DOT National Transportation Integrated Search
2002-02-01
This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...
Does a better model yield a better argument? An info-gap analysis
NASA Astrophysics Data System (ADS)
Ben-Haim, Yakov
2017-04-01
Theories, models and computations underlie reasoned argumentation in many areas. The possibility of error in these arguments, though of low probability, may be highly significant when the argument is used in predicting the probability of rare high-consequence events. This implies that the choice of a theory, model or computational method for predicting rare high-consequence events must account for the probability of error in these components. However, error may result from lack of knowledge or surprises of various sorts, and predicting the probability of error is highly uncertain. We show that the putatively best, most innovative and sophisticated argument may not actually have the lowest probability of error. Innovative arguments may entail greater uncertainty than more standard but less sophisticated methods, creating an innovation dilemma in formulating the argument. We employ info-gap decision theory to characterize and support the resolution of this problem and present several examples.
ERIC Educational Resources Information Center
Pitts, Laura; Dymond, Simon
2012-01-01
Research on the high-probability (high-p) request sequence shows that compliance with low-probability (low-p) requests generally increases when preceded by a series of high-p requests. Few studies have conducted formal preference assessments to identify the consequences used for compliance, which may partly explain treatment failures, and still…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mumpower, J.L.
There are strong structural similarities between risks from technological hazards and big-purse state lottery games. Risks from technological hazards are often described as low-probability, high-consequence negative events. State lotteries could be equally well characterized as low-probability, high-consequence positive events. Typical communications about state lotteries provide a virtual strategic textbook for opponents of risky technologies. The same techniques can be used to sell lottery tickets or sell opposition to risky technologies. Eight basic principles are enumerated.
Approved Methods and Algorithms for DoD Risk-Based Explosives Siting
2007-02-02
glass. Pgha Probability of a person being in the glass hazard area Phit Probability of hit Phit (f) Probability of hit for fatality Phit (maji...Probability of hit for major injury Phit (mini) Probability of hit for minor injury Pi Debris probability densities at the ES PMaj (pair) Individual...combined high-angle and combined low-angle tables. A unique probability of hit is calculated for the three consequences of fatality, Phit (f), major injury
ERIC Educational Resources Information Center
National Clearinghouse for Educational Facilities, 2008
2008-01-01
Earthquakes are low-probability, high-consequence events. Though they may occur only once in the life of a school, they can have devastating, irreversible consequences. Moderate earthquakes can cause serious damage to building contents and non-structural building systems, serious injury to students and staff, and disruption of building operations.…
Statistical surrogate models for prediction of high-consequence climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Constantine, Paul; Field, Richard V., Jr.; Boslough, Mark Bruce Elrick
2011-09-01
In safety engineering, performance metrics are defined using probabilistic risk assessments focused on the low-probability, high-consequence tail of the distribution of possible events, as opposed to best estimates based on central tendencies. We frame the climate change problem and its associated risks in a similar manner. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We therefore propose the use of specialized statistical surrogate models (SSMs) for the purpose of exploring the probability law of various climate variables of interest.more » A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field. The SSM can be calibrated to available spatial and temporal data from existing climate databases, e.g., the Program for Climate Model Diagnosis and Intercomparison (PCMDI), or to a collection of outputs from a General Circulation Model (GCM), e.g., the Community Earth System Model (CESM) and its predecessors. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework is developed to provide quantitative measures of confidence, via Bayesian credible intervals, in the use of the proposed approach to assess these risks.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grippo, Mark A.; Hlohowskyj, Ihor; Fox, Laura
The U.S. Army Corps of Engineers (USACE) is conducting the Great Lakes and Mississippi River Interbasin Study (GLMRIS) to determine the aquatic nuisance species (ANS) currently established in either the Mississippi River Basin (MRB) or the Great Lakes Basin (GLB) that pose the greatest risk to the other basin. The GLRMIS study focuses specifically on ANS transfer through the Chicago Area Waterway System (CAWS), a multi-use waterway connecting the two basins. In support of GLMRIS, we conducted a qualitative risk assessment for 34 ANS in which we determined overall risk level for four time intervals over a 50-year period ofmore » analysis based on the probability of ANS establishing in a new basin and the environmental, economic, and sociopolitical consequences of their establishment. Probability of establishment and consequences of establishment were assigned qualitative ratings of high, medium, or low and establishment and consequence ratings were then combined into an overall risk rating. Over the 50-year period of analysis, seven species were characterized as posing a medium risk and two species as posing a high risk to the MRB. Three species were characterized as posing a medium risk to the GLB, but no high-risk species were identified for this basin. Based on the time frame in which these species were considered likely to establish in the new basin, risk increased over time for some ANS. Identifying and prioritizing ANS risk supported the development and evaluation of multiple control alternatives that could reduce the probability of interbasin ANS transfer. However, both species traits and the need to balance multiple uses of the CAWS make it difficult to design cost-efficient and socially acceptable controls to reduce the probability of ANS transfer between the two basins.« less
Kyyrö, J; Sahlström, L; Lyytikäinen, T
2017-12-01
The NORA rapid risk assessment tool was developed for situations where there is a change in the disease status of easily transmissible animal diseases in neighbouring countries or in countries with significant interactions with Finland. The goal was to develop a tool that is quick to use and will provide consistent results to support risk management decisions. The model contains 63 questions that define the potential for entry and exposure by nine different pathways. The magnitude of the consequences is defined by 23 statements. The weight of different pathways is defined according to the properties of the assessed disease. The model was built as an Excel spreadsheet and is intended for use by animal health control administrators. As an outcome, the model gives the possible pathways of disease entry into the country, an overall approximation for the probability of entry and the subsequent exposure, an overall estimate for the consequences and a combined overall risk estimate (probability multiplied by magnitude of consequences). Model validity was assessed by expert panels. Outside Africa, African swine fever is currently established in Russia and Sardinia. In addition, there have been cases in both wild boar and domestic pigs in Latvia, Lithuania, Poland and Estonia. Finland has frequent contacts with Russia and Estonia, especially through passengers. The risk of African swine fever (ASF) introduction into Finland was tested with NORA for the situation in December 2015, when ASF was endemic in many parts of Russia, Africa and Sardinia and was present in Baltic countries and in Poland. African swine fever was assessed to have a high probability of entry into Finland, with high consequences and therefore a high overall risk. © 2017 Blackwell Verlag GmbH.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Krier, D. J.; Perry, F. V.
Location, timing, volume, and eruptive style of post-Miocene volcanoes have defined the volcanic hazard significant to a proposed high-level radioactive waste (HLW) and spent nuclear fuel (SNF) repository at Yucca Mountain, Nevada, as a low-probability, high-consequence event. Examination of eruptive centers in the region that may be analogueues to possible future volcanic activity at Yucca Mountain have aided in defining and evaluating the consequence scenarios for intrusion into and eruption above a repository. The probability of a future event intersecting a repository at Yucca Mountain has a mean value of 1.7 x 10{sup -8} per year. This probability comes frommore » the Probabilistic Volcanic Hazard Assessment (PVHA) completed in 1996 and updated to reflect change in repository layout. Since that time, magnetic anomalies representing potential buried volcanic centers have been identified fiom magnetic surveys; however these potential buried centers only slightly increase the probability of an event intersecting the repository. The proposed repository will be located in its central portion of Yucca Mountain at approximately 300m depth. The process for assessing performance of a repository at Yucca Mountain has identified two scenarios for igneous activity that, although having a very low probability of occurrence, could have a significant consequence should an igneous event occur. Either a dike swarm intersecting repository drifts containing waste packages, or a volcanic eruption through the repository could result in release of radioactive material to the accessible environment. Ongoing investigations are assessing the mechanisms and significance of the consequence scenarios. Lathrop Wells Cone ({approx}80,000 yrs), a key analogue for estimating potential future volcanic activity, is the youngest surface expression of apparent waning basaltic volcanism in the region. Cone internal structure, lavas, and ash-fall tephra have been examined to estimate eruptive volume, eruption type, and subsurface disturbance accompanying conduit growth and eruption. The Lathrop Wells volcanic complex has a total volume estimate of approximately 0.1 km{sup 3}. The eruptive products indicate a sequence of initial magmatic fissure fountaining, early Strombolian activity, and a brief hydrovolcanic phase, and violent Strombolian phase(s). Lava flows adjacent to the Lathrop Wells Cone probably were emplaced during the mid-eruptive sequence. Ongoing investigations continue to address the potential hazards of a volcanic event at Yucca Mountain.« less
Highly Competitive Reindeer Males Control Female Behavior during the Rut
Body, Guillaume; Weladji, Robert B.; Holand, Øystein; Nieminen, Mauri
2014-01-01
During the rut, female ungulates move among harems or territories, either to sample mates or to avoid harassment. Females may be herded by a male, may stay with a preferred male, or aggregate near a dominant male to avoid harassment from other males. In fission-fusion group dynamics, female movement is best described by the group’s fission probability, instead of inter-harem movement. In this study, we tested whether male herding ability, female mate choice or harassment avoidance influence fission probability. We recorded group dynamics in a herd of reindeer Rangifer tarandus equipped with GPS collars with activity sensors. We found no evidence that the harassment level in the group affected fission probability, or that females sought high rank (i.e. highly competitive and hence successful) males. However, the behavior of high ranked males decreased fission probability. Male herding activity was synchronous with the decrease of fission probability observed during the rut. We concluded that male herding behavior stabilized groups, thereby increasing average group size and consequently the opportunity for sexual selection. PMID:24759701
Quantitative assessment of building fire risk to life safety.
Guanquan, Chu; Jinhua, Sun
2008-06-01
This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oldenburg, Curtis M.; Budnitz, Robert J.
If Carbon dioxide Capture and Storage (CCS) is to be effective in mitigating climate change, it will need to be carried out on a very large scale. This will involve many thousands of miles of dedicated high-pressure pipelines in order to transport many millions of tonnes of CO 2 annually, with the CO 2 delivered to many thousands of wells that will inject the CO 2 underground. The new CCS infrastructure could rival in size the current U.S. upstream natural gas pipeline and well infrastructure. This new infrastructure entails hazards for life, health, animals, the environment, and natural resources. Pipelinesmore » are known to rupture due to corrosion, from external forces such as impacts by vehicles or digging equipment, by defects in construction, or from the failure of valves and seals. Similarly, wells are vulnerable to catastrophic failure due to corrosion, cement degradation, or operational mistakes. While most accidents involving pipelines and wells will be minor, there is the inevitable possibility of accidents with very high consequences, especially to public health. The most important consequence of concern is CO 2 release to the environment in concentrations sufficient to cause death by asphyxiation to nearby populations. Such accidents are thought to be very unlikely, but of course they cannot be excluded, even if major engineering effort is devoted (as it will be) to keeping their probability low and their consequences minimized. This project has developed a methodology for analyzing the risks of these rare but high-consequence accidents, using a step-by-step probabilistic methodology. A key difference between risks for pipelines and wells is that the former are spatially distributed along the pipe whereas the latter are confined to the vicinity of the well. Otherwise, the methodology we develop for risk assessment of pipeline and well failures is similar and provides an analysis both of the annual probabilities of accident sequences of concern and of their consequences, and crucially the methodology provides insights into what measures might be taken to mitigate those accident sequences identified as of concern. Mitigating strategies could address reducing the likelihood of an accident sequence of concern, or reducing the consequences, or some combination. The methodology elucidates both local and integrated risks along the pipeline or at the well providing information useful to decision makers at various levels including local (e.g., property owners and town councils), regional (e.g., county and state representatives), and national levels (federal regulators and corporate proponents).« less
Hotspot Identification for Shanghai Expressways Using the Quantitative Risk Assessment Method
Chen, Can; Li, Tienan; Sun, Jian; Chen, Feng
2016-01-01
Hotspot identification (HSID) is the first and key step of the expressway safety management process. This study presents a new HSID method using the quantitative risk assessment (QRA) technique. Crashes that are likely to happen for a specific site are treated as the risk. The aggregation of the crash occurrence probability for all exposure vehicles is estimated based on the empirical Bayesian method. As for the consequences of crashes, crashes may not only cause direct losses (e.g., occupant injuries and property damages) but also result in indirect losses. The indirect losses are expressed by the extra delays calculated using the deterministic queuing diagram method. The direct losses and indirect losses are uniformly monetized to be considered as the consequences of this risk. The potential costs of crashes, as a criterion to rank high-risk sites, can be explicitly expressed as the sum of the crash probability for all passing vehicles and the corresponding consequences of crashes. A case study on the urban expressways of Shanghai is presented. The results show that the new QRA method for HSID enables the identification of a set of high-risk sites that truly reveal the potential crash costs to society. PMID:28036009
The Probabilities of Unique Events
Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil
2012-01-01
Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224
Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change
NASA Astrophysics Data System (ADS)
Field, R.; Constantine, P.; Boslough, M.
2011-12-01
We have posed the climate change problem in a framework similar to that used in safety engineering, by acknowledging that probabilistic risk assessments focused on low-probability, high-consequence climate events are perhaps more appropriate than studies focused simply on best estimates. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We have developed specialized statistical surrogate models (SSMs) that can be used to make predictions about the tails of the associated probability distributions. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field, that is, a random variable for every fixed location in the atmosphere at all times. The SSM can be calibrated to available spatial and temporal data from existing climate databases, or to a collection of outputs from general circulation models. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework was also developed to provide quantitative measures of confidence, via Bayesian credible intervals, to assess these risks. To illustrate the use of the SSM, we considered two collections of NCAR CCSM 3.0 output data. The first collection corresponds to average December surface temperature for years 1990-1999 based on a collection of 8 different model runs obtained from the Program for Climate Model Diagnosis and Intercomparison (PCMDI). We calibrated the surrogate model to the available model data and make various point predictions. We also analyzed average precipitation rate in June, July, and August over a 54-year period assuming a cyclic Y2K ocean model. We applied the calibrated surrogate model to study the probability that the precipitation rate falls below certain thresholds and utilized the Bayesian approach to quantify our confidence in these predictions. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.
Evaluating Determinants of Environmental Risk Perception for Risk Management in Contaminated Sites
Janmaimool, Piyapong; Watanabe, Tsunemi
2014-01-01
Understanding the differences in the risk judgments of residents of industrial communities potentially provides insights into how to develop appropriate risk communication strategies. This study aimed to explore citizens’ fundamental understanding of risk-related judgments and to identify the factors contributing to perceived risks. An exploratory model was created to investigate the public’s risk judgments. In this model, the relationship between laypeople’s perceived risks and the factors related to the physical nature of risks (such as perceived probability of environmental contamination, probability of receiving impacts, and severity of catastrophic consequences) were examined by means of multiple regression analysis. Psychological factors, such as the ability to control the risks, concerns, experiences, and perceived benefits of industrial development were also included in the analysis. The Maptaphut industrial area in Rayong Province, Thailand was selected as a case study. A survey of 181 residents of communities experiencing different levels of hazardous gas contamination revealed rational risk judgments by inhabitants of high-risk and moderate-risk communities, based on their perceived probability of contamination, probability of receiving impacts, and perceived catastrophic consequences. However, risks assessed by people in low-risk communities could not be rationally explained and were influenced by their collective experiences. PMID:24937530
Impact of high-risk conjunctions on Active Debris Removal target selection
NASA Astrophysics Data System (ADS)
Lidtke, Aleksander A.; Lewis, Hugh G.; Armellin, Roberto
2015-10-01
Space debris simulations show that if current space launches continue unchanged, spacecraft operations might become difficult in the congested space environment. It has been suggested that Active Debris Removal (ADR) might be necessary in order to prevent such a situation. Selection of objects to be targeted by ADR is considered important because removal of non-relevant objects will unnecessarily increase the cost of ADR. One of the factors to be used in this ADR target selection is the collision probability accumulated by every object. This paper shows the impact of high-probability conjunctions on the collision probability accumulated by individual objects as well as the probability of any collision occurring in orbit. Such conjunctions cannot be predicted far in advance and, consequently, not all the objects that will be involved in such dangerous conjunctions can be removed through ADR. Therefore, a debris remediation method that would address such events at short notice, and thus help prevent likely collisions, is suggested.
ERIC Educational Resources Information Center
Belfiore, Phillip J.; Basile, Sarah Pulley; Lee, David L.
2008-01-01
One of the most problematic behaviors in children with developmental disabilities is noncompliance. Although behavioral research has provided strategies to impact noncompliance, oftentimes the methodologies are consequent techniques, which may not be conducive to implementation by the classroom teacher. In this teacher-designed and implemented…
10 CFR 26.119 - Determining “shy” bladder.
Code of Federal Regulations, 2012 CFR
2012-01-01
... donor was required to take a drug test, but was unable to provide a sufficient quantity of urine to complete the test; (2) The potential consequences of refusing to take the required drug test; and (3) The... condition has, or with a high degree of probability could have, precluded the donor from providing a...
10 CFR 26.119 - Determining “shy” bladder.
Code of Federal Regulations, 2013 CFR
2013-01-01
... donor was required to take a drug test, but was unable to provide a sufficient quantity of urine to complete the test; (2) The potential consequences of refusing to take the required drug test; and (3) The... condition has, or with a high degree of probability could have, precluded the donor from providing a...
10 CFR 26.119 - Determining “shy” bladder.
Code of Federal Regulations, 2014 CFR
2014-01-01
... donor was required to take a drug test, but was unable to provide a sufficient quantity of urine to complete the test; (2) The potential consequences of refusing to take the required drug test; and (3) The... condition has, or with a high degree of probability could have, precluded the donor from providing a...
Federal Register 2010, 2011, 2012, 2013, 2014
2013-01-29
... increase in the probability or consequences of an accident previously evaluated; or (2) create the possibility of a new or different kind of accident from any accident previously evaluated; or (3) involve a... proposed amendment involve a significant increase in the probability or consequences of an accident...
Nanoengineering of strong field processes in solids
NASA Astrophysics Data System (ADS)
Almalki, S.; Parks, A. M.; Brabec, T.; McDonald, C. R.
2018-04-01
We present a theoretical investigation of the effect of quantum confinement on high harmonic generation in semiconductor materials by systematically varying the confinement width along one or two directions transverse to the laser polarization. Our analysis shows a growth in high harmonic efficiency concurrent with a reduction of ionization. This decrease in ionization comes as a consequence of an increased band gap resulting from the confinement. The increase in harmonic efficiency results from a restriction of wave packet spreading, leading to greater recollision probability. Consequently, nanoengineering of one and two-dimensional nanosystems may prove to be a viable means to increase harmonic yield and photon energy in semiconductor materials driven by intense laser fields.
Nuclear risk analysis of the Ulysses mission
NASA Astrophysics Data System (ADS)
Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.
1991-01-01
The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.
Dalthorp, Daniel; Huso, Manuela
2015-12-02
Confirming the accuracy of predicted take and providing evidence that permitted take levels have not been exceeded can be challenging because carcasses may be detected with probability much less than 1, and often no carcasses are observed. When detection probability is high, finding 0 carcasses can be interpreted as evidence that none (or few) were actually killed. As the probability of observing an individual decreases, the likelihood of missing carcasses increases, making it unclear how to interpret having observed 0 (or few) carcasses. In a practical sense, the consequences of incorrect inference can be significant: overestimating take could result in costly and unjustified mitigation, whereas underestimating could result in unanticipated declines in species populations already at risk.
Nuclear risk analysis of the Ulysses mission
NASA Astrophysics Data System (ADS)
Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W.
An account is given of the method used to quantify the risks accruing to the use of a radioisotope thermoelectric generator fueled by Pu-238 dioxide aboard the Space Shuttle-launched Ulysses mission. After using a Monte Carlo technique to develop probability distributions for the radiological consequences of a range of accident scenarios throughout the mission, factors affecting those consequences are identified in conjunction with their probability distributions. The functional relationship among all the factors is then established, and probability distributions for all factor effects are combined by means of a Monte Carlo technique.
NASA Astrophysics Data System (ADS)
Copping, A. E.; Blake, K.; Zdanski, L.
2011-12-01
As marine and hydrokinetic (MHK) energy development projects progress towards early deployments in the U.S., the process of determining the risks to aquatic animals, habitats, and ecosystem processes from these engineered systems continues to be a significant barrier to efficient siting and permitting. Understanding the risk of MHK installations requires that the two elements of risk - consequence and probability - be evaluated. However, standard risk assessment methodologies are not easily applied to MHK interactions with marine and riverine environment as there are few data that describe the interaction of stressors (MHK devices, anchors, foundations, mooring lines and power cables) and receptors (aquatic animals, habitats and ecosystem processes). The number of possible combinations and permutations of stressors and receptors in MHK systems is large: there are many different technologies designed to harvest energy from the tides, waves and flowing rivers; each device is planned for a specific waterbody that supports an endemic ecosystem of animals and habitats, tied together by specific physical and chemical processes. With few appropriate analogue industries in the oceans and rivers, little information on the effects of these technologies on the living world is available. Similarly, without robust data sets of interactions, mathematical probability models are difficult to apply. Pacific Northwest National Laboratory scientists are working with MHK developers, researchers, engineers, and regulators to rank the consequences of planned MHK projects on living systems, and exploring alternative methodologies to estimate probabilities of these encounters. This paper will present the results of ERES, the Environmental Risk Evaluation System, which has been used to rank consequences for major animal groups and habitats for five MHK projects that are in advanced stages of development and/or early commercial deployment. Probability analyses have been performed for high priority stressor/receptor interactions where data are adaptable from other industries. In addition, a methodology for evaluating the probability of encounter, and therefore risk, to an endangered marine mammal from tidal turbine blades will be presented.
Risk Due to Radiological Terror Attacks With Natural Radionuclides
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedrich, Steinhaeusler; Lyudmila, Zaitseva; Stan, Rydell
The naturally occurring radionuclides radium (Ra-226) and polonium (Po-210) have the potential to be used for criminal acts. Analysis of international incident data contained in the Database on Nuclear Smuggling, Theft and Orphan Radiation Sources (CSTO), operated at the University of Salzburg, shows that several acts of murder and terrorism with natural radionuclides have already been carried out in Europe and Russia. Five different modes of attack (T) are possible: (1) Covert irradiation of an individual in order to deliver a high individual dose; (2) Covert irradiation of a group of persons delivering a large collective dose; (3) Contamination ofmore » food or drink; (4) Generation of radioactive aerosols or solutions; (5) Combination of Ra-226 with conventional explosives (Dirty Bomb).This paper assesses the risk (R) of such criminal acts in terms of: (a) Probability of terrorist motivation deploying a certain attack mode T; (b) Probability of success by the terrorists for the selected attack mode T; (c) Primary damage consequence (C) to the attacked target (activity, dose); (d) Secondary damage consequence (C') to the attacked target (psychological and socio-economic effects); (e) Probability that the consequences (C, C') cannot be brought under control, resulting in a failure to manage successfully the emergency situation due to logistical and/or technical deficits in implementing adequate countermeasures. Extensive computer modelling is used to determine the potential impact of such a criminal attack on directly affected victims and on the environment.« less
Risk Due to Radiological Terror Attacks With Natural Radionuclides
NASA Astrophysics Data System (ADS)
Friedrich, Steinhäusler; Stan, Rydell; Lyudmila, Zaitseva
2008-08-01
The naturally occurring radionuclides radium (Ra-226) and polonium (Po-210) have the potential to be used for criminal acts. Analysis of international incident data contained in the Database on Nuclear Smuggling, Theft and Orphan Radiation Sources (CSTO), operated at the University of Salzburg, shows that several acts of murder and terrorism with natural radionuclides have already been carried out in Europe and Russia. Five different modes of attack (T) are possible: (1) Covert irradiation of an individual in order to deliver a high individual dose; (2) Covert irradiation of a group of persons delivering a large collective dose; (3) Contamination of food or drink; (4) Generation of radioactive aerosols or solutions; (5) Combination of Ra-226 with conventional explosives (Dirty Bomb). This paper assesses the risk (R) of such criminal acts in terms of: (a) Probability of terrorist motivation deploying a certain attack mode T; (b) Probability of success by the terrorists for the selected attack mode T; (c) Primary damage consequence (C) to the attacked target (activity, dose); (d) Secondary damage consequence (C') to the attacked target (psychological and socio-economic effects); (e) Probability that the consequences (C, C') cannot be brought under control, resulting in a failure to manage successfully the emergency situation due to logistical and/or technical deficits in implementing adequate countermeasures. Extensive computer modelling is used to determine the potential impact of such a criminal attack on directly affected victims and on the environment.
Risk analysis for dry snow slab avalanche release by skier triggering
NASA Astrophysics Data System (ADS)
McClung, David
2013-04-01
Risk analysis is of primary importance for skier triggering of avalanches since human triggering is responsible for about 90% of deaths from slab avalanches in Europe and North America. Two key measureable quantities about dry slab avalanche release prior to initiation are the depth to the weak layer and the slope angle. Both are important in risk analysis. As the slope angle increases, the probability of avalanche release increases dramatically. As the slab depth increases, the consequences increase if an avalanche releases. Among the simplest risk definitions is (Vick, 2002): Risk = (Probability of failure) x (Consequences of failure). Here, these two components of risk are the probability or chance of avalanche release and the consequences given avalanche release. In this paper, for the first time, skier triggered avalanches were analyzed from probability theory and its relation to risk for both the D and . The data consisted of two quantities : (,D) taken from avalanche fracture line profiles after an avalanche has taken place. Two data sets from accidentally skier triggered avalanches were considered: (1) 718 for and (2) a set of 1242 values of D which represent average values along the fracture line. The values of D were both estimated (about 2/3) and measured (about 1/3) by ski guides from Canadian Mountain Holidays CMH). I also analyzed 1231 accidentally skier triggered avalanches reported by CMH ski guides for avalanche size (representing destructive potential) on the Canadian scale. The size analysis provided a second analysis of consequences to verify that using D. The results showed that there is an intermediate range of both D and with highest risk. ForD, the risk (product of consequences and probability of occurrence) is highest for D in the approximate range 0.6 m - 1.0 m. The consequences are low for lower values of D and the chance of release is low for higher values of D. Thus, the highest product is in the intermediate range. For slope angles, the risk analysis showed there are two ranges: ˜ 320; × 460for which risk is lowest. In this case, both the range of and the consequences vary by about a factor of two so the probability of release dominates the risk analysis to yield low risk at the tails of the distribution of with highest risk in the middle (330 - 450) of the expected range (250 - 550).
Kinetic aspects of chain growth in Fischer-Tropsch synthesis.
Filot, Ivo A W; Zijlstra, Bart; Broos, Robin J P; Chen, Wei; Pestman, Robert; Hensen, Emiel J M
2017-04-28
Microkinetics simulations are used to investigate the elementary reaction steps that control chain growth in the Fischer-Tropsch reaction. Chain growth in the FT reaction on stepped Ru surfaces proceeds via coupling of CH and CR surface intermediates. Essential to the growth mechanism are C-H dehydrogenation and C hydrogenation steps, whose kinetic consequences have been examined by formulating two novel kinetic concepts, the degree of chain-growth probability control and the thermodynamic degree of chain-growth probability control. For Ru the CO conversion rate is controlled by the removal of O atoms from the catalytic surface. The temperature of maximum CO conversion rate is higher than the temperature to obtain maximum chain-growth probability. Both maxima are determined by Sabatier behavior, but the steps that control chain-growth probability are different from those that control the overall rate. Below the optimum for obtaining long hydrocarbon chains, the reaction is limited by the high total surface coverage: in the absence of sufficient vacancies the CHCHR → CCHR + H reaction is slowed down. Beyond the optimum in chain-growth probability, CHCR + H → CHCHR and OH + H → H 2 O limit the chain-growth process. The thermodynamic degree of chain-growth probability control emphasizes the critical role of the H and free-site coverage and shows that at high temperature, chain depolymerization contributes to the decreased chain-growth probability. That is to say, during the FT reaction chain growth is much faster than chain depolymerization, which ensures high chain-growth probability. The chain-growth rate is also fast compared to chain-growth termination and the steps that control the overall CO conversion rate, which are O removal steps for Ru.
Yazdi, Mohammad; Korhan, Orhan; Daneshvar, Sahand
2018-05-09
This study aimed at establishing fault tree analysis (FTA) using expert opinion to compute the probability of an event. To find the probability of the top event (TE), all probabilities of the basic events (BEs) should be available when the FTA is drawn. In this case, employing expert judgment can be used as an alternative to failure data in an awkward situation. The fuzzy analytical hierarchy process as a standard technique is used to give a specific weight to each expert, and fuzzy set theory is engaged for aggregating expert opinion. In this regard, the probability of BEs will be computed and, consequently, the probability of the TE obtained using Boolean algebra. Additionally, to reduce the probability of the TE in terms of three parameters (safety consequences, cost and benefit), the importance measurement technique and modified TOPSIS was employed. The effectiveness of the proposed approach is demonstrated with a real-life case study.
Causal illusions in children when the outcome is frequent
2017-01-01
Causal illusions occur when people perceive a causal relation between two events that are actually unrelated. One factor that has been shown to promote these mistaken beliefs is the outcome probability. Thus, people tend to overestimate the strength of a causal relation when the potential consequence (i.e. the outcome) occurs with a high probability (outcome-density bias). Given that children and adults differ in several important features involved in causal judgment, including prior knowledge and basic cognitive skills, developmental studies can be considered an outstanding approach to detect and further explore the psychological processes and mechanisms underlying this bias. However, the outcome density bias has been mainly explored in adulthood, and no previous evidence for this bias has been reported in children. Thus, the purpose of this study was to extend outcome-density bias research to childhood. In two experiments, children between 6 and 8 years old were exposed to two similar setups, both showing a non-contingent relation between the potential cause and the outcome. These two scenarios differed only in the probability of the outcome, which could either be high or low. Children judged the relation between the two events to be stronger in the high probability of the outcome setting, revealing that, like adults, they develop causal illusions when the outcome is frequent. PMID:28898294
The Sznajd model with limited persuasion: competition between high-reputation and hesitant agents
NASA Astrophysics Data System (ADS)
Crokidakis, Nuno; Murilo Castro de Oliveira, Paulo
2011-11-01
In this work we study a modified version of the two-dimensional Sznajd sociophysics model. In particular, we consider the effects of agents' reputations in the persuasion rules. In other words, a high-reputation group with a common opinion may convince its neighbors with probability p, which induces an increase of the group's reputation. On the other hand, there is always a probability q = 1 - p of the neighbors keeping their opinions, which induces a decrease of the group's reputation. These rules describe a competition between groups with high-reputation and hesitant agents, which makes the full-consensus states (with all spins pointing in one direction) more difficult to reach. As consequences, the usual phase transition does not occur for p < pc ~ 0.69 and the system presents realistic democracy-like situations, where the majority of spins are aligned in a certain direction, for a wide range of parameters.
The National Response Framework: A Cross-Case Analysis
2014-06-13
media representatives wanting access to top officials.95 People with disabilities reported a lack of live captioning and sign language interpreters... use due to an inefficient check-in process. Public information shortfalls included providing inadequate information for people with disabilities , not...numbers, suggesting that the commonly used term “low-probably, high- consequence events” to describe major disasters is misleading. U.S. shores, forests
NASA Astrophysics Data System (ADS)
Grippo, Mark A.; Hlohowskyj, Ihor; Fox, Laura; Herman, Brook; Pothoff, Johanna; Yoe, Charles; Hayse, John
2017-01-01
The U.S. Army Corps of Engineers is conducting the Great Lakes and Mississippi River Interbasin Study to identify the highest risk aquatic nuisance species currently established in either the Mississippi River Basin or the Great Lakes Basin and prevent their movement into a new basin. The Great Lakes and Mississippi River Interbasin Study focuses specifically on aquatic nuisance species movement through the Chicago Area Waterway System, a multi-use waterway connecting the two basins. In support of Great Lakes and Mississippi River Interbasin Study, we conducted a qualitative risk assessment for 33 aquatic nuisance species over a 50-year period of analysis based on the probability of aquatic nuisance species establishing in a new basin and the environmental, economic, and sociopolitical consequences of their establishment. Probability of establishment and consequences of establishment were assigned qualitative ratings of high, medium, or low after considering the species' current location, mobility, habitat suitability, and impacts in previously invaded systems. The establishment and consequence ratings were then combined into an overall risk rating. Seven species were characterized as posing a medium risk and two species as posing a high risk to the Mississippi River Basin. Three species were characterized as posing a medium risk to the Great Lakes Basin, but no high-risk species were identified for this basin. Risk increased over time for some aquatic nuisance species based on the time frame in which these species were considered likely to establish in the new basin. Both species traits and the need to balance multiple uses of the Chicago Area Waterway System must be considered when identifying control measures to prevent aquatic nuisance species movement between the two basins.
Topics in inference and decision-making with partial knowledge
NASA Technical Reports Server (NTRS)
Safavian, S. Rasoul; Landgrebe, David
1990-01-01
Two essential elements needed in the process of inference and decision-making are prior probabilities and likelihood functions. When both of these components are known accurately and precisely, the Bayesian approach provides a consistent and coherent solution to the problems of inference and decision-making. In many situations, however, either one or both of the above components may not be known, or at least may not be known precisely. This problem of partial knowledge about prior probabilities and likelihood functions is addressed. There are at least two ways to cope with this lack of precise knowledge: robust methods, and interval-valued methods. First, ways of modeling imprecision and indeterminacies in prior probabilities and likelihood functions are examined; then how imprecision in the above components carries over to the posterior probabilities is examined. Finally, the problem of decision making with imprecise posterior probabilities and the consequences of such actions are addressed. Application areas where the above problems may occur are in statistical pattern recognition problems, for example, the problem of classification of high-dimensional multispectral remote sensing image data.
Miller, G Y; Ming, J; Williams, I; Gorvett, R
2012-12-01
Foot and mouth disease (FMD) continues to be a disease of major concern for the United States Department of Agriculture (USDA) and livestock industries. Foot and mouth disease virus is a high-consequence pathogen for the United States (USA). Live animal trade is a major risk factor for introduction of FMD into a country. This research estimates the probability of FMD being introduced into the USA via the legal importation of livestock. This probability is calculated by considering the potential introduction of FMD from each country from which the USA imports live animals. The total probability of introduction into the USA of FMD from imported livestock is estimated to be 0.415% per year, which is equivalent to one introduction every 241 years. In addition, to provide a basis for evaluating the significance of risk management techniques and expenditures, the sensitivity of the above result to changes in various risk parameter assumptions is determined.
Robust approaches to quantification of margin and uncertainty for sparse data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin
Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less
Global situational awareness and early warning of high-consequence climate change.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Backus, George A.; Carr, Martin J.; Boslough, Mark Bruce Elrick
2009-08-01
Global monitoring systems that have high spatial and temporal resolution, with long observational baselines, are needed to provide situational awareness of the Earth's climate system. Continuous monitoring is required for early warning of high-consequence climate change and to help anticipate and minimize the threat. Global climate has changed abruptly in the past and will almost certainly do so again, even in the absence of anthropogenic interference. It is possible that the Earth's climate could change dramatically and suddenly within a few years. An unexpected loss of climate stability would be equivalent to the failure of an engineered system on amore » grand scale, and would affect billions of people by causing agricultural, economic, and environmental collapses that would cascade throughout the world. The probability of such an abrupt change happening in the near future may be small, but it is nonzero. Because the consequences would be catastrophic, we argue that the problem should be treated with science-informed engineering conservatism, which focuses on various ways a system can fail and emphasizes inspection and early detection. Such an approach will require high-fidelity continuous global monitoring, informed by scientific modeling.« less
Boys will be boys: are there gender differences in the effect of sexual abstinence on schooling?
Sabia, Joseph J; Rees, Daniel I
2011-03-01
A recent study by Sabia and Rees (2009) found that delaying first intercourse leads to a substantial increase in the probability that female students graduate high school. However, it is unclear whether the effect of abstinence extends to male students. Here we identify exogenous variation in the timing of first intercourse using a physical development index available for both females and males. Two-stage least squares estimates suggest that abstaining from sexual intercourse increases the probability that females graduate from high school, but has little effect on the educational attainment of males. This pattern of results is consistent with evidence from previous studies that males are less likely than females to suffer adverse psychological consequences from engaging in sexual intercourse at an early age. Copyright © 2010 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Fuchs, Christopher A.; Schack, Rüdiger
2013-10-01
In the quantum-Bayesian interpretation of quantum theory (or QBism), the Born rule cannot be interpreted as a rule for setting measurement-outcome probabilities from an objective quantum state. But if not, what is the role of the rule? In this paper, the argument is given that it should be seen as an empirical addition to Bayesian reasoning itself. Particularly, it is shown how to view the Born rule as a normative rule in addition to usual Dutch-book coherence. It is a rule that takes into account how one should assign probabilities to the consequences of various intended measurements on a physical system, but explicitly in terms of prior probabilities for and conditional probabilities consequent upon the imagined outcomes of a special counterfactual reference measurement. This interpretation is exemplified by representing quantum states in terms of probabilities for the outcomes of a fixed, fiducial symmetric informationally complete measurement. The extent to which the general form of the new normative rule implies the full state-space structure of quantum mechanics is explored.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Not Available
1989-01-25
It is the purpose of the NRAD to provide an analysis of the range of potential consequences of accidents which have been identified that are associated with the launching and deployment of the Galileo mission spacecraft. The specific consequences analyzed are those associated with the possible release of radioactive material (fuel) of the Radioisotope Thermoelectric Generators (RTGs). They are in terms of radiation doses to people and areas of deposition of radioactive material. These consequence analyses can be used in several ways. One way is to identify the potential range of consequences which might have to be dealt with ifmore » there were to be an accident with a release of fuel, so as to assure that, given such an accident, the health and safety of the public will be reasonably protected. Another use of the information, in conjunction with accident and release probabilities, is to estimate the risks associated with the mission. That is, most space launches occur without incident. Given an accident, the most probable result relative to the RTGs is complete containment of the radioactive material. Only a small fraction of accidents might result in a release of fuel and subsequent radiological consequences. The combination of probability with consequence is risk, which can be compared to other human and societal risks to assure that no undue risks are implied by undertaking the mission. Book 2 contains eight appendices.« less
NASA Astrophysics Data System (ADS)
1989-01-01
It is the purpose of the NRAD to provide an analysis of the range of potential consequences of accidents which have been identified that are associated with the launching and deployment of the Galileo mission spacecraft. The specific consequences analyzed are those associated with the possible release of radioactive material (fuel) of the Radioisotope Thermoelectric Generators (RTGs). They are in terms of radiation doses to people and areas of deposition of radioactive material. These consequence analyses can be used in several ways. One way is to identify the potential range of consequences which might have to be dealt with if there were to be an accident with a release of fuel, so as to assure that, given such an accident, the health and safety of the public will be reasonably protected. Another use of the information, in conjunction with accident and release probabilities, is to estimate the risks associated with the mission. That is, most space launches occur without incident. Given an accident, the most probable result relative to the RTGs is complete containment of the radioactive material. Only a small fraction of accidents might result in a release of fuel and subsequent radiological consequences. The combination of probability with consequence is risk, which can be compared to other human and societal risks to assure that no undue risks are implied by undertaking the mission. Book 2 contains eight appendices.
[Genetic effects in the liquidators of consequences of Chernobyl Nuclear Power Plant accident].
Liaginskaia, A M; Tukov, A R; Osipov, V A; Prokhorova, O N
2007-01-01
The purpose of the present research was the estimation of probable genetic consequences at the liquidators of the consequences of Chernobyl accident in 1986-1987. The research is made on two groups of the liquidators. The first group included the liquidators taking place on the account in the branch register and working now at the enterprises of a nuclear industry. The second group included 902 liquidators of consequences of Chernobyl accident in 1986 constantly living in the Ryazan area and which are taking place on permanent observation the account in regional hospital. For an estimation of probable genetic effects analyzed the data on frequency and outcomes pregnancy of the wives of the liquidators, on condition and on diseases of newborn, on switching intrauterine development defects (IDD). The analysis carried out depending on dozes of an irradiation: up to 5 cGy; 5-10 cGy and 10-25 cGy. Received materials testify, that at the liquidators, at a doze of an external irradiation 10-25 cGy, the determined effects--period long sterility, kept at a part them till 3 years come to light. The set of the received data, such as depending from the dose increase of frequency of spontaneous abortions and of inherent defects of development of newborn, the increase of frequency diseases of newborn and share newborn with low weight, allows to make a conclusion about an induction of genetic effects in sexual cells of the liquidators of consequences of Chernobyl accident at dozes of an external irradiation more than 10 cGy. Taking into account high biological efficiency of alpha-radiation (K = 20), and of beta-radiation (K = 2-4), the equivalent effective doze male gonads (testes) in 3-5 times is higher, than estimated only from external gamma-radiation.
Causes and implications of the correlation between forest productivity and tree mortality rates
Stephenson, Nathan L.; van Mantgem, Philip J.; Bunn, Andrew G.; Bruner, Howard; Harmon, Mark E.; O'Connell, Kari B.; Urban, Dean L.; Franklin, Jerry F.
2011-01-01
For only one of these four mechanisms, competition, can high mortality rates be considered to be a relatively direct consequence of high NPP. The remaining mechanisms force us to adopt a different view of causality, in which tree growth rates and probability of mortality can vary with at least a degree of independence along productivity gradients. In many cases, rather than being a direct cause of high mortality rates, NPP may remain high in spite of high mortality rates. The independent influence of plant enemies and other factors helps explain why forest biomass can show little correlation, or even negative correlation, with forest NPP.
New paradoxes of risky decision making.
Birnbaum, Michael H
2008-04-01
During the last 25 years, prospect theory and its successor, cumulative prospect theory, replaced expected utility as the dominant descriptive theories of risky decision making. Although these models account for the original Allais paradoxes, 11 new paradoxes show where prospect theories lead to self-contradiction or systematic false predictions. The new findings are consistent with and, in several cases, were predicted in advance by simple "configural weight" models in which probability-consequence branches are weighted by a function that depends on branch probability and ranks of consequences on discrete branches. Although they have some similarities to later models called "rank-dependent utility," configural weight models do not satisfy coalescing, the assumption that branches leading to the same consequence can be combined by adding their probabilities. Nor do they satisfy cancellation, the "independence" assumption that branches common to both alternatives can be removed. The transfer of attention exchange model, with parameters estimated from previous data, correctly predicts results with all 11 new paradoxes. Apparently, people do not frame choices as prospects but, instead, as trees with branches.
Motives for offending among violent and psychotic men.
Taylor, P J
1985-11-01
Two hundred and three male remanded prisoners were interviewed with respect to their current offence, mental state, and social and psychiatric histories. All but nine of the sub-group of 121 psychotic men showed active symptoms at the time of committing a criminal offence; 20% of the actively ill psychotics were directly driven to offend by their psychotic symptoms, and a further 26% probably so. If some of the indirect consequences of the psychosis were taken into account, 82% of their offences were probably attributable to the illness. Among the normal and neurotic men, none claimed psychotic motives for offending, but motives suggesting high emotional arousal such as panic or retaliation triggered the greatest violence. Within the psychotic group, those driven to offend by their delusions were most likely to have been seriously violent, and psychotic symptoms probably accounted directly for most of the very violent behaviour.
Estimating the probability for major gene Alzheimer disease
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrer, L.A.; Cupples, L.A.
1994-02-01
Alzheimer disease (AD) is a neuropsychiatric illness caused by multiple etiologies. Prediction of whether AD is genetically based in a given family is problematic because of censoring bias among unaffected relatives as a consequence of the late onset of the disorder, diagnostic uncertainties, heterogeneity, and limited information in a single family. The authors have developed a method based on Bayesian probability to compute values for a continuous variable that ranks AD families as having a major gene form of AD (MGAD). In addition, they have compared the Bayesian method with a maximum-likelihood approach. These methods incorporate sex- and age-adjusted riskmore » estimates and allow for phenocopies and familial clustering of age on onset. Agreement is high between the two approaches for ranking families as MGAD (Spearman rank [r] = .92). When either method is used, the numerical outcomes are sensitive to assumptions of the gene frequency and cumulative incidence of the disease in the population. Consequently, risk estimates should be used cautiously for counseling purposes; however, there are numerous valid applications of these procedures in genetic and epidemiological studies. 41 refs., 4 figs., 3 tabs.« less
NASA Astrophysics Data System (ADS)
Raimond, Emmanuel; Decker, Kurt; Guigueno, Yves; Klug, Joakim; Loeffler, Horst
2015-04-01
The Fukushima nuclear accident in Japan resulted from the combination of two correlated extreme external events (earthquake and tsunami). The consequences, in particular flooding, went beyond what was considered in the initial engineering design design of nuclear power plants (NPPs). Such situations can in theory be identified using probabilistic safety assessment (PSA) methodology. PSA results may then lead industry (system suppliers and utilities) or Safety Authorities to take appropriate decisions to reinforce the defence-in-depth of the NPP for low probability event but high amplitude consequences. In reality, the development of such PSA remains a challenging task. Definitions of the design basis of NPPs, for example, require data on events with occurrence probabilities not higher than 10-4 per year. Today, even lower probabilities, down to 10-8, are expected and typically used for probabilistic safety analyses (PSA) of NPPs and the examination of so-called design extension conditions. Modelling the combinations of natural or man-made hazards that can affect a NPP and affecting some meaningful probability of occurrence seems to be difficult. The European project ASAMPSAE (www.asampsa.eu) gathers more than 30 organizations (industry, research, safety control) from Europe, US and Japan and aims at identifying some meaningful practices to extend the scope and the quality of the existing probabilistic safety analysis developed for nuclear power plants. It offers a framework to discuss, at a technical level, how "extended PSA" can be developed efficiently and be used to verify if the robustness of Nuclear Power Plants (NPPs) in their environment is sufficient. The paper will present the objectives of this project, some first lessons and introduce which type of guidance is being developed. It will explain the need of expertise from geosciences to support the nuclear safety assessment in the different area (seismotectonic, hydrological, meteorological and biological hazards, …).
[Diagnostics and treatment of Wernicke-Korsakoff syndrome patients with an alcohol abuse].
Nilsson, Maria; Sonne, Charlotte
2013-04-01
Wernicke-Korsakoff syndrome is a condition with high morbidity and mortality and occurs as a consequence of thiamine deficiency. Clinical symptoms are often ambiguous and post-mortem examinations show that the syndrome is underdiagnosed and probably undertreated. There is sparse clinical evidence concerning optimal dosage and duration of treatment. This article reviews the current literature and concludes that all patients with a history of alcohol abuse should be treated with high dosage IV thiamine for an extended period of time, albeit further research is needed.
Policy on synthetic biology: deliberation, probability, and the precautionary paradox.
Wareham, Christopher; Nardini, Cecilia
2015-02-01
Synthetic biology is a cutting-edge area of research that holds the promise of unprecedented health benefits. However, in tandem with these large prospective benefits, synthetic biology projects entail a risk of catastrophic consequences whose severity may exceed that of most ordinary human undertakings. This is due to the peculiar nature of synthetic biology as a 'threshold technology' which opens doors to opportunities and applications that are essentially unpredictable. Fears about these potentially unstoppable consequences have led to declarations from civil society groups calling for the use of a precautionary principle to regulate the field. Moreover, the principle is prevalent in law and international agreements. Despite widespread political recognition of a need for caution, the precautionary principle has been extensively criticized as a guide for regulatory policy. We examine a central objection to the principle: that its application entails crippling inaction and incoherence, since whatever action one takes there is always a chance that some highly improbable cataclysm will occur. In response to this difficulty, which we call the 'precautionary paradox,' we outline a deliberative means for arriving at threshold of probability below which potential dangers can be disregarded. In addition, we describe a Bayesian mechanism with which to assign probabilities to harmful outcomes. We argue that these steps resolve the paradox. The rehabilitated PP can thus provide a viable policy option to confront the uncharted waters of synthetic biology research. © 2013 John Wiley & Sons Ltd.
Approved Methods and Algorithms for DoD Risk-Based Explosives Siting
2009-07-21
Parameter used in determining probability of hit ( Phit ) by debris. [Table 31, Table 32, Table 33, Eq. (157), Eq. (158)] CCa Variable “Actual...being in the glass hazard area”. [Eq. (60), Eq. (78)] Phit Variable “Probability of hit”. An array value indexed by consequence and mass bin...Eq. (156), Eq. (157)] Phit (f) Variable “Probability of hit for fatality”. [Eq. (157), Eq. (158)] Phit (maji) Variable “Probability of hit for major
DOE Office of Scientific and Technical Information (OSTI.GOV)
Larsen, A. C.; Goriely, S.; Bernstein, L. A.
2015-01-01
An enhanced probability for low-energy γ-emission ( upbend, Eγ < 3 MeV) at high excitation energies has been observed for several light and medium-mass nuclei close to the valley of stability. Also the M1 scissors mode seen in deformed nuclei increases the γ-decay probability for low-energy γ-rays (E γ ≈ 2–3 MeV). These phenomena, if present in neutron-rich nuclei, have the potential to increase radiative neutron-capture rates relevant for the r-process. Furthermore, the experimental and theoretical status of the upbend is discussed, and preliminary calculations of (n,γ) reaction rates for neutron-rich, mid-mass nuclei including the scissors mode are shown.
Chiu, Yu-Han; Williams, Paige L; Gillman, Matthew W; Gaskins, Audrey J; Mínguez-Alarcón, Lidia; Souter, Irene; Toth, Thomas L; Ford, Jennifer B; Hauser, Russ; Chavarro, Jorge E
2018-01-01
Animal experiments suggest that ingestion of pesticide mixtures at environmentally relevant concentrations decreases the number of live-born offspring. Whether the same is true in humans is unknown. To examine the association of preconception intake of pesticide residues in fruits and vegetables (FVs) with outcomes of infertility treatment with assisted reproductive technologies (ART). This analysis included 325 women who completed a diet assessment and subsequently underwent 541 ART cycles in the Environment and Reproductive Health (EARTH) prospective cohort study (2007-2016) at a fertility center at a teaching hospital. We categorized FVs as having high or low pesticide residues using a validated method based on surveillance data from the US Department of Agriculture. Cluster-weighted generalized estimating equations were used to analyze associations of high- and low-pesticide residue FV intake with ART outcomes. Adjusted probabilities of clinical pregnancy and live birth per treatment cycle. In the 325 participants (mean [SD] age, 35.1 [4.0] y; body mass index, 24.1 [4.3]), mean (SD) intakes of high- and low-pesticide residue FVs were 1.7 (1.0) and 2.8 (1.6) servings/d, respectively. Greater intake of high-pesticide residue FVs was associated with a lower probability of clinical pregnancy and live birth. Compared with women in the lowest quartile of high-pesticide FV intake (<1.0 servings/d), women in the highest quartile (≥2.3 servings/d) had 18% (95% CI, 5%-30%) lower probability of clinical pregnancy and 26% (95% CI, 13%-37%) lower probability of live birth. Intake of low-pesticide residue FVs was not significantly related to ART outcomes. Higher consumption of high-pesticide residue FVs was associated with lower probabilities of pregnancy and live birth following infertility treatment with ART. These data suggest that dietary pesticide exposure within the range of typical human exposure may be associated with adverse reproductive consequences.
Probability of survival during accidental immersion in cold water.
Wissler, Eugene H
2003-01-01
Estimating the probability of survival during accidental immersion in cold water presents formidable challenges for both theoreticians and empirics. A number of theoretical models have been developed assuming that death occurs when the central body temperature, computed using a mathematical model, falls to a certain level. This paper describes a different theoretical approach to estimating the probability of survival. The human thermal model developed by Wissler is used to compute the central temperature during immersion in cold water. Simultaneously, a survival probability function is computed by solving a differential equation that defines how the probability of survival decreases with increasing time. The survival equation assumes that the probability of occurrence of a fatal event increases as the victim's central temperature decreases. Generally accepted views of the medical consequences of hypothermia and published reports of various accidents provide information useful for defining a "fatality function" that increases exponentially with decreasing central temperature. The particular function suggested in this paper yields a relationship between immersion time for 10% probability of survival and water temperature that agrees very well with Molnar's empirical observations based on World War II data. The method presented in this paper circumvents a serious difficulty with most previous models--that one's ability to survive immersion in cold water is determined almost exclusively by the ability to maintain a high level of shivering metabolism.
Parental smoking and respiratory tract infections in children.
Peat, J K; Keena, V; Harakeh, Z; Marks, G
2001-09-01
The adverse health consequences of exposing children to tobacco smoke have been well documented. Re-calculation of the data available from cohort and cross-sectional studies worldwide shows that between 500-2500 excess hospitalisations and between 1000 to 5000 excess diagnoses per 100 000 young children as result from respiratory infections can be directly attributed to parental smoking. Results of published meta-analyses support these figures, which are probably under-estimated because of the effects of non-differential misclassification bias. These excess infections are a source of preventable morbidity and have a high cost to the community. They also have important long-term consequences because children who have respiratory infections in early life are at an increased risk of developing asthma in later childhood. More effective strategies that prevent smoking in young people before they become parents have the potential to lead to reductions in these high rates of unnecessary morbidity in the next generation of children.
Characterizing High School Students Who Play Drinking Games Using Latent Class Analysis
Borsari, Brian; Zamboanga, Byron L.; Correia, Christopher; Olthuis, Janine V.; Van Tyne, Kathryne; Zadworny, Zoe; Grossbard, Joel R.; Horton, Nicholas J.
2013-01-01
Heavy alcohol use and its associated negative consequences continue to be an important health issue among adolescents. Of particular concern are risky drinking practices such as playing drinking games. Although retrospective accounts indicate that drinking game participation is common among high school students, it has yet to be assessed in current high school students. Utilizing data from high school students who reported current drinking game participation (n = 178), we used latent class analysis to investigate the negative consequences resulting from gaming and examined underlying demographic and alcohol-related behavioral characteristics of students as a function of the resultant classes. Three classes of “gamers” emerged: (1) a “lower-risk” group who had a lower probability of endorsing negative consequences compared to the other groups, (2) a “higher-risk” group who reported that they experienced hangovers and difficulties limiting their drinking, got physically sick, and became rude, obnoxious, or insulting, and (3) a “sexual regret” group who reported that they experienced poor recall and unplanned sexual activity that they later regretted. Although the frequency of participating in drinking games did not differ between these three groups, results indicated that the “lower-risk” group consumed fewer drinks in a typical gaming session compared to the other two groups. The present findings suggest that drinking games are common among high school students, but that mere participation and frequency of play is not necessarily the best indicator of risk. Instead, examination of other constructs such as game-related alcohol consumption, consequences, or psychosocial variables such as impulsivity may be more useful. PMID:23778317
Characterizing high school students who play drinking games using latent class analysis.
Borsari, Brian; Zamboanga, Byron L; Correia, Christopher; Olthuis, Janine V; Van Tyne, Kathryne; Zadworny, Zoe; Grossbard, Joel R; Horton, Nicholas J
2013-10-01
Heavy alcohol use and its associated negative consequences continue to be an important health issue among adolescents. Of particular concern are risky drinking practices such as playing drinking games. Although retrospective accounts indicate that drinking game participation is common among high school students, it has yet to be assessed in current high school students. Utilizing data from high school students who reported current drinking game participation (n=178), we used latent class analysis to investigate the negative consequences resulting from gaming and examined underlying demographic and alcohol-related behavioral characteristics of students as a function of the resultant classes. Three classes of "gamers" emerged: (1) a "lower-risk" group who had a lower probability of endorsing negative consequences compared to the other groups, (2) a "higher-risk" group who reported that they experienced hangovers and difficulties limiting their drinking, got physically sick, and became rude, obnoxious, or insulting, and (3) a "sexual regret" group who reported that they experienced poor recall and unplanned sexual activity that they later regretted. Although the frequency of participating in drinking games did not differ between these three groups, results indicated that the "lower-risk" group consumed fewer drinks in a typical gaming session compared to the other two groups. The present findings suggest that drinking games are common among high school students, but that mere participation and frequency of play are not necessarily the best indicators of risk. Instead, examination of other constructs such as game-related alcohol consumption, consequences, or psychosocial variables such as impulsivity may be more useful. Copyright © 2013 Elsevier Ltd. All rights reserved.
Jaeschke, Roman; Stevens, Scott M.; Goodacre, Steven; Wells, Philip S.; Stevenson, Matthew D.; Kearon, Clive; Schunemann, Holger J.; Crowther, Mark; Pauker, Stephen G.; Makdissi, Regina; Guyatt, Gordon H.
2012-01-01
Background: Objective testing for DVT is crucial because clinical assessment alone is unreliable and the consequences of misdiagnosis are serious. This guideline focuses on the identification of optimal strategies for the diagnosis of DVT in ambulatory adults. Methods: The methods of this guideline follow those described in Methodology for the Development of Antithrombotic Therapy and Prevention of Thrombosis Guidelines: Antithrombotic Therapy and Prevention of Thrombosis, 9th ed: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines. Results: We suggest that clinical assessment of pretest probability of DVT, rather than performing the same tests in all patients, should guide the diagnostic process for a first lower extremity DVT (Grade 2B). In patients with a low pretest probability of first lower extremity DVT, we recommend initial testing with D-dimer or ultrasound (US) of the proximal veins over no diagnostic testing (Grade 1B), venography (Grade 1B), or whole-leg US (Grade 2B). In patients with moderate pretest probability, we recommend initial testing with a highly sensitive D-dimer, proximal compression US, or whole-leg US rather than no testing (Grade 1B) or venography (Grade 1B). In patients with a high pretest probability, we recommend proximal compression or whole-leg US over no testing (Grade 1B) or venography (Grade 1B). Conclusions: Favored strategies for diagnosis of first DVT combine use of pretest probability assessment, D-dimer, and US. There is lower-quality evidence available to guide diagnosis of recurrent DVT, upper extremity DVT, and DVT during pregnancy. PMID:22315267
Brust, Vera; Bastian, Hans-Valentin; Bastian, Anita; Schmoll, Tim
2015-08-01
Re-occupation of existing nesting burrows in the European bee-eater Merops apiaster has only rarely - and if so mostly anecdotically - been documented in the literature record, although such behavior would substantially save time and energy. In this study, we quantify burrow re-occupation in a German colony over a period of eleven years and identify ecological variables determining reuse probability. Of 179 recorded broods, 54% took place in a reused burrow and the overall probability that one of 75 individually recognized burrows would be reused in a given subsequent year was estimated as 26.4%. This indicates that between-year burrow reuse is a common behavior in the study colony which contrasts with findings from studies in other colonies. Furthermore, burrow re-occupation probability declined highly significantly with increasing age of the breeding wall. Statistical separation of within- and between-burrow effects of the age of the breeding wall revealed that a decline in re-occupation probability with individual burrow age was responsible for this and not a selective disappearance of burrows with high re-occupation probability over time. Limited duty cycles of individual burrows may be caused by accumulating detritus or decreasing stability with increasing burrow age. Alternatively, burrow fidelity may presuppose pair fidelity which may also explain the observed restricted burrow reuse duty cycles. A consequent next step would be to extend our within-colony approach to other colonies and compare the ecological circumstances under which bee-eaters reuse breeding burrows.
Probability and Quantum Paradigms: the Interplay
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kracklauer, A. F.
Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a fewmore » details, this variant is appealing in its reliance on well tested concepts and technology.« less
Probability and Quantum Paradigms: the Interplay
NASA Astrophysics Data System (ADS)
Kracklauer, A. F.
2007-12-01
Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.
Surface friction alters the agility of a small Australian marsupial.
Wheatley, Rebecca; Clemente, Christofer J; Niehaus, Amanda C; Fisher, Diana O; Wilson, Robbie S
2018-04-23
Movement speed can underpin an animal's probability of success in ecological tasks. Prey often use agility to outmanoeuvre predators; however, faster speeds increase inertia and reduce agility. Agility is also constrained by grip, as the foot must have sufficient friction with the ground to apply the forces required for turning. Consequently, ground surface should affect optimum turning speed. We tested the speed-agility trade-off in buff-footed antechinus ( Antechinus mysticus ) on two different surfaces. Antechinus used slower turning speeds over smaller turning radii on both surfaces, as predicted by the speed-agility trade-off. Slipping was 64% more likely on the low-friction surface, and had a higher probability of occurring the faster the antechinus were running before the turn. However, antechinus compensated for differences in surface friction by using slower pre-turn speeds as their amount of experience on the low-friction surface increased, which consequently reduced their probability of slipping. Conversely, on the high-friction surface, antechinus used faster pre-turn speeds in later trials, which had no effect on their probability of slipping. Overall, antechinus used larger turning radii (0.733±0.062 versus 0.576±0.051 m) and slower pre-turn (1.595±0.058 versus 2.174±0.050 m s -1 ) and turning speeds (1.649±0.061 versus 2.01±0.054 m s -1 ) on the low-friction surface. Our results demonstrate the interactive effect of surface friction and the speed-agility trade-off on speed choice. To predict wild animals' movement speeds, future studies should examine the interactions between biomechanical trade-offs and terrain, and quantify the costs of motor mistakes in different ecological activities. © 2018. Published by The Company of Biologists Ltd.
A causal loop analysis of the sustainability of integrated community case management in Rwanda.
Sarriot, Eric; Morrow, Melanie; Langston, Anne; Weiss, Jennifer; Landegger, Justine; Tsuma, Laban
2015-04-01
Expansion of community health services in Rwanda has come with the national scale up of integrated Community Case Management (iCCM) of malaria, pneumonia and diarrhea. We used a sustainability assessment framework as part of a large-scale project evaluation to identify factors affecting iCCM sustainability (2011). We then (2012) used causal-loop analysis to identify systems determinants of iCCM sustainability from a national systems perspective. This allows us to develop three high-probability future scenarios putting the achievements of community health at risk, and to recommend mitigating strategies. Our causal loop diagram highlights both balancing and reinforcing loops of cause and effect in the national iCCM system. Financial, political and technical scenarios carry high probability for threatening the sustainability through: (1) reduction in performance-based financing resources, (2) political shocks and erosion of political commitment for community health, and (3) insufficient progress in resolving district health systems--"building blocks"--performance gaps. In a complex health system, the consequences of choices may be delayed and hard to predict precisely. Causal loop analysis and scenario mapping make explicit complex cause-and-effects relationships and high probability risks, which need to be anticipated and mitigated. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
Specifying design conservatism: Worst case versus probabilistic analysis
NASA Technical Reports Server (NTRS)
Miles, Ralph F., Jr.
1993-01-01
Design conservatism is the difference between specified and required performance, and is introduced when uncertainty is present. The classical approach of worst-case analysis for specifying design conservatism is presented, along with the modern approach of probabilistic analysis. The appropriate degree of design conservatism is a tradeoff between the required resources and the probability and consequences of a failure. A probabilistic analysis properly models this tradeoff, while a worst-case analysis reveals nothing about the probability of failure, and can significantly overstate the consequences of failure. Two aerospace examples will be presented that illustrate problems that can arise with a worst-case analysis.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-03-23
... social and economic consequences (poor academic performance, substance use, multiple disorders, suicides... substances, impact the whole community. Probable consequences include depression, domestic violence, child... financial resources; and Developing a biennial program plan, including specific objectives, performance...
ERIC Educational Resources Information Center
Shamblen, Stephen R.; Springer, J. Fred
2007-01-01
There is an absence of systematic, comparative research examining the negative consequences that are experienced as a result of using specific substances. Further, techniques typically used for needs assessment (i.e., prevalence proportions) do not take into account the probability of experiencing a negative consequence as a result of using…
Science in mid-Victorian Punch.
Noakes, Richard
2002-09-01
This article examines the scientific content of the most famous comic journal of the Victorian period: Punch. Concentrating on the first three decades of the periodical (1841-1871), I show that Punch usually engaged with science that was highly topical, of consequence to the lives of its bourgeois readers, and suitable for comic interpretation. But Punch's satire of scientific topics was highly complex. It often contained allusions to non-scientific topics, and its engagement with science ranged from the utterly comic to the sharply critical. Punch prompted readers to think as well as laugh about science, and probably shaped their scientific education more than we think.
Hazard perception and the economic impact of internment on residential land values
DOE Office of Scientific and Technical Information (OSTI.GOV)
Merz, J.F.
1983-04-01
The potential for large scale natural and man-made hazards exists in the form of hurricanes, earthquakes, volcanoes, floods, dams, accidents involving poisonous, explosive or radioactive materials, and severe pollution or waste disposal mishaps. Regions prone to natural hazards and areas located proximally to technological hazards may be subject to economic losses from low probability-high consequence events. Economic costs may be incurred in: evacuation and relocation of inhabitants; personal, industrial, agricultural, and tax revenue losses; decontamination; property damage or loss of value; and temporary or prolonged internment of land. The value of land in an area subjected to a low probability-highmore » consequence event may decrease, reflecting, a fortiori, a reluctance to continue living in the area or to repopulate a region which had required internment. The future value of such land may be described as a function of location, time, interdiction period (if applicable), and variables reflecting the magnitude of the perceived hazard. This paper presents a study of these variables and proposes a model for land value estimation. As an example, the application of the model to the Love Canal area in Niagara Falls, New York is presented.« less
Boyle, Sarah C.; Earle, Andrew M.; LaBrie, Joseph W.; Ballou, Kayla
2016-01-01
Studies examining representations of college drinking on social media have almost exclusively focused on Facebook. However, recent research suggests college students may be more influenced by peers’ alcohol-related posts on Instagram and Snapchat, two image-based platforms popular among this demographic. One potential explanation for this differential influence is that qualitative distinctions in the types of alcohol-related content posted by students on these three platforms may exist. Informed by undergraduate focus groups, this study examined the hypothesis that, of the three platforms, students tend to use Instagram most often for photos glamourizing drinking and Snapchat for incriminating photos of alcohol misuse and negative consequences. Undergraduate research assistants aided investigators in developing hypothetical vignettes and photographic examples of posts both glamorizing and depicting negative consequences associated with college drinking. In an online survey, vignette and photo stimuli were followed by counterbalanced paired comparisons that presented each possible pair of social media platforms. Undergraduates (N=196) selected the platform from each pair on which they would be more likely to see each post. Generalized Bradley-Terry models examined the probabilities of platform selections. As predicted, Instagram was seen as the most probable destination (and Facebook least probable) for photos depicting alcohol use as attractive and glamorous. Conversely, Snapchat was selected as the most probable destination (and Facebook least probable) for items depicting negative consequences associated with heavy drinking. Results suggest researchers aiming to mitigate the potential influences associated with college students’ glamorous and consequential alcohol-related photos posted social media posts should shift their focus from Facebook to Instagram and Snapchat. PMID:27776267
Risk-based maintenance of ethylene oxide production facilities.
Khan, Faisal I; Haddara, Mahmoud R
2004-05-20
This paper discusses a methodology for the design of an optimum inspection and maintenance program. The methodology, called risk-based maintenance (RBM) is based on integrating a reliability approach and a risk assessment strategy to obtain an optimum maintenance schedule. First, the likely equipment failure scenarios are formulated. Out of many likely failure scenarios, the ones, which are most probable, are subjected to a detailed study. Detailed consequence analysis is done for the selected scenarios. Subsequently, these failure scenarios are subjected to a fault tree analysis to determine their probabilities. Finally, risk is computed by combining the results of the consequence and the probability analyses. The calculated risk is compared against known acceptable criteria. The frequencies of the maintenance tasks are obtained by minimizing the estimated risk. A case study involving an ethylene oxide production facility is presented. Out of the five most hazardous units considered, the pipeline used for the transportation of the ethylene is found to have the highest risk. Using available failure data and a lognormal reliability distribution function human health risk factors are calculated. Both societal risk factors and individual risk factors exceeded the acceptable risk criteria. To determine an optimal maintenance interval, a reverse fault tree analysis was used. The maintenance interval was determined such that the original high risk is brought down to an acceptable level. A sensitivity analysis is also undertaken to study the impact of changing the distribution of the reliability model as well as the error in the distribution parameters on the maintenance interval.
Dynamics in atomic signaling games.
Fox, Michael J; Touri, Behrouz; Shamma, Jeff S
2015-07-07
We study an atomic signaling game under stochastic evolutionary dynamics. There are a finite number of players who repeatedly update from a finite number of available languages/signaling strategies. Players imitate the most fit agents with high probability or mutate with low probability. We analyze the long-run distribution of states and show that, for sufficiently small mutation probability, its support is limited to efficient communication systems. We find that this behavior is insensitive to the particular choice of evolutionary dynamic, a property that is due to the game having a potential structure with a potential function corresponding to average fitness. Consequently, the model supports conclusions similar to those found in the literature on language competition. That is, we show that efficient languages eventually predominate the society while reproducing the empirical phenomenon of linguistic drift. The emergence of efficiency in the atomic case can be contrasted with results for non-atomic signaling games that establish the non-negligible possibility of convergence, under replicator dynamics, to states of unbounded efficiency loss. Copyright © 2015 Elsevier Ltd. All rights reserved.
Regambal, Marci J; Alden, Lynn E
2012-09-01
Individuals with posttraumatic stress disorder (PTSD) are hypothesized to have a "sense of current threat." Perceived threat from the environment (i.e., external threat), can lead to overestimating the probability of the traumatic event reoccurring (Ehlers & Clark, 2000). However, it is unclear if external threat judgments are a pre-existing vulnerability for PTSD or a consequence of trauma exposure. We used trauma analog methodology to prospectively measure probability estimates of a traumatic event, and investigate how these estimates were related to cognitive processes implicated in PTSD development. 151 participants estimated the probability of being in car-accident related situations, watched a movie of a car accident victim, and then completed a measure of data-driven processing during the movie. One week later, participants re-estimated the probabilities, and completed measures of reexperiencing symptoms and symptom appraisals/reactions. Path analysis revealed that higher pre-existing probability estimates predicted greater data-driven processing which was associated with negative appraisals and responses to intrusions. Furthermore, lower pre-existing probability estimates and negative responses to intrusions were both associated with a greater change in probability estimates. Reexperiencing symptoms were predicted by negative responses to intrusions and, to a lesser degree, by greater changes in probability estimates. The undergraduate student sample may not be representative of the general public. The reexperiencing symptoms are less severe than what would be found in a trauma sample. Threat estimates present both a vulnerability and a consequence of exposure to a distressing event. Furthermore, changes in these estimates are associated with cognitive processes implicated in PTSD. Copyright © 2012 Elsevier Ltd. All rights reserved.
Austin, Jehannine C; Hippman, Catriona; Honer, William G
2012-03-30
Studies show that individuals with psychotic illnesses and their families want information about psychosis risks for other relatives. However, deriving accurate numeric probabilities for psychosis risk is challenging, and people have difficulty interpreting probabilistic information; thus, some have suggested that clinicians should use risk descriptors, such as "moderate" or "quite high", rather than numbers. Little is known about how individuals with psychosis and their family members use quantitative and qualitative descriptors of risk in the specific context of chance for an individual to develop psychosis. We explored numeric and descriptive estimations of psychosis risk among individuals with psychotic disorders and unaffected first-degree relatives. In an online survey, respondents numerically and descriptively estimated risk for an individual to develop psychosis in scenarios where they had: A) no affected family members; and B) an affected sibling. Participants comprised 219 affected individuals and 211 first-degree relatives participated. Affected individuals estimated significantly higher risks than relatives. Participants attributed all descriptors between "very low" and "very high" to probabilities of 1%, 10%, 25% and 50%+. For a given numeric probability, different risk descriptors were attributed in different scenarios. Clinically, brief interventions around risk (using either probabilities or descriptors alone) are vulnerable to miscommunication and potentially negative consequences-interventions around risk are best suited to in-depth discussion. Copyright © 2012 Elsevier Ltd. All rights reserved.
A methodology for estimating risks associated with landslides of contaminated soil into rivers.
Göransson, Gunnel; Norrman, Jenny; Larson, Magnus; Alén, Claes; Rosén, Lars
2014-02-15
Urban areas adjacent to surface water are exposed to soil movements such as erosion and slope failures (landslides). A landslide is a potential mechanism for mobilisation and spreading of pollutants. This mechanism is in general not included in environmental risk assessments for contaminated sites, and the consequences associated with contamination in the soil are typically not considered in landslide risk assessments. This study suggests a methodology to estimate the environmental risks associated with landslides in contaminated sites adjacent to rivers. The methodology is probabilistic and allows for datasets with large uncertainties and the use of expert judgements, providing quantitative estimates of probabilities for defined failures. The approach is illustrated by a case study along the river Göta Älv, Sweden, where failures are defined and probabilities for those failures are estimated. Failures are defined from a pollution perspective and in terms of exceeding environmental quality standards (EQSs) and acceptable contaminant loads. Models are then suggested to estimate probabilities of these failures. A landslide analysis is carried out to assess landslide probabilities based on data from a recent landslide risk classification study along the river Göta Älv. The suggested methodology is meant to be a supplement to either landslide risk assessment (LRA) or environmental risk assessment (ERA), providing quantitative estimates of the risks associated with landslide in contaminated sites. The proposed methodology can also act as a basis for communication and discussion, thereby contributing to intersectoral management solutions. From the case study it was found that the defined failures are governed primarily by the probability of a landslide occurring. The overall probabilities for failure are low; however, if a landslide occurs the probabilities of exceeding EQS are high and the probability of having at least a 10% increase in the contamination load within one year is also high. Copyright © 2013 Elsevier B.V. All rights reserved.
Health Consequences of Alcohol Use in Rural America.
ERIC Educational Resources Information Center
Brody, Gene H.; Neubaum, Eileen; Boyd, Gayle M.; Dufour, Mary
Results of three national surveys suggest that the prevalence of drinking was lower in nonmetropolitan areas than in metropolitan areas. However, nonmetro and metro areas were similar in the presence of risk for heavy, dependent, and problem drinking. Therefore, they probably share similar risks for health consequences of such levels of…
10 CFR 50.92 - Issuance of amendment.
Code of Federal Regulations, 2012 CFR
2012-01-01
... consequences (such as one that permits a significant increase in the amount of effluents or radiation emitted... significant increase in the probability or consequences of an accident previously evaluated; or (2) Create the possibility of a new or different kind of accident from any accident previously evaluated; or (3) Involve a...
10 CFR 50.92 - Issuance of amendment.
Code of Federal Regulations, 2013 CFR
2013-01-01
... consequences (such as one that permits a significant increase in the amount of effluents or radiation emitted... significant increase in the probability or consequences of an accident previously evaluated; or (2) Create the possibility of a new or different kind of accident from any accident previously evaluated; or (3) Involve a...
10 CFR 50.92 - Issuance of amendment.
Code of Federal Regulations, 2010 CFR
2010-01-01
... consequences (such as one that permits a significant increase in the amount of effluents or radiation emitted... significant increase in the probability or consequences of an accident previously evaluated; or (2) Create the possibility of a new or different kind of accident from any accident previously evaluated; or (3) Involve a...
10 CFR 50.92 - Issuance of amendment.
Code of Federal Regulations, 2014 CFR
2014-01-01
... consequences (such as one that permits a significant increase in the amount of effluents or radiation emitted... significant increase in the probability or consequences of an accident previously evaluated; or (2) Create the possibility of a new or different kind of accident from any accident previously evaluated; or (3) Involve a...
10 CFR 50.92 - Issuance of amendment.
Code of Federal Regulations, 2011 CFR
2011-01-01
... consequences (such as one that permits a significant increase in the amount of effluents or radiation emitted... significant increase in the probability or consequences of an accident previously evaluated; or (2) Create the possibility of a new or different kind of accident from any accident previously evaluated; or (3) Involve a...
Potential Future Igneous Activity at Yucca Mountain, Nevada
NASA Astrophysics Data System (ADS)
Cline, M.; Perry, F. V.; Valentine, G. A.; Smistad, E.
2005-12-01
Location, timing, and volumes of post-Miocene volcanic activity, along with expert judgement, provide the basis for assessing the probability of future volcanism intersecting a proposed repository for nuclear waste at Yucca Mountain, Nevada. Analog studies of eruptive centers in the region that may represent the style and extent of possible future igneous activity at Yucca Mountain have aided in defining the consequence scenarios for intrusion into and eruption through a proposed repository. Modeling of magmatic processes related to magma/proposed repository interactions has been used to assess the potential consequences of a future igneous event through a proposed repository at Yucca Mountain. Results of work to date indicate future igneous activity in the Yucca Mountain region has a very low probability of intersecting the proposed repository. Probability of a future event intersecting a proposed repository at Yucca Mountain is approximately 1.7 X 10-8 per year. Since completion of the Probabilistic Volcanic Hazard Assessment (PVHA) in 1996, anomalies representing potential buried volcanic centers have been identified from aeromagnetic surveys. A re-assessment of the hazard is currently underway to evaluate the probability of intersection in light of new information and to estimate the probability of one or more volcanic conduits located in the proposed repository along a dike that intersects the proposed repository. U.S. Nuclear Regulatory Commission regulations for siting and licensing a proposed repository require that the consequences of a disruptive event (igneous event) with annual probability greater than 1 X 10-8 be evaluated. Two consequence scenarios are considered; 1) igneous intrusion-groundwater transport case and 2) volcanic eruptive case. These scenarios equate to a dike or dike swarm intersecting repository drifts containing waste packages, formation of a conduit leading to a volcanic eruption through the repository that carries the contents of the waste packages into the atmosphere, deposition of a tephra sheet, and redistribution of the contaminated ash. In both cases radioactive material is released to the accessible environment either through groundwater transport or through the atmospheric dispersal and deposition. Six Quaternary volcanic centers exist within 20 km of Yucca Mountain. Lathrop Wells cone (LWC), the youngest (approximately 75,000 yrs), is a well-preserved cinder cone with associated flows and tephra sheet that provides an excellent analogue for consequence studies related to future volcanism. Cone, lavas, hydrovolcanic ash, and ash-fall tephra have been examined to estimate eruptive volume and eruption type. LWC ejecta volumes suggest basaltic volcanism may be waning in the Yucca Mountain region.. The eruptive products indicate a sequence of initial fissure fountaining, early Strombolian ash and lapilli deposition forming the scoria cone, a brief hydrovolcanic pulse (possibly limited to the NW sector), and a violent Strombolian phase. Mathematical models have been developed to represent magmatic processes and their consequences on proposed repository performance. These models address dike propagation, magma interaction and flow into drifts, eruption through the proposed repository, and post intrusion/eruption effects. These models continue to be refined to reduce the uncertainty associated with the consequences from a possible future igneous event.
Computer-aided diagnosis with potential application to rapid detection of disease outbreaks.
Burr, Tom; Koster, Frederick; Picard, Rick; Forslund, Dave; Wokoun, Doug; Joyce, Ed; Brillman, Judith; Froman, Phil; Lee, Jack
2007-04-15
Our objectives are to quickly interpret symptoms of emergency patients to identify likely syndromes and to improve population-wide disease outbreak detection. We constructed a database of 248 syndromes, each syndrome having an estimated probability of producing any of 85 symptoms, with some two-way, three-way, and five-way probabilities reflecting correlations among symptoms. Using these multi-way probabilities in conjunction with an iterative proportional fitting algorithm allows estimation of full conditional probabilities. Combining these conditional probabilities with misdiagnosis error rates and incidence rates via Bayes theorem, the probability of each syndrome is estimated. We tested a prototype of computer-aided differential diagnosis (CADDY) on simulated data and on more than 100 real cases, including West Nile Virus, Q fever, SARS, anthrax, plague, tularaemia and toxic shock cases. We conclude that: (1) it is important to determine whether the unrecorded positive status of a symptom means that the status is negative or that the status is unknown; (2) inclusion of misdiagnosis error rates produces more realistic results; (3) the naive Bayes classifier, which assumes all symptoms behave independently, is slightly outperformed by CADDY, which includes available multi-symptom information on correlations; as more information regarding symptom correlations becomes available, the advantage of CADDY over the naive Bayes classifier should increase; (4) overlooking low-probability, high-consequence events is less likely if the standard output summary is augmented with a list of rare syndromes that are consistent with observed symptoms, and (5) accumulating patient-level probabilities across a larger population can aid in biosurveillance for disease outbreaks. c 2007 John Wiley & Sons, Ltd.
Work statistics of charged noninteracting fermions in slowly changing magnetic fields.
Yi, Juyeon; Talkner, Peter
2011-04-01
We consider N fermionic particles in a harmonic trap initially prepared in a thermal equilibrium state at temperature β^{-1} and examine the probability density function (pdf) of the work done by a magnetic field slowly varying in time. The behavior of the pdf crucially depends on the number of particles N but also on the temperature. At high temperatures (β≪1) the pdf is given by an asymmetric Laplace distribution for a single particle, and for many particles it approaches a Gaussian distribution with variance proportional to N/β(2). At low temperatures the pdf becomes strongly peaked at the center with a variance that still linearly increases with N but exponentially decreases with the temperature. We point out the consequences of these findings for the experimental confirmation of the Jarzynski equality such as the low probability issue at high temperatures and its solution at low temperatures, together with a discussion of the crossover behavior between the two temperature regimes. ©2011 American Physical Society
Work statistics of charged noninteracting fermions in slowly changing magnetic fields
NASA Astrophysics Data System (ADS)
Yi, Juyeon; Talkner, Peter
2011-04-01
We consider N fermionic particles in a harmonic trap initially prepared in a thermal equilibrium state at temperature β-1 and examine the probability density function (pdf) of the work done by a magnetic field slowly varying in time. The behavior of the pdf crucially depends on the number of particles N but also on the temperature. At high temperatures (β≪1) the pdf is given by an asymmetric Laplace distribution for a single particle, and for many particles it approaches a Gaussian distribution with variance proportional to N/β2. At low temperatures the pdf becomes strongly peaked at the center with a variance that still linearly increases with N but exponentially decreases with the temperature. We point out the consequences of these findings for the experimental confirmation of the Jarzynski equality such as the low probability issue at high temperatures and its solution at low temperatures, together with a discussion of the crossover behavior between the two temperature regimes.
Boyle, Sarah C; Earle, Andrew M; LaBrie, Joseph W; Ballou, Kayla
2017-02-01
Studies examining representations of college drinking on social media have almost exclusively focused on Facebook. However, recent research suggests college students may be more influenced by peers' alcohol-related posts on Instagram and Snapchat, two image-based platforms popular among this demographic. One potential explanation for this differential influence is that qualitative distinctions in the types of alcohol-related content posted by students on these three platforms may exist. Informed by undergraduate focus groups, this study examined the hypothesis that, of the three platforms, students tend to use Instagram most often for photos glamourizing drinking and Snapchat for incriminating photos of alcohol misuse and negative consequences. Undergraduate research assistants aided investigators in developing hypothetical vignettes and photographic examples of posts both glamorizing and depicting negative consequences associated with college drinking. In an online survey, vignette and photo stimuli were followed by counterbalanced paired comparisons that presented each possible pair of social media platforms. Undergraduates (N=196) selected the platform from each pair on which they would be more likely to see each post. Generalized Bradley-Terry models examined the probabilities of platform selections. As predicted, Instagram was seen as the most probable destination (and Facebook least probable) for photos depicting alcohol use as attractive and glamorous. Conversely, Snapchat was selected as the most probable destination (and Facebook least probable) for items depicting negative consequences associated with heavy drinking. Results suggest researchers aiming to mitigate the potential influences associated with college students' glamorous and consequential alcohol-related photos posted social media posts should shift their focus from Facebook to Instagram and Snapchat. Copyright © 2016 Elsevier Ltd. All rights reserved.
Plateauing and Its Consequences for Educators and Educational Organizations.
ERIC Educational Resources Information Center
Milstein, Mike
Plateauing is an individual's conviction that continued progress is impossible. This conviction occurs as a consequence of long periods of occupational stability. When practitioners doubt the probability of promotion or the importance of their work and find their work to be boring and redundant, they are exhibiting symptoms of plateauing rather…
On the complex quantification of risk: systems-based perspective on terrorism.
Haimes, Yacov Y
2011-08-01
This article highlights the complexity of the quantification of the multidimensional risk function, develops five systems-based premises on quantifying the risk of terrorism to a threatened system, and advocates the quantification of vulnerability and resilience through the states of the system. The five premises are: (i) There exists interdependence between a specific threat to a system by terrorist networks and the states of the targeted system, as represented through the system's vulnerability, resilience, and criticality-impact. (ii) A specific threat, its probability, its timing, the states of the targeted system, and the probability of consequences can be interdependent. (iii) The two questions in the risk assessment process: "What is the likelihood?" and "What are the consequences?" can be interdependent. (iv) Risk management policy options can reduce both the likelihood of a threat to a targeted system and the associated likelihood of consequences by changing the states (including both vulnerability and resilience) of the system. (v) The quantification of risk to a vulnerable system from a specific threat must be built on a systemic and repeatable modeling process, by recognizing that the states of the system constitute an essential step to construct quantitative metrics of the consequences based on intelligence gathering, expert evidence, and other qualitative information. The fact that the states of all systems are functions of time (among other variables) makes the time frame pivotal in each component of the process of risk assessment, management, and communication. Thus, risk to a system, caused by an initiating event (e.g., a threat) is a multidimensional function of the specific threat, its probability and time frame, the states of the system (representing vulnerability and resilience), and the probabilistic multidimensional consequences. © 2011 Society for Risk Analysis.
A Method for Dynamic Risk Assessment and Management of Rockbursts in Drill and Blast Tunnels
NASA Astrophysics Data System (ADS)
Liu, Guo-Feng; Feng, Xia-Ting; Feng, Guang-Liang; Chen, Bing-Rui; Chen, Dong-Fang; Duan, Shu-Qian
2016-08-01
Focusing on the problems caused by rockburst hazards in deep tunnels, such as casualties, damage to construction equipment and facilities, construction schedule delays, and project cost increase, this research attempts to present a methodology for dynamic risk assessment and management of rockbursts in D&B tunnels. The basic idea of dynamic risk assessment and management of rockbursts is determined, and methods associated with each step in the rockburst risk assessment and management process are given, respectively. Among them, the main parts include a microseismic method for early warning the occurrence probability of rockburst risk, an estimation method that aims to assess potential consequences of rockburst risk, an evaluation method that utilizes a new quantitative index considering both occurrence probability and consequences for determining the level of rockburst risk, and the dynamic updating. Specifically, this research briefly describes the referenced microseismic method of warning rockburst, but focuses on the analysis of consequences and associated risk assessment and management of rockburst. Using the proposed method of risk assessment and management of rockburst, the occurrence probability, potential consequences, and the level of rockburst risk can be obtained in real-time during tunnel excavation, which contributes to the dynamic optimisation of risk mitigation measures and their application. The applicability of the proposed method has been verified by those cases from the Jinping II deep headrace and water drainage tunnels at depths of 1900-2525 m (with a length of 11.6 km in total for D&B tunnels).
N -tag probability law of the symmetric exclusion process
NASA Astrophysics Data System (ADS)
Poncet, Alexis; Bénichou, Olivier; Démery, Vincent; Oshanin, Gleb
2018-06-01
The symmetric exclusion process (SEP), in which particles hop symmetrically on a discrete line with hard-core constraints, is a paradigmatic model of subdiffusion in confined systems. This anomalous behavior is a direct consequence of strong spatial correlations induced by the requirement that the particles cannot overtake each other. Even if this fact has been recognized qualitatively for a long time, up to now there has been no full quantitative determination of these correlations. Here we study the joint probability distribution of an arbitrary number of tagged particles in the SEP. We determine analytically its large-time limit for an arbitrary density of particles, and its full dynamics in the high-density limit. In this limit, we obtain the time-dependent large deviation function of the problem and unveil a universal scaling form shared by the cumulants.
Relationships of Stress Exposures to Health in Gulf War Veterans
2004-10-01
traumatic events and post - traumatic stress disorder . In: Nutt D, Zohar J, Davidson J (eds). Post ...if such subgroups could be distinguished with respect to Gulf War exposures and probable posttraumatic stress disorder ( PTSD ). Additionally, we... stress disorder ( PTSD ). Additionally, we sought to examine the functional consequences of specific patterns of ill-health and probable PTSD ten
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crowe, B.
1980-12-31
This document summarizes an oral presentation that described the potential for volcanic activity at the proposed Yucca Mountain, Texas repository site. Yucca Mountain is located in a broad zone of volcanic activity known as the Death Valley-Pancake Ridge volcanic zone. The probability estimate for the likelihood that some future volcanic event will intersect a buried repository at Yucca Mountain is low. Additionally, the radiological consequences of penetration of a repository by basaltic magma followed by eruption of the magma at the surface are limited. The combination of low probability and limited consequence suggests that the risk posed by waste storagemore » at this site is low. (TEM)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.
1998-10-01
The US Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticality duringmore » the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).« less
Stimulus discriminability may bias value-based probabilistic learning.
Schutte, Iris; Slagter, Heleen A; Collins, Anne G E; Frank, Michael J; Kenemans, J Leon
2017-01-01
Reinforcement learning tasks are often used to assess participants' tendency to learn more from the positive or more from the negative consequences of one's action. However, this assessment often requires comparison in learning performance across different task conditions, which may differ in the relative salience or discriminability of the stimuli associated with more and less rewarding outcomes, respectively. To address this issue, in a first set of studies, participants were subjected to two versions of a common probabilistic learning task. The two versions differed with respect to the stimulus (Hiragana) characters associated with reward probability. The assignment of character to reward probability was fixed within version but reversed between versions. We found that performance was highly influenced by task version, which could be explained by the relative perceptual discriminability of characters assigned to high or low reward probabilities, as assessed by a separate discrimination experiment. Participants were more reliable in selecting rewarding characters that were more discriminable, leading to differences in learning curves and their sensitivity to reward probability. This difference in experienced reinforcement history was accompanied by performance biases in a test phase assessing ability to learn from positive vs. negative outcomes. In a subsequent large-scale web-based experiment, this impact of task version on learning and test measures was replicated and extended. Collectively, these findings imply a key role for perceptual factors in guiding reward learning and underscore the need to control stimulus discriminability when making inferences about individual differences in reinforcement learning.
NASA Astrophysics Data System (ADS)
Ferguson, Elaine A.; Hampson, Katie; Cleaveland, Sarah; Consunji, Ramona; Deray, Raffy; Friar, John; Haydon, Daniel T.; Jimenez, Joji; Pancipane, Marlon; Townsend, Sunny E.
2015-12-01
Understanding the factors influencing vaccination campaign effectiveness is vital in designing efficient disease elimination programmes. We investigated the importance of spatial heterogeneity in vaccination coverage and human-mediated dog movements for the elimination of endemic canine rabies by mass dog vaccination in Region VI of the Philippines (Western Visayas). Household survey data was used to parameterise a spatially-explicit rabies transmission model with realistic dog movement and vaccination coverage scenarios, assuming a basic reproduction number for rabies drawn from the literature. This showed that heterogeneous vaccination reduces elimination prospects relative to homogeneous vaccination at the same overall level. Had the three vaccination campaigns completed in Region VI in 2010-2012 been homogeneous, they would have eliminated rabies with high probability. However, given the observed heterogeneity, three further campaigns may be required to achieve elimination with probability 0.95. We recommend that heterogeneity be reduced in future campaigns through targeted efforts in low coverage areas, even at the expense of reduced coverage in previously high coverage areas. Reported human-mediated dog movements did not reduce elimination probability, so expending limited resources on restricting dog movements is unnecessary in this endemic setting. Enhanced surveillance will be necessary post-elimination, however, given the reintroduction risk from long-distance dog movements.
NASA Astrophysics Data System (ADS)
Ismaila, Aminu; Md Kasmani, Rafiziana; Meng-Hock, Koh; Termizi Ramli, Ahmad
2017-10-01
This paper deals with the assessment of external explosion, resulting from accidental release of jet fuel from the large commercial airliner in the nuclear power plant (NPP). The study used three widely prediction methods such as Trinitrotoluene (TNT), multi energy (TNO) and Baker-strehow (BST) to determine the unconfined vapour cloud explosion (UVCE) overpressure within the distances of 100-1400 m from the first impact location. The containment building was taken as the reference position. The fatalities of persons and damage of structures was estimated using probit methodology. Analysis of the results shows that both reactor building and control-room will be highly damaged with risk consequences and probability, depending on the assumed position of the crash. The structures at the radial distance of 600 m may suffer major structural damage with probability ranging from 25 to 100%. The minor structural damage was observed throughout the bounds of the plant complex. The people working within 250 m radius may get affected with different fatality ranging from 28 to 100%. The findings of this study is valuable to evaluate the safety improvement needed on the NPP site and on the risk and consequences associated with the hydrocarbon fuel release/fires due to external hazards.
Both, Christiaan; Van Turnhout, Chris A M; Bijlsma, Rob G; Siepel, Henk; Van Strien, Arco J; Foppen, Ruud P B
2010-04-22
One consequence of climate change is an increasing mismatch between timing of food requirements and food availability. Such a mismatch is primarily expected in avian long-distance migrants because of their complex annual cycle, and in habitats with a seasonal food peak. Here we show that insectivorous long-distance migrant species in The Netherlands declined strongly (1984-2004) in forests, a habitat characterized by a short spring food peak, but that they did not decline in less seasonal marshes. Also, within generalist long-distance migrant species, populations declined more strongly in forests than in marshes. Forest-inhabiting migrant species arriving latest in spring declined most sharply, probably because their mismatch with the peak in food supply is greatest. Residents and short-distance migrants had non-declining populations in both habitats, suggesting that habitat quality did not deteriorate. Habitat-related differences in trends were most probably caused by climate change because at a European scale, long-distance migrants in forests declined more severely in western Europe, where springs have become considerably warmer, when compared with northern Europe, where temperatures during spring arrival and breeding have increased less. Our results suggest that trophic mismatches may have become a major cause for population declines in long-distance migrants in highly seasonal habitats.
Dangerous "spin": the probability myth of evidence-based prescribing - a Merleau-Pontyian approach.
Morstyn, Ron
2011-08-01
The aim of this study was to examine logical positivist statistical probability statements used to support and justify "evidence-based" prescribing rules in psychiatry when viewed from the major philosophical theories of probability, and to propose "phenomenological probability" based on Maurice Merleau-Ponty's philosophy of "phenomenological positivism" as a better clinical and ethical basis for psychiatric prescribing. The logical positivist statistical probability statements which are currently used to support "evidence-based" prescribing rules in psychiatry have little clinical or ethical justification when subjected to critical analysis from any of the major theories of probability and represent dangerous "spin" because they necessarily exclude the individual , intersubjective and ambiguous meaning of mental illness. A concept of "phenomenological probability" founded on Merleau-Ponty's philosophy of "phenomenological positivism" overcomes the clinically destructive "objectivist" and "subjectivist" consequences of logical positivist statistical probability and allows psychopharmacological treatments to be appropriately integrated into psychiatric treatment.
Alban, Lis; Ellis-Iversen, Johanne; Andreasen, Margit; Dahl, Jan; Sönksen, Ute W.
2017-01-01
Antibiotic consumption in pigs can be optimized by developing treatment guidelines, which encourage veterinarians to use effective drugs with low probability of developing resistance of importance for human health. In Denmark, treatment guidelines for use in swine production are currently under review at the Danish Veterinary and Food Administration. Use of pleuromutilins in swine has previously been associated with a very low risk for human health. However, recent international data and sporadic findings of novel resistance genes suggest a change of risk. Consequently, a reassessment was undertaken inspired by a risk assessment framework developed by the European Medicines Agency. Livestock-associated methicillin-resistant Staphylococcus aureus of clonal complex 398 (MRSA CC398) and enterococci were identified as relevant hazards. The release assessment showed that the probability of development of pleuromutilin resistance was high in MRSA CC398 (medium uncertainty) and low in enterococci (high uncertainty). A relatively small proportion of Danes has an occupational exposure to pigs, and foodborne transmission was only considered of relevance for enterococci, resulting in an altogether low exposure risk. The human consequences of infection with pleuromutilin-resistant MRSA CC398 or enterococci were assessed as low for the public in general but high for vulnerable groups such as hospitalized and immunocompromised persons. For MRSA CC398, the total risk was estimated as low (low uncertainty), among other due to the current guidelines on prevention of MRSA in place at Danish hospitals, which include screening of patients with daily contact to pigs on admittance. Moreover, MRSA CC398 has a medium human–human transmission potential. For enterococci, the total risk was estimated as low due to low prevalence of resistance, low probability of spread to humans, low virulence, but no screening of hospitalized patients, high ability of acquiring resistance genes, and a limited number of alternative antimicrobials (high uncertainty). This assessment reflects the current situation and should be repeated if pleuromutilin consumption increases substantially, resulting in increased prevalence of mobile, easily transmissible resistance mechanisms. Continuous monitoring of pleuromutilin resistance in selected human pathogens should therefore be considered. This also includes monitoring of linezolid resistance, since resistance mechanisms for pleuromutilins and oxazolidones are often coupled. PMID:28603717
Early detection monitoring for larval dreissenid mussels: How much plankton sampling is enough?
Counihan, Timothy D.; Bollens, Stephen M.
2017-01-01
The development of quagga and zebra mussel (dreissenids) monitoring programs in the Pacific Northwest provides a unique opportunity to evaluate a regional invasive species detection effort early in its development. Recent studies suggest that the ecological and economic costs of a dreissenid infestation in the Pacific Northwest of the USA would be significant. Consequently, efforts are underway to monitor for the presence of dreissenids. However, assessments of whether these efforts provide for early detection are lacking. We use information collected from 2012 to 2014 to characterize the development of larval dreissenid monitoring programs in the states of Idaho, Montana, Oregon, and Washington in the context of introduction and establishment risk. We also estimate the effort needed for high-probability detection of rare planktonic taxa in four Columbia and Snake River reservoirs and assess whether the current level of effort provides for early detection. We found that the effort expended to monitor for dreissenid mussels increased substantially from 2012 to 2014, that efforts were distributed across risk categories ranging from high to very low, and that substantial gaps in our knowledge of both introduction and establishment risk exist. The estimated volume of filtered water required to fully census planktonic taxa or to provide high-probability detection of rare taxa was high for the four reservoirs examined. We conclude that the current level of effort expended does not provide for high-probability detection of larval dreissenids or other planktonic taxa when they are rare in these reservoirs. We discuss options to improve early detection capabilities.
Value and probability coding in a feedback-based learning task utilizing food rewards.
Tricomi, Elizabeth; Lempert, Karolina M
2015-01-01
For the consequences of our actions to guide behavior, the brain must represent different types of outcome-related information. For example, an outcome can be construed as negative because an expected reward was not delivered or because an outcome of low value was delivered. Thus behavioral consequences can differ in terms of the information they provide about outcome probability and value. We investigated the role of the striatum in processing probability-based and value-based negative feedback by training participants to associate cues with food rewards and then employing a selective satiety procedure to devalue one food outcome. Using functional magnetic resonance imaging, we examined brain activity related to receipt of expected rewards, receipt of devalued outcomes, omission of expected rewards, omission of devalued outcomes, and expected omissions of an outcome. Nucleus accumbens activation was greater for rewarding outcomes than devalued outcomes, but activity in this region did not correlate with the probability of reward receipt. Activation of the right caudate and putamen, however, was largest in response to rewarding outcomes relative to expected omissions of reward. The dorsal striatum (caudate and putamen) at the time of feedback also showed a parametric increase correlating with the trialwise probability of reward receipt. Our results suggest that the ventral striatum is sensitive to the motivational relevance, or subjective value, of the outcome, while the dorsal striatum codes for a more complex signal that incorporates reward probability. Value and probability information may be integrated in the dorsal striatum, to facilitate action planning and allocation of effort. Copyright © 2015 the American Physiological Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, L.L.; Wilson, J.R.; Sanchez, L.C.
The United States Department of Energy Office of Environmental Management's (DOE/EM's) National Spent Nuclear Fuel Program (NSNFP), through a collaboration between Sandia National Laboratories (SNL) and Idaho National Engineering and Environmental Laboratory (INEEL), is conducting a systematic Nuclear Dynamics Consequence Analysis (NDCA) of the disposal of SNFs in an underground geologic repository sited in unsaturated tuff. This analysis is intended to provide interim guidance to the DOE for the management of the SNF while they prepare for final compliance evaluation. This report presents results from a Nuclear Dynamics Consequence Analysis (NDCA) that examined the potential consequences and risks of criticalitymore » during the long-term disposal of spent nuclear fuel owned by DOE-EM. This analysis investigated the potential of post-closure criticality, the consequences of a criticality excursion, and the probability frequency for post-closure criticality. The results of the NDCA are intended to provide the DOE-EM with a technical basis for measuring risk which can be used for screening arguments to eliminate post-closure criticality FEPs (features, events and processes) from consideration in the compliance assessment because of either low probability or low consequences. This report is composed of an executive summary (Volume 1), the methodology and results of the NDCA (Volume 2), and the applicable appendices (Volume 3).« less
Ocular toxicity of fludarabine
Ding, Xiaoyan; Herzlich, Alexandra A; Bishop, Rachel; Tuo, Jingsheng; Chan, Chi-Chao
2008-01-01
The purine analogs, fludarabine and cladribine represent an important class of chemotherapy agents used to treat a broad spectrum of lymphoid malignancies. Their toxicity profiles include dose-limiting myelosuppression, immunosuppression, opportunistic infection and severe neurotoxicity. This review summarizes the neurotoxicity of high- and standard-dose fludarabine, focusing on the clinical and pathological manifestations in the eye. The mechanisms of ocular toxicity are probably multifactorial. With increasing clinical use, an awareness of the neurological and ocular vulnerability, particularly to fludarabine, is important owing to the potential for life- and sight-threatening consequences. PMID:18461151
Legal consequences of the moral duty to report errors.
Hall, Jacqulyn Kay
2003-09-01
Increasingly, clinicians are under a moral duty to report errors to the patients who are injured by such errors. The sources of this duty are identified, and its probable impact on malpractice litigation and criminal law is discussed. The potential consequences of enforcing this new moral duty as a minimum in law are noted. One predicted consequence is that the trend will be accelerated toward government payment of compensation for errors. The effect of truth-telling on individuals is discussed.
Holbrook, Christopher M.; Perry, Russell W.; Brandes, Patricia L.; Adams, Noah S.
2013-01-01
In telemetry studies, premature tag failure causes negative bias in fish survival estimates because tag failure is interpreted as fish mortality. We used mark-recapture modeling to adjust estimates of fish survival for a previous study where premature tag failure was documented. High rates of tag failure occurred during the Vernalis Adaptive Management Plan’s (VAMP) 2008 study to estimate survival of fall-run Chinook salmon (Oncorhynchus tshawytscha) during migration through the San Joaquin River and Sacramento-San Joaquin Delta, California. Due to a high rate of tag failure, the observed travel time distribution was likely negatively biased, resulting in an underestimate of tag survival probability in this study. Consequently, the bias-adjustment method resulted in only a small increase in estimated fish survival when the observed travel time distribution was used to estimate the probability of tag survival. Since the bias-adjustment failed to remove bias, we used historical travel time data and conducted a sensitivity analysis to examine how fish survival might have varied across a range of tag survival probabilities. Our analysis suggested that fish survival estimates were low (95% confidence bounds range from 0.052 to 0.227) over a wide range of plausible tag survival probabilities (0.48–1.00), and this finding is consistent with other studies in this system. When tags fail at a high rate, available methods to adjust for the bias may perform poorly. Our example highlights the importance of evaluating the tag life assumption during survival studies, and presents a simple framework for evaluating adjusted survival estimates when auxiliary travel time data are available.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Joerissen, G.; Zuend, H.
From international nuclear industries fair; Basel, Switzerland (16 Oct 1972). The probability and the consequences of an aircraft crash on a nuclear power plant incorporating a light water reactor are estimated considering the probabilities of an aircraft strike, missile penetration through walls and damage of structures and systems important for safety. The estimated risks are presented in a Farmer diagram and compared with tolerable risk limits. (6 references) (auth)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lestrange, Patrick J.; Egidi, Franco; Li, Xiaosong, E-mail: xsli@uw.edu
2015-12-21
The interaction between a quantum mechanical system and plane wave light is usually modeled within the electric dipole approximation. This assumes that the intensity of the incident field is constant over the length of the system and transition probabilities are described in terms of the electric dipole transition moment. For short wavelength spectroscopies, such as X-ray absorption, the electric dipole approximation often breaks down. Higher order multipoles are then included to describe transition probabilities. The square of the magnetic dipole and electric quadrupole are often included, but this results in an origin-dependent expression for the oscillator strength. The oscillator strengthmore » can be made origin-independent if all terms through the same order in the wave vector are retained. We will show the consequences and potential pitfalls of using either of these two expressions. It is shown that the origin-dependent expression may violate the Thomas-Reiche-Kuhn sum rule and the origin-independent expression can result in negative transition probabilities.« less
Lestrange, Patrick J; Egidi, Franco; Li, Xiaosong
2015-12-21
The interaction between a quantum mechanical system and plane wave light is usually modeled within the electric dipole approximation. This assumes that the intensity of the incident field is constant over the length of the system and transition probabilities are described in terms of the electric dipole transition moment. For short wavelength spectroscopies, such as X-ray absorption, the electric dipole approximation often breaks down. Higher order multipoles are then included to describe transition probabilities. The square of the magnetic dipole and electric quadrupole are often included, but this results in an origin-dependent expression for the oscillator strength. The oscillator strength can be made origin-independent if all terms through the same order in the wave vector are retained. We will show the consequences and potential pitfalls of using either of these two expressions. It is shown that the origin-dependent expression may violate the Thomas-Reiche-Kuhn sum rule and the origin-independent expression can result in negative transition probabilities.
NASA Astrophysics Data System (ADS)
Lestrange, Patrick J.; Egidi, Franco; Li, Xiaosong
2015-12-01
The interaction between a quantum mechanical system and plane wave light is usually modeled within the electric dipole approximation. This assumes that the intensity of the incident field is constant over the length of the system and transition probabilities are described in terms of the electric dipole transition moment. For short wavelength spectroscopies, such as X-ray absorption, the electric dipole approximation often breaks down. Higher order multipoles are then included to describe transition probabilities. The square of the magnetic dipole and electric quadrupole are often included, but this results in an origin-dependent expression for the oscillator strength. The oscillator strength can be made origin-independent if all terms through the same order in the wave vector are retained. We will show the consequences and potential pitfalls of using either of these two expressions. It is shown that the origin-dependent expression may violate the Thomas-Reiche-Kuhn sum rule and the origin-independent expression can result in negative transition probabilities.
Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data
NASA Astrophysics Data System (ADS)
Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei
2009-03-01
We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.
Outcome Probability versus Magnitude: When Waiting Benefits One at the Cost of the Other
Young, Michael E.; Webb, Tara L.; Rung, Jillian M.; McCoy, Anthony W.
2014-01-01
Using a continuous impulsivity and risk platform (CIRP) that was constructed using a video game engine, choice was assessed under conditions in which waiting produced a continuously increasing probability of an outcome with a continuously decreasing magnitude (Experiment 1) or a continuously increasing magnitude of an outcome with a continuously decreasing probability (Experiment 2). Performance in both experiments reflected a greater desire for a higher probability even though the corresponding wait times produced substantive decreases in overall performance. These tendencies are considered to principally reflect hyperbolic discounting of probability, power discounting of magnitude, and the mathematical consequences of different response rates. Behavior in the CIRP is compared and contrasted with that in the Balloon Analogue Risk Task (BART). PMID:24892657
Decompressing recompression chamber attendants during Australian submarine rescue operations.
Reid, Michael P; Fock, Andrew; Doolette, David J
2017-09-01
Inside chamber attendants rescuing survivors from a pressurised, distressed submarine may themselves accumulate a decompression obligation which may exceed the limits of Defense and Civil Institute of Environmental Medicine tables presently used by the Royal Australian Navy. This study assessed the probability of decompression sickness (P DCS ) for medical attendants supervising survivors undergoing oxygen-accelerated saturation decompression according to the National Oceanic and Atmospheric Administration (NOAA) 17.11 table. Estimated probability of decompression sickness (P DCS ), the units pulmonary oxygen toxicity dose (UPTD) and the volume of oxygen required were calculated for attendants breathing air during the NOAA table compared with the introduction of various periods of oxygen breathing. The P DCS in medical attendants breathing air whilst supervising survivors receiving NOAA decompression is up to 4.5%. For the longest predicted profile (830 minutes at 253 kPa) oxygen breathing at 30, 60 and 90 minutes at 132 kPa partial pressure of oxygen reduced the air-breathing-associated P DCS to less than 3.1 %, 2.1% and 1.4% respectively. The probability of at least one incident of DCS among attendants, with consequent strain on resources, is high if attendants breathe air throughout their exposure. The introduction of 90 minutes of oxygen breathing greatly reduces the probability of this interruption to rescue operations.
Crime and punishment: the economic burden of impunity
NASA Astrophysics Data System (ADS)
Gordon, M. B.; Iglesias, J. R.; Semeshenko, V.; Nadal, J. P.
2009-03-01
Crime is an economically relevant activity. It may represent a mechanism of wealth distribution but also a social and economic burden because of the interference with regular legal activities and the cost of the law enforcement system. Sometimes it may be less costly for the society to allow for some level of criminality. However, a drawback of such a policy is that it may lead to a high increase of criminal activity, that may become hard to reduce later on. Here we investigate the level of law enforcement required to keep crime within acceptable limits. A sharp phase transition is observed as a function of the probability of punishment. We also analyze other consequences of criminality as the growth of the economy, the inequality in the wealth distribution (the Gini coefficient) and other relevant quantities under different scenarios of criminal activity and probabilities of apprehension.
The HMDS Coating Flaw Removal Tool
DOE Office of Scientific and Technical Information (OSTI.GOV)
Monticelli, M V; Nostrand, M C; Mehta, N
2008-10-24
In many high energy laser systems, optics with HMDS sol gel antireflective coatings are placed in close proximity to each other making them particularly susceptible to certain types of strong optical interactions. During the coating process, halo shaped coating flaws develop around surface digs and particles. Depending on the shape and size of the flaw, the extent of laser light intensity modulation and consequent probability of damaging downstream optics may increase significantly. To prevent these defects from causing damage, a coating flaw removal tool was developed that deploys a spot of decane with a syringe and dissolves away the coatingmore » flaw. The residual liquid is evacuated leaving an uncoated circular spot approximately 1mm in diameter. The resulting uncoated region causes little light intensity modulation and thus has a low probability of causing damage in optics downstream from the mitigated flaw site.« less
Busy Nights: High Seed Dispersal by Crickets in a Neotropical Forest.
Santana, Flávia Delgado; Baccaro, Fabricio Beggiato; Costa, Flávia Regina Capellotto
2016-11-01
Among invertebrates, ants are the most abundant and probably most important seed dispersers in both temperate and tropical environments. Crickets, also abundant in tropical forests, are omnivores and commonly attracted to fruits on the forest floor. However, their capability to remove seeds has been reported only once. We compared Marantaceae seed removal by crickets and ants to assess the role of crickets as secondary seed dispersers in Amazonia. Compared with ants, crickets dispersed an equivalent number of seeds and tended to disperse larger seeds farther. However, seed removal by crickets occurs mostly at night, suggesting that removal of arillate seeds by crickets on the tropical forest floor is probably being overlooked or wrongly attributed to other invertebrate groups. One potential consequence of seed dispersal by crickets may be a change in the local spatial distribution of arillate-seed species, due to lower aggregation around ant nests.
Toward inflation models compatible with the no-boundary proposal
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hwang, Dong-il; Yeom, Dong-han, E-mail: dongil.j.hwang@gmail.com, E-mail: innocent.yeom@gmail.com
2014-06-01
In this paper, we investigate various inflation models in the context of the no-boundary proposal. We propose that a good inflation model should satisfy three conditions: observational constraints, plausible initial conditions, and naturalness of the model. For various inflation models, we assign the probability to each initial condition using the no-boundary proposal and define a quantitative standard, typicality, to check whether the model satisfies the observational constraints with probable initial conditions. There are three possible ways to satisfy the typicality criterion: there was pre-inflation near the high energy scale, the potential is finely tuned or the inflationary field space ismore » unbounded, or there are sufficient number of fields that contribute to inflation. The no-boundary proposal rejects some of naive inflation models, explains some of traditional doubts on inflation, and possibly, can have observational consequences.« less
NASA Astrophysics Data System (ADS)
Zhang, Weiping; Chen, Wenyuan; Zhao, Xiaolin; Li, Shengyong; Jiang, Yong
2005-08-01
In a novel safety device based on MEMS technology for high consequence systems, the discriminator consists of two groups of metal counter meshing gears and two pawl/ratchet wheel mechanisms. Each group of counter meshing gears is onepiece and driven directly by an axial flux permanent magnet micromotor respectively. The energy-coupling element is an optical shutter with two collimators and a coupler wheel. The safety device's probability is less than 1/106. It is fabricated by combination of an LiGA-like process and precision mechanical engineering. The device has simple structure, few dynamic problems, high strength and strong reliability.
Liquefaction hazard for the region of Evansville, Indiana
Haase, Jennifer S.; Choi, Yoon S.; Nowack, Robert L.; Cramer, Chris H.; Boyd, Oliver S.; Bauer, Robert A.
2011-01-01
Maps of liquefaction hazard for each scenario earthquake present (1) Mean liquefaction potential index at each site, and (2) Probabilities that liquefaction potential index values exceed 5 (threshold for expression of surface liquefaction) and 12 (threshold for lateral spreading). Values for the liquefaction potential index are high in the River alluvium group, where the soil profiles are predominantly sand, while values in the Lacustrine terrace group are lower, owing to the predominance of clay. Liquefaction potential index values in the Outwash terrace group are less consistent because the soil profiles contain highly variable sequences of silty sand, clayey sand, and sandy clay, justifying the use of the Monte Carlo procedure to capture the consequences of this complexity.
Movement of fine particles on an air bubble surface studied using high-speed video microscopy.
Nguyen, Anh V; Evans, Geoffrey M
2004-05-01
A CCD high-speed video microscopy system operating at 1000 frames per second was used to obtain direct quantitative measurements of the trajectories of fine glass spheres on the surface of air bubbles. The glass spheres were rendered hydrophobic by a methylation process. Rupture of the intervening water film between a hydrophobic particle and an air bubble with the consequent formation of a three-phase contact was observed. The bubble-particle sliding attachment interaction is not satisfactorily described by the available theories. Surface forces had little effect on the particle sliding with a water film, which ruptured probably due to the submicrometer-sized gas bubbles existing at the hydrophobic particle-water interface.
Goyret, Joaquín; Kelber, Almut; Pfaff, Michael; Raguso, Robert A
2009-08-07
Here, we show that the consequences of deficient micronutrient (beta-carotene) intake during larval stages of Manduca sexta are carried across metamorphosis, affecting adult behaviour. Our manipulation of larval diet allowed us to examine how developmental plasticity impacts the interplay between visual and olfactory inputs on adult foraging behaviour. Larvae of M. sexta were reared on natural (Nicotiana tabacum) and artificial laboratory diets containing different concentrations of beta-carotene (standard diet, low beta-carotene, high beta-carotene and cornmeal). This vitamin-A precursor has been shown to be crucial for photoreception sensitivity in the retina of M. sexta. After completing development, post-metamorphosis, starved adults were presented with artificial feeders that could be either scented or unscented. Regardless of their larval diet, adult moths fed with relatively high probabilities on scented feeders. When feeders were unscented, moths reared on tobacco were more responsive than moths reared on beta-carotene-deficient artificial diets. Strikingly, moths reared on artificial diets supplemented with increasing amounts of beta-carotene (low beta and high beta) showed increasing probabilities of response to scentless feeders. We discuss these results in relationship to the use of complex, multi-modal sensory information by foraging animals.
Goyret, Joaquín; Kelber, Almut; Pfaff, Michael; Raguso, Robert A.
2009-01-01
Here, we show that the consequences of deficient micronutrient (β-carotene) intake during larval stages of Manduca sexta are carried across metamorphosis, affecting adult behaviour. Our manipulation of larval diet allowed us to examine how developmental plasticity impacts the interplay between visual and olfactory inputs on adult foraging behaviour. Larvae of M. sexta were reared on natural (Nicotiana tabacum) and artificial laboratory diets containing different concentrations of β-carotene (standard diet, low β-carotene, high β-carotene and cornmeal). This vitamin-A precursor has been shown to be crucial for photoreception sensitivity in the retina of M. sexta. After completing development, post-metamorphosis, starved adults were presented with artificial feeders that could be either scented or unscented. Regardless of their larval diet, adult moths fed with relatively high probabilities on scented feeders. When feeders were unscented, moths reared on tobacco were more responsive than moths reared on β-carotene-deficient artificial diets. Strikingly, moths reared on artificial diets supplemented with increasing amounts of β-carotene (low β and high β) showed increasing probabilities of response to scentless feeders. We discuss these results in relationship to the use of complex, multi-modal sensory information by foraging animals. PMID:19419987
Knoblauch, Theresa A K; Stauffacher, Michael; Trutnevyte, Evelina
2018-04-01
Subsurface energy activities entail the risk of induced seismicity including low-probability high-consequence (LPHC) events. For designing respective risk communication, the scientific literature lacks empirical evidence of how the public reacts to different written risk communication formats about such LPHC events and to related uncertainty or expert confidence. This study presents findings from an online experiment (N = 590) that empirically tested the public's responses to risk communication about induced seismicity and to different technology frames, namely deep geothermal energy (DGE) and shale gas (between-subject design). Three incrementally different formats of written risk communication were tested: (i) qualitative, (ii) qualitative and quantitative, and (iii) qualitative and quantitative with risk comparison. Respondents found the latter two the easiest to understand, the most exact, and liked them the most. Adding uncertainty and expert confidence statements made the risk communication less clear, less easy to understand and increased concern. Above all, the technology for which risks are communicated and its acceptance mattered strongly: respondents in the shale gas condition found the identical risk communication less trustworthy and more concerning than in the DGE conditions. They also liked the risk communication overall less. For practitioners in DGE or shale gas projects, the study shows that the public would appreciate efforts in describing LPHC risks with numbers and optionally risk comparisons. However, there seems to be a trade-off between aiming for transparency by disclosing uncertainty and limited expert confidence, and thereby decreasing clarity and increasing concern in the view of the public. © 2017 Society for Risk Analysis.
Moore, Thomas E.; Pitman, Janet K.; Moore, Thomas E.; Gautier, D.L.
2018-01-26
The Jan Mayen Microcontinent encompasses a rectangular, mostly submarine fragment of continental crust that lies north of Iceland in the middle of the North Atlantic Ocean. These continental rocks were rifted away from the eastern margin of Greenland as a consequence of a westward jump of spreading centers from the now-extinct Aegir Ridge to the currently active Kolbeinsey Ridge in the Oligocene and early Miocene. The microcontinent is composed of the high-standing Jan Mayen Ridge and a series of smaller ridges that diminish southward in elevation and includes several deep basins that are underlain by strongly attenuated continental crust. The geology of this area is known principally from a loose collection of seismic reflection and refraction lines and several deep-sea scientific drill cores.The Jan Mayen Microcontinent petroleum province encompasses the entire area of the microcontinent and was defined as a single assessment unit (AU). Although its geology is poorly known, the microcontinent is thought to consist of late Paleozoic and Mesozoic rift basin stratigraphic sequences similar to those of the highly prospective Norwegian, North Sea, and Greenland continental margins. The prospectivity of the AU may be greatly diminished, however, by pervasive extensional deformation, basaltic magmatism, and exhumation that accompanied two periods of continental rifting and breakup in the Paleogene and early Neogene. The overall probability of at least one petroleum accumulation of >50 million barrels of oil equivalent was judged to be 5.6 percent. As a consequence of the low level of probability, a quantitative assessment of this AU was not conducted.
DeFisher, Luke E.; Bonter, David N.
2013-01-01
Various invasive ant species have negatively affected reproductive success in birds by disrupting nest site selection, incubation patterns, food supply, and by direct predation on nestlings. Impacts can be particularly severe when non-native ants colonize seabird nesting islands where thousands of birds may nest in high densities on the ground or in burrows or crevices. Here we report on the first documented effects of Myrmica rubra, the European fire ant, on the reproduction of birds in its non-native range. We documented herring gulls (Larus argentatus) on Appledore Island, Maine, engaging in more erratic incubation behaviors at nests infested by the ants. Newly-hatched chicks in some nests were swarmed by ants, leading to rapid chick death. Due to high overall rates of chick mortality, survival probabilities did not vary between nests with and without ant activity, however chick growth rates were slower at nests with ants than at ant-free nests. Ant infestation likely leads to longer-term fitness consequences because slower growth rates early in life may ultimately lead to lower post-fledging survival probabilities. PMID:23691168
Integrating resource selection information with spatial capture--recapture
Royle, J. Andrew; Chandler, Richard B.; Sun, Catherine C.; Fuller, Angela K.
2013-01-01
4. Finally, we find that SCR models using standard symmetric and stationary encounter probability models may not fully explain variation in encounter probability due to space usage, and therefore produce biased estimates of density when animal space usage is related to resource selection. Consequently, it is important that space usage be taken into consideration, if possible, in studies focused on estimating density using capture–recapture methods.
[Ageing and work: technical standards].
De Vito, G; Riva, M A; Meroni, R; Cesana, G C
2010-01-01
Over the last few years, studies on the relationship between ageing and work have attracted growing interest due to the increased probability among workers of developing major health problems as a consequence of ageing of the working population. Negative outcomes for health are possible when an age-related imbalance appears between physical workload and physical work capacity. Interventions based on workload reductions should help to keep workers on the job for as long as allowed by law. Reference masses by age and sex are suggested by the technical standards of the ISO 11228 series, which are also quoted by Italian law D.Lgs. 81/2008, and EN 1005 series, which recommend limits valid also for manual material handling, and pushing and pulling. Decreasing low back pain prevalence or recurrence, in an ageing population with high prevalence of back disorders, could be more effective than many other approaches to enhance workers' quality of life and consequently maintain and improve workers' performance.
Fire and aquatic ecosystems in forested biomes of North America
Gresswell, Robert E.
1999-01-01
Synthesis of the literature suggests that physical, chemical, and biological elements of a watershed interact with long-term climate to influence fire regime, and that these factors, in concordance with the postfire vegetation mosaic, combine with local-scale weather to govern the trajectory and magnitude of change following a fire event. Perturbation associated with hydrological processes is probably the primary factor influencing postfire persistence of fishes, benthic macroinvertebrates, and diatoms in fluvial systems. It is apparent that salmonids have evolved strategies to survive perturbations occurring at the frequency of wildland fires (100a??102 years), but local populations of a species may be more ephemeral. Habitat alteration probably has the greatest impact on individual organisms and local populations that are the least mobile, and reinvasion will be most rapid by aquatic organisms with high mobility. It is becoming increasingly apparent that during the past century fire suppression has altered fire regimes in some vegetation types, and consequently, the probability of large stand-replacing fires has increased in those areas. Current evidence suggests, however, that even in the case of extensive high-severity fires, local extirpation of fishes is patchy, and recolonization is rapid. Lasting detrimental effects on fish populations have been limited to areas where native populations have declined and become increasingly isolated because of anthropogenic activities. A strategy of protecting robust aquatic communities and restoring aquatic habitat structure and life history complexity in degraded areas may be the most effective means for insuring the persistence of native biota where the probability of large-scale fires has increased.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bustamante, Paco, E-mail: pbustama@univ-lr.fr; Carravieri, Alice; Centre d’Etudes Biologiques de Chizé
Hg can affect physiology of seabirds and ultimately their demography, particularly if they are top consumers. In the present study, body feathers of >200 wandering albatrosses from Possession Island in the Crozet archipelago were used to explore the potential demographic effects of the long-term exposure to Hg on an apex predator. Variations of Hg with sex, age class, foraging habitat (inferred from δ{sup 13}C values), and feeding habits (inferred from δ{sup 15}N values) were examined as well as the influence of Hg on current breeding output, long-term fecundity and survival. Wandering albatrosses displayed among the highest Hg feather concentrations reportedmore » for seabirds, ranging from 5.9 to 95 µg g{sup −1}, as a consequence of their high trophic position (δ{sup 15}N values). These concentrations fall within the same range of those of other wandering albatross populations from subantarctic sites, suggesting that this species has similar exposure to Hg all around the Southern Ocean. In both immature and adult albatrosses, females had higher Hg concentrations than males (28 vs. 20 µg g{sup −1} dw on average, respectively), probably as a consequence of females foraging at lower latitudes than males (δ{sup 13}C values). Hg concentrations were higher in immature than in adult birds, and they remained fairly constant across a wide range of ages in adults. Such high levels in immature individuals question (i) the frequency of moult in young birds, (ii) the efficiency of Hg detoxification processes in immatures compared to adults, and (iii) importantly the potential detrimental effects of Hg in early life. Despite very high Hg concentrations in their feathers, neither effects on adults' breeding probability, hatching failure and fledgling failure, nor on adults' survival rate were detected, suggesting that long-term bioaccumulated Hg was not under a chemical form leading to deleterious effects on reproductive parameters in adult individuals. - Highlights: • Immature albatrosses had higher feather Hg concentrations than adults. • Foraging habitat influenced Hg bioaccumulation as a result of male and female segregation. • No carry-over effects were detected on reproductive parameters.« less
Long-Term Evolution of the Sun and our Biosphere: Causes and Effects?
NASA Astrophysics Data System (ADS)
Des Marais, D. J.
2000-05-01
The course of early biological evolution felt the environmental consequences of changes in the solar output (discussed here), as well as long-term decreases in planetary heat flow and the flux of extraterrestrial impactors. A large, early UV flux fueled the photodissociation of atmospheric water vapor, sustaining a significant hydrogen flux to space. This flux caused Earth's crust to become oxidized, relative to its mantle. Accordingly, reduced gases and aqueous solutes that were erupted volcanically into the relatively more oxidized surface environment created sources of chemical redox energy for the origin and early evolution of life. Although the solar constant has increased some 30 percent over Earth's lifetime, oceans remained remarkably stable for more than 3.8 billion years. Thus a very effective climate regulation was probably achieved by decreasing over time the atmospheric inventories of greenhouse gases such as carbon dioxide and methane. Such decreases probably had major consequences for the biosphere. Substantial early marine bicarbonate and carbon dioxide inventories sustained abundant abiotic precipitation of carbonates, with consequences for the stability and habitability of key aqueous environments. A long-term decline in carbon dioxide levels increased the bioenergetic requirements for carbon dioxide as well as other aspects of the physiology of photosynthetic microorganisms. The long-term trend of global mean surface temperature is still debated, as is the role of the sun's evolution in that trend. Future increases in the solar constant will drive atmospheric carbon dioxide levels down further, challenging plants to cope with ever-dwindling concentrations of carbon substrates. Climate regulation will be achieved by modulating an increasing abundance of high-albedo water vapor clouds. Future biological evolution defies precise predictions, however it is certain that the sun's continuing evolution will play a key role.
Guerrero, Arnoldo; Embid, Cristina; Isetta, Valentina; Farre, Ramón; Duran-Cantolla, Joaquin; Parra, Olga; Barbé, Ferran; Montserrat, Josep M; Masa, Juan F
2014-08-01
Obstructive sleep apnea (OSA) diagnosis using simplified methods such as portable sleep monitoring (PM) is only recommended in patients with a high pretest probability. The aim is to determine the diagnostic efficacy, consequent therapeutic decision-making, and costs of OSA diagnosis using polysomnography (PSG) versus three consecutive studies of PM in patients with mild to moderate suspicion of sleep apnea or with comorbidity that can mask OSA symptoms. Randomized, blinded, crossover study of 3 nights of PM (3N-PM) versus PSG. The diagnostic efficacy was evaluated with receiver operating characteristic (ROC) curves. Therapeutic decisions to assess concordance between the two different approaches were performed by sleep physicians and respiratory physicians (staff and residents) using agreement level and kappa coefficient. The costs of each diagnostic strategy were considered. Fifty-six patients were selected. Epworth Sleepiness Scale was 10.1 (5.3) points. Bland-Altman plot for apnea-hypopnea index (AHI) showed good agreement. ROC curves showed the best area under the curve in patients with PSG AHI ≥ 5 [0.955 (confidence interval = 0.862-0.993)]. For a PSG AHI ≥ 5, a PM AHI of 5 would effectively exclude and confirm OSA diagnosis. For a PSG AHI ≥ 15, a PM AHI ≥ 22 would confirm and PM AHI < 7 would exclude OSA. The best agreement of therapeutic decisions was achieved by the sleep medicine specialists (81.8%). The best cost-diagnostic efficacy was obtained by the 3N-PM. Three consecutive nights of portable monitoring at home evaluated by a qualified sleep specialist is useful for the management of patients without high pretest probability of obstructive sleep apnea or with comorbidities. http://www.clinicaltrials.gov, registration number: NCT01820156. Guerrero A, Embid C, Isetta V, Farre R, Duran-Cantolla J, Parra O, Barbé F, Montserrat JM, Masa JF. Management of sleep apnea without high pretest probability or with comorbidities by three nights of portable sleep monitoring.
Historical precedence and technical requirements of biological weapons use : a threat assessment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Estes, Daniel P.; Vogel, Kathleen Margaret; Gaudioso, Jennifer Marie
2004-05-01
The threat from biological weapons is assessed through both a comparative historical analysis of the patterns of biological weapons use and an assessment of the technological hurdles to proliferation and use that must be overcome. The history of biological weapons is studied to learn how agents have been acquired and what types of states and substate actors have used agents. Substate actors have generally been more willing than states to use pathogens and toxins and they have focused on those agents that are more readily available. There has been an increasing trend of bioterrorism incidents over the past century, butmore » states and substate actors have struggled with one or more of the necessary technological steps. These steps include acquisition of a suitable agent, production of an appropriate quantity and form, and effective deployment. The technological hurdles associated with the steps present a real barrier to producing a high consequence event. However, the ever increasing technological sophistication of society continually lowers the barriers, resulting in a low but increasing probability of a high consequence bioterrorism event.« less
Oblique impacts: Catastrophic vs. protracted effects
NASA Technical Reports Server (NTRS)
Schultz, P. H.
1988-01-01
Proposed impacts as the cause of biologic catastrophes at the end of the Cretaceous and Eocene face several enigmas: protracted extinctions, even prior to the stratigraphic cosmogenic signature; widespread but non-uniform dispersal of the meteoritic component; absence of a crater of sufficient size; and evidence for massive intensive fires. Various hypotheses provide reasonable mechanisms for mass mortalities: global cooling by continental impact sites; global warming by oceanic impact sites; contrasting effects of asteroidal, cometary, and even multiple impacts; and stress on an already fragile global environment. Yet not every known large impact is associated with a major biologic catastrophe. An alternative is expanded: the consequences of an oblique impact. The most probable angle of impact is 45 deg with the probability for an impact at smaller angles decreasing: A vertical impact is as rare as a tangential impact with a 5 deg impact angle or less occurring only 8 percent of the time. Consequently a low-angle impact is a rare but probable event. Laboratory experiments at the NASA-Ames Vertical Gun Range reveal important information about cratering efficiency, impact vaporization, projectile dispersal, and phenomenology, thereby providing perspective for possible consequences of such an impact on both the Earth and Moon. Oblique impacts are rare but certain events through geologic time: A 5 deg impact by a 2 km-diameter impactor on the Earth would occur only once in about 18 my with a 10 km-diameter once in about 450 my. Major life extinctions beginning prior to the stratigraphic cosmogenic signature or protracted extinctions seemingly too long after the proposed event may not be evidence against an impact as a cause but evidence for a more complex but probable sequence of events.
A Model for Risk Analysis of Oil Tankers
NASA Astrophysics Data System (ADS)
Montewka, Jakub; Krata, Przemysław; Goerland, Floris; Kujala, Pentti
2010-01-01
The paper presents a model for risk analysis regarding marine traffic, with the emphasis on two types of the most common marine accidents which are: collision and grounding. The focus is on oil tankers as these pose the highest environmental risk. A case study in selected areas of Gulf of Finland in ice free conditions is presented. The model utilizes a well-founded formula for risk calculation, which combines the probability of an unwanted event with its consequences. Thus the model is regarded a block type model, consisting of blocks for the probability of collision and grounding estimation respectively as well as blocks for consequences of an accident modelling. Probability of vessel colliding is assessed by means of a Minimum Distance To Collision (MDTC) based model. The model defines in a novel way the collision zone, using mathematical ship motion model and recognizes traffic flow as a non homogeneous process. The presented calculations address waterways crossing between Helsinki and Tallinn, where dense cross traffic during certain hours is observed. For assessment of a grounding probability, a new approach is proposed, which utilizes a newly developed model, where spatial interactions between objects in different locations are recognized. A ship at a seaway and navigational obstructions may be perceived as interacting objects and their repulsion may be modelled by a sort of deterministic formulation. Risk due to tankers running aground addresses an approach fairway to an oil terminal in Sköldvik, near Helsinki. The consequences of an accident are expressed in monetary terms, and concern costs of an oil spill, based on statistics of compensations claimed from the International Oil Pollution Compensation Funds (IOPC Funds) by parties involved.
Estes, J A; Doak, D F; Springer, A M; Williams, T M
2009-06-27
Populations of sea otters, seals and sea lions have collapsed across much of southwest Alaska over the past several decades. The sea otter decline set off a trophic cascade in which the coastal marine ecosystem underwent a phase shift from kelp forests to deforested sea urchin barrens. This interaction in turn affected the distribution, abundance and productivity of numerous other species. Ecological consequences of the pinniped declines are largely unknown. Increased predation by transient (marine mammal-eating) killer whales probably caused the sea otter declines and may have caused the pinniped declines as well. Springer et al. proposed that killer whales, which purportedly fed extensively on great whales, expanded their diets to include a higher percentage of sea otters and pinnipeds following a sharp reduction in great whale numbers from post World War II industrial whaling. Critics of this hypothesis claim that great whales are not now and probably never were an important nutritional resource for killer whales. We used demographic/energetic analyses to evaluate whether or not a predator-prey system involving killer whales and the smaller marine mammals would be sustainable without some nutritional contribution from the great whales. Our results indicate that while such a system is possible, it could only exist under a narrow range of extreme conditions and is therefore highly unlikely.
Sussenbach, Samanta Pereira; Silva, Everton Nunes; Pufal, Milene Amarante; Casagrande, Daniela Shan; Padoin, Alexandre Vontobel; Mottin, Cláudio Corá
2014-01-01
Background Because of the high prevalence of obesity, there is a growing demand for bariatric surgery worldwide. The objective of this systematic review was to analyze the difference in relation to cost-effectiveness of access route by laparoscopy versus laparotomy of Roux en-Y gastric bypass (RYGB). Methods A systematic review was conducted in the electronic databases MEDLINE, Embase, Scopus, Cochrane and Lilacs in order to identify economic evaluation studies that compare the cost-effectiveness of laparoscopic and laparotomic routes in RYGB. Results In a total of 494 articles, only 6 fulfilled the eligibility criteria. All studies were published between 2001 and 2008 in the United States (USA). Three studies fulfilled less than half of the items that evaluated the results quality; two satisfied 5 of the required items, and only 1 study fulfilled 7 of 10 items. The economic evaluation of studies alternated between cost-effectiveness and cost-consequence. Five studies considered the surgery by laparoscopy the dominant strategy, because it showed greater clinical benefit (less probability of post-surgical complications, less hospitalization time) and lower total cost. Conclusion This review indicates that laparoscopy is a safe and well-tolerated technique, despite the costs of surgery being higher when compared with laparotomy. However, the additional costs are compensated by the lower probability of complications after surgery and, consequently, avoiding their costs. PMID:24945704
Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry
2016-01-01
In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013–2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes. PMID:27499677
Beaussier, Anne-Laure; Demeritt, David; Griffiths, Alex; Rothstein, Henry
2016-05-18
In this paper, we examine why risk-based policy instruments have failed to improve the proportionality, effectiveness, and legitimacy of healthcare quality regulation in the National Health Service (NHS) in England. Rather than trying to prevent all possible harms, risk-based approaches promise to rationalise and manage the inevitable limits of what regulation can hope to achieve by focusing regulatory standard-setting and enforcement activity on the highest priority risks, as determined through formal assessments of their probability and consequences. As such, risk-based approaches have been enthusiastically adopted by healthcare quality regulators over the last decade. However, by drawing on historical policy analysis and in-depth interviews with 15 high-level UK informants in 2013-2015, we identify a series of practical problems in using risk-based policy instruments for defining, assessing, and ensuring compliance with healthcare quality standards. Based on our analysis, we go on to consider why, despite a succession of failures, healthcare regulators remain committed to developing and using risk-based approaches. We conclude by identifying several preconditions for successful risk-based regulation: goals must be clear and trade-offs between them amenable to agreement; regulators must be able to reliably assess the probability and consequences of adverse outcomes; regulators must have a range of enforcement tools that can be deployed in proportion to risk; and there must be political tolerance for adverse outcomes.
Pet-Armacost, J J; Sepulveda, J; Sakude, M
1999-12-01
The US Department of Transportation was interested in the risks associated with transporting Hydrazine in tanks with and without relief devices. Hydrazine is both highly toxic and flammable, as well as corrosive. Consequently, there was a conflict as to whether a relief device should be used or not. Data were not available on the impact of relief devices on release probabilities or the impact of Hydrazine on the likelihood of fires and explosions. In this paper, a Monte Carlo sensitivity analysis of the unknown parameters was used to assess the risks associated with highway transport of Hydrazine. To help determine whether or not relief devices should be used, fault trees and event trees were used to model the sequences of events that could lead to adverse consequences during transport of Hydrazine. The event probabilities in the event trees were derived as functions of the parameters whose effects were not known. The impacts of these parameters on the risk of toxic exposures, fires, and explosions were analyzed through a Monte Carlo sensitivity analysis and analyzed statistically through an analysis of variance. The analysis allowed the determination of which of the unknown parameters had a significant impact on the risks. It also provided the necessary support to a critical transportation decision even though the values of several key parameters were not known.
Elasticity of demand for water in Khartoum, Sudan.
Cairncross, S; Kinnear, J
1992-01-01
A survey of the quantities of water purchased from vendors in the squatter areas of Khartoum, Sudan, was used to assess the effect of the price charged for water and of household income on domestic water consumption. Households in two squatter communities--Meiyo and Karton Kassala--were studied by observation and by interview. In spite of the substantially higher charges, water consumption in Karton Kassala was as high as that in Meiyo. Households within these communities showed no tendency to use less water when paying a higher price for it, or when their income was below average. In other words, no price elasticity or income elasticity was detectable. This was all the more striking in view of the high proportion of income that was spent on water; 17% in Meiyo, and 56% in Karton Kassala. One consequence of this lack of elasticity is that the poorest households devote the greatest percentage of their income to the purchase of water, although the only major item in their household budget which can be sacrificed to make this possible is food. The high price of water in urban Sudan is probably a major cause of the malnutrition prevalent in the squatter areas. Another consequence is that a low-income household's consumer surplus for domestic water is very high, amounting to a substantial proportion of its total income. This has important consequences for the economic appraisal of urban water supply schemes. It also follows that wealthier households with private connections would be willing to pay at least as much for water as that currently paid by the poor.
A meta-analytic review of two modes of learning and the description-experience gap.
Wulff, Dirk U; Mergenthaler-Canseco, Max; Hertwig, Ralph
2018-02-01
People can learn about the probabilistic consequences of their actions in two ways: One is by consulting descriptions of an action's consequences and probabilities (e.g., reading up on a medication's side effects). The other is by personally experiencing the probabilistic consequences of an action (e.g., beta testing software). In principle, people taking each route can reach analogous states of knowledge and consequently make analogous decisions. In the last dozen years, however, research has demonstrated systematic discrepancies between description- and experienced-based choices. This description-experience gap has been attributed to factors including reliance on a small set of experience, the impact of recency, and different weighting of probability information in the two decision types. In this meta-analysis focusing on studies using the sampling paradigm of decisions from experience, we evaluated these and other determinants of the decision-experience gap by reference to more than 70,000 choices made by more than 6,000 participants. We found, first, a robust description-experience gap but also a key moderator, namely, problem structure. Second, the largest determinant of the gap was reliance on small samples and the associated sampling error: free to terminate search, individuals explored too little to experience all possible outcomes. Third, the gap persisted when sampling error was basically eliminated, suggesting other determinants. Fourth, the occurrence of recency was contingent on decision makers' autonomy to terminate search, consistent with the notion of optional stopping. Finally, we found indications of different probability weighting in decisions from experience versus decisions from description when the problem structure involved a risky and a safe option. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
Transition from Terrorist Event Management to Consequence Management, Executive Summary
1982-03-31
8217 "This report has been reviewed in the Federal Emergency Management Agency and approved for puLlication. Approval does not signify that the contents...acts. It describes in general terms what mechanism is in place today. This description is derived from documentation reviewed and information obtained...probabilities was employed in the development of the- values expressed in the matrices. Probabilities were established by a review of several previous
Abad, Francisco; de la Morena-Barrio, María Eugenia; Fernández-Breis, Jesualdo Tomás; Corral, Javier
2018-06-01
Translation is a key biological process controlled in eukaryotes by the initiation AUG codon. Variations affecting this codon may have pathological consequences by disturbing the correct initiation of translation. Unfortunately, there is no systematic study describing these variations in the human genome. Moreover, we aimed to develop new tools for in silico prediction of the pathogenicity of gene variations affecting AUG codons, because to date, these gene defects have been wrongly classified as missense. Whole-exome analysis revealed the mean of 12 gene variations per person affecting initiation codons, mostly with high (> 0:01) minor allele frequency (MAF). Moreover, analysis of Ensembl data (December 2017) revealed 11,261 genetic variations affecting the initiation AUG codon of 7,205 genes. Most of these variations (99.5%) have low or unknown MAF, probably reflecting deleterious consequences. Only 62 variations had high MAF. Genetic variations with high MAF had closer alternative AUG downstream codons than did those with low MAF. Besides, the high-MAF group better maintained both the signal peptide and reading frame. These differentiating elements could help to determine the pathogenicity of this kind of variation. Data and scripts in Perl and R are freely available at https://github.com/fanavarro/hemodonacion. jfernand@um.es. Supplementary data are available at Bioinformatics online.
ANTECEDENT VERSUS CONSEQUENT EVENTS AS PREDICTORS OF PROBLEM BEHAVIOR
Camp, Erin M; Iwata, Brian A; Hammond, Jennifer L; Bloom, Sarah E
2009-01-01
Comparisons of results from descriptive and functional analyses of problem behavior generally have shown poor correspondence. Most descriptive analyses have focused on relations between consequent events and behavior, and it has been noted that attention is a common consequence for problem behavior even though it may not be a functional reinforcer. Because attention may be prescribed simply as a means of stopping serious problem behavior, it is possible that naturally occurring antecedent events (establishing operations) might be better predictors of problem behavior than consequences. We conducted descriptive and functional analyses of the problem behaviors of 7 participants. Conditional probabilities based on combined antecedent and consequent events showed correspondence with the functional analysis data for 4 of the 7 participants, but antecedent events were no better than consequent events in identifying the function of problem behavior. PMID:19949538
Assessing the consequences of unrealistic optimism: Challenges and recommendations.
Shepperd, James A; Pogge, Gabrielle; Howell, Jennifer L
2017-04-01
Of the hundreds of studies published on unrealistic optimism (i.e., expecting a better personal future than is reasonably likely), most have focused on demonstrating the phenomenon, examining boundary conditions, or documenting causes. Few studies have examined the consequences of unrealistic optimism. In this article, we provide an overview of the measurement of unrealistic optimism, review possible consequences, and identify numerous challenges confronting investigators attempting to understand the consequences. Assessing the consequences of unrealistic optimism is tricky, and ultimately probably impossible when researchers assess unrealistic optimism at the group level (which reveals if a group of people is displaying unrealistic optimism on average) rather than the individual level (which reveals whether a specific individual displays unrealistic optimism). We offer recommendations to researchers who wish to examine the consequences of unrealistic optimism. Copyright © 2016 Elsevier Inc. All rights reserved.
Contraceptive failure in the United States
Trussell, James
2013-01-01
This review provides an update of previous estimates of first-year probabilities of contraceptive failure for all methods of contraception available in the United States. Estimates are provided of probabilities of failure during typical use (which includes both incorrect and inconsistent use) and during perfect use (correct and consistent use). The difference between these two probabilities reveals the consequences of imperfect use; it depends both on how unforgiving of imperfect use a method is and on how hard it is to use that method perfectly. These revisions reflect new research on contraceptive failure both during perfect use and during typical use. PMID:21477680
Acrocentric chromosome associations in man.
Jacobs, P A; Mayer, M; Morton, N E
1976-01-01
Heterogeneity among chromosomes was found to be a highly significant source of variation for association proportions, while culture, slide, and observer were negligible sources of variation for association proportions although important for numbers of associations. The consequences of these results for tests of group differences are discussed. It seems evident that each pair of acrocentric chromosomes has its own characteristic probability of entering into association. This is presumably a combination of the probability for each individual member of the pair, a proposition easily tested utilizing acrocentric chromosomes carrying polymorphisms which allow each member of the pair to be individually recognized. A mathematical theory for pairwise satellite association was developed and shown to fit observations on banded chromosomes. While we found very significant heterogeneity among individuals in the frequency with which different chromosomes entered into associations, there was no significant evidence for preferential association between any particular chromosomes, either heterologous or homologous. This finding in our material of apparently random associations between different chromosomes is contrary to claims made by other investigators and should be tested on other material. No correlation was found between the phenotype of the chromosome, as judged by cytogenetic polymorphisms, and its probability of association. PMID:795295
Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; ...
2016-02-02
Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less
Use of probability analysis to establish routine bioassay screening levels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carbaugh, E.H.; Sula, M.J.; McFadden, K.M.
1990-09-01
Probability analysis was used by the Hanford Internal Dosimetry Program to establish bioassay screening levels for tritium and uranium in urine. Background environmental levels of these two radionuclides are generally detectable by the highly sensitive urine analysis procedures routinely used at Hanford. Establishing screening levels requires balancing the impact of false detection with the consequence of potentially undetectable occupation dose. To establish the screening levels, tritium and uranium analyses were performed on urine samples collected from workers exposed only to environmental sources. All samples were collected at home using a simulated 12-hour protocol for tritium and a simulated 24-hour collectionmore » protocol for uranium. Results of the analyses of these samples were ranked according to tritium concentration or total sample uranium. The cumulative percentile was calculated and plotted using log-probability coordinates. Geometric means and screening levels corresponding to various percentiles were estimated by graphical interpolation and standard calculations. The potentially annual internal dose associated with a screening level was calculated. Screening levels were selected corresponding to the 99.9 percentile, implying that, on the average, 1 out of 1000 samples collected from an unexposed worker population would be expected to exceed the screening level. 4 refs., 2 figs.« less
[Accidental falls in the elderly].
Heinimann, Niklas B; Kressig, Reto W
2014-06-18
Falls in the elderly are common with consecutive high mortality and morbidity. Recent consecutive data focus on identification and therapy of intrinsic risk factors. Sarcopenia, imbalance and gait disorders represent the major risk factors. Sarcopenia is caused by a disequilibrium of protein synthesis and breakdown, probably in consequence of age-related changes in protein metabolism. Protein supplements in combination with strength training shows the best benefit. Disorders in balance and gait are caused by age-related or pathologic changes in a complex regulation system of gait. The individual fall risk correlates with the gait variability and even increases with bad dual task performance. Activities with high requirements of attention and body awareness are the most effective prevention for falls in the elderly (-50%).
Prospect theory on the brain? Toward a cognitive neuroscience of decision under risk.
Trepel, Christopher; Fox, Craig R; Poldrack, Russell A
2005-04-01
Most decisions must be made without advance knowledge of their consequences. Economists and psychologists have devoted much attention to modeling decisions made under conditions of risk in which options can be characterized by a known probability distribution over possible outcomes. The descriptive shortcomings of classical economic models motivated the development of prospect theory (D. Kahneman, A. Tversky, Prospect theory: An analysis of decision under risk. Econometrica, 4 (1979) 263-291; A. Tversky, D. Kahneman, Advances in prospect theory: Cumulative representation of uncertainty. Journal of Risk and Uncertainty, 5 (4) (1992) 297-323) the most successful behavioral model of decision under risk. In the prospect theory, subjective value is modeled by a value function that is concave for gains, convex for losses, and steeper for losses than for gains; the impact of probabilities are characterized by a weighting function that overweights low probabilities and underweights moderate to high probabilities. We outline the possible neural bases of the components of prospect theory, surveying evidence from human imaging, lesion, and neuropharmacology studies as well as animal neurophysiology studies. These results provide preliminary suggestions concerning the neural bases of prospect theory that include a broad set of brain regions and neuromodulatory systems. These data suggest that focused studies of decision making in the context of quantitative models may provide substantial leverage towards a fuller understanding of the cognitive neuroscience of decision making.
Dual Diagnosis and Suicide Probability in Poly-Drug Users.
Youssef, Ismail M; Fahmy, Magda T; Haggag, Wafaa L; Mohamed, Khalid A; Baalash, Amany A
2016-02-01
To determine the frequency of suicidal thoughts and suicidal probability among poly-substance abusers in Saudi population, and to examine the relation between dual diagnosis and suicidal thoughts. Case control study. Al-Baha Psychiatric Hospital, Saudi Arabia, from May 2011 to June 2012. Participants were 239 subjects, aged 18 - 45 years. We reviewed 122 individuals who fulfilled the DSM-IV-TR criteria of substance abuse for two or more substances, and their data were compared with that collected from 117 control persons. Suicidal cases were highly present among poly-substance abusers 64.75%. Amphetamine and cannabis were the most abused substances, (87.7% and 70.49%, respectively). Astatistically significant association with suicidality was found with longer duration of substance abuse (p < 0.001), using alcohol (p=0.001), amphetamine (p=0.007), volatile substances (p=0.034), presence of comorbid psychiatric disorders (dual diagnosis) as substance induced mood disorder (p=0.001), schizo-affective disorder (p=0.017), major depressive disorders (p=0.001), antisocial (p=0.016) and borderline (p=0.005) personality disorder. Suicidal cases showed significant higher scores (p < 0.001) of suicide probability scale and higher scores in Beck depressive inventory (p < 0.001). Abusing certain substances for long duration, in addition to comorbid psychiatric disorders especially with disturbed-mood element, may trigger suicidal thoughts in poly-substance abusers. Depression and suicide probability is common consequences of substance abuse.
Aldinger, Kyle R.; Wood, Petra B.
2015-01-01
Detection probability during point counts and its associated variables are important considerations for bird population monitoring and have implications for conservation planning by influencing population estimates. During 2008–2009, we evaluated variables hypothesized to be associated with detection probability, detection latency, and behavioral responses of male Golden-winged Warblers in pastures in the Monongahela National Forest, West Virginia, USA. This is the first study of male Golden-winged Warbler detection probability, detection latency, or behavioral response based on point-count sampling with known territory locations and identities for all males. During 3-min passive point counts, detection probability decreased as distance to a male's territory and time since sunrise increased. During 3-min point counts with playback, detection probability decreased as distance to a male's territory increased, but remained constant as time since sunrise increased. Detection probability was greater when point counts included type 2 compared with type 1 song playback, particularly during the first 2 min of type 2 song playback. Golden-winged Warblers primarily use type 1 songs (often zee bee bee bee with a higher-pitched first note) in intersexual contexts and type 2 songs (strident, rapid stutter ending with a lower-pitched buzzy note) in intrasexual contexts. Distance to a male's territory, ordinal date, and song playback type were associated with the type of behavioral response to song playback. Overall, ~2 min of type 2 song playback may increase the efficacy of point counts for monitoring populations of Golden-winged Warblers by increasing the conspicuousness of males for visual identification and offsetting the consequences of surveying later in the morning. Because playback may interfere with the ability to detect distant males, it is important to follow playback with a period of passive listening. Our results indicate that even in relatively open pasture vegetation, detection probability of male Golden-winged Warblers is imperfect and highly variable.
A Deterministic Approach to Active Debris Removal Target Selection
NASA Astrophysics Data System (ADS)
Lidtke, A.; Lewis, H.; Armellin, R.
2014-09-01
Many decisions, with widespread economic, political and legal consequences, are being considered based on space debris simulations that show that Active Debris Removal (ADR) may be necessary as the concerns about the sustainability of spaceflight are increasing. The debris environment predictions are based on low-accuracy ephemerides and propagators. This raises doubts about the accuracy of those prognoses themselves but also the potential ADR target-lists that are produced. Target selection is considered highly important as removal of many objects will increase the overall mission cost. Selecting the most-likely candidates as soon as possible would be desirable as it would enable accurate mission design and allow thorough evaluation of in-orbit validations, which are likely to occur in the near-future, before any large investments are made and implementations realized. One of the primary factors that should be used in ADR target selection is the accumulated collision probability of every object. A conjunction detection algorithm, based on the smart sieve method, has been developed. Another algorithm is then applied to the found conjunctions to compute the maximum and true probabilities of collisions taking place. The entire framework has been verified against the Conjunction Analysis Tools in AGIs Systems Toolkit and relative probability error smaller than 1.5% has been achieved in the final maximum collision probability. Two target-lists are produced based on the ranking of the objects according to the probability they will take part in any collision over the simulated time window. These probabilities are computed using the maximum probability approach, that is time-invariant, and estimates of the true collision probability that were computed with covariance information. The top-priority targets are compared, and the impacts of the data accuracy and its decay are highlighted. General conclusions regarding the importance of Space Surveillance and Tracking for the purpose of ADR are also drawn and a deterministic method for ADR target selection, which could reduce the number of ADR missions to be performed, is proposed.
Counterfactual Assessment of Decoherence in Quantum Systems
NASA Astrophysics Data System (ADS)
Russo, Onofrio; Jiang, Liang
2013-03-01
Quantum Zeno effect occurs when the system is observed for unusually short observation times, t, where the probability of the transition between different quantum states is known to be proportional to t2. This results in a decrease in the probability of transitions between states and the consequent decrease in decoherence. We consider the conditions in which these observations are made counterfactual to assess whether this results in a significant change in decoherence.
Tsujiuchi, Takuya; Yamaguchi, Maya; Masuda, Kazutaka; Tsuchida, Marisa; Inomata, Tadashi; Kumano, Hiroaki; Kikuchi, Yasushi; Augusterfer, Eugene F; Mollica, Richard F
2016-01-01
This study investigated post-traumatic stress symptoms in relation to the population affected by the Fukushima Nuclear Disaster, one year after the disaster. Additionally, we investigated social factors, such as forced displacement, which we hypothesize contributed to the high prevalence of post-traumatic stress. Finally, we report of written narratives that were collected from the impacted population. Using the Impact of Event Scale-Revised (IES-R), questionnaires were sent to 2,011 households of those displaced from Fukushima prefecture living temporarily in Saitama prefecture. Of the 490 replies; 350 met the criteria for inclusion in the study. Multiple logistic regression analysis was performed to examine several characteristics and variables of social factors as predictors of probable post-traumatic stress disorder, PTSD. The mean score of IES-R was 36.15±21.55, with 59.4% having scores of 30 or higher, thus indicating a probable PTSD. No significant differences in percentages of high-risk subjects were found among sex, age, evacuation area, housing damages, tsunami affected, family split-up, and acquaintance support. By the result of multiple logistic regression analysis, the significant predictors of probable PTSD were chronic physical diseases (OR = 1.97), chronic mental diseases (OR = 6.25), worries about livelihood (OR = 2.27), lost jobs (OR = 1.71), lost social ties (OR = 2.27), and concerns about compensation (OR = 3.74). Although there are limitations in assuming a diagnosis of PTSD based on self-report IES-R, our findings indicate that there was a high-risk of PTSD strongly related to the nuclear disaster and its consequent evacuation and displacement. Therefore, recovery efforts must focus not only on medical and psychological treatment alone, but also on social and economic issues related to the displacement, as well.
Patterns of Dating Violence Victimization and Perpetration among Latino Youth.
Reyes, H Luz McNaughton; Foshee, Vangie A; Chen, May S; Ennett, Susan T
2017-08-01
Theory and research suggest that there may be significant heterogeneity in the development, manifestation, and consequences of adolescent dating violence that is not yet well understood. The current study contributed to our understanding of this heterogeneity by identifying distinct patterns of involvement in psychological, physical, and sexual dating violence victimization and perpetration in a sample of Latino youth (n = 201; M = 13.87 years; 42% male), a group that is understudied, growing, and at high risk for involvement in dating violence. Among both boys and girls, latent class analyses identified a three-class solution wherein the largest class demonstrated a low probability of involvement in dating violence across all indices ("uninvolved"; 56% of boys, 64% of girls) and the smallest class demonstrated high probability of involvement in all forms of dating violence except for sexual perpetration among girls and physical perpetration among boys ("multiform aggressive victims"; 10% of boys, 11% of girls). A third class of "psychologically aggressive victims" was identified for which there was a high probability of engaging and experiencing psychological dating violence, but low likelihood of involvement in physical or sexual dating violence (34% of boys, 24% of girls). Cultural (parent acculturation, acculturation conflict), family (conflict and cohesion) and individual (normative beliefs, conflict resolution skills, self-control) risk and protective factors were associated with class membership. Membership in the multiform vs. the uninvolved class was concurrently associated with emotional distress among girls and predicted emotional distress longitudinally among boys. The results contribute to understanding heterogeneity in patterns of involvement in dating violence among Latino youth that may reflect distinct etiological processes.
New normative standards of conditional reasoning and the dual-source model
Singmann, Henrik; Klauer, Karl Christoph; Over, David
2014-01-01
There has been a major shift in research on human reasoning toward Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998) for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer and Kleiter, 2005, 2010) exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer et al., 2010) is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches) and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task. PMID:24860516
New normative standards of conditional reasoning and the dual-source model.
Singmann, Henrik; Klauer, Karl Christoph; Over, David
2014-01-01
There has been a major shift in research on human reasoning toward Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998) for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer and Kleiter, 2005, 2010) exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer et al., 2010) is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches) and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task.
Sovereign cat bonds and infrastructure project financing.
Croson, David; Richter, Andreas
2003-06-01
We examine the opportunities for using catastrophe-linked securities (or equivalent forms of nondebt contingent capital) to reduce the total costs of funding infrastructure projects in emerging economies. Our objective is to elaborate on methods to reduce the necessity for unanticipated (emergency) project funding immediately after a natural disaster. We also place the existing explanations of sovereign-level contingent capital into a catastrophic risk management framework. In doing so, we address the following questions. (1) Why might catastrophe-linked securities be useful to a sovereign nation, over and above their usefulness for insurers and reinsurers? (2) Why are such financial instruments ideally suited for protecting infrastructure projects in emerging economies, under third-party sponsorship, from low-probability, high-consequence events that occur as a result of natural disasters? (3) How can the willingness to pay of a sovereign government in an emerging economy (or its external project sponsor), who values timely completion of infrastructure projects, for such instruments be calculated? To supplement our treatment of these questions, we use a multilayer spreadsheet-based model (in Microsoft Excel format) to calculate the overall cost reductions possible through the judicious use of catastrophe-based financial tools. We also report on numerical comparative statics on the value of contingent-capital financing to avoid project disruption based on varying costs of capital, probability and consequences of disasters, the feasibility of strategies for mid-stage project abandonment, and the timing of capital commitments to the infrastructure investment. We use these results to identify high-priority applications of catastrophe-linked securities so that maximal protection can be realized if the total number of catastrophe instruments is initially limited. The article concludes with potential extensions to our model and opportunities for future research.
Metabolic consequences of Helicobacter pylori infection and eradication
Buzás, György Miklós
2014-01-01
Helicobacter pylori (H. pylori) is still the most prevalent infection of the world. Colonization of the stomach by this agent will invariably induce chronic gastritis which is a low-grade inflammatory state leading to local complications (peptic ulcer, gastric cancer, lymphoma) and remote manifestations. While H. pylori does not enter circulation, these extragastric manifestations are probably mediated by the cytokines and acute phase proteins produced by the inflammed mucosa. The epidemiologic link between the H. pylori infection and metabolic changes is inconstant and controversial. Growth delay was described mainly in low-income regions with high prevalence of the infection, where probably other nutritional and social factors contribute to it. The timely eradication of the infection will lead to a more healthy development of the young population, along with preventing peptic ulcers and gastric cancer An increase of total, low density lipoprotein and high density liporotein cholesterol levels in some infected people creates an atherogenic lipid profile which could promote atherosclerosis with its complications, myocardial infarction, stroke and peripheral vascular disease. Well designed and adequately powered long-term studies are required to see whether eradication of the infection will prevent these conditions. In case of glucose metabolism, the most consistent association was found between H. pylori and insulin resistance: again, proof that eradication prevents this common metabolic disturbance is expected. The results of eradication with standard regimens in diabetics are significantly worse than in non-diabetic patients, thus, more active regimens must be found to obtain better results. Successful eradication itself led to an increase of body mass index and cholesterol levels in some populations, while in others no such changes were encountered. Uncertainities of the metabolic consequences of H. pylori infection must be clarified in the future. PMID:24833852
Dispersal and individual quality in a long lived species
Cam, E.; Monnat, J.-Y.; Royle, J. Andrew
2004-01-01
The idea of differences in individual quality has been put forward in numerous long-term studies in long-lived species to explain differences in lifetime production among individuals. Despite the important role of individual heterogeneity in vital rates in demography, population dynamics and life history theory, the idea of 'individual quality' is elusive. It is sometimes assumed to be a static or dynamic individual characteristic. When considered as a dynamic trait, it is sometimes assumed to vary deterministically or stochastically, or to be confounded with the characteristics of the habitat. We addressed heterogeneity in reproductive performance among individuals established in higher-quality habitat in a long-lived seabird species. We used approaches to statistical inference based on individual random effects permitting quantification of heterogeneity in populations and assessment of individual variation from the population mean. We found evidence of heterogeneity in breeding probability, not success probability. We assessed the influence of dispersal on individual reproductive potential. Dispersal is likely to be destabilizing in species with high site and mate fidelity. We detected heterogeneity after dispersal, not before. Individuals may perform well regardless of quality before destabilization, including those that recruited in higher-quality habitat by chance, but only higher-quality individuals may be able to overcome the consequences of dispersal. Importantly, results differed when accounting for individual heterogeneity (an increase in mean breeding probability when individuals dispersed), or not (a decrease in mean breeding probability). In the latter case, the decrease in mean breeding probability may result from a substantial decrease in breeding probability in a few individuals and a slight increase in others. In other words, the pattern observed at the population mean level may not reflect what happens in the majority of individuals.
What is a benefit in relation to food consumption?
Przyrembel, Hildegard; Kleiner, Juliane
2008-08-15
The identification and characterization of benefits as a consequence of consumption of food, food constituents or nutrients used to be neglected in comparison to the assessment of risks because the safety of food had priority. Interest in benefit assessment is the consequence of the realisation that both adverse and positive effects on health can follow the consumption of the same food or food constituent and that a balance between the two should be the aim. Moreover, proven benefits in connection with food are the basis of health related claims on food labels. Benefit assessment should follow a procedure which is parallel to structured risk assessment and apply the same stringent criteria with respect to substantiation. Benefits will consist of either the reduction of the probability of adverse health effects or the increase of the probability of positive health effects.
Martens, Brian K; DiGennaro, Florence D; Reed, Derek D; Szczech, Frances M; Rosenthal, Blair D
2008-01-01
Descriptive assessment methods have been used in applied settings to identify consequences for problem behavior, thereby aiding in the design of effective treatment programs. Consensus has not been reached, however, regarding the types of data or analytic strategies that are most useful for describing behavior–consequence relations. One promising approach involves the analysis of conditional probabilities from sequential recordings of behavior and events that follow its occurrence. In this paper we review several strategies for identifying contingent relations from conditional probabilities, and propose an alternative strategy known as a contingency space analysis (CSA). Step-by-step procedures for conducting and interpreting a CSA using sample data are presented, followed by discussion of the potential use of a CSA for conducting descriptive assessments, informing intervention design, and evaluating changes in reinforcement contingencies following treatment. PMID:18468280
Van Leijenhorst, Linda; Westenberg, P Michiel; Crone, Eveline A
2008-01-01
Decision making, or the process of choosing between competing courses of actions, is highly sensitive to age-related change, showing development throughout adolescence. In this study, we tested whether the development of decision making under risk is related to changes in risk-estimation abilities. Participants (N = 93) between ages 8-30 performed a child friendly gambling task, the Cake Gambling task, which was inspired by the Cambridge Gambling Task (Rogers et al., 1999), which has previously been shown to be sensitive to orbitofrontal cortex (OFC) damage. The task allowed comparisons of the contributions to risk perception of (1) the ability to estimate probabilities and (2) evaluate rewards. Adult performance patterns were highly similar to those found in previous reports, showing increased risk taking with increases in the probability of winning and the magnitude of potential reward. Behavioral patterns in children and adolescents did not differ from adult patterns, showing a similar ability for probability estimation and reward evaluation. These data suggest that participants 8 years and older perform like adults in a gambling task, previously shown to depend on the OFC in which all the information needed to make an advantageous decision is given on each trial and no information needs to be inferred from previous behavior. Interestingly, at all ages, females were more risk-averse than males. These results suggest that the increase in real-life risky behavior that is seen in adolescence is not a consequence of changes in risk perception abilities. The findings are discussed in relation to theories about the protracted development of the prefrontal cortex.
Soil acidification from atmospheric ammonium sulphate in forest canopy throughfall
NASA Astrophysics Data System (ADS)
van Breemen, N.; Burrough, P. A.; Velthorst, E. J.; van Dobben, H. F.; de Wit, Toke; Ridder, T. B.; Reijnders, H. F. R.
1982-10-01
Acid rain commonly has high concentrations of dissolved SO2-4, NH+4 and NO-3. Sulphuric and nitric acids are usually considered to be the acidic components, whereas ammonium has a tendency to increase the pH of rainwater1. Ammonium can be transformed to nitric acid in soil but this source of acidity is generally less important than wet and dry deposition of free acids2,3. Here we describe the occurrence of high concentrations of ammonium in canopy throughfall (rainwater falling through the tree canopy) and stemflow in woodland areas in the Netherlands, resulting in acid inputs to soils two to five times higher than those previously described for acid atmospheric deposition2-5. The ammonium is present as ammonium sulphate, which probably forms by interaction of ammonia (volatilized from manure) with sulphur dioxide (from fossil fuels), on the surfaces of vegetation. After leaching by rainwater the ammonium sulphate reaching the soil oxidizes rapidly to nitric and sulphuric acid, producing extremely low pH values (2.8-3.5) and high concentrations of dissolved aluminium in the non-calcareous soils studied. Deposition of ammonium sulphate on the surfaces of vegetation and its environmental consequences are probably most important in areas with intensive animal husbandry.
Risk Management of Future Foreign Conflict Intervention
2011-12-01
subsequent tsunami-induced disaster at the Fukushima Daiichi Nuclear Power Plant on March 11, 2012, was a stark reminder that the residual risk of a...com- munity, risk is the combination of the probability of an event and its consequences. awareness of the consequences of various actions or events is...patently necessary for informed decisionmaking on public safety. If there is a core meltdown of a nuclear reactor, there will be a massive release
NASA Technical Reports Server (NTRS)
Gutierrez, Alberto, Jr.
1995-01-01
This dissertation evaluates receiver-based methods for mitigating the effects due to nonlinear bandlimited signal distortion present in high data rate satellite channels. The effects of the nonlinear bandlimited distortion is illustrated for digitally modulated signals. A lucid development of the low-pass Volterra discrete time model for a nonlinear communication channel is presented. In addition, finite-state machine models are explicitly developed for a nonlinear bandlimited satellite channel. A nonlinear fixed equalizer based on Volterra series has previously been studied for compensation of noiseless signal distortion due to a nonlinear satellite channel. This dissertation studies adaptive Volterra equalizers on a downlink-limited nonlinear bandlimited satellite channel. We employ as figure of merits performance in the mean-square error and probability of error senses. In addition, a receiver consisting of a fractionally-spaced equalizer (FSE) followed by a Volterra equalizer (FSE-Volterra) is found to give improvement beyond that gained by the Volterra equalizer. Significant probability of error performance improvement is found for multilevel modulation schemes. Also, it is found that probability of error improvement is more significant for modulation schemes, constant amplitude and multilevel, which require higher signal to noise ratios (i.e., higher modulation orders) for reliable operation. The maximum likelihood sequence detection (MLSD) receiver for a nonlinear satellite channel, a bank of matched filters followed by a Viterbi detector, serves as a probability of error lower bound for the Volterra and FSE-Volterra equalizers. However, this receiver has not been evaluated for a specific satellite channel. In this work, an MLSD receiver is evaluated for a specific downlink-limited satellite channel. Because of the bank of matched filters, the MLSD receiver may be high in complexity. Consequently, the probability of error performance of a more practical suboptimal MLSD receiver, requiring only a single receive filter, is evaluated.
Mølbak, K; Højlyng, N; Gaarslev, K
1988-04-01
Campylobacter was the bacterial pathogen most prevalent in 859 children, aged 6-59 months, examined in a house-to-house diarrhoea survey in two Liberian communities. 44.9% of the children from an urban slum and 28.4% from a rural area were excretors. Since the prevalence of diarrhoea was very high and consequently many convalescent carriers were found, it was not possible to evaluate the pathogenic role of campylobacter. The excretor rate increased with age and was significantly correlated to the use of supplementary feeding, inversely correlated to the quality of the water supply, and also associated with helminthic infestation. Results from re-examination of 172 children suggested a high intensity of transmission. The findings all indicate the existence of a heavy environmental contamination with campylobacter, probably of both human and animal faecal origin.
Mølbak, K.; Højlyng, N.; Gaarslev, K.
1988-01-01
Campylobacter was the bacterial pathogen most prevalent in 859 children, aged 6-59 months, examined in a house-to-house diarrhoea survey in two Liberian communities. 44.9% of the children from an urban slum and 28.4% from a rural area were excretors. Since the prevalence of diarrhoea was very high and consequently many convalescent carriers were found, it was not possible to evaluate the pathogenic role of campylobacter. The excretor rate increased with age and was significantly correlated to the use of supplementary feeding, inversely correlated to the quality of the water supply, and also associated with helminthic infestation. Results from re-examination of 172 children suggested a high intensity of transmission. The findings all indicate the existence of a heavy environmental contamination with campylobacter, probably of both human and animal faecal origin. PMID:3356221
A novel approach to cardiac troponins to improve the diagnostic work-up in chest pain patients.
Eggers, Kai M; Jaffe, Allan S; Svennblad, Bodil; Lindahl, Bertil
2012-12-01
In patients with acute chest pain, current guidelines recommend serial measurements of cardiac troponins at predefined and partly late time points. Consequently, diagnostic assessment in these patients tends to be lengthy and often results in unnecessary admissions. We, therefore, evaluated whether an approach integrating troponin results into the clinical context provided by the individual patient's presentation might facilitate the early diagnostic work-up. In 197 chest pain patients, cardiac troponin I (cTnI; Stratus CS) was measured serially within 12 hours after hospital admission. In patient cohorts with different chances of having myocardial infarction (MI) according to clinical data, electrocardiographic findings, and admission biomarker results, pretest probabilities for MI were calculated and compared with posttest probabilities derived from subsequent cTnI results after admission. Elevated cTnI levels at 1 to 2 hours after admission revealed ≥95.0% posttest probabilities for MI in cohorts with intermediate or high chances of having MI. The posttest probabilities for the absence of MI were 94.7% to 98.2% in cohorts with low or intermediate chances of having MI when cTnI was negative at 2 hours. Troponin testing considering the individual patient's pretest probability of MI seems, in conclusion, to provide clinically useful information already 1 to 2 hours after admission. Such an approach has the potential to identify both patient cohorts in whom early discharge or admittance for further evaluation would be appropriate. This could facilitate the early diagnostic work-up of chest pain patients, thereby improving patient flow and reducing overcrowding in healthcare facilities.
Statistical analysis of PM₁₀ concentrations at different locations in Malaysia.
Sansuddin, Nurulilyana; Ramli, Nor Azam; Yahaya, Ahmad Shukri; Yusof, Noor Faizah Fitri Md; Ghazali, Nurul Adyani; Madhoun, Wesam Ahmed Al
2011-09-01
Malaysia has experienced several haze events since the 1980s as a consequence of the transboundary movement of air pollutants emitted from forest fires and open burning activities. Hazy episodes can result from local activities and be categorized as "localized haze". General probability distributions (i.e., gamma and log-normal) were chosen to analyze the PM(10) concentrations data at two different types of locations in Malaysia: industrial (Johor Bahru and Nilai) and residential (Kota Kinabalu and Kuantan). These areas were chosen based on their frequently high PM(10) concentration readings. The best models representing the areas were chosen based on their performance indicator values. The best distributions provided the probability of exceedances and the return period between the actual and predicted concentrations based on the threshold limit given by the Malaysian Ambient Air Quality Guidelines (24-h average of 150 μg/m(3)) for PM(10) concentrations. The short-term prediction for PM(10) exceedances in 14 days was obtained using the autoregressive model.
Target Tracking Using SePDAF under Ambiguous Angles for Distributed Array Radar.
Long, Teng; Zhang, Honggang; Zeng, Tao; Chen, Xinliang; Liu, Quanhua; Zheng, Le
2016-09-09
Distributed array radar can improve radar detection capability and measurement accuracy. However, it will suffer cyclic ambiguity in its angle estimates according to the spatial Nyquist sampling theorem since the large sparse array is undersampling. Consequently, the state estimation accuracy and track validity probability degrades when the ambiguous angles are directly used for target tracking. This paper proposes a second probability data association filter (SePDAF)-based tracking method for distributed array radar. Firstly, the target motion model and radar measurement model is built. Secondly, the fusion result of each radar's estimation is employed to the extended Kalman filter (EKF) to finish the first filtering. Thirdly, taking this result as prior knowledge, and associating with the array-processed ambiguous angles, the SePDAF is applied to accomplish the second filtering, and then achieving a high accuracy and stable trajectory with relatively low computational complexity. Moreover, the azimuth filtering accuracy will be promoted dramatically and the position filtering accuracy will also improve. Finally, simulations illustrate the effectiveness of the proposed method.
Point count length and detection of forest neotropical migrant birds
Dawson, D.K.; Smith, D.R.; Robbins, C.S.; Ralph, C. John; Sauer, John R.; Droege, Sam
1995-01-01
Comparisons of bird abundances among years or among habitats assume that the rates at which birds are detected and counted are constant within species. We use point count data collected in forests of the Mid-Atlantic states to estimate detection probabilities for Neotropical migrant bird species as a function of count length. For some species, significant differences existed among years or observers in both the probability of detecting the species and in the rate at which individuals are counted. We demonstrate the consequence that variability in species' detection probabilities can have on estimates of population change, and discuss ways for reducing this source of bias in point count studies.
Individual heterogeneity and identifiability in capture-recapture models
Link, W.A.
2004-01-01
Individual heterogeneity in detection probabilities is a far more serious problem for capture-recapture modeling than has previously been recognized. In this note, I illustrate that population size is not an identifiable parameter under the general closed population mark-recapture model Mh. The problem of identifiability is obvious if the population includes individuals with pi = 0, but persists even when it is assumed that individual detection probabilities are bounded away from zero. Identifiability may be attained within parametric families of distributions for pi, but not among parametric families of distributions. Consequently, in the presence of individual heterogeneity in detection probability, capture-recapture analysis is strongly model dependent.
Stauffer, Glenn E.; Rotella, Jay J.; Garrott, Robert A.; Kendall, William L.
2014-01-01
In colonial-breeding species, prebreeders often emigrate temporarily from natal reproductive colonies then subsequently return for one or more years before producing young. Variation in attendance–nonattendance patterns can have implications for subsequent recruitment. We used open robust-design multistate models and 28 years of encounter data for prebreeding female Weddell seals (Leptonychotes weddellii [Lesson]) to evaluate hypotheses about (1) the relationships of temporary emigration (TE) probabilities to environmental and population size covariates and (2) motivations for attendance and consequences of nonattendance for subsequent probability of recruitment to the breeding population. TE probabilities were density dependent (βˆBPOP = 0.66, = 0.17; estimated effects [β] and standard errors of population size in the previous year) and increased when the fast-ice edge was distant from the breeding colonies (βˆDIST = 0.75, = 0.04; estimated effects and standard errors of distance to the sea-ice edge in the current year on TE probability in the current year) and were strongly age and state dependent. These results suggest that trade-offs between potential benefits and costs of colony attendance vary annually and might influence motivation to attend colonies. Recruitment probabilities were greatest for seals that consistently attended colonies in two or more years (e.g., = 0.56, SD = 0.17) and lowest for seals that never or inconsistently attended prior to recruitment (e.g., = 0.32, SD = 0.15), where denotes the mean recruitment probability (over all years) for 10-year-old seals for the specified prebreeder state. In colonial-breeding seabirds, repeated colony attendance increases subsequent probability of recruitment to the adult breeding population; our results suggest similar implications for a marine mammal and are consistent with the hypothesis that prebreeders were motivated to attend reproductive colonies to gain reproductive skills or perhaps to optimally synchronize estrus through close association with mature breeding females.
NASA Astrophysics Data System (ADS)
Gürbüz, Ramazan
2010-09-01
The purpose of this study is to investigate and compare the effects of activity-based and traditional instructions on students' conceptual development of certain probability concepts. The study was conducted using a pretest-posttest control group design with 80 seventh graders. A developed 'Conceptual Development Test' comprising 12 open-ended questions was administered on both groups of students before and after the intervention. The data were analysed using analysis of covariance, with the pretest as covariate. The results revealed that activity-based instruction (ABI) outperformed the traditional counterpart in the development of probability concepts. Furthermore, ABI was found to contribute students' conceptual development of the concept of 'Probability of an Event' the most, whereas to the concept of 'Sample Space' the least. As a consequence, it can be deduced that the designed instructional process was effective in the instruction of probability concepts.
Decision curve analysis: a novel method for evaluating prediction models.
Vickers, Andrew J; Elkin, Elena B
2006-01-01
Diagnostic and prognostic models are typically evaluated with measures of accuracy that do not address clinical consequences. Decision-analytic techniques allow assessment of clinical outcomes but often require collection of additional information and may be cumbersome to apply to models that yield a continuous result. The authors sought a method for evaluating and comparing prediction models that incorporates clinical consequences,requires only the data set on which the models are tested,and can be applied to models that have either continuous or dichotomous results. The authors describe decision curve analysis, a simple, novel method of evaluating predictive models. They start by assuming that the threshold probability of a disease or event at which a patient would opt for treatment is informative of how the patient weighs the relative harms of a false-positive and a false-negative prediction. This theoretical relationship is then used to derive the net benefit of the model across different threshold probabilities. Plotting net benefit against threshold probability yields the "decision curve." The authors apply the method to models for the prediction of seminal vesicle invasion in prostate cancer patients. Decision curve analysis identified the range of threshold probabilities in which a model was of value, the magnitude of benefit, and which of several models was optimal. Decision curve analysis is a suitable method for evaluating alternative diagnostic and prognostic strategies that has advantages over other commonly used measures and techniques.
Risk-targeted maps for Romania
NASA Astrophysics Data System (ADS)
Vacareanu, Radu; Pavel, Florin; Craciun, Ionut; Coliba, Veronica; Arion, Cristian; Aldea, Alexandru; Neagu, Cristian
2018-03-01
Romania has one of the highest seismic hazard levels in Europe. The seismic hazard is due to a combination of local crustal seismic sources, situated mainly in the western part of the country and the Vrancea intermediate-depth seismic source, which can be found at the bend of the Carpathian Mountains. Recent seismic hazard studies have shown that there are consistent differences between the slopes of the seismic hazard curves for sites situated in the fore-arc and back-arc of the Carpathian Mountains. Consequently, in this study we extend this finding to the evaluation of the probability of collapse of buildings and finally to the development of uniform risk-targeted maps. The main advantage of uniform risk approach is that the target probability of collapse will be uniform throughout the country. Finally, the results obtained are discussed in the light of a recent study with the same focus performed at European level using the hazard data from SHARE project. The analyses performed in this study have pointed out to a dominant influence of the quantile of peak ground acceleration used for anchoring the fragility function. This parameter basically alters the shape of the risk-targeted maps shifting the areas which have higher collapse probabilities from eastern Romania to western Romania, as its exceedance probability increases. Consequently, a uniform procedure for deriving risk-targeted maps appears as more than necessary.
Nuclear Terrorism: The Possibilities, Probable Consequences, and Preventive Strategies.
ERIC Educational Resources Information Center
Totten, Michael
1986-01-01
This article explores the possibility of terrorist acts against nuclear power stations. It includes information on reactor security, public policy, and alternative courses of action deemed to increase public safety and cost efficiency. (JDH)
Liquid and gaseous oxygen safety review, volume 3
NASA Technical Reports Server (NTRS)
Lapin, A.
1972-01-01
Practices employed in the oxygen systems maintenance programs to minimize both accident probabilities and consequences of accidents and/or incidents are described. Appropriate sections of the operations department and industrial gas operating manuals are discussed.
Devarbhavi, Harshad; Karanth, Dheeraj; Prasanna, K S; Adarsh, C K; Patil, Mallikarjun
2011-10-01
Drug-induced liver injury (DILI) is rare in children and adolescents, and, consequently, data are remarkably limited. We analyzed the causes, clinical and biochemical features, natural history, and outcomes of children with DILI. Consecutive children with DILI from 1997 to 2004 (retrospective) and 2005 to 2010 (prospective) were studied based on standard criteria for DILI. Thirty-nine children constituted 8.7% of 450 cases of DILI. There were 22 boys and 17 girls. Median age was 16 years (range, 2.6-17). Combination antituberculous drugs were the most common cause (n = 22), followed by the anticonvulsants, phenytoin (n = 10) and carbamazepine (n = 6). All of the 16 children (41%) who developed hypersensitivity features, such as skin rashes, fever, lymphadenopathy, and/or eosinophilia, including the 3 with Stevens-Johnson syndrome, survived. Those with hypersensitivity presented earlier (24.5 versus 35 days; P = 0.24) had less severe disease (MELD, 16 versus 29; P = 0.01) and had no mortality (0/16 versus 12/23; P < 0.001), compared to those without hypersensitivity. The 12 fatalities were largely the result of antituberculous DILI (n = 11). The presence of encephalopathy and ascites were associated with mortality, along with hyperbilirubinemia, high international normalized ratio, and serum creatinine. According to the Roussel Uclaf Causality Assessment Method, 18 were highly probable, 14 probable, and 7 possible. Thirty-two children were hospitalized. DILI is not uncommon in children and accounts for 8.7% of all patients with DILI. Antituberculous drugs and anticonvulsants are the leading causes of DILI in India. Overall mortality is high (30.7%), largely accounted by antituberculous drugs. Children with DILI and hypersensitivity features present early, have less severe disease, and, consequently, a better prognosis, compared to those without, and are often associated with anticonvulsants or sulfonamides. Copyright © 2011 American Association for the Study of Liver Diseases.
Trending in Probability of Collision Measurements via a Bayesian Zero-Inflated Beta Mixed Model
NASA Technical Reports Server (NTRS)
Vallejo, Jonathon; Hejduk, Matt; Stamey, James
2015-01-01
We investigate the performance of a generalized linear mixed model in predicting the Probabilities of Collision (Pc) for conjunction events. Specifically, we apply this model to the log(sub 10) transformation of these probabilities and argue that this transformation yields values that can be considered bounded in practice. Additionally, this bounded random variable, after scaling, is zero-inflated. Consequently, we model these values using the zero-inflated Beta distribution, and utilize the Bayesian paradigm and the mixed model framework to borrow information from past and current events. This provides a natural way to model the data and provides a basis for answering questions of interest, such as what is the likelihood of observing a probability of collision equal to the effective value of zero on a subsequent observation.
Random breakup of microdroplets for single-cell encapsulation
NASA Astrophysics Data System (ADS)
Um, Eujin; Lee, Seung-Goo; Park, Je-Kyun
2010-10-01
Microfluidic droplet-based technology enables encapsulation of cells in the isolated aqueous chambers surrounded by immiscible fluid but single-cell encapsulation efficiency is usually less than 30%. In this letter, we introduce a simple microgroove structure to break droplets into random sizes which further allows collecting of single-cell [Escherichia coli (E. coli)] containing droplets by their size differences. Pinched-flow separation method is integrated to sort out droplets of certain sizes which have high probability of containing one cell. Consequently, we were able to obtain more than 50% of droplets having single E. coli inside, keeping the proportion of multiple-cell containing droplets less than 16%.
Asteroid Risk Assessment: A Probabilistic Approach.
Reinhardt, Jason C; Chen, Xi; Liu, Wenhao; Manchev, Petar; Paté-Cornell, M Elisabeth
2016-02-01
Following the 2013 Chelyabinsk event, the risks posed by asteroids attracted renewed interest, from both the scientific and policy-making communities. It reminded the world that impacts from near-Earth objects (NEOs), while rare, have the potential to cause great damage to cities and populations. Point estimates of the risk (such as mean numbers of casualties) have been proposed, but because of the low-probability, high-consequence nature of asteroid impacts, these averages provide limited actionable information. While more work is needed to further refine its input distributions (e.g., NEO diameters), the probabilistic model presented in this article allows a more complete evaluation of the risk of NEO impacts because the results are distributions that cover the range of potential casualties. This model is based on a modularized simulation that uses probabilistic inputs to estimate probabilistic risk metrics, including those of rare asteroid impacts. Illustrative results of this analysis are presented for a period of 100 years. As part of this demonstration, we assess the effectiveness of civil defense measures in mitigating the risk of human casualties. We find that they are likely to be beneficial but not a panacea. We also compute the probability-but not the consequences-of an impact with global effects ("cataclysm"). We conclude that there is a continued need for NEO observation, and for analyses of the feasibility and risk-reduction effectiveness of space missions designed to deflect or destroy asteroids that threaten the Earth. © 2015 Society for Risk Analysis.
Risk factors and biomarkers of life-threatening cancers
Autier, Philippe
2015-01-01
There is growing evidence that risk factors for cancer occurrence and for cancer death are not necessarily the same. Knowledge of cancer aggressiveness risk factors (CARF) may help in identifying subjects at high risk of developing a potentially deadly cancer (and not just any cancer). The availability of CARFs may have positive consequences for health policies, medical practice, and the search for biomarkers. For instance, cancer chemoprevention and cancer screening of subjects with CARFs would probably be more ethical and cost-effective than recommending chemoprevention and screening to entire segments of the population. Also, the harmful consequences of chemoprevention and of screening would be reduced while effectiveness would be optimised. We present examples of CARF already in use (e.g. mutations of the breast cancer (BRCA) gene), of promising avenues for the discovery of biomarkers thanks to the investigation of CARFs (e.g. breast radiological density and systemic inflammation), and of biomarkers commonly used that are not real CARFs (e.g. certain mammography images, prostate-specific antigen (PSA) concentration, nevus number). PMID:26635900
Brouxel, M.
1991-01-01
A clinopyroxene-rich dike of the Trinity ophiolite sheeted-dike complex shows three different magmatic pulses, probably injected in a short period of time (no well developed chilled margin) and important variations of the clinopyroxene and plagioclase percentages between its core (highly porphyritic) and margins (aphyric). This variation, interpreted as related to a flow differentiation phenomenon (mechanical phenocryst redistribution), has important geochemical consequences. It produces increases in the FeO, MgO, CaO, Cr and Ni contents from the margin to the core, together with increases in the clinopyroxene percentage, and decreases in the SiO2, Zr, Y, Nb and REE contents together with a decrease in the percentage of the fine-grained groundmass toward the core of the dike. This mineralogical redistribution, which also affects the incompatible trace element ratios because of the difference in plagioclase and clinopyroxene mineral/liquid partition coefficients, illustrate the importance of fractionation processes outside of a magma chamber. ?? 1991.
Resilience Metrics for the Electric Power System: A Performance-Based Approach.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vugrin, Eric D.; Castillo, Andrea R; Silva-Monroy, Cesar Augusto
Grid resilience is a concept related to a power system's ability to continue operating and delivering power even in the event that low probability, high-consequence disruptions such as hurricanes, earthquakes, and cyber-attacks occur. Grid resilience objectives focus on managing and, ideally, minimizing potential consequences that occur as a result of these disruptions. Currently, no formal grid resilience definitions, metrics, or analysis methods have been universally accepted. This document describes an effort to develop and describe grid resilience metrics and analysis methods. The metrics and methods described herein extend upon the Resilience Analysis Process (RAP) developed by Watson et al. formore » the 2015 Quadrennial Energy Review. The extension allows for both outputs from system models and for historical data to serve as the basis for creating grid resilience metrics and informing grid resilience planning and response decision-making. This document describes the grid resilience metrics and analysis methods. Demonstration of the metrics and methods is shown through a set of illustrative use cases.« less
Bivariate categorical data analysis using normal linear conditional multinomial probability model.
Sun, Bingrui; Sutradhar, Brajendra
2015-02-10
Bivariate multinomial data such as the left and right eyes retinopathy status data are analyzed either by using a joint bivariate probability model or by exploiting certain odds ratio-based association models. However, the joint bivariate probability model yields marginal probabilities, which are complicated functions of marginal and association parameters for both variables, and the odds ratio-based association model treats the odds ratios involved in the joint probabilities as 'working' parameters, which are consequently estimated through certain arbitrary 'working' regression models. Also, this later odds ratio-based model does not provide any easy interpretations of the correlations between two categorical variables. On the basis of pre-specified marginal probabilities, in this paper, we develop a bivariate normal type linear conditional multinomial probability model to understand the correlations between two categorical variables. The parameters involved in the model are consistently estimated using the optimal likelihood and generalized quasi-likelihood approaches. The proposed model and the inferences are illustrated through an intensive simulation study as well as an analysis of the well-known Wisconsin Diabetic Retinopathy status data. Copyright © 2014 John Wiley & Sons, Ltd.
On the Concept and Definition of Terrorism Risk.
Aven, Terje; Guikema, Seth
2015-12-01
In this article, we provide some reflections on how to define and understand the concept of terrorism risk in a professional risk assessment context. As a basis for this discussion we introduce a set of criteria that we believe should apply to any conceptualization of terrorism risk. These criteria are based on both criteria used in other areas of risk analysis and our experience with terrorism risk analysis. That is, these criteria offer our perspective. We show that several of the suggested perspectives and definitions have weaknesses in relation to these criteria. A main problem identified is the idea that terrorism risk can be conceptualized as a function of probability and consequence, not as a function of the interactions between adaptive individuals and organizations. We argue that perspectives based solely on probability and consequence should be used cautiously or not at all because they fail to reflect the essential features of the concept of terrorism risk, the threats and attacks, their consequences, and the uncertainties, all in the context of adaptation by the adversaries. These three elements should in our view constitute the main pillars of the terrorism risk concept. From this concept we can develop methods for assessing the risk by identifying a set of threats, attacks, and consequence measures associated with the possible outcome scenarios together with a description of the uncertainties and interactions between the adversaries. © 2015 Society for Risk Analysis.
Some limitations of frequency as a component of risk: an expository note.
Cox, Louis Anthony
2009-02-01
Students of risk analysis are often taught that "risk is frequency times consequence" or, more generally, that risk is determined by the frequency and severity of adverse consequences. But is it? This expository note reviews the concepts of frequency as average annual occurrence rate and as the reciprocal of mean time to failure (MTTF) or mean time between failures (MTBF) in a renewal process. It points out that if two risks (represented as two (frequency, severity) pairs for adverse consequences) have identical values for severity but different values of frequency, then it is not necessarily true that the one with the smaller value of frequency is preferable-and this is true no matter how frequency is defined. In general, there is not necessarily an increasing relation between the reciprocal of the mean time until an event occurs, its long-run average occurrences per year, and other criteria, such as the probability or expected number of times that it will happen over a specific interval of interest, such as the design life of a system. Risk depends on more than frequency and severity of consequences. It also depends on other information about the probability distribution for the time of a risk event that can become lost in simple measures of event "frequency." More flexible descriptions of risky processes, such as point process models can avoid these limitations.
Die Kosmogonie Anton von Zachs.
NASA Astrophysics Data System (ADS)
Brosche, P.
In his "Cosmogenische Betrachtungen" (1804), Anton von Zach rediscovered - probably independently - some aspects of the theories of Kant and Laplace. More originally, he envisaged also the consequences of an era of heavy impacts in the early history of the Earth.
van den Bos, Wouter; Hertwig, Ralph
2017-01-01
Although actuarial data indicate that risk-taking behavior peaks in adolescence, laboratory evidence for this developmental spike remains scarce. One possible explanation for this incongruity is that in the real world adolescents often have only vague information about the potential consequences of their behavior and the likelihoods of those consequences, whereas in the lab these are often clearly stated. How do adolescents behave under such more realistic conditions of ambiguity and uncertainty? We asked 105 participants aged from 8 to 22 years to make three types of choices: (1) choices between options whose possible outcomes and probabilities were fully described (choices under risk); (2) choices between options whose possible outcomes were described but whose probability information was incomplete (choices under ambiguity), and (3) choices between unknown options whose possible outcomes and probabilities could be explored (choices under uncertainty). Relative to children and adults, two adolescent-specific markers emerged. First, adolescents were more accepting of ambiguity; second, they were also more accepting of uncertainty (as indicated by shorter pre-decisional search). Furthermore, this tolerance of the unknown was associated with motivational, but not cognitive, factors. These findings offer novel insights into the psychology of adolescent risk taking. PMID:28098227
Rhodes, Jean; Chan, Christian; Paxson, Christina; Rouse, Cecilia Elena; Waters, Mary; Fussell, Elizabeth
2010-04-01
The purpose of this study was to document changes in mental and physical health among 392 low-income parents exposed to Hurricane Katrina and to explore how hurricane-related stressors and loss relate to post-Katrina well-being. The prevalence of probable serious mental illness doubled, and nearly half of the respondents exhibited probable posttraumatic stress disorder. Higher levels of hurricane-related loss and stressors were generally associated with worse health outcomes, controlling for baseline sociodemographic and health measures. Higher baseline resources predicted fewer hurricane-associated stressors, but the consequences of stressors and loss were similar regardless of baseline resources. Adverse health consequences of Hurricane Katrina persisted for a year or more and were most severe for those experiencing the most stressors and loss. Long-term health and mental health services are needed for low-income disaster survivors, especially those who experience disaster-related stressors and loss.
Rhodes, Jean; Chan, Christian; Paxson, Christina; Rouse, Cecilia Elena; Waters, Mary; Fussell, Elizabeth
2012-01-01
The purpose of this study was to document changes in mental and physical health among 392 low-income parents exposed to Hurricane Katrina and to explore how hurricane-related stressors and loss relate to post-Katrina well being. The prevalence of probable serious mental illness doubled, and nearly half of the respondents exhibited probable PTSD. Higher levels of hurricane-related loss and stressors were generally associated with worse health outcomes, controlling for baseline socio-demographic and health measures. Higher baseline resources predicted fewer hurricane-associated stressors, but the consequences of stressors and loss were similar regardless of baseline resources. Adverse health consequences of Hurricane Katrina persisted for a year or more, and were most severe for those experiencing the most stressors and loss. Long-term health and mental health services are needed for low-income disaster survivors, especially those who experience disaster-related stressors and loss. PMID:20553517
Dynamic Blowout Risk Analysis Using Loss Functions.
Abimbola, Majeed; Khan, Faisal
2018-02-01
Most risk analysis approaches are static; failing to capture evolving conditions. Blowout, the most feared accident during a drilling operation, is a complex and dynamic event. The traditional risk analysis methods are useful in the early design stage of drilling operation while falling short during evolving operational decision making. A new dynamic risk analysis approach is presented to capture evolving situations through dynamic probability and consequence models. The dynamic consequence models, the focus of this study, are developed in terms of loss functions. These models are subsequently integrated with the probability to estimate operational risk, providing a real-time risk analysis. The real-time evolving situation is considered dependent on the changing bottom-hole pressure as drilling progresses. The application of the methodology and models are demonstrated with a case study of an offshore drilling operation evolving to a blowout. © 2017 Society for Risk Analysis.
Nanotube Tunneling as a Consequence of Probable Discrete Trajectories
NASA Technical Reports Server (NTRS)
Robinson, Daryl C.
2001-01-01
It has been recently reported that the electrical charge in a semiconductive carbon nanotube is not evenly distributed, but is divided into charge "islands." A clear understanding of tunneling phenomena can be useful to elucidate the mechanism for electrical conduction in nanotubes. This paper represents the first attempt to shed light on the aforementioned phenomenon through viewing tunneling as a natural consequence of "discrete trajectories." The relevance of this analysis is that it may provide further insight into the higher rate of tunneling processes, which makes tunneling devices attractive. In a situation involving particles impinging on a classically impenetrable barrier, the result of quantum mechanics that the probability of detecting transmitted particles falls off exponentially is derived without wave theory. This paper should provide a basis for calculating the charge profile over the length of the tube so that nanoscale devices' conductive properties may be fully exploited.
Boudewyn, A C; Liem, J H
1995-12-01
In this study, we selected individuals high and low on a measure of chronic self-destructiveness--the tendency to perform behaviors that later reduce positive consequences and increase the probability of experiencing negative ones--and attempted to differentiate high and low scorers based on a set of hypothesized antecedent and concurrent psychological, interpersonal, and behavioral correlates. Men and women were equally represented in high- and low-scoring groups. High scorers reported experiencing more interpersonal exploitation, greater depression, lower self-esteem, more externalizing attitudes, and less control in relationships than low scorers. High-scoring individuals also engaged in more frequent acts of acute self-destructiveness, including attempted suicide. A significant age covariate effect emerged: high-scoring men and women were younger than low-scoring individuals. These findings underscore the importance of studying chronic self-destructiveness within a developmental framework and suggest that issues of safety and self-care may be particularly germane to educational and clinical interventions aimed at young adults.
A graph-based system for network-vulnerability analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Swiler, L.P.; Phillips, C.
1998-06-01
This paper presents a graph-based approach to network vulnerability analysis. The method is flexible, allowing analysis of attacks from both outside and inside the network. It can analyze risks to a specific network asset, or examine the universe of possible consequences following a successful attack. The graph-based tool can identify the set of attack paths that have a high probability of success (or a low effort cost) for the attacker. The system could be used to test the effectiveness of making configuration changes, implementing an intrusion detection system, etc. The analysis system requires as input a database of common attacks,more » broken into atomic steps, specific network configuration and topology information, and an attacker profile. The attack information is matched with the network configuration information and an attacker profile to create a superset attack graph. Nodes identify a stage of attack, for example the class of machines the attacker has accessed and the user privilege level he or she has compromised. The arcs in the attack graph represent attacks or stages of attacks. By assigning probabilities of success on the arcs or costs representing level-of-effort for the attacker, various graph algorithms such as shortest-path algorithms can identify the attack paths with the highest probability of success.« less
Statistics of cosmic density profiles from perturbation theory
NASA Astrophysics Data System (ADS)
Bernardeau, Francis; Pichon, Christophe; Codis, Sandrine
2014-11-01
The joint probability distribution function (PDF) of the density within multiple concentric spherical cells is considered. It is shown how its cumulant generating function can be obtained at tree order in perturbation theory as the Legendre transform of a function directly built in terms of the initial moments. In the context of the upcoming generation of large-scale structure surveys, it is conjectured that this result correctly models such a function for finite values of the variance. Detailed consequences of this assumption are explored. In particular the corresponding one-cell density probability distribution at finite variance is computed for realistic power spectra, taking into account its scale variation. It is found to be in agreement with Λ -cold dark matter simulations at the few percent level for a wide range of density values and parameters. Related explicit analytic expansions at the low and high density tails are given. The conditional (at fixed density) and marginal probability of the slope—the density difference between adjacent cells—and its fluctuations is also computed from the two-cell joint PDF; it also compares very well to simulations. It is emphasized that this could prove useful when studying the statistical properties of voids as it can serve as a statistical indicator to test gravity models and/or probe key cosmological parameters.
CerebroMatic: A Versatile Toolbox for Spline-Based MRI Template Creation
Wilke, Marko; Altaye, Mekibib; Holland, Scott K.
2017-01-01
Brain image spatial normalization and tissue segmentation rely on prior tissue probability maps. Appropriately selecting these tissue maps becomes particularly important when investigating “unusual” populations, such as young children or elderly subjects. When creating such priors, the disadvantage of applying more deformation must be weighed against the benefit of achieving a crisper image. We have previously suggested that statistically modeling demographic variables, instead of simply averaging images, is advantageous. Both aspects (more vs. less deformation and modeling vs. averaging) were explored here. We used imaging data from 1914 subjects, aged 13 months to 75 years, and employed multivariate adaptive regression splines to model the effects of age, field strength, gender, and data quality. Within the spm/cat12 framework, we compared an affine-only with a low- and a high-dimensional warping approach. As expected, more deformation on the individual level results in lower group dissimilarity. Consequently, effects of age in particular are less apparent in the resulting tissue maps when using a more extensive deformation scheme. Using statistically-described parameters, high-quality tissue probability maps could be generated for the whole age range; they are consistently closer to a gold standard than conventionally-generated priors based on 25, 50, or 100 subjects. Distinct effects of field strength, gender, and data quality were seen. We conclude that an extensive matching for generating tissue priors may model much of the variability inherent in the dataset which is then not contained in the resulting priors. Further, the statistical description of relevant parameters (using regression splines) allows for the generation of high-quality tissue probability maps while controlling for known confounds. The resulting CerebroMatic toolbox is available for download at http://irc.cchmc.org/software/cerebromatic.php. PMID:28275348
CerebroMatic: A Versatile Toolbox for Spline-Based MRI Template Creation.
Wilke, Marko; Altaye, Mekibib; Holland, Scott K
2017-01-01
Brain image spatial normalization and tissue segmentation rely on prior tissue probability maps. Appropriately selecting these tissue maps becomes particularly important when investigating "unusual" populations, such as young children or elderly subjects. When creating such priors, the disadvantage of applying more deformation must be weighed against the benefit of achieving a crisper image. We have previously suggested that statistically modeling demographic variables, instead of simply averaging images, is advantageous. Both aspects (more vs. less deformation and modeling vs. averaging) were explored here. We used imaging data from 1914 subjects, aged 13 months to 75 years, and employed multivariate adaptive regression splines to model the effects of age, field strength, gender, and data quality. Within the spm/cat12 framework, we compared an affine-only with a low- and a high-dimensional warping approach. As expected, more deformation on the individual level results in lower group dissimilarity. Consequently, effects of age in particular are less apparent in the resulting tissue maps when using a more extensive deformation scheme. Using statistically-described parameters, high-quality tissue probability maps could be generated for the whole age range; they are consistently closer to a gold standard than conventionally-generated priors based on 25, 50, or 100 subjects. Distinct effects of field strength, gender, and data quality were seen. We conclude that an extensive matching for generating tissue priors may model much of the variability inherent in the dataset which is then not contained in the resulting priors. Further, the statistical description of relevant parameters (using regression splines) allows for the generation of high-quality tissue probability maps while controlling for known confounds. The resulting CerebroMatic toolbox is available for download at http://irc.cchmc.org/software/cerebromatic.php.
Guerrero, Arnoldo; Embid, Cristina; Isetta, Valentina; Farre, Ramón; Duran-Cantolla, Joaquin; Parra, Olga; Barbé, Ferran; Montserrat, Josep M.; Masa, Juan F.
2014-01-01
Study Objectives: Obstructive sleep apnea (OSA) diagnosis using simplified methods such as portable sleep monitoring (PM) is only recommended in patients with a high pretest probability. The aim is to determine the diagnostic efficacy, consequent therapeutic decision-making, and costs of OSA diagnosis using polysomnography (PSG) versus three consecutive studies of PM in patients with mild to moderate suspicion of sleep apnea or with comorbidity that can mask OSA symptoms. Design and Setting: Randomized, blinded, crossover study of 3 nights of PM (3N-PM) versus PSG. The diagnostic efficacy was evaluated with receiver operating characteristic (ROC) curves. Therapeutic decisions to assess concordance between the two different approaches were performed by sleep physicians and respiratory physicians (staff and residents) using agreement level and kappa coefficient. The costs of each diagnostic strategy were considered. Patients and Results: Fifty-six patients were selected. Epworth Sleepiness Scale was 10.1 (5.3) points. Bland-Altman plot for apnea-hypopnea index (AHI) showed good agreement. ROC curves showed the best area under the curve in patients with PSG AHI ≥ 5 [0.955 (confidence interval = 0.862–0.993)]. For a PSG AHI ≥ 5, a PM AHI of 5 would effectively exclude and confirm OSA diagnosis. For a PSG AHI ≥ 15, a PM AHI ≥ 22 would confirm and PM AHI < 7 would exclude OSA. The best agreement of therapeutic decisions was achieved by the sleep medicine specialists (81.8%). The best cost-diagnostic efficacy was obtained by the 3N-PM. Conclusions: Three consecutive nights of portable monitoring at home evaluated by a qualified sleep specialist is useful for the management of patients without high pretest probability of obstructive sleep apnea or with comorbidities. Clinical Trial Registration: http://www.clinicaltrials.gov, registration number: NCT01820156 Citation: Guerrero A, Embid C, Isetta V, Farre R, Duran-Cantolla J, Parra O, Barbé F, Montserrat JM, Masa JF. Management of sleep apnea without high pretest probability or with comorbidities by three nights of portable sleep monitoring. SLEEP 2014;37(8):1363-1373. PMID:25083017
33 CFR 230.8 - Emergency actions.
Code of Federal Regulations, 2013 CFR
2013-07-01
... major in scope with potentially significant environmental impacts shall be referred through the division... DEFENSE PROCEDURES FOR IMPLEMENTING NEPA § 230.8 Emergency actions. In responding to emergency situations... this regulation. District commanders shall consider the probable environmental consequences in...
33 CFR 230.8 - Emergency actions.
Code of Federal Regulations, 2012 CFR
2012-07-01
... major in scope with potentially significant environmental impacts shall be referred through the division... DEFENSE PROCEDURES FOR IMPLEMENTING NEPA § 230.8 Emergency actions. In responding to emergency situations... this regulation. District commanders shall consider the probable environmental consequences in...
33 CFR 230.8 - Emergency actions.
Code of Federal Regulations, 2011 CFR
2011-07-01
... major in scope with potentially significant environmental impacts shall be referred through the division... DEFENSE PROCEDURES FOR IMPLEMENTING NEPA § 230.8 Emergency actions. In responding to emergency situations... this regulation. District commanders shall consider the probable environmental consequences in...
33 CFR 230.8 - Emergency actions.
Code of Federal Regulations, 2014 CFR
2014-07-01
... major in scope with potentially significant environmental impacts shall be referred through the division... DEFENSE PROCEDURES FOR IMPLEMENTING NEPA § 230.8 Emergency actions. In responding to emergency situations... this regulation. District commanders shall consider the probable environmental consequences in...
[Burden of atopic dermatitis in adults].
Misery, L
2017-12-01
Atopic dermatitis may have a very important impact on adults. Visible lesions, but especially near-permanent pruritus or sometimes pain for decades, necessarily have consequences on all aspects of everyday life, including sleep, and professional, social, family and emotional life. Financial consequences are also possible. Poorly known, stigmatisation can be real. Treatments can be very demanding. Thus, the quality of life can be greatly altered and atopic dermatitis could be a heavy burden. The psychological consequences can be major. Co-morbidity appears more and more as a major problem. Patients can therefore be caught in an infernal circle, consequences of the disease aggravating the disease. The best way out is probably to have very effective and well-tolerated treatments. © 2017 Elsevier Masson SAS. Tous droits réservés.
Propensity, Probability, and Quantum Theory
NASA Astrophysics Data System (ADS)
Ballentine, Leslie E.
2016-08-01
Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.
Developing questionnaires for educational research: AMEE Guide No. 87
La Rochelle, Jeffrey S.; Dezee, Kent J.; Gehlbach, Hunter
2014-01-01
In this AMEE Guide, we consider the design and development of self-administered surveys, commonly called questionnaires. Questionnaires are widely employed in medical education research. Unfortunately, the processes used to develop such questionnaires vary in quality and lack consistent, rigorous standards. Consequently, the quality of the questionnaires used in medical education research is highly variable. To address this problem, this AMEE Guide presents a systematic, seven-step process for designing high-quality questionnaires, with particular emphasis on developing survey scales. These seven steps do not address all aspects of survey design, nor do they represent the only way to develop a high-quality questionnaire. Instead, these steps synthesize multiple survey design techniques and organize them into a cohesive process for questionnaire developers of all levels. Addressing each of these steps systematically will improve the probabilities that survey designers will accurately measure what they intend to measure. PMID:24661014
Developing questionnaires for educational research: AMEE Guide No. 87.
Artino, Anthony R; La Rochelle, Jeffrey S; Dezee, Kent J; Gehlbach, Hunter
2014-06-01
In this AMEE Guide, we consider the design and development of self-administered surveys, commonly called questionnaires. Questionnaires are widely employed in medical education research. Unfortunately, the processes used to develop such questionnaires vary in quality and lack consistent, rigorous standards. Consequently, the quality of the questionnaires used in medical education research is highly variable. To address this problem, this AMEE Guide presents a systematic, seven-step process for designing high-quality questionnaires, with particular emphasis on developing survey scales. These seven steps do not address all aspects of survey design, nor do they represent the only way to develop a high-quality questionnaire. Instead, these steps synthesize multiple survey design techniques and organize them into a cohesive process for questionnaire developers of all levels. Addressing each of these steps systematically will improve the probabilities that survey designers will accurately measure what they intend to measure.
Gravity and count probabilities in an expanding universe
NASA Technical Reports Server (NTRS)
Bouchet, Francois R.; Hernquist, Lars
1992-01-01
The time evolution of nonlinear clustering on large scales in cold dark matter, hot dark matter, and white noise models of the universe is investigated using N-body simulations performed with a tree code. Count probabilities in cubic cells are determined as functions of the cell size and the clustering state (redshift), and comparisons are made with various theoretical models. We isolate the features that appear to be the result of gravitational instability, those that depend on the initial conditions, and those that are likely a consequence of numerical limitations. More specifically, we study the development of skewness, kurtosis, and the fifth moment in relation to variance, the dependence of the void probability on time as well as on sparseness of sampling, and the overall shape of the count probability distribution. Implications of our results for theoretical and observational studies are discussed.
Buechel, Eva C.; Zhang, Jiao; Morewedge, Carey K.; Vosgerau, Joachim
2014-01-01
We propose that affective forecasters overestimate the extent to which experienced hedonic responses to an outcome are influenced by the probability of its occurrence. The experience of an outcome (e.g., winning a gamble) is typically more affectively intense than the simulation of that outcome (e.g., imagining winning a gamble) upon which the affective forecast for it is based. We suggest that, as a result, experiencers allocate a larger share of their attention toward the outcome (e.g., winning the gamble) and less to its probability specifications than do affective forecasters. Consequently, hedonic responses to an outcome are less sensitive to its probability specifications than are affective forecasts for that outcome. The results of 6 experiments provide support for our theory. Affective forecasters overestimated how sensitive experiencers would be to the probability of positive and negative outcomes (Experiments 1 and 2). Consistent with our attentional account, differences in sensitivity to probability specifications disappeared when the attention of forecasters was diverted from probability specifications (Experiment 3) or when the attention of experiencers was drawn toward probability specifications (Experiment 4). Finally, differences in sensitivity to probability specifications between forecasters and experiencers were diminished when the forecasted outcome was more affectively intense (Experiments 5 and 6). PMID:24128184
Buechel, Eva C; Zhang, Jiao; Morewedge, Carey K; Vosgerau, Joachim
2014-01-01
We propose that affective forecasters overestimate the extent to which experienced hedonic responses to an outcome are influenced by the probability of its occurrence. The experience of an outcome (e.g., winning a gamble) is typically more affectively intense than the simulation of that outcome (e.g., imagining winning a gamble) upon which the affective forecast for it is based. We suggest that, as a result, experiencers allocate a larger share of their attention toward the outcome (e.g., winning the gamble) and less to its probability specifications than do affective forecasters. Consequently, hedonic responses to an outcome are less sensitive to its probability specifications than are affective forecasts for that outcome. The results of 6 experiments provide support for our theory. Affective forecasters overestimated how sensitive experiencers would be to the probability of positive and negative outcomes (Experiments 1 and 2). Consistent with our attentional account, differences in sensitivity to probability specifications disappeared when the attention of forecasters was diverted from probability specifications (Experiment 3) or when the attention of experiencers was drawn toward probability specifications (Experiment 4). Finally, differences in sensitivity to probability specifications between forecasters and experiencers were diminished when the forecasted outcome was more affectively intense (Experiments 5 and 6).
NASA Astrophysics Data System (ADS)
Avanzi, Francesco; De Michele, Carlo; Gabriele, Salvatore; Ghezzi, Antonio; Rosso, Renzo
2015-04-01
Here, we show how atmospheric circulation and topography rule the variability of depth-duration-frequency (DDF) curves parameters, and we discuss how this variability has physical implications on the formation of extreme precipitations at high elevations. A DDF is a curve ruling the value of the maximum annual precipitation H as a function of duration D and the level of probability F. We consider around 1500 stations over the Italian territory, with at least 20 years of data of maximum annual precipitation depth at different durations. We estimated the DDF parameters at each location by using the asymptotic distribution of extreme values, i.e. the so-called Generalized Extreme Value (GEV) distribution, and considering a statistical simple scale invariance hypothesis. Consequently, a DDF curve depends on five different parameters. A first set relates H with the duration (namely, the mean value of annual maximum precipitation depth for unit duration and the scaling exponent), while a second set links H to F (namely, a scale, position and shape parameter). The value of the shape parameter has consequences on the type of random variable (unbounded, upper or lower bounded). This extensive analysis shows that the variability of the mean value of annual maximum precipitation depth for unit duration obeys to the coupled effect of topography and modal direction of moisture flux during extreme events. Median values of this parameter decrease with elevation. We called this phenomenon "reverse orographic effect" on extreme precipitation of short durations, since it is in contrast with general knowledge about the orographic effect on mean precipitation. Moreover, the scaling exponent is mainly driven by topography alone (with increasing values of this parameter at increasing elevations). Therefore, the quantiles of H(D,F) at durations greater than unit turn to be more variable at high elevations than at low elevations. Additionally, the analysis of the variability of the shape parameter with elevation shows that extreme events at high elevations appear to be distributed according to an upper bounded probability distribution. These evidences could be a characteristic sign of the formation of extreme precipitation events at high elevations.
Johnson, Patrick S; Sweeney, Mary M; Herrmann, Evan S; Johnson, Matthew W
2016-06-01
Alcohol use, especially at binge levels, is associated with sexual HIV risk behavior, but the mechanisms through which alcohol increases sexual risk taking are not well-examined. Delay discounting, that is, devaluation of future consequences as a function of delay to their occurrence, has been implicated in a variety of problem behaviors, including risky sexual behavior. Probability discounting is studied with a similar framework as delay discounting, but is a distinct process in which a consequence is devalued because it is uncertain or probabilistic. Twenty-three, nondependent alcohol users (13 male, 10 female; mean age = 25.3 years old) orally consumed alcohol (1 g/kg) or placebo in 2 separate experimental sessions. During sessions, participants completed tasks examining delay and probability discounting of hypothetical condom-protected sex (Sexual Delay Discounting Task, Sexual Probability Discounting Task) and of hypothetical and real money. Alcohol decreased the likelihood that participants would wait to have condom-protected sex versus having immediate, unprotected sex. Alcohol also decreased the likelihood that participants would use an immediately available condom given a specified level of sexually transmitted infection (STI) risk. Alcohol did not affect delay discounting of money, but it did increase participants' preferences for larger, probabilistic monetary rewards over smaller, certain rewards. Acute, binge-level alcohol intoxication may increase sexual HIV risk by decreasing willingness to delay sex in order to acquire a condom in situations where one is not immediately available, and by decreasing sensitivity to perceived risk of STI contraction. These findings suggest that delay and probability discounting are critical, but heretofore unrecognized, processes that may mediate the relations between alcohol use and HIV risk. Copyright © 2016 by the Research Society on Alcoholism.
Estimating Consequences of MMOD Penetrations on ISS
NASA Technical Reports Server (NTRS)
Evans, H.; Hyde, James; Christiansen, E.; Lear, D.
2017-01-01
The threat from micrometeoroid and orbital debris (MMOD) impacts on space vehicles is often quantified in terms of the probability of no penetration (PNP). However, for large spacecraft, especially those with multiple compartments, a penetration may have a number of possible outcomes. The extent of the damage (diameter of hole, crack length or penetration depth), the location of the damage relative to critical equipment or crew, crew response, and even the time of day of the penetration are among the many factors that can affect the outcome. For the International Space Station (ISS), a Monte-Carlo style software code called Manned Spacecraft Crew Survivability (MSCSurv) is used to predict the probability of several outcomes of an MMOD penetration-broadly classified as loss of crew (LOC), crew evacuation (Evac), loss of escape vehicle (LEV), and nominal end of mission (NEOM). By generating large numbers of MMOD impacts (typically in the billions) and tracking the consequences, MSCSurv allows for the inclusion of a large number of parameters and models as well as enabling the consideration of uncertainties in the models and parameters. MSCSurv builds upon the results from NASA's Bumper software (which provides the probability of penetration and critical input data to MSCSurv) to allow analysts to estimate the probability of LOC, Evac, LEV, and NEOM. This paper briefly describes the overall methodology used by NASA to quantify LOC, Evac, LEV, and NEOM with particular emphasis on describing in broad terms how MSCSurv works and its capabilities and most significant models.
Predicting the Consequences of MMOD Penetrations on the International Space Station
NASA Technical Reports Server (NTRS)
Hyde, James; Christiansen, E.; Lear, D.; Evans
2018-01-01
The threat from micrometeoroid and orbital debris (MMOD) impacts on space vehicles is often quantified in terms of the probability of no penetration (PNP). However, for large spacecraft, especially those with multiple compartments, a penetration may have a number of possible outcomes. The extent of the damage (diameter of hole, crack length or penetration depth), the location of the damage relative to critical equipment or crew, crew response, and even the time of day of the penetration are among the many factors that can affect the outcome. For the International Space Station (ISS), a Monte-Carlo style software code called Manned Spacecraft Crew Survivability (MSCSurv) is used to predict the probability of several outcomes of an MMOD penetration-broadly classified as loss of crew (LOC), crew evacuation (Evac), loss of escape vehicle (LEV), and nominal end of mission (NEOM). By generating large numbers of MMOD impacts (typically in the billions) and tracking the consequences, MSCSurv allows for the inclusion of a large number of parameters and models as well as enabling the consideration of uncertainties in the models and parameters. MSCSurv builds upon the results from NASA's Bumper software (which provides the probability of penetration and critical input data to MSCSurv) to allow analysts to estimate the probability of LOC, Evac, LEV, and NEOM. This paper briefly describes the overall methodology used by NASA to quantify LOC, Evac, LEV, and NEOM with particular emphasis on describing in broad terms how MSCSurv works and its capabilities and most significant models.
10 CFR 963.17 - Postclosure suitability criteria.
Code of Federal Regulations, 2014 CFR
2014-01-01
...) Near field geochemical environment—for example, the chemical reactions and products resulting from... probability and potential consequences of a self-sustaining nuclear reaction as a result of chemical or..., drip shields, backfill, coatings, or chemical modifications, and (ii) Waste package degradation—for...
10 CFR 963.17 - Postclosure suitability criteria.
Code of Federal Regulations, 2012 CFR
2012-01-01
...) Near field geochemical environment—for example, the chemical reactions and products resulting from... probability and potential consequences of a self-sustaining nuclear reaction as a result of chemical or..., drip shields, backfill, coatings, or chemical modifications, and (ii) Waste package degradation—for...
10 CFR 963.17 - Postclosure suitability criteria.
Code of Federal Regulations, 2010 CFR
2010-01-01
...) Near field geochemical environment—for example, the chemical reactions and products resulting from... probability and potential consequences of a self-sustaining nuclear reaction as a result of chemical or..., drip shields, backfill, coatings, or chemical modifications, and (ii) Waste package degradation—for...
10 CFR 963.17 - Postclosure suitability criteria.
Code of Federal Regulations, 2013 CFR
2013-01-01
...) Near field geochemical environment—for example, the chemical reactions and products resulting from... probability and potential consequences of a self-sustaining nuclear reaction as a result of chemical or..., drip shields, backfill, coatings, or chemical modifications, and (ii) Waste package degradation—for...
10 CFR 963.17 - Postclosure suitability criteria.
Code of Federal Regulations, 2011 CFR
2011-01-01
...) Near field geochemical environment—for example, the chemical reactions and products resulting from... probability and potential consequences of a self-sustaining nuclear reaction as a result of chemical or..., drip shields, backfill, coatings, or chemical modifications, and (ii) Waste package degradation—for...
NASA Astrophysics Data System (ADS)
Ullah, Irshad; Baharom, MNR; Ahmed, H.; Luqman, HM.; Zainal, Zainab
2017-11-01
Protection against lightning is always a challenging job for the researcher. The consequences due to lightning on different building shapes needs a comprehensive knowledge in order to provide the information to the common man. This paper is mainly concern with lightning pattern when it strikes on the building with different shape. The work is based on the practical experimental work in high voltage laboratory. Different shapes of the scaled structures have been selected in order to investigate the equal distribution of lightning voltage. The equal distribution of lightning voltage will provide the maximum probability of lightning strike on air terminal of the selected shapes. Building shapes have a very important role in lightning protection. The shapes of the roof tops have different geometry and the Franklin rod installation is also varies with changing the shape of the roof top. According to the ambient weather condition of Malaysia high voltage impulse is applied on the lightning rod installed on different geometrical shape. The equal distribution of high voltage impulse is obtained as the geometry of the scaled structure is identical and the air gap for all the tested object is kept the same. This equal distribution of the lightning voltage also proves that the probability of lightning strike is on the corner and the edges of the building structure.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Copping, Andrea E.; Blake, Kara M.; Anderson, Richard M.
2011-09-01
Potential environmental effects of marine and hydrokinetic (MHK) energy development are not well understood, and yet regulatory agencies are required to make decisions in spite of substantial uncertainty about environmental impacts and their long-term consequences. An understanding of risks associated with interactions between MHK installations and aquatic receptors, including animals, habitats, and ecosystems, can help define key uncertainties and focus regulatory actions and scientific studies on interactions of most concern. As a first step in developing the Pacific Northwest National Laboratory (PNNL) Environmental Risk Evaluation System (ERES), PNNL scientists conducted a preliminary risk screening analysis on three initial MHK cases.more » During FY 2011, two additional cases were added: a tidal project in the Gulf of Maine using Ocean Renewable Power Company TidGenTM turbines and a wave project planned for the coast of Oregon using Aquamarine Oyster surge devices. Through an iterative process, the screening analysis revealed that top-tier stressors in the two FY 2011 cases were the dynamic effects of the device (e.g., strike), accidents/disasters, and effects of the static physical presence of the device (e.g., habitat alteration). Receptor interactions with these stressors at the highest tiers of risk were dominated by threatened and endangered animals. Risk to the physical environment from changes in flow regime also ranked high. Peer review of this process and results will be conducted in early FY 2012. The ERES screening analysis provides an analysis of vulnerability of environmental receptors to stressors associated with MHK installations, probability analysis is needed to determine specific risk levels to receptors. “Risk” has two components: (1) The likelihood, or “probability”, of the occurrence of a given interaction or event, and (2) the potential “consequence” if that interaction or event were to occur. During FY 2011, the ERES screening analysis focused primarily on the second component of risk, “consequence”, with focused probability analysis for interactions where data was sufficient for probability modeling. Consequence analysis provides an assessment of vulnerability of environmental receptors to stressors associated with MHK installations. Probability analysis is needed to determine specific risk levels to receptors and requires significant data inputs to drive risk models. During FY 2011, two stressor-receptor interactions were examined for the probability of occurrence. The two interactions (spill probability due to an encounter between a surface vessel and an MHK device; and toxicity from anti-biofouling paints on MHK devices) were seen to present relatively low risks to marine and freshwater receptors of greatest concern in siting and permitting MHK devices. A third probability analysis was scoped and initial steps taken to understand the risk of encounter between marine animals and rotating turbine blades. This analysis will be completed in FY 2012.« less
Assessing the chances of success: naïve statistics versus kind experience.
Hogarth, Robin M; Mukherjee, Kanchan; Soyer, Emre
2013-01-01
Additive integration of information is ubiquitous in judgment and has been shown to be effective even when multiplicative rules of probability theory are prescribed. We explore the generality of these findings in the context of estimating probabilities of success in contests. We first define a normative model of these probabilities that takes account of relative skill levels in contests where only a limited number of entrants can win. We then report 4 experiments using a scenario about a competition. Experiments 1 and 2 both elicited judgments of probabilities, and, although participants' responses demonstrated considerable variability, their mean judgments provide a good fit to a simple linear model. Experiment 3 explored choices. Most participants entered most contests and showed little awareness of appropriate probabilities. Experiment 4 investigated effects of providing aids to calculate probabilities, specifically, access to expert advice and 2 simulation tools. With these aids, estimates were accurate and decisions varied appropriately with economic consequences. We discuss implications by considering when additive decision rules are dysfunctional, the interpretation of overconfidence based on contest-entry behavior, and the use of aids to help people make better decisions.
The role of natural environments in the evolution of resistance traits in pathogenic bacteria.
Martinez, Jose L
2009-07-22
Antibiotics are among the most valuable compounds used for fighting human diseases. Unfortunately, pathogenic bacteria have evolved towards resistance. One important and frequently forgotten aspect of antibiotics and their resistance genes is that they evolved in non-clinical (natural) environments before the use of antibiotics by humans. Given that the biosphere is mainly formed by micro-organisms, learning the functional role of antibiotics and their resistance elements in nature has relevant implications both for human health and from an ecological perspective. Recent works have suggested that some antibiotics may serve for signalling purposes at the low concentrations probably found in natural ecosystems, whereas some antibiotic resistance genes were originally selected in their hosts for metabolic purposes or for signal trafficking. However, the high concentrations of antibiotics released in specific habitats (for instance, clinical settings) as a consequence of human activity can shift those functional roles. The pollution of natural ecosystems by antibiotics and resistance genes might have consequences for the evolution of the microbiosphere. Whereas antibiotics produce transient and usually local challenges in microbial communities, antibiotic resistance genes present in gene-transfer units can spread in nature with consequences for human health and the evolution of environmental microbiota that are largely ignored.
Climate warming may increase aphids' dropping probabilities in response to high temperatures.
Ma, Gang; Ma, Chun-Sen
2012-11-01
Dropping off is considered an anti-predator behavior for aphids since previous studies have shown that it reduces the risk of predation. However, little attention is paid to dropping behavior triggered by other external stresses such as daytime high temperatures which are predicted to become more frequent in the context of climate warming. Here we defined a new parameter, drop-off temperature (DOT), to describe the critical temperature at which an aphid drops off its host plant when the ambient temperature increases gradually and slowly. Detailed studies were conducted to reveal effects of short-term acclimation (temperature, exposure time at high-temperature and starvation) on DOT of an aphid species, Sitobion avenae. Our objectives were to test if the aphids dropped off host plant to avoid high temperatures and how short-term acclimation affected the aphids' dropping behavior in response to heat stress. We suggest that dropping is a behavioral thermoregulation to avoid heat stress, since aphids started to move before they dropped off and the dropped aphids were still able to control their muscles prior to knockdown. The adults starved for 12 h had higher DOT values than those that were unstarved or starved for 6 h, and there was a trade-off between behavioral thermoregulation and energy acquisition. Higher temperatures and longer exposure times at high temperatures significantly lowered the aphids' DOT, suggested that the aphids avoid heat stress by dropping when exposed to high temperatures. Climate warming may therefore increase the aphids' dropping probabilities and consequently affect the aphids' individual development and population growth. Copyright © 2012 Elsevier Ltd. All rights reserved.
Stevenson, D J
1981-11-06
Combined inferences from seismology, high-pressure experiment and theory, geomagnetism, fluid dynamics, and current views of terrestrial planetary evolution lead to models of the earth's core with the following properties. Core formation was contemporaneous with earth accretion; the core is not in chemical equilibrium with the mantle; the outer core is a fluid iron alloy containing significant quantities of lighter elements and is probably almost adiabatic and compositionally uniform; the more iron-rich inner solid core is a consequence of partial freezing of the outer core, and the energy release from this process sustains the earth's magnetic field; and the thermodynamic properties of the core are well constrained by the application of liquid-state theory to seismic and laboratory data.
Distribution of Causes in Selected US Aviation Accident Reports Between 1996 and 2003
NASA Technical Reports Server (NTRS)
Holloway, C. M.; Johnson, C. W.
2004-01-01
This paper describes the results of an independent analysis of the probable and contributory causes of selected aviation accidents in the United States between 1996 and 2003. The purpose of the study was to assess the comparative frequency of a variety of causal factors in the reporting of these adverse events. Although our results show that more of these high consequence accidents were attributed to human error than to any other single factor, a large number of reports also mentioned wider systemic issues, including the managerial and regulatory context of aviation operations. These wider issues are more likely to appear as contributory rather than primary causes in this set of accident reports.
Reefs as cradles of evolution and sources of biodiversity in the Phanerozoic.
Kiessling, Wolfgang; Simpson, Carl; Foote, Michael
2010-01-08
Large-scale biodiversity gradients among environments and habitats are usually attributed to a complex array of ecological and evolutionary factors. We tested the evolutionary component of such gradients by compiling the environments of the geologically oldest occurrences of marine genera and using sampling standardization to assess if originations tended to be clustered in particular environments. Shallow, tropical environments and carbonate substrates all tend to have harbored high origination rates. Diversity within these environments tended to be preferentially generated in reefs, probably because of their habitat complexity. Reefs were also prolific at exporting diversity to other environments, which might be a consequence of low-diversity habitats being more susceptible to invasions.
Lessons learned from the dog genome.
Wayne, Robert K; Ostrander, Elaine A
2007-11-01
Extensive genetic resources and a high-quality genome sequence position the dog as an important model species for understanding genome evolution, population genetics and genes underlying complex phenotypic traits. Newly developed genomic resources have expanded our understanding of canine evolutionary history and dog origins. Domestication involved genetic contributions from multiple populations of gray wolves probably through backcrossing. More recently, the advent of controlled breeding practices has segregated genetic variability into distinct dog breeds that possess specific phenotypic traits. Consequently, genome-wide association and selective sweep scans now allow the discovery of genes underlying breed-specific characteristics. The dog is finally emerging as a novel resource for studying the genetic basis of complex traits, including behavior.
Influence of superconductor film composition on adhesion strength of coated conductors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kesgin, Ibrahim; Khatri, Narayan; Liu, Yuhao
The effect of high temperature superconductor (HTS) film composition on the adhesion strength of rare- earth barium copper oxide coated conductors (CCs) has been studied. It has been found that the mechanical integrity of the superconductor layer is very susceptible to the defects especially those along the ab plane, probably due to the weak interfaces between the defects and the matrix. Gd and Y in the standard composition were substituted with Sm and the number of in-plane defects was drastically reduced. Consequently, a four-fold increase in adhesion or peeling strength in Sm-based CCs was achieved compared to the standard GdYBCOmore » samples.« less
Risk analysis for roadways subjected to multiple landslide-related hazards
NASA Astrophysics Data System (ADS)
Corominas, Jordi; Mavrouli, Olga
2014-05-01
Roadways through mountainous terrain often involve cuts and landslide areas whose stability is precarious and require protection and stabilization works. To optimize the allocation of resources, government and technical offices are increasingly interested in both the risk analysis and assessment. Risk analysis has to consider the hazard occurrence and the consequences. The consequences can be both direct and indirect. The former include the costs regarding the repair of the roadway, the damage of vehicles and the potential fatalities, while the latter refer to the costs related to the diversion of vehicles, the excess of distance travelled, the time differences, and tolls. The type of slope instabilities that may affect a roadway may vary and its effects as well. Most current approaches either consider a single hazardous phenomenon each time, or if applied at small (for example national) scale, they do not take into account local conditions at each section of the roadway. The objective of this work is the development of a simple and comprehensive methodology for the assessment of the risk due to multiple hazards along roadways, integrating different landslide types that include rockfalls, debris flows and considering as well the potential failure of retaining walls. To quantify risk, all hazards are expressed with a common term: their probability of occurrence. The methodology takes into consideration the specific local conditions along the roadway. For rockfalls and debris flow a variety of methods for assessing the probability of occurrence exists. To assess the annual probability of failure of retaining walls we use an indicator-based model that provides a hazard index. The model parameters consist in the design safety factor, and further anchorage design and construction parameters. The probability of failure is evaluated in function of the hazard index and next corrected (in terms of order of magnitude) according to in situ observations for increase of two dynamic factors: the service load and the wall deformation. The consequences are then calculated for each hazard type according to its characteristics (mechanism, magnitude, frequency). The difference of this method in comparison with other methodologies for landslide-related hazards lies in the hazard scenarios and consequence profiles that are investigated. The depth of analysis permits to account for local conditions either concerning the hazard or the consequences (the latter with respect to the very particular characteristics of the roadway such as traffic, number of lanes, velocity…). Furthermore it provides an extensive list of quantitative risk descriptors, including both individual and collective ones. The methodology was made automatic using the data sheets by Microsoft Excel. The results can be used to support decision-taking for the planning of protection measures. Gaps in knowledge and restrictions are discussed as well.
Antecedents and Consequences of Work Engagement Among Nurses
Sohrabizadeh, Sanaz; Sayfouri, Nasrin
2014-01-01
Background: Engaged nurses have high levels of energy and are enthusiastic about their work which impacts quality of health care services. However, in the context of Iran, due to observed burnout, work engagement among nurses necessitates immediate exploration. Objectives: This investigation aimed to identify a suitable work engagement model in nursing profession in hospitals according to the hypothesized model and to determine antecedents and consequences related to work engagement among nurses. Patients and Methods: In this cross-sectional study, a questionnaire was given to 279 randomly-selected nurses working in two general teaching hospitals of Shiraz University of Medical Sciences (Shiraz, Iran) to measure antecedents and consequences of work engagement using the Saks’s (2005) model. Structural Equation Modeling was used to examine the model fitness. Results: Two paths were added using LISREL software. The resulting model showed good fitness indices (χ2 = 23.62, AGFI = 0.93, CFI = 0.97, RMSEA = 0.07) and all the coefficients of the paths were significant (t ≥ 2, t ≤ -2). A significant correlation was found between work engagement and model variables. Conclusions: Paying adequate attention to the antecedents of work engagement can enhance the quality of performance among nurses. Additionally, rewards, organizational and supervisory supports, and job characteristics should be taken into consideration to establish work engagement among nurses. Further researches are required to identify other probable antecedents and consequences of nursing work engagement, which might be related to specific cultural settings. PMID:25763212
Trask, Newell J.
1994-01-01
Concern with the threat posed by terrestrial asteroid and comet impacts has heightened as the catastrophic consequences of such events have become better appreciated. Although the probabilities of such impacts are very small, a reasonable question for debate is whether such phenomena should be taken into account in deciding policy for the management of spent fuel and high-level radioactive waste. The rate at which asteroid or comet impacts would affect areas of surface storage of radioactive waste is about the same as the estimated rate at which volcanic activity would affect the Yucca Mountain area. The Underground Retrievable Storage (URS) concept could satisfactorily reduce the risk from cosmic impact with its associated uncertainties in addition to providing other benefits described by previous authors.
ERIC Educational Resources Information Center
Trede, Mildred
1991-01-01
The "Game of Decisions" is presented to encourage students to consider the consequences of banning books and/or ideas. The game involves story writing, creating probability graphs, writing a letter protesting censorship from a chosen historical period, and examining a controversial science issue. Three thesis statements for generating group…
CLIMATE VARIABILITY, ANTHROPOGENIC CHANGE, AND CONSEQUENCES IN THE MID-ATLANTIC
When compared to the preceding millennium, the rate of temperature change over the past century strongly suggests that we are in a period of rapid global climate change. Globally, continued anthropogenic increases in concentrations of atmospheric greenhouse gases probably will re...
Ockham's razor and Bayesian analysis. [statistical theory for systems evaluation
NASA Technical Reports Server (NTRS)
Jefferys, William H.; Berger, James O.
1992-01-01
'Ockham's razor', the ad hoc principle enjoining the greatest possible simplicity in theoretical explanations, is presently shown to be justifiable as a consequence of Bayesian inference; Bayesian analysis can, moreover, clarify the nature of the 'simplest' hypothesis consistent with the given data. By choosing the prior probabilities of hypotheses, it becomes possible to quantify the scientific judgment that simpler hypotheses are more likely to be correct. Bayesian analysis also shows that a hypothesis with fewer adjustable parameters intrinsically possesses an enhanced posterior probability, due to the clarity of its predictions.
Continuity equation for probability as a requirement of inference over paths
NASA Astrophysics Data System (ADS)
González, Diego; Díaz, Daniela; Davis, Sergio
2016-09-01
Local conservation of probability, expressed as the continuity equation, is a central feature of non-equilibrium Statistical Mechanics. In the existing literature, the continuity equation is always motivated by heuristic arguments with no derivation from first principles. In this work we show that the continuity equation is a logical consequence of the laws of probability and the application of the formalism of inference over paths for dynamical systems. That is, the simple postulate that a system moves continuously through time following paths implies the continuity equation. The translation between the language of dynamical paths to the usual representation in terms of probability densities of states is performed by means of an identity derived from Bayes' theorem. The formalism presented here is valid independently of the nature of the system studied: it is applicable to physical systems and also to more abstract dynamics such as financial indicators, population dynamics in ecology among others.
Nielsen, Bjørn G; Jensen, Morten Ø; Bohr, Henrik G
2003-01-01
The structure of enkephalin, a small neuropeptide with five amino acids, has been simulated on computers using molecular dynamics. Such simulations exhibit a few stable conformations, which also have been identified experimentally. The simulations provide the possibility to perform cluster analysis in the space defined by potentially pharmacophoric measures such as dihedral angles, side-chain orientation, etc. By analyzing the statistics of the resulting clusters, the probability distribution of the side-chain conformations may be determined. These probabilities allow us to predict the selectivity of [Leu]enkephalin and [Met]enkephalin to the known mu- and delta-type opiate receptors to which they bind as agonists. Other plausible consequences of these probability distributions are discussed in relation to the way in which they may influence the dynamics of the synapse. Copyright 2003 Wiley Periodicals, Inc. Biopolymers (Pept Sci) 71: 577-592, 2003
Cushing's Syndrome: Where and How to Find It.
Debono, Miguel; Newell-Price, John D
2016-01-01
The diagnosis of Cushing's syndrome is challenging to endocrinologists as patients often present with an insidious history, together with subtle external clinical features. Moreover, complications of endogenous hypercortisolism, such as visceral obesity, diabetes, hypertension and osteoporosis, are conditions commonly found in the population, and discerning whether these are truly a consequence of hypercortisolism is not straightforward. To avoid misdiagnosis, a careful investigative approach is essential. The investigation of Cushing's syndrome is a three-step process. Firstly, after exclusion of exogenous glucocorticoid use, the decision to initiate investigations should be based on whether there is a clinical index of suspicion of the disease. Specific signs of endogenous hypercortisolism raise the a priori probability of a truly positive test. Secondly, if the probability of hypercortisolism is high, one should carry out specific tests as indicated by Endocrine Society guidelines. Populations with non-distinguishing features of Cushing's syndrome should not be screened routinely as biochemical tests have a high false-positive rate if used indiscriminately. Thirdly, once hypercortisolism is confirmed, one should move to establish the cause. This usually entails distinguishing between adrenal or pituitary-related causes and the remoter possibility of the ectopic adrenocorticotropic hormone syndrome. It is crucial that the presence of Cushing's syndrome is established before any attempt at differential diagnosis. © 2016 S. Karger AG, Basel.
Risk to life due to flooding in post-Katrina New Orleans
NASA Astrophysics Data System (ADS)
Miller, A.; Jonkman, S. N.; Van Ledden, M.
2015-01-01
Since the catastrophic flooding of New Orleans due to Hurricane Katrina in 2005, the city's hurricane protection system has been improved to provide protection against a hurricane load with a 1/100 per year exceedance frequency. This paper investigates the risk to life in post-Katrina New Orleans. In a flood risk analysis the probabilities and consequences of various flood scenarios have been analyzed for the central area of the city (the metro bowl) to give a preliminary estimate of the risk to life in the post-Katrina situation. A two-dimensional hydrodynamic model has been used to simulate flood characteristics of various breaches. The model for estimation of fatality rates is based on the loss of life data for Hurricane Katrina. Results indicate that - depending on the flood scenario - the estimated loss of life in case of flooding ranges from about 100 to nearly 500, with the highest life loss due to breaching of the river levees leading to large flood depths. The probability and consequence estimates are combined to determine the individual risk and societal risk for New Orleans. When compared to risks of other large-scale engineering systems (e.g., other flood prone areas, dams and the nuclear sector) and acceptable risk criteria found in literature, the risks for the metro bowl are found to be relatively high. Thus, despite major improvements to the flood protection system, the flood risk to life of post-Katrina New Orleans is still expected to be significant. Indicative effects of reduction strategies on the risk level are discussed as a basis for further evaluation and discussion.
Risk to life due to flooding in post-Katrina New Orleans
NASA Astrophysics Data System (ADS)
Miller, A.; Jonkman, S. N.; Van Ledden, M.
2014-01-01
After the catastrophic flooding of New Orleans due to hurricane Katrina in the year 2005, the city's hurricane protection system has been improved to provide protection against a hurricane load with a 1/100 per year exceedance frequency. This paper investigates the risk to life in post-Katrina New Orleans. In a risk-based approach the probabilities and consequences of various flood scenarios have been analyzed for the central area of the city (the metro bowl) to give a preliminary estimate of the risk to life in the post-Katrina situation. A two-dimensional hydrodynamic model has been used to simulate flood characteristics of various breaches. The model for estimation of fatality rates is based on the loss of life data for Hurricane Katrina. Results indicate that - depending on the flood scenario - the estimated loss of life in case of flooding ranges from about 100 to nearly 500, with the highest life loss due to breaching of the river levees leading to large flood depths. The probability and consequence estimates are combined to determine the individual risk and societal risk for New Orleans. When compared to risks of other large scale engineering systems (e.g. other flood prone areas, dams and the nuclear sector) and acceptable risk criteria found in literature, the risks for the metro bowl are found to be relatively high. Thus, despite major improvements to the flood protection system, the flood risk of post-Katrina New Orleans is still expected to be significant. Effects of reduction strategies on the risk level are discussed as a basis for further evaluation.
Reliability Analysis of a Glacier Lake Warning System Using a Bayesian Net
NASA Astrophysics Data System (ADS)
Sturny, Rouven A.; Bründl, Michael
2013-04-01
Beside structural mitigation measures like avalanche defense structures, dams and galleries, warning and alarm systems have become important measures for dealing with Alpine natural hazards. Integrating them into risk mitigation strategies and comparing their effectiveness with structural measures requires quantification of the reliability of these systems. However, little is known about how reliability of warning systems can be quantified and which methods are suitable for comparing their contribution to risk reduction with that of structural mitigation measures. We present a reliability analysis of a warning system located in Grindelwald, Switzerland. The warning system was built for warning and protecting residents and tourists from glacier outburst floods as consequence of a rapid drain of the glacier lake. We have set up a Bayesian Net (BN, BPN) that allowed for a qualitative and quantitative reliability analysis. The Conditional Probability Tables (CPT) of the BN were determined according to manufacturer's reliability data for each component of the system as well as by assigning weights for specific BN nodes accounting for information flows and decision-making processes of the local safety service. The presented results focus on the two alerting units 'visual acoustic signal' (VAS) and 'alerting of the intervention entities' (AIE). For the summer of 2009, the reliability was determined to be 94 % for the VAS and 83 % for the AEI. The probability of occurrence of a major event was calculated as 0.55 % per day resulting in an overall reliability of 99.967 % for the VAS and 99.906 % for the AEI. We concluded that a failure of the VAS alerting unit would be the consequence of a simultaneous failure of the four probes located in the lake and the gorge. Similarly, we deduced that the AEI would fail either if there were a simultaneous connectivity loss of the mobile and fixed network in Grindelwald, an Internet access loss or a failure of the regional operations centre. However, the probability of a common failure of these components was assumed to be low. Overall it can be stated that due to numerous redundancies, the investigated warning system is highly reliable and its influence on risk reduction is very high. Comparable studies in the future are needed to classify these results and to gain more experience how the reliability of warning systems could be determined in practice.
Robinson, Gilpin R.; Lesure, Frank G.; Marlowe, J.I.; Foley, Nora K.; Clark, S.H.
1992-01-01
Vermiculite produced from a large deposit near Tigerville, S.C., in the Inner Piedmont. Deposit worked out and mine backfilled. Smaller deposits associated with ultramafic rocks in the east flank of the Blue Ridge are now uneconomic and have not been worked in the past 20 years. C. Metals: Copper in three deposits, the Fontana and Hazel Creek mines in the Great Smoky Mountains National Park in the Central Blue Ridge, and the Cullowhee mine in the east flank of the Blue Ridge. D. Organic fuels: The rocks of the quadrangle contain no coal and probably lie outside the maximum range in thermal maturity permitting the survival of oil. The rocks in the Valley and Ridge and for a short distance eastward below the west flank of the Blue Ridge probably lie within a zone of thermal maturity permitting the survival of natural gas. Consequently the western part of the quadrangle is an area of high risk for hydrocarbon exploration. No exploration drilling has been done in this belt.
Is the local linearity of space-time inherited from the linearity of probabilities?
NASA Astrophysics Data System (ADS)
Müller, Markus P.; Carrozza, Sylvain; Höhn, Philipp A.
2017-02-01
The appearance of linear spaces, describing physical quantities by vectors and tensors, is ubiquitous in all of physics, from classical mechanics to the modern notion of local Lorentz invariance. However, as natural as this seems to the physicist, most computer scientists would argue that something like a ‘local linear tangent space’ is not very typical and in fact a quite surprising property of any conceivable world or algorithm. In this paper, we take the perspective of the computer scientist seriously, and ask whether there could be any inherently information-theoretic reason to expect this notion of linearity to appear in physics. We give a series of simple arguments, spanning quantum information theory, group representation theory, and renormalization in quantum gravity, that supports a surprising thesis: namely, that the local linearity of space-time might ultimately be a consequence of the linearity of probabilities. While our arguments involve a fair amount of speculation, they have the virtue of being independent of any detailed assumptions on quantum gravity, and they are in harmony with several independent recent ideas on emergent space-time in high-energy physics.
What you see is what you expect: rapid scene understanding benefits from prior experience.
Greene, Michelle R; Botros, Abraham P; Beck, Diane M; Fei-Fei, Li
2015-05-01
Although we are able to rapidly understand novel scene images, little is known about the mechanisms that support this ability. Theories of optimal coding assert that prior visual experience can be used to ease the computational burden of visual processing. A consequence of this idea is that more probable visual inputs should be facilitated relative to more unlikely stimuli. In three experiments, we compared the perceptions of highly improbable real-world scenes (e.g., an underwater press conference) with common images matched for visual and semantic features. Although the two groups of images could not be distinguished by their low-level visual features, we found profound deficits related to the improbable images: Observers wrote poorer descriptions of these images (Exp. 1), had difficulties classifying the images as unusual (Exp. 2), and even had lower sensitivity to detect these images in noise than to detect their more probable counterparts (Exp. 3). Taken together, these results place a limit on our abilities for rapid scene perception and suggest that perception is facilitated by prior visual experience.
Target Tracking Using SePDAF under Ambiguous Angles for Distributed Array Radar
Long, Teng; Zhang, Honggang; Zeng, Tao; Chen, Xinliang; Liu, Quanhua; Zheng, Le
2016-01-01
Distributed array radar can improve radar detection capability and measurement accuracy. However, it will suffer cyclic ambiguity in its angle estimates according to the spatial Nyquist sampling theorem since the large sparse array is undersampling. Consequently, the state estimation accuracy and track validity probability degrades when the ambiguous angles are directly used for target tracking. This paper proposes a second probability data association filter (SePDAF)-based tracking method for distributed array radar. Firstly, the target motion model and radar measurement model is built. Secondly, the fusion result of each radar’s estimation is employed to the extended Kalman filter (EKF) to finish the first filtering. Thirdly, taking this result as prior knowledge, and associating with the array-processed ambiguous angles, the SePDAF is applied to accomplish the second filtering, and then achieving a high accuracy and stable trajectory with relatively low computational complexity. Moreover, the azimuth filtering accuracy will be promoted dramatically and the position filtering accuracy will also improve. Finally, simulations illustrate the effectiveness of the proposed method. PMID:27618058
Tsujiuchi, Takuya; Yamaguchi, Maya; Masuda, Kazutaka; Tsuchida, Marisa; Inomata, Tadashi; Kumano, Hiroaki; Kikuchi, Yasushi; Augusterfer, Eugene F.; Mollica, Richard F.
2016-01-01
Objective This study investigated post-traumatic stress symptoms in relation to the population affected by the Fukushima Nuclear Disaster, one year after the disaster. Additionally, we investigated social factors, such as forced displacement, which we hypothesize contributed to the high prevalence of post-traumatic stress. Finally, we report of written narratives that were collected from the impacted population. Design and Settings Using the Impact of Event Scale-Revised (IES-R), questionnaires were sent to 2,011 households of those displaced from Fukushima prefecture living temporarily in Saitama prefecture. Of the 490 replies; 350 met the criteria for inclusion in the study. Multiple logistic regression analysis was performed to examine several characteristics and variables of social factors as predictors of probable post-traumatic stress disorder, PTSD. Results The mean score of IES-R was 36.15±21.55, with 59.4% having scores of 30 or higher, thus indicating a probable PTSD. No significant differences in percentages of high-risk subjects were found among sex, age, evacuation area, housing damages, tsunami affected, family split-up, and acquaintance support. By the result of multiple logistic regression analysis, the significant predictors of probable PTSD were chronic physical diseases (OR = 1.97), chronic mental diseases (OR = 6.25), worries about livelihood (OR = 2.27), lost jobs (OR = 1.71), lost social ties (OR = 2.27), and concerns about compensation (OR = 3.74). Conclusion Although there are limitations in assuming a diagnosis of PTSD based on self-report IES-R, our findings indicate that there was a high-risk of PTSD strongly related to the nuclear disaster and its consequent evacuation and displacement. Therefore, recovery efforts must focus not only on medical and psychological treatment alone, but also on social and economic issues related to the displacement, as well. PMID:27002324
Asynchronous threat awareness by observer trials using crowd simulation
NASA Astrophysics Data System (ADS)
Dunau, Patrick; Huber, Samuel; Stein, Karin U.; Wellig, Peter
2016-10-01
The last few years showed that a high risk of asynchronous threats is given in every day life. Especially in large crowds a high probability of asynchronous attacks is evident. High observational abilities to detect threats are desirable. Consequently highly trained security and observation personal is needed. This paper evaluates the effectiveness of a training methodology to enhance performance of observation personnel engaging in a specific target identification task. For this purpose a crowd simulation video is utilized. The study first provides a measurement of the base performance before the training sessions. Furthermore a training procedure will be performed. Base performance will then be compared to the after training performance in order to look for a training effect. A thorough evaluation of both the training sessions as well as the overall performance will be done in this paper. A specific hypotheses based metric is used. Results will be discussed in order to provide guidelines for the design of training for observational tasks.
Degradation of Organics in a Glow Discharge Under Martian Conditions
NASA Technical Reports Server (NTRS)
Hintze, P. E.; Calle, L. M.; Calle, C. I.; Buhler, C. R.; Trigwell, S.; Starnes, J. W.; Schuerger, A. C.
2006-01-01
The primary objective of this project is to understand the consequences of glow electrical discharges on the chemistry and biology of Mars. The possibility was raised some time ago that the absence of organic material and carbonaceous matter in the Martian soil samples studied by the VikinG Landers might be due in part to an intrinsic atmospheric mechanism such as glow discharge. The high probability for dust interactions during Martian dust storms and dust devils, combined with the cold, dry climate of Mars most likely results in airborne dust that is highly charged. Such high electrostatic potentials generated during dust storms on Earth are not permitted in the low-pressure CO2 environment on Mars; therefore electrostatic energy released in the form of glow discharges is a highly likely phenomenon. Since glow discharge methods are used for cleaning and sterilizing surfaces throughout industry, the idea that dust in the Martian atmosphere undergoes a cleaning action many times over geologic time scales appears to be a plausible one.
Chiu, Yu-Han; Williams, Paige L.; Gillman, Matthew W.; Gaskins, Audrey J.; Mínguez-Alarcón, Lidia; Souter, Irene; Toth, Thomas L.; Ford, Jennifer B.; Hauser, Russ; Chavarro, Jorge E.
2018-01-01
IMPORTANCE Animal experiments suggest that ingestion of pesticide mixtures at environmentally relevant concentrations decreases the number of live-born offspring. Whether the same is true in humans is unknown. OBJECTIVE To examine the association of preconception intake of pesticide residues in fruits and vegetables (FVs) with outcomes of infertility treatment with assisted reproductive technologies (ART). DESIGN, SETTING, AND PARTICIPANTS This analysis included 325 women who completed a diet assessment and subsequently underwent 541 ART cycles in the Environment and Reproductive Health (EARTH) prospective cohort study (2007–2016) at a fertility center at a teaching hospital. We categorized FVs as having high or low pesticide residues using a validated method based on surveillance data from the US Department of Agriculture. Cluster-weighted generalized estimating equations were used to analyze associations of high– and low–pesticide residue FV intake with ART outcomes. MAIN OUTCOMES AND MEASURES Adjusted probabilities of clinical pregnancy and live birth per treatment cycle. RESULTS In the 325 participants (mean [SD] age, 35.1 [4.0] y; body mass index, 24.1 [4.3]), mean (SD) intakes of high– and low–pesticide residue FVs were 1.7 (1.0) and 2.8 (1.6) servings/d, respectively. Greater intake of high–pesticide residue FVs was associated with a lower probability of clinical pregnancy and live birth. Compared with women in the lowest quartile of high-pesticide FV intake (<1.0 servings/d), women in the highest quartile (≥ 2.3 servings/d) had 18% (95% CI, 5%–30%) lower probability of clinical pregnancy and 26% (95% CI, 13%–37%) lower probability of live birth. Intake of low–pesticide residue FVs was not significantly related to ART outcomes. CONCLUSIONS AND RELEVANCE Higher consumption of high–pesticide residue FVs was associated with lower probabilities of pregnancy and live birth following infertility treatment with ART. These data suggest that dietary pesticide exposure within the range of typical human exposure may be associated with adverse reproductive consequences. PMID:29084307
NASA Astrophysics Data System (ADS)
Zhang, Jiaxin; Shields, Michael D.
2018-01-01
This paper addresses the problem of uncertainty quantification and propagation when data for characterizing probability distributions are scarce. We propose a methodology wherein the full uncertainty associated with probability model form and parameter estimation are retained and efficiently propagated. This is achieved by applying the information-theoretic multimodel inference method to identify plausible candidate probability densities and associated probabilities that each method is the best model in the Kullback-Leibler sense. The joint parameter densities for each plausible model are then estimated using Bayes' rule. We then propagate this full set of probability models by estimating an optimal importance sampling density that is representative of all plausible models, propagating this density, and reweighting the samples according to each of the candidate probability models. This is in contrast with conventional methods that try to identify a single probability model that encapsulates the full uncertainty caused by lack of data and consequently underestimate uncertainty. The result is a complete probabilistic description of both aleatory and epistemic uncertainty achieved with several orders of magnitude reduction in computational cost. It is shown how the model can be updated to adaptively accommodate added data and added candidate probability models. The method is applied for uncertainty analysis of plate buckling strength where it is demonstrated how dataset size affects the confidence (or lack thereof) we can place in statistical estimates of response when data are lacking.
Individual Values, Learning Routines and Academic Procrastination
ERIC Educational Resources Information Center
Dietz, Franziska; Hofer, Manfred; Fries, Stefan
2007-01-01
Background: Academic procrastination, the tendency to postpone learning activities, is regarded as a consequence of postmodern values that are prominent in post-industrialized societies. When students strive for leisure goals and have no structured routines for academic tasks, delaying strenuous learning activities becomes probable. Aims: The…
Ehlers, Ute Christine; Ryeng, Eirin Olaussen; McCormack, Edward; Khan, Faisal; Ehlers, Sören
2017-02-01
The safety effects of cooperative intelligent transport systems (C-ITS) are mostly unknown and associated with uncertainties, because these systems represent emerging technology. This study proposes a bowtie analysis as a conceptual framework for evaluating the safety effect of cooperative intelligent transport systems. These seek to prevent road traffic accidents or mitigate their consequences. Under the assumption of the potential occurrence of a particular single vehicle accident, three case studies demonstrate the application of the bowtie analysis approach in road traffic safety. The approach utilizes exemplary expert estimates and knowledge from literature on the probability of the occurrence of accident risk factors and of the success of safety measures. Fuzzy set theory is applied to handle uncertainty in expert knowledge. Based on this approach, a useful tool is developed to estimate the effects of safety-related cooperative intelligent transport systems in terms of the expected change in accident occurrence and consequence probability. Copyright © 2016 Elsevier Ltd. All rights reserved.
The Madelung Picture as a Foundation of Geometric Quantum Theory
NASA Astrophysics Data System (ADS)
Reddiger, Maik
2017-10-01
Despite its age, quantum theory still suffers from serious conceptual difficulties. To create clarity, mathematical physicists have been attempting to formulate quantum theory geometrically and to find a rigorous method of quantization, but this has not resolved the problem. In this article we argue that a quantum theory recursing to quantization algorithms is necessarily incomplete. To provide an alternative approach, we show that the Schrödinger equation is a consequence of three partial differential equations governing the time evolution of a given probability density. These equations, discovered by Madelung, naturally ground the Schrödinger theory in Newtonian mechanics and Kolmogorovian probability theory. A variety of far-reaching consequences for the projection postulate, the correspondence principle, the measurement problem, the uncertainty principle, and the modeling of particle creation and annihilation are immediate. We also give a speculative interpretation of the equations following Bohm, Vigier and Tsekov, by claiming that quantum mechanical behavior is possibly caused by gravitational background noise.
Bayesian analysis of rare events
NASA Astrophysics Data System (ADS)
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
NASA Astrophysics Data System (ADS)
Biass, Sébastien; Falcone, Jean-Luc; Bonadonna, Costanza; Di Traglia, Federico; Pistolesi, Marco; Rosi, Mauro; Lestuzzi, Pierino
2016-10-01
We present a probabilistic approach to quantify the hazard posed by volcanic ballistic projectiles (VBP) and their potential impact on the built environment. A model named Great Balls of Fire (GBF) is introduced to describe ballistic trajectories of VBPs accounting for a variable drag coefficient and topography. It relies on input parameters easily identifiable in the field and is designed to model large numbers of VBPs stochastically. Associated functions come with the GBF code to post-process model outputs into a comprehensive probabilistic hazard assessment for VBP impacts. Outcomes include probability maps to exceed given thresholds of kinetic energies at impact, hazard curves and probabilistic isoenergy maps. Probabilities are calculated either on equally-sized pixels or zones of interest. The approach is calibrated, validated and applied to La Fossa volcano, Vulcano Island (Italy). We constructed a generic eruption scenario based on stratigraphic studies and numerical inversions of the 1888-1890 long-lasting Vulcanian cycle of La Fossa. Results suggest a ~ 10- 2% probability of occurrence of VBP impacts with kinetic energies ≤ 104 J at the touristic locality of Porto. In parallel, the vulnerability to roof perforation was estimated by combining field observations and published literature, allowing for a first estimate of the potential impact of VBPs during future Vulcanian eruptions. Results indicate a high physical vulnerability to the VBP hazard, and, consequently, half of the building stock having a ≥ 2.5 × 10- 3% probability of roof perforation.
Paladino, Ombretta; Seyedsalehi, Mahdi; Massabò, Marco
2018-09-01
The use of fertilizers in greenhouse-grown crops can pose a threat to groundwater quality and, consequently, to human beings and subterranean ecosystem, where intensive farming produces pollutants leaching. Albenga plain (Liguria, Italy) is an alluvial area of about 45km 2 historically devoted to farming. Recently the crops have evolved to greenhouses horticulture and floriculture production. In the area high levels of nitrates in groundwater have been detected. Lysimeters with three types of reconstituted soils (loamy sand, sandy clay loam and sandy loam) collected from different areas of Albenga plain were used in this study to evaluate the leaching loss of nitrate (NO 3 - ) over a period of 12weeks. Leaf lettuce (Lactuca sativa L.) was selected as a representative green-grown crop. Each of the soil samples was treated with a slow release fertilizer, simulating the real fertilizing strategy of the tillage. In order to estimate the potential risk for aquifers as well as for organisms exposed via pore water, nitrate concentrations in groundwater were evaluated by applying a simplified attenuation model to the experimental data. Results were refined and extended from comparison of single effects and exposure values (Tier I level) up to the evaluation of probabilistic distributions of exposure and related effects (Tier II, III IV levels). HHRA suggested HI >1 and about 20% probability of exceeding RfD for all the greenhouses, regardless of the soil. ERA suggested HQ>100 for all the greenhouses; 93% probability of PNEC exceedance for greenhouses containing sand clay loam. The probability of exceeding LC50 for 5% of the species was about 40% and the probability corresponding to DBQ of DEC/EC50>0.001 was >90% for all the greenhouses. The significantly high risk, related to the detected nitrate leaching loss, can be attributed to excessive and inappropriate fertigation strategies. Copyright © 2018 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Martin-Rojas, Ivan; Alfaro, Pedro; Estévez, Antonio
2014-05-01
We present a study that encompasses several software tools (iGIS©, ArcGIS©, Autocad©, etc.) and data (geological mapping, high resolution digital topographic data, high resolution aerial photographs, etc.) to create a detailed 3D geometric model of an active fault propagation growth fold. This 3D model clearly shows structural features of the analysed fold, as well as growth relationships and sedimentary patterns. The results obtained permit us to discuss the kinematics and structural evolution of the fold and the fault in time and space. The study fault propagation fold is the Crevillente syncline. This fold represents the northern limit of the Bajo Segura Basin, an intermontane basin in the Eastern Betic Cordillera (SE Spain) developed from upper Miocene on. 3D features of the Crevillente syncline, including growth pattern, indicate that limb rotation and, consequently, fault activity was higher during Messinian than during Tortonian; consequently, fault activity was also higher. From Pliocene on our data point that limb rotation and fault activity steadies or probably decreases. This in time evolution of the Crevillente syncline is not the same all along the structure; actually the 3D geometric model indicates that observed lateral heterogeneity is related to along strike variation of fault displacement.
Price discrimination in obstetric services--a case study in Bangladesh.
Amin, Mohammad; Hanson, Kara; Mills, Anne
2004-06-01
This article examines the existence of price discrimination for obstetric services in two private hospitals in Bangladesh, and considers the welfare consequences of such discrimination, i.e. whether or not price discrimination benefited the poorer users. Data on 1212 normal and caesarean section patients discharged from the two hospitals were obtained. Obstetric services were chosen because they are relatively standardised and the patient population is relatively homogeneous, so minimising the scope and scale of product differentiation due to procedure and case-mix differences. The differences between the hospital list price for delivery and actual prices paid by patients were calculated to determine the average rate of discount. The welfare consequences of price discrimination were assessed by testing the differences in mean prices paid by patients from three income groups: low, middle and high. The results suggest that two different forms of price discrimination for obstetric services occurred in both these hospitals. First, there was price discrimination according to income, with the poorer users benefiting from a higher discount rate than richer ones; and second, there was price discrimination according to social status, with three high status occupational groups (doctors, senior government officials, and large businessmen) having the highest probability of receiving some level of discount. Copyright 2003 John Wiley & Sons, Ltd.
Moore, Clinton T.; Converse, Sarah J.; Folk, Martin J.; Runge, Michael C.; Nesbitt, Stephen A.
2012-01-01
The release of animals to reestablish an extirpated population is a decision problem that is often attended by considerable uncertainty about the probability of success. Annual releases of captive-reared juvenile Whooping Cranes (Grus americana) were begun in 1993 in central Florida, USA, to establish a breeding, non-migratory population. Over a 12-year period, 286 birds were released, but by 2004, the introduced flock had produced only four wild-fledged birds. Consequently, releases were halted over managers' concerns about the performance of the released flock and uncertainty about the efficacy of further releases. We used data on marked, released birds to develop predictive models for addressing whether releases should be resumed, and if so, under what schedule. To examine the outcome of different release scenarios, we simulated the survival and productivity of individual female birds under a baseline model that recognized age and breeding-class structure and which incorporated empirically estimated stochastic elements. As data on wild-fledged birds from captive-reared parents were sparse, a key uncertainty that confronts release decision-making is whether captive-reared birds and their offspring share the same vital rates. Therefore, we used data on the only population of wild Whooping Cranes in existence to construct two alternatives to the baseline model. The probability of population persistence was highly sensitive to the choice of these three models. Under the baseline model, extirpation of the population was nearly certain under any scenario of resumed releases. In contrast, the model based on estimates from wild birds projected a high probability of persistence under any release scenario, including cessation of releases. Therefore, belief in either of these models suggests that further releases are an ineffective use of resources. In the third model, which simulated a population Allee effect, population persistence was sensitive to the release decision: high persistence probability was achieved only through the release of more birds, whereas extirpation was highly probable with cessation of releases. Despite substantial investment of time and effort in the release program, evidence collected to date does not favor one model over another; therefore, any decision about further releases must be made under considerable biological uncertainty. However, given an assignment of credibility weight to each model, a best, informed decision about releases can be made under uncertainty. Furthermore, if managers can periodically revisit the release decision and collect monitoring data to further inform the models, then managers have a basis for confronting uncertainty and adaptively managing releases through time.
Toxic Elements in Tobacco and in Cigarette Smoke: Inflammation and Sensitization
Pappas, R.S.
2015-01-01
Biochemically and pathologically, there is strong evidence for both atopic and nonatopic airway sensitization, hyperresponsiveness, and inflammation as a consequence of exposure to tobacco mainstream or sidestream smoke particulate. There is growing evidence for the relation between exposure to mainstream and sidestream smoke and diseases resulting from reactive oxidant challenge and inflammation directly as a consequence of the combined activity of neutrophils, macrophages, dendritic cells, eosinophils, basophils, as a humoral immunological consequence of sensitization, and that the metal components of the particulate play a role in adjuvant effects. As an end consequence, carcinogenicity is a known outcome of chronic inflammation. Smokeless tobacco has been evaluated by the IARC as a group 1 carcinogen. Of the many harmful constituents in smokeless tobacco, oral tissue metallothionein gradients suggest that metals contribute to the toxicity from smokeless tobacco use and possibly sensitization. This work reviews and examines work on probable contributions of toxic metals from tobacco and smoke to pathology observed as a consequence of smoking and the use of smokeless tobacco. PMID:21799956
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Bushnell, Dennis M. (Technical Monitor)
2002-01-01
This paper presents a study on the optimization of systems with structured uncertainties, whose inputs and outputs can be exhaustively described in the probabilistic sense. By propagating the uncertainty from the input to the output in the space of the probability density functions and the moments, optimization problems that pursue performance, robustness and reliability based designs are studied. Be specifying the desired outputs in terms of desired probability density functions and then in terms of meaningful probabilistic indices, we settle a computationally viable framework for solving practical optimization problems. Applications to static optimization and stability control are used to illustrate the relevance of incorporating uncertainty in the early stages of the design. Several examples that admit a full probabilistic description of the output in terms of the design variables and the uncertain inputs are used to elucidate the main features of the generic problem and its solution. Extensions to problems that do not admit closed form solutions are also evaluated. Concrete evidence of the importance of using a consistent probabilistic formulation of the optimization problem and a meaningful probabilistic description of its solution is provided in the examples. In the stability control problem the analysis shows that standard deterministic approaches lead to designs with high probability of running into instability. The implementation of such designs can indeed have catastrophic consequences.
Ellison, L.E.; O'Shea, T.J.; Neubaum, D.J.; Neubaum, M.A.; Pearce, R.D.; Bowen, R.A.
2007-01-01
We compared conventional capture (primarily mist nets and harp traps) and passive integrated transponder (PIT) tagging techniques for estimating capture and survival probabilities of big brown bats (Eptesicus fuscus) roosting in buildings in Fort Collins, Colorado. A total of 987 female adult and juvenile bats were captured and marked by subdermal injection of PIT tags during the summers of 2001-2005 at five maternity colonies in buildings. Openings to roosts were equipped with PIT hoop-style readers, and exit and entry of bats were passively monitored on a daily basis throughout the summers of 2002-2005. PIT readers 'recaptured' adult and juvenile females more often than conventional capture events at each roost. Estimates of annual capture probabilities for all five colonies were on average twice as high when estimated from PIT reader data (P?? = 0.93-1.00) than when derived from conventional techniques (P?? = 0.26-0.66), and as a consequence annual survival estimates were more precisely estimated when using PIT reader encounters. Short-term, daily capture estimates were also higher using PIT readers than conventional captures. We discuss the advantages and limitations of using PIT tags and passive encounters with hoop readers vs. conventional capture techniques for estimating these vital parameters in big brown bats. ?? Museum and Institute of Zoology PAS.
The Science-Policy Link: Stakeholder Reactions to the Uncertainties of Future Sea Level Rise
NASA Astrophysics Data System (ADS)
Plag, H.; Bye, B.
2011-12-01
Policy makers and stakeholders in the coastal zone are equally challenged by the risk of an anticipated rise of coastal Local Sea Level (LSL) as a consequence of future global warming. Many low-lying and often densely populated coastal areas are under risk of increased inundation. More than 40% of the global population is living in or near the coastal zone and this fraction is steadily increasing. A rise in LSL will increase the vulnerability of coastal infrastructure and population dramatically, with potentially devastating consequences for the global economy, society, and environment. Policy makers are faced with a trade-off between imposing today the often very high costs of coastal protection and adaptation upon national economies and leaving the costs of potential major disasters to future generations. They are in need of actionable information that provides guidance for the development of coastal zones resilient to future sea level changes. Part of this actionable information comes from risk and vulnerability assessments, which require information on future LSL changes as input. In most cases, a deterministic approach has been applied based on predictions of the plausible range of future LSL trajectories as input. However, there is little consensus in the scientific community on how these trajectories should be determined, and what the boundaries of the plausible range are. Over the last few years, many publications in Science, Nature and other peer-reviewed scientific journals have revealed a broad range of possible futures and significant epistemic uncertainties and gaps concerning LSL changes. Based on the somewhat diffuse science input, policy and decision makers have made rather different choices for mitigation and adaptation in cases such as Venice, The Netherlands, New York City, and the San Francisco Bay area. Replacing the deterministic, prediction-based approach with a statistical one that fully accounts for the uncertainties and epistemic gaps would provide a different kind of science input to policy makers and stakeholders. Like in many other insurance problems (for example, earthquakes), where deterministic predictions are not possible and decisions have to be made on the basis of statistics and probabilities, the statistical approach to coastal resilience would require stakeholders to make decisions on the basis of probabilities instead of predictions. The science input for informed decisions on adaptation would consist of general probabilities of decadal to century scale sea level changes derived from paleo records, including the probabilities for large and rapid rises. Similar to other problems where the appearance of a hazard is associated with a high risk (like a fire in a house), this approach would also require a monitoring and warning system (a "smoke detector") capable of detecting any onset of a rapid sea level rise.
Task Training Emphasis for Determining Training Priority.
1987-08-01
the relative time spent on tasks performed in their current jobs. Supervisors also rated the tasks on several different task factors, including a new... different task factors, including Task Difficulty, Probable Consequences of Inadequate Performance, Task Delay Tolerance, and Recommended Training Emphasis...3 11. APPROACH. .. ..... ..... ...... ..... ..... ...... ...... 4 III. METHOD
Handling Alters Aggression and "Loser" Effect Formation in "Drosophila Melanogaster"
ERIC Educational Resources Information Center
Trannoy, Severine; Chowdhury, Budhaditya; Kravitz, Edward A.
2015-01-01
In "Drosophila," prior fighting experience influences the outcome of later contests: losing a fight increases the probability of losing second contests, thereby revealing "loser" effects that involve learning and memory. In these experiments, to generate and quantify the behavioral changes observed as consequences of losing…
10 CFR 100.10 - Factors to be considered when evaluating sites.
Code of Federal Regulations, 2013 CFR
2013-01-01
... reactor incorporates unique or unusual features having a significant bearing on the probability or consequences of accidental release of radioactive materials; (4) The safety features that are to be engineered... radioactive fission products. In addition, the site location and the engineered features included as...
10 CFR 100.10 - Factors to be considered when evaluating sites.
Code of Federal Regulations, 2012 CFR
2012-01-01
... reactor incorporates unique or unusual features having a significant bearing on the probability or consequences of accidental release of radioactive materials; (4) The safety features that are to be engineered... radioactive fission products. In addition, the site location and the engineered features included as...
10 CFR 100.10 - Factors to be considered when evaluating sites.
Code of Federal Regulations, 2014 CFR
2014-01-01
... reactor incorporates unique or unusual features having a significant bearing on the probability or consequences of accidental release of radioactive materials; (4) The safety features that are to be engineered... radioactive fission products. In addition, the site location and the engineered features included as...
Repair of clustered DNA damage caused by high LET radiation in human fibroblasts
NASA Technical Reports Server (NTRS)
Rydberg, B.; Lobrich, M.; Cooper, P. K.; Chatterjee, A. (Principal Investigator)
1998-01-01
It has recently been demonstrated experimentally that DNA damage induced by high LET radiation in mammalian cells is non-randomly distributed along the DNA molecule in the form of clusters of various sizes. The sizes of such clusters range from a few base-pairs to at least 200 kilobase-pairs. The high biological efficiency of high LET radiation for induction of relevant biological endpoints is probably a consequence of this clustering, although the exact mechanisms by which the clustering affects the biological outcome is not known. We discuss here results for induction and repair of base damage, single-strand breaks and double-strand breaks for low and high LET radiations. These results are discussed in the context of clustering. Of particular interest is to determine how clustering at different scales affects overall rejoining and fidelity of rejoining of DNA double-strand breaks. However, existing methods for measuring repair of DNA strand breaks are unable to resolve breaks that are close together in a cluster. This causes problems in interpretation of current results from high LET radiation and will require new methods to be developed.
Brownian Motion with Active Fluctuations
NASA Astrophysics Data System (ADS)
Romanczuk, Pawel; Schimansky-Geier, Lutz
2011-06-01
We study the effect of different types of fluctuation on the motion of self-propelled particles in two spatial dimensions. We distinguish between passive and active fluctuations. Passive fluctuations (e.g., thermal fluctuations) are independent of the orientation of the particle. In contrast, active ones point parallel or perpendicular to the time dependent orientation of the particle. We derive analytical expressions for the speed and velocity probability density for a generic model of active Brownian particles, which yields an increased probability of low speeds in the presence of active fluctuations in comparison to the case of purely passive fluctuations. As a consequence, we predict sharply peaked Cartesian velocity probability densities at the origin. Finally, we show that such a behavior may also occur in non-Gaussian active fluctuations and discuss briefly correlations of the fluctuating stochastic forces.
Fanshawe, T. R.
2015-01-01
There are many examples from the scientific literature of visual search tasks in which the length, scope and success rate of the search have been shown to vary according to the searcher's expectations of whether the search target is likely to be present. This phenomenon has major practical implications, for instance in cancer screening, when the prevalence of the condition is low and the consequences of a missed disease diagnosis are severe. We consider this problem from an empirical Bayesian perspective to explain how the effect of a low prior probability, subjectively assessed by the searcher, might impact on the extent of the search. We show how the searcher's posterior probability that the target is present depends on the prior probability and the proportion of possible target locations already searched, and also consider the implications of imperfect search, when the probability of false-positive and false-negative decisions is non-zero. The theoretical results are applied to two studies of radiologists' visual assessment of pulmonary lesions on chest radiographs. Further application areas in diagnostic medicine and airport security are also discussed. PMID:26587267
NASA Astrophysics Data System (ADS)
Villanueva, Anthony Allan D.
2018-02-01
We discuss a class of solutions of the time-dependent Schrödinger equation such that the position uncertainty temporarily decreases. This self-focusing or contractive behavior is a consequence of the anti-correlation of the position and momentum observables. Since the associated position density satisfies a continuity equation, upon contraction the probability current at a given fixed point may flow in the opposite direction of the group velocity of the wave packet. For definiteness, we consider a free particle incident from the left of the origin, and establish a condition for the initial position-momentum correlation such that a negative probability current at the origin is possible. This implies a decrease in the particle's detection probability in the region x > 0, and we calculate how long this occurs. Analogous results are obtained for a particle subject to a uniform gravitational force if we consider the particle approaching the turning point. We show that position-momentum anti-correlation may cause a negative probability current at the turning point, leading to a temporary decrease in the particle's detection probability in the classically forbidden region.
Efficient expression systems for cysteine proteases of malaria parasites
Sarduy, Emir Salas; de los A. Chávez Planes, María
2013-01-01
Papain-like cysteine proteases of malaria parasites are considered important chemotherapeutic targets or valuable models for the evaluation of drug candidates. Consequently, many of these enzymes have been cloned and expressed in Escherichia coli for their biochemical characterization. However, their expression has been problematic, showing low yield and leading to the formation of insoluble aggregates. Given that highly-productive expression systems are required for the high-throughput evaluation of inhibitors, we analyzed the existing expression systems to identify the causes of such apparent issues. We found that significant divergences in codon and nucleotide composition from host genes are the most probable cause of expression failure, and propose several strategies to overcome these limitations. Finally we predict that yeast hosts Saccharomyces cerevisiae and Pichia pastoris may be better suited than E. coli for the efficient expression of plasmodial genes, presumably leading to soluble and active products reproducing structural and functional characteristics of the natural enzymes. PMID:23018863
No geochemical evidence for an asteroidal impact at late Devonian mass extinction horizon
NASA Astrophysics Data System (ADS)
McGhee, G. R., Jr.; Gilmore, J. S.; Orth, C. J.; Olsen, E.
1984-04-01
Three sedimentary sequences in New York State (Dunkirk Beach, Walnut Creek Gorge, and Mills Mills) and one sedimentary sequence in Belgium (Sinsin), that cross the Devonian Frasnian-Famennian boundary, were examined for an iridium (Ir) anomaly to determine whether the biotic extinctions at the end of the Cretaceous could have been caused by an asteroidal impact. The sampling at three of the four areas was on 2-cm center points, and 15 to 20 g of sample were collected. The instrumental neutron activation method required 5 g samples, and consequently the distance between samples was less than 1 cm. Though the Devonian samples studied had a high probability of locating an Ir anomaly, none was found. The highest Ir values were between 0.2 and 2 percent of those reported for the marine and terrestrial Ir analyses at the Cretaceous-Tertiary boundary, and Devonian pyrite-rich sediments did not exhibit high Ir concentrations.
Genetic structure of farmer-managed varieties in clonally-propagated crops.
Scarcelli, N; Tostain, S; Vigouroux, Y; Luong, V; Baco, M N; Agbangla, C; Daïnou, O; Pham, J L
2011-08-01
The relative role of sexual reproduction and mutation in shaping the diversity of clonally propagated crops is largely unknown. We analyzed the genetic diversity of yam-a vegetatively-propagated crop-to gain insight into how these two factors shape its diversity in relation with farmers' classifications. Using 15 microsatellite loci, we analyzed 485 samples of 10 different yam varieties. We identified 33 different genotypes organized in lineages supported by high bootstrap values. We computed the probability that these genotypes appeared by sexual reproduction or mutation within and between each lineage. This allowed us to interpret each lineage as a product of sexual reproduction that has evolved by mutation. Moreover, we clearly noted a similarity between the genetic structure and farmers' classifications. Each variety could thus be interpreted as being the product of sexual reproduction having evolved by mutation. This highly structured diversity of farmer-managed varieties has consequences for the preservation of yam diversity.
A new model for soft gamma-ray repeaters and anomalous x-ray pulsars using quark stars
NASA Astrophysics Data System (ADS)
Niebergal, Brian Phillip
2007-05-01
If indeed the strange quark matter (SQM) hypothesis is true, then it is highly probable that some stars exist with an interior composed entirely of deconfined quarks. In this thesis the consequences of this SQM hypothesis are explored in the context of strange quark stars (QSs), and the manner in which they manifest themselves, namely Soft-Gamma ray Repeaters (SGRs) and Anomalous X-ray Pulsars (AXPs). Discussed in this thesis is the effect of the highly superconducting SQM, which is the formation of an Abrikosov lattice occupying the entire QS, and the result of spin-down on this lattice due to magnetic braking. By including a degenerate shell or torus surrounding the QS in this model, created during the quark-nova, SGRs and AXPs can be linked into a single classification and every observation of SGRs/AXPs to date can be explained.
Entanglement-assisted transformation is asymptotically equivalent to multiple-copy transformation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Duan Runyao; Feng Yuan; Ying Mingsheng
2005-08-15
We show that two ways of manipulating quantum entanglement - namely, entanglement-assisted local transformation [D. Jonathan and M. B. Plenio, Phys. Rev. Lett. 83, 3566 (1999)] and multiple-copy transformation [S. Bandyopadhyay, V. Roychowdhury, and U. Sen, Phys. Rev. A 65, 052315 (2002)]--are equivalent in the sense that they can asymptotically simulate each other's ability to implement a desired transformation from a given source state to another given target state with the same optimal success probability. As a consequence, this yields a feasible method to evaluate the optimal conversion probability of an entanglement-assisted transformation.
Evolution of the Uranus-neptune Planetesimal Swarm: Consequences for the Earth
NASA Technical Reports Server (NTRS)
Shoemaker, E. M.; Wolfe, R. F.
1984-01-01
The evolution of planetesimals in the outer Solar System were evaluated, both stellar and planetary encounters. About 20% of the Uranus-Neptune planetesimals (UNP's) enter the comet cloud and are stored primarily in the region inside the observational limits of the Oort cloud. Half of the comets have suruived to the present time; the cloud now has a mass of the order of Jupiter's mass. Most UNP's are ejected from the Solar system, and about half of the planetesimal swarm is passed to the control of Jupiter prior to ejection. Jupiter's perturbations drive a large flux of these planetesimals into Earth-crossing orbits, and it now appears highly probable that UNP's account for most of the heavy bombardment of the Moon and Earth.
NASA Technical Reports Server (NTRS)
Stevenson, D. J.
1981-01-01
Combined inferences from seismology, high-pressure experiment and theory, geomagnetism, fluid dynamics, and current views of terrestrial planetary evolution lead to models of the earth's core with five basic properties. These are that core formation was contemporaneous with earth accretion; the core is not in chemical equilibrium with the mantle; the outer core is a fluid iron alloy containing significant quantities of lighter elements and is probably almost adiabatic and compositionally uniform; the more iron-rich inner solid core is a consequence of partial freezing of the outer core, and the energy release from this process sustains the earth's magnetic field; and the thermodynamic properties of the core are well constrained by the application of liquid-state theory to seismic and labroatory data.
Treasure, Trevor E
2014-08-01
A paradigm shift in the training, practice, and study of office-based anesthesia is necessary for our specialty. Practice improvement plans are required to prevent low-probability-high-consequence anesthesia mishaps in our offices. A scarcity of statistical data exists regarding the true risk of office-based anesthesia in oral and maxillofacial surgery. Effective proactive risk management mandates accurate data to correctly outline the problem before solutions can be implemented. Only by learning from our mistakes, will we be able to reduce errors and improve patient safety: "The only real mistake is the one from which we learn nothing"--John Powell. Copyright © 2014 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.
Alien molluscs are widely distributed in U.S. streams. While some raise economic concerns on the order of billions of dollars, documentation of widespread ecological effects has, in some instances, been more elusive. A probability survey of wadeable streams of the coterminous U.S...
The Futurist Perspective: Implications for Community College Planning.
ERIC Educational Resources Information Center
Nicholson, R. Stephen; Keyser, John S.
Community college managers would probably acknowledge the importance of planning, but might not accept the need to adopt a futuristic perspective on educational planning. One of the characteristics of futurists is a belief that the future is a created reality, not a consequence of random events. Futurists conceive possible paths, examine…
ERIC Educational Resources Information Center
Brunner, Judy
2010-01-01
The purpose of swimming is to travel safely through water; the purpose of reading is to construct meaning. Both skills require perseverance from the student and patience from the teacher. Not having the necessary swimming skills may result in tragic consequences. Not having the ability to read and understand text--although probably not fatal--will…
ERIC Educational Resources Information Center
Fox, G. Thomas, Jr.
1978-01-01
Suggests that there exist great potential and much professional support for practitioner involvement (teachers, students, and administrators) in generating knowledge (creating new understandings of what is occurring in our experience and why) from field experience and that the probable consequences noted when practitioners become involved in…
The School Maintenance Rental Option
ERIC Educational Resources Information Center
Schaffer, Michael
2012-01-01
It will probably be years before all the social and structural ramifications of the economic downturn are fully realized. However, one interesting consequence already is materializing. According to Amy Hoak, a reporter for "MarketWatch", a "Wall Street Journal" publication, for the first time in decades, more people say they would rather rent a…
Robust Estimation of Latent Ability in Item Response Models
ERIC Educational Resources Information Center
Schuster, Christof; Yuan, Ke-Hai
2011-01-01
Because of response disturbances such as guessing, cheating, or carelessness, item response models often can only approximate the "true" individual response probabilities. As a consequence, maximum-likelihood estimates of ability will be biased. Typically, the nature and extent to which response disturbances are present is unknown, and, therefore,…
Marketing Science, Marketing Ourselves
ERIC Educational Resources Information Center
Montgomery, David C.
2003-01-01
In this article, the author describes how the quest for external funding has dominated academic science and argues that today's scientists should think about pledging allegiance to traditional academic values. Enthusiasm for the pre-Cold War model of the university can probably not be justified in utilitarian terms or explained as a consequence of…
Risk Communication in Special Education.
ERIC Educational Resources Information Center
Bull, Kay S.; Kimball, Sarah
This paper describes the application of a risk-based decision-making process in education and the use of risk communication with special education students and their parents. Risk-based decision making clarifies uncertainties inherent in a decision by examining the probability of a resulting harmful effect and the consequences of decisions made.…
College Students' Openness toward Autism Spectrum Disorders: Improving Peer Acceptance
ERIC Educational Resources Information Center
Nevill, Rose E. A.; White, Susan W.
2011-01-01
One probable consequence of rising rates of autism spectrum disorder diagnosis in individuals without co-occurring intellectual disability is that more young adults with diagnoses or traits of ASD will attend college and require appropriate supports. This study sought to explore college students' openness to peers who demonstrate…
Hua, Chiaho; Wu, Shengjie; Chemaitilly, Wassim; Lukose, Renin C; Merchant, Thomas E
2012-11-15
To develop a mathematical model utilizing more readily available measures than stimulation tests that identifies brain tumor survivors with high likelihood of abnormal growth hormone secretion after radiotherapy (RT), to avoid late recognition and a consequent delay in growth hormone replacement therapy. We analyzed 191 prospectively collected post-RT evaluations of peak growth hormone level (arginine tolerance/levodopa stimulation test), serum insulin-like growth factor 1 (IGF-1), IGF-binding protein 3, height, weight, growth velocity, and body mass index in 106 children and adolescents treated for ependymoma (n=72), low-grade glioma (n=28) or craniopharyngioma (n=6), who had normal growth hormone levels before RT. Normal level in this study was defined as the peak growth hormone response to the stimulation test≥7 ng/mL. Independent predictor variables identified by multivariate logistic regression with high statistical significance (p<0.0001) included IGF-1 z score, weight z score, and hypothalamic dose. The developed predictive model demonstrated a strong discriminatory power with an area under the receiver operating characteristic curve of 0.883. At a potential cutoff point of probability of 0.3 the sensitivity was 80% and specificity 78%. Without unpleasant and expensive frequent stimulation tests, our model provides a quantitative approach to closely follow the growth hormone secretory capacity of brain tumor survivors. It allows identification of high-risk children for subsequent confirmatory tests and in-depth workup for diagnosis of growth hormone deficiency. Copyright © 2012 Elsevier Inc. All rights reserved.
Characterization of the Ionospheric Scintillations at High Latitude using GPS Signal
NASA Astrophysics Data System (ADS)
Mezaoui, H.; Hamza, A. M.; Jayachandran, P. T.
2013-12-01
Transionospheric radio signals experience both amplitude and phase variations as a result of propagation through a turbulent ionosphere; this phenomenon is known as ionospheric scintillations. As a result of these fluctuations, Global Positioning System (GPS) receivers lose track of signals and consequently induce position and navigational errors. Therefore, there is a need to study these scintillations and their causes in order to not only resolve the navigational problem but in addition develop analytical and numerical radio propagation models. In order to quantify and qualify these scintillations, we analyze the probability distribution functions (PDFs) of L1 GPS signals at 50 Hz sampling rate using the Canadian High arctic Ionospheric Network (CHAIN) measurements. The raw GPS signal is detrended using a wavelet-based technique and the detrended amplitude and phase of the signal are used to construct probability distribution functions (PDFs) of the scintillating signal. The resulting PDFs are non-Gaussian. From the PDF functional fits, the moments are estimated. The results reveal a general non-trivial parabolic relationship between the normalized fourth and third moments for both the phase and amplitude of the signal. The calculated higher-order moments of the amplitude and phase distribution functions will help quantify some of the scintillation characteristics and in the process provide a base for forecasting, i.e. develop a scintillation climatology model. This statistical analysis, including power spectra, along with a numerical simulation will constitute the backbone of a high latitude scintillation model.
NASA Astrophysics Data System (ADS)
Beukema, J. J.
Annual variation in recruitment and biomass was studied during 13 years for the 5 species contributing most to total zoobenthic biomass in a tidal flat area in the westernmost part of the Wadden Sea. In all of these species annual biomass values tended to be more stable than numbers of recruits. In Cerastoderma edule and in Mytilus edulis recruitment variability was high, and was passed on almost completely to biomass, probably as a consequence of rapid juvenile growth and a high mortality, also in the adult stage, leaving few year-classes in the population. In Arenicola marina and in Mya arenaria biomass values varied much less than recruit numbers. Both species showed a low adult mortality rate with many year-classes present in the population, holding many old and heavy specimens that dominated biomass. Macoma balathica took an intermediate position in these respects. Recruitment was relatively stable in Arenicola and was probably controlled by the high numbers of adults. Recruitment variability was fairly low too in Macoma, but in this species juvenile mortality appeared to be directly related to their own density. Successful and poor years for recruitment were roughly the same for the 4 bivalve species. Particularly heavy spatfall was found during the summer following the severe 1978-1979 winter. Such synchronized recruitment does not fully add to variability in annual biomass values as the time needed for the recruted cohorts to reach maximum biomass values differs greatly between most of the high-biomass species.
A pilot study of naturally occurring high-probability request sequences in hostage negotiations.
Hughes, James
2009-01-01
In the current study, the audiotapes from three hostage-taking situations were analyzed. Hostage negotiator requests to the hostage taker were characterized as either high or low probability. The results suggested that hostage-taker compliance to a hostage negotiator's low-probability request was more likely when a series of complied-with high-probability requests preceded the low-probability request. However, two of the three hostage-taking situations ended violently; therefore, the implications of the high-probability request sequence for hostage-taking situations should be assessed in future research.
A PILOT STUDY OF NATURALLY OCCURRING HIGH-PROBABILITY REQUEST SEQUENCES IN HOSTAGE NEGOTIATIONS
Hughes, James
2009-01-01
In the current study, the audiotapes from three hostage-taking situations were analyzed. Hostage negotiator requests to the hostage taker were characterized as either high or low probability. The results suggested that hostage-taker compliance to a hostage negotiator's low-probability request was more likely when a series of complied-with high-probability requests preceded the low-probability request. However, two of the three hostage-taking situations ended violently; therefore, the implications of the high-probability request sequence for hostage-taking situations should be assessed in future research. PMID:19949541
Robust Emergent Climate Phenomena Associated with the High-Sensitivity Tail
NASA Astrophysics Data System (ADS)
Boslough, M.; Levy, M.; Backus, G.
2010-12-01
Because the potential effects of climate change are more severe than had previously been thought, increasing focus on uncertainty quantification is required for risk assessment needed by policy makers. Current scientific efforts focus almost exclusively on establishing best estimates of future climate change. However, the greatest consequences occur in the extreme tail of the probability density functions for climate sensitivity (the “high-sensitivity tail”). To this end, we are exploring the impacts of newly postulated, highly uncertain, but high-consequence physical mechanisms to better establish the climate change risk. We define consequence in terms of dramatic change in physical conditions and in the resulting socioeconomic impact (hence, risk) on populations. Although we are developing generally applicable risk assessment methods, we have focused our initial efforts on uncertainty and risk analyses for the Arctic region. Instead of focusing on best estimates, requiring many years of model parameterization development and evaluation, we are focusing on robust emergent phenomena (those that are not necessarily intuitive and are insensitive to assumptions, subgrid-parameterizations, and tunings). For many physical systems, under-resolved models fail to generate such phenomena, which only develop when model resolution is sufficiently high. Our ultimate goal is to discover the patterns of emergent climate precursors (those that cannot be predicted with lower-resolution models) that can be used as a "sensitivity fingerprint" and make recommendations for a climate early warning system that would use satellites and sensor arrays to look for the various predicted high-sensitivity signatures. Our initial simulations are focused on the Arctic region, where underpredicted phenomena such as rapid loss of sea ice are already emerging, and because of major geopolitical implications associated with increasing Arctic accessibility to natural resources, shipping routes, and strategic locations. We anticipate that regional climate will be strongly influenced by feedbacks associated with a seasonally ice-free Arctic, but with unknown emergent phenomena. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy under Contract DE-AC04-94AL85000.
Willoughby, Laura; Adams, Daniel M; Evans, R Scott; Lloyd, James F; Stevens, Scott M; Woller, Scott C; Bledsoe, Joseph R; Aston, Valerie T; Wilson, Emily L; Elliott, C Gregory
2018-05-01
Guidelines suggest anticoagulation of patients with high pretest probability of pulmonary embolism (PE) while awaiting diagnostic test results (preemptive anticoagulation). Data relevant to the practice of preemptive anticoagulation are not available. We reviewed 3,500 consecutive patients who underwent CT pulmonary angiography (CTPA) at two EDs. We classified the pretest probability for PE using the revised Geneva Score (RGS) as low (RGS 0-3), intermediate (RGS 4-10), or high (RGS 11-18). We classified patients with a high pretest probability of PE as receiving preemptive anticoagulation if therapeutic anticoagulation was given before CTPA completion. Patients with a high bleeding risk and those receiving treatment for DVT before CTPA were excluded from the preemptive anticoagulation analysis. We compared the time elapsed between ED registration and CTPA completion for patients with a low, intermediate, and high pretest probability for PE. We excluded three of 3,500 patients because CTPA preceded ED registration. Of the remaining 3,497 patients, 167 (4.8%) had a high pretest probability for PE. After excluding 29 patients for high bleeding risk and 21 patients who were treated for DVT prior to CTPA, only two of 117 patients (1.7%) with a high pretest probability for PE received preemptive anticoagulation. Furthermore, 37 of the remaining 115 patients (32%) with a high pretest probability for PE had a preexisting indication for anticoagulation but did not receive preemptive anticoagulation. The time from ED registration to CTPA completion did not differ based on the pretest probability of PE. Physicians rarely use preemptive anticoagulation in patients with a high pretest probability for PE. Clinicians do not expedite CTPA examinations for patients with a high pretest probability for PE. Copyright © 2017 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.
Assessing the Risk of a Canine Rabies Incursion in Northern Australia
Hudson, Emily G.; Brookes, Victoria J.; Ward, Michael P.
2017-01-01
Rabies is a globally distributed virus that causes approximately 60,00 human deaths annually with >99% of cases caused by dog bites. Australia is currently canine rabies free. However, the recent eastward spread of rabies in the Indonesian archipelago has increased the probability of rabies entry into northern Australian communities. In addition, many northern Australian communities have large populations of free-roaming dogs, capable of maintaining rabies should an incursion occur. A risk assessment of rabies entry and transmission into these communities is needed to target control and surveillance measures. Illegal transportation of rabies-infected dogs via boat landings is a high-risk entry pathway and was the focus of the current study. A quantitative, stochastic, risk assessment model was developed to evaluate the risk of rabies entry into north-west Cape York Peninsula, Australia, and rabies introduction to resident dogs in one of the communities via transport of rabies-infected dogs on illegal Indonesian fishing boats. Parameter distributions were derived from expert opinion, literature, and analysis of field studies. The estimated median probability of rabies entry into north-west Cape York Peninsula and into Seisia from individual fishing boats was 1.9 × 10−4/boat and 8.7 × 10−6/boat, respectively. The estimated annual probability that at least one rabies-infected dog enters north-west Cape York Peninsula and into Seisia was 5.5 × 10−3 and 3.5 × 10−4, respectively. The estimated median probability of rabies introduction into Seisia was 4.7 × 10−8/boat, and the estimated annual probability that at least one rabies-infected dog causes rabies transmission in a resident Seisia dog was 8.3 × 10−5. Sensitivity analysis using the Sobol method highlighted some parameters as influential, including but not limited to the prevalence of rabies in Indonesia, the probability of a dog on board an Indonesian fishing boat, and the probability of a Seisia dog being on the beach. Overall, the probabilities of rabies entry into north-west Cape York Peninsula and rabies introduction into Seisia are low. However, the potential devastating consequences of a rabies incursion in this region make this a non-negligible risk. PMID:28913341
ERIC Educational Resources Information Center
Zuluaga, Carlos A.; Normand, Matthew P.
2008-01-01
We assessed the effects of reinforcement and no reinforcement for compliance to high-probability (high-p) instructions on compliance to low-probability (low-p) instructions using a reversal design. For both participants, compliance with the low-p instruction increased only when compliance with high-p instructions was followed by reinforcement.…
Evolution of cosmic string networks
NASA Technical Reports Server (NTRS)
Albrecht, Andreas; Turok, Neil
1989-01-01
Results on cosmic strings are summarized including: (1) the application of non-equilibrium statistical mechanics to cosmic string evolution; (2) a simple one scale model for the long strings which has a great deal of predictive power; (3) results from large scale numerical simulations; and (4) a discussion of the observational consequences of our results. An upper bound on G mu of approximately 10(-7) emerges from the millisecond pulsar gravity wave bound. How numerical uncertainties affect this are discussed. Any changes which weaken the bound would probably also give the long strings the dominant role in producing observational consequences.
Applications of fuzzy ranking methods to risk-management decisions
NASA Astrophysics Data System (ADS)
Mitchell, Harold A.; Carter, James C., III
1993-12-01
The Department of Energy is making significant improvements to its nuclear facilities as a result of more stringent regulation, internal audits, and recommendations from external review groups. A large backlog of upgrades has resulted. Currently, a prioritization method is being utilized which relies on a matrix of potential consequence and probability of occurrence. The attributes of the potential consequences considered include likelihood, exposure, public health and safety, environmental impact, site personnel safety, public relations, legal liability, and business loss. This paper describes an improved method which utilizes fuzzy multiple attribute decision methods to rank proposed improvement projects.
Water abundance and accretion history of terrestrial planets
NASA Technical Reports Server (NTRS)
Waenke, H.; Dreibus, G.
1994-01-01
According to a widespread believe, Earth's water was either added in form of a late volatile-rich veneer or as we have argued repeatedly that of all the water which was added to the Earth only that portion remained which was added towards the end of accretion when the mean oxygen fugacity of the accreting material became so high that metallic iron could not exist any longer. Prior to this moment, all the water in the latter scenario would have been used up for the oxidation of iron. Fe + H2O yields FeO + H2. Huge quantities of hydrogen would continuously be produced in this scenario which escaped. In the same moment the hydrogen on its way to the surface would lead to an efficient degassing of the growing Earth's mantle. The fact that - assuming C1 abundances - the amount of iridium in the Earth's mantle agrees, within a factor of two with the total water inventory of the Earth's mantle and crust is taken as evidence for the validity of such a scenario. In both scenarios, the Earth's mantle would remain dry and devoid of other volatiles. Some species soluble in metallic iron like carbon and hydrogen will probably partly enter the core in some portions. It is generally assumed that today a considerable portion of the earth's total water inventory resides in the mantle. It is also clear that over the history of the Earth the water of the Earth's oceans has been recycled many times through the mantle. This is the consequence of plate subduction. In a similar way mantle convection was probably responsible to being water into the originally dry mantle. As a consequence, today the Earth is wet both inside and outside.
Unconditionally verifiable blind quantum computation
NASA Astrophysics Data System (ADS)
Fitzsimons, Joseph F.; Kashefi, Elham
2017-07-01
Blind quantum computing (BQC) allows a client to have a server carry out a quantum computation for them such that the client's input, output, and computation remain private. A desirable property for any BQC protocol is verification, whereby the client can verify with high probability whether the server has followed the instructions of the protocol or if there has been some deviation resulting in a corrupted output state. A verifiable BQC protocol can be viewed as an interactive proof system leading to consequences for complexity theory. We previously proposed [A. Broadbent, J. Fitzsimons, and E. Kashefi, in Proceedings of the 50th Annual Symposium on Foundations of Computer Science, Atlanta, 2009 (IEEE, Piscataway, 2009), p. 517] a universal and unconditionally secure BQC scheme where the client only needs to be able to prepare single qubits in separable states randomly chosen from a finite set and send them to the server, who has the balance of the required quantum computational resources. In this paper we extend that protocol with additional functionality allowing blind computational basis measurements, which we use to construct another verifiable BQC protocol based on a different class of resource states. We rigorously prove that the probability of failing to detect an incorrect output is exponentially small in a security parameter, while resource overhead remains polynomial in this parameter. This resource state allows entangling gates to be performed between arbitrary pairs of logical qubits with only constant overhead. This is a significant improvement on the original scheme, which required that all computations to be performed must first be put into a nearest-neighbor form, incurring linear overhead in the number of qubits. Such an improvement has important consequences for efficiency and fault-tolerance thresholds.
Toward a microscopic model of bidirectional synaptic plasticity
Castellani, Gastone C.; Bazzani, Armando; Cooper, Leon N
2009-01-01
We show that a 2-step phospho/dephosphorylation cycle for the α-amino-3-hydroxy-5-methyl-4-isoxazole proprionic acid receptor (AMPAR), as used in in vivo learning experiments to assess long-term potentiation (LTP) induction and establishment, exhibits bistability for a wide range of parameters, consistent with values derived from biological literature. The AMPAR model we propose, hence, is a candidate for memory storage and switching behavior at a molecular-microscopic level. Furthermore, the stochastic formulation of the deterministic model leads to a mesoscopic interpretation by considering the effect of enzymatic fluctuations on the Michelis–Menten average dynamics. Under suitable hypotheses, this leads to a stochastic dynamical system with multiplicative noise whose probability density evolves according to a Fokker–Planck equation in the Stratonovich sense. In this approach, the probability density associated with each AMPAR phosphorylation state allows one to compute the probability of any concentration value, whereas the Michaelis–Menten equations consider the average concentration dynamics. We show that bistable dynamics are robust for multiplicative stochastic perturbations and that the presence of both noise and bistability simulates LTP and long-term depression (LTD) behavior. Interestingly, the LTP part of this model has been experimentally verified as a result of in vivo, one-trial inhibitory avoidance learning protocol in rats, that produced the same changes in hippocampal AMPARs phosphorylation state as observed with in vitro induction of LTP with high-frequency stimulation (HFS). A consequence of this model is the possibility of characterizing a molecular switch with a defined biochemical set of reactions showing bistability and bidirectionality. Thus, this 3-enzymes-based biophysical model can predict LTP as well as LTD and their transition rates. The theoretical results can be, in principle, validated by in vitro and in vivo experiments, such as fluorescence measurements and electrophysiological recordings at multiple scales, from molecules to neurons. A further consequence is that the bistable regime occurs only within certain parametric windows, which may simulate a “history-dependent threshold”. This effect might be related to the Bienenstock–Cooper–Munro theory of synaptic plasticity. PMID:19666550
Why risk is not variance: an expository note.
Cox, Louis Anthony Tony
2008-08-01
Variance (or standard deviation) of return is widely used as a measure of risk in financial investment risk analysis applications, where mean-variance analysis is applied to calculate efficient frontiers and undominated portfolios. Why, then, do health, safety, and environmental (HS&E) and reliability engineering risk analysts insist on defining risk more flexibly, as being determined by probabilities and consequences, rather than simply by variances? This note suggests an answer by providing a simple proof that mean-variance decision making violates the principle that a rational decisionmaker should prefer higher to lower probabilities of receiving a fixed gain, all else being equal. Indeed, simply hypothesizing a continuous increasing indifference curve for mean-variance combinations at the origin is enough to imply that a decisionmaker must find unacceptable some prospects that offer a positive probability of gain and zero probability of loss. Unlike some previous analyses of limitations of variance as a risk metric, this expository note uses only simple mathematics and does not require the additional framework of von Neumann Morgenstern utility theory.
The probability heuristics model of syllogistic reasoning.
Chater, N; Oaksford, M
1999-03-01
A probability heuristic model (PHM) for syllogistic reasoning is proposed. An informational ordering over quantified statements suggests simple probability based heuristics for syllogistic reasoning. The most important is the "min-heuristic": choose the type of the least informative premise as the type of the conclusion. The rationality of this heuristic is confirmed by an analysis of the probabilistic validity of syllogistic reasoning which treats logical inference as a limiting case of probabilistic inference. A meta-analysis of past experiments reveals close fits with PHM. PHM also compares favorably with alternative accounts, including mental logics, mental models, and deduction as verbal reasoning. Crucially, PHM extends naturally to generalized quantifiers, such as Most and Few, which have not been characterized logically and are, consequently, beyond the scope of current mental logic and mental model theories. Two experiments confirm the novel predictions of PHM when generalized quantifiers are used in syllogistic arguments. PHM suggests that syllogistic reasoning performance may be determined by simple but rational informational strategies justified by probability theory rather than by logic. Copyright 1999 Academic Press.
Quantification of effective exoelectrogens by most probable number (MPN) in a microbial fuel cell.
Heidrich, Elizabeth S; Curtis, Thomas P; Woodcock, Stephen; Dolfing, Jan
2016-10-01
The objective of this work was to quantify the number of exoelectrogens in wastewater capable of producing current in a microbial fuel cell by adapting the classical most probable number (MPN) methodology using current production as end point. Inoculating a series of microbial fuel cells with various dilutions of domestic wastewater and with acetate as test substrate yielded an apparent number of exoelectrogens of 17perml. Using current as a proxy for activity the apparent exoelectrogen growth rate was 0.03h(-1). With starch or wastewater as more complex test substrates similar apparent growth rates were obtained, but the apparent MPN based numbers of exoelectrogens in wastewater were significantly lower, probably because in contrast to acetate, complex substrates require complex food chains to deliver the electrons to the electrodes. Consequently, the apparent MPN is a function of the combined probabilities of members of the food chain being present. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.
Selective sweep mapping of genes with large phenotypic effects.
Pollinger, John P; Bustamante, Carlos D; Fledel-Alon, Adi; Schmutz, Sheila; Gray, Melissa M; Wayne, Robert K
2005-12-01
Many domestic dog breeds have originated through fixation of discrete mutations by intense artificial selection. As a result of this process, markers in the proximity of genes influencing breed-defining traits will have reduced variation (a selective sweep) and will show divergence in allele frequency. Consequently, low-resolution genomic scans can potentially be used to identify regions containing genes that have a major influence on breed-defining traits. We model the process of breed formation and show that the probability of two or three adjacent marker loci showing a spurious signal of selection within at least one breed (i.e., Type I error or false-positive rate) is low if highly variable and moderately spaced markers are utilized. We also use simulations with selection to demonstrate that even a moderately spaced set of highly polymorphic markers (e.g., one every 0.8 cM) has high power to detect regions targeted by strong artificial selection in dogs. Further, we show that a gene responsible for black coat color in the Large Munsterlander has a 40-Mb region surrounding the gene that is very low in heterozygosity for microsatellite markers. Similarly, we survey 302 microsatellite markers in the Dachshund and find three linked monomorphic microsatellite markers all within a 10-Mb region on chromosome 3. This region contains the FGFR3 gene, which is responsible for achondroplasia in humans, but not in dogs. Consequently, our results suggest that the causative mutation is a gene or regulatory region closely linked to FGFR3.
NASA Astrophysics Data System (ADS)
Anagnostou, E. N.; Seyyedi, H.; Beighley, E., II; McCollum, J.
2014-12-01
Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a conflation of CCS with the more advanced, widespread technology hydraulic fracturing and corresponding strong risk associations. We conclude with suggestions on how to integrate modeling results into public conversations to improve risk awareness and we provide preliminary policy recommendations to increase public support for CCS.
NASA Astrophysics Data System (ADS)
Augustin, C. M.
2015-12-01
Carbon capture and storage (CCS) has been suggested by the Intergovernmental Panel on Climate Change as a partial solution to the greenhouse gas emissions problem. As CCS has become mainstream, researchers have raised multiple risk assessment issues typical of emerging technologies. In our research, we examine issues occuring when stored carbon dioxide (CO2) migrates to the near-surface or surface. We believe that both the public misperception and the physical reality of potential environmental, health, and commercial impacts of leak events from such subsurface sites have prevented widespread adoption of CCS. This paper is presented in three parts; the first is an evaluation of the systemic risk of a CCS site CO2 leak and models indicating potential likelihood of a leakage event. As the likelihood of a CCS site leak is stochastic and nonlinear, we present several Bayesian simulations for leak events based on research done with other low-probability, high-consequence gaseous pollutant releases. Though we found a large, acute leak to be exceptionally rare, we demonstrate potential for a localized, chronic leak at a CCS site. To that end, we present the second piece of this paper. Using a combination of spatio-temporal models and reaction-path models, we demonstrate the interplay between leak migrations, material interactions, and atmospheric dispersion for leaks of various duration and volume. These leak-event scenarios have implications for human, environmental, and economic health; they also have a significant impact on implementation support. Public acceptance of CCS is essential for a national low-carbon future, and this is what we address in the final part of this paper. We demonstrate that CCS remains unknown to the general public in the United States. Despite its unknown state, we provide survey findings -analyzed in Slovic and Weber's 2002 framework - that show a high unknown, high dread risk perception of leaks from a CCS site. Secondary findings are a conflation of CCS with the more advanced, widespread technology hydraulic fracturing and corresponding strong risk associations. We conclude with suggestions on how to integrate modeling results into public conversations to improve risk awareness and we provide preliminary policy recommendations to increase public support for CCS.
The Torino Impact Hazard Scale
NASA Astrophysics Data System (ADS)
Binzel, Richard P.
2000-04-01
Newly discovered asteroids and comets have inherent uncertainties in their orbit determinations owing to the natural limits of positional measurement precision and the finite lengths of orbital arcs over which determinations are made. For some objects making predictable future close approaches to the Earth, orbital uncertainties may be such that a collision with the Earth cannot be ruled out. Careful and responsible communication between astronomers and the public is required for reporting these predictions and a 0-10 point hazard scale, reported inseparably with the date of close encounter, is recommended as a simple and efficient tool for this purpose. The goal of this scale, endorsed as the Torino Impact Hazard Scale, is to place into context the level of public concern that is warranted for any close encounter event within the next century. Concomitant reporting of the close encounter date further conveys the sense of urgency that is warranted. The Torino Scale value for a close approach event is based upon both collision probability and the estimated kinetic energy (collision consequence), where the scale value can change as probability and energy estimates are refined by further data. On the scale, Category 1 corresponds to collision probabilities that are comparable to the current annual chance for any given size impactor. Categories 8-10 correspond to certain (probability >99%) collisions having increasingly dire consequences. While close approaches falling Category 0 may be no cause for noteworthy public concern, there remains a professional responsibility to further refine orbital parameters for such objects and a figure of merit is suggested for evaluating such objects. Because impact predictions represent a multi-dimensional problem, there is no unique or perfect translation into a one-dimensional system such as the Torino Scale. These limitations are discussed.
Wahlberg, Å; Andreen Sachs, M; Johannesson, K; Hallberg, G; Jonsson, M; Skoog Svanberg, A; Högberg, U
2017-07-01
To examine post-traumatic stress reactions among obstetricians and midwives, experiences of support and professional consequences after severe events in the labour ward. Cross-sectional online survey from January 7 to March 10, 2014. Members of the Swedish Society of Obstetrics and Gynaecology and the Swedish Association of Midwives. Potentially traumatic events were defined as: the child died or was severely injured during delivery; maternal near-miss; maternal mortality; and other events such as violence or threat. The validated Screen Questionnaire Posttraumatic Stress Disorder (SQ-PTSD), based on DSM-IV (1994) 4th edition, was used to assess partial post-traumatic stress disorder (PTSD) and probable PTSD. Partial or probable PTSD. The response rate was 47% for obstetricians (n = 706) and 40% (n = 1459) for midwives. Eighty-four percent of the obstetricians and 71% of the midwives reported experiencing at least one severe event on the delivery ward. Fifteen percent of both professions reported symptoms indicative of partial PTSD, whereas 7% of the obstetricians and 5% of the midwives indicated symptoms fulfilling PTSD criteria. Having experienced emotions of guilt or perceived insufficient support from friends predicted a higher risk of suffering from partial or probable PTSD. Obstetricians and midwives with partial PTSD symptoms chose to change their work to outpatient care significantly more often than colleagues without these symptoms. A substantial proportion of obstetricians and midwives reported symptoms of partial or probable PTSD after severe traumatic events experienced on the labour ward. Support and resilience training could avoid suffering and consequences for professional carers. In a survey 15% of Swedish obstetricians and midwives reported PTSD symptoms after their worst obstetric event. © 2016 Royal College of Obstetricians and Gynaecologists.
Bullying victimisation and risk of psychotic phenomena: analyses of British national survey data.
Catone, Gennaro; Marwaha, Steven; Kuipers, Elizabeth; Lennox, Belinda; Freeman, Daniel; Bebbington, Paul; Broome, Matthew
2015-07-01
Being bullied is an aversive experience with short-term and long-term consequences, and is incorporated in biopsychosocial models of psychosis. We used the 2000 and the 2007 British Adult Psychiatric Morbidity Surveys to test the hypothesis that bullying is associated with individual psychotic phenomena and with psychosis, and predicts the later emergence of persecutory ideation and hallucinations. We analysed two nationally representative surveys of individuals aged 16 years or older in Great Britain (2000) and England (2007). Respondents were presented with a card listing stressful events to identify experiences of bullying over the entire lifespan. We assessed associations with the dependent variables persecutory ideation, auditory and visual hallucinations, and diagnosis of probable psychosis. All analyses were controlled for sociodemographic confounders, intelligence quotient (IQ), and other traumas. We used data for 8580 respondents from 2000 and 7403 from 2007. Bullying was associated with presence of persecutory ideation and hallucinations, remaining so after adjustment for sociodemographic factors, IQ, other traumas, and childhood sexual abuse. Bullying was associated with a diagnosis of probable psychosis. If reported at baseline, bullying predicted emergence and maintenance of persecutory ideation and hallucinations during 18 months of follow-up in the 2000 survey. Controlling for other traumas and childhood sexual abuse did not affect the association between bullying and psychotic symptoms, but reduced the significance of the association with diagnosis of probable psychosis. Bullying was most strongly associated with the presence of concurrent persecutory ideation and hallucinations. Bullying victimisation increases the risk of individual psychotic symptoms and of a diagnosis of probable psychosis. Early detection of bullying and use of treatments oriented towards its psychological consequences might ameliorate the course of psychosis. None. Copyright © 2015 Elsevier Ltd. All rights reserved.
Bayesian analysis of rare events
DOE Office of Scientific and Technical Information (OSTI.GOV)
Straub, Daniel, E-mail: straub@tum.de; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into themore » probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.« less
Estimating transmission probability in schools for the 2009 H1N1 influenza pandemic in Italy.
Clamer, Valentina; Dorigatti, Ilaria; Fumanelli, Laura; Rizzo, Caterina; Pugliese, Andrea
2016-10-12
Epidemic models are being extensively used to understand the main pathways of spread of infectious diseases, and thus to assess control methods. Schools are well known to represent hot spots for epidemic spread; hence, understanding typical patterns of infection transmission within schools is crucial for designing adequate control strategies. The attention that was given to the 2009 A/H1N1pdm09 flu pandemic has made it possible to collect detailed data on the occurrence of influenza-like illness (ILI) symptoms in two primary schools of Trento, Italy. The data collected in the two schools were used to calibrate a discrete-time SIR model, which was designed to estimate the probabilities of influenza transmission within the classes, grades and schools using Markov Chain Monte Carlo (MCMC) methods. We found that the virus was mainly transmitted within class, with lower levels of transmission between students in the same grade and even lower, though not significantly so, among different grades within the schools. We estimated median values of R 0 from the epidemic curves in the two schools of 1.16 and 1.40; on the other hand, we estimated the average number of students infected by the first school case to be 0.85 and 1.09 in the two schools. The discrepancy between the values of R 0 estimated from the epidemic curve or from the within-school transmission probabilities suggests that household and community transmission played an important role in sustaining the school epidemics. The high probability of infection between students in the same class confirms that targeting within-class transmission is key to controlling the spread of influenza in school settings and, as a consequence, in the general population.
Fire-probability maps for the Brazilian Amazonia
NASA Astrophysics Data System (ADS)
Cardoso, M.; Nobre, C.; Obregon, G.; Sampaio, G.
2009-04-01
Most fires in Amazonia result from the combination between climate and land-use factors. They occur mainly in the dry season and are used as an inexpensive tool for land clearing and management. However, their unintended consequences are of important concern. Fire emissions are the most important sources of greenhouse gases and aerosols in the region, accidental fires are a major threat to protected areas, and frequent fires may lead to permanent conversion of forest areas into savannas. Fire-activity models have thus become important tools for environmental analyses in Amazonia. They are used, for example, in warning systems for monitoring the risk of burnings in protected areas, to improve the description of biogeochemical cycles and vegetation composition in ecosystem models, and to help estimate the long-term potential for savannas in biome models. Previous modeling studies for the whole region were produced in units of satellite fire pixels, which complicate their direct use for environmental applications. By reinterpreting remote-sensing based data using a statistical approach, we were able to calibrate models for the whole region in units of probability, or chance of fires to occur. The application of these models for years 2005 and 2006 provided maps of fire potential at 3-month and 0.25-deg resolution as a function of precipitation and distance from main roads. In both years, the performance of the resulting maps was better for the period July-September. During these months, most of satellite-based fire observations were located in areas with relatively high chance of fire, as determined by the modeled probability maps. In addition to reproduce reasonably well the areas presenting maximum fire activity as detected by remote sensing, the new results in units of probability are easier to apply than previous estimates from fire-pixel models.
Fire-probability maps for the Brazilian Amazonia
NASA Astrophysics Data System (ADS)
Cardoso, Manoel; Sampaio, Gilvan; Obregon, Guillermo; Nobre, Carlos
2010-05-01
Most fires in Amazonia result from the combination between climate and land-use factors. They occur mainly in the dry season and are used as an inexpensive tool for land clearing and management. However, their unintended consequences are of important concern. Fire emissions are the most important sources of greenhouse gases and aerosols in the region, accidental fires are a major threat to protected areas, and frequent fires may lead to permanent conversion of forest areas into savannas. Fire-activity models have thus become important tools for environmental analyses in Amazonia. They are used, for example, in warning systems for monitoring the risk of burnings in protected areas, to improve the description of biogeochemical cycles and vegetation composition in ecosystem models, and to help estimate the long-term potential for savannas in biome models. Previous modeling studies for the whole region were produced in units of satellite fire pixels, which complicate their direct use for environmental applications. By reinterpreting remote-sensing based data using a statistical approach, we were able to calibrate models for the whole region in units of probability, or chance of fires to occur. The application of these models for years 2005 and 2006 provided maps of fire potential at 3-month and 0.25-deg resolution as a function of precipitation and distance from main roads. In both years, the performance of the resulting maps was better for the period July-September. During these months, most of satellite-based fire observations were located in areas with relatively high chance of fire, as determined by the modeled probability maps. In addition to reproduce reasonably well the areas presenting maximum fire activity as detected by remote sensing, the new results in units of probability are easier to apply than previous estimates from fire-pixel models.
Hamada, Neusa; Cavalcante do Nascimento, Jeane Marcelle; Grillet, Maria Eugenia
2017-01-01
Simulium guianense Wise is a Latin American vector complex of black flies associated with transmission of the causal agent of human onchocerciasis (river blindness). An analysis of the chromosomal banding patterns of 607 larvae of S. guianense s. l. revealed a high level of variation involving 83 macrogenomic rearrangements across 25 populations in Brazil, French Guiana, and Venezuela. The 25 populations were assigned to 13 cytoforms (A1, A2, B1–B4, C, D, E1–E4, and F), some of which are probably valid species. Based on geographical proximity, a member of the B group of cytoforms probably represents the name-bearing type specimen of S. guianense and the primary vector in the last-remaining onchocerciasis foci in the Western Hemisphere. Cytoform B3 in Amapá State is implicated as an anthropophilic simuliid in an area currently and historically free of onchocerciasis. Distributions of cytoforms are associated with geography, elevation, and drainage basin, and are largely congruent with ecoregions. Despite extraordinarily large larval populations of S. guianense s. l. in big rivers and consequent production of female flies for dispersal, the cytoforms maintain their chromosomal distinction within individual rivers, suggesting a high degree of fidelity to the specialized breeding habitats—rocky shoals—of the natal rivers. PMID:28727841
Adler, Peter H; Hamada, Neusa; Cavalcante do Nascimento, Jeane Marcelle; Grillet, Maria Eugenia
2017-01-01
Simulium guianense Wise is a Latin American vector complex of black flies associated with transmission of the causal agent of human onchocerciasis (river blindness). An analysis of the chromosomal banding patterns of 607 larvae of S. guianense s. l. revealed a high level of variation involving 83 macrogenomic rearrangements across 25 populations in Brazil, French Guiana, and Venezuela. The 25 populations were assigned to 13 cytoforms (A1, A2, B1-B4, C, D, E1-E4, and F), some of which are probably valid species. Based on geographical proximity, a member of the B group of cytoforms probably represents the name-bearing type specimen of S. guianense and the primary vector in the last-remaining onchocerciasis foci in the Western Hemisphere. Cytoform B3 in Amapá State is implicated as an anthropophilic simuliid in an area currently and historically free of onchocerciasis. Distributions of cytoforms are associated with geography, elevation, and drainage basin, and are largely congruent with ecoregions. Despite extraordinarily large larval populations of S. guianense s. l. in big rivers and consequent production of female flies for dispersal, the cytoforms maintain their chromosomal distinction within individual rivers, suggesting a high degree of fidelity to the specialized breeding habitats-rocky shoals-of the natal rivers.
NASA Astrophysics Data System (ADS)
Avila-Alonso, Dailé; Baetens, Jan M.; Cardenas, Rolando; de Baets, Bernard
2017-07-01
In this work, the photosynthesis model presented by Avila et al. in 2013 is extended and more scenarios inhabited by ancient cyanobacteria are investigated to quantify the effects of ultraviolet (UV) radiation on their photosynthetic potential in marine environments of the Archean eon. We consider ferrous ions as blockers of UV during the Early Archean, while the absorption spectrum of chlorophyll a is used to quantify the fraction of photosynthetically active radiation absorbed by photosynthetic organisms. UV could have induced photoinhibition at the water surface, thereby strongly affecting the species with low light use efficiency. A higher photosynthetic potential in early marine environments was shown than in the Late Archean as a consequence of the attenuation of UVC and UVB by iron ions, which probably played an important role in the protection of ancient free-floating bacteria from high-intensity UV radiation. Photosynthetic organisms in Archean coastal and ocean environments were probably abundant in the first 5 and 25 m of the water column, respectively. However, species with a relatively high efficiency in the use of light could have inhabited ocean waters up to a depth of 200 m and show a Deep Chlorophyll Maximum near 60 m depth. We show that the electromagnetic radiation from the Sun, both UV and visible light, could have determined the vertical distribution of Archean marine photosynthetic organisms.
Rojas-Rodríguez, Jorge; Escobar-Linares, Luis E; Garcia-Carrasco, Mario; Escárcega, Ricardo O; Fuentes-Alexandro, Salvador; Zamora-Ustaran, Alfonso
2007-01-01
We propose that the pathogenesis of obesity-induced osteoarthritis may be explained by the metabolic changes in the striated muscle induced by the interaction of insulin resistance and systemic inflammation in obese individuals with metabolic syndrome being osteoarthritis the latest consequence by the physiological changes seen in the metabolic syndrome. Increased levels of TH1 cytokines are produced by activated macrophages in the presence of an acute or chronic infectious disease and suppress the sensitivity of insulin receptors on the membrane of muscle cell and adipocytes. Both cells are activated by inflammatory cytokines and contribute to enhance acute inflammation and to maintain a state of chronic, low-grade inflammation in apparently healthy obese individuals. The increased number of macrophage in the adipose tissue of obese individuals acts as an amplifier of inflammation. Patients with osteoarthritis and metabolic syndrome frequently are complaining about hotness and recurrent edema of feet and hands. It is probable that hyperinsulinemia in the presence of insulin resistance and inflammation, induce vasodilation through the TNF mediated-iNOS overexpression. Patients with metabolic syndrome express clinically the consequence of a poor uptake, storage and energy expenditure by the muscle and any other insulin dependent tissue and the consequence of high insulin plasma levels are vasodilation and increased protein synthesis. The fatigue and muscle weakness induced by insulin resistance and inflammation in obese patients with metabolic syndrome increase the frequency and the intensity of traumatic events of peripheral or axial joints that result in stretch and breaking of tenoperiosteal junction and abrasive damage of cartilage and therefore in these patients with metabolic syndrome and pro-inflammatory state the reparative process of cartilage and periarticular tissues would be severely modified by the growth factor activity in presence of high levels of insulin.
NASA Astrophysics Data System (ADS)
Baklanov, A.; Mahura, A.; Sørensen, J. H.
2003-06-01
There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the ``Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release (e.g. 1 Bq) are performed. The analysis showed that the possible deposition fractions of 10-11 (Bq/m2) over the Kola Peninsula, and 10-12 - 10-13 (Bq/m2) for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.
NASA Astrophysics Data System (ADS)
Baklanov, A.; Mahura, A.; Sørensen, J. H.
2003-03-01
There are objects with some periods of higher than normal levels of risk of accidental atmospheric releases (nuclear, chemical, biological, etc.). Such accidents or events may occur due to natural hazards, human errors, terror acts, and during transportation of waste or various operations at high risk. A methodology for risk assessment is suggested and it includes two approaches: 1) probabilistic analysis of possible atmospheric transport patterns using long-term trajectory and dispersion modelling, and 2) forecast and evaluation of possible contamination and consequences for the environment and population using operational dispersion modelling. The first approach could be applied during the preparation stage, and the second - during the operation stage. The suggested methodology is applied on an example of the most important phases (lifting, transportation, and decommissioning) of the "Kursk" nuclear submarine operation. It is found that the temporal variability of several probabilistic indicators (fast transport probability fields, maximum reaching distance, maximum possible impact zone, and average integral concentration of 137Cs) showed that the fall of 2001 was the most appropriate time for the beginning of the operation. These indicators allowed to identify the hypothetically impacted geographical regions and territories. In cases of atmospheric transport toward the most populated areas, the forecasts of possible consequences during phases of the high and medium potential risk levels based on a unit hypothetical release are performed. The analysis showed that the possible deposition fractions of 1011 over the Kola Peninsula, and 10-12 - 10-13 for the remote areas of the Scandinavia and Northwest Russia could be observed. The suggested methodology may be used successfully for any potentially dangerous object involving risk of atmospheric release of hazardous materials of nuclear, chemical or biological nature.
Discrete two-sex models of population dynamics: On modelling the mating function
NASA Astrophysics Data System (ADS)
Bessa-Gomes, Carmen; Legendre, Stéphane; Clobert, Jean
2010-09-01
Although sexual reproduction has long been a central subject of theoretical ecology, until recently its consequences for population dynamics were largely overlooked. This is now changing, and many studies have addressed this issue, showing that when the mating system is taken into account, the population dynamics depends on the relative abundance of males and females, and is non-linear. Moreover, sexual reproduction increases the extinction risk, namely due to the Allee effect. Nevertheless, different studies have identified diverse potential consequences, depending on the choice of mating function. In this study, we investigate the consequences of three alternative mating functions that are frequently used in discrete population models: the minimum; the harmonic mean; and the modified harmonic mean. We consider their consequences at three levels: on the probability that females will breed; on the presence and intensity of the Allee effect; and on the extinction risk. When we consider the harmonic mean, the number of times the individuals of the least abundant sex mate exceeds their mating potential, which implies that with variable sex-ratios the potential reproductive rate is no longer under the modeller's control. Consequently, the female breeding probability exceeds 1 whenever the sex-ratio is male-biased, which constitutes an obvious problem. The use of the harmonic mean is thus only justified if we think that this parameter should be re-defined in order to represent the females' breeding rate and the fact that females may reproduce more than once per breeding season. This phenomenon buffers the Allee effect, and reduces the extinction risk. However, when we consider birth-pulse populations, such a phenomenon is implausible because the number of times females can reproduce per birth season is limited. In general, the minimum or modified harmonic mean mating functions seem to be more suitable for assessing the impact of mating systems on population dynamics.
Grapefruit and drug interactions.
2012-12-01
Since the late 1980s, grapefruit juice has been known to affect the metabolism of certain drugs. Several serious adverse effects involving drug interactions with grapefruit juice have been published in detail. The components of grapefruit juice vary considerably depending on the variety, maturity and origin of the fruit, local climatic conditions, and the manufacturing process. No single component accounts for all observed interactions. Other grapefruit products are also occasionally implicated, including preserves, lyophylised grapefruit juice, powdered whole grapefruit, grapefruit seed extract, and zest. Clinical reports of drug interactions with grapefruit juice are supported by pharmacokinetic studies, each usually involving about 10 healthy volunteers, in which the probable clinical consequences were extrapolated from the observed plasma concentrations. Grapefruit juice inhibits CYP3A4, the cytochrome P450 isoenzyme most often involved in drug metabolism. This increases plasma concentrations of the drugs concerned, creating a risk of overdose and dose-dependent adverse effects. Grapefruit juice also inhibits several other cytochrome P450 isoenzymes, but they are less frequently implicated in interactions with clinical consequences. Drugs interacting with grapefruit and inducing serious clinical consequences (confirmed or very probable) include: immunosuppressants, some statins, benzodiazepines, most calcium channel blockers, indinavir and carbamazepine. There are large inter-individual differences in enzyme efficiency. Along with the variable composition of grapefruit juice, this makes it difficult to predict the magnitude and clinical consequences of drug interactions with grapefruit juice in a given patient. There is increasing evidence that transporter proteins such as organic anion transporters and P-glycoprotein are involved in interactions between drugs and grapefruit juice. In practice, numerous drugs interact with grapefruit juice. Although only a few reports involving severe clinical consequences have been published, they suggest that grapefruit juice should be avoided during drug therapy, especially when the drug has a narrow therapeutic margin or carries a risk of serious dose-dependent adverse effects. Patients should be informed of this risk whenever a drug is prescribed or dispensed.
Maritime Transportation Risk Assessment of Tianjin Port with Bayesian Belief Networks.
Zhang, Jinfen; Teixeira, Ângelo P; Guedes Soares, C; Yan, Xinping; Liu, Kezhong
2016-06-01
This article develops a Bayesian belief network model for the prediction of accident consequences in the Tianjin port. The study starts with a statistical analysis of historical accident data of six years from 2008 to 2013. Then a Bayesian belief network is constructed to express the dependencies between the indicator variables and accident consequences. The statistics and expert knowledge are synthesized in the Bayesian belief network model to obtain the probability distribution of the consequences. By a sensitivity analysis, several indicator variables that have influence on the consequences are identified, including navigational area, ship type and time of the day. The results indicate that the consequences are most sensitive to the position where the accidents occurred, followed by time of day and ship length. The results also reflect that the navigational risk of the Tianjin port is at the acceptable level, despite that there is more room of improvement. These results can be used by the Maritime Safety Administration to take effective measures to enhance maritime safety in the Tianjin port. © 2016 Society for Risk Analysis.
Is seeing believing? Perceptions of wildfire risk over time
Patricia A. Champ; Hannah Brenkert-Smith
2016-01-01
Ongoing challenges to understanding how hazard exposure and disaster experiences influence perceived risk lead us to ask: Is seeing believing? We approach risk perception by attending to two components of overall risk perception: perceived probability of an event occurring and perceived consequences if an event occurs. Using a two-period longitudinal data set...
Richard Wagner: Twilight of the Nazi Spell.
ERIC Educational Resources Information Center
Lindemann, Dirk
1985-01-01
Richard Wagner was probably the most influential musician of the 19th century. However, his image as an alleged intellectual-spiritual forerunner of national socialism through his music and prose works fosters aversion among critics. Whether Wagner's complicacy of art and ideology has had any lasting consequence on his reputation is discussed. (RM)
How many universes are necessary for an ice cream to melt?
NASA Astrophysics Data System (ADS)
Cirkovic, Milan M.
We investigate a quantitative consequence of the Acausal-Anthropic approach to solving the long-standing puzzle of the thermodynamical arrow of time. Notably, the size of the required multiverse is estimated on the basis of the classical Boltzmann connection between entropy and probability, as well as the thermodynamic properties of black holes.
Physical consequences of large organic debris in Pacific Northwest streams.
Frederick J. Swanson; George W. Lienkaemper
1978-01-01
Large organic debris in streams controls the distribution of aquatic habitats, the routing of sediment through stream systems, and the stability of streambed and banks. Management activities directly alter debris loading by addition or removal of material and indirectly by increasing the probability of debris torrents and removing standing streamside trees. We propose...
Line intersect sampling: Ell-shaped transects and multiple intersections
Timothy G. Gregoire; Harry T. Valentine
2003-01-01
The probability of selecting a population element under line intersect sampling depends on the width of the particle in the direction perpendicular to the transect, as is well known. The consequence of this when using ell-shaped transects rather than straight-line transects are explicated, and modifications that preserve design-unbiasedness of Kaiser's (1983)...
Baroreflex Sensitivity Is Reduced in Adolescents with Probable Developmental Coordination Disorder
ERIC Educational Resources Information Center
Coverdale, Nicole S.; O'Leary, Deborah D.; Faught, Brent E.; Chirico, Daniele; Hay, John; Cairney, John
2012-01-01
Developmental coordination disorder (DCD) is a neurodevelopmental condition characterized by poor motor skills leading to a significant impairment in activities of daily living. Compared to typically developing children, those with DCD are less fit and physically active, and have increased body fat. This is an important consequence as both…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-06
... significantly increase the probability or consequences of accidents. No changes are being made in the types of effluents that may be released offsite. There is no significant increase in the amount of any effluent released offsite. There is no significant increase in occupational or public radiation exposure. Therefore...
Risk Assessment: Evidence Base
NASA Technical Reports Server (NTRS)
Johnson-Throop, Kathy A.
2007-01-01
Human systems PRA (Probabilistic Risk Assessment: a) Provides quantitative measures of probability, consequence, and uncertainty; and b) Communicates risk and informs decision-making. Human health risks rated highest in ISS PRA are based on 1997 assessment of clinical events in analog operational settings. Much work remains to analyze remaining human health risks identified in Bioastronautics Roadmap.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-04-06
... of an accident or that supports mitigation of an accident previously evaluated. The proposed... probability or consequences of an accident previously evaluated; or (2) create the possibility of a new or different kind of accident from any accident previously evaluated; or (3) involve a significant reduction in...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-21
...) involve a significant increase in the probability or consequences of an accident previously evaluated; or (2) create the possibility of a new or different kind of accident from any accident previously... statement of the alleged facts or expert opinion which support the contention and on which the requestor...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-11-01
... [email protected] . Federal Rulemaking Web Site: Public comments and supporting materials related... increase in the probability or consequences of an accident previously evaluated; or (2) create the possibility of a new or different kind of accident from any accident previously evaluated; or (3) involve a...
Managing forest landscapes for climate change. Chapter 3.
Thomas R. Crow
2008-01-01
Climate change is the defining issue of the day and probably for many subsequent generations of resource managers. Although the public and therefore the policymakers have been slow in grasping the far-reaching consequences of climate change on our social and economic institutions, they are now desperately seeking options for dealing with novel climates, ecological...
The Theory and Practice of Alternative Certification: Implications for the Improvement of Teaching.
ERIC Educational Resources Information Center
Hawley, Willis D.
1990-01-01
Identifies questions related to the processes and consequences of alternative teacher certification (AC), answers questions with research-based facts, proposes key elements of a model AC program, and draws conclusions about the directions AC may take and its probable effects on educational reform and on the professionalization of teaching. (SM)
Setting Priorities: Personal Values, Organizational Results. Ideas into Action Guidebooks
ERIC Educational Resources Information Center
Cartwright, Talula
2007-01-01
Successful leaders get results. To get results, you need to set priorities. This book can help you do a better job of setting priorities, recognizing the personal values that motivate your decision making, the probable trade-offs and consequences of your decisions, and the importance of aligning your priorities with your organization's…
Decision making under uncertainty: a quasimetric approach.
N'Guyen, Steve; Moulin-Frier, Clément; Droulez, Jacques
2013-01-01
We propose a new approach for solving a class of discrete decision making problems under uncertainty with positive cost. This issue concerns multiple and diverse fields such as engineering, economics, artificial intelligence, cognitive science and many others. Basically, an agent has to choose a single or series of actions from a set of options, without knowing for sure their consequences. Schematically, two main approaches have been followed: either the agent learns which option is the correct one to choose in a given situation by trial and error, or the agent already has some knowledge on the possible consequences of his decisions; this knowledge being generally expressed as a conditional probability distribution. In the latter case, several optimal or suboptimal methods have been proposed to exploit this uncertain knowledge in various contexts. In this work, we propose following a different approach, based on the geometric intuition of distance. More precisely, we define a goal independent quasimetric structure on the state space, taking into account both cost function and transition probability. We then compare precision and computation time with classical approaches.
Vulnerability inducing technologies: An initial appreciation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reinhardt, G.C.
The arms control community continues to act as though vulnerability were directly proportional to numbers of nuclear weapons, however rapidly they would voice their rejection of such a proposition if it were placed before them in an intellectual forum. Such neglect in matching action to knowledge is a well-known human phenomenon, but in this case it is particularly troublesome. START manages to reduce the numbers of weapons to just the right extent to encourage damage limiting. The present number of nuclear weapons on either side probably provides a robust deterrent; six thousand probably does not. To make matters worse, wemore » live in a period of burgeoning technical expansion, so that even with the best intent on both sides, new technology threatens to cause new vulnerabilities in strategic systems. To pin a shorthand label on the problem, we will refer to vulnerability inducing technology as ''VIT.'' In order to appreciate VIT, we will make a rough quantification of its consequences. This will at least provide some incentive for further study because the consequences are grave indeed. 2 tabs.« less
NASA Technical Reports Server (NTRS)
Putcha, Chandra S.; Mikula, D. F. Kip; Dueease, Robert A.; Dang, Lan; Peercy, Robert L.
1997-01-01
This paper deals with the development of a reliability methodology to assess the consequences of using hardware, without failure analysis or corrective action, that has previously demonstrated that it did not perform per specification. The subject of this paper arose from the need to provide a detailed probabilistic analysis to calculate the change in probability of failures with respect to the base or non-failed hardware. The methodology used for the analysis is primarily based on principles of Monte Carlo simulation. The random variables in the analysis are: Maximum Time of Operation (MTO) and operation Time of each Unit (OTU) The failure of a unit is considered to happen if (OTU) is less than MTO for the Normal Operational Period (NOP) in which this unit is used. NOP as a whole uses a total of 4 units. Two cases are considered. in the first specialized scenario, the failure of any operation or system failure is considered to happen if any of the units used during the NOP fail. in the second specialized scenario, the failure of any operation or system failure is considered to happen only if any two of the units used during the MOP fail together. The probability of failure of the units and the system as a whole is determined for 3 kinds of systems - Perfect System, Imperfect System 1 and Imperfect System 2. in a Perfect System, the operation time of the failed unit is the same as that of the MTO. In an Imperfect System 1, the operation time of the failed unit is assumed as 1 percent of the MTO. In an Imperfect System 2, the operation time of the failed unit is assumed as zero. in addition, simulated operation time of failed units is assumed as 10 percent of the corresponding units before zero value. Monte Carlo simulation analysis is used for this study. Necessary software has been developed as part of this study to perform the reliability calculations. The results of the analysis showed that the predicted change in failure probability (P(sub F)) for the previously failed units is as high as 49 percent above the baseline (perfect system) for the worst case. The predicted change in system P(sub F) for the previously failed units is as high as 36% for single unit failure without any redundancy. For redundant systems, with dual unit failure, the predicted change in P(sub F) for the previously failed units is as high as 16%. These results will help management to make decisions regarding the consequences of using previously failed units without adequate failure analysis or corrective action.
Probability in reasoning: a developmental test on conditionals.
Barrouillet, Pierre; Gauffroy, Caroline
2015-04-01
Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.
Masip, Jaume; Blandón-Gitlin, Iris; de la Riva, Clara; Herrero, Carmen
2016-09-01
Meta-analyses reveal that behavioral differences between liars and truth tellers are small. To facilitate lie detection, researchers are currently developing interviewing approaches to increase these differences. Some of these approaches assume that lying is cognitively more difficult than truth telling; however, they are not based on specific cognitive theories of lie production, which are rare. Here we examined one existing theory, Walczyk et al.'s (2014) Activation-Decision-Construction-Action Theory (ADCAT). We tested the Decision component. According to ADCAT, people decide whether to lie or tell the truth as if they were using a specific mathematical formula to calculate the motivation to lie from (a) the probability of a number of outcomes derived from lying vs. telling the truth, and (b) the costs/benefits associated with each outcome. In this study, participants read several hypothetical scenarios and indicated whether they would lie or tell the truth in each scenario (Questionnaire 1). Next, they answered several questions about the consequences of lying vs. telling the truth in each scenario, and rated the probability and valence of each consequence (Questionnaire 2). Significant associations were found between the participants' dichotomous decision to lie/tell the truth in Questionnaire 1 and their motivation to lie scores calculated from the Questionnaire 2 data. However, interestingly, whereas the expected consequences of truth telling were associated with the decision to lie vs. tell the truth, the expected consequences of lying were not. Suggestions are made to refine ADCAT, which can be a useful theoretical framework to guide deception research. Copyright © 2016 Elsevier B.V. All rights reserved.
Robinson, Gilpin R.; Lesure, Frank G.; Marlowe, J. I.; Foley, Nora K.; Clark, S.H.
2004-01-01
Vermiculite produced from a large deposit near Tigerville, S.C-, in the Inner Piedmont. Deposit worked out and mine backfilled. Smaller deposits associated with ultramafic rocks in the east flank of the Blue Ridge are now uneconomic and have not been worked in the past 20 years. C. Metals: Copper in three deposits, the Fontana and Hazel Creek mines in the Great Smoky Mountains Abstract Figure 1. Location of the Knoxville 1ºx2º quadrangle, with state and county boundaries National Park in the Central Blue Ridge, and the Cullowhee mine in the east flank of the Blue Ridge. D. Organic fuels: The rocks of the quadrangle contain no coal and probably lie outside the maximum range in thermal maturity permitting the survival of oil. The rocks in the Valley and Ridge and for a short distance eastward below the west flank of the Blue Ridge probably lie within a zone of thermal maturity permitting the survival of natural gas. Consequently the western part of the quadrangle is an area of high risk for hydrocarbon exploration. No exploration drilling has been done in this belt.
Developing Probabilistic Safety Performance Margins for Unknown and Underappreciated Risks
NASA Technical Reports Server (NTRS)
Benjamin, Allan; Dezfuli, Homayoon; Everett, Chris
2015-01-01
Probabilistic safety requirements currently formulated or proposed for space systems, nuclear reactor systems, nuclear weapon systems, and other types of systems that have a low-probability potential for high-consequence accidents depend on showing that the probability of such accidents is below a specified safety threshold or goal. Verification of compliance depends heavily upon synthetic modeling techniques such as PRA. To determine whether or not a system meets its probabilistic requirements, it is necessary to consider whether there are significant risks that are not fully considered in the PRA either because they are not known at the time or because their importance is not fully understood. The ultimate objective is to establish a reasonable margin to account for the difference between known risks and actual risks in attempting to validate compliance with a probabilistic safety threshold or goal. In this paper, we examine data accumulated over the past 60 years from the space program, from nuclear reactor experience, from aircraft systems, and from human reliability experience to formulate guidelines for estimating probabilistic margins to account for risks that are initially unknown or underappreciated. The formulation includes a review of the safety literature to identify the principal causes of such risks.
Socioeconomic status and misperception of body mass index among Mexican adults.
Arantxa Colchero, M; Caro-Vega, Yanink; Kaufer-Horwitz, Martha
2014-01-01
To estimate the association between perceived body mass index (BMI) and socioeconomic variables in adults in Mexico. We studied 32052 adults from the Mexican National Health and Nutrition Survey of 2006. We estimated BMI misperception by comparing the respondent's weight perception (as categories of BMI) with the corresponding category according to measured weight and height. Misperception was defined as respondent's perception of a BMI category different from their actual category. Socioeconomic status was assessed using household assets. Logistic and multinomial regression models by gender and BMI category were estimated. Adult women and men highly underestimate their BMI category. We found that the probability of a correct classification was lower than the probability of getting a correct result by chance alone. Better educated and more affluent individuals are more likely to have a correct perception of their weight status, particularly among overweight adults. Given that a correct perception of weight has been associated with an increased search of weight control and that our results show that the studied population underestimated their BMI, interventions providing definitions and consequences of overweight and obesity and encouraging the population to monitor their weight could be beneficial.
Cannon, B; Bernson, V S; Nedergaard, J
1984-08-31
Brown fat mitochondria obtained from a hibernator, the golden hamster, were investigated in order to elucidate the significance of membrane permeability for metabolic functioning at different temperatures. The mitochondria were shown to have active permeases for phosphate and pyruvate, but very poorly developed permeases for di- and tricarboxylate substrate anions. This was shown with both osmotic swelling techniques and respiration-driven uptake studies. It was shown that the very limited malate permeation observed was compatible with it being a non-carrier-mediated diffusion process. The role of malate transport in supporting fatty-acid oxidation in vitro as a function of temperature was studied in detail. The results support our earlier suggestion that physiologically pyruvate carboxylase probably functions to generate oxaloacetate when high concentrations of condensing partner are needed during thermogenesis. They may also explain earlier observations that acetate was produced from palmitoyl-carnitine at low temperatures even when malate was present; this is here shown to be due to the limited malate permeability at these low temperatures. Thus, even at the body temperature of the hibernating hamster (4-5 degrees C), brown fat is probably able to combust fatty acids totally.
Cox, Melissa D; Myerscough, Mary R
2003-07-21
This paper develops and explores a model of foraging in honey bee colonies. The model may be applied to forage sources with various properties, and to colonies with different foraging-related parameters. In particular, we examine the effect of five foraging-related parameters on the foraging response and consequent nectar intake of a homogeneous colony. The parameters investigated affect different quantities critical to the foraging cycle--visit rate (affected by g), probability of dancing (mpd and bpd), duration of dancing (mcirc), or probability of abandonment (A). We show that one parameter, A, affects nectar intake in a nonlinear way. Further, we show that colonies with a midrange value of any foraging parameter perform better than the average of colonies with high- and low-range values, when profitable sources are available. Together these observations suggest that a heterogeneous colony, in which a range of parameter values are present, may perform better than a homogeneous colony. We modify the model to represent heterogeneous colonies and use it to show that the most important effect of heterogeneous foraging behaviour within the colony is to reduce the variance in the average quantity of nectar collected by heterogeneous colonies.
Dermol, Janja; Miklavčič, Damijan
2014-12-01
High voltage electric pulses cause electroporation of the cell membrane. Consequently, flow of the molecules across the membrane increases. In our study we investigated possibility to predict the percentage of the electroporated cells in an inhomogeneous electric field on the basis of the experimental results obtained when cells were exposed to a homogeneous electric field. We compared and evaluated different mathematical models previously suggested by other authors for interpolation of the results (symmetric sigmoid, asymmetric sigmoid, hyperbolic tangent and Gompertz curve). We investigated the density of the cells and observed that it has the most significant effect on the electroporation of the cells while all four of the mathematical models yielded similar results. We were able to predict electroporation of cells exposed to an inhomogeneous electric field based on mathematical modeling and using mathematical formulations of electroporation probability obtained experimentally using exposure to the homogeneous field of the same density of cells. Models describing cell electroporation probability can be useful for development and presentation of treatment planning for electrochemotherapy and non-thermal irreversible electroporation. Copyright © 2014 Elsevier B.V. All rights reserved.
Cortisol shifts financial risk preferences
Kandasamy, Narayanan; Hardy, Ben; Page, Lionel; Schaffner, Markus; Graggaber, Johann; Powlson, Andrew S.; Fletcher, Paul C.; Gurnell, Mark; Coates, John
2014-01-01
Risk taking is central to human activity. Consequently, it lies at the focal point of behavioral sciences such as neuroscience, economics, and finance. Many influential models from these sciences assume that financial risk preferences form a stable trait. Is this assumption justified and, if not, what causes the appetite for risk to fluctuate? We have previously found that traders experience a sustained increase in the stress hormone cortisol when the amount of uncertainty, in the form of market volatility, increases. Here we ask whether these elevated cortisol levels shift risk preferences. Using a double-blind, placebo-controlled, cross-over protocol we raised cortisol levels in volunteers over 8 d to the same extent previously observed in traders. We then tested for the utility and probability weighting functions underlying their risk taking and found that participants became more risk-averse. We also observed that the weighting of probabilities became more distorted among men relative to women. These results suggest that risk preferences are highly dynamic. Specifically, the stress response calibrates risk taking to our circumstances, reducing it in times of prolonged uncertainty, such as a financial crisis. Physiology-induced shifts in risk preferences may thus be an underappreciated cause of market instability. PMID:24550472
Cortisol shifts financial risk preferences.
Kandasamy, Narayanan; Hardy, Ben; Page, Lionel; Schaffner, Markus; Graggaber, Johann; Powlson, Andrew S; Fletcher, Paul C; Gurnell, Mark; Coates, John
2014-03-04
Risk taking is central to human activity. Consequently, it lies at the focal point of behavioral sciences such as neuroscience, economics, and finance. Many influential models from these sciences assume that financial risk preferences form a stable trait. Is this assumption justified and, if not, what causes the appetite for risk to fluctuate? We have previously found that traders experience a sustained increase in the stress hormone cortisol when the amount of uncertainty, in the form of market volatility, increases. Here we ask whether these elevated cortisol levels shift risk preferences. Using a double-blind, placebo-controlled, cross-over protocol we raised cortisol levels in volunteers over 8 d to the same extent previously observed in traders. We then tested for the utility and probability weighting functions underlying their risk taking and found that participants became more risk-averse. We also observed that the weighting of probabilities became more distorted among men relative to women. These results suggest that risk preferences are highly dynamic. Specifically, the stress response calibrates risk taking to our circumstances, reducing it in times of prolonged uncertainty, such as a financial crisis. Physiology-induced shifts in risk preferences may thus be an underappreciated cause of market instability.
Approach to patients with pulmonary embolism in a surgical intensive care unit.
Grigorakos, Leonidas; Sotiriou, Evangelia; Myrianthefs, P; Michail, Anastasia; Koulendi, Despina; Zidianakis, Vasilis; Gianakopoulos, K; Baltopoulos, G
2008-01-01
Pulmonary embolism (PE) is a potentially life threatening disease. Clinical signs and symptoms allow the clinician to determine the pretest probability of someone having pulmonary embolism but are insufficient to diagnose or rule out the condition. This paper aims to study the clinical presentation, identify the risk factors and evaluate the diagnostic strategies and management of patients with PE. The medical files of 69 patients were searched, who were diagnosed with PE and who were admitted to the Surgical Care Unit. Dyspnea, pleuritic pain, haemoptysis, fever and cough were the most common presenting symptoms. Risk factors for PE were found in 90% of cases. D-dimers assay was elevated in all cases (100%) and the other diagnostic strategies used showed great accuracy in confirming the pretest probabilities of PE. It is of high importance that 75% of the patients had deep vein thrombosis as assessed by venous ultrasonography. Mortality due to PE was approximately 6.9%. PE can be often overlooked with hazardous consequences. Clinical evaluation in combination with spiral CT or lung scintigraphy and vein ultrasound and D-dimer level can establish the diagnosis in the majority of patients so that effective treatment to be started as soon as possible.
Faverjon, C; Leblond, A; Lecollinet, S; Bødker, R; de Koeijer, A A; Fischer, E A J
2017-12-01
African horse sickness (AHS) and equine encephalosis (EE) are Culicoides-borne viral diseases that could have the potential to spread across Europe if introduced, thus being potential threats for the European equine industry. Both share similar epidemiology, transmission patterns and geographical distribution. Using stochastic spatiotemporal models of virus entry, we assessed and compared the probabilities of both viruses entering France via two pathways: importation of live-infected animals or importation of infected vectors. Analyses were performed for three consecutive years (2010-2012). Seasonal and regional differences in virus entry probabilities were the same for both diseases. However, the probability of EE entry was much higher than the probability of AHS entry. Interestingly, the most likely entry route differed between AHS and EE: AHS has a higher probability to enter through an infected vector and EE has a higher probability to enter through an infectious host. Consequently, different effective protective measures were identified by 'what-if' scenarios for the two diseases. The implementation of vector protection on all animals (equine and bovine) coming from low-risk regions before their importation was the most effective in reducing the probability of AHS entry. On the other hand, the most significant reduction in the probability of EE entry was obtained by the implementation of quarantine before import for horses coming from both EU and non-EU countries. The developed models can be useful to implement risk-based surveillance. © 2016 Blackwell Verlag GmbH.
Thureborn, Petter; Franzetti, Andrea; Lundin, Daniel; Sjöling, Sara
2016-01-01
Baltic Sea deep water and sediments hold one of the largest anthropogenically induced hypoxic areas in the world. High nutrient input and low water exchange result in eutrophication and oxygen depletion below the halocline. As a consequence at Landsort Deep, the deepest point of the Baltic Sea, anoxia in the sediments has been a persistent condition over the past decades. Given that microbial communities are drivers of essential ecosystem functions we investigated the microbial community metabolisms and functions of oxygen depleted Landsort Deep sediments by metatranscriptomics. Results show substantial expression of genes involved in protein metabolism demonstrating that the Landsort Deep sediment microbial community is active. Identified expressed gene suites of metabolic pathways with importance for carbon transformation including fermentation, dissimilatory sulphate reduction and methanogenesis were identified. The presence of transcripts for these metabolic processes suggests a potential for heterotrophic-autotrophic community synergism and indicates active mineralisation of the organic matter deposited at the sediment as a consequence of the eutrophication process. Furthermore, cyanobacteria, probably deposited from the water column, are transcriptionally active in the anoxic sediment at this depth. Results also reveal high abundance of transcripts encoding integron integrases. These results provide insight into the activity of the microbial community of the anoxic sediment at the deepest point of the Baltic Sea and its possible role in ecosystem functioning.
Aanen, Duur K.; Spelbrink, Johannes N.; Beekman, Madeleine
2014-01-01
The peculiar biology of mitochondrial DNA (mtDNA) potentially has detrimental consequences for organismal health and lifespan. Typically, eukaryotic cells contain multiple mitochondria, each with multiple mtDNA genomes. The high copy number of mtDNA implies that selection on mtDNA functionality is relaxed. Furthermore, because mtDNA replication is not strictly regulated, within-cell selection may favour mtDNA variants with a replication advantage, but a deleterious effect on cell fitness. The opportunities for selfish mtDNA mutations to spread are restricted by various organism-level adaptations, such as uniparental transmission, germline mtDNA bottlenecks, germline selection and, during somatic growth, regular alternation between fusion and fission of mitochondria. These mechanisms are all hypothesized to maintain functional mtDNA. However, the strength of selection for maintenance of functional mtDNA progressively declines with age, resulting in age-related diseases. Furthermore, organismal adaptations that most probably evolved to restrict the opportunities for selfish mtDNA create secondary problems. Owing to predominantly maternal mtDNA transmission, recombination among mtDNA from different individuals is highly restricted or absent, reducing the scope for repair. Moreover, maternal inheritance precludes selection against mtDNA variants with male-specific effects. We finish by discussing the consequences of life-history differences among taxa with respect to mtDNA evolution and make a case for the use of microorganisms to experimentally manipulate levels of selection. PMID:24864309
Franzetti, Andrea; Lundin, Daniel; Sjöling, Sara
2016-01-01
Baltic Sea deep water and sediments hold one of the largest anthropogenically induced hypoxic areas in the world. High nutrient input and low water exchange result in eutrophication and oxygen depletion below the halocline. As a consequence at Landsort Deep, the deepest point of the Baltic Sea, anoxia in the sediments has been a persistent condition over the past decades. Given that microbial communities are drivers of essential ecosystem functions we investigated the microbial community metabolisms and functions of oxygen depleted Landsort Deep sediments by metatranscriptomics. Results show substantial expression of genes involved in protein metabolism demonstrating that the Landsort Deep sediment microbial community is active. Identified expressed gene suites of metabolic pathways with importance for carbon transformation including fermentation, dissimilatory sulphate reduction and methanogenesis were identified. The presence of transcripts for these metabolic processes suggests a potential for heterotrophic-autotrophic community synergism and indicates active mineralisation of the organic matter deposited at the sediment as a consequence of the eutrophication process. Furthermore, cyanobacteria, probably deposited from the water column, are transcriptionally active in the anoxic sediment at this depth. Results also reveal high abundance of transcripts encoding integron integrases. These results provide insight into the activity of the microbial community of the anoxic sediment at the deepest point of the Baltic Sea and its possible role in ecosystem functioning. PMID:26823996
Lattice based Kinetic Monte Carlo Simulations of a complex chemical reaction network
NASA Astrophysics Data System (ADS)
Danielson, Thomas; Savara, Aditya; Hin, Celine
Lattice Kinetic Monte Carlo (KMC) simulations offer a powerful alternative to using ordinary differential equations for the simulation of complex chemical reaction networks. Lattice KMC provides the ability to account for local spatial configurations of species in the reaction network, resulting in a more detailed description of the reaction pathway. In KMC simulations with a large number of reactions, the range of transition probabilities can span many orders of magnitude, creating subsets of processes that occur more frequently or more rarely. Consequently, processes that have a high probability of occurring may be selected repeatedly without actually progressing the system (i.e. the forward and reverse process for the same reaction). In order to avoid the repeated occurrence of fast frivolous processes, it is necessary to throttle the transition probabilities in such a way that avoids altering the overall selectivity. Likewise, as the reaction progresses, new frequently occurring species and reactions may be introduced, making a dynamic throttling algorithm a necessity. We present a dynamic steady-state detection scheme with the goal of accurately throttling rate constants in order to optimize the KMC run time without compromising the selectivity of the reaction network. The algorithm has been applied to a large catalytic chemical reaction network, specifically that of methanol oxidative dehydrogenation, as well as additional pathways on CeO2(111) resulting in formaldehyde, CO, methanol, CO2, H2 and H2O as gas products.
Avian influenza virus (H5N1): a threat to human health.
Peiris, J S Malik; de Jong, Menno D; Guan, Yi
2007-04-01
Pandemic influenza virus has its origins in avian influenza viruses. The highly pathogenic avian influenza virus subtype H5N1 is already panzootic in poultry, with attendant economic consequences. It continues to cross species barriers to infect humans and other mammals, often with fatal outcomes. Therefore, H5N1 virus has rightly received attention as a potential pandemic threat. However, it is noted that the pandemics of 1957 and 1968 did not arise from highly pathogenic influenza viruses, and the next pandemic may well arise from a low-pathogenicity virus. The rationale for particular concern about an H5N1 pandemic is not its inevitability but its potential severity. An H5N1 pandemic is an event of low probability but one of high human health impact and poses a predicament for public health. Here, we review the ecology and evolution of highly pathogenic avian influenza H5N1 viruses, assess the pandemic risk, and address aspects of human H5N1 disease in relation to its epidemiology, clinical presentation, pathogenesis, diagnosis, and management.
Perry, Russell W.; Brandes, Patricia L.; Burau, Jon R.; Sandstrom, Philip T.; Skalski, John R.
2015-01-01
Juvenile Chinook Salmon Oncorhynchus tshawytscha emigrating from natal tributaries of the Sacramento River, California, must negotiate the Sacramento-San Joaquin River Delta (hereafter, the Delta), a complex network of natural and man-made channels linking the Sacramento River with San Francisco Bay. Fish that enter the interior and southern Delta—the region to the south of the Sacramento River where water pumping stations are located—survive at a lower rate than fish that use alternative migration routes. Consequently, total survival decreases as the fraction of the population entering the interior Delta increases, thus spurring management actions to reduce the proportion of fish that are entrained into the interior Delta. To better inform management actions, we modeled entrainment probability as a function of hydrodynamic variables. We fitted alternative entrainment models to telemetry data that identified when tagged fish in the Sacramento River entered two river channels leading to the interior Delta (Georgiana Slough and the gated Delta Cross Channel). We found that the probability of entrainment into the interior Delta through both channels depended strongly on the river flow and tidal stage at the time of fish arrival at the river junction. Fish that arrived during ebb tides had a low entrainment probability, whereas fish that arrived during flood tides (i.e., when the river's flow was reversed) had a high probability of entering the interior Delta. We coupled our entrainment model with a flow simulation model to evaluate the effect of nighttime closures of the Delta Cross Channel gates on the daily probability of fish entrainment into the interior Delta. Relative to 24-h gate closures, nighttime closures increased daily entrainment probability by 3 percentage points on average if fish arrived at the river junction uniformly throughout the day and by only 1.3 percentage points if 85% of fish arrived at night. We illustrate how our model can be used to evaluate the effects of alternative water management actions on fish entrainment into the interior Delta.
Exact Time-Dependent Exchange-Correlation Potential in Electron Scattering Processes
NASA Astrophysics Data System (ADS)
Suzuki, Yasumitsu; Lacombe, Lionel; Watanabe, Kazuyuki; Maitra, Neepa T.
2017-12-01
We identify peak and valley structures in the exact exchange-correlation potential of time-dependent density functional theory that are crucial for time-resolved electron scattering in a model one-dimensional system. These structures are completely missed by adiabatic approximations that, consequently, significantly underestimate the scattering probability. A recently proposed nonadiabatic approximation is shown to correctly capture the approach of the electron to the target when the initial Kohn-Sham state is chosen judiciously, and it is more accurate than standard adiabatic functionals but ultimately fails to accurately capture reflection. These results may explain the underestimation of scattering probabilities in some recent studies on molecules and surfaces.
Quantum aspects of brain activity and the role of consciousness.
Beck, F; Eccles, J C
1992-01-01
The relationship of brain activity to conscious intentions is considered on the basis of the functional microstructure of the cerebral cortex. Each incoming nerve impulse causes the emission of transmitter molecules by the process of exocytosis. Since exocytosis is a quantal phenomenon of the presynaptic vesicular grid with a probability much less than 1, we present a quantum mechanical model for it based on a tunneling process of the trigger mechanism. Consciousness manifests itself in mental intentions. The consequent voluntary actions become effective by momentary increases of the probability of vesicular emission in the thousands of synapses on each pyramidal cell by quantal selection. PMID:1333607
The extent and consequences of p-hacking in science.
Head, Megan L; Holman, Luke; Lanfear, Rob; Kahn, Andrew T; Jennions, Michael D
2015-03-01
A focus on novel, confirmatory, and statistically significant results leads to substantial bias in the scientific literature. One type of bias, known as "p-hacking," occurs when researchers collect or select data or statistical analyses until nonsignificant results become significant. Here, we use text-mining to demonstrate that p-hacking is widespread throughout science. We then illustrate how one can test for p-hacking when performing a meta-analysis and show that, while p-hacking is probably common, its effect seems to be weak relative to the real effect sizes being measured. This result suggests that p-hacking probably does not drastically alter scientific consensuses drawn from meta-analyses.
Probability of success for phase III after exploratory biomarker analysis in phase II.
Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver
2017-05-01
The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs. Copyright © 2017 John Wiley & Sons, Ltd.
2016-01-01
When parasites have different interests in regard to how their host should behave this can result in a conflict over host manipulation, i.e. parasite induced changes in host behaviour that enhance parasite fitness. Such a conflict can result in the alteration, or even complete suppression, of one parasite's host manipulation. Many parasites, and probably also symbionts and commensals, have the ability to manipulate the behaviour of their host. Non‐manipulating parasites should also have an interest in host behaviour. Given the frequency of multiple parasite infections in nature, potential conflicts of interest over host behaviour and manipulation may be common. This review summarizes the evidence on how parasites can alter other parasite's host manipulation. Host manipulation can have important ecological and medical consequences. I speculate on how a conflict over host manipulation could alter these consequences and potentially offer a new avenue of research to ameliorate harmful consequences of host manipulation. PMID:27510821
Pfleger, C C H; Flachs, E M; Koch-Henriksen, Nils
2010-07-01
There is a need for follow-up studies of the familial situation of multiple sclerosis (MS) patients. To evaluate the probability of MS patients to remain in marriage or relationship with the same partner after onset of MS in comparison with the population. All 2538 Danes with onset of MS 1980-1989, retrieved from the Danish MS-Registry, and 50,760 matched and randomly drawn control persons were included. Information on family status was retrieved from Statistics Denmark. Cox analyses were used with onset as starting point. Five years after onset, the cumulative probability of remaining in the same relationship was 86% in patients vs. 89% in controls. The probabilities continued to deviate, and at 24 years, the probability was 33% in patients vs. 53% in the control persons (p < 0.001). Among patients with young onset (< 36 years of age), those with no children had a higher risk of divorce than those having children less than 7 years (Hazard Ratio 1.51; p < 0.0001), and men had a higher risk of divorce than women (Hazard Ratio 1.33; p < 0.01). MS significantly affects the probability of remaining in the same relationship compared with the background population.
Assessment of accident severity in the construction industry using the Bayesian theorem.
Alizadeh, Seyed Shamseddin; Mortazavi, Seyed Bagher; Mehdi Sepehri, Mohammad
2015-01-01
Construction is a major source of employment in many countries. In construction, workers perform a great diversity of activities, each one with a specific associated risk. The aim of this paper is to identify workers who are at risk of accidents with severe consequences and classify these workers to determine appropriate control measures. We defined 48 groups of workers and used the Bayesian theorem to estimate posterior probabilities about the severity of accidents at the level of individuals in construction sector. First, the posterior probabilities of injuries based on four variables were provided. Then the probabilities of injury for 48 groups of workers were determined. With regard to marginal frequency of injury, slight injury (0.856), fatal injury (0.086) and severe injury (0.058) had the highest probability of occurrence. It was observed that workers with <1 year's work experience (0.168) had the highest probability of injury occurrence. The first group of workers, who were extensively exposed to risk of severe and fatal accidents, involved workers ≥ 50 years old, married, with 1-5 years' work experience, who had no past accident experience. The findings provide a direction for more effective safety strategies and occupational accident prevention and emergency programmes.
Remarks on the Phase Transition in QCD
NASA Astrophysics Data System (ADS)
Wilczek, Frank
The significance of the question of the order of the phase transition in QCD, and recent evidence that real-world QCD is probably close to having a single second order transition as a function of temperature, is reviewed. Although this circumstance seems to remove the possibility that the QCD transition during the big bang might have had spectacular cosmological consequences, there is some good news: it allows highly non-trivial yet reliable quantitative predictions to be made for the behavior near the transition. These predictions can be tested in numerical simulations and perhaps even eventually in heavy ion collisions. The present paper is a very elementary discussion of the relevant concepts, meant to be an accessible introduction for those innocent of the renormalization group approach to critical phenomena and/or the details of QCD.
Regional cutaneous microvascular flow responses during gravitational and LBNP stresses
NASA Technical Reports Server (NTRS)
Breit, Gregory A.; Watenpaugh, Donald E.; Ballard, Richard E.; Murthy, Gita; Hargens, Alan R.
1993-01-01
Due to the regional variability of local hydrostatic pressures, microvascular flow responses to gravitational stress probably vary along the length of the body. Although these differences in local autoregulation have been observed previously during whole-body tilting, they have not been investigated during application of artificial gravitational stresses, such as lower body negative pressure or high gravity centrifugation. Although these stresses can create equivalent G-levels at the feet, they result in distinct distributions of vascular transmural pressure along the length of the body, and should consequently elicit different magnitudes and distributions of microvascular response. In the present study, the effects of whole-body tilting and lower body negative pressure on the level and distribution of microvascular flows within skin along the length of the body were compared.
Risks from Solar Particle Events for Long Duration Space Missions Outside Low Earth Orbit
NASA Technical Reports Server (NTRS)
Over, S.; Myers, J.; Ford, J.
2016-01-01
The Integrated Medical Model (IMM) simulates the medical occurrences and mission outcomes for various mission profiles using probabilistic risk assessment techniques. As part of the work with the Integrated Medical Model (IMM), this project focuses on radiation risks from acute events during extended human missions outside low Earth orbit (LEO). Of primary importance in acute risk assessment are solar particle events (SPEs), which are low probability, high consequence events that could adversely affect mission outcomes through acute radiation damage to astronauts. SPEs can be further classified into coronal mass ejections (CMEs) and solar flares/impulsive events (Fig. 1). CMEs are an eruption of solar material and have shock enhancements that contribute to make these types of events higher in total fluence than impulsive events.
Psychological distress and alcohol use among fire fighters.
Boxer, P A; Wild, D
1993-04-01
Few studies have investigated stressors to which fire fighters are subjected and the potential psychological consequences. One hundred and forty-five fire fighters were studied to enumerate potential occupational stressors, assess psychological distress and problems with alcohol use, and determine whether a relationship exists between these measures and self-reported stressors. Hearing that children are in a burning building was the highest ranked stressor. According to three self-report instruments, between 33 and 41% of the fire fighters were experiencing significant psychological distress, and 29% had possible or probable problems with alcohol use. These figures are significantly higher than would be expected in a typical community or working population. In a logistic regression analysis, no relationship was found between measures of psychological distress and alcohol use and the 10 most highly ranked work stressors.
Guzzi-Heeb, Sandro
2011-01-01
The eighteenth-century "sexual revolution" cannot simply be explained as a consequence of economic or institutional factors -- industrialization, agricultural revolution, secularization, or legal hindrances to marriages. The example of western Valais (Switzerland) shows that we have to deal with a complex configuration of factors. The micro-historical approach reveals that in the eighteenth- and nineteenth-century sexuality -- and above all illicit sexuality -- was a highly subversive force that was considerably linked to political innovation and probably more generally to historical change. Nonmarital sexuality was clearly tied to political dissent and to innovative ways of behavior, both among the social elites and the common people. This behavior patterns influenced crucial evolutions in the social, cultural, and economic history of the region.
The potential benefits of a new poliovirus vaccine for long-term poliovirus risk management.
Duintjer Tebbens, Radboud J; Thompson, Kimberly M
2016-12-01
To estimate the incremental net benefits (INBs) of a hypothetical ideal vaccine with all of the advantages and no disadvantages of existing oral and inactivated poliovirus vaccines compared with current vaccines available for future outbreak response. INB estimates based on expected costs and polio cases from an existing global model of long-term poliovirus risk management. Excluding the development costs, an ideal poliovirus vaccine could offer expected INBs of US$1.6 billion. The ideal vaccine yields small benefits in most realizations of long-term risks, but great benefits in low-probability-high-consequence realizations. New poliovirus vaccines may offer valuable insurance against long-term poliovirus risks and new vaccine development efforts should continue as the world gathers more evidence about polio endgame risks.
Temporal genetic change in the last remaining population of woolly mammoth
Nyström, Veronica; Dalén, Love; Vartanyan, Sergey; Lidén, Kerstin; Ryman, Nils; Angerbjörn, Anders
2010-01-01
During the Late Pleistocene, the woolly mammoth (Mammuthus primigenius) experienced a series of local extinctions generally attributed to human predation or environmental change. Some small and isolated populations did however survive far into the Holocene. Here, we investigated the genetic consequences of the isolation of the last remaining mammoth population on Wrangel Island. We analysed 741 bp of the mitochondrial DNA and found a loss of genetic variation in relation to the isolation event, probably caused by a demographic bottleneck or a founder event. However, in spite of ca 5000 years of isolation, we did not detect any further loss of genetic variation. Together with the relatively high number of mitochondrial haplotypes on Wrangel Island near the final disappearance, this suggests a sudden extinction of a rather stable population. PMID:20356891
Music-evoked incidental happiness modulates probability weighting during risky lottery choices
Schulreich, Stefan; Heussen, Yana G.; Gerhardt, Holger; Mohr, Peter N. C.; Binkofski, Ferdinand C.; Koelsch, Stefan; Heekeren, Hauke R.
2014-01-01
We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music—happy, sad, or no music, or sequences of random tones—and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the “happy” than in the “sad” and “random tones” conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the “happy” condition, participants showed significantly higher decision weights associated with the larger payoffs than in the “sad” and “random tones” conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting. PMID:24432007
Music-evoked incidental happiness modulates probability weighting during risky lottery choices.
Schulreich, Stefan; Heussen, Yana G; Gerhardt, Holger; Mohr, Peter N C; Binkofski, Ferdinand C; Koelsch, Stefan; Heekeren, Hauke R
2014-01-07
We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music-happy, sad, or no music, or sequences of random tones-and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the "happy" than in the "sad" and "random tones" conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the "happy" condition, participants showed significantly higher decision weights associated with the larger payoffs than in the "sad" and "random tones" conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting.
Consequences of evolution: is rhinosinusitis, like otitis media, a unique disease of humans?
Bluestone, Charles D; Pagano, Anthony S; Swarts, J Douglas; Laitman, Jeffrey T
2012-12-01
We hypothesize that if otitis media is most likely primarily a human disease due to consequences of evolution, rhinosinusitis may also be limited to humans for similar reasons. If otitis media, with its associated hearing loss, occurred in animals in the wild, they probably would have been culled out by predation. Similarly, if rhinosinusitis occurred regularly in animals, they likely would have suffered from severely decreased olfactory abilities, crucial for predator avoidance, and presumably would likewise have been selected against evolutionarily. Thus, both otitis media and rhinosinusitis-common conditions particularly in infants and young children-appear to be essentially human conditions. Their manifestation in our species is likely due to our unique evolutionary trajectory and may be a consequence of adaptations, including adaptations to bipedalism and speech, loss of prognathism, and immunologic and environmental factors.
Shapley, Robert M.; Xing, Dajun
2012-01-01
Theoretical considerations have led to the concept that the cerebral cortex is operating in a balanced state in which synaptic excitation is approximately balanced by synaptic inhibition from the local cortical circuit. This paper is about the functional consequences of the balanced state in sensory cortex. One consequence is gain control: there is experimental evidence and theoretical support for the idea that local circuit inhibition acts as a local automatic gain control throughout the cortex. Second, inhibition increases cortical feature selectivity: many studies of different sensory cortical areas have reported that suppressive mechanisms contribute to feature selectivity. Synaptic inhibition from the local microcircuit should be untuned (or broadly tuned) for stimulus features because of the microarchitecture of the cortical microcircuit. Untuned inhibition probably is the source of Untuned Suppression that enhances feature selectivity. We studied inhibition’s function in our experiments, guided by a neuronal network model, on orientation selectivity in the primary visual cortex, V1, of the Macaque monkey. Our results revealed that Untuned Suppression, generated by local circuit inhibition, is crucial for the generation of highly orientation-selective cells in V1 cortex. PMID:23036513
Fitness Consequences of Boldness in Juvenile and Adult Largemouth Bass.
Ballew, Nicholas G; Mittelbach, Gary G; Scribner, Kim T
2017-04-01
To date, most studies investigating the relationship between personality traits and fitness have focused on a single measure of fitness (such as survival) at a specific life stage. However, many personality traits likely have multiple effects on fitness, potentially operating across different functional contexts and stages of development. Here, we address the fitness consequences of boldness, under seminatural conditions, across life stages and functional contexts in largemouth bass (Micropterus salmoides). Specifically, we report the effect of boldness on (1) juvenile survivorship in an outdoor pond containing natural prey and predators and (2) adult reproductive success in three outdoor ponds across three reproductive seasons (years). Juvenile survival was negatively affected by boldness, with bolder juveniles having a lower probability of survival than shyer juveniles. In contrast, bolder adult male bass had greater reproductive success than their shyer male counterparts. Female reproductive success was not affected by boldness. These findings demonstrate that boldness can affect fitness differently across life stages. Further, boldness was highly consistent across years and significantly heritable, which suggests that boldness has a genetic component. Thus, our results support theory suggesting that fitness trade-offs across life stages may contribute to the maintenance of personality variation within populations.
Budria, Alexandre; Candolin, Ulrika
2015-04-01
Anthropogenic activities are having profound impacts on species interactions, with further consequences for populations and communities. We investigated the influence that anthropogenic eutrophication has on the prevalence of the parasitic tapeworm Schistocephalus solidus in threespine stickleback Gasterosteus aculeatus populations. We caught stickleback from four areas along the coast of Finland, and within each area from one undisturbed and one eutrophied habitat. We found the prevalence of the parasite to be lower in the eutrophied habitats at the start of the breeding season, probably because of fewer piscivorous birds that transmit the parasite. However, while the prevalence of the parasite declined across the season in the undisturbed habitat, it did less so in eutrophied habitats. We discuss different processes that could be behind the differences, such as lower predation rate on infected fish, higher food availability and less dispersal in eutrophied habitats. We found no effect of eutrophication on the proportion of infected stickleback that entered reproductive condition. Together with earlier findings, this suggests that eutrophication increases the proportion of infected stickleback that reproduce. This could promote the evolution of less parasite resistant populations, with potential consequences for the viability of the interacting parties of the host-parasite system.
High Impact = High Statistical Standards? Not Necessarily So
Tressoldi, Patrizio E.; Giofré, David; Sella, Francesco; Cumming, Geoff
2013-01-01
What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors. PMID:23418533
High impact = high statistical standards? Not necessarily so.
Tressoldi, Patrizio E; Giofré, David; Sella, Francesco; Cumming, Geoff
2013-01-01
What are the statistical practices of articles published in journals with a high impact factor? Are there differences compared with articles published in journals with a somewhat lower impact factor that have adopted editorial policies to reduce the impact of limitations of Null Hypothesis Significance Testing? To investigate these questions, the current study analyzed all articles related to psychological, neuropsychological and medical issues, published in 2011 in four journals with high impact factors: Science, Nature, The New England Journal of Medicine and The Lancet, and three journals with relatively lower impact factors: Neuropsychology, Journal of Experimental Psychology-Applied and the American Journal of Public Health. Results show that Null Hypothesis Significance Testing without any use of confidence intervals, effect size, prospective power and model estimation, is the prevalent statistical practice used in articles published in Nature, 89%, followed by articles published in Science, 42%. By contrast, in all other journals, both with high and lower impact factors, most articles report confidence intervals and/or effect size measures. We interpreted these differences as consequences of the editorial policies adopted by the journal editors, which are probably the most effective means to improve the statistical practices in journals with high or low impact factors.
Potential landscape and flux field theory for turbulence and nonequilibrium fluid systems
NASA Astrophysics Data System (ADS)
Wu, Wei; Zhang, Feng; Wang, Jin
2018-02-01
Turbulence is a paradigm for far-from-equilibrium systems without time reversal symmetry. To capture the nonequilibrium irreversible nature of turbulence and investigate its implications, we develop a potential landscape and flux field theory for turbulent flow and more general nonequilibrium fluid systems governed by stochastic Navier-Stokes equations. We find that equilibrium fluid systems with time reversibility are characterized by a detailed balance constraint that quantifies the detailed balance condition. In nonequilibrium fluid systems with nonequilibrium steady states, detailed balance breaking leads directly to a pair of interconnected consequences, namely, the non-Gaussian potential landscape and the irreversible probability flux, forming a 'nonequilibrium trinity'. The nonequilibrium trinity characterizes the nonequilibrium irreversible essence of fluid systems with intrinsic time irreversibility and is manifested in various aspects of these systems. The nonequilibrium stochastic dynamics of fluid systems including turbulence with detailed balance breaking is shown to be driven by both the non-Gaussian potential landscape gradient and the irreversible probability flux, together with the reversible convective force and the stochastic stirring force. We reveal an underlying connection of the energy flux essential for turbulence energy cascade to the irreversible probability flux and the non-Gaussian potential landscape generated by detailed balance breaking. Using the energy flux as a center of connection, we demonstrate that the four-fifths law in fully developed turbulence is a consequence and reflection of the nonequilibrium trinity. We also show how the nonequilibrium trinity can affect the scaling laws in turbulence.
Pathogenesis and Consequences of Uniparental Disomy in Cancer
Makishima, Hideki; Maciejewski, Jaroslaw P.
2012-01-01
Systematic application of new genome-wide single nucleotide polymorphism arrays has demonstrated that somatically acquired regions of loss of heterozygosity (LOH) without changes in copy number frequently occur in many types of cancer. Until recently, the ubiquity of this type of chromosomal defect had remained unrecognized as it cannot be detected using routine cytogenetic technologies. Random and recurrent patterns of copy-neutral LOH, also referred to as uniparental disomy (UPD), can be found in specific cancer types and probably contribute to clonal outgrowth owing to various mechanisms. In this review we explore the types, topography, genesis, pathophysiological consequences and clinical implications of UPD. PMID:21518781
Advance directives and medical treatment at the end of life.
Kessler, Daniel P; McClellan, Mark B
2004-01-01
To assess the consequences of advance medical directives--which explicitly specify a patient's preferences for one or more specific types of medical treatment in the event of a loss of competence--we analyze the medical care of elderly Medicare beneficiaries who died between 1985 and 1995. We compare the care of patients from states that adopted laws enhancing incentives for compliance with advance directives and laws requiring the appointment of a health care surrogate in the absence of an advance directive to the care of patients from states that did not. We report three key findings. First, laws enhancing incentives for compliance significantly reduce the probability of dying in an acute care hospital. Second, laws requiring the appointment of a surrogate significantly increase the probability of receiving acute care in the last month of life, but decrease the probability of receiving nonacute care. Third, neither type of law leads to any savings in medical expenditures.
On the predictability of outliers in ensemble forecasts
NASA Astrophysics Data System (ADS)
Siegert, S.; Bröcker, J.; Kantz, H.
2012-03-01
In numerical weather prediction, ensembles are used to retrieve probabilistic forecasts of future weather conditions. We consider events where the verification is smaller than the smallest, or larger than the largest ensemble member of a scalar ensemble forecast. These events are called outliers. In a statistically consistent K-member ensemble, outliers should occur with a base rate of 2/(K+1). In operational ensembles this base rate tends to be higher. We study the predictability of outlier events in terms of the Brier Skill Score and find that forecast probabilities can be calculated which are more skillful than the unconditional base rate. This is shown analytically for statistically consistent ensembles. Using logistic regression, forecast probabilities for outlier events in an operational ensemble are calculated. These probabilities exhibit positive skill which is quantitatively similar to the analytical results. Possible causes of these results as well as their consequences for ensemble interpretation are discussed.
The Ticking of the Social Clock: Adults' Beliefs about the Timing of Transition Events.
ERIC Educational Resources Information Center
Peterson, Candida C.
1996-01-01
Two studies regarding beliefs about descriptive and prescriptive age norms for adults in developmental transitions were examined in a sample of 214 Australian university students ages 17 to 50. Discusses research methodology. The probable consequences for self-esteem, mental health, and life planning are discussed in the context of the research…
DOE Office of Scientific and Technical Information (OSTI.GOV)
PIEPHO, M.G.
Four bounding accidents postulated for the K West Basin integrated water treatment system are evaluated against applicable risk evaluation guidelines. The accidents are a spray leak during fuel retrieval, spray leak during backflushing a hydrogen explosion, and a fire breaching filter vessel and enclosure. Event trees and accident probabilities are estimated. In all cases, the unmitigated dose consequences are below the risk evaluation guidelines.
Moodle: A Way for Blending VLE and Face-to-Face Instruction in the ELT Context?
ERIC Educational Resources Information Center
Ilin, Gulden
2013-01-01
This classroom research explores the probable consequences of a blended Teaching English to Young Learners (TEYLs) course comprised of Moodle applications and face to face instruction in the English Language Teaching (ELT) context. Contrary to previous face to face only procedure, the course was divided into two segments: traditional classroom…
Federal Register 2010, 2011, 2012, 2013, 2014
2013-12-24
... amendment would not (1) involve a significant increase in the probability or consequences of an accident previously evaluated; or (2) create the possibility of a new or different kind of accident from any accident... alleged facts or expert opinion which support the contention and on which the requestor/ petitioner...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-11
... amendment would not (1) involve a significant increase in the probability or consequences of an accident previously evaluated; or (2) create the possibility of a new or different kind of accident from any accident... contention and a concise statement of the alleged facts or expert opinion which support the contention and on...
Catholic Schools and the Common Good
ERIC Educational Resources Information Center
DeFiore, Leonard
2006-01-01
There are at least two ways to think about the Common Good. One is the intentional, direct attempt to provide benefits to those beyond oneself and those connected to oneself. The second consists of those unintended consequences of the pursuit of some other benefit, usually private but not necessarily so. Probably the best-known example is that of…
Impact of Bilingualism on Infants' Ability to Learn from Talking and Nontalking Faces
ERIC Educational Resources Information Center
Fort, Mathilde; Ayneto-Gimeno, Alba; Escrichs, Anira; Sebastian-Galles, Nuria
2018-01-01
To probably overcome the challenge of learning two languages at the same time, infants raised in a bilingual environment pay more attention to the mouth of talking faces than same-age monolinguals. Here we examined the consequences of such preference for monolingual and bilingual infants' ability to perceive nonspeech information coming from the…
A mathematical model was used to link decadal changes in the Mississippi River nutrient flux to coastal eutrophication near the Mississippi River Delta. Model simulations suggest that bottom water hypoxia intensified about 30 years ago, as a probable consequence of increased n...
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-23
... adversely affect plant safety, and would have no adverse effect on the probability of any accident. For the accidents that involve damage or melting of the fuel in the reactor core, fuel rod integrity has been shown to be unaffected by extended burnup under consideration; therefore, the consequences of an accident...
Effects of Responding to a Name and Group Call on Preschoolers' Compliance
ERIC Educational Resources Information Center
Beaulieu, Lauren; Hanley, Gregory P.; Roberson, Aleasha A.
2012-01-01
We assessed teacher-child relations with respect to children's name calls, instructions, and compliance in a preschool classroom. The most frequent consequence to a child's name being called was the provision of instructions. We also observed a higher probability of compliance when children attended to a name call. Next, we evaluated the effects…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hua Chiaho, E-mail: Chia-Ho.Hua@stjude.org; Wu Shengjie; Chemaitilly, Wassim
Purpose: To develop a mathematical model utilizing more readily available measures than stimulation tests that identifies brain tumor survivors with high likelihood of abnormal growth hormone secretion after radiotherapy (RT), to avoid late recognition and a consequent delay in growth hormone replacement therapy. Methods and Materials: We analyzed 191 prospectively collected post-RT evaluations of peak growth hormone level (arginine tolerance/levodopa stimulation test), serum insulin-like growth factor 1 (IGF-1), IGF-binding protein 3, height, weight, growth velocity, and body mass index in 106 children and adolescents treated for ependymoma (n = 72), low-grade glioma (n = 28) or craniopharyngioma (n = 6),more » who had normal growth hormone levels before RT. Normal level in this study was defined as the peak growth hormone response to the stimulation test {>=}7 ng/mL. Results: Independent predictor variables identified by multivariate logistic regression with high statistical significance (p < 0.0001) included IGF-1 z score, weight z score, and hypothalamic dose. The developed predictive model demonstrated a strong discriminatory power with an area under the receiver operating characteristic curve of 0.883. At a potential cutoff point of probability of 0.3 the sensitivity was 80% and specificity 78%. Conclusions: Without unpleasant and expensive frequent stimulation tests, our model provides a quantitative approach to closely follow the growth hormone secretory capacity of brain tumor survivors. It allows identification of high-risk children for subsequent confirmatory tests and in-depth workup for diagnosis of growth hormone deficiency.« less
Oral health in children investigated by Social services on suspicion of child abuse and neglect.
Kvist, T; Annerbäck, E-M; Dahllöf, G
2018-02-01
Child abuse and neglect (CAN) are likely to have negative consequences on health; however, for oral health, studies on associated outcomes are sparse. The purpose of this study was to assess oral health and oral health behaviors in relation to suspected CAN among children being investigated by the Swedish Social Services. The material comprised data from the Social Services and dental records; the sample, 86 children and 172 matched controls. The children in the study group had a higher prevalence of dental caries than the control group; in addition, levels of non-attendance and dental avoidance were high, as was parental failure to promote good oral health. We found four factors that, taken together, indicated a high probability of being investigated because of suspected CAN: prevalence of dental caries in primary teeth, fillings in permanent teeth, dental health service avoidance, and referral to specialist pediatric dentistry clinics. If all four factors were present, the cumulative probability of being investigated was 0.918. In conclusion, there is a high prevalence of dental caries, irregular attendance, and a need for referral a pediatric dental clinic among Swedish children under investigation due to suspected CAN. Social context is an important factor in assessing the risk of developing dental caries, the inclination to follow treatment plans, and the prerequisites for cooperation during treatment. Routinely requesting dental records during an investigation would provide important information for social workers on parental skills and abilities to fulfill the basic needs of children. Copyright © 2017 Elsevier Ltd. All rights reserved.
High probability neurotransmitter release sites represent an energy efficient design
Lu, Zhongmin; Chouhan, Amit K.; Borycz, Jolanta A.; Lu, Zhiyuan; Rossano, Adam J; Brain, Keith L.; Zhou, You; Meinertzhagen, Ian A.; Macleod, Gregory T.
2016-01-01
Nerve terminals contain multiple sites specialized for the release of neurotransmitters. Release usually occurs with low probability, a design thought to confer many advantages. High probability release sites are not uncommon but their advantages are not well understood. Here we test the hypothesis that high probability release sites represent an energy efficient design. We examined release site probabilities and energy efficiency at the terminals of two glutamatergic motor neurons synapsing on the same muscle fiber in Drosophila larvae. Through electrophysiological and ultrastructural measurements we calculated release site probabilities to differ considerably between terminals (0.33 vs. 0.11). We estimated the energy required to release and recycle glutamate from the same measurements. The energy required to remove calcium and sodium ions subsequent to nerve excitation was estimated through microfluorimetric and morphological measurements. We calculated energy efficiency as the number of glutamate molecules released per ATP molecule hydrolyzed, and high probability release site terminals were found to be more efficient (0.13 vs. 0.06). Our analytical model indicates that energy efficiency is optimal (~0.15) at high release site probabilities (~0.76). As limitations in energy supply constrain neural function, high probability release sites might ameliorate such constraints by demanding less energy. Energy efficiency can be viewed as one aspect of nerve terminal function, in balance with others, because high efficiency terminals depress significantly during episodic bursts of activity. PMID:27593375
Medina, L S; Crone, K; Kuntz, K M
2001-12-01
To assess the clinical and economic consequences of different diagnostic strategies in newborns with suspected occult spinal dysraphism. A decision-analytic model was constructed to project the cost and health outcomes of magnetic resonance imaging (MRI), ultrasound (US), plain radiographs, and no imaging in newborns with suspected occult spinal dysraphism. Morbidity and mortality rates of early versus late diagnosis of dysraphism and the sensitivity and specificity of MRI, US, and plain radiographs were obtained from the literature. Cost estimates were obtained from a hospital cost accounting database and from the Medicaid fee schedule. We found that the choice of imaging strategy depends on the underlying risk of occult spinal dysraphism. In low-risk children with intergluteal dimple or newborns of diabetic mothers (pretest probability: 0.3%-0.34%), US was the most effective strategy with an incremental cost-effectiveness ratio of $55 100 per quality-adjusted life year gained. For children with lumbosacral dimples, who have a higher pretest probability of 3.8%, US was less costly and more effective than the other 3 strategies considered. In intermediate-risk newborns with low anorectal malformation (pretest probability: 27%), US was more effective and less costly than radiographs and no imaging. However, MRI was more effective than US at an incremental cost-effectiveness of $1000 per quality-adjusted life year gained. In the high-risk group that included high anorectal malformation, cloacal malformation, and exstrophy (pretest probability: 44%-46%), MRI was actually cost-saving when compared with the other diagnostic strategies. For the intermediate-risk group, we found our analysis to be sensitive to the costs and diagnostic performances (sensitivity and specificity) of MRI and US. Lower MRI cost or greater MRI diagnostic performance improved the cost-effectiveness of the MRI strategy, whereas lower US cost or greater US diagnostic performance worsened the cost-effectiveness of the MRI strategy. Therefore, individual or institutional expertise with a specific diagnostic modality (MRI versus US) may influence the optimal diagnostic strategy. In newborns with suspected occult dysraphism, appropriate selection of patients and diagnostic strategy may increase quality-adjusted life expectancy and decrease cost of medical work-up.
Interpolating Non-Parametric Distributions of Hourly Rainfall Intensities Using Random Mixing
NASA Astrophysics Data System (ADS)
Mosthaf, Tobias; Bárdossy, András; Hörning, Sebastian
2015-04-01
The correct spatial interpolation of hourly rainfall intensity distributions is of great importance for stochastical rainfall models. Poorly interpolated distributions may lead to over- or underestimation of rainfall and consequently to wrong estimates of following applications, like hydrological or hydraulic models. By analyzing the spatial relation of empirical rainfall distribution functions, a persistent order of the quantile values over a wide range of non-exceedance probabilities is observed. As the order remains similar, the interpolation weights of quantile values for one certain non-exceedance probability can be applied to the other probabilities. This assumption enables the use of kernel smoothed distribution functions for interpolation purposes. Comparing the order of hourly quantile values over different gauges with the order of their daily quantile values for equal probabilities, results in high correlations. The hourly quantile values also show high correlations with elevation. The incorporation of these two covariates into the interpolation is therefore tested. As only positive interpolation weights for the quantile values assure a monotonically increasing distribution function, the use of geostatistical methods like kriging is problematic. Employing kriging with external drift to incorporate secondary information is not applicable. Nonetheless, it would be fruitful to make use of covariates. To overcome this shortcoming, a new random mixing approach of spatial random fields is applied. Within the mixing process hourly quantile values are considered as equality constraints and correlations with elevation values are included as relationship constraints. To profit from the dependence of daily quantile values, distribution functions of daily gauges are used to set up lower equal and greater equal constraints at their locations. In this way the denser daily gauge network can be included in the interpolation of the hourly distribution functions. The applicability of this new interpolation procedure will be shown for around 250 hourly rainfall gauges in the German federal state of Baden-Württemberg. The performance of the random mixing technique within the interpolation is compared to applicable kriging methods. Additionally, the interpolation of kernel smoothed distribution functions is compared with the interpolation of fitted parametric distributions.
Possible impacts of climate change on wetlands and its biota in the Brazilian Amazon.
Barros, D F; Albernaz, A L M
2014-11-01
Wetlands cover approximately 6% of the Earth's surface. They are frequently found at the interface between terrestrial and aquatic ecosystems and are strongly dependent on the water cycle. For this reason, wetlands are extremely vulnerable to the effects of climate change. Mangroves and floodplain ecosystems are some of the most important environments for the Amazonian population, as a source of proteins and income, and are thus the types of wetlands chosen for this review. Some of the main consequences that can be predicted from climate change for wetlands are modifications in hydrological regimes, which can cause intense droughts or inundations. A possible reduction in rainfall can cause a decrease of the areas of mangroves and floodplains, with a consequent decline in their species numbers. Conversely, an increase in rainfall would probably cause the substitution of plant species, which would not be able to survive under new conditions for a long period. An elevation in water temperature on the floodplains would cause an increase in frequency and duration of hypoxic or anoxic episodes, which might further lead to a reduction in growth rates or the reproductive success of many species. In mangroves, an increase in water temperature would influence the sea level, causing losses of these environments through coastal erosion processes. Therefore, climate change will likely cause the loss of, or reduction in, Amazonian wetlands and will challenge the adaptability of species, composition and distribution, which will probably have consequences for the human population that depend on them.
High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect.
Gosling, Corentin J; Moutier, Sylvain
2017-01-01
Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation.
High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect
Gosling, Corentin J.; Moutier, Sylvain
2017-01-01
Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation. PMID:28232808
The probability of monophyly of a sample of gene lineages on a species tree
Mehta, Rohan S.; Bryant, David; Rosenberg, Noah A.
2016-01-01
Monophyletic groups—groups that consist of all of the descendants of a most recent common ancestor—arise naturally as a consequence of descent processes that result in meaningful distinctions between organisms. Aspects of monophyly are therefore central to fields that examine and use genealogical descent. In particular, studies in conservation genetics, phylogeography, population genetics, species delimitation, and systematics can all make use of mathematical predictions under evolutionary models about features of monophyly. One important calculation, the probability that a set of gene lineages is monophyletic under a two-species neutral coalescent model, has been used in many studies. Here, we extend this calculation for a species tree model that contains arbitrarily many species. We study the effects of species tree topology and branch lengths on the monophyly probability. These analyses reveal new behavior, including the maintenance of nontrivial monophyly probabilities for gene lineage samples that span multiple species and even for lineages that do not derive from a monophyletic species group. We illustrate the mathematical results using an example application to data from maize and teosinte. PMID:27432988
Fault tree analysis for urban flooding.
ten Veldhuis, J A E; Clemens, F H L R; van Gelder, P H A J M
2009-01-01
Traditional methods to evaluate flood risk generally focus on heavy storm events as the principal cause of flooding. Conversely, fault tree analysis is a technique that aims at modelling all potential causes of flooding. It quantifies both overall flood probability and relative contributions of individual causes of flooding. This paper presents a fault model for urban flooding and an application to the case of Haarlem, a city of 147,000 inhabitants. Data from a complaint register, rainfall gauges and hydrodynamic model calculations are used to quantify probabilities of basic events in the fault tree. This results in a flood probability of 0.78/week for Haarlem. It is shown that gully pot blockages contribute to 79% of flood incidents, whereas storm events contribute only 5%. This implies that for this case more efficient gully pot cleaning is a more effective strategy to reduce flood probability than enlarging drainage system capacity. Whether this is also the most cost-effective strategy can only be decided after risk assessment has been complemented with a quantification of consequences of both types of events. To do this will be the next step in this study.
Significance of stress transfer in time-dependent earthquake probability calculations
Parsons, T.
2005-01-01
A sudden change in stress is seen to modify earthquake rates, but should it also revise earthquake probability? Data used to derive input parameters permits an array of forecasts; so how large a static stress change is require to cause a statistically significant earthquake probability change? To answer that question, effects of parameter and philosophical choices are examined through all phases of sample calculations, Drawing at random from distributions of recurrence-aperiodicity pairs identifies many that recreate long paleoseismic and historic earthquake catalogs. Probability density funtions built from the recurrence-aperiodicity pairs give the range of possible earthquake forecasts under a point process renewal model. Consequences of choices made in stress transfer calculations, such as different slip models, fault rake, dip, and friction are, tracked. For interactions among large faults, calculated peak stress changes may be localized, with most of the receiving fault area changed less than the mean. Thus, to avoid overstating probability change on segments, stress change values should be drawn from a distribution reflecting the spatial pattern rather than using the segment mean. Disparity resulting from interaction probability methodology is also examined. For a fault with a well-understood earthquake history, a minimum stress change to stressing rate ratio of 10:1 to 20:1 is required to significantly skew probabilities with >80-85% confidence. That ratio must be closer to 50:1 to exceed 90-95% confidence levels. Thus revision to earthquake probability is achievable when a perturbing event is very close to the fault in question or the tectonic stressing rate is low.
Analysis on tank truck accidents involved in road hazardous materials transportation in china.
Shen, Xiaoyan; Yan, Ying; Li, Xiaonan; Xie, Chenjiang; Wang, Lihua
2014-01-01
Due to the sheer size and capacity of the tanker and the properties of cargo transported in the tank, hazmat tanker accidents are more disastrous than other types of vehicle accidents. The aim of this study was to provide a current survey on the situation of accidents involving tankers transporting hazardous materials in China. Detailed descriptions of 708 tanker accidents associated with hazmat transportation in China from 2004 to 2011 were analyzed to identify causes, location, types, time of occurrence, hazard class for materials involved, consequences, and the corresponding probability. Hazmat tanker accidents mainly occurred in eastern (38.1%) and southwest China (12.3%). The most frequent hazmat tanker accidents involved classes 2, 3, and 8. The predominant accident types were rollover (29.10%), run-off-the-road (16.67%), and rear-end collisions (13.28%), with a high likelihood of a large spill occurring. About 55.93% of the accidents occurred on freeways and class 1 roads, with the spill percentage reaching 75.00% and the proportion of spills that occurred in the total accidents amounting to 77.82%, of which 61.72% are considered large spills. The month with the highest accident probability was July (12.29%), and most crashes occurred during the early morning (4:00-6:00 a.m.) and midday (10:00 a.m.-12:00 p.m.) hours, 19.63% versus 16.10%. Human-related errors (73.8%) and vehicle-related defects (19.6%) were the primary reasons for hazmat tanker crashes. The most common outcomes of a hazmat tanker accident was a spill without further events (55.51%), followed by a release with fire (7.77%), and release with an explosion (2.54%). The safety situation of China's hazmat tanker transportation is grim. Such accidents not only have high spill percentages and consistently large spills but they can also cause serious consequences, such as fires and explosions. Improving the training of drivers and the quality of vehicles, deploying roll stability aids, enhancing vehicle inspection and maintenance, and developing good delivery schedules may all be considered effective measures for mitigating hazmat tanker accidents, especially severe crashes.
Effects of variability in probable maximum precipitation patterns on flood losses
NASA Astrophysics Data System (ADS)
Zischg, Andreas Paul; Felder, Guido; Weingartner, Rolf; Quinn, Niall; Coxon, Gemma; Neal, Jeffrey; Freer, Jim; Bates, Paul
2018-05-01
The assessment of the impacts of extreme floods is important for dealing with residual risk, particularly for critical infrastructure management and for insurance purposes. Thus, modelling of the probable maximum flood (PMF) from probable maximum precipitation (PMP) by coupling hydrological and hydraulic models has gained interest in recent years. Herein, we examine whether variability in precipitation patterns exceeds or is below selected uncertainty factors in flood loss estimation and if the flood losses within a river basin are related to the probable maximum discharge at the basin outlet. We developed a model experiment with an ensemble of probable maximum precipitation scenarios created by Monte Carlo simulations. For each rainfall pattern, we computed the flood losses with a model chain and benchmarked the effects of variability in rainfall distribution with other model uncertainties. The results show that flood losses vary considerably within the river basin and depend on the timing and superimposition of the flood peaks from the basin's sub-catchments. In addition to the flood hazard component, the other components of flood risk, exposure, and vulnerability contribute remarkably to the overall variability. This leads to the conclusion that the estimation of the probable maximum expectable flood losses in a river basin should not be based exclusively on the PMF. Consequently, the basin-specific sensitivities to different precipitation patterns and the spatial organization of the settlements within the river basin need to be considered in the analyses of probable maximum flood losses.
Caruso, Maria Vittoria; Serra, Raffaele; Perri, Paolo; Buffone, Gianluca; Caliò, Francesco Giuseppe; DE Franciscis, Stefano; Fragomeni, Fragomeni
2017-01-01
Hemodynamics has a key role in atheropathogenesis. Indeed, atherosclerotic phenomena occur in vessels characterized by complex geometry and flow pattern, like the carotid bifurcation. Moreover, lifestyle is a significant risk factor. The aim of this study is to evaluate the hemodynamic effects due to two sedentary lifestyles - sitting and standing positions - in the carotid bifurcation in order to identify the worst condition and to investigate the atherosclerosis incidence. The computational fluid dynamics (CFD) was chosen to carry out the analysis, in which in vivo non-invasive measurements were used as boundary conditions. Furthermore, to compare the two conditions, one patient-specific 3D model of a carotid bifurcation was reconstructed starting from computer tomography. Different mechanical indicators, correlated with atherosclerosis incidence, were calculated in addition to flow pattern and pressure distribution: the time average wall shear stress (TAWSS), the oscillatory shear index (OSI) and the relative residence time (RRT). The results showed that the bulb and the external carotid artery emergence are the most probable regions in which atherosclerotic events could happen. Indeed, low velocity and WSS values, high OSI and, as a consequence, areas with chaotic-swirling flow, with stasis (high RRT), occur. Moreover, the sitting position is the worst condition: considering a cardiac cycle, TAWSS is less than 17.2% and OSI and RRT are greater than 17.5% and 21.2%, respectively. This study suggests that if a person spends much time in the sitting position, a high risk of plaque formation and, consequently, of stenosis could happen.
ERIC Educational Resources Information Center
Penrod, Becky; Gardella, Laura; Fernand, Jonathan
2012-01-01
Few studies have examined the effects of the high-probability instructional sequence in the treatment of food selectivity, and results of these studies have been mixed (e.g., Dawson et al., 2003; Patel et al., 2007). The present study extended previous research on the high-probability instructional sequence by combining this procedure with…
Stochastic von Bertalanffy models, with applications to fish recruitment.
Lv, Qiming; Pitchford, Jonathan W
2007-02-21
We consider three individual-based models describing growth in stochastic environments. Stochastic differential equations (SDEs) with identical von Bertalanffy deterministic parts are formulated, with a stochastic term which decreases, remains constant, or increases with organism size, respectively. Probability density functions for hitting times are evaluated in the context of fish growth and mortality. Solving the hitting time problem analytically or numerically shows that stochasticity can have a large positive impact on fish recruitment probability. It is also demonstrated that the observed mean growth rate of surviving individuals always exceeds the mean population growth rate, which itself exceeds the growth rate of the equivalent deterministic model. The consequences of these results in more general biological situations are discussed.
NASA Astrophysics Data System (ADS)
Oliveira, F. C.; Denadai, A. M. L.; Fulgêncio, F. H.; Magalhães, W. F.; Alcântara, A. F. C.; Windmöller, D.; Machado, J. C.
2012-06-01
Positronium formation in triphenylphosphine oxide (TPPO), triphenylmethanol (TPM), and systems [TPPO(1-X)ṡTPMX] has been studied. The low probability of positronium formation in complex [TPPO0.5ṡTPM0.5] was attributed to strong hydrogen bond and sixfold phenyl embrace interactions. These strong interactions in complex reduce the possibility of the n- and π-electrons to interact with positrons on the spur and consequently, the probability of positronium formation is lower. The τ3 parameter and free volume (correlated to τ3) were also sensitive to the formation of hydrogen bonds and sixfold phenyl embrace interactions within the complex. For physical mixture the positron annihilation parameters remained unchanged throughout the composition range.
The Extent and Consequences of P-Hacking in Science
Head, Megan L.; Holman, Luke; Lanfear, Rob; Kahn, Andrew T.; Jennions, Michael D.
2015-01-01
A focus on novel, confirmatory, and statistically significant results leads to substantial bias in the scientific literature. One type of bias, known as “p-hacking,” occurs when researchers collect or select data or statistical analyses until nonsignificant results become significant. Here, we use text-mining to demonstrate that p-hacking is widespread throughout science. We then illustrate how one can test for p-hacking when performing a meta-analysis and show that, while p-hacking is probably common, its effect seems to be weak relative to the real effect sizes being measured. This result suggests that p-hacking probably does not drastically alter scientific consensuses drawn from meta-analyses. PMID:25768323
Determination of celestial bodies orbits and probabilities of their collisions with the Earth
NASA Astrophysics Data System (ADS)
Medvedev, Yuri; Vavilov, Dmitrii
In this work we have developed a universal method to determine the small bodies orbits in the Solar System. In the method we consider different planes of body’s motion and pick up which is the most appropriate. Given an orbit plane we can calculate geocentric distances at time of observations and consequence determinate all orbital elements. Another technique that we propose here addresses the problem of estimation probability of collisions celestial bodies with the Earth. This technique uses the coordinate system associated with the nominal osculating orbit. We have compared proposed technique with the Monte-Carlo simulation. Results of these methods exhibit satisfactory agreement, whereas, proposed method is advantageous in time performance.
A New Empirical Constraint on the Prevalence of Technological Species in the Universe
NASA Astrophysics Data System (ADS)
Frank, A.; Sullivan, W. T., III
2016-05-01
In this article, we address the cosmic frequency of technological species. Recent advances in exoplanet studies provide strong constraints on all astrophysical terms in the Drake equation. Using these and modifying the form and intent of the Drake equation, we set a firm lower bound on the probability that one or more technological species have evolved anywhere and at any time in the history of the observable Universe. We find that as long as the probability that a habitable zone planet develops a technological species is larger than ˜10-24, humanity is not the only time technological intelligence has evolved. This constraint has important scientific and philosophical consequences.
High hunting pressure selects for earlier birth date: Wild boar as a case study
Gamelon, M.; Besnard, A.; Gaillard, J.-M.; Servanty, S.; Baubet, E.; Brandt, S.; Gimenez, O.
2011-01-01
Exploitation by humans affects the size and structure of populations. This has evolutionary and demographic consequences that have typically being studied independent of one another. We here applied a framework recently developed applying quantitative tools from population ecology and selection gradient analysis to quantify the selection on a quantitative trait-birth date-through its association with multiple fitness components. From the long-term monitoring (22 years) of a wild boar (Sus scrofa scrofa) population subject to markedly increasing hunting pressure, we found that birth dates have advanced by up to 12 days throughout the study period. During the period of low hunting pressure, there was no detectable selection. However, during the period of high hunting pressure, the selection gradient linking breeding probability in the first year of life to birth date was negative, supporting current life-history theory predicting selection for early births to reproduce within the first year of life with increasing adult mortality. ?? 2011 The Author(s). Evolution?? 2011 The Society for the Study of Evolution..
Identification of pre-leukaemic haematopoietic stem cells in acute leukaemia.
Shlush, Liran I; Zandi, Sasan; Mitchell, Amanda; Chen, Weihsu Claire; Brandwein, Joseph M; Gupta, Vikas; Kennedy, James A; Schimmer, Aaron D; Schuh, Andre C; Yee, Karen W; McLeod, Jessica L; Doedens, Monica; Medeiros, Jessie J F; Marke, Rene; Kim, Hyeoung Joon; Lee, Kwon; McPherson, John D; Hudson, Thomas J; Brown, Andrew M K; Yousif, Fouad; Trinh, Quang M; Stein, Lincoln D; Minden, Mark D; Wang, Jean C Y; Dick, John E
2014-02-20
In acute myeloid leukaemia (AML), the cell of origin, nature and biological consequences of initiating lesions, and order of subsequent mutations remain poorly understood, as AML is typically diagnosed without observation of a pre-leukaemic phase. Here, highly purified haematopoietic stem cells (HSCs), progenitor and mature cell fractions from the blood of AML patients were found to contain recurrent DNMT3A mutations (DNMT3A(mut)) at high allele frequency, but without coincident NPM1 mutations (NPM1c) present in AML blasts. DNMT3A(mut)-bearing HSCs showed a multilineage repopulation advantage over non-mutated HSCs in xenografts, establishing their identity as pre-leukaemic HSCs. Pre-leukaemic HSCs were found in remission samples, indicating that they survive chemotherapy. Therefore DNMT3A(mut) arises early in AML evolution, probably in HSCs, leading to a clonally expanded pool of pre-leukaemic HSCs from which AML evolves. Our findings provide a paradigm for the detection and treatment of pre-leukaemic clones before the acquisition of additional genetic lesions engenders greater therapeutic resistance.
[Early congenital syphilis: a case report].
Cavagnaro S M, Felipe; Pereira R, Teresita; Pérez P, Carla; Vargas Del V, Fernanda; Sandoval C, Carmen
2014-02-01
Congenital syphilis (CS) is a multisystemic infection of the newborn (NB) which can produce severe symptoms, and in some cases, even be fatal. In recent years, the incidence of syphilis has increased worldwide and similarly, the cases of CS in neonates have increased. To report two cases of early and severe presentation of CS, focusing on the importance of prevention of vertical transmission and monitoring of treated mothers. The diagnostic difficulties are discussed. Two premature newborns that were diagnosed with probable CS present in the newborn period are presented. In the first case, due to a high index of suspicion, but without confirmatory testing, treatment was started with good clinical response. In the second case, CS was confirmed through positive serology and the specific treatment was given. CS has significant diagnostic challenges as there is no test for early confirmation, therefore, a high index of suspicion might be key in the treatment and consequent prognosis. Due to the current epidemiology of the condition, it is also important to focus on preventive measures.
Investigation of a light fixture fire
Jurney, James D.; Cournoyer, Michael E.; Trujillo, Stanley; ...
2016-04-16
Metal-halide lamps produce light by discharging an electric arc through a gaseous mixture of vaporized mercury and metal halides. Metal-halide lamps for use in spaces with lower mounting heights can produce excessive visual glare in the normal, higher field-of-view unless they are equipped with prismatic lenses. Should the bulb fail, high internal operating pressure of the arc tube can launch fragments of arc tube at high velocity in all directions, striking the outer bulb of the lamp with enough force to cause the outer bulb to break. This article reports an investigation of a light fixture fire and reviews amore » case study of a metal-halide lamp fire. We reported on causal analysis of the metal-halide lamp fire uncovered contributing factors that created the environment in which the incident occurred. Latent organizational conditions that created error-likely situations or weakened defenses were identified and controlled. Lastly, effective improvements that reduce the probability or consequence of similar metal-halide lamp fire incidents were implemented.« less
Stienen, Eric W M; Courtens, Wouter; Van de Walle, Marc; Vanermen, Nicolas; Verstraete, Hilbran
2017-02-15
Trends in oil rates of beached seabirds reflect temporal and spatial patterns in chronic oil pollution at sea. We analysed a long-term dataset of systematic beached bird surveys along the Belgian North Sea coast during 1962-2015, where extreme high oil contamination rates and consequently high mortality rates of seabirds during the 1960s used to coincide with intensive ship traffic. In the 1960s, >90% of all swimming seabirds that washed ashore were contaminated with oil and estimated oil-induced mortality of seabirds was probably several times higher than natural mortality. More than 50years later oil rates of seabirds have dropped to historically low levels while shipping is still very intense, indicating that chronic oil pollution has significantly declined. The declining trend is discussed in the light of a series of legislative measures that were enacted in the North Sea region to reduce oil pollution. Copyright © 2016 Elsevier Ltd. All rights reserved.
Bonjean, Maxime; Baker, Tanya; Bazhenov, Maxim; Cash, Sydney; Halgren, Eric; Sejnowski, Terrence
2012-01-01
Sleep spindles, which are bursts of 11–15 Hz that occur during non-REM sleep, are highly synchronous across the scalp when measured with EEG, but have low spatial coherence and exhibit low correlation with EEG signals when simultaneously measured with MEG spindles in humans. We developed a computational model to explore the hypothesis that the spatial coherence of the EEG spindle is a consequence of diffuse matrix projections of the thalamus to layer 1 compared to the focal projections of the core pathway to layer 4 recorded by the MEG. Increasing the fanout of thalamocortical connectivity in the matrix pathway while keeping the core pathway fixed led to increased synchrony of the spindle activity in the superficial cortical layers in the model. In agreement with cortical recordings, the latency for spindles to spread from the core to the matrix was independent of the thalamocortical fanout but highly dependent on the probability of connections between cortical areas. PMID:22496571
A model independent search for new physics in final states containing leptons at the DO experiment
NASA Astrophysics Data System (ADS)
Piper, Joel M.
The standard model is known to be the low energy limit of a more general theory. Several consequences of the standard model point to a strong probability of new physics becoming experimentally visible in high energy collisions of a few TeV, resulting in high momentum objects. The specific signatures of these collisions are topics of much debate. Rather than choosing a specific signature, this analysis broadly searches the data, preferring breadth over sensitivity. In searching for new physics, several different approaches are used. These include the comparison of data with standard model background expectation in overall number of events, comparisons of distributions of many kinematic variables, and finally comparisons on the tails of distributions that sum the momenta of the objects in an event. With 1.07 fb-1 at the DO experiment, we find no evidence of physics beyond the standard model. Several discrepancies from the standard model were found, but none of these provide a compelling case for new physics.
Basal paravian functional anatomy illuminated by high-detail body outline
Wang, Xiaoli; Pittman, Michael; Zheng, Xiaoting; Kaye, Thomas G.; Falk, Amanda R.; Hartman, Scott A.; Xu, Xing
2017-01-01
Body shape is a fundamental expression of organismal biology, but its quantitative reconstruction in fossil vertebrates is rare. Due to the absence of fossilized soft tissue evidence, the functional consequences of basal paravian body shape and its implications for the origins of avians and flight are not yet fully understood. Here we reconstruct the quantitative body outline of a fossil paravian Anchiornis based on high-definition images of soft tissues revealed by laser-stimulated fluorescence. This body outline confirms patagia-bearing arms, drumstick-shaped legs and a slender tail, features that were probably widespread among paravians. Finely preserved details also reveal similarities in propatagial and footpad form between basal paravians and modern birds, extending their record to the Late Jurassic. The body outline and soft tissue details suggest significant functional decoupling between the legs and tail in at least some basal paravians. The number of seemingly modern propatagial traits hint that feathering was a significant factor in how basal paravians utilized arm, leg and tail function for aerodynamic benefit. PMID:28248287
Investigation of a light fixture fire
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jurney, James D.; Cournoyer, Michael E.; Trujillo, Stanley
Metal-halide lamps produce light by discharging an electric arc through a gaseous mixture of vaporized mercury and metal halides. Metal-halide lamps for use in spaces with lower mounting heights can produce excessive visual glare in the normal, higher field-of-view unless they are equipped with prismatic lenses. Should the bulb fail, high internal operating pressure of the arc tube can launch fragments of arc tube at high velocity in all directions, striking the outer bulb of the lamp with enough force to cause the outer bulb to break. This article reports an investigation of a light fixture fire and reviews amore » case study of a metal-halide lamp fire. We reported on causal analysis of the metal-halide lamp fire uncovered contributing factors that created the environment in which the incident occurred. Latent organizational conditions that created error-likely situations or weakened defenses were identified and controlled. Lastly, effective improvements that reduce the probability or consequence of similar metal-halide lamp fire incidents were implemented.« less
Martin, C E; Brandmeyer, E A; Ross, R D
2013-01-01
Leaf temperatures were lower when light entry at the leaf tip window was prevented through covering the window with reflective tape, relative to leaf temperatures of plants with leaf tip windows covered with transparent tape. This was true when leaf temperatures were measured with an infrared thermometer, but not with a fine-wire thermocouple. Leaf tip windows of Lithops growing in high-rainfall regions of southern Africa were larger than the windows of plants (numerous individuals of 17 species) growing in areas with less rainfall and, thus, more annual insolation. The results of this study indicate that leaf tip windows of desert plants with an underground growth habit can allow entry of supra-optimal levels of radiant energy, thus most likely inhibiting photosynthetic activity. Consequently, the size of the leaf tip windows correlates inversely with habitat solar irradiance, minimising the probability of photoinhibition, while maximising the absorption of irradiance in cloudy, high-rainfall regions. © 2012 German Botanical Society and The Royal Botanical Society of the Netherlands.
Bonte, Anja; Schweiger, Rabea; Pons, Caroline; Wagner, Claudia; Brühl, Ludger; Matthäus, Bertrand; Müller, Caroline
2017-12-20
Virgin rapeseed (Brassica napus) oil is a valuable niche product, if delivered with a high quality. In this study, the effects of moist storage of B. napus seeds for 1 to 4 days on the seed metabolome and the chemo-sensory properties of the produced oils were determined. The concentrations of several primary metabolites, including monosaccharides and amino acids, rapidly increased in the seeds, probably indicating the breakdown of storage compounds to support seed germination. Seed concentrations of indole glucosinolates increased with a slight time offset suggesting that amino acids may be used to modify secondary metabolism. The volatile profiles of the oils were pronouncedly influenced by moist seed storage, with the sensory quality of the oils decreasing. This study provides a direct time-resolved link between seed metabolism under moist conditions and the quality of the resulting oils, thereby emphasizing the crucial role of dry seed storage in ensuring high oil quality.
[Disability pensions in young age in Norway during 1976-1996].
Bjerkedal, T
1998-06-10
A 15% increase in the incidence of 16 to 24-year olds drawing disability pension was observed in Norway from 1976 to 93. This increase is mainly a consequence of the higher numbers of pensioners because of birth defects and mental retardation. Prevalence of these conditions, which are clearly related to pregnancy, delivery, and inheritable disorders, may have increased as a consequence of the improved survival of newborn babies observed during the last two decades. A 50% increase in the incidence of disability pensions among 16 to 24-year olds has occurred in the three-year period from 1994 to 96. The higher rate is most probably a consequence of the restrictions in rehabilitation benefits introduced in 1993, and the resultant difficulties in obtaining employment. The higher incidence is a clear indicator of the need to increase assistance for the disabled in order to avoid their being pensioned at a young age.
Pekcan-Hekim, Zeynep; Lappalainen, Jyrki
2006-07-01
Increased turbidity reduces visibility in the water column, which can negatively affect vision-oriented fish and their ability to detect prey. Young fish could consequently benefit from high turbidity levels that can provide a protective cover, reducing predation pressure. Perch (Perca fluviatilis) are commonly found in littoral zones of temperate lakes and coastal areas of the Baltic Sea. Pikeperch (Sander lucioperca) spawn in these areas, so perch is a potential predator for pikeperch larvae. We conducted laboratory experiments to test the predation of perch on pikeperch larvae at different turbidity levels (5-85 nephelometric turbidity units), densities of pikeperch larvae (2-21 individuals l(-1)) and volumes of water (10-45 l). The logistic regression showed that the probability of larvae eaten depended significantly on turbidity and volume of water in the bags, while density of larvae was not significant. However, because container size is known to affect predation, the data was divided into two groups based on water volume (10-20 and 25-45 l) to reduce the effects of container size. In either group, probability of predation did not significantly depend on volume, whereas turbidity was significant in both groups, while density was significant in larger water volumes. Thus, high turbidity impaired perch predation and protected pikeperch larvae from perch predation. Because density of larvae was also a significant factor affecting predation of perch, the dispersal of pikeperch larvae from spawning areas should also increase the survival of larvae.
Teaching Probabilities and Statistics to Preschool Children
ERIC Educational Resources Information Center
Pange, Jenny
2003-01-01
This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…
The Value, Protocols, and Scientific Ethics of Earthquake Forecasting
NASA Astrophysics Data System (ADS)
Jordan, Thomas H.
2013-04-01
Earthquakes are different from other common natural hazards because precursory signals diagnostic of the magnitude, location, and time of impending seismic events have not yet been found. Consequently, the short-term, localized prediction of large earthquakes at high probabilities with low error rates (false alarms and failures-to-predict) is not yet feasible. An alternative is short-term probabilistic forecasting based on empirical statistical models of seismic clustering. During periods of high seismic activity, short-term earthquake forecasts can attain prospective probability gains up to 1000 relative to long-term forecasts. The value of such information is by no means clear, however, because even with hundredfold increases, the probabilities of large earthquakes typically remain small, rarely exceeding a few percent over forecasting intervals of days or weeks. Civil protection agencies have been understandably cautious in implementing operational forecasting protocols in this sort of "low-probability environment." This paper will explore the complex interrelations among the valuation of low-probability earthquake forecasting, which must account for social intangibles; the protocols of operational forecasting, which must factor in large uncertainties; and the ethics that guide scientists as participants in the forecasting process, who must honor scientific principles without doing harm. Earthquake forecasts possess no intrinsic societal value; rather, they acquire value through their ability to influence decisions made by users seeking to mitigate seismic risk and improve community resilience to earthquake disasters. According to the recommendations of the International Commission on Earthquake Forecasting (www.annalsofgeophysics.eu/index.php/annals/article/view/5350), operational forecasting systems should appropriately separate the hazard-estimation role of scientists from the decision-making role of civil protection authorities and individuals. They should provide public sources of information on short-term probabilities that are authoritative, scientific, open, and timely. Alert procedures should be negotiated with end-users to facilitate decisions at different levels of society, based in part on objective analysis of costs and benefits but also on less tangible aspects of value-of-information, such as gains in psychological preparedness and resilience. Unfortunately, in most countries, operational forecasting systems do not conform to such high standards, and earthquake scientists are often called upon to advise the public in roles that exceed their civic authority, expertise in risk communication, and situational knowledge. Certain ethical principles are well established; e.g., announcing unreliable predictions in public forums should be avoided, because bad information can be dangerous. But what are the professional responsibilities of earthquake scientists during seismic crises, especially when the public information through official channels is thought to be inadequate or incorrect? How much should these responsibilities be discounted in the face of personal liability? How should scientists contend with highly uncertain forecasts? To what degree should the public be involved in controversies about forecasting results? No simple answers to these questions can be offered, but the need for answers can be reduced by improving operational forecasting systems. This will require more substantial, and more trustful, collaborations between scientists, civil authorities, and public stakeholders.
Estimating the carbon in coarse woody debris with perpendicular distance sampling. Chapter 6
Harry T. Valentine; Jeffrey H. Gove; Mark J. Ducey; Timothy G. Gregoire; Michael S. Williams
2008-01-01
Perpendicular distance sampling (PDS) is a design for sampling the population of pieces of coarse woody debris (logs) in a forested tract. In application, logs are selected at sample points with probability proportional to volume. Consequently, aggregate log volume per unit land area can be estimated from tallies of logs at sample points. In this chapter we provide...
Andrew Youngblood; Kerry L. Metlen; Eric E. Knapp; Kenneth W. Outcalt; Scott L. Stephens; Thomas A. Waldrop; Daniel Yaussy
2005-01-01
Many fire-dependent forests today are denser, contain fewer large trees, have higher fuel loads, and greater fuel continuity than occurred under historical fire regimes. These conditions increase the probability of unnaturally severe wildfires. Silviculturists are increasingly being asked to design fuel reduction treatments to help protect existing and future forest...
Elephantiasis Nostras Verrucosa (ENV): a complication of congestive heart failure and obesity.
Baird, Drew; Bode, David; Akers, Troy; Deyoung, Zachariah
2010-01-01
Congestive heart failure (CHF) and obesity are common medical conditions that have many complications and an increasing incidence in the United States. Presented here is a case of a disfiguring skin condition that visually highlights the dermatologic consequences of poorly controlled CHF and obesity. This condition will probably become more common as CHF and obesity increase in the US.
The challenge of modelling and mapping the future distribution and impact of invasive alien species
Robert C. Venette
2015-01-01
Invasions from alien species can jeopardize the economic, environmental or social benefits derived from biological systems. Biosecurity measures seek to protect those systems from accidental or intentional introductions of species that might become injurious. Pest risk maps convey how the probability of invasion by an alien species or the potential consequences of that...
Common pitfalls in statistical analysis: The perils of multiple testing
Ranganathan, Priya; Pramesh, C. S.; Buyse, Marc
2016-01-01
Multiple testing refers to situations where a dataset is subjected to statistical testing multiple times - either at multiple time-points or through multiple subgroups or for multiple end-points. This amplifies the probability of a false-positive finding. In this article, we look at the consequences of multiple testing and explore various methods to deal with this issue. PMID:27141478
A Population-Based Study of Childhood Sexual Contact in China: Prevalence and Long-Term Consequences
ERIC Educational Resources Information Center
Luo, Ye; Parish, William L.; Laumann, Edward O.
2008-01-01
Objectives: This study provides national estimates of the prevalence of childhood sexual contact and its association with sexual well-being and psychological distress among adults in China. Method: A national stratified probability sample of 1,519 women and 1,475 men aged 20-64 years in urban China completed a computer-administered survey in…
[Hospital financing in 2015. Relevant changes for rheumatology].
Fiori, W; Lakomek, H-J; Buscham, K; Lehmann, H; Fuchs, A-K; Bessler, F; Roeder, N
2015-06-01
The announced major reforms will most probably not have an impact on hospital financing before 2016. Nevertheless, the numerous minor changes in the legislative framework and the new version of the German diagnosis-related groups (G-DRG) system can be important for hospitals specialized in rheumatology. The following article presents the relevant changes and discusses the consequences for hospitals specialized in rheumatology.
Loss Estimations due to Earthquakes and Secondary Technological Hazards
NASA Astrophysics Data System (ADS)
Frolova, N.; Larionov, V.; Bonnin, J.
2009-04-01
Expected loss and damage assessment due to natural and technological disasters are of primary importance for emergency management just after the disaster, as well as for development and implementation of preventive measures plans. The paper addresses the procedures and simulation models for loss estimations due to strong earthquakes and secondary technological accidents. The mathematical models for shaking intensity distribution, damage to buildings and structures, debris volume, number of fatalities and injuries due to earthquakes and technological accidents at fire and chemical hazardous facilities are considered, which are used in geographical information systems assigned for these purposes. The criteria of technological accidents occurrence are developed on the basis of engineering analysis of past events' consequences. The paper is providing the results of scenario earthquakes consequences estimation and individual seismic risk assessment taking into account the secondary technological hazards at regional and urban levels. The individual risk is understood as the probability of death (or injuries) due to possible hazardous event within one year in a given territory. It is determined through mathematical expectation of social losses taking into account the number of inhabitants in the considered settlement and probability of natural and/or technological disaster.
Dynamic decision making for dam-break emergency management - Part 1: Theoretical framework
NASA Astrophysics Data System (ADS)
Peng, M.; Zhang, L. M.
2013-02-01
An evacuation decision for dam breaks is a very serious issue. A late decision may lead to loss of lives and properties, but a very early evacuation will incur unnecessary expenses. This paper presents a risk-based framework of dynamic decision making for dam-break emergency management (DYDEM). The dam-break emergency management in both time scale and space scale is introduced first to define the dynamic decision problem. The probability of dam failure is taken as a stochastic process and estimated using a time-series analysis method. The flood consequences are taken as functions of warning time and evaluated with a human risk analysis model (HURAM) based on Bayesian networks. A decision criterion is suggested to decide whether to evacuate the population at risk (PAR) or to delay the decision. The optimum time for evacuating the PAR is obtained by minimizing the expected total loss, which integrates the time-related probabilities and flood consequences. When a delayed decision is chosen, the decision making can be updated with available new information. A specific dam-break case study is presented in a companion paper to illustrate the application of this framework to complex dam-breaching problems.
Probability of foliar injury for Acer sp. based on foliar fluoride concentrations.
McDonough, Andrew M; Dixon, Murray J; Terry, Debbie T; Todd, Aaron K; Luciani, Michael A; Williamson, Michele L; Roszak, Danuta S; Farias, Kim A
2016-12-01
Fluoride is considered one of the most phytotoxic elements to plants, and indicative fluoride injury has been associated over a wide range of foliar fluoride concentrations. The aim of this study was to determine the probability of indicative foliar fluoride injury based on Acer sp. foliar fluoride concentrations using a logistic regression model. Foliage from Acer nedundo, Acer saccharinum, Acer saccharum and Acer platanoides was collected along a distance gradient from three separate brick manufacturing facilities in southern Ontario as part of a long-term monitoring programme between 1995 and 2014. Hydrogen fluoride is the major emission source associated with the manufacturing facilities resulting with highly elevated foliar fluoride close to the facilities and decreasing with distance. Consistent with other studies, indicative fluoride injury was observed over a wide range of foliar concentrations (9.9-480.0 μg F - g -1 ). The logistic regression model was statistically significant for the Acer sp. group, A. negundo and A. saccharinum; consequently, A. negundo being the most sensitive species among the group. In addition, A. saccharum and A. platanoides were not statistically significant within the model. We are unaware of published foliar fluoride values for Acer sp. within Canada, and this research provides policy maker and scientist with probabilities of indicative foliar injury for common urban Acer sp. trees that can help guide decisions about emissions controls. Further research should focus on mechanisms driving indicative fluoride injury over wide ranging foliar fluoride concentrations and help determine foliar fluoride thresholds for damage.
Areal, F J; Touza, J; MacLeod, A; Dehnen-Schmutz, K; Perrings, C; Palmieri, M G; Spence, N J
2008-12-01
This paper analyses the cut flower market as an example of an invasion pathway along which species of non-indigenous plant pests can travel to reach new areas. The paper examines the probability of pest detection by assessing information on pest detection and detection effort associated with the import of cut flowers. We test the link between the probability of plant pest arrivals, as a precursor to potential invasion, and volume of traded flowers using count data regression models. The analysis is applied to the UK import of specific genera of cut flowers from Kenya between 1996 and 2004. There is a link between pest detection and the Genus of cut flower imported. Hence, pest detection efforts should focus on identifying and targeting those imported plants with a high risk of carrying pest species. For most of the plants studied, efforts allocated to inspection have a significant influence on the probability of pest detection. However, by better targeting inspection efforts, it is shown that plant inspection effort could be reduced without increasing the risk of pest entry. Similarly, for most of the plants analysed, an increase in volume traded will not necessarily lead to an increase in the number of pests entering the UK. For some species, such as Carthamus and Veronica, the volume of flowers traded has a significant and positive impact on the likelihood of pest detection. We conclude that analysis at the rank of plant Genus is important both to understand the effectiveness of plant pest detection efforts and consequently to manage the risk of introduction of non-indigenous species.
[Association of childhood sexual abuse and disordered eating in a sample of Mexican adolescents].
Unikel-Santoncini, Claudia; Ramos-Lira, Luciana; Juárez-García, Francisco
2011-01-01
To analyze the association between childhood sexual abuse (CSA) and disordered eating (DE). A probabilistic sample of 2,358 female high school students of public schools in the Estado de Mexico was used. DE was more prevalent among CSA sufferers (p < or = 0.05). Preoccupation with gaining weight, binging and restrictive behaviors were associated to CSA (p < or = 0.05). The probability of DE was 7 times higher when the CSA experience had not been revealed and 36 times when CSA happened before 14 years of age. The association between CSA and DE is clear in the sample studied, as well as with some of its specific characteristics, which highlights the need to deepen in this research field and to incorporate the evaluation of CSA and its consequences in adolescents' mental health.
Dynamic risk control by human nucleus accumbens
Lopez-Sosa, Fernando; Gonzalez-Rosa, Javier Jesus; Galarza, Ana; Avecillas, Josue; Pineda-Pardo, Jose Angel; Lopez-Ibor, Juan José; Reneses, Blanca; Barcia, Juan Antonio
2015-01-01
Real-world decisions about reward often involve a complex counterbalance of risk and value. Although the nucleus accumbens has been implicated in the underlying neural substrate, its criticality to human behaviour remains an open question, best addressed with interventional methodology that probes the behavioural consequences of focal neural modulation. Combining a psychometric index of risky decision-making with transient electrical modulation of the nucleus accumbens, here we reveal profound, highly dynamic alteration of the relation between probability of reward and choice during therapeutic deep brain stimulation in four patients with treatment-resistant psychiatric disease. Short-lived phasic electrical stimulation of the region of the nucleus accumbens dynamically altered risk behaviour, transiently shifting the psychometric function towards more risky decisions only for the duration of stimulation. A critical, on-line role of human nucleus accumbens in dynamic risk control is thereby established. PMID:26428667
Thelytokous parthenogenesis in eusocial Hymenoptera.
Rabeling, Christian; Kronauer, Daniel J C
2013-01-01
Female parthenogenesis, or thelytoky, is particularly common in solitary Hymenoptera. Only more recently has it become clear that many eusocial species also regularly reproduce thelytokously, and here we provide a comprehensive overview. Especially in ants, thelytoky underlies a variety of idiosyncratic life histories with unique evolutionary and ecological consequences. In all eusocial species studied, thelytoky probably has a nuclear genetic basis and the underlying cytological mechanism retains high levels of heterozygosity. This is in striking contrast to many solitary wasps, in which thelytoky is often induced by cytoplasmic bacteria and results in an immediate loss of heterozygosity. These differences are likely related to differences in haplodiploid sex determination mechanisms, which in eusocial species usually require heterozygosity for female development. At the same time, haplodiploidy might account for important preadaptations that can help explain the apparent ease with which Hymenoptera transition between sexual and asexual reproduction.
NASA Astrophysics Data System (ADS)
Liu, Changqin; Li, Zhe; Zhang, Yuanlei; Huang, Yinsheng; Ye, Miaofu; Sun, Xiaodong; Zhang, Guojie; Cao, Yiming; Xu, Kun; Jing, Chao
2018-05-01
In this work, we have developed a ferromagnetic shape memory alloy Co50V34Ga16 with a metamagnetic martensitic transformation (MT) from the high-magnetization austenitic phase to the low-magnetization martensitic phase. As a consequence of a strong coupling between structure and magnetic degrees of freedom, the metamagnetic MT of this alloy is relatively sensitive to the external magnetic field, thus giving rise to a field-induced reverse MT. Associated with such a unique behavior, both considerable inverse magnetocaloric effect (9.6 J/kg K) and magnetostrain (0.07%) have also been obtained under the magnetic field change of 3 T. Our experimental results indicate that this kind of Co-V based alloy probably becomes an alternatively promising candidate for applications in magnetic sensors and magnetic refrigeration.
Use of computers in dysmorphology.
Diliberti, J H
1988-01-01
As a consequence of the increasing power and decreasing cost of digital computers, dysmorphologists have begun to explore a wide variety of computerised applications in clinical genetics. Of considerable interest are developments in the areas of syndrome databases, expert systems, literature searches, image processing, and pattern recognition. Each of these areas is reviewed from the perspective of the underlying computer principles, existing applications, and the potential for future developments. Particular emphasis is placed on the analysis of the tasks performed by the dysmorphologist and the design of appropriate tools to facilitate these tasks. In this context the computer and associated software are considered paradigmatically as tools for the dysmorphologist and should be designed accordingly. Continuing improvements in the ability of computers to manipulate vast amounts of data rapidly makes the development of increasingly powerful tools for the dysmorphologist highly probable. PMID:3050092
Promises and pitfalls of data sharing in qualitative research
Tsai, Alexander C.; Kohrt, Brandon A.; Matthews, Lynn T.; Betancourt, Theresa S.; Lee, Jooyoung K.; Papachristos, Andrew V.; Weiser, Sheri D.; Dworkin, Shari L.
2017-01-01
The movement for research transparency has gained irresistible momentum over the past decade. Although qualitative research is rarely published in the high-impact journals that have adopted, or are most likely to adopt, data sharing policies, qualitative researchers who publish work in these and similar venues will likely encounter questions about data sharing within the next few years. The fundamental ways in which qualitative and quantitative data differ should be considered when assessing the extent to which qualitative and mixed methods researchers should be expected to adhere to data sharing policies developed with quantitative studies in mind. We outline several of the most critical concerns below, while also suggesting possible modifications that may help to reduce the probability of unintended adverse consequences and to ensure that the sharing of qualitative data is consistent with ethical standards in research. PMID:27535900
Faccia, Michele; Mastromatteo, Marianna; Conte, Amalia; Del Nobile, Matteo Alessandro
2012-11-01
In this work the effects of addition of different amounts of sodium chloride, during cheese making, on shelf life of mozzarella cheese were evaluated. The mozzarella cheese quality decay was assessed during storage at 9 °C by monitoring microbiological, sensory and physico-chemical changes in the product. Results showed that Pseudomonas spp. growth was responsible for cheese unacceptability, whereas the sensory quality did not limit cheese shelf life. In particular, the highest shelf life values were obtained for mozzarella without salt and with the lowest salt concentration (0·23 g NaCl), and amounted to about 5 and 4 d, respectively. On the contrary, high salt concentrations affected product shelf life, probably as a consequence of progressive solubilisation of cheese casein, due to the phenomenon of 'salting in'.
Biased Brownian dynamics for rate constant calculation.
Zou, G; Skeel, R D; Subramaniam, S
2000-08-01
An enhanced sampling method-biased Brownian dynamics-is developed for the calculation of diffusion-limited biomolecular association reaction rates with high energy or entropy barriers. Biased Brownian dynamics introduces a biasing force in addition to the electrostatic force between the reactants, and it associates a probability weight with each trajectory. A simulation loses weight when movement is along the biasing force and gains weight when movement is against the biasing force. The sampling of trajectories is then biased, but the sampling is unbiased when the trajectory outcomes are multiplied by their weights. With a suitable choice of the biasing force, more reacted trajectories are sampled. As a consequence, the variance of the estimate is reduced. In our test case, biased Brownian dynamics gives a sevenfold improvement in central processing unit (CPU) time with the choice of a simple centripetal biasing force.
Effect of acute heat stress on rat adrenal glands: a morphological and stereological study.
Koko, Vesna; Djordjeviae, Jelena; Cvijiae, Gordana; Davidoviae, Vukosava
2004-11-01
The morphological and stereological structure of rat adrenal gland was analysed by light microscopy after an acute (60 min) exposure to high ambient temperature (38 degrees C). A significant increase in plasma corticotrophin (ACTH) and serum corticosterone (CORT) concentrations was observed, confirming that acute heat exposure has a strong stressful effect. Under these conditions the adrenal gland mass and volume were decreased, probably as the consequence of adrenal cortex reduction, especially that of the zona fasciculata (ZF). Histological examination revealed that many ZF cells were deprived of lipid droplets. Fibrosis was observed in all parts of the adrenal gland, both cortex and medulla, of heat stressed animals. Mitotic figures were absent in cortical cells after heat exposure, but there were no differences in ZF and zona reticularis (ZR) small blood vessels compared to nonstressed controls.
NASA Technical Reports Server (NTRS)
Larocque, G. R.
1980-01-01
The vulnerability of a power distribution system in Bedford and Lexington, Massachusetts to power outages as a result of exposure to carbon fibers released in a commercial aviation accident in 1993 was examined. Possible crash scenarios at Logan Airport based on current operational data and estimated carbon fiber usage levels were used to predict exposure levels and occurrence probabilities. The analysis predicts a mean time between carbon fiber induced power outages of 2300 years with an expected annual consequence of 0.7 persons losing power. In comparison to historical outage data for the system, this represents a 0.007% increase in outage rate and 0.07% increase in consequence.
EFFECT OF NONZERO θ13 ON THE MEASUREMENT OF θ23
NASA Astrophysics Data System (ADS)
Raut, Sushant K.
2013-06-01
The moderately large measured value of θ13 signals a departure from the approximate two-flavor oscillation framework. As a consequence, the relation between the value of θ23 in nature, and the mixing angle measured in νμ disappearance experiments is nontrivial. In this paper, we calculate this relation analytically. We also derive the correct conversion between degenerate values of θ23 in the two octants. Through simulations of a νμ disappearance experiment, we show that there are observable consequences of not using the correct relation in calculating oscillation probabilities. These include a wrong best-fit value for θ23, and spurious sensitivity to the octant of θ23.
Fullerton, Carol S; McKibben, Jodi B A; Reissman, Dori B; Scharf, Ted; Kowalski-Trakofler, Kathleen M; Shultz, James M; Ursano, Robert J
2013-02-01
We examined the relationship of probable posttraumatic stress disorder (PTSD), probable depression, and increased alcohol and/or tobacco use to disaster exposure and work demand in Florida Department of Health workers after the 2004 hurricanes. Participants (N = 2249) completed electronic questionnaires assessing PTSD, depression, alcohol and tobacco use, hurricane exposure, and work demand. Total mental and behavioral health burden (probable PTSD, probable depression, increased alcohol and/or tobacco use) was 11%. More than 4% had probable PTSD, and 3.8% had probable depression. Among those with probable PTSD, 29.2% had increased alcohol use, and 50% had increased tobacco use. Among those with probable depression, 34% indicated increased alcohol use and 55.6% increased tobacco use. Workers with greater exposure were more likely to have probable PTSD and probable depression (ORs = 3.3 and 3.06, respectively). After adjusting for demographics and work demand, those with high exposure were more likely to have probable PTSD and probable depression (ORs = 3.21 and 3.13). Those with high exposure had increased alcohol and tobacco use (ORs = 3.01 and 3.40), and those with high work demand indicated increased alcohol and tobacco use (ORs = 1.98 and 2.10). High exposure and work demand predicted increased alcohol and tobacco use, after adjusting for demographics, work demand, and exposure. Work-related disaster mental and behavioral health burden indicate the need for additional mental health interventions in the public health disaster workforce.
Fullerton, Carol S.; McKibben, Jodi B.A.; Reissman, Dori B.; Scharf, Ted; Kowalski-Trakofler, Kathleen M.; Shultz, James M.; Ursano, Robert J.
2015-01-01
Objective We examined the relationship of probable posttraumatic stress disorder (PTSD), probable depression, and increased alcohol and/or tobacco use to disaster exposure and work demand in Florida Department of Health workers after the 2004 hurricanes. Methods Participants (N = 2249) completed electronic questionnaires assessing PTSD, depression, alcohol and tobacco use, hurricane exposure, and work demand. Results Total mental and behavioral health burden (probable PTSD, probable depression, increased alcohol and/or tobacco use) was 11%. More than 4% had probable PTSD, and 3.8% had probable depression. Among those with probable PTSD, 29.2% had increased alcohol use, and 50% had increased tobacco use. Among those with probable depression, 34% indicated increased alcohol use and 55.6% increased tobacco use. Workers with greater exposure were more likely to have probable PTSD and probable depression (ORs = 3.3 and 3.06, respectively). After adjusting for demographics and work demand, those with high exposure were more likely to have probable PTSD and probable depression (ORs = 3.21 and 3.13). Those with high exposure had increased alcohol and tobacco use (ORs = 3.01 and 3.40), and those with high work demand indicated increased alcohol and tobacco use (ORs = 1.98 and 2.10). High exposure and work demand predicted increased alcohol and tobacco use, after adjusting for demographics, work demand, and exposure. Conclusions Work-related disaster mental and behavioral health burden indicate the need for additional mental health interventions in the public health disaster workforce. PMID:24618140
Recent research on the high-probability instructional sequence: A brief review.
Lipschultz, Joshua; Wilder, David A
2017-04-01
The high-probability (high-p) instructional sequence consists of the delivery of a series of high-probability instructions immediately before delivery of a low-probability or target instruction. It is commonly used to increase compliance in a variety of populations. Recent research has described variations of the high-p instructional sequence and examined the conditions under which the sequence is most effective. This manuscript reviews the most recent research on the sequence and identifies directions for future research. Recommendations for practitioners regarding the use of the high-p instructional sequence are also provided. © 2017 Society for the Experimental Analysis of Behavior.
Long, Clive G; Banyard, Ellen; Fulton, Barbara; Hollin, Clive R
2014-09-01
Arson and fire-setting are highly prevalent among patients in secure psychiatric settings but there is an absence of valid and reliable assessment instruments and no evidence of a significant approach to intervention. To develop a semi-structured interview assessment specifically for fire-setting to augment structured assessments of risk and need. The extant literature was used to frame interview questions relating to the antecedents, behaviour and consequences necessary to formulate a functional analysis. Questions also covered readiness to change, fire-setting self-efficacy, the probability of future fire-setting, barriers to change, and understanding of fire-setting behaviour. The assessment concludes with indications for assessment and a treatment action plan. The inventory was piloted with a sample of women in secure care and was assessed for comprehensibility, reliability and validity. Staff rated the St Andrews Fire and Risk Instrument (SAFARI) as acceptable to patients and easy to administer. SAFARI was found to be comprehensible by over 95% of the general population, to have good acceptance, high internal reliability, substantial test-retest reliability and validity. SAFARI helps to provide a clear explanation of fire-setting in terms of the complex interplay of antecedents and consequences and facilitates the design of an individually tailored treatment programme in sympathy with a cognitive-behavioural approach. Further studies are needed to verify the reliability and validity of SAFARI with male populations and across settings.
Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio
2012-09-07
In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007)]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H(5)(+) complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H(5)(+) complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011)] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.
NASA Astrophysics Data System (ADS)
Gómez-Carrasco, Susana; González-Sánchez, Lola; Aguado, Alfredo; Sanz-Sanz, Cristina; Zanchet, Alexandre; Roncero, Octavio
2012-09-01
In this work we present a dynamically biased statistical model to describe the evolution of the title reaction from statistical to a more direct mechanism, using quasi-classical trajectories (QCT). The method is based on the one previously proposed by Park and Light [J. Chem. Phys. 126, 044305 (2007), 10.1063/1.2430711]. A recent global potential energy surface is used here to calculate the capture probabilities, instead of the long-range ion-induced dipole interactions. The dynamical constraints are introduced by considering a scrambling matrix which depends on energy and determine the probability of the identity/hop/exchange mechanisms. These probabilities are calculated using QCT. It is found that the high zero-point energy of the fragments is transferred to the rest of the degrees of freedom, what shortens the lifetime of H_5^+ complexes and, as a consequence, the exchange mechanism is produced with lower proportion. The zero-point energy (ZPE) is not properly described in quasi-classical trajectory calculations and an approximation is done in which the initial ZPE of the reactants is reduced in QCT calculations to obtain a new ZPE-biased scrambling matrix. This reduction of the ZPE is explained by the need of correcting the pure classical level number of the H_5^+ complex, as done in classical simulations of unimolecular processes and to get equivalent quantum and classical rate constants using Rice-Ramsperger-Kassel-Marcus theory. This matrix allows to obtain a ratio of hop/exchange mechanisms, α(T), in rather good agreement with recent experimental results by Crabtree et al. [J. Chem. Phys. 134, 194311 (2011), 10.1063/1.3587246] at room temperature. At lower temperatures, however, the present simulations predict too high ratios because the biased scrambling matrix is not statistical enough. This demonstrates the importance of applying quantum methods to simulate this reaction at the low temperatures of astrophysical interest.
Reliability and Probabilistic Risk Assessment - How They Play Together
NASA Technical Reports Server (NTRS)
Safie, Fayssal; Stutts, Richard; Huang, Zhaofeng
2015-01-01
Since the Space Shuttle Challenger accident in 1986, NASA has extensively used probabilistic analysis methods to assess, understand, and communicate the risk of space launch vehicles. Probabilistic Risk Assessment (PRA), used in the nuclear industry, is one of the probabilistic analysis methods NASA utilizes to assess Loss of Mission (LOM) and Loss of Crew (LOC) risk for launch vehicles. PRA is a system scenario based risk assessment that uses a combination of fault trees, event trees, event sequence diagrams, and probability distributions to analyze the risk of a system, a process, or an activity. It is a process designed to answer three basic questions: 1) what can go wrong that would lead to loss or degraded performance (i.e., scenarios involving undesired consequences of interest), 2) how likely is it (probabilities), and 3) what is the severity of the degradation (consequences). Since the Challenger accident, PRA has been used in supporting decisions regarding safety upgrades for launch vehicles. Another area that was given a lot of emphasis at NASA after the Challenger accident is reliability engineering. Reliability engineering has been a critical design function at NASA since the early Apollo days. However, after the Challenger accident, quantitative reliability analysis and reliability predictions were given more scrutiny because of their importance in understanding failure mechanism and quantifying the probability of failure, which are key elements in resolving technical issues, performing design trades, and implementing design improvements. Although PRA and reliability are both probabilistic in nature and, in some cases, use the same tools, they are two different activities. Specifically, reliability engineering is a broad design discipline that deals with loss of function and helps understand failure mechanism and improve component and system design. PRA is a system scenario based risk assessment process intended to assess the risk scenarios that could lead to a major/top undesirable system event, and to identify those scenarios that are high-risk drivers. PRA output is critical to support risk informed decisions concerning system design. This paper describes the PRA process and the reliability engineering discipline in detail. It discusses their differences and similarities and how they work together as complementary analyses to support the design and risk assessment processes. Lessons learned, applications, and case studies in both areas are also discussed in the paper to demonstrate and explain these differences and similarities.
[Experimental analysis of some determinants of inductive reasoning].
Ono, K
1989-02-01
Three experiments were conducted from a behavioral perspective to investigate the determinants of inductive reasoning and to compare some methodological differences. The dependent variable used in these experiments was the threshold of confident response (TCR), which was defined as "the minimal sample size required to establish generalization from instances." Experiment 1 examined the effects of population size on inductive reasoning, and the results from 35 college students showed that the TCR varied in proportion to the logarithm of population size. In Experiment 2, 30 subjects showed distinct sensitivity to both prior probability and base-rate. The results from 70 subjects who participated in Experiment 3 showed that the TCR was affected by its consequences (risk condition), and especially, that humans were sensitive to a loss situation. These results demonstrate the sensitivity of humans to statistical variables in inductive reasoning. Furthermore, methodological comparison indicated that the experimentally observed values of TCR were close to, but not as precise as the optimal values predicted by Bayes' model. On the other hand, the subjective TCR estimated by subjects was highly discrepant from the observed TCR. These findings suggest that various aspects of inductive reasoning can be fruitfully investigated not only from subjective estimations such as probability likelihood but also from an objective behavioral perspective.
Ayas, Mouhab; Eapen, Mary; Le-Rademacher, Jennifer; Carreras, Jeanette; Abdel-Azim, Hisham; Alter, Blanche P.; Anderlini, Paolo; Battiwalla, Minoo; Bierings, Marc; Buchbinder, David K.; Bonfim, Carmem; Camitta, Bruce M.; Fasth, Anders L.; Gale, Robert Peter; Lee, Michelle A.; Lund, Troy C.; Myers, Kasiani C.; Olsson, Richard F.; Page, Kristin M.; Prestidge, Tim D.; Radhi, Mohamed; Shah, Ami J.; Schultz, Kirk R.; Wirk, Baldeep; Wagner, John E.; Deeg, H. Joachim
2015-01-01
Second allogeneic hematopoietic cell transplantation (HCT) is the only salvage option for those for develop graft failure after their first HCT. Data on outcomes after second HCT in Fanconi anemia (FA) are scarce. We report outcomes after second allogeneic HCT for FA (n=81). The indication for second HCT was graft failure after the first HCT. Transplants occurred between 1990 and 2012. The timing of second transplantation predicted subsequent graft failure and survival. Graft failure was high when the second transplant occurred less than 3 months from the first. The 3-month probability of graft failure was 69% when the interval between first and second transplant was less than 3 months compared to 23% when the interval was longer (p<0.001). Consequently, survival rates were substantially lower when the interval between first and second transplant was less than 3 months, 23% at 1-year compared to 58%, when the interval was longer (p=0.001). The corresponding 5-year probabilities of survival were 16% and 45%, respectively (p=0.006). Taken together, these data suggest that fewer than half of FA patients undergoing a second HCT for graft failure are long-term survivors. There is an urgent need to develop strategies to lower graft failure after first HCT. PMID:26116087
Experimental evidence for adaptive personalities in a wild passerine bird
Nicolaus, Marion; Tinbergen, Joost M.; Bouwman, Karen M.; Michler, Stephanie P. M.; Ubels, Richard; Both, Christiaan; Kempenaers, Bart; Dingemanse, Niels J.
2012-01-01
Individuals of the same species differ consistently in risky actions. Such ‘animal personality’ variation is intriguing because behavioural flexibility is often assumed to be the norm. Recent theory predicts that between-individual differences in propensity to take risks should evolve if individuals differ in future fitness expectations: individuals with high long-term fitness expectations (i.e. that have much to lose) should behave consistently more cautious than individuals with lower expectations. Consequently, any manipulation of future fitness expectations should result in within-individual changes in risky behaviour in the direction predicted by this adaptive theory. We tested this prediction and confirmed experimentally that individuals indeed adjust their ‘exploration behaviour’, a proxy for risk-taking behaviour, to their future fitness expectations. We show for wild great tits (Parus major) that individuals with experimentally decreased survival probability become faster explorers (i.e. increase risk-taking behaviour) compared to individuals with increased survival probability. We also show, using quantitative genetics approaches, that non-genetic effects (i.e. permanent environment effects) underpin adaptive personality variation in this species. This study thereby confirms a key prediction of adaptive personality theory based on life-history trade-offs, and implies that selection may indeed favour the evolution of personalities in situations where individuals differ in future fitness expectations. PMID:23097506
Epidemiology and social costs of hip fracture.
Veronese, Nicola; Maggi, Stefania
2018-04-20
Hip fracture is an important and debilitating condition in older people, particularly in women. The epidemiological data varies between countries, but it is globally estimated that hip fractures will affect around 18% of women and 6% of men. Although the age-standardised incidence is gradually falling in many countries, this is far outweighed by the ageing of the population. Thus, the global number of hip fractures is expected to increase from 1.26 million in 1990 to 4.5 million by the year 2050. The direct costs associated with this condition are enormous since it requires a long period of hospitalisation and subsequent rehabilitation. Furthermore, hip fracture is associated with the development of other negative consequences, such as disability, depression, and cardiovascular diseases, with additional costs for society. In this review, we show the most recent epidemiological data regarding hip fracture, indicating the well-known risk factors and conditions that seem relevant for determining this condition. A specific part is dedicated to the social costs due to hip fracture. Although the costs of hip fracture are probably comparable to other common diseases with a high hospitalisation rate (e.g. cardiovascular disease), the other social costs (due to onset of new co-morbidities, sarcopenia, poor quality of life, disability and mortality) are probably greater. Copyright © 2018 Elsevier Ltd. All rights reserved.
Sex and boldness explain individual differences in spatial learning in a lizard.
Carazo, Pau; Noble, Daniel W A; Chandrasoma, Dani; Whiting, Martin J
2014-05-07
Understanding individual differences in cognitive performance is a major challenge to animal behaviour and cognition studies. We used the Eastern water skink (Eulamprus quoyii) to examine associations between exploration, boldness and individual variability in spatial learning, a dimension of lizard cognition with important bearing on fitness. We show that males perform better than females in a biologically relevant spatial learning task. This is the first evidence for sex differences in learning in a reptile, and we argue that it is probably owing to sex-specific selective pressures that may be widespread in lizards. Across the sexes, we found a clear association between boldness after a simulated predatory attack and the probability of learning the spatial task. In contrast to previous studies, we found a nonlinear association between boldness and learning: both 'bold' and 'shy' behavioural types were more successful learners than intermediate males. Our results do not fit with recent predictions suggesting that individual differences in learning may be linked with behavioural types via high-low-risk/reward trade-offs. We suggest the possibility that differences in spatial cognitive performance may arise in lizards as a consequence of the distinct environmental variability and complexity experienced by individuals as a result of their sex and social tactics.
Mention effect in information diffusion on a micro-blogging network.
Bao, Peng; Shen, Hua-Wei; Huang, Junming; Chen, Haiqiang
2018-01-01
Micro-blogging systems have become one of the most important ways for information sharing. Network structure and users' interactions such as forwarding behaviors have aroused considerable research attention, while mention, as a key feature in micro-blogging platforms which can improve the visibility of a message and direct it to a particular user beyond the underlying social structure, is seldom studied in previous works. In this paper, we empirically study the mention effect in information diffusion, using the dataset from a population-scale social media website. We find that users with high number of followers would receive much more mentions than others. We further investigate the effect of mention in information diffusion by examining the response probability with respect to the number of mentions in a message and observe a saturation at around 5 mentions. Furthermore, we find that the response probability is the highest when a reciprocal followship exists between users, and one is more likely to receive a target user's response if they have similar social status. To illustrate these findings, we propose the response prediction task and formulate it as a binary classification problem. Extensive evaluation demonstrates the effectiveness of discovered factors. Our results have consequences for the understanding of human dynamics on the social network, and potential implications for viral marketing and public opinion monitoring.
The antecedents and prevention of unwanted pregnancy.
Gerrard, M; McCann, L; Geis, B D
1983-01-01
Much of the research on the antecedents and consequences of birth control has focused on teenagers and members of racial minority groups, but the trends in contraceptive use indicate that the danger of unwanted pregnancy exists for most women throughout the childbearing ages of 14-45, for white and middle class women as well as minority women and women from the lower socioeconomic status levels. There are basically 4 choices open to the unmarried woman who conceives: giving the child up for adoption, keeping the child without marrying, and marriage. There are little data on the mental health consequences of giving a child up for adoption, but there is no question that the experience at the very least upsetting and may cause longterm trauma. Induced abortion is less traumatic, both physically and psychologically, than carrying a pregnancy to term, however, many women suffer from longterm depression following the procedure. The social, economic, and psychological consequences of single motherhood are clearly documented for both teenagers and older women. The most frequently cited problems are delayed or truncated emotional and social activities, unemployment, and role overload resulting from the responsibility of caring for a child without the support of a spouse. The pregnant teenager who does marry has a 50% probability of divorce within 4 years, and even if the couple does stay married they suffer some adverse consequences. In 1978 Zelnick and Kantner estimated that it would be possible to reduce the number of premarital pregnancies and presumably their psychological and economic consequences by at least 40% if all sexually active young women were to use a contraceptive method and to use it consistently. If the majority of all sexually active women were to use the most reliable methods of contraception, the unwanted pregnancy rate would be reduced even more markedly. Yet, reliable contraceptive behavior involves a complex sequence of psychological and behavioral events including awareness of the risk of becoming pregnant, obtaining adequate information about contraception, making decisions about contraceptive use, acquiring contraception, and regular and consistent use of a reliable contraceptive method. The literature on the psychological antecedents of contraceptive behavior clearly characterize ineffective female contraceptors as being unaccepting of their own sexuality and having negative attitudes toward most matters pertaining to sex. Their attitudes and emotions include irrational fears about specific contraceptives, conflicting attitude and belief systems about birth control in general, and guilt. Implicit in this profile is an inability to think rationally about the high probability that unprotected sex will result in conception and an inability to engage in rational decision making about birth control. Yet, review of the prevention programs currently available reveals that the vast majority are designed to serve the self-motivated women. Given that these programs already have been demonstrated to be effective, it is time to direct attention to exploring ways to reach those women who will not adequate precautions without first experiencing changes in their attitudinal and emotional responses to sex.
M≥7 Earthquake rupture forecast and time-dependent probability for the Sea of Marmara region, Turkey
Murru, Maura; Akinci, Aybige; Falcone, Guiseppe; Pucci, Stefano; Console, Rodolfo; Parsons, Thomas E.
2016-01-01
We forecast time-independent and time-dependent earthquake ruptures in the Marmara region of Turkey for the next 30 years using a new fault-segmentation model. We also augment time-dependent Brownian Passage Time (BPT) probability with static Coulomb stress changes (ΔCFF) from interacting faults. We calculate Mw > 6.5 probability from 26 individual fault sources in the Marmara region. We also consider a multisegment rupture model that allows higher-magnitude ruptures over some segments of the Northern branch of the North Anatolian Fault Zone (NNAF) beneath the Marmara Sea. A total of 10 different Mw=7.0 to Mw=8.0 multisegment ruptures are combined with the other regional faults at rates that balance the overall moment accumulation. We use Gaussian random distributions to treat parameter uncertainties (e.g., aperiodicity, maximum expected magnitude, slip rate, and consequently mean recurrence time) of the statistical distributions associated with each fault source. We then estimate uncertainties of the 30-year probability values for the next characteristic event obtained from three different models (Poisson, BPT, and BPT+ΔCFF) using a Monte Carlo procedure. The Gerede fault segment located at the eastern end of the Marmara region shows the highest 30-yr probability, with a Poisson value of 29%, and a time-dependent interaction probability of 48%. We find an aggregated 30-yr Poisson probability of M >7.3 earthquakes at Istanbul of 35%, which increases to 47% if time dependence and stress transfer are considered. We calculate a 2-fold probability gain (ratio time-dependent to time-independent) on the southern strands of the North Anatolian Fault Zone.
TU-AB-BRB-01: Coverage Evaluation and Probabilistic Treatment Planning as a Margin Alternative
DOE Office of Scientific and Technical Information (OSTI.GOV)
Siebers, J.
The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, H.
The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less
TU-AB-BRB-02: Stochastic Programming Methods for Handling Uncertainty and Motion in IMRT Planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Unkelbach, J.
The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less
TU-AB-BRB-00: New Methods to Ensure Target Coverage
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
2015-06-15
The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less
Propagules of arbuscular mycorrhizal fungi in a secondary dry forest of Oaxaca, Mexico.
Guadarrama, Patricia; Castillo-Argüero, Silvia; Ramos-Zapata, José A; Camargo-Ricalde, Sara L; Alvarez-Sánchez, Javier
2008-03-01
Plant cover loss due to changes in land use promotes a decrease in spore diversity of arbuscular mycorrhizal fungi (AMF), viable mycelium and, therefore, in AMF colonization, this has an influence in community diversity and, as a consequence, in its recovery. To evaluate different AMF propagules, nine plots in a tropical dry forest with secondary vegetation were selected: 0, 1, 7, 10, 14, 18, 22, 25, and 27 years after abandonment in Nizanda, Oaxaca, Mexico. The secondary vegetation with different stages of development is a consequence of slash and burn agriculture, and posterior abandonment. Soil samples (six per plot) were collected and percentage of AMF field colonization, extrarradical mycelium, viable spore density, infectivity and most probable number (MPN) ofAMF propagules were quantified through a bioassay. Means for field colonization ranged between 40% and 70%, mean of total mycelium length was 15.7 +/- 1.88 mg(-1) dry soil, with significant differences between plots; however, more than 40% of extracted mycelium was not viable, between 60 and 456 spores in 100 g of dry soil were recorded, but more than 64% showed some kind of damage. Infectivity values fluctuated between 20% and 50%, while MPN showed a mean value of 85.42 +/- 44.17 propagules (100 g dry soil). We conclude that secondary communities generated by elimination of vegetation with agricultural purposes in a dry forest in Nizanda do not show elimination of propagules, probably as a consequence of the low input agriculture practices in this area, which may encourage natural regeneration.
Adult consequences of growth failure in early childhood123
Hoddinott, John; Behrman, Jere R; Maluccio, John A; Melgar, Paul; Quisumbing, Agnes R; Ramirez-Zea, Manuel; Stein, Aryeh D; Yount, Kathryn M
2013-01-01
Background: Growth failure is associated with adverse consequences, but studies need to control adequately for confounding. Objective: We related height-for-age z scores (HAZs) and stunting at age 24 mo to adult human capital, marriage, fertility, health, and economic outcomes. Design: In 2002–2004, we collected data from 1338 Guatemalan adults (aged 25–42 y) who were studied as children in 1969–1977. We used instrumental variable regression to correct for estimation bias and adjusted for potentially confounding factors. Results: A 1-SD increase in HAZ was associated with more schooling (0.78 grades) and higher test scores for reading and nonverbal cognitive skills (0.28 and 0.25 SDs, respectively), characteristics of marriage partners (1.39 y older, 1.02 grade more schooling, and 1.01 cm taller) and, for women, a higher age at first birth (0.77 y) and fewer number of pregnancies and children (0.63 and 0.43, respectively). A 1-SD increase in HAZ was associated with increased household per capita expenditure (21%) and a lower probability of living in poverty (10 percentage points). Conversely, being stunted at 2 y was associated with less schooling, a lower test performance, a lower household per capita expenditure, and an increased probability of living in poverty. For women, stunting was associated with a lower age at first birth and higher number of pregnancies and children. There was little relation between either HAZ or stunting and adult health. Conclusion: Growth failure in early life has profound adverse consequences over the life course on human, social, and economic capital. PMID:24004889
Methodology of risk assessment of loss of water resources due to climate changes
NASA Astrophysics Data System (ADS)
Israfilov, Yusif; Israfilov, Rauf; Guliyev, Hatam; Afandiyev, Galib
2016-04-01
For sustainable development and management of rational use of water resources of Azerbaijan Republic it is actual to forecast their changes taking into account different scenarios of climate changes and assessment of possible risks of loss of sections of water resources. The major part of the Azerbaijani territory is located in the arid climate and the vast majority of water is used in the national economic production. An optimal use of conditional groundwater and surface water is of great strategic importance for economy of the country in terms of lack of common water resources. Low annual rate of sediments, high evaporation and complex natural and hydrogeological conditions prevent sustainable formation of conditioned resources of ground and surface water. In addition, reserves of fresh water resources are not equally distributed throughout the Azerbaijani territory. The lack of the common water balance creates tension in the rational use of fresh water resources in various sectors of the national economy, especially in agriculture, and as a result, in food security of the republic. However, the fresh water resources of the republic have direct proportional dependence on climatic factors. 75-85% of the resources of ground stratum-pore water of piedmont plains and fracture-vein water of mountain regions are formed by the infiltration of rainfall and condensate water. Changes of climate parameters involve changes in the hydrological cycle of the hydrosphere and as a rule, are reflected on their resources. Forecasting changes of water resources of the hydrosphere with different scenarios of climate change in regional mathematical models allowed estimating the extent of their relationship and improving the quality of decisions. At the same time, it is extremely necessary to obtain additional data for risk assessment and management to reduce water resources for a detailed analysis, forecasting the quantitative and qualitative parameters of resources, and also for optimization the use of water resources. In this regard, we have developed the methodology of risk assessment including statistical fuzzy analysis of the relationship "probability-consequences", classification of probabilities, the consequences on degree of severity and risk. The current methodology allow providing the possibility of practical use of the obtained results and giving effectual help in the sustainable development and reduction of risk degree of optimal use of water resources of the republic and, as a consequence, the national strategy of economic development.
Skier triggering of backcountry avalanches with skilled route selection
NASA Astrophysics Data System (ADS)
Sinickas, Alexandra; Haegeli, Pascal; Jamieson, Bruce
2015-04-01
Jamieson (2009) provided numerical estimates for the baseline probabilities of triggering an avalanche by a backcountry skier making fresh tracks without skilled route selection as a function of the North American avalanche danger scale (i.e., hazard levels Low, Moderate, Considerable, High and Extreme). Using the results of an expert survey, he showed that triggering probabilities while skiing directly up, down or across a trigger zone without skilled route selection increase roughly by a factor of 10 with each step of the North American avalanche danger scale (i.e. hazard level). The objective of the present study is to examine the effect of skilled route selection on the relationship between triggering probability and hazard level. To assess the effect of skilled route selection on triggering probability by hazard level, we analysed avalanche hazard assessments as well as reports of skiing activity and triggering of avalanches from 11 Canadian helicopter and snowcat operations during two winters (2012-13 and 2013-14). These reports were submitted to the daily information exchange among Canadian avalanche safety operations, and reflect professional decision-making and route selection practices of guides leading groups of skiers. We selected all skier-controlled or accidentally triggered avalanches with a destructive size greater than size 1 according to the Canadian avalanche size classification, triggered by any member of a guided group (guide or guest). These operations forecast the avalanche hazard daily for each of three elevation bands: alpine, treeline and below treeline. In contrast to the 2009 study, an exposure was defined as a group skiing within any one of the three elevation bands, and consequently within a hazard rating, for the day (~4,300 ratings over two winters). For example, a group that skied below treeline (rated Moderate) and treeline (rated Considerable) in one day, would receive one count for exposure to Moderate hazard, and one count for exposure to Considerable hazard. While the absolute values for triggering probability cannot be compared to the 2009 study because of different definitions of exposure, our preliminary results suggest that with skilled route selection the triggering probability is similar all hazard levels, except for extreme for which there are few exposures. This means that the guiding teams of backcountry skiing operations effectively control the hazard from triggering avalanches with skilled route selection. Groups were exposed relatively evenly to Low hazard (1275 times or 29% of total exposure), Moderate hazard (1450 times or 33 %) and Considerable hazard (1215 times or 28 %). At higher levels, the exposure reduced to roughly 380 times (9 % of total exposure) to High hazard, and only 13 times (0.3 %) to Extreme hazard. We assess the sensitivity of the results to some of our key assumptions.
Human resource crises in German hospitals--an explorative study.
Schermuly, Carsten C; Draheim, Michael; Glasberg, Ronald; Stantchev, Vladimir; Tamm, Gerrit; Hartmann, Michael; Hessel, Franz
2015-05-28
The complexity of providing medical care in a high-tech environment with a highly specialized, limited labour force makes hospitals more crisis-prone than other industries. An effective defence against crises is only possible if the organizational resilience and the capacity to handle crises become part of the hospitals' organizational culture. To become more resilient to crises, a raised awareness--especially in the area of human resource (HR)--is necessary. The aim of this paper is to contribute to the process robustness against crises through the identification and evaluation of relevant HR crises and their causations in hospitals. Qualitative and quantitative methods were combined to identify and evaluate crises in hospitals in the HR sector. A structured workshop with experts was conducted to identify HR crises and their descriptions, as well as causes and consequences for patients and hospitals. To evaluate the findings, an online survey was carried out to rate the occurrence (past, future) and dangerousness of each crisis. Six HR crises were identified in this study: staff shortages, acute loss of personnel following a pandemic, damage to reputation, insufficient communication during restructuring, bullying, and misuse of drugs. The highest occurrence probability in the future was seen in staff shortages, followed by acute loss of personnel following a pandemic. Staff shortages, damage to reputation, and acute loss of personnel following a pandemic were seen as the most dangerous crises. The study concludes that coping with HR crises in hospitals is existential for hospitals and requires increased awareness. The six HR crises identified occurred regularly in German hospitals in the past, and their occurrence probability for the future was rated as high.
Mechanistic Assessment of DNA Ligase as an Antibacterial Target in Staphylococcus aureus
Podos, Steven D.; Thanassi, Jane A.
2012-01-01
We report the use of a known pyridochromanone inhibitor with antibacterial activity to assess the validity of NAD+-dependent DNA ligase (LigA) as an antibacterial target in Staphylococcus aureus. Potent inhibition of purified LigA was demonstrated in a DNA ligation assay (inhibition constant [Ki] = 4.0 nM) and in a DNA-independent enzyme adenylation assay using full-length LigA (50% inhibitory concentration [IC50] = 28 nM) or its isolated adenylation domain (IC50 = 36 nM). Antistaphylococcal activity was confirmed against methicillin-susceptible and -resistant S. aureus (MSSA and MRSA) strains (MIC = 1.0 μg/ml). Analysis of spontaneous resistance potential revealed a high frequency of emergence (4 × 10−7) of high-level resistant mutants (MIC > 64) with associated ligA lesions. There were no observable effects on growth rate in these mutants. Of 22 sequenced clones, 3 encoded point substitutions within the catalytic adenylation domain and 19 in the downstream oligonucleotide-binding (OB) fold and helix-hairpin-helix (HhH) domains. In vitro characterization of the enzymatic properties of four selected mutants revealed distinct signatures underlying their resistance to inhibition. The infrequent adenylation domain mutations altered the kinetics of adenylation and probably elicited resistance directly. In contrast, the highly represented OB fold domain mutations demonstrated a generalized resistance mechanism in which covalent LigA activation proceeds normally and yet the parameters of downstream ligation steps are altered. A resulting decrease in substrate Km and a consequent increase in substrate occupancy render LigA resistant to competitive inhibition. We conclude that the observed tolerance of staphylococcal cells to such hypomorphic mutations probably invalidates LigA as a viable target for antistaphylococcal chemotherapy. PMID:22585221
Potential Standards and Methods for the National Guard’s Homeland Response Force
2011-09-01
rapidly determine a missile launch and probable impact area ( Opall -Rome, 2009). Since 2006, Color Red coverage has expanded throughout the country...Manportable Air Defense (MANPAD) systems, land mines , advanced communication systems, mortars, unmanned air systems (UAS), frequency-hopping...Consequence Management Response Force (CCMRF). Internal document. Opall -Rome, B. (2009, January19). In Israel: Anti-sniper gear spots rockets
B. M. Tkacz; H. H. Burdsall; G. A. DeNitto; A. Eglitis; J. B. Hanson; J. T. Kliejunas; W. E. Wallner; J. G. O`Brien; E. L. Smith
1998-01-01
The unmitigated pest risk potential for the importation of Pinus and Abies logs from all states of Mexico into the United States was assessed by estimating the probability and consequences of establishment of representative insects and pathogens of concern. Twenty-two individual pest risk assessments were prepared for Pinus logs, twelve dealing with insects and ten...
Improved Methodology for Developing Cost Uncertainty Models for Naval Vessels
2008-09-01
Growth: Last 700 Years (From: Deegan , 2007b) ................13 Figure 3. Business Rules to Consider: Choosing an acceptable cost risk point...requires an understanding of consequence (From: Deegan , 2007b)...............16 Figure 4. Basic Steps in Estimating Probable Systems Cost (From: Book...her guidance and assistance in the development of this thesis. Additionally, I thank Mr. Chris Deegan , the former Director of Cost Engineering and
ERIC Educational Resources Information Center
Livingston, Robert B.; And Others
The degree to which under nourishment exists in a local community such as San Diego, California, and in the U.S. at large, and whether it is severe enough to interfere with brain development is the focus of this report. After establishing criteria for nutrition intake that would represent unambiguous jeopardy to brain development, these criteria…
The case for cellulose production on Mars
NASA Technical Reports Server (NTRS)
Volk, Tyler; Rummel, John D.
1989-01-01
From examining the consequences of not requiring that all wastes from life support be recycled back to the food plants, it is concluded that cellulose production on Mars could be an important input for many nonmetabolic material requirements on Mars. The fluxes of carbon in cellulose production would probably exceed those in food production, and therefore settlements on Mars could utilize cellulose farms in building a Mars infrastructure.
How are flood risk estimates affected by the choice of return-periods?
NASA Astrophysics Data System (ADS)
Ward, P. J.; de Moel, H.; Aerts, J. C. J. H.
2011-12-01
Flood management is more and more adopting a risk based approach, whereby flood risk is the product of the probability and consequences of flooding. One of the most common approaches in flood risk assessment is to estimate the damage that would occur for floods of several exceedance probabilities (or return periods), to plot these on an exceedance probability-loss curve (risk curve) and to estimate risk as the area under the curve. However, there is little insight into how the selection of the return-periods (which ones and how many) used to calculate risk actually affects the final risk calculation. To gain such insights, we developed and validated an inundation model capable of rapidly simulating inundation extent and depth, and dynamically coupled this to an existing damage model. The method was applied to a section of the River Meuse in the southeast of the Netherlands. Firstly, we estimated risk based on a risk curve using yearly return periods from 2 to 10 000 yr (€ 34 million p.a.). We found that the overall risk is greatly affected by the number of return periods used to construct the risk curve, with over-estimations of annual risk between 33% and 100% when only three return periods are used. In addition, binary assumptions on dike failure can have a large effect (a factor two difference) on risk estimates. Also, the minimum and maximum return period considered in the curve affects the risk estimate considerably. The results suggest that more research is needed to develop relatively simple inundation models that can be used to produce large numbers of inundation maps, complementary to more complex 2-D-3-D hydrodynamic models. It also suggests that research into flood risk could benefit by paying more attention to the damage caused by relatively high probability floods.
Design with brittle materials - An interdisciplinary educational program
NASA Technical Reports Server (NTRS)
Mueller, J. I.; Bollard, R. J. H.; Hartz, B. J.; Kobayashi, A. S.; Love, W. J.; Scott, W. D.; Taggart, R.; Whittemore, O. J.
1980-01-01
A series of interdisciplinary design courses being offered to senior and graduate engineering students at the University of Washington is described. Attention is given to the concepts and some of the details on group design projects that have been undertaken during the past two years. It is noted that ceramic materials normally demonstrate a large scatter in strength properties. As a consequence, when designing with these materials, the conventional 'mil standards' design stresses with acceptable margins of safety cannot by employed and the designer is forced to accept a probable number of failures in structures of a given brittle material. It is this prediction of the probability of failure for structures of given, well-characterized materials that forms the basis for this series of courses.
A New Empirical Constraint on the Prevalence of Technological Species in the Universe.
Frank, A; Sullivan, W T
2016-05-01
In this article, we address the cosmic frequency of technological species. Recent advances in exoplanet studies provide strong constraints on all astrophysical terms in the Drake equation. Using these and modifying the form and intent of the Drake equation, we set a firm lower bound on the probability that one or more technological species have evolved anywhere and at any time in the history of the observable Universe. We find that as long as the probability that a habitable zone planet develops a technological species is larger than ∼10(-24), humanity is not the only time technological intelligence has evolved. This constraint has important scientific and philosophical consequences. Life-Intelligence-Extraterrestrial life. Astrobiology 2016, 359-362.
Decision making generalized by a cumulative probability weighting function
NASA Astrophysics Data System (ADS)
dos Santos, Lindomar Soares; Destefano, Natália; Martinez, Alexandre Souto
2018-01-01
Typical examples of intertemporal decision making involve situations in which individuals must choose between a smaller reward, but more immediate, and a larger one, delivered later. Analogously, probabilistic decision making involves choices between options whose consequences differ in relation to their probability of receiving. In Economics, the expected utility theory (EUT) and the discounted utility theory (DUT) are traditionally accepted normative models for describing, respectively, probabilistic and intertemporal decision making. A large number of experiments confirmed that the linearity assumed by the EUT does not explain some observed behaviors, as nonlinear preference, risk-seeking and loss aversion. That observation led to the development of new theoretical models, called non-expected utility theories (NEUT), which include a nonlinear transformation of the probability scale. An essential feature of the so-called preference function of these theories is that the probabilities are transformed by decision weights by means of a (cumulative) probability weighting function, w(p) . We obtain in this article a generalized function for the probabilistic discount process. This function has as particular cases mathematical forms already consecrated in the literature, including discount models that consider effects of psychophysical perception. We also propose a new generalized function for the functional form of w. The limiting cases of this function encompass some parametric forms already proposed in the literature. Far beyond a mere generalization, our function allows the interpretation of probabilistic decision making theories based on the assumption that individuals behave similarly in the face of probabilities and delays and is supported by phenomenological models.
NASA Astrophysics Data System (ADS)
Cuellar, A. D.; McKinney, D. C.
2014-12-01
Climate change has accelerated glacial retreat in high altitude glaciated regions of Peru leading to the growth and formation of glacier lakes. Glacial lake outburst floods (GLOF) are sudden events triggered by an earthquake, avalanche into the lake or other shock that causes a sudden outflow of water. These floods are catastrophic because of their sudden onset, the difficulty predicting them, and enormous quantity of water and debris rapidly flooding downstream areas. Palcacocha Lake in the Peruvian Andes has experienced accelerated growth since it burst in 1941 and threatens the major city of Huaraz and surrounding communities. Since the 1941 flood stakeholders have advocated for projects to adapt to the increasing threat posed by Palcacocha Lake. Nonetheless, discussions surrounding projects for Palcacocha have not included a rigorous analysis of the potential consequences of a flood, probability of an event, or costs of mitigation projects. This work presents the first step to rationally analyze the risks posed by Palcacocha Lake and the various adaptation projects proposed. In this work the authors use decision analysis to asses proposed adaptation measures that would mitigate damage in downstream communities from a GLOF. We use an existing hydrodynamic model of the at-risk area to determine how adaptation projects will affect downstream flooding. Flood characteristics are used in the HEC-FIA software to estimate fatalities and injuries from an outburst flood, which we convert to monetary units using the value of a statistical life. We combine the monetary consequences of a GLOF with the cost of the proposed projects and a diffuse probability distribution for the likelihood of an event to estimate the expected cost of the adaptation plans. From this analysis we found that lowering the lake level by 15 meters has the least expected cost of any proposal despite uncertainty in the effect of lake lowering on flooding downstream.
Explaining intraspecific diversity in plant secondary metabolites in an ecological context.
Moore, Ben D; Andrew, Rose L; Külheim, Carsten; Foley, William J
2014-02-01
Plant secondary metabolites (PSMs) are ubiquitous in plants and play many ecological roles. Each compound can vary in presence and/or quantity, and the composition of the mixture of chemicals can vary, such that chemodiversity can be partitioned within and among individuals. Plant ontogeny and environmental and genetic variation are recognized as sources of chemical variation, but recent advances in understanding the molecular basis of variation may allow the future deployment of isogenic mutants to test the specific adaptive function of variation in PSMs. An important consequence of high intraspecific variation is the capacity to evolve rapidly. It is becoming increasingly clear that trait variance linked to both macro- and micro-environmental variation can also evolve and may respond more strongly to selection than mean trait values. This research, which is in its infancy in plants, highlights what could be a missing piece of the picture of PSM evolution. PSM polymorphisms are probably maintained by multiple selective forces acting across many spatial and temporal scales, but convincing examples that recognize the diversity of plant population structures are rare. We describe how diversity can be inherently beneficial for plants and suggest fruitful avenues for future research to untangle the causes and consequences of intraspecific variation. © 2013 The Authors. New Phytologist © 2013 New Phytologist Trust.
Early childhood caries update: A review of causes, diagnoses, and treatments
Çolak, Hakan; Dülgergil, Çoruh T.; Dalli, Mehmet; Hamidi, Mehmet Mustafa
2013-01-01
Dental caries (decay) is an international public health challenge, especially amongst young children. Early childhood caries (ECC) is a serious public health problem in both developing and industrialized countries. ECC can begin early in life, progresses rapidly in those who are at high risk, and often goes untreated. Its consequences can affect the immediate and long-term quality of life of the child's family and can have significant social and economic consequences beyond the immediate family as well. ECC can be a particularly virulent form of caries, beginning soon after dental eruption, developing on smooth surfaces, progressing rapidly, and having a lasting detrimental impact on the dentition. Children experiencing caries as infants or toddlers have a much greater probability of subsequent caries in both the primary and permanent dentitions. The relationship between breastfeeding and ECC is likely to be complex and confounded by many biological variables, such as mutans streptococci, enamel hypoplasia, intake of sugars, as well as social variables, such as parental education and socioeconomic status, which may affect oral health. Unlike other infectious diseases, tooth decay is not self-limiting. Decayed teeth require professional treatment to remove infection and restore tooth function. In this review, we give detailed information about ECC, from its diagnosis to management. PMID:23633832
van Harreveld, Frenk; Rotteveel, Mark; Lelieveld, Gert-Jan; Crone, Eveline A.
2014-01-01
Ambivalence is a state of inconsistency that is often experienced as affectively aversive. In this functional magnetic resonance imaging study, we investigated the role of cognitive and social-affective processes in the experience of ambivalence and coping with its negative consequences. We examined participants’ brain activity during the dichotomous evaluation (pro vs contra) of pretested ambivalent (e.g. alcohol), positive (e.g. happiness) and negative (e.g. genocide) word stimuli. We manipulated evaluation relevance by varying the probability of evaluation consequences, under the hypothesis that ambivalence is experienced as more negative when outcomes are relevant. When making ambivalent evaluations, more activity was found in the anterior cingulate cortex, the insula, the temporal parietal junction (TPJ) and the posterior cingulate cortex (PCC)/precuneus, for both high and low evaluation relevance. After statistically conservative corrections, activity in the TPJ and PCC/precuneus was negatively correlated with experienced ambivalence after scanning, as measured by Priester and Petty’s felt ambivalence scale (1996). The findings show that cognitive and social-affective brain areas are involved in the experience of ambivalence. However, these networks are differently associated with subsequent reduction of ambivalence, thus highlighting the importance of understanding both cognitive and affective processes involved in ambivalent decision-making. PMID:23685774
Greenberg, Michael; Lioy, Paul; Ozbas, Birnur; Mantell, Nancy; Isukapalli, Sastry; Lahr, Michael; Altiok, Tayfur; Bober, Joseph; Lacy, Clifton; Lowrie, Karen; Mayer, Henry; Rovito, Jennifer
2013-11-01
We built three simulation models that can assist rail transit planners and operators to evaluate high and low probability rail-centered hazard events that could lead to serious consequences for rail-centered networks and their surrounding regions. Our key objective is to provide these models to users who, through planning with these models, can prevent events or more effectively react to them. The first of the three models is an industrial systems simulation tool that closely replicates rail passenger traffic flows between New York Penn Station and Trenton, New Jersey. Second, we built and used a line source plume model to trace chemical plumes released by a slow-moving freight train that could impact rail passengers, as well as people in surrounding areas. Third, we crafted an economic simulation model that estimates the regional economic consequences of a variety of rail-related hazard events through the year 2020. Each model can work independently of the others. However, used together they help provide a coherent story about what could happen and set the stage for planning that should make rail-centered transport systems more resistant and resilient to hazard events. We highlight the limitations and opportunities presented by using these models individually or in sequence. © 2013 Society for Risk Analysis.
Fitness cost of incubation in great tits (Parus major) is related to clutch size
de Heij, Maaike E; van den Hout, Piet J; Tinbergen, Joost M
2006-01-01
Life-history theory predicts that parents produce the number of offspring that maximizes their fitness. In birds, natural selection on parental decisions regarding clutch size may act during egg laying, incubation or nestling phase. To study the fitness consequences of clutch size during the incubation phase, we manipulated the clutch sizes during this phase only in three breeding seasons and measured the fitness consequences on the short and the long term. Clutch enlargement did not affect the offspring fitness of the manipulated first clutches, but fledging probability of the subsequent clutch in the same season was reduced. Parents incubating enlarged first clutches provided adequate care for the offspring of their first clutches during the nestling phase, but paid the price when caring for the offspring of their second clutch. Parents that incubated enlarged first clutches had lower local survival in the 2 years when the population had a relatively high production of second clutches, but not in the third year when there was a very low production of second clutches. During these 2 years, the costs of incubation were strong enough to change positive selection, as established by brood size manipulations in this study population, into stabilizing selection through the negative effect of incubation on parental fitness. PMID:16928638
Greenberg, Michael; Lioy, Paul; Ozbas, Birnur; Mantell, Nancy; Isukapalli, Sastry; Lahr, Michael; Altiok, Tayfur; Bober, Joseph; Lacy, Clifton; Lowrie, Karen; Mayer, Henry; Rovito, Jennifer
2014-01-01
We built three simulation models that can assist rail transit planners and operators to evaluate high and low probability rail-centered hazard events that could lead to serious consequences for rail-centered networks and their surrounding regions. Our key objective is to provide these models to users who, through planning with these models, can prevent events or more effectively react to them. The first of the three models is an industrial systems simulation tool that closely replicates rail passenger traffic flows between New York Penn Station and Trenton, New Jersey. Second, we built and used a line source plume model to trace chemical plumes released by a slow-moving freight train that could impact rail passengers, as well as people in surrounding areas. Third, we crafted an economic simulation model that estimates the regional economic consequences of a variety of rail-related hazard events through the year 2020. Each model can work independently of the others. However, used together they help provide a coherent story about what could happen and set the stage for planning that should make rail-centered transport systems more resistant and resilient to hazard events. We highlight the limitations and opportunities presented by using these models individually or in sequence. PMID:23718133
DOE Office of Scientific and Technical Information (OSTI.GOV)
P. Persoff
The evaluation of impacts of potential volcanic eruptions on populations and facilities far in the future may involve detailed volcanological studies that differ from traditional hazards analyses. The proximity of Quaternary volcanoes to a proposed repository for disposal of the USA's high-level radioactive waste at Yucca Mountain, Nevada, has required in-depth study of probability and consequences of basaltic igneous activity. Because of the underground nature of the repository, evaluation of the potential effects of dike intrusion and interaction with the waste packages stored in underground tunnels (dnfts) as well as effects of eruption and ash dispersal have been important. Thesemore » studies include analyses of dike propagation, dike-drift intersection, flow of magma into dnfts, heat and volcanic gas migration, atmospheric dispersal of tephra, and redistribution of waste-contaminated tephra by surficial processes. Unlike traditional volcanic hazards studies that focus on impacts on housing, transportation, communications, etc. (to name a small subset), the igneous consequences studies at Yucca Mountain have focused on evaluation of igneous impacts on nuclear waste packages and implications for enhanced radioactive dose on a hypothetical future ({le} 10000 yrs) local population. Potential exposure pathways include groundwater (affected by in-situ degradation of waste packages by igneous heat and corrosion) and inhalation, ingestion, and external exposure due to deposition and redistribution of waste-contaminated tephra.« less