Sample records for high conditional probability

  1. Decomposition of conditional probability for high-order symbolic Markov chains.

    PubMed

    Melnik, S S; Usatenko, O V

    2017-07-01

    The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.

  2. Decomposition of conditional probability for high-order symbolic Markov chains

    NASA Astrophysics Data System (ADS)

    Melnik, S. S.; Usatenko, O. V.

    2017-07-01

    The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.

  3. Encounter risk analysis of rainfall and reference crop evapotranspiration in the irrigation district

    NASA Astrophysics Data System (ADS)

    Zhang, Jinping; Lin, Xiaomin; Zhao, Yong; Hong, Yang

    2017-09-01

    Rainfall and reference crop evapotranspiration are random but mutually affected variables in the irrigation district, and their encounter situation can determine water shortage risks under the contexts of natural water supply and demand. However, in reality, the rainfall and reference crop evapotranspiration may have different marginal distributions and their relations are nonlinear. In this study, based on the annual rainfall and reference crop evapotranspiration data series from 1970 to 2013 in the Luhun irrigation district of China, the joint probability distribution of rainfall and reference crop evapotranspiration are developed with the Frank copula function. Using the joint probability distribution, the synchronous-asynchronous encounter risk, conditional joint probability, and conditional return period of different combinations of rainfall and reference crop evapotranspiration are analyzed. The results show that the copula-based joint probability distributions of rainfall and reference crop evapotranspiration are reasonable. The asynchronous encounter probability of rainfall and reference crop evapotranspiration is greater than their synchronous encounter probability, and the water shortage risk associated with meteorological drought (i.e. rainfall variability) is more prone to appear. Compared with other states, there are higher conditional joint probability and lower conditional return period in either low rainfall or high reference crop evapotranspiration. For a specifically high reference crop evapotranspiration with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is increased with the decrease in frequency. For a specifically low rainfall with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is decreased with the decrease in frequency. When either the high reference crop evapotranspiration exceeds a certain frequency or low rainfall does not exceed a certain frequency, the higher conditional joint probability and lower conditional return period of various combinations likely cause a water shortage, but the water shortage is not severe.

  4. Internal Medicine residents use heuristics to estimate disease probability.

    PubMed

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.

  5. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  6. Internal Medicine residents use heuristics to estimate disease probability

    PubMed Central

    Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin

    2015-01-01

    Background Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. Results When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Conclusions Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing. PMID:27004080

  7. The estimation of tree posterior probabilities using conditional clade probability distributions.

    PubMed

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  8. Tracking the Sensory Environment: An ERP Study of Probability and Context Updating in ASD

    PubMed Central

    Westerfield, Marissa A.; Zinni, Marla; Vo, Khang; Townsend, Jeanne

    2014-01-01

    We recorded visual event-related brain potentials (ERPs) from 32 adult male participants (16 high-functioning participants diagnosed with Autism Spectrum Disorder (ASD) and 16 control participants, ranging in age from 18–53 yrs) during a three-stimulus oddball paradigm. Target and non-target stimulus probability was varied across three probability conditions, whereas the probability of a third non-target stimulus was held constant in all conditions. P3 amplitude to target stimuli was more sensitive to probability in ASD than in TD participants, whereas P3 amplitude to non-target stimuli was less responsive to probability in ASD participants. This suggests that neural responses to changes in event probability are attention-dependant in high-functioning ASD. The implications of these findings for higher-level behaviors such as prediction and planning are discussed. PMID:24488156

  9. Probable Posttraumatic Stress Disorder in the US Veteran Population According to DSM-5: Results From the National Health and Resilience in Veterans Study.

    PubMed

    Wisco, Blair E; Marx, Brian P; Miller, Mark W; Wolf, Erika J; Mota, Natalie P; Krystal, John H; Southwick, Steven M; Pietrzak, Robert H

    2016-11-01

    With the publication of DSM-5, important changes were made to the diagnostic criteria for posttraumatic stress disorder (PTSD), including the addition of 3 new symptoms. Some have argued that these changes will further increase the already high rates of comorbidity between PTSD and other psychiatric disorders. This study examined the prevalence of DSM-5 PTSD, conditional probability of PTSD given certain trauma exposures, endorsement of specific PTSD symptoms, and psychiatric comorbidities in the US veteran population. Data were analyzed from the National Health and Resilience in Veterans Study (NHRVS), a Web-based survey of a cross-sectional, nationally representative, population-based sample of 1,484 US veterans, which was fielded from September through October 2013. Probable PTSD was assessed using the PTSD Checklist-5. The weighted lifetime and past-month prevalence of probable DSM-5 PTSD was 8.1% (SE = 0.7%) and 4.7% (SE = 0.6%), respectively. Conditional probability of lifetime probable PTSD ranged from 10.1% (sudden death of close family member or friend) to 28.0% (childhood sexual abuse). The DSM-5 PTSD symptoms with the lowest prevalence among veterans with probable PTSD were trauma-related amnesia and reckless and self-destructive behavior. Probable PTSD was associated with increased odds of mood and anxiety disorders (OR = 7.6-62.8, P < .001), substance use disorders (OR = 3.9-4.5, P < .001), and suicidal behaviors (OR = 6.7-15.1, P < .001). In US veterans, the prevalence of DSM-5 probable PTSD, conditional probability of probable PTSD, and odds of psychiatric comorbidity were similar to prior findings with DSM-IV-based measures; we found no evidence that changes in DSM-5 increase psychiatric comorbidity. Results underscore the high rates of exposure to both military and nonmilitary trauma and the high public health burden of DSM-5 PTSD and comorbid conditions in veterans. © Copyright 2016 Physicians Postgraduate Press, Inc.

  10. Recent research on the high-probability instructional sequence: A brief review.

    PubMed

    Lipschultz, Joshua; Wilder, David A

    2017-04-01

    The high-probability (high-p) instructional sequence consists of the delivery of a series of high-probability instructions immediately before delivery of a low-probability or target instruction. It is commonly used to increase compliance in a variety of populations. Recent research has described variations of the high-p instructional sequence and examined the conditions under which the sequence is most effective. This manuscript reviews the most recent research on the sequence and identifies directions for future research. Recommendations for practitioners regarding the use of the high-p instructional sequence are also provided. © 2017 Society for the Experimental Analysis of Behavior.

  11. The Effects of Phonotactic Probability and Neighborhood Density on Adults' Word Learning in Noisy Conditions

    PubMed Central

    Storkel, Holly L.; Lee, Jaehoon; Cox, Casey

    2016-01-01

    Purpose Noisy conditions make auditory processing difficult. This study explores whether noisy conditions influence the effects of phonotactic probability (the likelihood of occurrence of a sound sequence) and neighborhood density (phonological similarity among words) on adults' word learning. Method Fifty-eight adults learned nonwords varying in phonotactic probability and neighborhood density in either an unfavorable (0-dB signal-to-noise ratio [SNR]) or a favorable (+8-dB SNR) listening condition. Word learning was assessed using a picture naming task by scoring the proportion of phonemes named correctly. Results The unfavorable 0-dB SNR condition showed a significant interaction between phonotactic probability and neighborhood density in the absence of main effects. In particular, adults learned more words when phonotactic probability and neighborhood density were both low or both high. The +8-dB SNR condition did not show this interaction. These results are inconsistent with those from a prior adult word learning study conducted under quiet listening conditions that showed main effects of word characteristics. Conclusions As the listening condition worsens, adult word learning benefits from a convergence of phonotactic probability and neighborhood density. Clinical implications are discussed for potential populations who experience difficulty with auditory perception or processing, making them more vulnerable to noise. PMID:27788276

  12. The Effects of Phonotactic Probability and Neighborhood Density on Adults' Word Learning in Noisy Conditions.

    PubMed

    Han, Min Kyung; Storkel, Holly L; Lee, Jaehoon; Cox, Casey

    2016-11-01

    Noisy conditions make auditory processing difficult. This study explores whether noisy conditions influence the effects of phonotactic probability (the likelihood of occurrence of a sound sequence) and neighborhood density (phonological similarity among words) on adults' word learning. Fifty-eight adults learned nonwords varying in phonotactic probability and neighborhood density in either an unfavorable (0-dB signal-to-noise ratio [SNR]) or a favorable (+8-dB SNR) listening condition. Word learning was assessed using a picture naming task by scoring the proportion of phonemes named correctly. The unfavorable 0-dB SNR condition showed a significant interaction between phonotactic probability and neighborhood density in the absence of main effects. In particular, adults learned more words when phonotactic probability and neighborhood density were both low or both high. The +8-dB SNR condition did not show this interaction. These results are inconsistent with those from a prior adult word learning study conducted under quiet listening conditions that showed main effects of word characteristics. As the listening condition worsens, adult word learning benefits from a convergence of phonotactic probability and neighborhood density. Clinical implications are discussed for potential populations who experience difficulty with auditory perception or processing, making them more vulnerable to noise.

  13. Using conditional probability to identify trends in intra-day high-frequency equity pricing

    NASA Astrophysics Data System (ADS)

    Rechenthin, Michael; Street, W. Nick

    2013-12-01

    By examining the conditional probabilities of price movements in a popular US stock over different high-frequency intra-day timespans, varying levels of trend predictability are identified. This study demonstrates the existence of predictable short-term trends in the market; understanding the probability of price movement can be useful to high-frequency traders. Price movement was examined in trade-by-trade (tick) data along with temporal timespans between 1 s to 30 min for 52 one-week periods for one highly-traded stock. We hypothesize that much of the initial predictability of trade-by-trade (tick) data is due to traditional market dynamics, or the bouncing of the price between the stock’s bid and ask. Only after timespans of between 5 to 10 s does this cease to explain the predictability; after this timespan, two consecutive movements in the same direction occur with higher probability than that of movements in the opposite direction. This pattern holds up to a one-minute interval, after which the strength of the pattern weakens.

  14. The World According to de Finetti: On de Finetti's Theory of Probability and Its Application to Quantum Mechanics

    NASA Astrophysics Data System (ADS)

    Berkovitz, Joseph

    Bruno de Finetti is one of the founding fathers of the subjectivist school of probability, where probabilities are interpreted as rational degrees of belief. His work on the relation between the theorems of probability and rationality is among the corner stones of modern subjective probability theory. De Finetti maintained that rationality requires that degrees of belief be coherent, and he argued that the whole of probability theory could be derived from these coherence conditions. De Finetti's interpretation of probability has been highly influential in science. This paper focuses on the application of this interpretation to quantum mechanics. We argue that de Finetti held that the coherence conditions of degrees of belief in events depend on their verifiability. Accordingly, the standard coherence conditions of degrees of belief that are familiar from the literature on subjective probability only apply to degrees of belief in events which could (in principle) be jointly verified; and the coherence conditions of degrees of belief in events that cannot be jointly verified are weaker. While the most obvious explanation of de Finetti's verificationism is the influence of positivism, we argue that it could be motivated by the radical subjectivist and instrumental nature of probability in his interpretation; for as it turns out, in this interpretation it is difficult to make sense of the idea of coherent degrees of belief in, and accordingly probabilities of unverifiable events. We then consider the application of this interpretation to quantum mechanics, concentrating on the Einstein-Podolsky-Rosen experiment and Bell's theorem.

  15. Prevalence and co-occurrence of addictive behaviors among former alternative high school youth: A longitudinal follow-up study.

    PubMed

    Sussman, Steve; Pokhrel, Pallav; Sun, Ping; Rohrbach, Louise A; Spruijt-Metz, Donna

    2015-09-01

    Recent work has studied addictions using a matrix measure, which taps multiple addictions through single responses for each type. This is the first longitudinal study using a matrix measure. We investigated the use of this approach among former alternative high school youth (average age = 19.8 years at baseline; longitudinal n = 538) at risk for addictions. Lifetime and last 30-day prevalence of one or more of 11 addictions reviewed in other work was the primary focus (i.e., cigarettes, alcohol, hard drugs, shopping, gambling, Internet, love, sex, eating, work, and exercise). These were examined at two time-points one year apart. Latent class and latent transition analyses (LCA and LTA) were conducted in Mplus. Prevalence rates were stable across the two time-points. As in the cross-sectional baseline analysis, the 2-class model (addiction class, non-addiction class) fit the data better at follow-up than models with more classes. Item-response or conditional probabilities for each addiction type did not differ between time-points. As a result, the LTA model utilized constrained the conditional probabilities to be equal across the two time-points. In the addiction class, larger conditional probabilities (i.e., 0.40-0.49) were found for love, sex, exercise, and work addictions; medium conditional probabilities (i.e., 0.17-0.27) were found for cigarette, alcohol, other drugs, eating, Internet and shopping addiction; and a small conditional probability (0.06) was found for gambling. Persons in an addiction class tend to remain in this addiction class over a one-year period.

  16. Targeting the probability versus cost of feared outcomes in public speaking anxiety.

    PubMed

    Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T

    2010-04-01

    Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.

  17. Shoot Development and Extension of Quercus serrata Saplings in Response to Insect Damage and Nutrient Conditions

    PubMed Central

    MIZUMACHI, ERI; MORI, AKIRA; OSAWA, NAOYA; AKIYAMA, REIKO; TOKUCHI, NAOKO

    2006-01-01

    • Background and Aims Plants have the ability to compensate for damage caused by herbivores. This is important to plant growth, because a plant cannot always avoid damage, even if it has developed defence mechanisms against herbivores. In previous work, we elucidated the herbivory-induced compensatory response of Quercus (at both the individual shoot and whole sapling levels) in both low- and high-nutrient conditions throughout one growing season. In this study, we determine how the compensatory growth of Quercus serrata saplings is achieved at different nutrient levels. • Methods Quercus serrata saplings were grown under controlled conditions. Length, number of leaves and percentage of leaf area lost on all extension units (EUs) were measured. • Key Results Both the probability of flushing and the length of subsequent EUs significantly increased with an increase in the length of the parent EU. The probability of flushing increased with an increase in leaf damage of the parent EU, but the length of subsequent EUs decreased. This indicates that EU growth is fundamentally regulated at the individual EU level. The probabilities of a second and third flush were significantly higher in plants in high-nutrient soil than those in low-nutrient soil. The subsequent EUs of damaged saplings were also significantly longer at high-nutrient conditions. • Conclusions An increase in the probability of flushes in response to herbivore damage is important for damaged saplings to produce new EUs; further, shortening the length of EUs helps to effectively reproduce foliage lost by herbivory. The probability of flushing also varied according to soil nutrient levels, suggesting that the compensatory growth of individual EUs in response to local damage levels is affected by the nutrients available to the whole sapling. PMID:16709576

  18. A prototype method for diagnosing high ice water content probability using satellite imager data

    NASA Astrophysics Data System (ADS)

    Yost, Christopher R.; Bedka, Kristopher M.; Minnis, Patrick; Nguyen, Louis; Strapp, J. Walter; Palikonda, Rabindra; Khlopenkov, Konstantin; Spangenberg, Douglas; Smith, William L., Jr.; Protat, Alain; Delanoe, Julien

    2018-03-01

    Recent studies have found that ingestion of high mass concentrations of ice particles in regions of deep convective storms, with radar reflectivity considered safe for aircraft penetration, can adversely impact aircraft engine performance. Previous aviation industry studies have used the term high ice water content (HIWC) to define such conditions. Three airborne field campaigns were conducted in 2014 and 2015 to better understand how HIWC is distributed in deep convection, both as a function of altitude and proximity to convective updraft regions, and to facilitate development of new methods for detecting HIWC conditions, in addition to many other research and regulatory goals. This paper describes a prototype method for detecting HIWC conditions using geostationary (GEO) satellite imager data coupled with in situ total water content (TWC) observations collected during the flight campaigns. Three satellite-derived parameters were determined to be most useful for determining HIWC probability: (1) the horizontal proximity of the aircraft to the nearest overshooting convective updraft or textured anvil cloud, (2) tropopause-relative infrared brightness temperature, and (3) daytime-only cloud optical depth. Statistical fits between collocated TWC and GEO satellite parameters were used to determine the membership functions for the fuzzy logic derivation of HIWC probability. The products were demonstrated using data from several campaign flights and validated using a subset of the satellite-aircraft collocation database. The daytime HIWC probability was found to agree quite well with TWC time trends and identified extreme TWC events with high probability. Discrimination of HIWC was more challenging at night with IR-only information. The products show the greatest capability for discriminating TWC ≥ 0.5 g m-3. Product validation remains challenging due to vertical TWC uncertainties and the typically coarse spatio-temporal resolution of the GEO data.

  19. Modelling detection probabilities to evaluate management and control tools for an invasive species

    USGS Publications Warehouse

    Christy, M.T.; Yackel Adams, A.A.; Rodda, G.H.; Savidge, J.A.; Tyrrell, C.L.

    2010-01-01

    For most ecologists, detection probability (p) is a nuisance variable that must be modelled to estimate the state variable of interest (i.e. survival, abundance, or occupancy). However, in the realm of invasive species control, the rate of detection and removal is the rate-limiting step for management of this pervasive environmental problem. For strategic planning of an eradication (removal of every individual), one must identify the least likely individual to be removed, and determine the probability of removing it. To evaluate visual searching as a control tool for populations of the invasive brown treesnake Boiga irregularis, we designed a mark-recapture study to evaluate detection probability as a function of time, gender, size, body condition, recent detection history, residency status, searcher team and environmental covariates. We evaluated these factors using 654 captures resulting from visual detections of 117 snakes residing in a 5-ha semi-forested enclosure on Guam, fenced to prevent immigration and emigration of snakes but not their prey. Visual detection probability was low overall (= 0??07 per occasion) but reached 0??18 under optimal circumstances. Our results supported sex-specific differences in detectability that were a quadratic function of size, with both small and large females having lower detection probabilities than males of those sizes. There was strong evidence for individual periodic changes in detectability of a few days duration, roughly doubling detection probability (comparing peak to non-elevated detections). Snakes in poor body condition had estimated mean detection probabilities greater than snakes with high body condition. Search teams with high average detection rates exhibited detection probabilities about twice that of search teams with low average detection rates. Surveys conducted with bright moonlight and strong wind gusts exhibited moderately decreased probabilities of detecting snakes. Synthesis and applications. By emphasizing and modelling detection probabilities, we now know: (i) that eradication of this species by searching is possible, (ii) how much searching effort would be required, (iii) under what environmental conditions searching would be most efficient, and (iv) several factors that are likely to modulate this quantification when searching is applied to new areas. The same approach can be use for evaluation of any control technology or population monitoring programme. ?? 2009 The Authors. Journal compilation ?? 2009 British Ecological Society.

  20. Prevalence and co-occurrence of addictive behaviors among former alternative high school youth: A longitudinal follow-up study

    PubMed Central

    Sussman, Steve; Pokhrel, Pallav; Sun, Ping; Rohrbach, Louise A.; Spruijt-Metz, Donna

    2015-01-01

    Background and Aims Recent work has studied addictions using a matrix measure, which taps multiple addictions through single responses for each type. This is the first longitudinal study using a matrix measure. Methods We investigated the use of this approach among former alternative high school youth (average age = 19.8 years at baseline; longitudinal n = 538) at risk for addictions. Lifetime and last 30-day prevalence of one or more of 11 addictions reviewed in other work was the primary focus (i.e., cigarettes, alcohol, hard drugs, shopping, gambling, Internet, love, sex, eating, work, and exercise). These were examined at two time-points one year apart. Latent class and latent transition analyses (LCA and LTA) were conducted in Mplus. Results Prevalence rates were stable across the two time-points. As in the cross-sectional baseline analysis, the 2-class model (addiction class, non-addiction class) fit the data better at follow-up than models with more classes. Item-response or conditional probabilities for each addiction type did not differ between time-points. As a result, the LTA model utilized constrained the conditional probabilities to be equal across the two time-points. In the addiction class, larger conditional probabilities (i.e., 0.40−0.49) were found for love, sex, exercise, and work addictions; medium conditional probabilities (i.e., 0.17−0.27) were found for cigarette, alcohol, other drugs, eating, Internet and shopping addiction; and a small conditional probability (0.06) was found for gambling. Discussion and Conclusions Persons in an addiction class tend to remain in this addiction class over a one-year period. PMID:26551909

  1. "Jumping to conclusions" in delusion-prone participants: an experimental economics approach.

    PubMed

    van der Leer, Leslie; McKay, Ryan

    2014-01-01

    That delusional and delusion-prone individuals "jump to conclusions" on probabilistic reasoning tasks is a key finding in cognitive neuropsychiatry. Here we focused on a less frequently investigated aspect of "jumping to conclusions" (JTC): certainty judgments. We incorporated rigorous procedures from experimental economics to eliminate potential confounds of miscomprehension and motivation and systematically investigated the effect of incentives on task performance. Low- and high-delusion-prone participants (n = 109) completed a series of computerised trials; on each trial, they were shown a black or a white fish, caught from one of the two lakes containing fish of both colours in complementary ratios. In the betting condition, participants were given £4 to distribute over the two lakes as they wished; in the control condition, participants simply provided an estimate of how probable each lake was. Deviations from Bayesian probabilities were investigated. Whereas high-delusion-prone participants in both the control and betting conditions underestimated the Bayesian probabilities (i.e. were conservative), low-delusion-prone participants in the control condition underestimated but those in the betting condition provided accurate estimates. In the control condition, there was a trend for high-delusion-prone participants to give higher estimates than low-delusion-prone participants, which is consistent with previous reports of "jumping to conclusions" in delusion-prone participants. However, our findings in the betting condition, where high-delusion-prone participants provided lower estimates than low-delusion-prone participants (who were accurate), are inconsistent with the jumping-to-conclusions effect in both a relative and an absolute sense. Our findings highlight the key role of task incentives and underscore the importance of comparing the responses of delusion-prone participants to an objective rational standard as well as to the responses of non-delusion-prone participants.

  2. Learning in an interactive simulation tool against landslide risks: the role of strength and availability of experiential feedback

    NASA Astrophysics Data System (ADS)

    Chaturvedi, Pratik; Arora, Akshit; Dutt, Varun

    2018-06-01

    Feedback via simulation tools is likely to help people improve their decision-making against natural disasters. However, little is known on how differing strengths of experiential feedback and feedback's availability in simulation tools influence people's decisions against landslides. We tested the influence of differing strengths of experiential feedback and feedback's availability on people's decisions against landslides in Mandi, Himachal Pradesh, India. Experiential feedback (high or low) and feedback's availability (present or absent) were varied across four between-subject conditions in a tool called the Interactive Landslide Simulation (ILS): high damage with feedback present, high damage with feedback absent, low damage with feedback present, and low damage with feedback absent. In high-damage conditions, the probabilities of damages to life and property due to landslides were 10 times higher than those in the low-damage conditions. In feedback-present conditions, experiential feedback was provided in numeric, text, and graphical formats in ILS. In feedback-absent conditions, the probabilities of damages were described; however, there was no experiential feedback present. Investments were greater in conditions where experiential feedback was present and damages were high compared to conditions where experiential feedback was absent and damages were low. Furthermore, only high-damage feedback produced learning in ILS. Simulation tools like ILS seem appropriate for landslide risk communication and for performing what-if analyses.

  3. Correlation between crash avoidance maneuvers and injury severity sustained by motorcyclists in single-vehicle crashes.

    PubMed

    Wang, Chen; Lu, Linjun; Lu, Jian; Wang, Tao

    2016-01-01

    In order to improve motorcycle safety, this article examines the correlation between crash avoidance maneuvers and injury severity sustained by motorcyclists, under multiple precrash conditions. Ten-year crash data for single-vehicle motorcycle crashes from the General Estimates Systems (GES) were analyzed, using partial proportional odds models (i.e., generalized ordered logit models). The modeling results show that "braking (no lock-up)" is associated with a higher probability of increased severity, whereas "braking (lock-up)" is associated with a higher probability of decreased severity, under all precrash conditions. "Steering" is associated with a higher probability of reduced injury severity when other vehicles are encroaching, whereas it is correlated with high injury severity under other conditions. "Braking and steering" is significantly associated with a higher probability of low severity under "animal encounter and object presence," whereas it is surprisingly correlated with high injury severity when motorcycles are traveling off the edge of the road. The results also show that a large number of motorcyclists did not perform any crash avoidance maneuvers or conducted crash avoidance maneuvers that are significantly associated with high injury severity. In general, this study suggests that precrash maneuvers are an important factor associated with motorcyclists' injury severity. To improve motorcycle safety, training/educational programs should be considered to improve safety awareness and adjust driving habits of motorcyclists. Antilock brakes and such systems are also promising, because they could effectively prevent brake lock-up and assist motorcyclists in maneuvering during critical conditions. This study also provides valuable information for the design of motorcycle training curriculum.

  4. Vertical changes in the probability distribution of downward irradiance within the near-surface ocean under sunny conditions

    NASA Astrophysics Data System (ADS)

    Gernez, Pierre; Stramski, Dariusz; Darecki, Miroslaw

    2011-07-01

    Time series measurements of fluctuations in underwater downward irradiance, Ed, within the green spectral band (532 nm) show that the probability distribution of instantaneous irradiance varies greatly as a function of depth within the near-surface ocean under sunny conditions. Because of intense light flashes caused by surface wave focusing, the near-surface probability distributions are highly skewed to the right and are heavy tailed. The coefficients of skewness and excess kurtosis at depths smaller than 1 m can exceed 3 and 20, respectively. We tested several probability models, such as lognormal, Gumbel, Fréchet, log-logistic, and Pareto, which are potentially suited to describe the highly skewed heavy-tailed distributions. We found that the models cannot approximate with consistently good accuracy the high irradiance values within the right tail of the experimental distribution where the probability of these values is less than 10%. This portion of the distribution corresponds approximately to light flashes with Ed > 1.5?, where ? is the time-averaged downward irradiance. However, the remaining part of the probability distribution covering all irradiance values smaller than the 90th percentile can be described with a reasonable accuracy (i.e., within 20%) with a lognormal model for all 86 measurements from the top 10 m of the ocean included in this analysis. As the intensity of irradiance fluctuations decreases with depth, the probability distribution tends toward a function symmetrical around the mean like the normal distribution. For the examined data set, the skewness and excess kurtosis assumed values very close to zero at a depth of about 10 m.

  5. Chemical Separation of Fe-Ni Particles after Impact

    NASA Astrophysics Data System (ADS)

    Miura, Y.; Fukuyama, S.; Kedves, M. A.; Yamori, A.; Okamoto, M.; Gucsik, A.

    Tiny grains of Fe-Ni system originated from planetesimals or meteoroids can remain under solid (or melt)-solid impact reactions even after impact process, probably together with high pressure form of Fe phase. Impact fragment with major Fe-Si (-Ni) system can be formed under vapor condition of impact reaction from terrestrial and artificial impact craters and spherules, and those with Ni-Cl (-S) system in composi- tion are formed under vapor condition of artificial impact experiments on the Barringer iron meteorite. These impact grains of Fe-bearing composition or high pressure form of iron-rich phases will be found probably on the asteroids in future exploration

  6. [Infant and child mortality in Latin America].

    PubMed

    Behm, H; Primante, D A

    1978-04-01

    High mortality rates persist in Latin America, and data collection is made very difficult because of the lack of reliable statistics. A study was initiated in 1976 to measure the probability of mortality from birth to 2 years of age in 12 Latin American countries. The Brass method was used and applied to population censuses. Probability of mortality is extremely heterogeneous and regularly very high, varying between a maximum of 202/1000 in Bolivia, to a minimum of 112/1000 in Uruguay. In comparison, the same probability is 21/1000 in the U.S., and 11/1000 in sweden. Mortality in rural areas is much higher than in urban ones, and varies according to the degree of education of the mother, children being born to mothers who had 10 years of formal education having the lowest risk of death. Children born to the indigenous population, largely illiterate and living in the poorest of conditions, have the highest probability of death, a probability reaching 67% of all deaths under 2 years. National health services in Latin America, although vastly improved and improving, still do not meet the needs of the population, especially rural, and structural and historical conditions hamper a wider application of existing medical knowledge.

  7. Oil spill contamination probability in the southeastern Levantine basin.

    PubMed

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  8. Iteroparity in the variable environment of the salamander Ambystoma tigrinum

    USGS Publications Warehouse

    Church, D.R.; Bailey, L.L.; Wilbur, H.M.; Kendall, W.L.; Hines, J.E.

    2007-01-01

    Simultaneous estimation of survival, reproduction, and movement is essential to understanding how species maximize lifetime reproduction in environments that vary across space and time. We conducted a four-year, capture–recapture study of three populations of eastern tiger salamanders (Ambystoma tigrinum tigrinum) and used multistate mark–recapture statistical methods to estimate the manner in which movement, survival, and breeding probabilities vary under different environmental conditions across years and among populations and habitats. We inferred how individuals may mitigate risks of mortality and reproductive failure by deferring breeding or by moving among populations. Movement probabilities among populations were extremely low despite high spatiotemporal variation in reproductive success and survival, suggesting possible costs to movements among breeding ponds. Breeding probabilities varied between wet and dry years and according to whether or not breeding was attempted in the previous year. Estimates of survival in the nonbreeding, forest habitat varied among populations but were consistent across time. Survival in breeding ponds was generally high in years with average or high precipitation, except for males in an especially ephemeral pond. A drought year incurred severe survival costs in all ponds to animals that attempted breeding. Female salamanders appear to defer these episodic survival costs of breeding by choosing not to breed in years when the risk of adult mortality is high. Using stochastic simulations of survival and breeding under historical climate conditions, we found that an interaction between breeding probabilities and mortality limits the probability of multiple breeding attempts differently between the sexes and among populations.

  9. Climate drives inter-annual variability in probability of high severity fire occurrence in the western United States

    NASA Astrophysics Data System (ADS)

    Keyser, Alisa; Westerling, Anthony LeRoy

    2017-05-01

    A long history of fire suppression in the western United States has significantly changed forest structure and ecological function, leading to increasingly uncharacteristic fires in terms of size and severity. Prior analyses of fire severity in California forests showed that time since last fire and fire weather conditions predicted fire severity very well, while a larger regional analysis showed that topography and climate were important predictors of high severity fire. There has not yet been a large-scale study that incorporates topography, vegetation and fire-year climate to determine regional scale high severity fire occurrence. We developed models to predict the probability of high severity fire occurrence for the western US. We predict high severity fire occurrence with some accuracy, and identify the relative importance of predictor classes in determining the probability of high severity fire. The inclusion of both vegetation and fire-year climate predictors was critical for model skill in identifying fires with high fractional fire severity. The inclusion of fire-year climate variables allows this model to forecast inter-annual variability in areas at future risk of high severity fire, beyond what slower-changing fuel conditions alone can accomplish. This allows for more targeted land management, including resource allocation for fuels reduction treatments to decrease the risk of high severity fire.

  10. The Neural Correlates of Health Risk Perception in Individuals with Low and High Numeracy

    ERIC Educational Resources Information Center

    Vogel, Stephan E.; Keller, Carmen; Koschutnig, Karl; Reishofer, Gernot; Ebner, Franz; Dohle, Simone; Siegrist, Michael; Grabner, Roland H.

    2016-01-01

    The ability to use numerical information in different contexts is a major goal of mathematics education. In health risk communication, outcomes of a medical condition are frequently expressed in probabilities. Difficulties to accurately represent probability information can result in unfavourable medical decisions. To support individuals with…

  11. Optimum space shuttle launch times relative to natural environment

    NASA Technical Reports Server (NTRS)

    King, R. L.

    1977-01-01

    Three sets of meteorological criteria were analyzed to determine the probabilities of favorable launch and landing conditions. Probabilities were computed for every 3 hours on a yearly basis using 14 years of weather data. These temporal probability distributions, applicable to the three sets of weather criteria encompassing benign, moderate and severe weather conditions, were computed for both Kennedy Space Center (KSC) and Edwards Air Force Base. In addition, conditional probabilities were computed for unfavorable weather conditions occurring after a delay which may or may not be due to weather conditions. Also, for KSC, the probabilities of favorable landing conditions at various times after favorable launch conditions have prevailed have been computed so that mission probabilities may be more accurately computed for those time periods when persistence strongly correlates weather conditions. Moreover, the probabilities and conditional probabilities of the occurrence of both favorable and unfavorable events for each individual criterion were computed to indicate the significance of each weather element to the overall result.

  12. Predictability of currency market exchange

    NASA Astrophysics Data System (ADS)

    Ohira, Toru; Sazuka, Naoya; Marumo, Kouhei; Shimizu, Tokiko; Takayasu, Misako; Takayasu, Hideki

    2002-05-01

    We analyze tick data of yen-dollar exchange with a focus on its up and down movement. We show that there exists a rather particular conditional probability structure with such high frequency data. This result provides us with evidence to question one of the basic assumptions of the traditional market theory, where such bias in high frequency price movements is regarded as not present. We also construct systematically a random walk model reflecting this probability structure.

  13. Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers

    USGS Publications Warehouse

    Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.

    2018-01-01

    Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which maximizes the use of available resources. Increased implementation of approaches that consider detection error promote ecological advancements and conservation and management decisions that are better informed.

  14. Behavioral economic insights into physician tobacco treatment decision-making.

    PubMed

    Leone, Frank T; Evers-Casey, Sarah; Graden, Sarah; Schnoll, Robert

    2015-03-01

    Physicians self-report high adherence rates for Ask and Advise behaviors of tobacco dependence treatment but are much less likely to engage in "next steps" consistent with sophisticated management of chronic illness. A variety of potential explanations have been offered, yet each lacks face validity in light of experience with other challenging medical conditions. Conduct a preliminary exploration of the behavioral economics of tobacco treatment decision-making in the face of uncertain outcomes, seeking evidence that behaviors may be explained within the framework of Prospect Theory. Four physician cohorts were polled regarding their impressions of the utility of tobacco use treatment and their estimations of "success" probabilities. Contingent valuation was estimated by asking respondents to make monetary tradeoffs relative to three common chronic conditions. Responses from all four cohorts showed a similar pattern of high utility of tobacco use treatment but low success probability when compared with the other chronic medical conditions. Following instructional methods aimed at controverting cognitive biases related to tobacco, this pattern was reversed, with success probabilities attaining higher valuation than for diabetes. Important presuppositions regarding the potential "success" of tobacco-related patient interactions are likely limiting physician engagement by favoring the most secure visit outcome despite the limited potential for health gains. Under these conditions, low engagement rates would be consistent with Prospect Theory predictions. Interventions aimed at counteracting the cognitive biases limiting estimations of success probabilities seem to effectively reverse this pattern and provide clues to improving the adoption of target clinical behaviors.

  15. Lost in search: (Mal-)adaptation to probabilistic decision environments in children and adults.

    PubMed

    Betsch, Tilmann; Lehmann, Anne; Lindow, Stefanie; Lang, Anna; Schoemann, Martin

    2016-02-01

    Adaptive decision making in probabilistic environments requires individuals to use probabilities as weights in predecisional information searches and/or when making subsequent choices. Within a child-friendly computerized environment (Mousekids), we tracked 205 children's (105 children 5-6 years of age and 100 children 9-10 years of age) and 103 adults' (age range: 21-22 years) search behaviors and decisions under different probability dispersions (.17; .33, .83 vs. .50, .67, .83) and constraint conditions (instructions to limit search: yes vs. no). All age groups limited their depth of search when instructed to do so and when probability dispersion was high (range: .17-.83). Unlike adults, children failed to use probabilities as weights for their searches, which were largely not systematic. When examining choices, however, elementary school children (unlike preschoolers) systematically used probabilities as weights in their decisions. This suggests that an intuitive understanding of probabilities and the capacity to use them as weights during integration is not a sufficient condition for applying simple selective search strategies that place one's focus on weight distributions. PsycINFO Database Record (c) 2016 APA, all rights reserved.

  16. The probability of object-scene co-occurrence influences object identification processes.

    PubMed

    Sauvé, Geneviève; Harmand, Mariane; Vanni, Léa; Brodeur, Mathieu B

    2017-07-01

    Contextual information allows the human brain to make predictions about the identity of objects that might be seen and irregularities between an object and its background slow down perception and identification processes. Bar and colleagues modeled the mechanisms underlying this beneficial effect suggesting that the brain stocks information about the statistical regularities of object and scene co-occurrence. Their model suggests that these recurring regularities could be conceptualized along a continuum in which the probability of seeing an object within a given scene can be high (probable condition), moderate (improbable condition) or null (impossible condition). In the present experiment, we propose to disentangle the electrophysiological correlates of these context effects by directly comparing object-scene pairs found along this continuum. We recorded the event-related potentials of 30 healthy participants (18-34 years old) and analyzed their brain activity in three time windows associated with context effects. We observed anterior negativities between 250 and 500 ms after object onset for the improbable and impossible conditions (improbable more negative than impossible) compared to the probable condition as well as a parieto-occipital positivity (improbable more positive than impossible). The brain may use different processing pathways to identify objects depending on whether the probability of co-occurrence with the scene is moderate (rely more on top-down effects) or null (rely more on bottom-up influences). The posterior positivity could index error monitoring aimed to ensure that no false information is integrated into mental representations of the world.

  17. A Method of Face Detection with Bayesian Probability

    NASA Astrophysics Data System (ADS)

    Sarker, Goutam

    2010-10-01

    The objective of face detection is to identify all images which contain a face, irrespective of its orientation, illumination conditions etc. This is a hard problem, because the faces are highly variable in size, shape lighting conditions etc. Many methods have been designed and developed to detect faces in a single image. The present paper is based on one `Appearance Based Method' which relies on learning the facial and non facial features from image examples. This in its turn is based on statistical analysis of examples and counter examples of facial images and employs Bayesian Conditional Classification Rule to detect the probability of belongingness of a face (or non-face) within an image frame. The detection rate of the present system is very high and thereby the number of false positive and false negative detection is substantially low.

  18. Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?

    PubMed

    Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin

    2014-08-01

    Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.

  19. Students' Understanding of Conditional Probability on Entering University

    ERIC Educational Resources Information Center

    Reaburn, Robyn

    2013-01-01

    An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…

  20. Conditional long-term survival following minimally invasive robotic mitral valve repair: a health services perspective.

    PubMed

    Efird, Jimmy T; Griffin, William F; Gudimella, Preeti; O'Neal, Wesley T; Davies, Stephen W; Crane, Patricia B; Anderson, Ethan J; Kindell, Linda C; Landrine, Hope; O'Neal, Jason B; Alwair, Hazaim; Kypson, Alan P; Nifong, Wiley L; Chitwood, W Randolph

    2015-09-01

    Conditional survival is defined as the probability of surviving an additional number of years beyond that already survived. The aim of this study was to compute conditional survival in patients who received a robotically assisted, minimally invasive mitral valve repair procedure (RMVP). Patients who received RMVP with annuloplasty band from May 2000 through April 2011 were included. A 5- and 10-year conditional survival model was computed using a multivariable product-limit method. Non-smoking men (≤65 years) who presented in sinus rhythm had a 96% probability of surviving at least 10 years if they survived their first year following surgery. In contrast, recent female smokers (>65 years) with preoperative atrial fibrillation only had an 11% probability of surviving beyond 10 years if alive after one year post-surgery. In the context of an increasingly managed healthcare environment, conditional survival provides useful information for patients needing to make important treatment decisions, physicians seeking to select patients most likely to benefit long-term following RMVP, and hospital administrators needing to comparatively assess the life-course economic value of high-tech surgical procedures.

  1. CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS

    EPA Science Inventory

    Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...

  2. Study on conditional probability of surface rupture: effect of fault dip and width of seismogenic layer

    NASA Astrophysics Data System (ADS)

    Inoue, N.

    2017-12-01

    The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source fault would be performed in order to examine the amount of the displacement and conditional probability quantitatively.

  3. Monthly streamflow forecasting based on hidden Markov model and Gaussian Mixture Regression

    NASA Astrophysics Data System (ADS)

    Liu, Yongqi; Ye, Lei; Qin, Hui; Hong, Xiaofeng; Ye, Jiajun; Yin, Xingli

    2018-06-01

    Reliable streamflow forecasts can be highly valuable for water resources planning and management. In this study, we combined a hidden Markov model (HMM) and Gaussian Mixture Regression (GMR) for probabilistic monthly streamflow forecasting. The HMM is initialized using a kernelized K-medoids clustering method, and the Baum-Welch algorithm is then executed to learn the model parameters. GMR derives a conditional probability distribution for the predictand given covariate information, including the antecedent flow at a local station and two surrounding stations. The performance of HMM-GMR was verified based on the mean square error and continuous ranked probability score skill scores. The reliability of the forecasts was assessed by examining the uniformity of the probability integral transform values. The results show that HMM-GMR obtained reasonably high skill scores and the uncertainty spread was appropriate. Different HMM states were assumed to be different climate conditions, which would lead to different types of observed values. We demonstrated that the HMM-GMR approach can handle multimodal and heteroscedastic data.

  4. Behavioral Economic Insights into Physician Tobacco Treatment Decision-Making

    PubMed Central

    Evers-Casey, Sarah; Graden, Sarah; Schnoll, Robert

    2015-01-01

    Rationale: Physicians self-report high adherence rates for Ask and Advise behaviors of tobacco dependence treatment but are much less likely to engage in “next steps” consistent with sophisticated management of chronic illness. A variety of potential explanations have been offered, yet each lacks face validity in light of experience with other challenging medical conditions. Objective: Conduct a preliminary exploration of the behavioral economics of tobacco treatment decision-making in the face of uncertain outcomes, seeking evidence that behaviors may be explained within the framework of Prospect Theory. Methods: Four physician cohorts were polled regarding their impressions of the utility of tobacco use treatment and their estimations of “success” probabilities. Contingent valuation was estimated by asking respondents to make monetary tradeoffs relative to three common chronic conditions. Measurements and Main Results: Responses from all four cohorts showed a similar pattern of high utility of tobacco use treatment but low success probability when compared with the other chronic medical conditions. Following instructional methods aimed at controverting cognitive biases related to tobacco, this pattern was reversed, with success probabilities attaining higher valuation than for diabetes. Conclusions: Important presuppositions regarding the potential “success” of tobacco-related patient interactions are likely limiting physician engagement by favoring the most secure visit outcome despite the limited potential for health gains. Under these conditions, low engagement rates would be consistent with Prospect Theory predictions. Interventions aimed at counteracting the cognitive biases limiting estimations of success probabilities seem to effectively reverse this pattern and provide clues to improving the adoption of target clinical behaviors. PMID:25664676

  5. Modeling summer month hydrological drought probabilities in the United States using antecedent flow conditions

    USGS Publications Warehouse

    Austin, Samuel H.; Nelms, David L.

    2017-01-01

    Climate change raises concern that risks of hydrological drought may be increasing. We estimate hydrological drought probabilities for rivers and streams in the United States (U.S.) using maximum likelihood logistic regression (MLLR). Streamflow data from winter months are used to estimate the chance of hydrological drought during summer months. Daily streamflow data collected from 9,144 stream gages from January 1, 1884 through January 9, 2014 provide hydrological drought streamflow probabilities for July, August, and September as functions of streamflows during October, November, December, January, and February, estimating outcomes 5-11 months ahead of their occurrence. Few drought prediction methods exploit temporal links among streamflows. We find MLLR modeling of drought streamflow probabilities exploits the explanatory power of temporally linked water flows. MLLR models with strong correct classification rates were produced for streams throughout the U.S. One ad hoc test of correct prediction rates of September 2013 hydrological droughts exceeded 90% correct classification. Some of the best-performing models coincide with areas of high concern including the West, the Midwest, Texas, the Southeast, and the Mid-Atlantic. Using hydrological drought MLLR probability estimates in a water management context can inform understanding of drought streamflow conditions, provide warning of future drought conditions, and aid water management decision making.

  6. Organic priority substances and microbial processes in river sediments subject to contrasting hydrological conditions.

    PubMed

    Zoppini, Annamaria; Ademollo, Nicoletta; Amalfitano, Stefano; Casella, Patrizia; Patrolecco, Luisa; Polesello, Stefano

    2014-06-15

    Flood and drought events of higher intensity and frequency are expected to increase in arid and semi-arid regions, in which temporary rivers represent both a water resource and an aquatic ecosystem to be preserved. In this study, we explored the variation of two classes of hazardous substances (Polycyclic Aromatic Hydrocarbons and Nonylphenols) and the functioning of the microbial community in river sediments subject to hydrological fluctuations (Candelaro river basin, Italy). Overall, the concentration of pollutants (∑PAHs range 8-275ngg(-1); ∑NPs range 299-4858ngg(-1)) suggests a moderate degree of contamination. The conditions in which the sediments were tested, flow (high/low) and no flow (wet/dry/arid), were associated to significant differences in the chemical and microbial properties. The total organic carbon contribution decreased together with the stream flow reduction, while the contribution of C-PAHs and C-NPs tended to increase. NPs were relatively more concentrated in sediments under high flow, while the more hydrophobic PAHs accumulated under low and no flow conditions. Passing from high to no flow conditions, a gradual reduction of microbial processes was observed, to reach the lowest specific bacterial carbon production rates (0.06fmolCh(-1)cell(-1)), extracellular enzyme activities, and the highest doubling time (40h) in arid sediments. In conclusion, different scenarios for the mobilization of pollutants and microbial processes can be identified under contrasting hydrological conditions: (i) the mobilization of pollutants under high flow and a relatively higher probability for biodegradation; (ii) the accumulation of pollutants during low flow and lower probability for biodegradation; (iii) the drastic reduction of pollutant concentrations under dry and arid conditions, probably independently from the microbial activity (abiotic processes). Our findings let us infer that a multiple approach has to be considered for an appropriate water resource exploitation and a more realistic prevision of the impact of pollutants in temporary waters. Copyright © 2014 Elsevier B.V. All rights reserved.

  7. Optimum space shuttle launch times relative to natural environment

    NASA Technical Reports Server (NTRS)

    King, R. L.

    1977-01-01

    The probabilities of favorable and unfavorable weather conditions for launch and landing of the STS under different criteria were computed for every three hours on a yearly basis using 14 years of weather data. These temporal probability distributions were considered for three sets of weather criteria encompassing benign, moderate and severe weather conditions for both Kennedy Space Center and for Edwards Air Force Base. In addition, the conditional probabilities were computed for unfavorable weather conditions occurring after a delay which may or may not be due to weather conditions. Also for KSC, the probabilities of favorable landing conditions at various times after favorable launch conditions have prevailed. The probabilities were computed to indicate the significance of each weather element to the overall result.

  8. Reasoning and choice in the Monty Hall Dilemma (MHD): implications for improving Bayesian reasoning

    PubMed Central

    Tubau, Elisabet; Aguilar-Lleyda, David; Johnson, Eric D.

    2015-01-01

    The Monty Hall Dilemma (MHD) is a two-step decision problem involving counterintuitive conditional probabilities. The first choice is made among three equally probable options, whereas the second choice takes place after the elimination of one of the non-selected options which does not hide the prize. Differing from most Bayesian problems, statistical information in the MHD has to be inferred, either by learning outcome probabilities or by reasoning from the presented sequence of events. This often leads to suboptimal decisions and erroneous probability judgments. Specifically, decision makers commonly develop a wrong intuition that final probabilities are equally distributed, together with a preference for their first choice. Several studies have shown that repeated practice enhances sensitivity to the different reward probabilities, but does not facilitate correct Bayesian reasoning. However, modest improvements in probability judgments have been observed after guided explanations. To explain these dissociations, the present review focuses on two types of causes producing the observed biases: Emotional-based choice biases and cognitive limitations in understanding probabilistic information. Among the latter, we identify a crucial cause for the universal difficulty in overcoming the equiprobability illusion: Incomplete representation of prior and conditional probabilities. We conclude that repeated practice and/or high incentives can be effective for overcoming choice biases, but promoting an adequate partitioning of possibilities seems to be necessary for overcoming cognitive illusions and improving Bayesian reasoning. PMID:25873906

  9. Bayesian probability analysis: a prospective demonstration of its clinical utility in diagnosing coronary disease

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Detrano, R.; Yiannikas, J.; Salcedo, E.E.

    One hundred fifty-four patients referred for coronary arteriography were prospectively studied with stress electrocardiography, stress thallium scintigraphy, cine fluoroscopy (for coronary calcifications), and coronary angiography. Pretest probabilities of coronary disease were determined based on age, sex, and type of chest pain. These and pooled literature values for the conditional probabilities of test results based on disease state were used in Bayes theorem to calculate posttest probabilities of disease. The results of the three noninvasive tests were compared for statistical independence, a necessary condition for their simultaneous use in Bayes theorem. The test results were found to demonstrate pairwise independence inmore » patients with and those without disease. Some dependencies that were observed between the test results and the clinical variables of age and sex were not sufficient to invalidate application of the theorem. Sixty-eight of the study patients had at least one major coronary artery obstruction of greater than 50%. When these patients were divided into low-, intermediate-, and high-probability subgroups according to their pretest probabilities, noninvasive test results analyzed by Bayesian probability analysis appropriately advanced 17 of them by at least one probability subgroup while only seven were moved backward. Of the 76 patients without disease, 34 were appropriately moved into a lower probability subgroup while 10 were incorrectly moved up. We conclude that posttest probabilities calculated from Bayes theorem more accurately classified patients with and without disease than did pretest probabilities, thus demonstrating the utility of the theorem in this application.« less

  10. Tuberculosis in a South African prison – a transmission modelling analysis

    PubMed Central

    Johnstone-Robertson, Simon; Lawn, Stephen D; Welte, Alex; Bekker, Linda-Gail; Wood, Robin

    2015-01-01

    Background Prisons are recognised internationally as institutions with very high tuberculosis (TB) burdens where transmission is predominantly determined by contact between infectious and susceptible prisoners. A recent South African court case described the conditions under which prisoners awaiting trial were kept. With the use of these data, a mathematical model was developed to explore the interactions between incarceration conditions and TB control measures. Methods Cell dimensions, cell occupancy, lock-up time, TB incidence and treatment delays were derived from court evidence and judicial reports. Using the Wells-Riley equation and probability analyses of contact between prisoners, we estimated the current TB transmission probability within prison cells, and estimated transmission probabilities of improved levels of case finding in combination with implementation of national and international minimum standards for incarceration. Results Levels of overcrowding (230%) in communal cells and poor TB case finding result in annual TB transmission risks of 90% per annum. Implementing current national or international cell occupancy recommendations would reduce TB transmission probabilities by 30% and 50%, respectively. Improved passive case finding, modest ventilation increase or decreased lock-up time would minimally impact on transmission if introduced individually. However, active case finding together with implementation of minimum national and international standards of incarceration could reduce transmission by 50% and 94%, respectively. Conclusions Current conditions of detention for awaiting-trial prisoners are highly conducive for spread of drug-sensitive and drug-resistant TB. Combinations of simple well-established scientific control measures should be implemented urgently. PMID:22272961

  11. An Alternative Version of Conditional Probabilities and Bayes' Rule: An Application of Probability Logic

    ERIC Educational Resources Information Center

    Satake, Eiki; Amato, Philip P.

    2008-01-01

    This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…

  12. A Comparative Study of Automated Infrasound Detectors - PMCC and AFD with Analyst Review.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Junghyun; Hayward, Chris; Zeiler, Cleat

    Automated detections calculated by the progressive multi-channel correlation (PMCC) method (Cansi, 1995) and the adaptive F detector (AFD) (Arrowsmith et al., 2009) are compared to the signals identified by five independent analysts. Each detector was applied to a four-hour time sequence recorded by the Korean infrasound array CHNAR. This array was used because it is composed of both small (<100 m) and large (~1000 m) aperture element spacing. The four hour time sequence contained a number of easily identified signals under noise conditions that have average RMS amplitudes varied from 1.2 to 4.5 mPa (1 to 5 Hz), estimated withmore » running five-minute window. The effectiveness of the detectors was estimated for the small aperture, large aperture, small aperture combined with the large aperture, and full array. The full and combined arrays performed the best for AFD under all noise conditions while the large aperture array had the poorest performance for both detectors. PMCC produced similar results as AFD under the lower noise conditions, but did not produce as dramatic an increase in detections using the full and combined arrays. Both automated detectors and the analysts produced a decrease in detections under the higher noise conditions. Comparing the detection probabilities with Estimated Receiver Operating Characteristic (EROC) curves we found that the smaller value of consistency for PMCC and the larger p-value for AFD had the highest detection probability. These parameters produced greater changes in detection probability than estimates of the false alarm rate. The detection probability was impacted the most by noise level, with low noise (average RMS amplitude of 1.7 mPa) having an average detection probability of ~40% and high noise (average RMS amplitude of 2.9 mPa) average detection probability of ~23%.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Shih-Jung

    Dynamic strength of the High Flux Isotope Reactor (HFIR) vessel to resist hypothetical accidents is analyzed by using the method of fracture mechanics. Vessel critical stresses are estimated by applying dynamic pressure pulses of a range of magnitudes and pulse-durations. The pulses versus time functions are assumed to be step functions. The probability of vessel fracture is then calculated by assuming a distribution of possible surface cracks of different crack depths. The probability distribution function for the crack depths is based on the form that is recommended by the Marshall report. The toughness of the vessel steel used in themore » analysis is based on the projected and embrittled value after 10 effective full power years from 1986. From the study made by Cheverton, Merkle and Nanstad, the weakest point on the vessel for fracture evaluation is known to be located within the region surrounding the tangential beam tube HB3. The increase in the probability of fracture is obtained as an extension of the result from that report for the regular operating condition to include conditions of higher dynamic pressures due to accident loadings. The increase in the probability of vessel fracture is plotted for a range of hoop stresses to indicate the vessel strength against hypothetical accident conditions.« less

  14. Streamflow distribution maps for the Cannon River drainage basin, southeast Minnesota, and the St. Louis River drainage basin, northeast Minnesota

    USGS Publications Warehouse

    Smith, Erik A.; Sanocki, Chris A.; Lorenz, David L.; Jacobsen, Katrin E.

    2017-12-27

    Streamflow distribution maps for the Cannon River and St. Louis River drainage basins were developed by the U.S. Geological Survey, in cooperation with the Legislative-Citizen Commission on Minnesota Resources, to illustrate relative and cumulative streamflow distributions. The Cannon River was selected to provide baseline data to assess the effects of potential surficial sand mining, and the St. Louis River was selected to determine the effects of ongoing Mesabi Iron Range mining. Each drainage basin (Cannon, St. Louis) was subdivided into nested drainage basins: the Cannon River was subdivided into 152 nested drainage basins, and the St. Louis River was subdivided into 353 nested drainage basins. For each smaller drainage basin, the estimated volumes of groundwater discharge (as base flow) and surface runoff flowing into all surface-water features were displayed under the following conditions: (1) extreme low-flow conditions, comparable to an exceedance-probability quantile of 0.95; (2) low-flow conditions, comparable to an exceedance-probability quantile of 0.90; (3) a median condition, comparable to an exceedance-probability quantile of 0.50; and (4) a high-flow condition, comparable to an exceedance-probability quantile of 0.02.Streamflow distribution maps were developed using flow-duration curve exceedance-probability quantiles in conjunction with Soil-Water-Balance model outputs; both the flow-duration curve and Soil-Water-Balance models were built upon previously published U.S. Geological Survey reports. The selected streamflow distribution maps provide a proactive water management tool for State cooperators by illustrating flow rates during a range of hydraulic conditions. Furthermore, after the nested drainage basins are highlighted in terms of surface-water flows, the streamflows can be evaluated in the context of meeting specific ecological flows under different flow regimes and potentially assist with decisions regarding groundwater and surface-water appropriations. Presented streamflow distribution maps are foundational work intended to support the development of additional streamflow distribution maps that include statistical constraints on the selected flow conditions.

  15. REGULATION OF GEOGRAPHIC VARIABILITY IN HAPLOID:DIPLOD RATIOS OF BIPHASIC SEAWEED LIFE CYCLES(1).

    PubMed

    da Silva Vieira, Vasco Manuel Nobre de Carvalho; Santos, Rui Orlando Pimenta

    2012-08-01

    The relative abundance of haploid and diploid individuals (H:D) in isomorphic marine algal biphasic cycles varies spatially, but only if vital rates of haploid and diploid phases vary differently with environmental conditions (i.e. conditional differentiation between phases). Vital rates of isomorphic phases in particular environments may be determined by subtle morphological or physiological differences. Herein, we test numerically how geographic variability in H:D is regulated by conditional differentiation between isomorphic life phases and the type of life strategy of populations (i.e. life cycles dominated by reproduction, survival or growth). Simulation conditions were selected using available data on H:D spatial variability in seaweeds. Conditional differentiation between ploidy phases had a small effect on the H:D variability for species with life strategies that invest either in fertility or in growth. Conversely, species with life strategies that invest mainly in survival, exhibited high variability in H:D through a conditional differentiation in stasis (the probability of staying in the same size class), breakage (the probability of changing to a smaller size class) or growth (the probability of changing to a bigger size class). These results were consistent with observed geographic variability in H:D of natural marine algae populations. © 2012 Phycological Society of America.

  16. Measuring Financial Gains from Genetically Superior Trees

    Treesearch

    George Dutrow; Clark Row

    1976-01-01

    Planting genetically superior loblolly pines will probably yield high profits.Forest economists have made computer simulations that predict financial gains expected from a tree improvement program under actual field conditions.

  17. An Alternative Teaching Method of Conditional Probabilities and Bayes' Rule: An Application of the Truth Table

    ERIC Educational Resources Information Center

    Satake, Eiki; Vashlishan Murray, Amy

    2015-01-01

    This paper presents a comparison of three approaches to the teaching of probability to demonstrate how the truth table of elementary mathematical logic can be used to teach the calculations of conditional probabilities. Students are typically introduced to the topic of conditional probabilities--especially the ones that involve Bayes' rule--with…

  18. Inherent limitations of probabilistic models for protein-DNA binding specificity

    PubMed Central

    Ruan, Shuxiang

    2017-01-01

    The specificities of transcription factors are most commonly represented with probabilistic models. These models provide a probability for each base occurring at each position within the binding site and the positions are assumed to contribute independently. The model is simple and intuitive and is the basis for many motif discovery algorithms. However, the model also has inherent limitations that prevent it from accurately representing true binding probabilities, especially for the highest affinity sites under conditions of high protein concentration. The limitations are not due to the assumption of independence between positions but rather are caused by the non-linear relationship between binding affinity and binding probability and the fact that independent normalization at each position skews the site probabilities. Generally probabilistic models are reasonably good approximations, but new high-throughput methods allow for biophysical models with increased accuracy that should be used whenever possible. PMID:28686588

  19. Multidetector computed tomographic pulmonary angiography in patients with a high clinical probability of pulmonary embolism.

    PubMed

    Moores, L; Kline, J; Portillo, A K; Resano, S; Vicente, A; Arrieta, P; Corres, J; Tapson, V; Yusen, R D; Jiménez, D

    2016-01-01

    ESSENTIALS: When high probability of pulmonary embolism (PE), sensitivity of computed tomography (CT) is unclear. We investigated the sensitivity of multidetector CT among 134 patients with a high probability of PE. A normal CT alone may not safely exclude PE in patients with a high clinical pretest probability. In patients with no clear alternative diagnosis after CTPA, further testing should be strongly considered. Whether patients with a negative multidetector computed tomographic pulmonary angiography (CTPA) result and a high clinical pretest probability of pulmonary embolism (PE) should be further investigated is controversial. This was a prospective investigation of the sensitivity of multidetector CTPA among patients with a priori clinical assessment of a high probability of PE according to the Wells criteria. Among patients with a negative CTPA result, the diagnosis of PE required at least one of the following conditions: ventilation/perfusion lung scan showing a high probability of PE in a patient with no history of PE, abnormal findings on venous ultrasonography in a patient without previous deep vein thrombosis at that site, or the occurrence of venous thromboembolism (VTE) in a 3-month follow-up period after anticoagulation was withheld because of a negative multidetector CTPA result. We identified 498 patients with a priori clinical assessment of a high probability of PE and a completed CTPA study. CTPA excluded PE in 134 patients; in these patients, the pooled incidence of VTE was 5.2% (seven of 134 patients; 95% confidence interval [CI] 1.5-9.0). Five patients had VTEs that were confirmed by an additional imaging test despite a negative CTPA result (five of 48 patients; 10.4%; 95% CI 1.8-19.1), and two patients had objectively confirmed VTEs that occurred during clinical follow-up of at least 3 months (two of 86 patients; 2.3%; 95% CI 0-5.5). None of the patients had a fatal PE during follow-up. A normal multidetector CTPA result alone may not safely exclude PE in patients with a high clinical pretest probability. © 2015 International Society on Thrombosis and Haemostasis.

  20. Phobias and Preparedness - Republished Article.

    PubMed

    Seligman, Martin E P

    2016-09-01

    Some inadequacies of the classical conditioning analysis of phobias are discussed: phobias are highly resistant to extinction, whereas laboratory fear conditioning, unlike avoidance conditioning, extinguishes rapidly; phobias comprise a nonarbitrary and limited set of objects, whereas fear conditioning is thought to occur to an unlimited range of conditioned stimuli. Furthermore, phobias, unlike laboratory fear conditioning, are often acquired in one trial and seem quite resistant to change by "cognitive" means. An analysis of phobias using a more contemporary model of fear conditioning is proposed. In this view, phobias are seen as instances of highly "prepared" learning (Seligman, 1970). Such prepared learning is selective, highly resistant to extinction, probably noncognitive and can be acquired in one trial. A reconstruction of the notion of symbolism is suggested. Copyright © 2016 Elsevier Ltd. All rights reserved.

  1. Simulation of precipitation by weather pattern and frontal analysis

    NASA Astrophysics Data System (ADS)

    Wilby, Robert

    1995-12-01

    Daily rainfall from two sites in central and southern England was stratified according to the presence or absence of weather fronts and then cross-tabulated with the prevailing Lamb Weather Type (LWT). A semi-Markov chain model was developed for simulating daily sequences of LWTs from matrices of transition probabilities between weather types for the British Isles 1970-1990. Daily and annual rainfall distributions were then simulated from the prevailing LWTs using historic conditional probabilities for precipitation occurrence and frontal frequencies. When compared with a conventional rainfall generator the frontal model produced improved estimates of the overall size distribution of daily rainfall amounts and in particular the incidence of low-frequency high-magnitude totals. Further research is required to establish the contribution of individual frontal sub-classes to daily rainfall totals and of long-term fluctuations in frontal frequencies to conditional probabilities.

  2. Domestic wells have high probability of pumping septic tank leachate

    NASA Astrophysics Data System (ADS)

    Horn, J. E.; Harter, T.

    2011-06-01

    Onsite wastewater treatment systems such as septic systems are common in rural and semi-rural areas around the world; in the US, about 25-30 % of households are served by a septic system and a private drinking water well. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. Particularly in areas with small lots, thus a high septic system density, these typically shallow wells are prone to contamination by septic system leachate. Typically, mass balance approaches are used to determine a maximum septic system density that would prevent contamination of the aquifer. In this study, we estimate the probability of a well pumping partially septic system leachate. A detailed groundwater and transport model is used to calculate the capture zone of a typical drinking water well. A spatial probability analysis is performed to assess the probability that a capture zone overlaps with a septic system drainfield depending on aquifer properties, lot and drainfield size. We show that a high septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We conclude that mass balances calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances which experience limited attenuation, and those being harmful even in low concentrations.

  3. Heightened fire probability in Indonesia in non-drought conditions: the effect of increasing temperatures

    NASA Astrophysics Data System (ADS)

    Fernandes, Kátia; Verchot, Louis; Baethgen, Walter; Gutierrez-Velez, Victor; Pinedo-Vasquez, Miguel; Martius, Christopher

    2017-05-01

    In Indonesia, drought driven fires occur typically during the warm phase of the El Niño Southern Oscillation. This was the case of the events of 1997 and 2015 that resulted in months-long hazardous atmospheric pollution levels in Equatorial Asia and record greenhouse gas emissions. Nonetheless, anomalously active fire seasons have also been observed in non-drought years. In this work, we investigated the impact of temperature on fires and found that when the July-October (JASO) period is anomalously dry, the sensitivity of fires to temperature is modest. In contrast, under normal-to-wet conditions, fire probability increases sharply when JASO is anomalously warm. This describes a regime in which an active fire season is not limited to drought years. Greater susceptibility to fires in response to a warmer environment finds support in the high evapotranspiration rates observed in normal-to-wet and warm conditions in Indonesia. We also find that fire probability in wet JASOs would be considerably less sensitive to temperature were not for the added effect of recent positive trends. Near-term regional climate projections reveal that, despite negligible changes in precipitation, a continuing warming trend will heighten fire probability over the next few decades especially in non-drought years. Mild fire seasons currently observed in association with wet conditions and cool temperatures will become rare events in Indonesia.

  4. The Probability Approach to English If-Conditional Sentences

    ERIC Educational Resources Information Center

    Wu, Mei

    2012-01-01

    Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…

  5. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  6. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  7. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  8. Interplanetary survival probability of Aspergillus terreus spores under simulated solar vacuum ultraviolet irradiation

    NASA Astrophysics Data System (ADS)

    Sarantopoulou, E.; Gomoiu, I.; Kollia, Z.; Cefalas, A. C.

    2011-01-01

    This work is a part of ESA/EU SURE project aiming to quantify the survival probability of fungal spores in space under solar irradiation in the vacuum ultraviolet (VUV) (110-180 nm) spectral region. The contribution and impact of VUV photons, vacuum, low temperature and their synergies on the survival probability of Aspergillus terreus spores is measured at simulated space conditions on Earth. To simulate the solar VUV irradiation, the spores are irradiated with a continuous discharge VUV hydrogen photon source and a molecular fluorine laser, at low and high photon intensities at 10 15 photon m -2 s -1 and 3.9×10 27 photons pulse -1 m -2 s -1, respectively. The survival probability of spores is independent from the intensity and the fluence of photons, within certain limits, in agreement with previous studies. The spores are shielded from a thin carbon layer, which is formed quickly on the external surface of the proteinaceous membrane at higher photon intensities at the start of the VUV irradiation. Extrapolating the results in space conditions, for an interplanetary direct transfer orbit from Mars to Earth, the spores will be irradiated with 3.3×10 21 solar VUV photons m -2. This photon fluence is equivalent to the irradiation of spores on Earth with 54 laser pulses with an experimental ˜92% survival probability, disregarding the contribution of space vacuum and low temperature, or to continuous solar VUV irradiation for 38 days in space near the Earth with an extrapolated ˜61% survival probability. The experimental results indicate that the damage of spores is mainly from the dehydration stress in vacuum. The high survival probability after 4 days in vacuum (˜34%) is due to the exudation of proteins on the external membrane, thus preventing further dehydration of spores. In addition, the survival probability is increasing to ˜54% at 10 K with 0.12 K/s cooling and heating rates.

  9. Unbiased multi-fidelity estimate of failure probability of a free plane jet

    NASA Astrophysics Data System (ADS)

    Marques, Alexandre; Kramer, Boris; Willcox, Karen; Peherstorfer, Benjamin

    2017-11-01

    Estimating failure probability related to fluid flows is a challenge because it requires a large number of evaluations of expensive models. We address this challenge by leveraging multiple low fidelity models of the flow dynamics to create an optimal unbiased estimator. In particular, we investigate the effects of uncertain inlet conditions in the width of a free plane jet. We classify a condition as failure when the corresponding jet width is below a small threshold, such that failure is a rare event (failure probability is smaller than 0.001). We estimate failure probability by combining the frameworks of multi-fidelity importance sampling and optimal fusion of estimators. Multi-fidelity importance sampling uses a low fidelity model to explore the parameter space and create a biasing distribution. An unbiased estimate is then computed with a relatively small number of evaluations of the high fidelity model. In the presence of multiple low fidelity models, this framework offers multiple competing estimators. Optimal fusion combines all competing estimators into a single estimator with minimal variance. We show that this combined framework can significantly reduce the cost of estimating failure probabilities, and thus can have a large impact in fluid flow applications. This work was funded by DARPA.

  10. Toward inflation models compatible with the no-boundary proposal

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hwang, Dong-il; Yeom, Dong-han, E-mail: dongil.j.hwang@gmail.com, E-mail: innocent.yeom@gmail.com

    2014-06-01

    In this paper, we investigate various inflation models in the context of the no-boundary proposal. We propose that a good inflation model should satisfy three conditions: observational constraints, plausible initial conditions, and naturalness of the model. For various inflation models, we assign the probability to each initial condition using the no-boundary proposal and define a quantitative standard, typicality, to check whether the model satisfies the observational constraints with probable initial conditions. There are three possible ways to satisfy the typicality criterion: there was pre-inflation near the high energy scale, the potential is finely tuned or the inflationary field space ismore » unbounded, or there are sufficient number of fields that contribute to inflation. The no-boundary proposal rejects some of naive inflation models, explains some of traditional doubts on inflation, and possibly, can have observational consequences.« less

  11. Framing of outcome and probability of recurrence: breast cancer patients' choice of adjuvant chemotherapy (ACT) in hypothetical patient scenarios.

    PubMed

    Zimmermann, C; Baldo, C; Molino, A

    2000-03-01

    To examine the effects of framing of outcome and probabilities of cancer occurrence on the treatment preference which breast cancer patients indicate for hypothetical patient scenarios. A modified version of the Decision Board Instrument (Levine et al. 1992) was administered to 35 breast cancer patients with past ACT experience. Patients expressed their choice regarding ACT for six scenarios which were characterized by either negative or positive framing of outcome and by one of the three levels of probability of recurrence (high, medium, low). The framing had no influence on ACT choices over all three probability levels. The majority chose ACT for high and medium risk and one third switched from ACT to No ACT in the low-risk condition. This switch was statistically significant. Hypothetical treatment decisions against ACT occur only when the probability of recurrence is low and the benefit of ACT is small. This finding for patients with past experience of ACT is similar to those reported for other oncological patient groups still in treatment.

  12. Two conditions for equivalence of 0-norm solution and 1-norm solution in sparse representation.

    PubMed

    Li, Yuanqing; Amari, Shun-Ichi

    2010-07-01

    In sparse representation, two important sparse solutions, the 0-norm and 1-norm solutions, have been receiving much of attention. The 0-norm solution is the sparsest, however it is not easy to obtain. Although the 1-norm solution may not be the sparsest, it can be easily obtained by the linear programming method. In many cases, the 0-norm solution can be obtained through finding the 1-norm solution. Many discussions exist on the equivalence of the two sparse solutions. This paper analyzes two conditions for the equivalence of the two sparse solutions. The first condition is necessary and sufficient, however, difficult to verify. Although the second is necessary but is not sufficient, it is easy to verify. In this paper, we analyze the second condition within the stochastic framework and propose a variant. We then prove that the equivalence of the two sparse solutions holds with high probability under the variant of the second condition. Furthermore, in the limit case where the 0-norm solution is extremely sparse, the second condition is also a sufficient condition with probability 1.

  13. Prediction of Conditional Probability of Survival After Surgery for Gastric Cancer: A Study Based on Eastern and Western Large Data Sets.

    PubMed

    Zhong, Qing; Chen, Qi-Yue; Li, Ping; Xie, Jian-Wei; Wang, Jia-Bin; Lin, Jian-Xian; Lu, Jun; Cao, Long-Long; Lin, Mi; Tu, Ru-Hong; Zheng, Chao-Hui; Huang, Chang-Ming

    2018-04-20

    The dynamic prognosis of patients who have undergone curative surgery for gastric cancer has yet to be reported. Our objective was to devise an accurate tool for predicting the conditional probability of survival for these patients. We analyzed 11,551 gastric cancer patients from the Surveillance, Epidemiology, and End Results database. Two-thirds of the patients were selected randomly for the development set and one-third for the validation set. Two nomograms were constructed to predict the conditional probability of overall survival and the conditional probability of disease-specific survival, using conditional survival methods. We then applied these nomograms to the 4,001 patients in the database from Fujian Medical University Union Hospital, Fuzhou, China, one of the most active Chinese institutes. The 5-year conditional probability of overall survival of the patients was 41.6% immediately after resection and increased to 52.8%, 68.2%, and 80.4% at 1, 2, and 3 years after gastrectomy. The 5-year conditional probability of disease-specific survival "increased" from 48.9% at the time of gastrectomy to 59.8%, 74.7%, and 85.5% for patients surviving 1, 2, and 3 years, respectively. Sex; race; age; depth of tumor invasion; lymph node metastasis; and tumor size, site, and grade were associated with overall survival and disease-specific survival (P <.05). Within the Surveillance, Epidemiology, and End Results validation set, the accuracy of the conditional probability of overall survival nomogram was 0.77, 0.81, 0.82, and 0.82 at 1, 3, 5, and 10 years after gastrectomy, respectively. Within the other validation set from the Fujian Medical University Union Hospital (n = 4,001), the accuracy of the conditional probability of overall survival nomogram was 0.76, 0.79, 0.77, and 0.77 at 1, 3, 5, and 10 years, respectively. The accuracy of the conditional probability of disease-specific survival model was also favorable. The calibration curve demonstrated good agreement between the predicted and observed survival rates. Based on the large Eastern and Western data sets, we developed and validated the first conditional nomogram for prediction of conditional probability of survival for patients with gastric cancer to allow consideration of the duration of survivorship. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Effect of social influence on effort-allocation for monetary rewards.

    PubMed

    Gilman, Jodi M; Treadway, Michael T; Curran, Max T; Calderon, Vanessa; Evins, A Eden

    2015-01-01

    Though decades of research have shown that people are highly influenced by peers, few studies have directly assessed how the value of social conformity is weighed against other types of costs and benefits. Using an effort-based decision-making paradigm with a novel social influence manipulation, we measured how social influence affected individuals' decisions to allocate effort for monetary rewards during trials with either high or low probability of receiving a reward. We found that information about the effort-allocation of peers modulated participant choices, specifically during conditions of low probability of obtaining a reward. This suggests that peer influence affects effort-based choices to obtain rewards especially under conditions of risk. This study provides evidence that people value social conformity in addition to other costs and benefits when allocating effort, and suggests that neuroeconomic studies that assess trade-offs between effort and reward should consider social environment as a factor that can influence decision-making.

  15. Option volatility and the acceleration Lagrangian

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Cao, Yang

    2014-01-01

    This paper develops a volatility formula for option on an asset from an acceleration Lagrangian model and the formula is calibrated with market data. The Black-Scholes model is a simpler case that has a velocity dependent Lagrangian. The acceleration Lagrangian is defined, and the classical solution of the system in Euclidean time is solved by choosing proper boundary conditions. The conditional probability distribution of final position given the initial position is obtained from the transition amplitude. The volatility is the standard deviation of the conditional probability distribution. Using the conditional probability and the path integral method, the martingale condition is applied, and one of the parameters in the Lagrangian is fixed. The call option price is obtained using the conditional probability and the path integral method.

  16. Computer-aided diagnosis with potential application to rapid detection of disease outbreaks.

    PubMed

    Burr, Tom; Koster, Frederick; Picard, Rick; Forslund, Dave; Wokoun, Doug; Joyce, Ed; Brillman, Judith; Froman, Phil; Lee, Jack

    2007-04-15

    Our objectives are to quickly interpret symptoms of emergency patients to identify likely syndromes and to improve population-wide disease outbreak detection. We constructed a database of 248 syndromes, each syndrome having an estimated probability of producing any of 85 symptoms, with some two-way, three-way, and five-way probabilities reflecting correlations among symptoms. Using these multi-way probabilities in conjunction with an iterative proportional fitting algorithm allows estimation of full conditional probabilities. Combining these conditional probabilities with misdiagnosis error rates and incidence rates via Bayes theorem, the probability of each syndrome is estimated. We tested a prototype of computer-aided differential diagnosis (CADDY) on simulated data and on more than 100 real cases, including West Nile Virus, Q fever, SARS, anthrax, plague, tularaemia and toxic shock cases. We conclude that: (1) it is important to determine whether the unrecorded positive status of a symptom means that the status is negative or that the status is unknown; (2) inclusion of misdiagnosis error rates produces more realistic results; (3) the naive Bayes classifier, which assumes all symptoms behave independently, is slightly outperformed by CADDY, which includes available multi-symptom information on correlations; as more information regarding symptom correlations becomes available, the advantage of CADDY over the naive Bayes classifier should increase; (4) overlooking low-probability, high-consequence events is less likely if the standard output summary is augmented with a list of rare syndromes that are consistent with observed symptoms, and (5) accumulating patient-level probabilities across a larger population can aid in biosurveillance for disease outbreaks. c 2007 John Wiley & Sons, Ltd.

  17. Wildfire risk in the wildland-urban interface: A simulation study in northwestern Wisconsin

    USGS Publications Warehouse

    Massada, Avi Bar; Radeloff, Volker C.; Stewart, Susan I.; Hawbaker, Todd J.

    2009-01-01

    The rapid growth of housing in and near the wildland–urban interface (WUI) increases wildfirerisk to lives and structures. To reduce fire risk, it is necessary to identify WUI housing areas that are more susceptible to wildfire. This is challenging, because wildfire patterns depend on fire behavior and spread, which in turn depend on ignition locations, weather conditions, the spatial arrangement of fuels, and topography. The goal of our study was to assess wildfirerisk to a 60,000 ha WUI area in northwesternWisconsin while accounting for all of these factors. We conducted 6000 simulations with two dynamic fire models: Fire Area Simulator (FARSITE) and Minimum Travel Time (MTT) in order to map the spatial pattern of burn probabilities. Simulations were run under normal and extreme weather conditions to assess the effect of weather on fire spread, burn probability, and risk to structures. The resulting burn probability maps were intersected with maps of structure locations and land cover types. The simulations revealed clear hotspots of wildfire activity and a large range of wildfirerisk to structures in the study area. As expected, the extreme weather conditions yielded higher burn probabilities over the entire landscape, as well as to different land cover classes and individual structures. Moreover, the spatial pattern of risk was significantly different between extreme and normal weather conditions. The results highlight the fact that extreme weather conditions not only produce higher fire risk than normal weather conditions, but also change the fine-scale locations of high risk areas in the landscape, which is of great importance for fire management in WUI areas. In addition, the choice of weather data may limit the potential for comparisons of risk maps for different areas and for extrapolating risk maps to future scenarios where weather conditions are unknown. Our approach to modeling wildfirerisk to structures can aid fire risk reduction management activities by identifying areas with elevated wildfirerisk and those most vulnerable under extreme weather conditions.

  18. Bayesian analysis of the astrobiological implications of life’s early emergence on Earth

    PubMed Central

    Spiegel, David S.; Turner, Edwin L.

    2012-01-01

    Life arose on Earth sometime in the first few hundred million years after the young planet had cooled to the point that it could support water-based organisms on its surface. The early emergence of life on Earth has been taken as evidence that the probability of abiogenesis is high, if starting from young Earth-like conditions. We revisit this argument quantitatively in a Bayesian statistical framework. By constructing a simple model of the probability of abiogenesis, we calculate a Bayesian estimate of its posterior probability, given the data that life emerged fairly early in Earth’s history and that, billions of years later, curious creatures noted this fact and considered its implications. We find that, given only this very limited empirical information, the choice of Bayesian prior for the abiogenesis probability parameter has a dominant influence on the computed posterior probability. Although terrestrial life's early emergence provides evidence that life might be abundant in the universe if early-Earth-like conditions are common, the evidence is inconclusive and indeed is consistent with an arbitrarily low intrinsic probability of abiogenesis for plausible uninformative priors. Finding a single case of life arising independently of our lineage (on Earth, elsewhere in the solar system, or on an extrasolar planet) would provide much stronger evidence that abiogenesis is not extremely rare in the universe. PMID:22198766

  19. Bayesian analysis of the astrobiological implications of life's early emergence on Earth.

    PubMed

    Spiegel, David S; Turner, Edwin L

    2012-01-10

    Life arose on Earth sometime in the first few hundred million years after the young planet had cooled to the point that it could support water-based organisms on its surface. The early emergence of life on Earth has been taken as evidence that the probability of abiogenesis is high, if starting from young Earth-like conditions. We revisit this argument quantitatively in a bayesian statistical framework. By constructing a simple model of the probability of abiogenesis, we calculate a bayesian estimate of its posterior probability, given the data that life emerged fairly early in Earth's history and that, billions of years later, curious creatures noted this fact and considered its implications. We find that, given only this very limited empirical information, the choice of bayesian prior for the abiogenesis probability parameter has a dominant influence on the computed posterior probability. Although terrestrial life's early emergence provides evidence that life might be abundant in the universe if early-Earth-like conditions are common, the evidence is inconclusive and indeed is consistent with an arbitrarily low intrinsic probability of abiogenesis for plausible uninformative priors. Finding a single case of life arising independently of our lineage (on Earth, elsewhere in the solar system, or on an extrasolar planet) would provide much stronger evidence that abiogenesis is not extremely rare in the universe.

  20. Effects of sampling conditions on DNA-based estimates of American black bear abundance

    USGS Publications Warehouse

    Laufenberg, Jared S.; Van Manen, Frank T.; Clark, Joseph D.

    2013-01-01

    DNA-based capture-mark-recapture techniques are commonly used to estimate American black bear (Ursus americanus) population abundance (N). Although the technique is well established, many questions remain regarding study design. In particular, relationships among N, capture probability of heterogeneity mixtures A and B (pA and pB, respectively, or p, collectively), the proportion of each mixture (π), number of capture occasions (k), and probability of obtaining reliable estimates of N are not fully understood. We investigated these relationships using 1) an empirical dataset of DNA samples for which true N was unknown and 2) simulated datasets with known properties that represented a broader array of sampling conditions. For the empirical data analysis, we used the full closed population with heterogeneity data type in Program MARK to estimate N for a black bear population in Great Smoky Mountains National Park, Tennessee. We systematically reduced the number of those samples used in the analysis to evaluate the effect that changes in capture probabilities may have on parameter estimates. Model-averaged N for females and males were 161 (95% CI = 114–272) and 100 (95% CI = 74–167), respectively (pooled N = 261, 95% CI = 192–419), and the average weekly p was 0.09 for females and 0.12 for males. When we reduced the number of samples of the empirical data, support for heterogeneity models decreased. For the simulation analysis, we generated capture data with individual heterogeneity covering a range of sampling conditions commonly encountered in DNA-based capture-mark-recapture studies and examined the relationships between those conditions and accuracy (i.e., probability of obtaining an estimated N that is within 20% of true N), coverage (i.e., probability that 95% confidence interval includes true N), and precision (i.e., probability of obtaining a coefficient of variation ≤20%) of estimates using logistic regression. The capture probability for the larger of 2 mixture proportions of the population (i.e., pA or pB, depending on the value of π) was most important for predicting accuracy and precision, whereas capture probabilities of both mixture proportions (pA and pB) were important to explain variation in coverage. Based on sampling conditions similar to parameter estimates from the empirical dataset (pA = 0.30, pB = 0.05, N = 250, π = 0.15, and k = 10), predicted accuracy and precision were low (60% and 53%, respectively), whereas coverage was high (94%). Increasing pB, the capture probability for the predominate but most difficult to capture proportion of the population, was most effective to improve accuracy under those conditions. However, manipulation of other parameters may be more effective under different conditions. In general, the probabilities of obtaining accurate and precise estimates were best when p≥ 0.2. Our regression models can be used by managers to evaluate specific sampling scenarios and guide development of sampling frameworks or to assess reliability of DNA-based capture-mark-recapture studies.

  1. Bivariate normal, conditional and rectangular probabilities: A computer program with applications

    NASA Technical Reports Server (NTRS)

    Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.

    1980-01-01

    Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.

  2. OVERALL CONTROL SYSTEM FOR HIGH FLUX PILE

    DOEpatents

    Newson, H.W.; Durham, N.C.; Wigner, E.P.; Princeton, N.J.; Epler, E.P.

    1961-05-23

    A control system is given for a high fiux reactor incorporating an anti- scram control feature whereby a neutron absorbing control rod acts as a fine adjustment while a neutron absorbing shim rod, actuated upon a command received from reactor period and level signals, has substantially greater effect on the neutron level and is moved prior to scram conditions to alter the reactor activity before a scram condition is created. Thus the probability that a scram will have to be initiated is substantially decreased.

  3. Identifying Changes in the Probability of High Temperature, High Humidity Heat Wave Events

    NASA Astrophysics Data System (ADS)

    Ballard, T.; Diffenbaugh, N. S.

    2016-12-01

    Understanding how heat waves will respond to climate change is critical for adequate planning and adaptation. While temperature is the primary determinant of heat wave severity, humidity has been shown to play a key role in heat wave intensity with direct links to human health and safety. Here we investigate the individual contributions of temperature and specific humidity to extreme heat wave conditions in recent decades. Using global NCEP-DOE Reanalysis II daily data, we identify regional variability in the joint probability distribution of humidity and temperature. We also identify a statistically significant positive trend in humidity over the eastern U.S. during heat wave events, leading to an increased probability of high humidity, high temperature events. The extent to which we can expect this trend to continue under climate change is complicated due to variability between CMIP5 models, in particular among projections of humidity. However, our results support the notion that heat wave dynamics are characterized by more than high temperatures alone, and understanding and quantifying the various components of the heat wave system is crucial for forecasting future impacts.

  4. Effects of ignition location models on the burn patterns of simulated wildfires

    USGS Publications Warehouse

    Bar-Massada, A.; Syphard, A.D.; Hawbaker, T.J.; Stewart, S.I.; Radeloff, V.C.

    2011-01-01

    Fire simulation studies that use models such as FARSITE often assume that ignition locations are distributed randomly, because spatially explicit information about actual ignition locations are difficult to obtain. However, many studies show that the spatial distribution of ignition locations, whether human-caused or natural, is non-random. Thus, predictions from fire simulations based on random ignitions may be unrealistic. However, the extent to which the assumption of ignition location affects the predictions of fire simulation models has never been systematically explored. Our goal was to assess the difference in fire simulations that are based on random versus non-random ignition location patterns. We conducted four sets of 6000 FARSITE simulations for the Santa Monica Mountains in California to quantify the influence of random and non-random ignition locations and normal and extreme weather conditions on fire size distributions and spatial patterns of burn probability. Under extreme weather conditions, fires were significantly larger for non-random ignitions compared to random ignitions (mean area of 344.5 ha and 230.1 ha, respectively), but burn probability maps were highly correlated (r = 0.83). Under normal weather, random ignitions produced significantly larger fires than non-random ignitions (17.5 ha and 13.3 ha, respectively), and the spatial correlations between burn probability maps were not high (r = 0.54), though the difference in the average burn probability was small. The results of the study suggest that the location of ignitions used in fire simulation models may substantially influence the spatial predictions of fire spread patterns. However, the spatial bias introduced by using a random ignition location model may be minimized if the fire simulations are conducted under extreme weather conditions when fire spread is greatest. ?? 2010 Elsevier Ltd.

  5. Beginning Bayes

    ERIC Educational Resources Information Center

    Erickson, Tim

    2017-01-01

    Understanding a Bayesian perspective demands comfort with conditional probability and with probabilities that appear to change as we acquire additional information. This paper suggests a simple context in conditional probability that helps develop the understanding students would need for a successful introduction to Bayesian reasoning.

  6. Sensitivity Study for Long Term Reliability

    NASA Technical Reports Server (NTRS)

    White, Allan L.

    2008-01-01

    This paper illustrates using Markov models to establish system and maintenance requirements for small electronic controllers where the goal is a high probability of continuous service for a long period of time. The system and maintenance items considered are quality of components, various degrees of simple redundancy, redundancy with reconfiguration, diagnostic levels, periodic maintenance, and preventive maintenance. Markov models permit a quantitative investigation with comparison and contrast. An element of special interest is the use of conditional probability to study the combination of imperfect diagnostics and periodic maintenance.

  7. Mapping fire probability and severity in a Mediterranean area using different weather and fuel moisture scenarios

    NASA Astrophysics Data System (ADS)

    Arca, B.; Salis, M.; Bacciu, V.; Duce, P.; Pellizzaro, G.; Ventura, A.; Spano, D.

    2009-04-01

    Although in many countries lightning is the main cause of ignition, in the Mediterranean Basin the forest fires are predominantly ignited by arson, or by human negligence. The fire season peaks coincide with extreme weather conditions (mainly strong winds, hot temperatures, low atmospheric water vapour content) and high tourist presence. Many works reported that in the Mediterranean Basin the projected impacts of climate change will cause greater weather variability and extreme weather conditions, with drier and hotter summers and heat waves. At long-term scale, climate changes could affect the fuel load and the dead/live fuel ratio, and therefore could change the vegetation flammability. At short-time scale, the increase of extreme weather events could directly affect fuel water status, and it could increase large fire occurrence. In this context, detecting the areas characterized by both high probability of large fire occurrence and high fire severity could represent an important component of the fire management planning. In this work we compared several fire probability and severity maps (fire occurrence, rate of spread, fireline intensity, flame length) obtained for a study area located in North Sardinia, Italy, using FlamMap simulator (USDA Forest Service, Missoula). FlamMap computes the potential fire behaviour characteristics over a defined landscape for given weather, wind and fuel moisture data. Different weather and fuel moisture scenarios were tested to predict the potential impact of climate changes on fire parameters. The study area, characterized by a mosaic of urban areas, protected areas, and other areas subject to anthropogenic disturbances, is mainly composed by fire-prone Mediterranean maquis. The input themes needed to run FlamMap were input as grid of 10 meters; the wind data, obtained using a computational fluid-dynamic model, were inserted as gridded file, with a resolution of 50 m. The analysis revealed high fire probability and severity in most of the areas, and therefore a high potential danger. The FlamMap outputs and the derived fire probability maps can be used in decision support systems for fire spread and behaviour and for fire danger assessment with actual and future fire regimes.

  8. Know the risk, take the win: how executive functions and probability processing influence advantageous decision making under risk conditions.

    PubMed

    Brand, Matthias; Schiebener, Johannes; Pertl, Marie-Theres; Delazer, Margarete

    2014-01-01

    Recent models on decision making under risk conditions have suggested that numerical abilities are important ingredients of advantageous decision-making performance, but empirical evidence is still limited. The results of our first study show that logical reasoning and basic mental calculation capacities predict ratio processing and that ratio processing predicts decision making under risk. In the second study, logical reasoning together with executive functions predicted probability processing (numeracy and probability knowledge), and probability processing predicted decision making under risk. These findings suggest that increasing an individual's understanding of ratios and probabilities should lead to more advantageous decisions under risk conditions.

  9. High reticulocyte count is an independent risk factor for cerebrovascular disease in children with sickle cell anemia.

    PubMed

    Silva, Célia Maria; Giovani, Poliana; Viana, Marcos Borato

    2011-01-01

    Transcranial Doppler ultrasonography (TCD) is an important way of detecting risk of ischemic stroke in children with sickle cell anemia. A random sample of 262 FS-hemoglobin children from a newborn screening inception cohort in Brazil (1998-2005) was followed up to May 2009. Pulsed TCD followed STOP protocol. Children with mean blood flow velocity < 170 cm/sec in cerebral arteries were classified as low risk; between 170 and 184, low conditional risk; between 185 and 199, high conditional risk; and ≥ 200, high risk. Median age, 6.2 years (2-11.2 years); 147 female; 13 children (5%) had ischemic stroke prior to TCD; 186/249 (74.7%) were classified as low risk; 19 (7.6%) as low conditional; 7 (2.8%) as high conditional; and 8 (3.2%) as high risk; inadequate tests, 11.6%. The probability of ischemic stroke at 10 years was 8.3% (SEM 2.3%); of stroke or high-risk TCD 15.6% (3.5%). Children with stroke or altered TCD (conditional and high risk) were compared to children with normal examinations. They were younger (P = 0.03), with lower hemoglobin (P = 0.003), higher leukocytosis (P = 0.015), and higher reticulocytosis (P < 0.001). Episodes per year of acute chest syndrome were also higher in that group, but not significantly (P = 0.09). Reticulocytosis remained the only significant variable upon multivariate analysis (P = 0.004). Basilar and middle cerebral artery velocities were significantly correlated (R = 0.55; P < 0.001). Probability of stroke was similar to international reports; of belonging to high-risk group, lower. High-reticulocyte count was the most important factor associated with cerebrovascular disease. Basilar artery velocity > 130 cm/sec seems to be an indirect sign of an underlying cerebrovascular disease. Copyright © 2010 Wiley-Liss, Inc.

  10. Winter movement dynamics of Black Brant

    USGS Publications Warehouse

    Lindberg, Mark S.; Ward, David H.; Tibbitts, T. Lee; Roser, John

    2007-01-01

    Although North American geese are managed based on their breeding distributions, the dynamics of those breeding populations may be affected by events that occur during the winter. Birth rates of capital breeding geese may be influenced by wintering conditions, mortality may be influenced by timing of migration and wintering distribution, and immigration and emigration among breeding populations may depend on winter movement and timing of pair formation. We examined factors affecting movements of black brant (Branta bernicla nigricans) among their primary wintering sites in Mexico and southern California, USA, (Mar 1998-Mar 2000) using capture-recapture models. Although brant exhibited high probability (>0.85) of monthly and annual fidelity to the wintering sites we sampled, we observed movements among all wintering sites. Movement probabilities both within and among winters were negatively related to distance between sites. We observed a higher probability both of southward movement between winters (Mar to Dec) and northward movement between months within winters. Between-winter movements were probably most strongly affected by spatial and temporal variation in habitat quality as we saw movement patterns consistent with contrasting environmental conditions (e.g., La Niña and El Niño southern oscillation cycles). Month-to-month movements were related to migration patterns and may also have been affected by differences in habitat conditions among sites. Patterns of winter movements indicate that a network of wintering sites may be necessary for effective conservation of brant.

  11. Winter movement dynamics of black brant

    USGS Publications Warehouse

    Lindberg, Mark S.; Ward, David H.; Tibbitts, T. Lee; Roser, John

    2007-01-01

    Although North American geese are managed based on their breeding distributions, the dynamics of those breeding populations may be affected by events that occur during the winter. Birth rates of capital breeding geese may be influenced by wintering conditions, mortality may be influenced by timing of migration and wintering distribution, and immigration and emigration among breeding populations may depend on winter movement and timing of pair formation. We examined factors affecting movements of black brant (Branta bernicla nigricans) among their primary wintering sites in Mexico and southern California, USA, (Mar 1998–Mar 2000) using capture–recapture models. Although brant exhibited high probability (>0.85) of monthly and annual fidelity to the wintering sites we sampled, we observed movements among all wintering sites. Movement probabilities both within and among winters were negatively related to distance between sites. We observed a higher probability both of southward movement between winters (Mar to Dec) and northward movement between months within winters. Between-winter movements were probably most strongly affected by spatial and temporal variation in habitat quality as we saw movement patterns consistent with contrasting environmental conditions (e.g., La Niña and El Niño southern oscillation cycles). Month-to-month movements were related to migration patterns and may also have been affected by differences in habitat conditions among sites. Patterns of winter movements indicate that a network of wintering sites may be necessary for effective conservation of brant.

  12. New Concepts in the Evaluation of Biodegradation/Persistence of Chemical Substances Using a Microbial Inoculum

    PubMed Central

    Thouand, Gérald; Durand, Marie-José; Maul, Armand; Gancet, Christian; Blok, Han

    2011-01-01

    The European REACH Regulation (Registration, Evaluation, Authorization of CHemical substances) implies, among other things, the evaluation of the biodegradability of chemical substances produced by industry. A large set of test methods is available including detailed information on the appropriate conditions for testing. However, the inoculum used for these tests constitutes a “black box.” If biodegradation is achievable from the growth of a small group of specific microbial species with the substance as the only carbon source, the result of the test depends largely on the cell density of this group at “time zero.” If these species are relatively rare in an inoculum that is normally used, the likelihood of inoculating a test with sufficient specific cells becomes a matter of probability. Normally this probability increases with total cell density and with the diversity of species in the inoculum. Furthermore the history of the inoculum, e.g., a possible pre-exposure to the test substance or similar substances will have a significant influence on the probability. A high probability can be expected for substances that are widely used and regularly released into the environment, whereas a low probability can be expected for new xenobiotic substances that have not yet been released into the environment. Be that as it may, once the inoculum sample contains sufficient specific degraders, the performance of the biodegradation will follow a typical S shaped growth curve which depends on the specific growth rate under laboratory conditions, the so called F/M ratio (ratio between food and biomass) and the more or less toxic recalcitrant, but possible, metabolites. Normally regulators require the evaluation of the growth curve using a simple approach such as half-time. Unfortunately probability and biodegradation half-time are very often confused. As the half-time values reflect laboratory conditions which are quite different from environmental conditions (after a substance is released), these values should not be used to quantify and predict environmental behavior. The probability value could be of much greater benefit for predictions under realistic conditions. The main issue in the evaluation of probability is that the result is not based on a single inoculum from an environmental sample, but on a variety of samples. These samples can be representative of regional or local areas, climate regions, water types, and history, e.g., pristine or polluted. The above concept has provided us with a new approach, namely “Probabio.” With this approach, persistence is not only regarded as a simple intrinsic property of a substance, but also as the capability of various environmental samples to degrade a substance under realistic exposure conditions and F/M ratio. PMID:21863143

  13. Extreme river flow dependence in Northern Scotland

    NASA Astrophysics Data System (ADS)

    Villoria, M. Franco; Scott, M.; Hoey, T.; Fischbacher-Smith, D.

    2012-04-01

    Various methods for the spatial analysis of hydrologic data have been developed recently. Here we present results using the conditional probability approach proposed by Keef et al. [Appl. Stat. (2009): 58,601-18] to investigate spatial interdependence in extreme river flows in Scotland. This approach does not require the specification of a correlation function, being mostly suitable for relatively small geographical areas. The work is motivated by the Flood Risk Management Act (Scotland (2009)) which requires maps of flood risk that take account of spatial dependence in extreme river flow. The method is based on two conditional measures of spatial flood risk: firstly the conditional probability PC(p) that a set of sites Y = (Y 1,...,Y d) within a region C of interest exceed a flow threshold Qp at time t (or any lag of t), given that in the specified conditioning site X > Qp; and, secondly the expected number of sites within C that will exceed a flow Qp on average (given that X > Qp). The conditional probabilities are estimated using the conditional distribution of Y |X = x (for large x), which can be modeled using a semi-parametric approach (Heffernan and Tawn [Roy. Statist. Soc. Ser. B (2004): 66,497-546]). Once the model is fitted, pseudo-samples can be generated to estimate functionals of the joint tails of the distribution of (Y,X). Conditional return level plots were directly compared to traditional return level plots thus improving our understanding of the dependence structure of extreme river flow events. Confidence intervals were calculated using block bootstrapping methods (100 replicates). We report results from applying this approach to a set of four rivers (Dulnain, Lossie, Ewe and Ness) in Northern Scotland. These sites were chosen based on data quality, spatial location and catchment characteristics. The river Ness, being the largest (catchment size 1839.1km2) was chosen as the conditioning river. Both the Ewe (441.1km2) and Ness catchments have predominantly impermeable bedrock, with the Ewe's one being very wet. The Lossie(216km2) and Dulnain (272.2km2) both contain significant areas of glacial deposits. River flow in the Dulnain is usually affected by snowmelt. In all cases, the conditional probability of each of the three rivers (Dulnain, Lossie, Ewe) decreases as the event in the conditioning river (Ness) becomes more extreme. The Ewe, despite being the furthest of the three sites from the Ness shows the strongest dependence, with relatively high (>0.4) conditional probabilities even for very extreme events (>0.995). Although the Lossie is closer geographically to the Ness than the Ewe, it shows relatively low conditional probabilities and can be considered independent of the Ness for very extreme events (> 0.990). The conditional probabilities seem to reflect the different catchment characteristics and dominant precipitation generating events, with the Ewe being more similar to the Ness than the other two rivers. This interpretation suggests that the conditional method may yield improved estimates of extreme events, but the approach is time consuming. An alternative model that is easier to implement, using a spatial quantile regression, is currently being investigated, which would also allow the introduction of further covariates, essential as the effects of climate change are incorporated into estimation procedures.

  14. Investigation of shipping accident injury severity and mortality.

    PubMed

    Weng, Jinxian; Yang, Dong

    2015-03-01

    Shipping movements are operated in a complex and high-risk environment. Fatal shipping accidents are the nightmares of seafarers. With ten years' worldwide ship accident data, this study develops a binary logistic regression model and a zero-truncated binomial regression model to predict the probability of fatal shipping accidents and corresponding mortalities. The model results show that both the probability of fatal accidents and mortalities are greater for collision, fire/explosion, contact, grounding, sinking accidents occurred in adverse weather conditions and darkness conditions. Sinking has the largest effects on the increment of fatal accident probability and mortalities. The results also show that the bigger number of mortalities is associated with shipping accidents occurred far away from the coastal area/harbor/port. In addition, cruise ships are found to have more mortalities than non-cruise ships. The results of this study are beneficial for policy-makers in proposing efficient strategies to prevent fatal shipping accidents. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Predicted liquefaction of East Bay fills during a repeat of the 1906 San Francisco earthquake

    USGS Publications Warehouse

    Holzer, T.L.; Blair, J.L.; Noce, T.E.; Bennett, M.J.

    2006-01-01

    Predicted conditional probabilities of surface manifestations of liquefaction during a repeat of the 1906 San Francisco (M7.8) earthquake range from 0.54 to 0.79 in the area underlain by the sandy artificial fills along the eastern shore of San Francisco Bay near Oakland, California. Despite widespread liquefaction in 1906 of sandy fills in San Francisco, most of the East Bay fills were emplaced after 1906 without soil improvement to increase their liquefaction resistance. They have yet to be shaken strongly. Probabilities are based on the liquefaction potential index computed from 82 CPT soundings using median (50th percentile) estimates of PGA based on a ground-motion prediction equation. Shaking estimates consider both distance from the San Andreas Fault and local site conditions. The high probabilities indicate extensive and damaging liquefaction will occur in East Bay fills during the next M ??? 7.8 earthquake on the northern San Andreas Fault. ?? 2006, Earthquake Engineering Research Institute.

  16. Analysis of high-resolution foreign exchange data of USD-JPY for 13 years

    NASA Astrophysics Data System (ADS)

    Mizuno, Takayuki; Kurihara, Shoko; Takayasu, Misako; Takayasu, Hideki

    2003-06-01

    We analyze high-resolution foreign exchange data consisting of 20 million data points of USD-JPY for 13 years to report firm statistical laws in distributions and correlations of exchange rate fluctuations. A conditional probability density analysis clearly shows the existence of trend-following movements at time scale of 8-ticks, about 1 min.

  17. Questioning the Relevance of Model-Based Probability Statements on Extreme Weather and Future Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2007-12-01

    We question the relevance of climate-model based Bayesian (or other) probability statements for decision support and impact assessment on spatial scales less than continental and temporal averages less than seasonal. Scientific assessment of higher resolution space and time scale information is urgently needed, given the commercial availability of "products" at high spatiotemporal resolution, their provision by nationally funded agencies for use both in industry decision making and governmental policy support, and their presentation to the public as matters of fact. Specifically we seek to establish necessary conditions for probability forecasts (projections conditioned on a model structure and a forcing scenario) to be taken seriously as reflecting the probability of future real-world events. We illustrate how risk management can profitably employ imperfect models of complicated chaotic systems, following NASA's study of near-Earth PHOs (Potentially Hazardous Objects). Our climate models will never be perfect, nevertheless the space and time scales on which they provide decision- support relevant information is expected to improve with the models themselves. Our aim is to establish a set of baselines of internal consistency; these are merely necessary conditions (not sufficient conditions) that physics based state-of-the-art models are expected to pass if their output is to be judged decision support relevant. Probabilistic Similarity is proposed as one goal which can be obtained even when our models are not empirically adequate. In short, probabilistic similarity requires that, given inputs similar to today's empirical observations and observational uncertainties, we expect future models to produce similar forecast distributions. Expert opinion on the space and time scales on which we might reasonably expect probabilistic similarity may prove of much greater utility than expert elicitation of uncertainty in parameter values in a model that is not empirically adequate; this may help to explain the reluctance of experts to provide information on "parameter uncertainty." Probability statements about the real world are always conditioned on some information set; they may well be conditioned on "False" making them of little value to a rational decision maker. In other instances, they may be conditioned on physical assumptions not held by any of the modellers whose model output is being cast as a probability distribution. Our models will improve a great deal in the next decades, and our insight into the likely climate fifty years hence will improve: maintaining the credibility of the science and the coherence of science based decision support, as our models improve, require a clear statement of our current limitations. What evidence do we have that today's state-of-the-art models provide decision-relevant probability forecasts? What space and time scales do we currently have quantitative, decision-relevant information on for 2050? 2080?

  18. Second-order contrast based on the expectation of effort and reinforcement.

    PubMed

    Clement, Tricia S; Zentall, Thomas R

    2002-01-01

    Pigeons prefer signals for reinforcement that require greater effort (or time) to obtain over those that require less effort to obtain (T. S. Clement, J. Feltus, D. H. Kaiser, & T. R. Zentall, 2000). Preference was attributed to contrast (or to the relatively greater improvement in conditions) produced by the appearance of the signal when it was preceded by greater effort. In Experiment 1, the authors of the present study demonstrated that the expectation of greater effort was sufficient to produce such a preference (a second-order contrast effect). In Experiments 2 and 3, low versus high probability of reinforcement was substituted for high versus low effort, respectively, with similar results. In Experiment 3, the authors found that the stimulus preference could be attributed to positive contrast (when the discriminative stimuli represented an improvement in the probability of reinforcement) and perhaps also negative contrast (when the discriminative stimuli represented reduction in the probability of reinforcement).

  19. Collapse susceptibility mapping in karstified gypsum terrain (Sivas basin - Turkey) by conditional probability, logistic regression, artificial neural network models

    NASA Astrophysics Data System (ADS)

    Yilmaz, Isik; Keskin, Inan; Marschalko, Marian; Bednarik, Martin

    2010-05-01

    This study compares the GIS based collapse susceptibility mapping methods such as; conditional probability (CP), logistic regression (LR) and artificial neural networks (ANN) applied in gypsum rock masses in Sivas basin (Turkey). Digital Elevation Model (DEM) was first constructed using GIS software. Collapse-related factors, directly or indirectly related to the causes of collapse occurrence, such as distance from faults, slope angle and aspect, topographical elevation, distance from drainage, topographic wetness index- TWI, stream power index- SPI, Normalized Difference Vegetation Index (NDVI) by means of vegetation cover, distance from roads and settlements were used in the collapse susceptibility analyses. In the last stage of the analyses, collapse susceptibility maps were produced from CP, LR and ANN models, and they were then compared by means of their validations. Area Under Curve (AUC) values obtained from all three methodologies showed that the map obtained from ANN model looks like more accurate than the other models, and the results also showed that the artificial neural networks is a usefull tool in preparation of collapse susceptibility map and highly compatible with GIS operating features. Key words: Collapse; doline; susceptibility map; gypsum; GIS; conditional probability; logistic regression; artificial neural networks.

  20. An Inverse Problem for a Class of Conditional Probability Measure-Dependent Evolution Equations

    PubMed Central

    Mirzaev, Inom; Byrne, Erin C.; Bortz, David M.

    2016-01-01

    We investigate the inverse problem of identifying a conditional probability measure in measure-dependent evolution equations arising in size-structured population modeling. We formulate the inverse problem as a least squares problem for the probability measure estimation. Using the Prohorov metric framework, we prove existence and consistency of the least squares estimates and outline a discretization scheme for approximating a conditional probability measure. For this scheme, we prove general method stability. The work is motivated by Partial Differential Equation (PDE) models of flocculation for which the shape of the post-fragmentation conditional probability measure greatly impacts the solution dynamics. To illustrate our methodology, we apply the theory to a particular PDE model that arises in the study of population dynamics for flocculating bacterial aggregates in suspension, and provide numerical evidence for the utility of the approach. PMID:28316360

  1. Exact calculation of loop formation probability identifies folding motifs in RNA secondary structures

    PubMed Central

    Sloma, Michael F.; Mathews, David H.

    2016-01-01

    RNA secondary structure prediction is widely used to analyze RNA sequences. In an RNA partition function calculation, free energy nearest neighbor parameters are used in a dynamic programming algorithm to estimate statistical properties of the secondary structure ensemble. Previously, partition functions have largely been used to estimate the probability that a given pair of nucleotides form a base pair, the conditional stacking probability, the accessibility to binding of a continuous stretch of nucleotides, or a representative sample of RNA structures. Here it is demonstrated that an RNA partition function can also be used to calculate the exact probability of formation of hairpin loops, internal loops, bulge loops, or multibranch loops at a given position. This calculation can also be used to estimate the probability of formation of specific helices. Benchmarking on a set of RNA sequences with known secondary structures indicated that loops that were calculated to be more probable were more likely to be present in the known structure than less probable loops. Furthermore, highly probable loops are more likely to be in the known structure than the set of loops predicted in the lowest free energy structures. PMID:27852924

  2. Endogenous modulation of low frequency oscillations by temporal expectations

    PubMed Central

    Cravo, Andre M.; Rohenkohl, Gustavo; Wyart, Valentin

    2011-01-01

    Recent studies have associated increasing temporal expectations with synchronization of higher frequency oscillations and suppression of lower frequencies. In this experiment, we explore a proposal that low-frequency oscillations provide a mechanism for regulating temporal expectations. We used a speeded Go/No-go task and manipulated temporal expectations by changing the probability of target presentation after certain intervals. Across two conditions, the temporal conditional probability of target events differed substantially at the first of three possible intervals. We found that reactions times differed significantly at this first interval across conditions, decreasing with higher temporal expectations. Interestingly, the power of theta activity (4–8 Hz), distributed over central midline sites, also differed significantly across conditions at this first interval. Furthermore, we found a transient coupling between theta phase and beta power after the first interval in the condition with high temporal expectation for targets at this time point. Our results suggest that the adjustments in theta power and the phase-power coupling between theta and beta contribute to a central mechanism for controlling neural excitability according to temporal expectations. PMID:21900508

  3. Seasonal Variability of Middle Latitude Ozone in the Lowermost Stratosphere Derived from Probability Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cerniglia, M. C.; Douglass, A. R.; Rood, R. B.; Sparling, L. C..; Nielsen, J. E.

    1999-01-01

    We present a study of the distribution of ozone in the lowermost stratosphere with the goal of understanding the relative contribution to the observations of air of either distinctly tropospheric or stratospheric origin. The air in the lowermost stratosphere is divided into two population groups based on Ertel's potential vorticity at 300 hPa. High [low] potential vorticity at 300 hPa suggests that the tropopause is low [high], and the identification of the two groups helps to account for dynamic variability. Conditional probability distribution functions are used to define the statistics of the mix from both observations and model simulations. Two data sources are chosen. First, several years of ozonesonde observations are used to exploit the high vertical resolution. Second, observations made by the Halogen Occultation Experiment [HALOE] on the Upper Atmosphere Research Satellite [UARS] are used to understand the impact on the results of the spatial limitations of the ozonesonde network. The conditional probability distribution functions are calculated at a series of potential temperature surfaces spanning the domain from the midlatitude tropopause to surfaces higher than the mean tropical tropopause [about 380K]. Despite the differences in spatial and temporal sampling, the probability distribution functions are similar for the two data sources. Comparisons with the model demonstrate that the model maintains a mix of air in the lowermost stratosphere similar to the observations. The model also simulates a realistic annual cycle. By using the model, possible mechanisms for the maintenance of mix of air in the lowermost stratosphere are revealed. The relevance of the results to the assessment of the environmental impact of aircraft effluence is discussed.

  4. Seasonal Variability of Middle Latitude Ozone in the Lowermost Stratosphere Derived from Probability Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cerniglia, M. C.; Douglass, A. R.; Rood, R. B.; Sparling, L. C.; Nielsen, J. E.

    1999-01-01

    We present a study of the distribution of ozone in the lowermost stratosphere with the goal of understanding the relative contribution to the observations of air of either distinctly tropospheric or stratospheric origin. The air in the lowermost stratosphere is divided into two population groups based on Ertel's potential vorticity at 300 hPa. High [low] potential vorticity at 300 hPa suggests that the tropopause is low [high], and the identification of the two groups helps to account for dynamic variability. Conditional probability distribution functions are used to define the statistics of the mix from both observations and model simulations. Two data sources are chosen. First, several years of ozonesonde observations are used to exploit the high vertical resolution. Second, observations made by the Halogen Occultation Experiment [HALOE) on the Upper Atmosphere Research Satellite [UARS] are used to understand the impact on the results of the spatial limitations of the ozonesonde network. The conditional probability distribution functions are calculated at a series of potential temperature surfaces spanning the domain from the midlatitude tropopause to surfaces higher than the mean tropical tropopause [approximately 380K]. Despite the differences in spatial and temporal sampling, the probability distribution functions are similar for the two data sources. Comparisons with the model demonstrate that the model maintains a mix of air in the lowermost stratosphere similar to the observations. The model also simulates a realistic annual cycle. By using the model, possible mechanisms for the maintenance of mix of air in the lowermost stratosphere are revealed. The relevance of the results to the assessment of the environmental impact of aircraft effluence is discussed.

  5. Updating: Learning versus Supposing

    ERIC Educational Resources Information Center

    Zhao, Jiaying; Crupi, Vincenzo; Tentori, Katya; Fitelson, Branden; Osherson, Daniel

    2012-01-01

    Bayesian orthodoxy posits a tight relationship between conditional probability and updating. Namely, the probability of an event "A" after learning "B" should equal the conditional probability of "A" given "B" prior to learning "B". We examine whether ordinary judgment conforms to the orthodox view. In three experiments we found substantial…

  6. Music-evoked incidental happiness modulates probability weighting during risky lottery choices

    PubMed Central

    Schulreich, Stefan; Heussen, Yana G.; Gerhardt, Holger; Mohr, Peter N. C.; Binkofski, Ferdinand C.; Koelsch, Stefan; Heekeren, Hauke R.

    2014-01-01

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music—happy, sad, or no music, or sequences of random tones—and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the “happy” than in the “sad” and “random tones” conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the “happy” condition, participants showed significantly higher decision weights associated with the larger payoffs than in the “sad” and “random tones” conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting. PMID:24432007

  7. Music-evoked incidental happiness modulates probability weighting during risky lottery choices.

    PubMed

    Schulreich, Stefan; Heussen, Yana G; Gerhardt, Holger; Mohr, Peter N C; Binkofski, Ferdinand C; Koelsch, Stefan; Heekeren, Hauke R

    2014-01-07

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music-happy, sad, or no music, or sequences of random tones-and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the "happy" than in the "sad" and "random tones" conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the "happy" condition, participants showed significantly higher decision weights associated with the larger payoffs than in the "sad" and "random tones" conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting.

  8. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  9. Probability in reasoning: a developmental test on conditionals.

    PubMed

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Face Recognition for Access Control Systems Combining Image-Difference Features Based on a Probabilistic Model

    NASA Astrophysics Data System (ADS)

    Miwa, Shotaro; Kage, Hiroshi; Hirai, Takashi; Sumi, Kazuhiko

    We propose a probabilistic face recognition algorithm for Access Control System(ACS)s. Comparing with existing ACSs using low cost IC-cards, face recognition has advantages in usability and security that it doesn't require people to hold cards over scanners and doesn't accept imposters with authorized cards. Therefore face recognition attracts more interests in security markets than IC-cards. But in security markets where low cost ACSs exist, price competition is important, and there is a limitation on the quality of available cameras and image control. Therefore ACSs using face recognition are required to handle much lower quality images, such as defocused and poor gain-controlled images than high security systems, such as immigration control. To tackle with such image quality problems we developed a face recognition algorithm based on a probabilistic model which combines a variety of image-difference features trained by Real AdaBoost with their prior probability distributions. It enables to evaluate and utilize only reliable features among trained ones during each authentication, and achieve high recognition performance rates. The field evaluation using a pseudo Access Control System installed in our office shows that the proposed system achieves a constant high recognition performance rate independent on face image qualities, that is about four times lower EER (Equal Error Rate) under a variety of image conditions than one without any prior probability distributions. On the other hand using image difference features without any prior probabilities are sensitive to image qualities. We also evaluated PCA, and it has worse, but constant performance rates because of its general optimization on overall data. Comparing with PCA, Real AdaBoost without any prior distribution performs twice better under good image conditions, but degrades to a performance as good as PCA under poor image conditions.

  11. Memory and decision making: Effects of sequential presentation of probabilities and outcomes in risky prospects.

    PubMed

    Millroth, Philip; Guath, Mona; Juslin, Peter

    2018-06-07

    The rationality of decision making under risk is of central concern in psychology and other behavioral sciences. In real-life, the information relevant to a decision often arrives sequentially or changes over time, implying nontrivial demands on memory. Yet, little is known about how this affects the ability to make rational decisions and a default assumption is rather that information about outcomes and probabilities are simultaneously available at the time of the decision. In 4 experiments, we show that participants receiving probability- and outcome information sequentially report substantially (29 to 83%) higher certainty equivalents than participants with simultaneous presentation. This holds also for monetary-incentivized participants with perfect recall of the information. Participants in the sequential conditions often violate stochastic dominance in the sense that they pay more for a lottery with low probability of an outcome than participants in the simultaneous condition pay for a high probability of the same outcome. Computational modeling demonstrates that Cumulative Prospect Theory (Tversky & Kahneman, 1992) fails to account for the effects of sequential presentation, but a model assuming anchoring-and adjustment constrained by memory can account for the data. By implication, established assumptions of rationality may need to be reconsidered to account for the effects of memory in many real-life tasks. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. Further evaluation of leisure items in the attention condition of functional analyses.

    PubMed

    Roscoe, Eileen M; Carreau, Abbey; MacDonald, Jackie; Pence, Sacha T

    2008-01-01

    Research suggests that including leisure items in the attention condition of a functional analysis may produce engagement that masks sensitivity to attention. In this study, 4 individuals' initial functional analyses indicated that behavior was maintained by nonsocial variables (n = 3) or by attention (n = 1). A preference assessment was used to identify items for subsequent functional analyses. Four conditions were compared, attention with and without leisure items and control with and without leisure items. Following this, either high- or low-preference items were included in the attention condition. Problem behavior was more probable during the attention condition when no leisure items or low-preference items were included, and lower levels of problem behavior were observed during the attention condition when high-preference leisure items were included. These findings suggest how preferred items may hinder detection of behavioral function.

  13. A Violation of the Conditional Independence Assumption in the Two-High-Threshold Model of Recognition Memory

    ERIC Educational Resources Information Center

    Chen, Tina; Starns, Jeffrey J.; Rotello, Caren M.

    2015-01-01

    The 2-high-threshold (2HT) model of recognition memory assumes that test items result in distinct internal states: they are either detected or not, and the probability of responding at a particular confidence level that an item is "old" or "new" depends on the state-response mapping parameters. The mapping parameters are…

  14. Laser radar system for obstacle avoidance

    NASA Astrophysics Data System (ADS)

    Bers, Karlheinz; Schulz, Karl R.; Armbruster, Walter

    2005-09-01

    The threat of hostile surveillance and weapon systems require military aircraft to fly under extreme conditions such as low altitude, high speed, poor visibility and incomplete terrain information. The probability of collision with natural and man-made obstacles during such contour missions is high if detection capability is restricted to conventional vision aids. Forward-looking scanning laser radars which are build by the EADS company and presently being flight tested and evaluated at German proving grounds, provide a possible solution, having a large field of view, high angular and range resolution, a high pulse repetition rate, and sufficient pulse energy to register returns from objects at distances of military relevance with a high hit-and-detect probability. The development of advanced 3d-scene analysis algorithms had increased the recognition probability and reduced the false alarm rate by using more readily recognizable objects such as terrain, poles, pylons, trees, etc. to generate a parametric description of the terrain surface as well as the class, position, orientation, size and shape of all objects in the scene. The sensor system and the implemented algorithms can be used for other applications such as terrain following, autonomous obstacle avoidance, and automatic target recognition. This paper describes different 3D-imaging ladar sensors with unique system architecture but different components matched for different military application. Emphasis is laid on an obstacle warning system with a high probability of detection of thin wires, the real time processing of the measured range image data, obstacle classification und visualization.

  15. Nitrous oxide emissions from wastewater treatment processes

    PubMed Central

    Law, Yingyu; Ye, Liu; Pan, Yuting; Yuan, Zhiguo

    2012-01-01

    Nitrous oxide (N2O) emissions from wastewater treatment plants vary substantially between plants, ranging from negligible to substantial (a few per cent of the total nitrogen load), probably because of different designs and operational conditions. In general, plants that achieve high levels of nitrogen removal emit less N2O, indicating that no compromise is required between high water quality and lower N2O emissions. N2O emissions primarily occur in aerated zones/compartments/periods owing to active stripping, and ammonia-oxidizing bacteria, rather than heterotrophic denitrifiers, are the main contributors. However, the detailed mechanisms remain to be fully elucidated, despite strong evidence suggesting that both nitrifier denitrification and the chemical breakdown of intermediates of hydroxylamine oxidation are probably involved. With increased understanding of the fundamental reactions responsible for N2O production in wastewater treatment systems and the conditions that stimulate their occurrence, reduction of N2O emissions from wastewater treatment systems through improved plant design and operation will be achieved in the near future. PMID:22451112

  16. GIS-based probability assessment of natural hazards in forested landscapes of Central and South-Eastern Europe.

    PubMed

    Lorz, C; Fürst, C; Galic, Z; Matijasic, D; Podrazky, V; Potocic, N; Simoncic, P; Strauch, M; Vacik, H; Makeschin, F

    2010-12-01

    We assessed the probability of three major natural hazards--windthrow, drought, and forest fire--for Central and South-Eastern European forests which are major threats for the provision of forest goods and ecosystem services. In addition, we analyzed spatial distribution and implications for a future oriented management of forested landscapes. For estimating the probability of windthrow, we used rooting depth and average wind speed. Probabilities of drought and fire were calculated from climatic and total water balance during growing season. As an approximation to climate change scenarios, we used a simplified approach with a general increase of pET by 20%. Monitoring data from the pan-European forests crown condition program and observed burnt areas and hot spots from the European Forest Fire Information System were used to test the plausibility of probability maps. Regions with high probabilities of natural hazard are identified and management strategies to minimize probability of natural hazards are discussed. We suggest future research should focus on (i) estimating probabilities using process based models (including sensitivity analysis), (ii) defining probability in terms of economic loss, (iii) including biotic hazards, (iv) using more detailed data sets on natural hazards, forest inventories and climate change scenarios, and (v) developing a framework of adaptive risk management.

  17. Diatremes and craters attributed to natural explosions

    USGS Publications Warehouse

    Shoemaker, Eugene Merle

    1956-01-01

    Diatremes - volcanic pipes attributed to explosion - and craters have been studied to infer the ultimate causes and physical conditions attending natural explosive processes. Initial piercement of diatremes on the Navajo reservation, Arizona was probably along a fracture propagated by a high-pressure aqueous fluid. Gas rising at high velocity along the fracture would become converted to a gas-solid fluidized system by entrainment of wall- rock fragments. The first stages of widening of the vent are probably accomplished mainly by simple abrasion of the high-velocity fluidized system on the walls of the fracture. As the vent widens, its enlargement may be accelerated by inward spalling of the walls. The inferred mechanics of the Navajo-Hopi diatremes is used to illustrate the possibility of diatreme formation over a molten salt mass.

  18. Wildfire risk in the wildland-urban interface: A simulation study in northwestern Wisconsin

    USGS Publications Warehouse

    Bar-Massada, A.; Radeloff, V.C.; Stewart, S.I.; Hawbaker, T.J.

    2009-01-01

    The rapid growth of housing in and near the wildland-urban interface (WUI) increases wildfire risk to lives and structures. To reduce fire risk, it is necessary to identify WUI housing areas that are more susceptible to wildfire. This is challenging, because wildfire patterns depend on fire behavior and spread, which in turn depend on ignition locations, weather conditions, the spatial arrangement of fuels, and topography. The goal of our study was to assess wildfire risk to a 60,000 ha WUI area in northwestern Wisconsin while accounting for all of these factors. We conducted 6000 simulations with two dynamic fire models: Fire Area Simulator (FARSITE) and Minimum Travel Time (MTT) in order to map the spatial pattern of burn probabilities. Simulations were run under normal and extreme weather conditions to assess the effect of weather on fire spread, burn probability, and risk to structures. The resulting burn probability maps were intersected with maps of structure locations and land cover types. The simulations revealed clear hotspots of wildfire activity and a large range of wildfire risk to structures in the study area. As expected, the extreme weather conditions yielded higher burn probabilities over the entire landscape, as well as to different land cover classes and individual structures. Moreover, the spatial pattern of risk was significantly different between extreme and normal weather conditions. The results highlight the fact that extreme weather conditions not only produce higher fire risk than normal weather conditions, but also change the fine-scale locations of high risk areas in the landscape, which is of great importance for fire management in WUI areas. In addition, the choice of weather data may limit the potential for comparisons of risk maps for different areas and for extrapolating risk maps to future scenarios where weather conditions are unknown. Our approach to modeling wildfire risk to structures can aid fire risk reduction management activities by identifying areas with elevated wildfire risk and those most vulnerable under extreme weather conditions. ?? 2009 Elsevier B.V.

  19. Partitioning Detectability Components in Populations Subject to Within-Season Temporary Emigration Using Binomial Mixture Models

    PubMed Central

    O’Donnell, Katherine M.; Thompson, Frank R.; Semlitsch, Raymond D.

    2015-01-01

    Detectability of individual animals is highly variable and nearly always < 1; imperfect detection must be accounted for to reliably estimate population sizes and trends. Hierarchical models can simultaneously estimate abundance and effective detection probability, but there are several different mechanisms that cause variation in detectability. Neglecting temporary emigration can lead to biased population estimates because availability and conditional detection probability are confounded. In this study, we extend previous hierarchical binomial mixture models to account for multiple sources of variation in detectability. The state process of the hierarchical model describes ecological mechanisms that generate spatial and temporal patterns in abundance, while the observation model accounts for the imperfect nature of counting individuals due to temporary emigration and false absences. We illustrate our model’s potential advantages, including the allowance of temporary emigration between sampling periods, with a case study of southern red-backed salamanders Plethodon serratus. We fit our model and a standard binomial mixture model to counts of terrestrial salamanders surveyed at 40 sites during 3–5 surveys each spring and fall 2010–2012. Our models generated similar parameter estimates to standard binomial mixture models. Aspect was the best predictor of salamander abundance in our case study; abundance increased as aspect became more northeasterly. Increased time-since-rainfall strongly decreased salamander surface activity (i.e. availability for sampling), while higher amounts of woody cover objects and rocks increased conditional detection probability (i.e. probability of capture, given an animal is exposed to sampling). By explicitly accounting for both components of detectability, we increased congruence between our statistical modeling and our ecological understanding of the system. We stress the importance of choosing survey locations and protocols that maximize species availability and conditional detection probability to increase population parameter estimate reliability. PMID:25775182

  20. Estimating rare events in biochemical systems using conditional sampling.

    PubMed

    Sundar, V S

    2017-01-28

    The paper focuses on development of variance reduction strategies to estimate rare events in biochemical systems. Obtaining this probability using brute force Monte Carlo simulations in conjunction with the stochastic simulation algorithm (Gillespie's method) is computationally prohibitive. To circumvent this, important sampling tools such as the weighted stochastic simulation algorithm and the doubly weighted stochastic simulation algorithm have been proposed. However, these strategies require an additional step of determining the important region to sample from, which is not straightforward for most of the problems. In this paper, we apply the subset simulation method, developed as a variance reduction tool in the context of structural engineering, to the problem of rare event estimation in biochemical systems. The main idea is that the rare event probability is expressed as a product of more frequent conditional probabilities. These conditional probabilities are estimated with high accuracy using Monte Carlo simulations, specifically the Markov chain Monte Carlo method with the modified Metropolis-Hastings algorithm. Generating sample realizations of the state vector using the stochastic simulation algorithm is viewed as mapping the discrete-state continuous-time random process to the standard normal random variable vector. This viewpoint opens up the possibility of applying more sophisticated and efficient sampling schemes developed elsewhere to problems in stochastic chemical kinetics. The results obtained using the subset simulation method are compared with existing variance reduction strategies for a few benchmark problems, and a satisfactory improvement in computational time is demonstrated.

  1. Conditional, Time-Dependent Probabilities for Segmented Type-A Faults in the WGCEP UCERF 2

    USGS Publications Warehouse

    Field, Edward H.; Gupta, Vipin

    2008-01-01

    This appendix presents elastic-rebound-theory (ERT) motivated time-dependent probabilities, conditioned on the date of last earthquake, for the segmented type-A fault models of the 2007 Working Group on California Earthquake Probabilities (WGCEP). These probabilities are included as one option in the WGCEP?s Uniform California Earthquake Rupture Forecast 2 (UCERF 2), with the other options being time-independent Poisson probabilities and an ?Empirical? model based on observed seismicity rate changes. A more general discussion of the pros and cons of all methods for computing time-dependent probabilities, as well as the justification of those chosen for UCERF 2, are given in the main body of this report (and the 'Empirical' model is also discussed in Appendix M). What this appendix addresses is the computation of conditional, time-dependent probabilities when both single- and multi-segment ruptures are included in the model. Computing conditional probabilities is relatively straightforward when a fault is assumed to obey strict segmentation in the sense that no multi-segment ruptures occur (e.g., WGCEP (1988, 1990) or see Field (2007) for a review of all previous WGCEPs; from here we assume basic familiarity with conditional probability calculations). However, and as we?ll see below, the calculation is not straightforward when multi-segment ruptures are included, in essence because we are attempting to apply a point-process model to a non point process. The next section gives a review and evaluation of the single- and multi-segment rupture probability-calculation methods used in the most recent statewide forecast for California (WGCEP UCERF 1; Petersen et al., 2007). We then present results for the methodology adopted here for UCERF 2. We finish with a discussion of issues and possible alternative approaches that could be explored and perhaps applied in the future. A fault-by-fault comparison of UCERF 2 probabilities with those of previous studies is given in the main part of this report.

  2. The Formalism of Generalized Contexts and Decay Processes

    NASA Astrophysics Data System (ADS)

    Losada, Marcelo; Laura, Roberto

    2013-04-01

    The formalism of generalized contexts for quantum histories is used to investigate the possibility to consider the survival probability as the probability of no decay property at a given time conditional to no decay property at an earlier time. A negative result is found for an isolated system. The inclusion of two quantum measurement instruments at two different times makes possible to interpret the survival probability as a conditional probability of the whole system.

  3. Drug and alcohol use and family characteristics: a study among Brazilian high-school students.

    PubMed

    Carvalho, V; Pinsky, I; De Souza e Silva, R; Carlini-Cotrim, B

    1995-01-01

    The present work employs a multivariate analysis technique to study, simultaneously, family relations and alcohol/drug consumption among 16,378 Brazilian high-school students. The analysis is centered on the relation between subjective or objective family situations and consumption. Subjective situations are measured by adolescents' perception of their families, that is, the family's environmental "climate"--whether violent situations occur at home, whether there is frequent dialogue about the youngsters' problems, and whether they perceive interest on the part of parents. Objective situations refer to the conjugal status of parents. Results pointed to family violence as the factor most frequently associated with alcohol/drug use behavior. It was also found that the family's environmental climate constitutes a more important factor than the conjugal status of parents, when it comes to the development of drug use behavior. Therefore, the impact of this last variable (whether parents are living together) is determined by environmental conditions: when those conditions are favorable (no violence, problems habitually talked about, parents concerned with their offspring) the fact that parents were effectively living together meant a smaller probability of alcohol/drug use; when these conditions were unfavorable, the same fact was associated with a greater probability of consumption.

  4. Holocene rainfall runoff in the central Ethiopian highlands and evolution of the River Nile drainage system as revealed from a sediment record from Lake Dendi

    NASA Astrophysics Data System (ADS)

    Wagner, Bernd; Wennrich, Volker; Viehberg, Finn; Junginger, Annett; Kolvenbach, Anne; Rethemeyer, Janet; Schaebitz, Frank; Schmiedl, Gerhard

    2018-04-01

    A 12 m long sediment sequence was recovered from the eastern Dendi Crater lake, located on the central Ethiopian Plateau and in the region of the Blue Nile headwaters. 24 AMS radiocarbon dates from bulk organic carbon samples indicate that the sediment sequence spans the last ca. 12 cal kyr BP. Sedimentological and geochemical data from the sediment sequence that were combined with initial diatom information show only moderate change in precipitation and catchment runoff during that period, probably due to the elevated location of the study region in the Ethiopian highlands. Less humid conditions prevailed during the Younger Dryas (YD). After the return to full humid conditions of the African Humid Period (AHP), a 2 m thick tephra layer, probably originating from an eruption of the Wenchi crater 12 km to the west of the lake, was deposited at 10.2 cal kyr BP. Subsequently, single thin horizons of high clastic matter imply that short spells of dry conditions and significantly increased rainfall, respectively, superimpose the generally humid conditions. The end of the AHP is rather gradual and precedes relatively stable and less humid conditions around 3.9 cal kyr BP. Subsequently, slightly increasing catchment runoff led to sediment redeposition, increasing nutrient supply, and highest trophic states in the lake until 1.5 cal kyr BP. A highly variable increase in clastic matter indicates fluctuating and increasing catchment runoff over the last 1500 years. The data from Lake Dendi show, in concert with other records from the Nile catchment and the Eastern Mediterranean Sea (EMS), that the Blue Nile discharge was relatively high between ca. 10.0 and 8.7 cal kyr BP. Subsequent aridification peaked with some regional differences between ca. 4.0 and 2.6 cal kyr BP. Higher discharge in the Blue Nile hydraulic regime after 2.6 cal kyr BP is probably triggered by more local increase in rainfall, which is tentatively caused by a change in the influence of the Indian Ocean monsoon.

  5. Computing under-ice discharge: A proof-of-concept using hydroacoustics and the Probability Concept

    NASA Astrophysics Data System (ADS)

    Fulton, John W.; Henneberg, Mark F.; Mills, Taylor J.; Kohn, Michael S.; Epstein, Brian; Hittle, Elizabeth A.; Damschen, William C.; Laveau, Christopher D.; Lambrecht, Jason M.; Farmer, William H.

    2018-07-01

    Under-ice discharge is estimated using open-water reference hydrographs; however, the ratings for ice-affected sites are generally qualified as poor. The U.S. Geological Survey (USGS), in collaboration with the Colorado Water Conservation Board, conducted a proof-of-concept to develop an alternative method for computing under-ice discharge using hydroacoustics and the Probability Concept. The study site was located south of Minturn, Colorado (CO), USA, and was selected because of (1) its proximity to the existing USGS streamgage 09064600 Eagle River near Minturn, CO, and (2) its ease-of-access to verify discharge using a variety of conventional methods. From late September 2014 to early March 2015, hydraulic conditions varied from open water to under ice. These temporal changes led to variations in water depth and velocity. Hydroacoustics (tethered and uplooking acoustic Doppler current profilers and acoustic Doppler velocimeters) were deployed to measure the vertical-velocity profile at a singularly important vertical of the channel-cross section. Because the velocity profile was non-standard and cannot be characterized using a Power Law or Log Law, velocity data were analyzed using the Probability Concept, which is a probabilistic formulation of the velocity distribution. The Probability Concept-derived discharge was compared to conventional methods including stage-discharge and index-velocity ratings and concurrent field measurements; each is complicated by the dynamics of ice formation, pressure influences on stage measurements, and variations in cross-sectional area due to ice formation. No particular discharge method was assigned as truth. Rather one statistical metric (Kolmogorov-Smirnov; KS), agreement plots, and concurrent measurements provided a measure of comparability between various methods. Regardless of the method employed, comparisons between each method revealed encouraging results depending on the flow conditions and the absence or presence of ice cover. For example, during lower discharges dominated by under-ice and transition (intermittent open-water and under-ice) conditions, the KS metric suggests there is not sufficient information to reject the null hypothesis and implies that the Probability Concept and index-velocity rating represent similar distributions. During high-flow, open-water conditions, the comparisons are less definitive; therefore, it is important that the appropriate analytical method and instrumentation be selected. Six conventional discharge measurements were collected concurrently with Probability Concept-derived discharges with percent differences (%) of -9.0%, -21%, -8.6%, 17.8%, 3.6%, and -2.3%. This proof-of-concept demonstrates that riverine discharges can be computed using the Probability Concept for a range of hydraulic extremes (variations in discharge, open-water and under-ice conditions) immediately after the siting phase is complete, which typically requires one day. Computing real-time discharges is particularly important at sites, where (1) new streamgages are planned, (2) river hydraulics are complex, and (3) shifts in the stage-discharge rating are needed to correct the streamflow record. Use of the Probability Concept does not preclude the need to maintain a stage-area relation. Both the Probability Concept and index-velocity rating offer water-resource managers and decision makers alternatives for computing real-time discharge for open-water and under-ice conditions.

  6. Computing under-ice discharge: A proof-of-concept using hydroacoustics and the Probability Concept

    USGS Publications Warehouse

    Fulton, John W.; Henneberg, Mark F.; Mills, Taylor J.; Kohn, Michael S.; Epstein, Brian; Hittle, Elizabeth A.; Damschen, William C.; Laveau, Christopher D.; Lambrecht, Jason M.; Farmer, William H.

    2018-01-01

    Under-ice discharge is estimated using open-water reference hydrographs; however, the ratings for ice-affected sites are generally qualified as poor. The U.S. Geological Survey (USGS), in collaboration with the Colorado Water Conservation Board, conducted a proof-of-concept to develop an alternative method for computing under-ice discharge using hydroacoustics and the Probability Concept.The study site was located south of Minturn, Colorado (CO), USA, and was selected because of (1) its proximity to the existing USGS streamgage 09064600 Eagle River near Minturn, CO, and (2) its ease-of-access to verify discharge using a variety of conventional methods. From late September 2014 to early March 2015, hydraulic conditions varied from open water to under ice. These temporal changes led to variations in water depth and velocity. Hydroacoustics (tethered and uplooking acoustic Doppler current profilers and acoustic Doppler velocimeters) were deployed to measure the vertical-velocity profile at a singularly important vertical of the channel-cross section. Because the velocity profile was non-standard and cannot be characterized using a Power Law or Log Law, velocity data were analyzed using the Probability Concept, which is a probabilistic formulation of the velocity distribution. The Probability Concept-derived discharge was compared to conventional methods including stage-discharge and index-velocity ratings and concurrent field measurements; each is complicated by the dynamics of ice formation, pressure influences on stage measurements, and variations in cross-sectional area due to ice formation.No particular discharge method was assigned as truth. Rather one statistical metric (Kolmogorov-Smirnov; KS), agreement plots, and concurrent measurements provided a measure of comparability between various methods. Regardless of the method employed, comparisons between each method revealed encouraging results depending on the flow conditions and the absence or presence of ice cover.For example, during lower discharges dominated by under-ice and transition (intermittent open-water and under-ice) conditions, the KS metric suggests there is not sufficient information to reject the null hypothesis and implies that the Probability Concept and index-velocity rating represent similar distributions. During high-flow, open-water conditions, the comparisons are less definitive; therefore, it is important that the appropriate analytical method and instrumentation be selected. Six conventional discharge measurements were collected concurrently with Probability Concept-derived discharges with percent differences (%) of −9.0%, −21%, −8.6%, 17.8%, 3.6%, and −2.3%.This proof-of-concept demonstrates that riverine discharges can be computed using the Probability Concept for a range of hydraulic extremes (variations in discharge, open-water and under-ice conditions) immediately after the siting phase is complete, which typically requires one day. Computing real-time discharges is particularly important at sites, where (1) new streamgages are planned, (2) river hydraulics are complex, and (3) shifts in the stage-discharge rating are needed to correct the streamflow record. Use of the Probability Concept does not preclude the need to maintain a stage-area relation. Both the Probability Concept and index-velocity rating offer water-resource managers and decision makers alternatives for computing real-time discharge for open-water and under-ice conditions.

  7. Using dynamic geometry software for teaching conditional probability with area-proportional Venn diagrams

    NASA Astrophysics Data System (ADS)

    Radakovic, Nenad; McDougall, Douglas

    2012-10-01

    This classroom note illustrates how dynamic visualization can be used to teach conditional probability and Bayes' theorem. There are two features of the visualization that make it an ideal pedagogical tool in probability instruction. The first feature is the use of area-proportional Venn diagrams that, along with showing qualitative relationships, describe the quantitative relationship between two sets. The second feature is the slider and animation component of dynamic geometry software enabling students to observe how the change in the base rate of an event influences conditional probability. A hypothetical instructional sequence using a well-known breast cancer example is described.

  8. Stability of Molecules of Biological Importance to Ionizing Radiation: Relevance in Astrobiology

    NASA Astrophysics Data System (ADS)

    Meléndez-López, A. L.; Negrón-Mendoza, A.; Ramos-Bernal, S.; Colín-García, M.; Heredia, A.

    2017-11-01

    Our aim is to study the stability of amino acids in conditions that probably existed in the primitive environments. We study aspartic acid and glutamic acid, in solid state and aqueous solution, against high doses of gamma radiation at 298 and 77 K.

  9. SELECTING LEAST-DISTURBED SURVEY SITES FOR GREAT PLAINS STREAMS AND RIVERS

    EPA Science Inventory

    True reference condition probably does not exist for streams in highly utilized regions such as the Great Plains. Selecting least-disturbed sites for large regions is confounded by the association between human uses and natural gradients, and by multiple kinds of disturbance. U...

  10. A Probability-Based Algorithm Using Image Sensors to Track the LED in a Vehicle Visible Light Communication System.

    PubMed

    Huynh, Phat; Do, Trong-Hop; Yoo, Myungsik

    2017-02-10

    This paper proposes a probability-based algorithm to track the LED in vehicle visible light communication systems using a camera. In this system, the transmitters are the vehicles' front and rear LED lights. The receivers are high speed cameras that take a series of images of the LEDs. ThedataembeddedinthelightisextractedbyfirstdetectingthepositionoftheLEDsintheseimages. Traditionally, LEDs are detected according to pixel intensity. However, when the vehicle is moving, motion blur occurs in the LED images, making it difficult to detect the LEDs. Particularly at high speeds, some frames are blurred at a high degree, which makes it impossible to detect the LED as well as extract the information embedded in these frames. The proposed algorithm relies not only on the pixel intensity, but also on the optical flow of the LEDs and on statistical information obtained from previous frames. Based on this information, the conditional probability that a pixel belongs to a LED is calculated. Then, the position of LED is determined based on this probability. To verify the suitability of the proposed algorithm, simulations are conducted by considering the incidents that can happen in a real-world situation, including a change in the position of the LEDs at each frame, as well as motion blur due to the vehicle speed.

  11. Conservative Analytical Collision Probabilities for Orbital Formation Flying

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell

    2004-01-01

    The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.

  12. Conservative Analytical Collision Probability for Design of Orbital Formations

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell

    2004-01-01

    The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.

  13. Social networks, mental health problems, and mental health service utilization in OEF/OIF National Guard veterans.

    PubMed

    Sripada, Rebecca K; Bohnert, Amy S B; Teo, Alan R; Levine, Debra S; Pfeiffer, Paul N; Bowersox, Nicholas W; Mizruchi, Mark S; Chermack, Stephen T; Ganoczy, Dara; Walters, Heather; Valenstein, Marcia

    2015-09-01

    Low social support and small social network size have been associated with a variety of negative mental health outcomes, while their impact on mental health services use is less clear. To date, few studies have examined these associations in National Guard service members, where frequency of mental health problems is high, social support may come from military as well as other sources, and services use may be suboptimal. Surveys were administered to 1448 recently returned National Guard members. Multivariable regression models assessed the associations between social support characteristics, probable mental health conditions, and service utilization. In bivariate analyses, large social network size, high social network diversity, high perceived social support, and high military unit support were each associated with lower likelihood of having a probable mental health condition (p < .001). In adjusted analyses, high perceived social support (OR .90, CI .88-.92) and high unit support (OR .96, CI .94-.97) continued to be significantly associated with lower likelihood of mental health conditions. Two social support measures were associated with lower likelihood of receiving mental health services in bivariate analyses, but were not significant in adjusted models. General social support and military-specific support were robustly associated with reduced mental health symptoms in National Guard members. Policy makers, military leaders, and clinicians should attend to service members' level of support from both the community and their units and continue efforts to bolster these supports. Other strategies, such as focused outreach, may be needed to bring National Guard members with need into mental health care.

  14. Implementation of the Iterative Proportion Fitting Algorithm for Geostatistical Facies Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Yupeng, E-mail: yupeng@ualberta.ca; Deutsch, Clayton V.

    2012-06-15

    In geostatistics, most stochastic algorithm for simulation of categorical variables such as facies or rock types require a conditional probability distribution. The multivariate probability distribution of all the grouped locations including the unsampled location permits calculation of the conditional probability directly based on its definition. In this article, the iterative proportion fitting (IPF) algorithm is implemented to infer this multivariate probability. Using the IPF algorithm, the multivariate probability is obtained by iterative modification to an initial estimated multivariate probability using lower order bivariate probabilities as constraints. The imposed bivariate marginal probabilities are inferred from profiles along drill holes or wells.more » In the IPF process, a sparse matrix is used to calculate the marginal probabilities from the multivariate probability, which makes the iterative fitting more tractable and practical. This algorithm can be extended to higher order marginal probability constraints as used in multiple point statistics. The theoretical framework is developed and illustrated with estimation and simulation example.« less

  15. Domestic wells have high probability of pumping septic tank leachate

    NASA Astrophysics Data System (ADS)

    Bremer, J. E.; Harter, T.

    2012-08-01

    Onsite wastewater treatment systems are common in rural and semi-rural areas around the world; in the US, about 25-30% of households are served by a septic (onsite) wastewater treatment system, and many property owners also operate their own domestic well nearby. Site-specific conditions and local groundwater flow are often ignored when installing septic systems and wells. In areas with small lots (thus high spatial septic system densities), shallow domestic wells are prone to contamination by septic system leachate. Mass balance approaches have been used to determine a maximum septic system density that would prevent contamination of groundwater resources. In this study, a source area model based on detailed groundwater flow and transport modeling is applied for a stochastic analysis of domestic well contamination by septic leachate. Specifically, we determine the probability that a source area overlaps with a septic system drainfield as a function of aquifer properties, septic system density and drainfield size. We show that high spatial septic system density poses a high probability of pumping septic system leachate. The hydraulic conductivity of the aquifer has a strong influence on the intersection probability. We find that mass balance calculations applied on a regional scale underestimate the contamination risk of individual drinking water wells by septic systems. This is particularly relevant for contaminants released at high concentrations, for substances that experience limited attenuation, and those that are harmful even at low concentrations (e.g., pathogens).

  16. Using Dynamic Geometry Software for Teaching Conditional Probability with Area-Proportional Venn Diagrams

    ERIC Educational Resources Information Center

    Radakovic, Nenad; McDougall, Douglas

    2012-01-01

    This classroom note illustrates how dynamic visualization can be used to teach conditional probability and Bayes' theorem. There are two features of the visualization that make it an ideal pedagogical tool in probability instruction. The first feature is the use of area-proportional Venn diagrams that, along with showing qualitative relationships,…

  17. Recursive recovery of Markov transition probabilities from boundary value data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patch, Sarah Kathyrn

    1994-04-01

    In an effort to mathematically describe the anisotropic diffusion of infrared radiation in biological tissue Gruenbaum posed an anisotropic diffusion boundary value problem in 1989. In order to accommodate anisotropy, he discretized the temporal as well as the spatial domain. The probabilistic interpretation of the diffusion equation is retained; radiation is assumed to travel according to a random walk (of sorts). In this random walk the probabilities with which photons change direction depend upon their previous as well as present location. The forward problem gives boundary value data as a function of the Markov transition probabilities. The inverse problem requiresmore » finding the transition probabilities from boundary value data. Problems in the plane are studied carefully in this thesis. Consistency conditions amongst the data are derived. These conditions have two effects: they prohibit inversion of the forward map but permit smoothing of noisy data. Next, a recursive algorithm which yields a family of solutions to the inverse problem is detailed. This algorithm takes advantage of all independent data and generates a system of highly nonlinear algebraic equations. Pluecker-Grassmann relations are instrumental in simplifying the equations. The algorithm is used to solve the 4 x 4 problem. Finally, the smallest nontrivial problem in three dimensions, the 2 x 2 x 2 problem, is solved.« less

  18. Simulating high spatial resolution high severity burned area in Sierra Nevada forests for California Spotted Owl habitat climate change risk assessment and management.

    NASA Astrophysics Data System (ADS)

    Keyser, A.; Westerling, A. L.; Jones, G.; Peery, M. Z.

    2017-12-01

    Sierra Nevada forests have experienced an increase in very large fires with significant areas of high burn severity, such as the Rim (2013) and King (2014) fires, that have impacted habitat of endangered species such as the California spotted owl. In order to support land manager forest management planning and risk assessment activities, we used historical wildfire histories from the Monitoring Trends in Burn Severity project and gridded hydroclimate and land surface characteristics data to develope statistical models to simulate the frequency, location and extent of high severity burned area in Sierra Nevada forest wildfires as functions of climate and land surface characteristics. We define high severity here as BA90 area: the area comprising patches with ninety percent or more basal area killed within a larger fire. We developed a system of statistical models to characterize the probability of large fire occurrence, the probability of significant BA90 area present given a large fire, and the total extent of BA90 area in a fire on a 1/16 degree lat/lon grid over the Sierra Nevada. Repeated draws from binomial and generalized pareto distributions using these probabilities generated a library of simulated histories of high severity fire for a range of near (50 yr) future climate and fuels management scenarios. Fuels management scenarios were provided by USFS Region 5. Simulated BA90 area was then downscaled to 30 m resolution using a statistical model we developed using Random Forest techniques to estimate the probability of adjacent 30m pixels burning with ninety percent basal kill as a function of fire size and vegetation and topographic features. The result is a library of simulated high resolution maps of BA90 burned areas for a range of climate and fuels management scenarios with which we estimated conditional probabilities of owl nesting sites being impacted by high severity wildfire.

  19. Exact calculation of loop formation probability identifies folding motifs in RNA secondary structures.

    PubMed

    Sloma, Michael F; Mathews, David H

    2016-12-01

    RNA secondary structure prediction is widely used to analyze RNA sequences. In an RNA partition function calculation, free energy nearest neighbor parameters are used in a dynamic programming algorithm to estimate statistical properties of the secondary structure ensemble. Previously, partition functions have largely been used to estimate the probability that a given pair of nucleotides form a base pair, the conditional stacking probability, the accessibility to binding of a continuous stretch of nucleotides, or a representative sample of RNA structures. Here it is demonstrated that an RNA partition function can also be used to calculate the exact probability of formation of hairpin loops, internal loops, bulge loops, or multibranch loops at a given position. This calculation can also be used to estimate the probability of formation of specific helices. Benchmarking on a set of RNA sequences with known secondary structures indicated that loops that were calculated to be more probable were more likely to be present in the known structure than less probable loops. Furthermore, highly probable loops are more likely to be in the known structure than the set of loops predicted in the lowest free energy structures. © 2016 Sloma and Mathews; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  20. Inter-Population Movements of Steller Sea Lions in Alaska with Implications for Population Separation

    PubMed Central

    Jemison, Lauri A.; Pendleton, Grey W.; Fritz, Lowell W.; Hastings, Kelly K.; Maniscalco, John M.; Trites, Andrew W.; Gelatt, Tom S.

    2013-01-01

    Genetic studies and differing population trends support the separation of Steller sea lions (Eumetopias jubatus) into a western distinct population segment (WDPS) and an eastern DPS (EDPS) with the dividing line between populations at 144° W. Despite little exchange for thousands of years, the gap between the breeding ranges narrowed during the past 15–30 years with the formation of new rookeries near the DPS boundary. We analyzed >22,000 sightings of 4,172 sea lions branded as pups in each DPS from 2000–2010 to estimate probabilities of a sea lion born in one DPS being seen within the range of the other DPS (either ‘West’ or ‘East’). Males from both populations regularly traveled across the DPS boundary; probabilities were highest at ages 2–5 and for males born in Prince William Sound and southern Southeast Alaska. The probability of WDPS females being in the East at age 5 was 0.067 but 0 for EDPS females which rarely traveled to the West. Prince William Sound-born females had high probabilities of being in the East during breeding and non-breeding seasons. We present strong evidence that WDPS females have permanently emigrated to the East, reproducing at two ‘mixing zone’ rookeries. We documented breeding bulls that traveled >6,500 km round trip from their natal rookery in southern Alaska to the northern Bering Sea and central Aleutian Islands and back within one year. WDPS animals began moving East in the 1990s, following steep population declines in the central Gulf of Alaska. Results of our study, and others documenting high survival and rapid population growth in northern Southeast Alaska suggest that conditions in this mixing zone region have been optimal for sea lions. It is unclear whether eastward movement across the DPS boundary is due to less-optimal conditions in the West or a reflection of favorable conditions in the East. PMID:23940543

  1. Inter-population movements of steller sea lions in Alaska with implications for population separation.

    PubMed

    Jemison, Lauri A; Pendleton, Grey W; Fritz, Lowell W; Hastings, Kelly K; Maniscalco, John M; Trites, Andrew W; Gelatt, Tom S

    2013-01-01

    Genetic studies and differing population trends support the separation of Steller sea lions (Eumetopias jubatus) into a western distinct population segment (WDPS) and an eastern DPS (EDPS) with the dividing line between populations at 144° W. Despite little exchange for thousands of years, the gap between the breeding ranges narrowed during the past 15-30 years with the formation of new rookeries near the DPS boundary. We analyzed >22,000 sightings of 4,172 sea lions branded as pups in each DPS from 2000-2010 to estimate probabilities of a sea lion born in one DPS being seen within the range of the other DPS (either 'West' or 'East'). Males from both populations regularly traveled across the DPS boundary; probabilities were highest at ages 2-5 and for males born in Prince William Sound and southern Southeast Alaska. The probability of WDPS females being in the East at age 5 was 0.067 but 0 for EDPS females which rarely traveled to the West. Prince William Sound-born females had high probabilities of being in the East during breeding and non-breeding seasons. We present strong evidence that WDPS females have permanently emigrated to the East, reproducing at two 'mixing zone' rookeries. We documented breeding bulls that traveled >6,500 km round trip from their natal rookery in southern Alaska to the northern Bering Sea and central Aleutian Islands and back within one year. WDPS animals began moving East in the 1990s, following steep population declines in the central Gulf of Alaska. Results of our study, and others documenting high survival and rapid population growth in northern Southeast Alaska suggest that conditions in this mixing zone region have been optimal for sea lions. It is unclear whether eastward movement across the DPS boundary is due to less-optimal conditions in the West or a reflection of favorable conditions in the East.

  2. Comet Science Working Group report on the Halley Intercept Mission

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The Halley Intercept Mission is described and the scientific benefits expected from the program are defined. One characteristic of the mission is the optical navigation and resulting accurate delivery of the spacecraft to a desired point near the nucleus. This accuracy of delivery has two important implications: (1) high probability that the mass spectrometers and other in situ measurement devices will reach the cometary ionosphere and the zone of parent molecules next to the nucleus; (2) high probability that sunlit, high resolution images of Halley's nucleus will be obtained under proper lighting conditions. In addition an observatory phase is included during which high quality images of the tail and coma structure will be obtained at progressively higher spatial resolutions as the spacecraft approaches the comet. Complete measurements of the comet/solar wind interaction can be made around the time of encounter. Specific recommendations are made concerning project implementation and spacecraft requirements.

  3. Clinical decision-making by midwives: managing case complexity.

    PubMed

    Cioffi, J; Markham, R

    1997-02-01

    In making clinical judgements, it is argued that midwives use 'shortcuts' or heuristics based on estimated probabilities to simplify the decision-making task. Midwives (n = 30) were given simulated patient assessment situations of high and low complexity and were required to think aloud. Analysis of verbal protocols showed that subjective probability judgements (heuristics) were used more frequently in the high than low complexity case and predominated in the last quarter of the assessment period for the high complexity case. 'Representativeness' was identified more frequently in the high than in the low case, but was the dominant heuristic in both. Reports completed after each simulation suggest that heuristics based on memory for particular conditions affect decisions. It is concluded that midwives use heuristics, derived mainly from their clinical experiences, in an attempt to save cognitive effort and to facilitate reasonably accurate decisions in the decision-making process.

  4. Statistical learning of action: the role of conditional probability.

    PubMed

    Meyer, Meredith; Baldwin, Dare

    2011-12-01

    Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.

  5. Early Intervention for Abused and Neglected Infants and Toddlers

    ERIC Educational Resources Information Center

    Zero to Three (J), 2006

    2006-01-01

    Children who suffer abuse or neglect, or have parents who suffer from mental health problems (especially maternal depression), substance abuse, or family violence, have as high a probability of experiencing developmental delays as do children with medical conditions that are automatically eligible for Part C services under the Individuals with…

  6. Current best management practices for harvesting and storing dry hay: a research review

    USDA-ARS?s Scientific Manuscript database

    The production of high-quality grass or legume hays in humid environments is complicated by slower drying rates, and increased probability of rainfall events compared to hay produced under arid climatic conditions. As a result, hay producers in humid environments often face the management dilemma of...

  7. Case−Control Study of Risk Factors for Meningococcal Disease in Chile

    PubMed Central

    Matute, Isabel; González, Claudia; Delgado, Iris; Poffald, Lucy; Pedroni, Elena; Alfaro, Tania; Hirmas, Macarena; Nájera, Manuel; Gormaz, Ana; López, Darío; Loayza, Sergio; Ferreccio, Catterina; Gallegos, Doris; Fuentes, Rodrigo; Vial, Pablo; Aguilera, Ximena

    2017-01-01

    An outbreak of meningococcal disease with a case-fatality rate of 30% and caused by predominantly serogroup W of Neisseria meningitidis began in Chile in 2012. This outbreak required a case−control study to assess determinants and risk factors for infection. We identified confirmed cases during January 2012−March 2013 and selected controls by random sampling of the population, matched for age and sex, resulting in 135 case-patients and 618 controls. Sociodemographic variables, habits, and previous illnesses were studied. Analyses yielded adjusted odds ratios as estimators of the probability of disease development. Results indicated that conditions of social vulnerability, such as low income and overcrowding, as well as familial history of this disease and clinical histories, especially chronic diseases and hospitalization for respiratory conditions, increased the probability of illness. Findings should contribute to direction of intersectoral public policies toward a highly vulnerable social group to enable them to improve their living conditions and health. PMID:28628448

  8. [Effects of prefrontal ablations on the reaction of the active choice of feeder under different probability and value of the reinforcement on dog].

    PubMed

    Preobrazhenskaia, L A; Ioffe, M E; Mats, V N

    2004-01-01

    The role of the prefrontal cortex was investigated on the reaction of the active choice of the two feeders under changes value and probability reinforcement. The experiments were performed on 2 dogs with prefrontal ablation (g. proreus). Before the lesions the dogs were taught to receive food in two different feeders to conditioned stimuli with equally probable alimentary reinforcement. After ablation in the inter-trial intervals the dogs were running from the one feeder to another. In the answer to conditioned stimuli for many times the dogs choose the same feeder. The disturbance of the behavior after some times completely restored. In the experiments with competition of probability events and values of reinforcement the dogs chose the feeder with low-probability but better quality of reinforcement. In the experiments with equal value but different probability the intact dogs chose the feeder with higher probability. In our experiments the dogs with prefrontal lesions chose the each feeder equiprobably. Thus in condition of free behavior one of different functions of the prefrontal cortex is the reactions choose with more probability of reinforcement.

  9. The Context Matters: Outcome Probability and Expectation Mismatch Modulate the Feedback Negativity When Self-Evaluation of Response Correctness Is Possible

    PubMed Central

    Leue, Anja; Cano Rodilla, Carmen; Beauducel, André

    2015-01-01

    Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated. PMID:26783525

  10. The Context Matters: Outcome Probability and Expectation Mismatch Modulate the Feedback Negativity When Self-Evaluation of Response Correctness Is Possible.

    PubMed

    Leue, Anja; Cano Rodilla, Carmen; Beauducel, André

    2015-01-01

    Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated.

  11. Exploiting risk-reward structures in decision making under uncertainty.

    PubMed

    Leuker, Christina; Pachur, Thorsten; Hertwig, Ralph; Pleskac, Timothy J

    2018-06-01

    People often have to make decisions under uncertainty-that is, in situations where the probabilities of obtaining a payoff are unknown or at least difficult to ascertain. One solution to this problem is to infer the probability from the magnitude of the potential payoff and thus exploit the inverse relationship between payoffs and probabilities that occurs in many domains in the environment. Here, we investigated how the mind may implement such a solution: (1) Do people learn about risk-reward relationships from the environment-and if so, how? (2) How do learned risk-reward relationships impact preferences in decision-making under uncertainty? Across three experiments (N = 352), we found that participants can learn risk-reward relationships from being exposed to choice environments with a negative, positive, or uncorrelated risk-reward relationship. They were able to learn the associations both from gambles with explicitly stated payoffs and probabilities (Experiments 1 & 2) and from gambles about epistemic events (Experiment 3). In subsequent decisions under uncertainty, participants often exploited the learned association by inferring probabilities from the magnitudes of the payoffs. This inference systematically influenced their preferences under uncertainty: Participants who had been exposed to a negative risk-reward relationship tended to prefer the uncertain option over a smaller sure option for low payoffs, but not for high payoffs. This pattern reversed in the positive condition and disappeared in the uncorrelated condition. This adaptive change in preferences is consistent with the use of the risk-reward heuristic. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Recent trends in the probability of high out-of-pocket medical expenses in the United States

    PubMed Central

    Baird, Katherine E

    2016-01-01

    Objective: This article measures the probability that out-of-pocket expenses in the United States exceed a threshold share of income. It calculates this probability separately by individuals’ health condition, income, and elderly status and estimates changes occurring in these probabilities between 2010 and 2013. Data and Method: This article uses nationally representative household survey data on 344,000 individuals. Logistic regressions estimate the probabilities that out-of-pocket expenses exceed 5% and alternatively 10% of income in the two study years. These probabilities are calculated for individuals based on their income, health status, and elderly status. Results: Despite favorable changes in both health policy and the economy, large numbers of Americans continue to be exposed to high out-of-pocket expenditures. For instance, the results indicate that in 2013 over a quarter of nonelderly low-income citizens in poor health spent 10% or more of their income on out-of-pocket expenses, and over 40% of this group spent more than 5%. Moreover, for Americans as a whole, the probability of spending in excess of 5% of income on out-of-pocket costs increased by 1.4 percentage points between 2010 and 2013, with the largest increases occurring among low-income Americans; the probability of Americans spending more than 10% of income grew from 9.3% to 9.6%, with the largest increases also occurring among the poor. Conclusion: The magnitude of out-of-pocket’s financial burden and the most recent upward trends in it underscore a need to develop good measures of the degree to which health care policy exposes individuals to financial risk, and to closely monitor the Affordable Care Act’s success in reducing Americans’ exposure to large medical bills. PMID:27651901

  13. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making.

    PubMed

    Ojala, Karita E; Janssen, Lieneke K; Hashemi, Mahur M; Timmer, Monique H M; Geurts, Dirk E M; Ter Huurne, Niels P; Cools, Roshan; Sescousse, Guillaume

    2018-01-01

    Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls ( n = 21) and pathological gamblers ( n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D 2 /D 3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D 2 /D 3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making.

  14. Atmospheric conditions, lunar phases, and childbirth: a multivariate analysis

    NASA Astrophysics Data System (ADS)

    Ochiai, Angela Megumi; Gonçalves, Fabio Luiz Teixeira; Ambrizzi, Tercio; Florentino, Lucia Cristina; Wei, Chang Yi; Soares, Alda Valeria Neves; De Araujo, Natalucia Matos; Gualda, Dulce Maria Rosa

    2012-07-01

    Our objective was to assess extrinsic influences upon childbirth. In a cohort of 1,826 days containing 17,417 childbirths among them 13,252 spontaneous labor admissions, we studied the influence of environment upon the high incidence of labor (defined by 75th percentile or higher), analyzed by logistic regression. The predictors of high labor admission included increases in outdoor temperature (odds ratio: 1.742, P = 0.045, 95%CI: 1.011 to 3.001), and decreases in atmospheric pressure (odds ratio: 1.269, P = 0.029, 95%CI: 1.055 to 1.483). In contrast, increases in tidal range were associated with a lower probability of high admission (odds ratio: 0.762, P = 0.030, 95%CI: 0.515 to 0.999). Lunar phase was not a predictor of high labor admission ( P = 0.339). Using multivariate analysis, increases in temperature and decreases in atmospheric pressure predicted high labor admission, and increases of tidal range, as a measurement of the lunar gravitational force, predicted a lower probability of high admission.

  15. Direct evidence for a dual process model of deductive inference.

    PubMed

    Markovits, Henry; Brunet, Marie-Laurence; Thompson, Valerie; Brisson, Janie

    2013-07-01

    In 2 experiments, we tested a strong version of a dual process theory of conditional inference (cf. Verschueren et al., 2005a, 2005b) that assumes that most reasoners have 2 strategies available, the choice of which is determined by situational variables, cognitive capacity, and metacognitive control. The statistical strategy evaluates inferences probabilistically, accepting those with high conditional probability. The counterexample strategy rejects inferences when a counterexample shows the inference to be invalid. To discriminate strategy use, we presented reasoners with conditional statements (if p, then q) and explicit statistical information about the relative frequency of the probability of p/q (50% vs. 90%). A statistical strategy would accept the more probable inferences more frequently, whereas the counterexample one would reject both. In Experiment 1, reasoners under time pressure used the statistical strategy more, but switched to the counterexample strategy when time constraints were removed; the former took less time than the latter. These data are consistent with the hypothesis that the statistical strategy is the default heuristic. Under a free-time condition, reasoners preferred the counterexample strategy and kept it when put under time pressure. Thus, it is not simply a lack of capacity that produces a statistical strategy; instead, it seems that time pressure disrupts the ability to make good metacognitive choices. In line with this conclusion, in a 2nd experiment, we measured reasoners' confidence in their performance; those under time pressure were less confident in the statistical than the counterexample strategy and more likely to switch strategies under free-time conditions. PsycINFO Database Record (c) 2013 APA, all rights reserved.

  16. Differentiation between inflammatory and neoplastic orbital conditions based on computed tomographic signs.

    PubMed

    Lederer, Kristina; Ludewig, Eberhard; Hechinger, Harald; Parry, Andrew T; Lamb, Christopher R; Kneissl, Sibylle

    2015-07-01

    To identify computed tomographic (CT) signs that could be used to differentiate inflammatory from neoplastic orbital conditions in small animals. Fifty-two animals (25 cats, 21 dogs, 4 rabbits, and 2 rodents). Case-control study in which CT images of animals with histopathologic diagnosis of inflammatory (n = 11), neoplastic orbital conditions (n = 31), or normal control animals (n = 10) were reviewed independently by five observers without the knowledge of the history or diagnosis. Observers recorded their observations regarding specific anatomical structures within the orbit using an itemized form containing the following characteristics: definitely normal; probably normal; equivocal; probably abnormal; and definitely abnormal. Results were statistically analyzed using Fleiss' kappa and logistic regression analyses. The overall level of agreement between observers about the presence or absence of abnormal CT signs in animals with orbital disease was poor to moderate, but was highest for observations concerning orbital bones (κ = 0.62) and involvement of the posterior segment (κ = 0.52). Significant associations between abnormalities and diagnosis were found for four structures: Abnormalities affecting orbital bones (odds ratio [OR], 1.7) and anterior ocular structures (OR, 1.5) were predictive of neoplasia, while abnormalities affecting extraconal fat (OR, 1.7) and skin (OR, 1.4) were predictive of inflammatory conditions. Orbital CT is an imaging test with high specificity. Fat stranding, a CT sign not previously emphasized in veterinary medicine, was significantly associated with inflammatory conditions. Low observer agreement probably reflects the limited resolution of CT for small orbital structures. © 2014 American College of Veterinary Ophthalmologists.

  17. A test of the substitution-habitat hypothesis in amphibians.

    PubMed

    Martínez-Abraín, Alejandro; Galán, Pedro

    2018-06-01

    Most examples that support the substitution-habitat hypothesis (human-made habitats act as substitutes of original habitat) deal with birds and mammals. We tested this hypothesis in 14 amphibians by using percentage occupancy as a proxy of habitat quality (i.e., higher occupancy percentages indicate higher quality). We classified water body types as original habitat (no or little human influence) depending on anatomical, behavioral, or physiological adaptations of each amphibian species. Ten species had relatively high probabilities (0.16-0.28) of occurrence in original habitat, moderate probability of occurrence in substitution habitats (0.11-0.14), and low probability of occurrence in refuge habitats (0.05-0.08). Thus, the substitution-habitat hypothesis only partially applies to amphibians because the low occupancy of refuges could be due to the negligible human persecution of this group (indicating good conservation status). However, low occupancy of refuges could also be due to low tolerance of refuge conditions, which could have led to selective extinction or colonization problems due to poor dispersal capabilities. That original habitats had the highest probabilities of occupancy suggests amphibians have a good conservation status in the region. They also appeared highly adaptable to anthropogenic substitution habitats. © 2017 Society for Conservation Biology.

  18. [Overweight and obesity in schoolchildren from Brandsen and its relationship with socio-environmental characteristics of residence].

    PubMed

    Cesani, María F; Luis, María A; Torres, María F; Castro, Luis E; Quintero, Fabián A; Luna, María E; Bergel, María L; Oyhenart, Evelia E

    2010-08-01

    Environmental factors play an important role in the etiology of overweight (S) and obesity (O), constituting the "obesogenic environment". The objectives of the present study are: a) to estimate overweight and obesity prevalences in 3 to 14 years-old schoolchildren from Brandsen (Provincia de Buenos Aires), and b) to analyze the probability of occurrence of overweight and obesity in relation to the socioenvironmental conditions of resident. Weight and height were measured in 989 boys and girls aged 3 to 14 years. S and O were estimated following the criteria suggested by the International Obesity Task Force. The prevalences of S and O were compared between genders and ages. The socio- environmental information was gathered according to surveys and processed by Categorical Principal Components Analysis (catPCA). Generalized Linear Model (link logit) against the variables S and O was employed. S was found in 15,8% of schoolchildren and O in 7,2%. None significative statistics differences between both genders and ages, were found. The first axis of the catPCA discriminated the cases that presented better socio-environmental conditions with positive values and those with more unfavorable conditions with negatives values. Higher probability of obese children was associated with better socio-environmental conditions (higher educational level of parents, higher income and better access to public services), and higher probability of overweight children was associated with less favored environments. The schoolchildren population of Brandsen presents high overweight and obesity prevalences. The chance of presenting overweight is higher in children from households with adverse socio-environmental conditions. On the contrary, obese children are to be more found in households which have more favorable socio-environmental conditions.

  19. Dissociable Neural Processes Underlying Risky Decisions for Self Versus Other

    PubMed Central

    Jung, Daehyun; Sul, Sunhae; Kim, Hackjin

    2013-01-01

    Previous neuroimaging studies on decision making have mainly focused on decisions on behalf of oneself. Considering that people often make decisions on behalf of others, it is intriguing that there is little neurobiological evidence on how decisions for others differ from those for oneself. The present study directly compared risky decisions for self with those for another person using functional magnetic resonance imaging (fMRI). Participants were asked to perform a gambling task on behalf of themselves (decision-for-self condition) or another person (decision-for-other condition) while in the scanner. Their task was to choose between a low-risk option (i.e., win or lose 10 points) and a high-risk option (i.e., win or lose 90 points) with variable levels of winning probability. Compared with choices regarding others, those regarding oneself were more risk-averse at lower winning probabilities and more risk-seeking at higher winning probabilities, perhaps due to stronger affective process during risky decisions for oneself compared with those for other. The brain-activation pattern changed according to the target, such that reward-related regions were more active in the decision-for-self condition than in the decision-for-other condition, whereas brain regions related to the theory of mind (ToM) showed greater activation in the decision-for-other condition than in the decision-for-self condition. Parametric modulation analysis using individual decision models revealed that activation of the amygdala and the dorsomedial prefrontal cortex (DMPFC) were associated with value computations for oneself and for another, respectively, during risky financial decisions. The results of the present study suggest that decisions for oneself and for other may recruit fundamentally distinct neural processes, which can be mainly characterized as dominant affective/impulsive and cognitive/regulatory processes, respectively. PMID:23519016

  20. Academic decision making and prospect theory.

    PubMed

    Mowrer, Robert R; Davidson, William B

    2011-08-01

    Two studies are reported that investigate the applicability of prospect theory to college students' academic decision making. Exp. 1 failed to provide support for the risk-seeking portion of the fourfold pattern predicted by prospect theory but did find the greater weighting of losses over gains. Using a more sensitive dependent measure, in Exp. 2 the results of the first experiment were replicated in terms of the gain-loss effect and also found some support for the fourfold pattern in the interaction between probabilities and gain versus loss. The greatest risk-seeking was found in the high probability loss condition.

  1. Condition factor variations over time and trophic position among four species of Characidae from Amazonian floodplain lakes: effects of an anomalous drought.

    PubMed

    Tribuzy-Neto, I A; Conceição, K G; Siqueira-Souza, F K; Hurd, L E; Freitas, C E C

    2018-05-01

    The effects of extreme droughts on freshwater fish remain unknown worldwide. In this paper, we estimated the condition factor, a measure of relative fitness based on the relationship of body weight to length, in four fish species representing two trophic levels (omnivores and piscivores) from Amazonian floodplain lakes for three consecutive years: 2004, 2005 (an anomalous drought year), and 2006. The two omnivores, Colossoma macropomum and Mylossoma duriventre, exhibited trends consistent with their life cycles in 2004 and 2006: high values during the hydrologic seasons of high water, receding water, and low water, with a drop following reproduction following the onset of rising water. However during the drought year of 2005 the condition factor was much lower than normal during receding and low water seasons, probably as a result of an abnormal reduction in resource availability in a reduced habitat. The two piscivorous piranhas, Serrasalmus spilopleura and S. elongatus, maintained relatively stable values of condition factor over the hydrologic cycles of all three years, with no apparent effect of the drought, probably because the reduction in habitat is counterbalanced by the resulting increase in relative prey density. We suggest that if predictions of increasing drought in the Amazon are correct, predatory species may benefit, at least in the short run, while omnivores may be negatively affected.

  2. The Role of Cognitive and Perceptual Loads in Inattentional Deafness

    PubMed Central

    Causse, Mickaël; Imbert, Jean-Paul; Giraudet, Louise; Jouffrais, Christophe; Tremblay, Sébastien

    2016-01-01

    The current study examines the role of cognitive and perceptual loads in inattentional deafness (the failure to perceive an auditory stimulus) and the possibility to predict this phenomenon with ocular measurements. Twenty participants performed Air Traffic Control (ATC) scenarios—in the Laby ATC-like microworld—guiding one (low cognitive load) or two (high cognitive load) aircraft while responding to visual notifications related to 7 (low perceptual load) or 21 (high perceptual load) peripheral aircraft. At the same time, participants were played standard tones which they had to ignore (probability = 0.80), or deviant tones (probability = 0.20) which they had to report. Behavioral results showed that 28.76% of alarms were not reported in the low cognitive load condition and up to 46.21% in the high cognitive load condition. On the contrary, perceptual load had no impact on the inattentional deafness rate. Finally, the mean pupil diameter of the fixations that preceded the target tones was significantly lower in the trials in which the participants did not report the tones, likely showing a momentary lapse of sustained attention, which in turn was associated to the occurrence of inattentional deafness. PMID:27458362

  3. Probabilistic attribution of individual unprecedented extreme events

    NASA Astrophysics Data System (ADS)

    Diffenbaugh, N. S.

    2016-12-01

    The last decade has seen a rapid increase in efforts to understand the influence of global warming on individual extreme climate events. Although trends in the distributions of climate observations have been thoroughly analyzed, rigorously quantifying the contribution of global-scale warming to individual events that are unprecedented in the observed record presents a particular challenge. This paper describes a method for leveraging observations and climate model ensembles to quantify the influence of historical global warming on the severity and probability of unprecedented events. This approach uses formal inferential techniques to quantify four metrics: (1) the contribution of the observed trend to the event magnitude, (2) the contribution of the observed trend to the event probability, (3) the probability of the observed trend in the current climate and a climate without human influence, and (4) the probability of the event magnitude in the current climate and a climate without human influence. Illustrative examples are presented, spanning a range of climate variables, timescales, and regions. These examples illustrate that global warming can influence the severity and probability of unprecedented extremes. In some cases - particularly high temperatures - this change is indicated by changes in the mean. However, changes in probability do not always arise from changes in the mean, suggesting that global warming can alter the frequency with which complex physical conditions co-occur. Because our framework is transparent and highly generalized, it can be readily applied to a range of climate events, regions, and levels of climate forcing.

  4. Neural substrates of the impaired effort expenditure decision making in schizophrenia.

    PubMed

    Huang, Jia; Yang, Xin-Hua; Lan, Yong; Zhu, Cui-Ying; Liu, Xiao-Qun; Wang, Ye-Fei; Cheung, Eric F C; Xie, Guang-Rong; Chan, Raymond C K

    2016-09-01

    Unwillingness to expend more effort to pursue high value rewards has been associated with motivational anhedonia in schizophrenia (SCZ) and abnormal dopamine activity in the nucleus accumbens (NAcc). The authors hypothesized that dysfunction of the NAcc and the associated forebrain regions are involved in the impaired effort expenditure decision-making of SCZ. A 2 (reward magnitude: low vs. high) × 3 (probability: 20% vs. 50% vs. 80%) event-related fMRI design in the effort-expenditure for reward task (EEfRT) was used to examine the neural response of 23 SCZ patients and 23 demographically matched control participants when the participants made effort expenditure decisions to pursue uncertain rewards. SCZ patients were significantly less likely to expend high level of effort in the medium (50%) and high (80%) probability conditions than healthy controls. The neural response in the NAcc, the posterior cingulate gyrus and the left medial frontal gyrus in SCZ patients were weaker than healthy controls and did not linearly increase with an increase in reward magnitude and probability. Moreover, NAcc activity was positively correlated with the willingness to expend high-level effort and concrete consummatory pleasure experience. NAcc and posterior cingulate dysfunctions in SCZ patients may be involved in their impaired effort expenditure decision-making. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  5. Comparison of hoop-net trapping and visual surveys to monitor abundance of the Rio Grande cooter (Pseudemys gorzugi).

    PubMed

    Mali, Ivana; Duarte, Adam; Forstner, Michael R J

    2018-01-01

    Abundance estimates play an important part in the regulatory and conservation decision-making process. It is important to correct monitoring data for imperfect detection when using these data to track spatial and temporal variation in abundance, especially in the case of rare and elusive species. This paper presents the first attempt to estimate abundance of the Rio Grande cooter ( Pseudemys gorzugi ) while explicitly considering the detection process. Specifically, in 2016 we monitored this rare species at two sites along the Black River, New Mexico via traditional baited hoop-net traps and less invasive visual surveys to evaluate the efficacy of these two sampling designs. We fitted the Huggins closed-capture estimator to estimate capture probabilities using the trap data and distance sampling models to estimate detection probabilities using the visual survey data. We found that only the visual survey with the highest number of observed turtles resulted in similar abundance estimates to those estimated using the trap data. However, the estimates of abundance from the remaining visual survey data were highly variable and often underestimated abundance relative to the estimates from the trap data. We suspect this pattern is related to changes in the basking behavior of the species and, thus, the availability of turtles to be detected even though all visual surveys were conducted when environmental conditions were similar. Regardless, we found that riverine habitat conditions limited our ability to properly conduct visual surveys at one site. Collectively, this suggests visual surveys may not be an effective sample design for this species in this river system. When analyzing the trap data, we found capture probabilities to be highly variable across sites and between age classes and that recapture probabilities were much lower than initial capture probabilities, highlighting the importance of accounting for detectability when monitoring this species. Although baited hoop-net traps seem to be an effective sampling design, it is important to note that this method required a relatively high trap effort to reliably estimate abundance. This information will be useful when developing a larger-scale, long-term monitoring program for this species of concern.

  6. SA45. Amotivation in Schizophrenia, Bipolar Disorder, and Major Depressive Disorder: A Preliminary Comparison Study

    PubMed Central

    Zou, Ying-min; Ni, Ke; Wang, Yang-yu; Yu, En-qing; Lui, Simon S. Y.; Cheung, Eric F. C.; Chan, Raymond C. K.

    2017-01-01

    Abstract Background: Deficits in reward processing, such as approaching motivation, reward learning and effort-based decision-making, have been observed in patients with schizophrenia (SCZ), bipolar disorder (BD), and major depressive disorder (MDD). However, little is known about the nature of reward-processing deficits in these 3 diagnostic groups. The present study aimed to compare and contrast amotivation in these 3 diagnostic groups using an effort-based decision-making task. Methods: Sixty patients (19 SCZ patients, 18 BD patients and 23 MDD patients) and 27 healthy controls (HC) were recruited for the present study. The Effort Expenditure for Reward Task (EEfRT) was administered to evaluate their effort allocation pattern. This task required participants to choose easy or hard tasks in response to different levels of reward magnitude and reward probability. Results: Results showed that SCZ, BD, and MDD patients chose fewer hard tasks compared to HC. As reward magnitude increased, MDD patients made the least effort to gain reward compared to the other groups. When reward probability was intermediate, MDD patients chose fewer hard tasks than SCZ patients, whereas BD patients and HC chose more hard tasks than MDD and SCZ patients. When the reward probability was high, all 3 groups of patients tried fewer hard tasks than HC. Moreover, SCZ and MDD patients were less likely to choose hard tasks than BD patients and HC in the intermediate estimated value conditions. However, in the highest estimated value condition, there was no group difference in hard task choices between these 3 clinical groups, and they were all less motivated than HC. Conclusion: SCZ, BD, and MDD patients shared common deficits in gaining reward if the reward probability and estimated value were high. SCZ and MDD patients showed less motivation than BD patients in gaining reward when the reward probability and estimated value was intermediate.

  7. Detection probabilities of electrofishing, hoop nets, and benthic trawls for fishes in two western North American rivers

    USGS Publications Warehouse

    Smith, Christopher D.; Quist, Michael C.; Hardy, Ryan S.

    2015-01-01

    Research comparing different sampling techniques helps improve the efficiency and efficacy of sampling efforts. We compared the effectiveness of three sampling techniques (small-mesh hoop nets, benthic trawls, boat-mounted electrofishing) for 30 species in the Green (WY, USA) and Kootenai (ID, USA) rivers by estimating conditional detection probabilities (probability of detecting a species given its presence at a site). Electrofishing had the highest detection probabilities (generally greater than 0.60) for most species (88%), but hoop nets also had high detectability for several taxa (e.g., adult burbot Lota lota, juvenile northern pikeminnow Ptychocheilus oregonensis). Benthic trawls had low detection probabilities (<0.05) for most taxa (84%). Gear-specific effects were present for most species indicating large differences in gear effectiveness among techniques. In addition to gear effects, habitat characteristics also influenced detectability of fishes. Most species-specific habitat relationships were idiosyncratic and reflected the ecology of the species. Overall findings of our study indicate that boat-mounted electrofishing and hoop nets are the most effective techniques for sampling fish assemblages in large, coldwater rivers.

  8. Anomalous night-time peaks in diurnal variations of NmF2 close to the geomagnetic equator: A statistical study

    NASA Astrophysics Data System (ADS)

    Pavlov, A. V.; Pavlova, N. M.

    2007-11-01

    We present a study of anomalous night-time NmF2 peaks, ANNPs, observed by the La Paz, Natal, Djibouti, Kodaikanal, Madras, Manila, Talara, and Huancayo Jicamarca ionosonde stations close to the geomagnetic equator. It is shown for the first time that the probabilities of occurrence of the first and second ANNPs depend on the geomagnetic longitude, and there is a longitude sector close to 110° geomagnetic longitude where the first and second ANNPs occur less frequently in comparison with the longitude regions located close to and below about 34° geomagnetic longitude and close to and above about 144° geomagnetic longitude. The found frequencies of occurrence of the ANNPs increase with increasing solar activity, except of the Djibouti and Kodaikanal ionosonde stations, where the probability of the first ANNP occurrence is found to decrease with increasing solar activity from low to moderate solar activity, and except of the Natal ionosonde station, where the frequencies of occurrence of the first and second ANNPs decrease with increasing solar activity from moderate to high solar activity. We found that the occurrence probabilities of ANNPs during geomagnetically disturbed conditions are greater than those during geomagnetically quiet conditions. The ANNP probabilities are largest in summer and are lowest in winter for the La-Paz, Talara, and Huancayo Jicamarca sounders. These probabilities are lowest in summer for the Djibouti, Madras, and Manila ionosonde stations, and in spring for the Kodaikanal sounder. The maximums in the probabilities are found to be in autumn for the Djibouti, Madras, and Manila ionosonde stations, and in winter for the Kodaikanal sounder.

  9. New normative standards of conditional reasoning and the dual-source model

    PubMed Central

    Singmann, Henrik; Klauer, Karl Christoph; Over, David

    2014-01-01

    There has been a major shift in research on human reasoning toward Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998) for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer and Kleiter, 2005, 2010) exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer et al., 2010) is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches) and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task. PMID:24860516

  10. New normative standards of conditional reasoning and the dual-source model.

    PubMed

    Singmann, Henrik; Klauer, Karl Christoph; Over, David

    2014-01-01

    There has been a major shift in research on human reasoning toward Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998) for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer and Kleiter, 2005, 2010) exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer et al., 2010) is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches) and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task.

  11. The School Experiences of Children with Epilepsy: A Phenomenological Study

    ERIC Educational Resources Information Center

    Whiting-MacKinnon, Cheryl; Roberts, Jillian

    2012-01-01

    In Canada, approximately three out of every 1,000 children have epilepsy, making it one of the most commonly diagnosed neurological conditions affecting children. It is therefore highly probable that educators will work with this population at some point in their careers. Epilepsy is linked to academic underachievement and social isolation, but…

  12. 10 CFR 26.119 - Determining “shy” bladder.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... donor was required to take a drug test, but was unable to provide a sufficient quantity of urine to complete the test; (2) The potential consequences of refusing to take the required drug test; and (3) The... condition has, or with a high degree of probability could have, precluded the donor from providing a...

  13. 10 CFR 26.119 - Determining “shy” bladder.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... donor was required to take a drug test, but was unable to provide a sufficient quantity of urine to complete the test; (2) The potential consequences of refusing to take the required drug test; and (3) The... condition has, or with a high degree of probability could have, precluded the donor from providing a...

  14. 10 CFR 26.119 - Determining “shy” bladder.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... donor was required to take a drug test, but was unable to provide a sufficient quantity of urine to complete the test; (2) The potential consequences of refusing to take the required drug test; and (3) The... condition has, or with a high degree of probability could have, precluded the donor from providing a...

  15. Physiological condition of autumn-banded mallards and its relationship to hunting vulnerability

    USGS Publications Warehouse

    Hepp, G.R.; Blohm, R.J.; Reynolds, R.E.; Hines, J.E.; Nichols, J.D.

    1986-01-01

    An important topic of waterfowl ecology concerns the relationship between the physiological condition of ducks during the nonbreeding season and fitness, i.e., survival and future reproductive success. We investigated this subject using direct band recovery records of mallards (Anas platyrhynchos) banded in autumn (1 Oct-15 Dec) 1981-83 in the Mississippi Alluvial Valley (MAV) [USA]. A condition index, weight (g)/wing length (mm), was calculated for each duck, and we tested whether condition of mallards at time of banding was related to their probability of recovery during the hunting season. In 3 years, 5,610 mallards were banded and there were 234 direct recoveries. Three binary regression model was used to test the relationship between recovery probability and condition. Likelihood-ratio tests were conducted to determine the most suitable model. For mallards banded in autumn there was a negative relationship between physical condition and the probability of recovery. Mallards in poor condition at the time of banding had a greater probability of being recovered during the hunting season. In general, this was true for all ages and sex classes; however, the strongest relationship occurred for adult males.

  16. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  17. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    PubMed Central

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2014-01-01

    Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016

  18. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  19. Metocean design parameter estimation for fixed platform based on copula functions

    NASA Astrophysics Data System (ADS)

    Zhai, Jinjin; Yin, Qilin; Dong, Sheng

    2017-08-01

    Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.

  20. The evaluation of speed skating helmet performance through peak linear and rotational accelerations.

    PubMed

    Karton, Clara; Rousseau, Philippe; Vassilyadi, Michael; Hoshizaki, Thomas Blaine

    2014-01-01

    Like many sports involving high speeds and body contact, head injuries are a concern for short track speed skating athletes and coaches. While the mandatory use of helmets has managed to nearly eliminate catastrophic head injuries such as skull fractures and cerebral haemorrhages, they may not be as effective at reducing the risk of a concussion. The purpose of this study was to evaluate the performance characteristics of speed skating helmets with respect to managing peak linear and peak rotational acceleration, and to compare their performance against other types of helmets commonly worn within the speed skating sport. Commercially available speed skating, bicycle and ice hockey helmets were evaluated using a three-impact condition test protocol at an impact velocity of 4 m/s. Two speed skating helmet models yielded mean peak linear accelerations at a low-estimated probability range for sustaining a concussion for all three impact conditions. Conversely, the resulting mean peak rotational acceleration values were all found close to the high end of a probability range for sustaining a concussion. A similar tendency was observed for the bicycle and ice hockey helmets under the same impact conditions. Speed skating helmets may not be as effective at managing rotational acceleration and therefore may not successfully protect the user against risks associated with concussion injuries.

  1. Endoscopic ultrasound in common bile duct dilatation with normal liver enzymes

    PubMed Central

    De Angelis, Claudio; Marietti, Milena; Bruno, Mauro; Pellicano, Rinaldo; Rizzetto, Mario

    2015-01-01

    In recent years, the description of isolated bile duct dilatation has been increasingly observed in subjects with normal liver function tests and nonspecific abdominal symptoms, probably due to the widespread use of high-resolution imaging techniques. However, there is scant literature about the evolution of this condition and the impact of endoscopic ultrasound (EUS) in the diagnostic work up. When noninvasive imaging tests (transabdominal ultrasound, computed tomography or magnetic resonance cholangiopancreatography) fail to identify the cause of dilatation and clinical or biochemical alarm signs are absent, the probability of having biliary disease is considered low. In this setting, using EUS, the presence of pathologic findings (choledocholithiasis, strictures, chronic pancreatitis, ampullary or pancreatic tumors, cholangiocarcinoma), not always with a benign course, has been observed. The aim of this review has been to evaluate the prevalence of disease among non-jaundiced patients without signs of cytolysis and/or cholestasis and the assessment of EUS yield. Data point out to a promising role of EUS in the identification of a potential biliary pathology. EUS is a low invasive technique, with high accuracy, that could play a double cost-effective role: identifying pathologic conditions with dismal prognosis, in asymptomatic patients with negative prior imaging tests, and excluding pathologic conditions and further follow-up in healthy subjects. PMID:26191344

  2. Assessing hypotheses about nesting site occupancy dynamics

    USGS Publications Warehouse

    Bled, Florent; Royle, J. Andrew; Cam, Emmanuelle

    2011-01-01

    Hypotheses about habitat selection developed in the evolutionary ecology framework assume that individuals, under some conditions, select breeding habitat based on expected fitness in different habitat. The relationship between habitat quality and fitness may be reflected by breeding success of individuals, which may in turn be used to assess habitat quality. Habitat quality may also be assessed via local density: if high-quality sites are preferentially used, high density may reflect high-quality habitat. Here we assessed whether site occupancy dynamics vary with site surrogates for habitat quality. We modeled nest site use probability in a seabird subcolony (the Black-legged Kittiwake, Rissa tridactyla) over a 20-year period. We estimated site persistence (an occupied site remains occupied from time t to t + 1) and colonization through two subprocesses: first colonization (site creation at the timescale of the study) and recolonization (a site is colonized again after being deserted). Our model explicitly incorporated site-specific and neighboring breeding success and conspecific density in the neighborhood. Our results provided evidence that reproductively "successful'' sites have a higher persistence probability than "unsuccessful'' ones. Analyses of site fidelity in marked birds and of survival probability showed that high site persistence predominantly reflects site fidelity, not immediate colonization by new owners after emigration or death of previous owners. There is a negative quadratic relationship between local density and persistence probability. First colonization probability decreases with density, whereas recolonization probability is constant. This highlights the importance of distinguishing initial colonization and recolonization to understand site occupancy. All dynamics varied positively with neighboring breeding success. We found evidence of a positive interaction between site-specific and neighboring breeding success. We addressed local population dynamics using a site occupancy approach integrating hypotheses developed in behavioral ecology to account for individual decisions. This allows development of models of population and metapopulation dynamics that explicitly incorporate ecological and evolutionary processes.

  3. Estimating the Per-Contact Probability of Infection by Highly Pathogenic Avian Influenza (H7N7) Virus during the 2003 Epidemic in The Netherlands

    PubMed Central

    Ssematimba, Amos; Elbers, Armin R. W.; Hagenaars, Thomas J.; de Jong, Mart C. M.

    2012-01-01

    Estimates of the per-contact probability of transmission between farms of Highly Pathogenic Avian Influenza virus of H7N7 subtype during the 2003 epidemic in the Netherlands are important for the design of better control and biosecurity strategies. We used standardized data collected during the epidemic and a model to extract data for untraced contacts based on the daily number of infectious farms within a given distance of a susceptible farm. With these data, we used a maximum likelihood estimation approach to estimate the transmission probabilities by the individual contact types, both traced and untraced. The estimated conditional probabilities, conditional on the contact originating from an infectious farm, of virus transmission were: 0.000057 per infectious farm within 1 km per day, 0.000413 per infectious farm between 1 and 3 km per day, 0.0000895 per infectious farm between 3 and 10 km per day, 0.0011 per crisis organisation contact, 0.0414 per feed delivery contact, 0.308 per egg transport contact, 0.133 per other-professional contact and, 0.246 per rendering contact. We validate these outcomes against literature data on virus genetic sequences for outbreak farms. These estimates can be used to inform further studies on the role that improved biosecurity between contacts and/or contact frequency reduction can play in eliminating between-farm spread of the virus during future epidemics. The findings also highlight the need to; 1) understand the routes underlying the infections without traced contacts and, 2) to review whether the contact-tracing protocol is exhaustive in relation to all the farm’s day-to-day activities and practices. PMID:22808285

  4. Probability based models for estimation of wildfire risk

    Treesearch

    Haiganoush Preisler; D. R. Brillinger; R. E. Burgan; John Benoit

    2004-01-01

    We present a probability-based model for estimating fire risk. Risk is defined using three probabilities: the probability of fire occurrence; the conditional probability of a large fire given ignition; and the unconditional probability of a large fire. The model is based on grouped data at the 1 km²-day cell level. We fit a spatially and temporally explicit non-...

  5. Does probability of occurrence relate to population dynamics?

    USGS Publications Warehouse

    Thuiller, Wilfried; Münkemüller, Tamara; Schiffers, Katja H.; Georges, Damien; Dullinger, Stefan; Eckhart, Vincent M.; Edwards, Thomas C.; Gravel, Dominique; Kunstler, Georges; Merow, Cory; Moore, Kara; Piedallu, Christian; Vissault, Steve; Zimmermann, Niklaus E.; Zurell, Damaris; Schurr, Frank M.

    2014-01-01

    Hutchinson defined species' realized niche as the set of environmental conditions in which populations can persist in the presence of competitors. In terms of demography, the realized niche corresponds to the environments where the intrinsic growth rate (r) of populations is positive. Observed species occurrences should reflect the realized niche when additional processes like dispersal and local extinction lags do not have overwhelming effects. Despite the foundational nature of these ideas, quantitative assessments of the relationship between range-wide demographic performance and occurrence probability have not been made. This assessment is needed both to improve our conceptual understanding of species' niches and ranges and to develop reliable mechanistic models of species geographic distributions that incorporate demography and species interactions.The objective of this study is to analyse how demographic parameters (intrinsic growth rate r and carrying capacity K ) and population density (N ) relate to occurrence probability (Pocc ). We hypothesized that these relationships vary with species' competitive ability. Demographic parameters, density, and occurrence probability were estimated for 108 tree species from four temperate forest inventory surveys (Québec, western USA, France and Switzerland). We used published information of shade tolerance as indicators of light competition strategy, assuming that high tolerance denotes high competitive capacity in stable forest environments.Interestingly, relationships between demographic parameters and occurrence probability did not vary substantially across degrees of shade tolerance and regions. Although they were influenced by the uncertainty in the estimation of the demographic parameters, we found that r was generally negatively correlated with Pocc, while N, and for most regions K, was generally positively correlated with Pocc. Thus, in temperate forest trees the regions of highest occurrence probability are those with high densities but slow intrinsic population growth rates. The uncertain relationships between demography and occurrence probability suggests caution when linking species distribution and demographic models.

  6. Time varying moments, regime switch, and crisis warning: The birth-death process with changing transition probability

    NASA Astrophysics Data System (ADS)

    Tang, Yinan; Chen, Ping

    2014-06-01

    The sub-prime crisis in the U.S. reveals the limitation of diversification strategy based on mean-variance analysis. A regime switch and a turning point can be observed using a high moment representation and time-dependent transition probability. Up-down price movements are induced by interactions among agents, which can be described by the birth-death (BD) process. Financial instability is visible by dramatically increasing 3rd to 5th moments one-quarter before and during the crisis. The sudden rising high moments provide effective warning signals of a regime-switch or a coming crisis. The critical condition of a market breakdown can be identified from nonlinear stochastic dynamics. The master equation approach of population dynamics provides a unified theory of a calm and turbulent market.

  7. Effect of noopept and afobazole on the development of neurosis of learned helplessness in rats.

    PubMed

    Uyanaev, A A; Fisenko, V P; Khitrov, N K

    2003-08-01

    We studied the effects of new psychotropic preparations noopept and afobazole on acquisition of the conditioned active avoidance response and development of neurosis of learned helplessness in rats. Noopept in doses of 0.05-0.10 mg/kg accelerated acquisition of conditioned active avoidance response and reduced the incidence of learned helplessness in rats. Afobazole in a dose of 5 mg/kg produced an opposite effect, which is probably related to high selective anxiolytic activity of this preparation.

  8. Effect of drain current on appearance probability and amplitude of random telegraph noise in low-noise CMOS image sensors

    NASA Astrophysics Data System (ADS)

    Ichino, Shinya; Mawaki, Takezo; Teramoto, Akinobu; Kuroda, Rihito; Park, Hyeonwoo; Wakashima, Shunichi; Goto, Tetsuya; Suwa, Tomoyuki; Sugawa, Shigetoshi

    2018-04-01

    Random telegraph noise (RTN), which occurs in in-pixel source follower (SF) transistors, has become one of the most critical problems in high-sensitivity CMOS image sensors (CIS) because it is a limiting factor of dark random noise. In this paper, the behaviors of RTN toward changes in SF drain current conditions were analyzed using a low-noise array test circuit measurement system with a floor noise of 35 µV rms. In addition to statistical analysis by measuring a large number of transistors (18048 transistors), we also analyzed the behaviors of RTN parameters such as amplitude and time constants in the individual transistors. It is demonstrated that the appearance probability of RTN becomes small under a small drain current condition, although large-amplitude RTN tends to appear in a very small number of cells.

  9. Operational foreshock forecasting: Fifteen years after

    NASA Astrophysics Data System (ADS)

    Ogata, Y.

    2010-12-01

    We are concerned with operational forecasting of the probability that events are foreshocks of a forthcoming earthquake that is significantly larger (mainshock). Specifically, we define foreshocks as the preshocks substantially smaller than the mainshock by a magnitude gap of 0.5 or larger. The probability gain of foreshock forecast is extremely high compare to long-term forecast by renewal processes or various alarm-based intermediate-term forecasts because of a large event’s low occurrence rate in a short period and a narrow target region. Thus, it is desired to establish operational foreshock probability forecasting as seismologists have done for aftershocks. When a series of earthquakes occurs in a region, we attempt to discriminate foreshocks from a swarm or mainshock-aftershock sequence. Namely, after real time identification of an earthquake cluster using methods such as the single-link algorithm, the probability is calculated by applying statistical features that discriminate foreshocks from other types of clusters, by considering the events' stronger proximity in time and space and tendency towards chronologically increasing magnitudes. These features were modeled for probability forecasting and the coefficients of the model were estimated in Ogata et al. (1996) for the JMA hypocenter data (M≧4, 1926-1993). Currently, fifteen years has passed since the publication of the above-stated work so that we are able to present the performance and validation of the forecasts (1994-2009) by using the same model. Taking isolated events into consideration, the probability of the first events in a potential cluster being a foreshock vary in a range between 0+% and 10+% depending on their locations. This conditional forecasting performs significantly better than the unconditional (average) foreshock probability of 3.7% throughout Japan region. Furthermore, when we have the additional events in a cluster, the forecast probabilities range more widely from nearly 0% to about 40% depending on the discrimination features among the events in the cluster. This conditional forecasting further performs significantly better than the unconditional foreshock probability of 7.3%, which is the average probability of the plural events in the earthquake clusters. Indeed, the frequency ratios of the actual foreshocks are consistent with the forecasted probabilities. Reference: Ogata, Y., Utsu, T. and Katsura, K. (1996). Statistical discrimination of foreshocks from other earthquake clusters, Geophys. J. Int. 127, 17-30.

  10. Stochastic approach for an unbiased estimation of the probability of a successful separation in conventional chromatography and sequential elution liquid chromatography.

    PubMed

    Ennis, Erin J; Foley, Joe P

    2016-07-15

    A stochastic approach was utilized to estimate the probability of a successful isocratic or gradient separation in conventional chromatography for numbers of sample components, peak capacities, and saturation factors ranging from 2 to 30, 20-300, and 0.017-1, respectively. The stochastic probabilities were obtained under conditions of (i) constant peak width ("gradient" conditions) and (ii) peak width increasing linearly with time ("isocratic/constant N" conditions). The isocratic and gradient probabilities obtained stochastically were compared with the probabilities predicted by Martin et al. [Anal. Chem., 58 (1986) 2200-2207] and Davis and Stoll [J. Chromatogr. A, (2014) 128-142]; for a given number of components and peak capacity the same trend is always observed: probability obtained with the isocratic stochastic approach

  11. Probability Density Functions of the Solar Wind Driver of the Magnetopshere-Ionosphere System

    NASA Astrophysics Data System (ADS)

    Horton, W.; Mays, M. L.

    2007-12-01

    The solar-wind driven magnetosphere-ionosphere system is a complex dynamical system in that it exhibits (1) sensitivity to initial conditions; (2) multiple space-time scales; (3) bifurcation sequences with hysteresis in transitions between attractors; and (4) noncompositionality. This system is modeled by WINDMI--a network of eight coupled ordinary differential equations which describe the transfer of power from the solar wind through the geomagnetic tail, the ionosphere, and ring current in the system. The model captures both storm activity from the plasma ring current energy, which yields a model Dst index result, and substorm activity from the region 1 field aligned current, yielding model AL and AU results. The input to the model is the solar wind driving voltage calculated from ACE solar wind parameter data, which has a regular coherent component and broad-band turbulent component. Cross correlation functions of the input-output data time series are computed and the conditional probability density function for the occurrence of substorms given earlier IMF conditions are derived. The model shows a high probability of substorms for solar activity that contains a coherent, rotating IMF with magnetic cloud features. For a theoretical model of the imprint of solar convection on the solar wind we have used the Lorenz attractor (Horton et al., PoP, 1999, doi:10.10631.873683) as a solar wind driver. The work is supported by NSF grant ATM-0638480.

  12. HMM for hyperspectral spectrum representation and classification with endmember entropy vectors

    NASA Astrophysics Data System (ADS)

    Arabi, Samir Y. W.; Fernandes, David; Pizarro, Marco A.

    2015-10-01

    The Hyperspectral images due to its good spectral resolution are extensively used for classification, but its high number of bands requires a higher bandwidth in the transmission data, a higher data storage capability and a higher computational capability in processing systems. This work presents a new methodology for hyperspectral data classification that can work with a reduced number of spectral bands and achieve good results, comparable with processing methods that require all hyperspectral bands. The proposed method for hyperspectral spectra classification is based on the Hidden Markov Model (HMM) associated to each Endmember (EM) of a scene and the conditional probabilities of each EM belongs to each other EM. The EM conditional probability is transformed in EM vector entropy and those vectors are used as reference vectors for the classes in the scene. The conditional probability of a spectrum that will be classified is also transformed in a spectrum entropy vector, which is classified in a given class by the minimum ED (Euclidian Distance) among it and the EM entropy vectors. The methodology was tested with good results using AVIRIS spectra of a scene with 13 EM considering the full 209 bands and the reduced spectral bands of 128, 64 and 32. For the test area its show that can be used only 32 spectral bands instead of the original 209 bands, without significant loss in the classification process.

  13. Dopaminergic Drug Effects on Probability Weighting during Risky Decision Making

    PubMed Central

    Timmer, Monique H. M.; ter Huurne, Niels P.

    2018-01-01

    Abstract Dopamine has been associated with risky decision-making, as well as with pathological gambling, a behavioral addiction characterized by excessive risk-taking behavior. However, the specific mechanisms through which dopamine might act to foster risk-taking and pathological gambling remain elusive. Here we test the hypothesis that this might be achieved, in part, via modulation of subjective probability weighting during decision making. Human healthy controls (n = 21) and pathological gamblers (n = 16) played a decision-making task involving choices between sure monetary options and risky gambles both in the gain and loss domains. Each participant played the task twice, either under placebo or the dopamine D2/D3 receptor antagonist sulpiride, in a double-blind counterbalanced design. A prospect theory modelling approach was used to estimate subjective probability weighting and sensitivity to monetary outcomes. Consistent with prospect theory, we found that participants presented a distortion in the subjective weighting of probabilities, i.e., they overweighted low probabilities and underweighted moderate to high probabilities, both in the gain and loss domains. Compared with placebo, sulpiride attenuated this distortion in the gain domain. Across drugs, the groups did not differ in their probability weighting, although gamblers consistently underweighted losing probabilities in the placebo condition. Overall, our results reveal that dopamine D2/D3 receptor antagonism modulates the subjective weighting of probabilities in the gain domain, in the direction of more objective, economically rational decision making. PMID:29632870

  14. Holocene evolution of the River Nile drainage system as revealed from the Lake Dendi sediment record, central Ethiopian highlands

    NASA Astrophysics Data System (ADS)

    Wagner, B.; Viehberg, F. A.; Wennrich, V.; Junginger, A.; Kolvenbach, A.; Rethemeyer, J.; Schaebitz, F.; Schmiedl, G. H.

    2015-12-01

    A 12 m long sediment sequence from Dendi Crater lakes, located on the central Ethiopian Plateau, was analysed with sedimentological and geochemical methods to reconstruct the regional environmental history. Bulk organic carbon samples from 23 horizons throughout the sequence were used for AMS radiocarbon dating and indicate that the sediment sequence spans the last ca. 12 cal kyr BP. Microscope analyses and sedimentological data reveal three tephra layers, of which the most prominent layer with a thickness of ~2 m was deposited at 10.2 cal kyr BP and probably originates from an eruption of the Wenchi crater 12 km to the west of the Dendi lakes. Sedimentological data of the pelagic deposits indicate shifts in erosion and rainfall throughout the record. A decrease in Ca and Sr at 11.6 cal kyr BP is related to the shift of less humid condition during the Younger Dryas (YD) to the return to full humid conditions of the African Humid Period (AHP). Single thin horizons with high carbonate content or high Ti and K imply that short spells of dry conditions and significantly increased rainfall superimpose the generally more humid conditions during the AHP. The end of the AHP is gradual. Relatively stable and less humid conditions characterised the Dendi Crater lakes until around 3.9 cal kyr BP. A highly variable increase in clastic matter over the last 1500 years indicates higher erosion due to short-term variations in precipitation within the Dendi catchment. Overall, the sediment record suggests moderate change of precipitation during the Holocene, which is probably due to their exposed location in the Ethiopian highlands. The data from the Dendi Crater lakes show, in concert with other records from the Nile catchment and the Eastern Mediterranean Sea (EMS), that the Blue Nile provided the main freshwater source for maintaining EMS stratification and sapropel S1 formation between ca. 10.0 and 8.7 cal kyr BP. Subsequent aridification is recorded from equatorial East Africa to the northeastern Mediterranean and peaked with some regional differences between ca. 4.0 and 2.6 cal kyr BP. Significant higher discharge in the Blue Nile hydraulic regime after 2.6 cal kyr BP is probably triggered by more local changes in precipitation, which are tentatively caused by a change in the influence of the Indian Ocean.

  15. Probability and predictors of treatment-seeking for substance use disorders in the U.S.

    PubMed

    Blanco, Carlos; Iza, Miren; Rodríguez-Fernández, Jorge Mario; Baca-García, Enrique; Wang, Shuai; Olfson, Mark

    2015-04-01

    Little is known about to what extent treatment-seeking behavior varies across individuals with alcohol abuse, alcohol dependence, drug abuse, and drug dependence. The sample included respondents from the Wave 2 of the National Epidemiologic Survey on Alcohol and Related Conditions (NESARC) who reported a lifetime diagnosis alcohol abuse, alcohol dependence, drug abuse, or drug dependence. Unadjusted and adjusted hazard ratios are presented for time to first treatment contact by sociodemographic characteristics and comorbid psychiatric disorders. Individuals were censored from the analyses if their condition remitted prior to seeking treatment. In the first year after disorder onset, rates of treatment-seeking were 13% for drug dependence, 5% for alcohol dependence, 2% for drug abuse, and 1% for alcohol abuse. The lifetime probability of seeking treatment among individuals who did not remit was also highest for drug dependence (90%), followed by drug abuse (60%), alcohol dependence (54%), and alcohol abuse (16%). Having had previous treatment contact for a substance use disorder (SUD) increased the probability of seeking treatment for another SUD. By contrast, an early age of SUD onset, belonging to an older cohort, and a higher level of education decreased the lifetime probability of treatment contact for SUD. The role of comorbid mental disorders was more complex, with some disorders increasing and other decreasing the probability of seeking treatment. Given high rates of SUD and their substantial health and economic burden, these patterns suggest the need for innovative approaches to increase treatment access for individuals with SUD. Copyright © 2015. Published by Elsevier Ireland Ltd.

  16. Probability and predictors of treatment-seeking for substance use disorders in the U.S

    PubMed Central

    Blanco, Carlos; Iza, Miren; Rodríguez-Fernández, Jorge Mario; Baca-García, Enrique; Wang, Shuai; Olfson, Mark

    2016-01-01

    Background Little is known about to what extent treatment-seeking behavior varies across individuals with alcohol abuse, alcohol dependence, drug abuse, and drug dependence. Methods The sample included respondents from the Wave 2 of the National Epidemiologic Survey on Alcohol and Related Conditions (NESARC) who reported a lifetime diagnosis alcohol abuse, alcohol dependence, drug abuse, or drug dependence. Unadjusted and adjusted hazard ratios are presented for time to first treatment contact by sociodemographic characteristics and comorbid psychiatric disorders. Individuals were censored from the analyses if their condition remitted prior to seeking treatment. Results In the first year after disorder onset, rates of treatment-seeking were 13% for drug dependence, 5% for alcohol dependence, 2% for drug abuse, and 1% for alcohol abuse. The lifetime probability of seeking treatment among individuals who did not remit was also highest for drug dependence (90%), followed by drug abuse (60%), alcohol dependence (54%), and alcohol abuse (16%). Having had previous treatment contact for a substance use disorder (SUD) increased the probability of seeking treatment for another SUD. By contrast, an early age of SUD onset, belonging to an older cohort, and a higher level of education decreased the lifetime probability of treatment contact for SUD. The role of comorbid mental disorders was more complex, with some disorders increasing and other decreasing the probability of seeking treatment. Conclusions Given high rates of SUD and their substantial health and economic burden, these patterns suggest the need for innovative approaches to increase treatment access for individuals with SUD. PMID:25725934

  17. Height probabilities in the Abelian sandpile model on the generalized finite Bethe lattice

    NASA Astrophysics Data System (ADS)

    Chen, Haiyan; Zhang, Fuji

    2013-08-01

    In this paper, we study the sandpile model on the generalized finite Bethe lattice with a particular boundary condition. Using a combinatorial method, we give the exact expressions for all single-site probabilities and some two-site joint probabilities. As a by-product, we prove that the height probabilities of bulk vertices are all the same for the Bethe lattice with certain given boundary condition, which was found from numerical evidence by Grassberger and Manna ["Some more sandpiles," J. Phys. (France) 51, 1077-1098 (1990)], 10.1051/jphys:0199000510110107700 but without a proof.

  18. "A violation of the conditional independence assumption in the two-high-threshold Model of recognition memory": Correction to Chen, Starns, and Rotello (2015).

    PubMed

    2016-01-01

    Reports an error in "A violation of the conditional independence assumption in the two-high-threshold model of recognition memory" by Tina Chen, Jeffrey J. Starns and Caren M. Rotello (Journal of Experimental Psychology: Learning, Memory, and Cognition, 2015[Jul], Vol 41[4], 1215-1222). In the article, Chen et al. compared three models: a continuous signal detection model (SDT), a standard two-high-threshold discrete-state model in which detect states always led to correct responses (2HT), and a full-mapping version of the 2HT model in which detect states could lead to either correct or incorrect responses. After publication, Rani Moran (personal communication, April 21, 2015) identified two errors that impact the reported fit statistics for the Bayesian information criterion (BIC) metric of all models as well as the Akaike information criterion (AIC) results for the full-mapping model. The errors are described in the erratum. (The following abstract of the original article appeared in record 2014-56216-001.) The 2-high-threshold (2HT) model of recognition memory assumes that test items result in distinct internal states: they are either detected or not, and the probability of responding at a particular confidence level that an item is "old" or "new" depends on the state-response mapping parameters. The mapping parameters are independent of the probability that an item yields a particular state (e.g., both strong and weak items that are detected as old have the same probability of producing a highest-confidence "old" response). We tested this conditional independence assumption by presenting nouns 1, 2, or 4 times. To maximize the strength of some items, "superstrong" items were repeated 4 times and encoded in conjunction with pleasantness, imageability, anagram, and survival processing tasks. The 2HT model failed to simultaneously capture the response rate data for all item classes, demonstrating that the data violated the conditional independence assumption. In contrast, a Gaussian signal detection model, which posits that the level of confidence that an item is "old" or "new" is a function of its continuous strength value, provided a good account of the data. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  19. Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.

    EPA Science Inventory

    The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...

  20. A probabilistic assessment of the likelihood of vegetation drought under varying climate conditions across China.

    PubMed

    Liu, Zhiyong; Li, Chao; Zhou, Ping; Chen, Xiuzhi

    2016-10-07

    Climate change significantly impacts the vegetation growth and terrestrial ecosystems. Using satellite remote sensing observations, here we focus on investigating vegetation dynamics and the likelihood of vegetation-related drought under varying climate conditions across China. We first compare temporal trends of Normalized Difference Vegetation Index (NDVI) and climatic variables over China. We find that in fact there is no significant change in vegetation over the cold regions where warming is significant. Then, we propose a joint probability model to estimate the likelihood of vegetation-related drought conditioned on different precipitation/temperature scenarios in growing season across China. To the best of our knowledge, this study is the first to examine the vegetation-related drought risk over China from a perspective based on joint probability. Our results demonstrate risk patterns of vegetation-related drought under both low and high precipitation/temperature conditions. We further identify the variations in vegetation-related drought risk under different climate conditions and the sensitivity of drought risk to climate variability. These findings provide insights for decision makers to evaluate drought risk and vegetation-related develop drought mitigation strategies over China in a warming world. The proposed methodology also has a great potential to be applied for vegetation-related drought risk assessment in other regions worldwide.

  1. A probabilistic assessment of the likelihood of vegetation drought under varying climate conditions across China

    PubMed Central

    Liu, Zhiyong; Li, Chao; Zhou, Ping; Chen, Xiuzhi

    2016-01-01

    Climate change significantly impacts the vegetation growth and terrestrial ecosystems. Using satellite remote sensing observations, here we focus on investigating vegetation dynamics and the likelihood of vegetation-related drought under varying climate conditions across China. We first compare temporal trends of Normalized Difference Vegetation Index (NDVI) and climatic variables over China. We find that in fact there is no significant change in vegetation over the cold regions where warming is significant. Then, we propose a joint probability model to estimate the likelihood of vegetation-related drought conditioned on different precipitation/temperature scenarios in growing season across China. To the best of our knowledge, this study is the first to examine the vegetation-related drought risk over China from a perspective based on joint probability. Our results demonstrate risk patterns of vegetation-related drought under both low and high precipitation/temperature conditions. We further identify the variations in vegetation-related drought risk under different climate conditions and the sensitivity of drought risk to climate variability. These findings provide insights for decision makers to evaluate drought risk and vegetation-related develop drought mitigation strategies over China in a warming world. The proposed methodology also has a great potential to be applied for vegetation-related drought risk assessment in other regions worldwide. PMID:27713530

  2. State and Jurisdictional Eligibility Definitions for Infants and Toddlers with Disabilities under IDEA. NECTAC Notes.

    ERIC Educational Resources Information Center

    Shackelford, Jo

    Under Part C of the Individuals with Disabilities Education Act (IDEA), participating states and jurisdictions must provide services to children who are either experiencing developmental delays, or who have a diagnosed mental or physical condition that has a a high probability of resulting in developmental delay. Additionally, states may choose to…

  3. State and Jurisdictional Eligibility Definitions for Infants and Toddlers with Disabilities under IDEA. NECTAC Notes No. 16

    ERIC Educational Resources Information Center

    Shackelford, Jo

    2004-01-01

    Under Part C of the Individuals with Disabilities Education Act (IDEA), participating states and jurisdictions must provide services to children who are either experiencing developmental delays, or who have a diagnosed mental or physical condition that has a high probability of resulting in developmental delay. Additionally, states may choose to…

  4. Student and Teacher Ratings of Academic Competence: An Examination of Cross-Informant Agreement

    ERIC Educational Resources Information Center

    Kettler, Ryan J.; Elliott, Stephen N.; DiPerna, James C.; Bolt, Daniel M.; Reiser, Deitra; Resurreccion, Leilani

    2014-01-01

    Two studies were conducted with samples of middle and high school teachers and students to examine cross-informant agreement on the Academic Competence Evaluation Scales. Cross-informant agreement was examined using Pearson correlations and conditional probability indices. Results of Study 1 (N = 65) and Study 2 (N = 66) indicated that teacher and…

  5. On defense strategies for system of systems using aggregated correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S.; Imam, Neena; Ma, Chris Y. T.

    2017-04-01

    We consider a System of Systems (SoS) wherein each system Si, i = 1; 2; ... ;N, is composed of discrete cyber and physical components which can be attacked and reinforced. We characterize the disruptions using aggregate failure correlation functions given by the conditional failure probability of SoS given the failure of an individual system. We formulate the problem of ensuring the survival of SoS as a game between an attacker and a provider, each with a utility function composed of asurvival probability term and a cost term, both expressed in terms of the number of components attacked and reinforced.more » The survival probabilities of systems satisfy simple product-form, first-order differential conditions, which simplify the Nash Equilibrium (NE) conditions. We derive the sensitivity functions that highlight the dependence of SoS survival probability at NE on cost terms, correlation functions, and individual system survival probabilities.We apply these results to a simplified model of distributed cloud computing infrastructure.« less

  6. Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data

    NASA Astrophysics Data System (ADS)

    Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei

    2009-03-01

    We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.

  7. Logic, probability, and human reasoning.

    PubMed

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Large margin nearest neighbor classifiers.

    PubMed

    Domeniconi, Carlotta; Gunopulos, Dimitrios; Peng, Jing

    2005-07-01

    The nearest neighbor technique is a simple and appealing approach to addressing classification problems. It relies on the assumption of locally constant class conditional probabilities. This assumption becomes invalid in high dimensions with a finite number of examples due to the curse of dimensionality. Severe bias can be introduced under these conditions when using the nearest neighbor rule. The employment of a locally adaptive metric becomes crucial in order to keep class conditional probabilities close to uniform, thereby minimizing the bias of estimates. We propose a technique that computes a locally flexible metric by means of support vector machines (SVMs). The decision function constructed by SVMs is used to determine the most discriminant direction in a neighborhood around the query. Such a direction provides a local feature weighting scheme. We formally show that our method increases the margin in the weighted space where classification takes place. Moreover, our method has the important advantage of online computational efficiency over competing locally adaptive techniques for nearest neighbor classification. We demonstrate the efficacy of our method using both real and simulated data.

  9. Experimental determination of the steady-state charging probabilities and particle size conservation in non-radioactive and radioactive bipolar aerosol chargers in the size range of 5-40 nm

    NASA Astrophysics Data System (ADS)

    Kallinger, Peter; Szymanski, Wladyslaw W.

    2015-04-01

    Three bipolar aerosol chargers, an AC-corona (Electrical Ionizer 1090, MSP Corp.), a soft X-ray (Advanced Aerosol Neutralizer 3087, TSI Inc.), and an α-radiation-based 241Am charger (tapcon & analysesysteme), were investigated on their charging performance of airborne nanoparticles. The charging probabilities for negatively and positively charged particles and the particle size conservation were measured in the diameter range of 5-40 nm using sucrose nanoparticles. Chargers were operated under various flow conditions in the range of 0.6-5.0 liters per minute. For particular experimental conditions, some deviations from the chosen theoretical model were found for all chargers. For very small particle sizes, the AC-corona charger showed particle losses at low flow rates and did not reach steady-state charge equilibrium at high flow rates. However, for all chargers, operating conditions were identified where the bipolar charge equilibrium was achieved. Practically, excellent particle size conservation was found for all three chargers.

  10. Evaluation of design parameters for TRISO-coated fuel particles to establish manufacturing critical limits using PARFUME

    DOE PAGES

    Skerjanc, William F.; Maki, John T.; Collin, Blaise P.; ...

    2015-12-02

    The success of modular high temperature gas-cooled reactors is highly dependent on the performance of the tristructural-isotopic (TRISO) coated fuel particle and the quality to which it can be manufactured. During irradiation, TRISO-coated fuel particles act as a pressure vessel to contain fission gas and mitigate the diffusion of fission products to the coolant boundary. The fuel specifications place limits on key attributes to minimize fuel particle failure under irradiation and postulated accident conditions. PARFUME (an integrated mechanistic coated particle fuel performance code developed at the Idaho National Laboratory) was used to calculate fuel particle failure probabilities. By systematically varyingmore » key TRISO-coated particle attributes, failure probability functions were developed to understand how each attribute contributes to fuel particle failure. Critical manufacturing limits were calculated for the key attributes of a low enriched TRISO-coated nuclear fuel particle with a kernel diameter of 425 μm. As a result, these critical manufacturing limits identify ranges beyond where an increase in fuel particle failure probability is expected to occur.« less

  11. Landsat D Thematic Mapper image dimensionality reduction and geometric correction accuracy

    NASA Technical Reports Server (NTRS)

    Ford, G. E.

    1986-01-01

    To characterize and quantify the performance of the Landsat thematic mapper (TM), techniques for dimensionality reduction by linear transformation have been studied and evaluated and the accuracy of the correction of geometric errors in TM images analyzed. Theoretical evaluations and comparisons for existing methods for the design of linear transformation for dimensionality reduction are presented. These methods include the discrete Karhunen Loeve (KL) expansion, Multiple Discriminant Analysis (MDA), Thematic Mapper (TM)-Tasseled Cap Linear Transformation and Singular Value Decomposition (SVD). A unified approach to these design problems is presented in which each method involves optimizing an objective function with respect to the linear transformation matrix. From these studies, four modified methods are proposed. They are referred to as the Space Variant Linear Transformation, the KL Transform-MDA hybrid method, and the First and Second Version of the Weighted MDA method. The modifications involve the assignment of weights to classes to achieve improvements in the class conditional probability of error for classes with high weights. Experimental evaluations of the existing and proposed methods have been performed using the six reflective bands of the TM data. It is shown that in terms of probability of classification error and the percentage of the cumulative eigenvalues, the six reflective bands of the TM data require only a three dimensional feature space. It is shown experimentally as well that for the proposed methods, the classes with high weights have improvements in class conditional probability of error estimates as expected.

  12. Impact of a Usual Source of Care on Health Care Use, Spending, and Quality Among Adults With Mental Health Conditions.

    PubMed

    Fullerton, Catherine A; Witt, Whitney P; Chow, Clifton M; Gokhale, Manjusha; Walsh, Christine E; Crable, Erika L; Naeger, Sarah

    2018-05-01

    Physical comorbidities associated with mental health conditions contribute to high health care costs. This study examined the impact of having a usual source of care (USC) for physical health on health care utilization, spending, and quality for adults with a mental health condition using Medicaid administrative data. Having a USC decreased the probability of inpatient admissions and readmissions. It decreased expenditures on emergency department visits for physical health, 30-day readmissions, and behavioral health inpatient admissions. It also had a positive effect on several quality measures. Results underscore the importance of a USC for physical health and integrated care for adults with mental health conditions.

  13. Measurement of the Errors of Service Altimeter Installations During Landing-Approach and Take-Off Operations

    NASA Technical Reports Server (NTRS)

    Gracey, William; Jewel, Joseph W., Jr.; Carpenter, Gene T.

    1960-01-01

    The overall errors of the service altimeter installations of a variety of civil transport, military, and general-aviation airplanes have been experimentally determined during normal landing-approach and take-off operations. The average height above the runway at which the data were obtained was about 280 feet for the landings and about 440 feet for the take-offs. An analysis of the data obtained from 196 airplanes during 415 landing approaches and from 70 airplanes during 152 take-offs showed that: 1. The overall error of the altimeter installations in the landing- approach condition had a probable value (50 percent probability) of +/- 36 feet and a maximum probable value (99.7 percent probability) of +/- 159 feet with a bias of +10 feet. 2. The overall error in the take-off condition had a probable value of +/- 47 feet and a maximum probable value of +/- 207 feet with a bias of -33 feet. 3. The overall errors of the military airplanes were generally larger than those of the civil transports in both the landing-approach and take-off conditions. In the landing-approach condition the probable error and the maximum probable error of the military airplanes were +/- 43 and +/- 189 feet, respectively, with a bias of +15 feet, whereas those for the civil transports were +/- 22 and +/- 96 feet, respectively, with a bias of +1 foot. 4. The bias values of the error distributions (+10 feet for the landings and -33 feet for the take-offs) appear to represent a measure of the hysteresis characteristics (after effect and recovery) and friction of the instrument and the pressure lag of the tubing-instrument system.

  14. Probabilistic Approach to Conditional Probability of Release of Hazardous Materials from Railroad Tank Cars during Accidents

    DOT National Transportation Integrated Search

    2009-10-13

    This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...

  15. The Dependence Structure of Conditional Probabilities in a Contingency Table

    ERIC Educational Resources Information Center

    Joarder, Anwar H.; Al-Sabah, Walid S.

    2002-01-01

    Conditional probability and statistical independence can be better explained with contingency tables. In this note some special cases of 2 x 2 contingency tables are considered. In turn an interesting insight into statistical dependence as well as independence of events is obtained.

  16. Comparison of Aperture Averaging and Receiver Diversity Techniques for Free Space Optical Links in Presence of Turbulence and Various Weather Conditions

    NASA Astrophysics Data System (ADS)

    Kaur, Prabhmandeep; Jain, Virander Kumar; Kar, Subrat

    2014-12-01

    In this paper, we investigate the performance of a Free Space Optic (FSO) link considering the impairments caused by the presence of various weather conditions such as very clear air, drizzle, haze, fog, etc., and turbulence in the atmosphere. Analytic expression for the outage probability is derived using the gamma-gamma distribution for turbulence and accounting the effect of weather conditions using the Beer-Lambert's law. The effect of receiver diversity schemes using aperture averaging and array receivers on the outage probability is studied and compared. As the aperture diameter is increased, the outage probability decreases irrespective of the turbulence strength (weak, moderate and strong) and weather conditions. Similar effects are observed when the number of direct detection receivers in the array are increased. However, it is seen that as the desired level of performance in terms of the outage probability decreases, array receiver becomes the preferred choice as compared to the receiver with aperture averaging.

  17. Interactive effects of senescence and natural disturbance on the annual survival probabilities of snail kites

    USGS Publications Warehouse

    Reichert, Brian E.; Martin, J.; Kendall, William L.; Cattau, Christopher E.; Kitchens, Wiley M.

    2010-01-01

    Individuals in wild populations face risks associated with both intrinsic (i.e. aging) and external (i.e. environmental) sources of mortality. Condition-dependent mortality occurs when there is an interaction between such factors; however, few studies have clearly demonstrated condition-dependent mortality and some have even argued that condition-dependent mortality does not occur in wild avian populations. Using large sample sizes (2084 individuals, 3746 re-sights) of individual-based longitudinal data collected over a 33 year period (1976-2008) on multiple cohorts, we used a capture-mark-recapture framework to model age-dependent survival in the snail kite Rostrhamus sociabilis plumbeus population in Florida. Adding to the growing amount of evidence for actuarial senescence in wild populations, we found evidence of senescent declines in survival probabilities in adult kites. We also tested the hypothesis that older kites experienced condition-dependent mortality during a range-wide drought event (2000-2002). The results provide convincing evidence that the annual survival probability of senescent kites was disproportionately affected by the drought relative to the survival probability of prime-aged adults. To our knowledge, this is the first evidence of condition-dependent mortality to be demonstrated in a wild avian population, a finding which challenges recent conclusions drawn in the literature. Our study suggests that senescence and condition-dependent mortality can affect the demography of wild avian populations. Accounting for these sources of variation may be particularly important to appropriately compute estimates of population growth rate, and probabilities of quasi-extinctions.

  18. BIODEGRADATION PROBABILITY PROGRAM (BIODEG)

    EPA Science Inventory

    The Biodegradation Probability Program (BIODEG) calculates the probability that a chemical under aerobic conditions with mixed cultures of microorganisms will biodegrade rapidly or slowly. It uses fragment constants developed using multiple linear and non-linear regressions and d...

  19. Fundamental niche prediction of the pathogenic yeasts Cryptococcus neoformans and Cryptococcus gattii in Europe.

    PubMed

    Cogliati, Massimo; Puccianti, Erika; Montagna, Maria T; De Donno, Antonella; Susever, Serdar; Ergin, Cagri; Velegraki, Aristea; Ellabib, Mohamed S; Nardoni, Simona; Macci, Cristina; Trovato, Laura; Dipineto, Ludovico; Rickerts, Volker; Akcaglar, Sevim; Mlinaric-Missoni, Emilija; Bertout, Sebastien; Vencà, Ana C F; Sampaio, Ana C; Criseo, Giuseppe; Ranque, Stéphane; Çerikçioğlu, Nilgün; Marchese, Anna; Vezzulli, Luigi; Ilkit, Macit; Desnos-Ollivier, Marie; Pasquale, Vincenzo; Polacheck, Itzhack; Scopa, Antonio; Meyer, Wieland; Ferreira-Paim, Kennio; Hagen, Ferry; Boekhout, Teun; Dromer, Françoise; Varma, Ashok; Kwon-Chung, Kyung J; Inácio, Joäo; Colom, Maria F

    2017-10-01

    Fundamental niche prediction of Cryptococcus neoformans and Cryptococcus gattii in Europe is an important tool to understand where these pathogenic yeasts have a high probability to survive in the environment and therefore to identify the areas with high risk of infection. In this study, occurrence data for C. neoformans and C. gattii were compared by MaxEnt software with several bioclimatic conditions as well as with soil characteristics and land use. The results showed that C. gattii distribution can be predicted with high probability along the Mediterranean coast. The analysis of variables showed that its distribution is limited by low temperatures during the coldest season, and by heavy precipitations in the driest season. C. neoformans var. grubii is able to colonize the same areas of C. gattii but is more tolerant to cold winter temperatures and summer precipitations. In contrast, the C. neoformans var. neoformans map was completely different. The best conditions for its survival were displayed in sub-continental areas and not along the Mediterranean coasts. In conclusion, we produced for the first time detailed prediction maps of the species and varieties of the C. neoformans and C. gattii species complex in Europe and Mediterranean area. © 2017 Society for Applied Microbiology and John Wiley & Sons Ltd.

  20. Timescales of isotropic and anisotropic cluster collapse

    NASA Astrophysics Data System (ADS)

    Bartelmann, M.; Ehlers, J.; Schneider, P.

    1993-12-01

    From a simple estimate for the formation time of galaxy clusters, Richstone et al. have recently concluded that the evidence for non-virialized structures in a large fraction of observed clusters points towards a high value for the cosmological density parameter Omega0. This conclusion was based on a study of the spherical collapse of density perturbations, assumed to follow a Gaussian probability distribution. In this paper, we extend their treatment in several respects: first, we argue that the collapse does not start from a comoving motion of the perturbation, but that the continuity equation requires an initial velocity perturbation directly related to the density perturbation. This requirement modifies the initial condition for the evolution equation and has the effect that the collapse proceeds faster than in the case where the initial velocity perturbation is set to zero; the timescale is reduced by a factor of up to approximately equal 0.5. Our results thus strengthens the conclusion of Richstone et al. for a high Omega0. In addition, we study the collapse of density fluctuations in the frame of the Zel'dovich approximation, using as starting condition the analytically known probability distribution of the eigenvalues of the deformation tensor, which depends only on the (Gaussian) width of the perturbation spectrum. Finally, we consider the anisotropic collapse of density perturbations dynamically, again with initial conditions drawn from the probability distribution of the deformation tensor. We find that in both cases of anisotropic collapse, in the Zel'dovich approximation and in the dynamical calculations, the resulting distribution of collapse times agrees remarkably well with the results from spherical collapse. We discuss this agreement and conclude that it is mainly due to the properties of the probability distribution for the eigenvalues of the Zel'dovich deformation tensor. Hence, the conclusions of Richstone et al. on the value of Omega0 can be verified and strengthened, even if a more general approach to the collapse of density perturbations is employed. A simple analytic formula for the cluster redshift distribution in an Einstein-deSitter universe is derived.

  1. Non-linear relationship of cell hit and transformation probabilities in a low dose of inhaled radon progenies.

    PubMed

    Balásházy, Imre; Farkas, Arpád; Madas, Balázs Gergely; Hofmann, Werner

    2009-06-01

    Cellular hit probabilities of alpha particles emitted by inhaled radon progenies in sensitive bronchial epithelial cell nuclei were simulated at low exposure levels to obtain useful data for the rejection or support of the linear-non-threshold (LNT) hypothesis. In this study, local distributions of deposited inhaled radon progenies in airway bifurcation models were computed at exposure conditions characteristic of homes and uranium mines. Then, maximum local deposition enhancement factors at bronchial airway bifurcations, expressed as the ratio of local to average deposition densities, were determined to characterise the inhomogeneity of deposition and to elucidate their effect on resulting hit probabilities. The results obtained suggest that in the vicinity of the carinal regions of the central airways the probability of multiple hits can be quite high, even at low average doses. Assuming a uniform distribution of activity there are practically no multiple hits and the hit probability as a function of dose exhibits a linear shape in the low dose range. The results are quite the opposite in the case of hot spots revealed by realistic deposition calculations, where practically all cells receive multiple hits and the hit probability as a function of dose is non-linear in the average dose range of 10-100 mGy.

  2. Rapidly assessing the probability of exceptionally high natural hazard losses

    NASA Astrophysics Data System (ADS)

    Gollini, Isabella; Rougier, Jonathan

    2014-05-01

    One of the objectives in catastrophe modeling is to assess the probability distribution of losses for a specified period, such as a year. From the point of view of an insurance company, the whole of the loss distribution is interesting, and valuable in determining insurance premiums. But the shape of the righthand tail is critical, because it impinges on the solvency of the company. A simple measure of the risk of insolvency is the probability that the annual loss will exceed the company's current operating capital. Imposing an upper limit on this probability is one of the objectives of the EU Solvency II directive. If a probabilistic model is supplied for the loss process, then this tail probability can be computed, either directly, or by simulation. This can be a lengthy calculation for complex losses. Given the inevitably subjective nature of quantifying loss distributions, computational resources might be better used in a sensitivity analysis. This requires either a quick approximation to the tail probability or an upper bound on the probability, ideally a tight one. We present several different bounds, all of which can be computed nearly instantly from a very general event loss table. We provide a numerical illustration, and discuss the conditions under which the bound is tight. Although we consider the perspective of insurance and reinsurance companies, exactly the same issues concern the risk manager, who is typically very sensitive to large losses.

  3. A method to establish stimulus control and compliance with instructions.

    PubMed

    Borgen, John G; Charles Mace, F; Cavanaugh, Brenna M; Shamlian, Kenneth; Lit, Keith R; Wilson, Jillian B; Trauschke, Stephanie L

    2017-10-01

    We evaluated a unique procedure to establish compliance with instructions in four young children diagnosed with autism spectrum disorder (ASD) who had low levels of compliance. Our procedure included methods to establish a novel therapist as a source of positive reinforcement, reliably evoke orienting responses to the therapist, increase the number of exposures to instruction-compliance-reinforcer contingencies, and minimize the number of exposures to instruction-noncompliance-no reinforcer contingencies. We further alternated between instructions with a high probability of compliance (high-p instructions) with instructions that had a prior low probability of compliance (low-p instructions) as soon as low-p instructions lost stimulus control. The intervention is discussed in relation to the conditions necessary for the development of stimulus control and as an example of a variation of translational research. © 2017 Society for the Experimental Analysis of Behavior.

  4. Screening for SNPs with Allele-Specific Methylation based on Next-Generation Sequencing Data.

    PubMed

    Hu, Bo; Ji, Yuan; Xu, Yaomin; Ting, Angela H

    2013-05-01

    Allele-specific methylation (ASM) has long been studied but mainly documented in the context of genomic imprinting and X chromosome inactivation. Taking advantage of the next-generation sequencing technology, we conduct a high-throughput sequencing experiment with four prostate cell lines to survey the whole genome and identify single nucleotide polymorphisms (SNPs) with ASM. A Bayesian approach is proposed to model the counts of short reads for each SNP conditional on its genotypes of multiple subjects, leading to a posterior probability of ASM. We flag SNPs with high posterior probabilities of ASM by accounting for multiple comparisons based on posterior false discovery rates. Applying the Bayesian approach to the in-house prostate cell line data, we identify 269 SNPs as candidates of ASM. A simulation study is carried out to demonstrate the quantitative performance of the proposed approach.

  5. On estimating probability of presence from use-availability or presence-background data.

    PubMed

    Phillips, Steven J; Elith, Jane

    2013-06-01

    A fundamental ecological modeling task is to estimate the probability that a species is present in (or uses) a site, conditional on environmental variables. For many species, available data consist of "presence" data (locations where the species [or evidence of it] has been observed), together with "background" data, a random sample of available environmental conditions. Recently published papers disagree on whether probability of presence is identifiable from such presence-background data alone. This paper aims to resolve the disagreement, demonstrating that additional information is required. We defined seven simulated species representing various simple shapes of response to environmental variables (constant, linear, convex, unimodal, S-shaped) and ran five logistic model-fitting methods using 1000 presence samples and 10 000 background samples; the simulations were repeated 100 times. The experiment revealed a stark contrast between two groups of methods: those based on a strong assumption that species' true probability of presence exactly matches a given parametric form had highly variable predictions and much larger RMS error than methods that take population prevalence (the fraction of sites in which the species is present) as an additional parameter. For six species, the former group grossly under- or overestimated probability of presence. The cause was not model structure or choice of link function, because all methods were logistic with linear and, where necessary, quadratic terms. Rather, the experiment demonstrates that an estimate of prevalence is not just helpful, but is necessary (except in special cases) for identifying probability of presence. We therefore advise against use of methods that rely on the strong assumption, due to Lele and Keim (recently advocated by Royle et al.) and Lancaster and Imbens. The methods are fragile, and their strong assumption is unlikely to be true in practice. We emphasize, however, that we are not arguing against standard statistical methods such as logistic regression, generalized linear models, and so forth, none of which requires the strong assumption. If probability of presence is required for a given application, there is no panacea for lack of data. Presence-background data must be augmented with an additional datum, e.g., species' prevalence, to reliably estimate absolute (rather than relative) probability of presence.

  6. Cyber-Physical Correlations for Infrastructure Resilience: A Game-Theoretic Approach

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S; He, Fei; Ma, Chris Y. T.

    In several critical infrastructures, the cyber and physical parts are correlated so that disruptions to one affect the other and hence the whole system. These correlations may be exploited to strategically launch components attacks, and hence must be accounted for ensuring the infrastructure resilience, specified by its survival probability. We characterize the cyber-physical interactions at two levels: (i) the failure correlation function specifies the conditional survival probability of cyber sub-infrastructure given the physical sub-infrastructure as a function of their marginal probabilities, and (ii) the individual survival probabilities of both sub-infrastructures are characterized by first-order differential conditions. We formulate a resiliencemore » problem for infrastructures composed of discrete components as a game between the provider and attacker, wherein their utility functions consist of an infrastructure survival probability term and a cost term expressed in terms of the number of components attacked and reinforced. We derive Nash Equilibrium conditions and sensitivity functions that highlight the dependence of infrastructure resilience on the cost term, correlation function and sub-infrastructure survival probabilities. These results generalize earlier ones based on linear failure correlation functions and independent component failures. We apply the results to models of cloud computing infrastructures and energy grids.« less

  7. Evidence for skipped spawning in a potamodromous cyprinid, humpback chub (Gila cypha), with implications for demographic parameter estimates

    USGS Publications Warehouse

    Pearson, Kristen Nicole; Kendall, William L.; Winkelman, Dana L.; Persons, William R.

    2015-01-01

    Our findings reveal evidence for skipped spawning in a potamodromous cyprinid, humpback chub (HBC; Gila cypha  ). Using closed robust design mark-recapture models, we found, on average, spawning HBC transition to the skipped spawning state () with a probability of 0.45 (95% CRI (i.e. credible interval): 0.10, 0.80) and skipped spawners remain in the skipped spawning state () with a probability of 0.60 (95% CRI: 0.26, 0.83), yielding an average spawning cycle of every 2.12 years, conditional on survival. As a result, migratory skipped spawners are unavailable for detection during annual sampling events. If availability is unaccounted for, survival and detection probability estimates will be biased. Therefore, we estimated annual adult survival probability (S), while accounting for skipped spawning, and found S remained reasonably stable throughout the study period, with an average of 0.75 ((95% CRI: 0.66, 0.82), process varianceσ2 = 0.005), while skipped spawning probability was highly dynamic (σ2 = 0.306). By improving understanding of HBC spawning strategies, conservation decisions can be based on less biased estimates of survival and a more informed population model structure.

  8. Translational Genomics Research Institute: Identification of Pathways Enriched with Condition-Specific Statistical Dependencies Across Four Subtypes of Glioblastoma Multiforme | Office of Cancer Genomics

    Cancer.gov

    Evaluation of Differential DependencY (EDDY) is a statistical test for the differential dependency relationship of a set of genes between two given conditions. For each condition, possible dependency network structures are enumerated and their likelihoods are computed to represent a probability distribution of dependency networks. The difference between the probability distributions of dependency networks is computed between conditions, and its statistical significance is evaluated with random permutations of condition labels on the samples.  

  9. Translational Genomics Research Institute (TGen): Identification of Pathways Enriched with Condition-Specific Statistical Dependencies Across Four Subtypes of Glioblastoma Multiforme | Office of Cancer Genomics

    Cancer.gov

    Evaluation of Differential DependencY (EDDY) is a statistical test for the differential dependency relationship of a set of genes between two given conditions. For each condition, possible dependency network structures are enumerated and their likelihoods are computed to represent a probability distribution of dependency networks. The difference between the probability distributions of dependency networks is computed between conditions, and its statistical significance is evaluated with random permutations of condition labels on the samples.  

  10. Disruptive effects of prefeeding and haloperidol administration on multiple measures of food-maintained behavior in rats

    PubMed Central

    Hayashi, Yusuke; Wirth, Oliver

    2015-01-01

    Four rats responded under a choice reaction-time procedure. At the beginning of each trial, the rats were required to hold down a center lever for a variable duration, release it following a high- or low-pitched tone, and press either a left or right lever, conditionally on the tone. Correct choices were reinforced with a probability of .95 or .05 under blinking or static houselights, respectively. After performance stabilized, disruptive effects of free access to food pellets prior to sessions (prefeeding) and intraperitoneal injection of haloperidol were examined on multiple behavioral measures (i.e., the number of trials completed, percent of correct responses, and reaction time). Resistance to prefeeding depended on the probability of food delivery for the number of trials completed and reaction time. Resistance to haloperidol, on the other hand, was not systematically affected by the probability of food delivery for all dependent measures. PMID:22209910

  11. Chaos in high-dimensional dissipative dynamical systems

    PubMed Central

    Ispolatov, Iaroslav; Madhok, Vaibhav; Allende, Sebastian; Doebeli, Michael

    2015-01-01

    For dissipative dynamical systems described by a system of ordinary differential equations, we address the question of how the probability of chaotic dynamics increases with the dimensionality of the phase space. We find that for a system of d globally coupled ODE’s with quadratic and cubic non-linearities with randomly chosen coefficients and initial conditions, the probability of a trajectory to be chaotic increases universally from ~10−5 − 10−4 for d = 3 to essentially one for d ~ 50. In the limit of large d, the invariant measure of the dynamical systems exhibits universal scaling that depends on the degree of non-linearity, but not on the choice of coefficients, and the largest Lyapunov exponent converges to a universal scaling limit. Using statistical arguments, we provide analytical explanations for the observed scaling, universality, and for the probability of chaos. PMID:26224119

  12. Ditching Investigation of a 1/12-Scale Model of the Douglas F3D-2 Airplane, TED No. NACA DE 381

    NASA Technical Reports Server (NTRS)

    Fisher, Lloyd J.; Thompson, William C.

    1955-01-01

    An investigation of a 1/12- scale dynamically similar model of the Douglas F3D-2 airplane was made in calm water to observe the ditching behavior and to determine the safest procedure for making an emergency water landing. Various conditions of damage were simulated to determine the behavior which probably would occur in a full-scale ditching. The behavior of the model was determined from motion-picture records, time- history acceleration records, and visual observations. It was concluded that the airplane should be ditched at a medium high attitude of about 8 degrees with the landing flaps down 40 degrees. In calm water the airplane will probably make a smooth run of about 550 feet and will have a maximum longitudinal deceleration of about 3g. The fuselage bottom will probably be damaged enough to allow the fuselage to fill with water very rapidly.

  13. Conditional Probabilities and Collapse in Quantum Measurements

    NASA Astrophysics Data System (ADS)

    Laura, Roberto; Vanni, Leonardo

    2008-09-01

    We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.

  14. Probabilities of good, marginal, and poor flying conditions for space shuttle ferry flights

    NASA Technical Reports Server (NTRS)

    Whiting, D. M.; Guttman, N. B.

    1977-01-01

    Empirical probabilities are provided for good, marginal, and poor flying weather for ferrying the Space Shuttle Orbiter from Edwards AFB, California, to Kennedy Space Center, Florida, and from Edwards AFB to Marshall Space Flight Center, Alabama. Results are given by month for each overall route plus segments of each route. The criteria for defining a day as good, marginal, or poor and the method of computing the relative frequencies and conditional probabilities for monthly reference periods are described.

  15. Pre-Service Teachers' Conceptions of Probability

    ERIC Educational Resources Information Center

    Odafe, Victor U.

    2011-01-01

    Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…

  16. The Efficacy of Using Diagrams When Solving Probability Word Problems in College

    ERIC Educational Resources Information Center

    Beitzel, Brian D.; Staley, Richard K.

    2015-01-01

    Previous experiments have shown a deleterious effect of visual representations on college students' ability to solve total- and joint-probability word problems. The present experiments used conditional-probability problems, known to be more difficult than total- and joint-probability problems. The diagram group was instructed in how to use tree…

  17. Relative Contributions of Three Descriptive Methods: Implications for Behavioral Assessment

    ERIC Educational Resources Information Center

    Pence, Sacha T.; Roscoe, Eileen M.; Bourret, Jason C.; Ahearn, William H.

    2009-01-01

    This study compared the outcomes of three descriptive analysis methods--the ABC method, the conditional probability method, and the conditional and background probability method--to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior…

  18. Statistics concerning the Apollo command module water landing, including the probability of occurrence of various impact conditions, sucessful impact, and body X-axis loads

    NASA Technical Reports Server (NTRS)

    Whitnah, A. M.; Howes, D. B.

    1971-01-01

    Statistical information for the Apollo command module water landings is presented. This information includes the probability of occurrence of various impact conditions, a successful impact, and body X-axis loads of various magnitudes.

  19. Combination of a Stressor-Response Model with a Conditional Probability Analysis Approach for Developing Candidate Criteria from MBSS

    EPA Science Inventory

    I show that a conditional probability analysis using a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criteria from empirical data, such as the Maryland Biological Streams Survey (MBSS) data.

  20. Spatial prediction models for the probable biological condition of streams and rivers in the USA

    EPA Science Inventory

    The National Rivers and Streams Assessment (NRSA) is a probability-based survey conducted by the US Environmental Protection Agency and its state and tribal partners. It provides information on the ecological condition of the rivers and streams in the conterminous USA, and the ex...

  1. Random forest models for the probable biological condition of streams and rivers in the USA

    EPA Science Inventory

    The National Rivers and Streams Assessment (NRSA) is a probability based survey conducted by the US Environmental Protection Agency and its state and tribal partners. It provides information on the ecological condition of the rivers and streams in the conterminous USA, and the ex...

  2. GEOGRAPHIC-SPECIFIC WATER QUALITY CRITERIA DEVELOPMENT WITH MONITORING DATA USING CONDITIONAL PROBABILITIES - A PROPOSED APPROACH

    EPA Science Inventory

    A conditional probability approach using monitoring data to develop geographic-specific water quality criteria for protection of aquatic life is presented. Typical methods to develop criteria using existing monitoring data are limited by two issues: (1) how to extrapolate to an...

  3. [Sports and heat stroke].

    PubMed

    Yuzawa, Itsuki; Miyake, Yasufumi; Aruga, Tohru

    2012-06-01

    We described Characteristic of the heat stroke in the sports activity in Japan. It was common in teenage men, and 15 years old had a peak with a man, the woman. Most patients did not need specific treatment. Many happened from the end of July on the outdoors around 3:00 p.m. in mid-August. There are many in order of baseball, football, tennis, and a basketball. Running and cycling had high severity of illness. Probably, grasp of an environmental condition, suitable sportswear, suitable hydration, and condition management are the best things as preventive measures.

  4. Essential health care among Mexican indigenous people in a universal coverage context.

    PubMed

    Servan-Mori, Edson; Pelcastre-Villafuerte, Blanca; Heredia-Pi, Ileana; Montoya-Rodríguez, Arain

    2014-01-01

    To analyze the influence of indigenous condition on essential health care among Mexican children, older people and women in reproductive age. The influence of indigenous condition on the probability of receiving medical care due to acute respiratory infection (ARI) and acute diarrheal disease (ADD), vaccination coverage; and antenatal care (ANC) was analyzed using the 2012 National Health Survey and non-experimental matching methods. Indigenous condition does not influence per-se vaccination coverage (in < 1 year), probability of attention of ARI's and ADD's as well as, timely, frequent, and quality ANC. Being indigenous and older adult increases 9% the probability of receiving a fulfilled vaccination schedule. Unfavorable structural conditions in which Mexican indigenous live constitutes the persistent mechanisms of their health vulnerability. Public policy should consider this level of intervention, in a way that intensive and focalized health strategies contribute to improve their health condition and life.

  5. What Are Probability Surveys used by the National Aquatic Resource Surveys?

    EPA Pesticide Factsheets

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  6. State and Jurisdictional Eligibility Definitions for Infants and Toddlers with Disabilities under IDEA. NECTAC Notes Issue No. 14

    ERIC Educational Resources Information Center

    Shackelford, Jo

    2004-01-01

    Under Part C of the Individuals with Disabilities Education Act (IDEA), participating states and jurisdictions must provide services to children who are either experiencing developmental delays, or who have a diagnosed mental or physical condition that has a high probability of resulting in developmental delay. Additionally, states may choose to…

  7. Noise in a phosphorelay drives stochastic entry into sporulation in Bacillus subtilis.

    PubMed

    Russell, Jonathan R; Cabeen, Matthew T; Wiggins, Paul A; Paulsson, Johan; Losick, Richard

    2017-10-02

    Entry into sporulation in Bacillus subtilis is governed by a phosphorelay in which phosphoryl groups from a histidine kinase are successively transferred via relay proteins to the response regulator Spo0A. Spo0A~P, in turn, sets in motion events that lead to asymmetric division and activation of the cell-specific transcription factor σ F , a hallmark for entry into sporulation. Here, we have used a microfluidics-based platform to investigate the activation of Spo0A and σ F in individual cells held under constant, sporulation-inducing conditions. The principal conclusions were that: (i) activation of σ F occurs with an approximately constant probability after adaptation to conditions of nutrient limitation; (ii) activation of σ F is tightly correlated with, and preceded by, Spo0A~P reaching a high threshold level; (iii) activation of Spo0A takes place abruptly just prior to asymmetric division; and (iv) the primary source of noise in the activation of Spo0A is the phosphorelay. We propose that cells exhibit a constant probability of attaining a high threshold level of Spo0A~P due to fluctuations in the flux of phosphoryl groups through the phosphorelay. © 2017 The Authors.

  8. Enceladus as a hydrothermal water world

    NASA Astrophysics Data System (ADS)

    Postberg, Frank; Hsu, Hsiang-Wen; Sekine, Yasuhito

    2014-05-01

    The composition of both salty ice grains and nanometer-sized stream particles emitted from Enceladus and measured by Cassini-CDA require require liquid water as a source. Moreover, they provide strong geochemical constraints for their origin inside the active moon. Most stream particles are composed of silica, a unique indicator as nano-silica would only form under quite specific conditions. With high probability on-going or geological recent hydrothermal activity at Enceladus is required to generate these particles. Inferred reaction temperatures at Enceladus ocean floor lie between 100 and 350 °C in a slightly alkaline environment (pH 7.5 - 10.5). The inferred high temperatures at great depth might require heat sources other than tides alone, such as remaining primordial heat and/or serpentinization of a probably porous rocky core. Long-term laboratory experiments were carried out to simulate the conditions at the Enceladus rock/water interface using the constraints derived from CDA measurements. These experiments allow insights into a rock/water chemistry which severely constrains the formation history of the moon and substantially enhances its astrobiological potential. Together with recent results from other Cassini instruments a conclusive picture of Enceladus as an active water world seems to be in reach.

  9. The Statistics of Urban Scaling and Their Connection to Zipf’s Law

    PubMed Central

    Gomez-Lievano, Andres; Youn, HyeJin; Bettencourt, Luís M. A.

    2012-01-01

    Urban scaling relations characterizing how diverse properties of cities vary on average with their population size have recently been shown to be a general quantitative property of many urban systems around the world. However, in previous studies the statistics of urban indicators were not analyzed in detail, raising important questions about the full characterization of urban properties and how scaling relations may emerge in these larger contexts. Here, we build a self-consistent statistical framework that characterizes the joint probability distributions of urban indicators and city population sizes across an urban system. To develop this framework empirically we use one of the most granular and stochastic urban indicators available, specifically measuring homicides in cities of Brazil, Colombia and Mexico, three nations with high and fast changing rates of violent crime. We use these data to derive the conditional probability of the number of homicides per year given the population size of a city. To do this we use Bayes’ rule together with the estimated conditional probability of city size given their number of homicides and the distribution of total homicides. We then show that scaling laws emerge as expectation values of these conditional statistics. Knowledge of these distributions implies, in turn, a relationship between scaling and population size distribution exponents that can be used to predict Zipf’s exponent from urban indicator statistics. Our results also suggest how a general statistical theory of urban indicators may be constructed from the stochastic dynamics of social interaction processes in cities. PMID:22815745

  10. [HYGIENIC ASSESSMENT OFWORKING ENVIRONMENT FOR REPAIRERS OF RAILWAY ROLLING STOCK IN PLANT CONDITIONS].

    PubMed

    Sudeikina, N A; Kurenkova, G V

    2015-01-01

    The comprehensive hygienic assessment of working environment for main occupational groups Railway Car Repair Plant in factory conditions shows that workers are exposed to the impact of factors of chemical nature in concentrations exceeding maximum allowable (lead, manganese, alkali caustic, sulphuric and nitric acids, chromium trioxide, silicon-containing dust, white corundum, diiron trioxide, silicate-organic dust, wood and carbon dusts), the high level of noise, the local vibration, insufficient levels of artificial lighting. The manual work is used, that determines the high severity of the labor process in the most of workers. There was identified the inconsistency of quality and quantitative estimation of the work conditions on chemical factor at implementation of various types of control: certification of workplaces on work conditions, productions and state control. There was given an a priori evaluation of the occupational risk in the three main workshops, there were detected 13 occupations with mild (moderate) risk, 9 occupations with average (significant) risk, 6 professions with high (intolerable) risk category and 1 occupation--with very high (intolerable) risk category. Low indices of occupational diseases according to official statistics were establishedfail to be consistent with a high probability of their occurrence in the production.

  11. Exploration of Use of Copulas in Analysing the Relationship between Precipitation and Meteorological Drought in Beijing, China

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, Linlin; Wang, Hongrui; Wang, Cheng

    Drought risk analysis is essential for regional water resource management. In this study, the probabilistic relationship between precipitation and meteorological drought in Beijing, China, was calculated under three different precipitation conditions (precipitation equal to, greater than, or less than a threshold) based on copulas. The Standardized Precipitation Evapotranspiration Index (SPEI) was calculated based on monthly total precipitation and monthly mean temperature data. The trends and variations in the SPEI were analysed using Hilbert-Huang Transform (HHT) and Mann-Kendall (MK) trend tests with a running approach. The results of the HHT and MK test indicated a significant decreasing trend in the SPEI.more » The copula-based conditional probability indicated that the probability of meteorological drought decreased as monthly precipitation increased and that 10 mm can be regarded as the threshold for triggering extreme drought. From a quantitative perspective, when R ≤ mm, the probabilities of moderate drought, severe drought, and extreme drought were 22.1%, 18%, and 13.6%, respectively. This conditional probability distribution not only revealed the occurrence of meteorological drought in Beijing but also provided a quantitative way to analyse the probability of drought under different precipitation conditions. Furthermore, the results provide a useful reference for future drought prediction.« less

  12. Exploration of Use of Copulas in Analysing the Relationship between Precipitation and Meteorological Drought in Beijing, China

    DOE PAGES

    Fan, Linlin; Wang, Hongrui; Wang, Cheng; ...

    2017-05-16

    Drought risk analysis is essential for regional water resource management. In this study, the probabilistic relationship between precipitation and meteorological drought in Beijing, China, was calculated under three different precipitation conditions (precipitation equal to, greater than, or less than a threshold) based on copulas. The Standardized Precipitation Evapotranspiration Index (SPEI) was calculated based on monthly total precipitation and monthly mean temperature data. The trends and variations in the SPEI were analysed using Hilbert-Huang Transform (HHT) and Mann-Kendall (MK) trend tests with a running approach. The results of the HHT and MK test indicated a significant decreasing trend in the SPEI.more » The copula-based conditional probability indicated that the probability of meteorological drought decreased as monthly precipitation increased and that 10 mm can be regarded as the threshold for triggering extreme drought. From a quantitative perspective, when R ≤ mm, the probabilities of moderate drought, severe drought, and extreme drought were 22.1%, 18%, and 13.6%, respectively. This conditional probability distribution not only revealed the occurrence of meteorological drought in Beijing but also provided a quantitative way to analyse the probability of drought under different precipitation conditions. Furthermore, the results provide a useful reference for future drought prediction.« less

  13. Game-Theoretic strategies for systems of components using product-form utilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rao, Nageswara S; Ma, Cheng-Yu; Hausken, K.

    Many critical infrastructures are composed of multiple systems of components which are correlated so that disruptions to one may propagate to others. We consider such infrastructures with correlations characterized in two ways: (i) an aggregate failure correlation function specifies the conditional failure probability of the infrastructure given the failure of an individual system, and (ii) a pairwise correlation function between two systems specifies the failure probability of one system given the failure of the other. We formulate a game for ensuring the resilience of the infrastructure, wherein the utility functions of the provider and attacker are products of an infrastructuremore » survival probability term and a cost term, both expressed in terms of the numbers of system components attacked and reinforced. The survival probabilities of individual systems satisfy first-order differential conditions that lead to simple Nash Equilibrium conditions. We then derive sensitivity functions that highlight the dependence of infrastructure resilience on the cost terms, correlation functions, and individual system survival probabilities. We apply these results to simplified models of distributed cloud computing and energy grid infrastructures.« less

  14. Anthropogenic warming has increased drought risk in California.

    PubMed

    Diffenbaugh, Noah S; Swain, Daniel L; Touma, Danielle

    2015-03-31

    California is currently in the midst of a record-setting drought. The drought began in 2012 and now includes the lowest calendar-year and 12-mo precipitation, the highest annual temperature, and the most extreme drought indicators on record. The extremely warm and dry conditions have led to acute water shortages, groundwater overdraft, critically low streamflow, and enhanced wildfire risk. Analyzing historical climate observations from California, we find that precipitation deficits in California were more than twice as likely to yield drought years if they occurred when conditions were warm. We find that although there has not been a substantial change in the probability of either negative or moderately negative precipitation anomalies in recent decades, the occurrence of drought years has been greater in the past two decades than in the preceding century. In addition, the probability that precipitation deficits co-occur with warm conditions and the probability that precipitation deficits produce drought have both increased. Climate model experiments with and without anthropogenic forcings reveal that human activities have increased the probability that dry precipitation years are also warm. Further, a large ensemble of climate model realizations reveals that additional global warming over the next few decades is very likely to create ∼ 100% probability that any annual-scale dry period is also extremely warm. We therefore conclude that anthropogenic warming is increasing the probability of co-occurring warm-dry conditions like those that have created the acute human and ecosystem impacts associated with the "exceptional" 2012-2014 drought in California.

  15. Anthropogenic warming has increased drought risk in California

    PubMed Central

    Diffenbaugh, Noah S.; Swain, Daniel L.; Touma, Danielle

    2015-01-01

    California is currently in the midst of a record-setting drought. The drought began in 2012 and now includes the lowest calendar-year and 12-mo precipitation, the highest annual temperature, and the most extreme drought indicators on record. The extremely warm and dry conditions have led to acute water shortages, groundwater overdraft, critically low streamflow, and enhanced wildfire risk. Analyzing historical climate observations from California, we find that precipitation deficits in California were more than twice as likely to yield drought years if they occurred when conditions were warm. We find that although there has not been a substantial change in the probability of either negative or moderately negative precipitation anomalies in recent decades, the occurrence of drought years has been greater in the past two decades than in the preceding century. In addition, the probability that precipitation deficits co-occur with warm conditions and the probability that precipitation deficits produce drought have both increased. Climate model experiments with and without anthropogenic forcings reveal that human activities have increased the probability that dry precipitation years are also warm. Further, a large ensemble of climate model realizations reveals that additional global warming over the next few decades is very likely to create ∼100% probability that any annual-scale dry period is also extremely warm. We therefore conclude that anthropogenic warming is increasing the probability of co-occurring warm–dry conditions like those that have created the acute human and ecosystem impacts associated with the “exceptional” 2012–2014 drought in California. PMID:25733875

  16. Evaluation of a Diffusion/Trapping Model for Hydrogen Ingress in High- Strength Alloys

    DTIC Science & Technology

    1992-10-01

    high-strength steels [3-5], precipitation -hardened and work-hardened nickel-base alloys [3-61, and titanium [71 and was shown to be effective in...other two alloys, Ti-13-11-3 was tested in the unaged and age- conditions to establish the role of the secondary (x phase precipitated during aging... maraging steel , so it probably takes the form of reversible trapping [5,29]. Hence, grain boundaries are considered to be the most likely sites for

  17. Work-Related Depression in Primary Care Teams in Brazil.

    PubMed

    da Silva, Andréa Tenório Correia; Lopes, Claudia de Souza; Susser, Ezra; Menezes, Paulo Rossi

    2016-11-01

    To identify work-related factors associated with depressive symptoms and probable major depression in primary care teams. Cross-sectional study among primary care teams (community health workers, nursing assistants, nurses, and physicians) in the city of São Paulo, Brazil (2011-2012; n = 2940), to assess depressive symptoms and probable major depression and their associations with job strain and other work-related conditions. Community health workers presented higher prevalence of probable major depression (18%) than other primary care workers. Higher odds ratios for depressive symptoms or probable major depression were associated with longer duration of employment in primary care; having a passive, active, or high-strain job; lack of supervisor feedback regarding performance; and low social support from colleagues and supervisors. Observed levels of job-related depression can endanger the sustainability of primary care programs. Public Health implications. Strategies are needed to deliver care to primary care workers with depression, facilitating diagnosis and access to treatment, particularly in low- and middle-income countries. Preventive interventions can include training managers to provide feedback and creating strategies to increase job autonomy and social support at work.

  18. Delayed and forgone care for families with chronic conditions in high-deductible health plans.

    PubMed

    Galbraith, Alison A; Soumerai, Stephen B; Ross-Degnan, Dennis; Rosenthal, Meredith B; Gay, Charlene; Lieu, Tracy A

    2012-09-01

    High-deductible health plans (HDHPs) are an increasingly common strategy to contain health care costs. Individuals with chronic conditions are at particular risk for increased out-of-pocket costs in HDHPs and resulting cost-related underuse of essential health care. To evaluate whether families with chronic conditions in HDHPs have higher rates of delayed or forgone care due to cost, compared with those in traditional health insurance plans. This mail and phone survey used multiple logistic regression to compare family-level rates of reporting delayed/forgone care in HDHPs vs. traditional plans. We selected families with children that had at least one member with a chronic condition. Families had employer-sponsored insurance in a Massachusetts health plan and >12 months of enrollment in an HDHP or a traditional plan. The primary outcome was report of any delayed or forgone care due to cost (acute care, emergency department visits, chronic care, checkups, or tests) for adults or children during the prior 12 months. Respondents included 208 families in HDHPs and 370 in traditional plans. Membership in an HDHP and lower income were each independently associated with higher probability of delayed/forgone care due to cost. For adult family members, the predicted probability of delayed/forgone care due to cost was higher in HDHPs than in traditional plans [40.0% vs 15.1% among families with incomes <400% of the federal poverty level (FPL) and 16.0% vs 4.8% among those with incomes ≥400% FPL]. Similar associations were observed for children. Among families with chronic conditions, reporting of delayed/forgone care due to cost is higher for both adults and children in HDHPs than in traditional plans. Families with lower incomes are also at higher risk for delayed/forgone care.

  19. Evaluation of permafrost conditions in non-alpine scree slopes in Central Europe by geophysical methods

    NASA Astrophysics Data System (ADS)

    Gude, M.; Hauck, C.; Kneisel, C.; Krause, S.; Molenda, R.; Ruzicka, V.; Zacharda, M.

    2003-04-01

    Many slope sections covered with blocky material situated in Central European highlands display special microclimatic conditions that resemble those of high latitude or high altitude periglacial areas. In some of these screes even permafrost-like conditions are detected although they are located on elevations fairly below 1000 m a.s.l. These conditions are accompanied by a circulation of air through the open void system, which effects a formation of an ice body during winter by re-sublimation of air humidity, supported by refreezing water from snow melt and precipitation. This ice body is assumed to prevail during the entire summer. Population genetic investigations on alpine and polar beetle species that inhabit the screes proof the continuous existence of the extraordinary cool conditions with probable permafrost throughout the Holocene - i.e. these periglacial-like condition are relatively stable despite all Holocene climatic variations. Observations of summer ice and numerous temperature measurements lead to the assumption of permafrost with enduring ice in the open voids as an integral factor of the microclimatic system. To evaluate its existence the underground was investigated in seven European screes by means of DC resistivity and refraction seismic in early summer. In order to solve the multi-phase subsurface structures, tomographic survey and inversion techniques are necessary, as 1-dimensional plane layer approximations are usually invalid. The results clearly reveal seismic and resistivity anomalies e.g. in the Klic scree (50°49'N, 14°04'E, base at 524 m a.s.l.) in Northern Bohemia. Within a blocky layer of about 10 m high thickness velocity anomalies (2000-3000 m/s) indicate the existence of a small ground ice body, which is confirmed by the synchronous detection of resistivity anomalies in the same place. These conditions are confined to steep scree slopes in regions with thin winter snow cover to enable air circulation. It is probable, that these effects are also found in alpine regions, where they cause a significant depression of the current lower permafrost limit.

  20. Secondary School Students' Reasoning about Conditional Probability, Samples, and Sampling Procedures

    ERIC Educational Resources Information Center

    Prodromou, Theodosia

    2016-01-01

    In the Australian mathematics curriculum, Year 12 students (aged 16-17) are asked to solve conditional probability problems that involve the representation of the problem situation with two-way tables or three-dimensional diagrams and consider sampling procedures that result in different correct answers. In a small exploratory study, we…

  1. Combination of a Stresor-Response Model with a Conditional Probability Anaylsis Approach to Develop Candidate Criteria from Empirical Data

    EPA Science Inventory

    We show that a conditional probability analysis that utilizes a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criterai from empirical data. The critical step in this approach is transforming the response ...

  2. CONDITIONAL PROBABILITY ANALYSIS APPROACH FOR IDENTIFYING BIOLOGICAL THRESHOLD OF IMPACT FOR SEDIMENTATION: APPICATION TO FRESHWATER STREAMS IN OREGON COAST RANGE ECOREGION

    EPA Science Inventory

    A conditional probability analysis (CPA) approach has been developed for identifying biological thresholds of impact for use in the development of geographic-specific water quality criteria for protection of aquatic life. This approach expresses the threshold as the likelihood ...

  3. Racial/Ethnic and County-level Disparity in Inpatient Utilization among Hawai'i Medicaid Population.

    PubMed

    Siriwardhana, Chathura; Lim, Eunjung; Aggarwal, Lovedhi; Davis, James; Hixon, Allen; Chen, John J

    2018-05-01

    We investigated racial/ethnic and county-level disparities in inpatient utilization for 15 clinical conditions among Hawaii's Medicaid population. The study was conducted using inpatient claims data from more than 200,000 Hawai'i Medicaid beneficiaries, reported in the year 2010. The analysis was performed by stratifying the Medicaid population into three age groups: children and adolescent group (1-20 years), adult group (21-64 years), and elderly group (65 years and above). Among the differences found, Asians had a low probability of inpatient admissions compared to Whites for many disease categories, while Native Hawaiian/Pacific Islanders had higher probabilities than Whites, across all age groups. Pediatric and adult groups from Hawai'i County (Big Island) had lower probabilities for inpatient admissions compared to Honolulu County (O'ahu) for most disease conditions, but higher probabilities were observed for several conditions in the elderly group. Notably, the elderly population residing on Kaua'i County (Kaua'i and Ni'ihau islands) had substantially increased odds of hospital admissions for several disease conditions, compared to Honolulu.

  4. Anomalous night-time peaks in diurnal variations of NmF2 close to the geomagnetic equator: a statistical study

    NASA Astrophysics Data System (ADS)

    Pavlov, Anatoli

    We present a study of anomalous night-time NmF2 peaks, ANNPs, observed by the La Paz, Natal, Djibouti, Kodaikanal, Madras, Manila, Talara, and Huancayo-Jicamarca ionosonde stations close to the geomagnetic equator. It is shown that the probabilities of occurrence of the first and second ANNPs depend on the geomagnetic longitude, and there is a longitude sector close to 110° geomagnetic longitude where the first and second ANNPs occur less frequently in comparisons with the longitude regions located close to and below about 34° geomagnetic longitude and close to and above about 144° geomagnetic longitude. The found frequencies of occurrence of the ANNPs increase with increasing solar activity, except of the Djibouti and Kodaikanal ionosonde stations, where the probability of the first ANNP occurrence is found to decrease with increasing solar activity from low (F10.7<100) to moderate (100≤F10.7≤170) solar activity, and except of the Natal ionosonde station, where the frequencies of occurrence of the first and second ANNPs decrease with increasing solar activity from moderate to high (F10.7>170) solar activity. We found that the occurrence probabilities of ANNPs during geomagnetically disturbed conditions are greater than those during geomagnetically quiet conditions. The calculated values of these probabilities have pronounced maximums in June (La-Paz and Talara) and in July (Huancayo-Jicamarca) at the ionosonde stations located in the southern geographic hemisphere. The first ANNP is least frequently observed in January (La-Paz, Talara, and Huancayo-Jicamarca), and the second ANNP is least frequently measured in January (La-Paz and Huancayo-Jicamarca) and in December (Talara). In the northern geographic hemisphere, the studied probabilities are lowest in June (Djibouti and Madras), in July (Manila), and in April (Kodaikanal). The maximums in the probabilities of occurrence of the first and second ANNPs are found to be in September (Djibouti), in October (Madras), in November (Manila), and in December (Kodaikanal).

  5. Experiments with crystal deflectors for high energy ion beams: Electromagnetic dissociation probability for well channeled ions

    NASA Astrophysics Data System (ADS)

    Scandale, W.; Taratin, A. M.; Kovalenko, A. D.

    2013-01-01

    The paper presents the current status with the use of the crystal defectors for high energy ion beams. The channeling properties of multicharged ions are discussed. The results of the experiments on the deflection and extraction (collimation) of high energy ion beams with bent crystals performed in the accelerator centers are shortly considered. The analysis of the recent collimation experiment with a Pb nuclei of 270GeV/c per charge at the CERN Super Proton Synchrotron showed that the channeling efficiency was as large as about 90%. For Pb ions of the LHC energies a new mechanism, which can reduce the channeling efficiency, appears. The electromagnetic dissociation (ED) becomes possible for well channeled particles. However, the estimations performed in the paper show that the ED probability is small and should not visibly reduce the collimation efficiency. On the other hand, the aligned crystal gives the possibility to study the ED processes of heavy nuclei in the conditions when nuclear interactions are fully suppressed.

  6. Portfolios in Stochastic Local Search: Efficiently Computing Most Probable Explanations in Bayesian Networks

    NASA Technical Reports Server (NTRS)

    Mengshoel, Ole J.; Roth, Dan; Wilkins, David C.

    2001-01-01

    Portfolio methods support the combination of different algorithms and heuristics, including stochastic local search (SLS) heuristics, and have been identified as a promising approach to solve computationally hard problems. While successful in experiments, theoretical foundations and analytical results for portfolio-based SLS heuristics are less developed. This article aims to improve the understanding of the role of portfolios of heuristics in SLS. We emphasize the problem of computing most probable explanations (MPEs) in Bayesian networks (BNs). Algorithmically, we discuss a portfolio-based SLS algorithm for MPE computation, Stochastic Greedy Search (SGS). SGS supports the integration of different initialization operators (or initialization heuristics) and different search operators (greedy and noisy heuristics), thereby enabling new analytical and experimental results. Analytically, we introduce a novel Markov chain model tailored to portfolio-based SLS algorithms including SGS, thereby enabling us to analytically form expected hitting time results that explain empirical run time results. For a specific BN, we show the benefit of using a homogenous initialization portfolio. To further illustrate the portfolio approach, we consider novel additive search heuristics for handling determinism in the form of zero entries in conditional probability tables in BNs. Our additive approach adds rather than multiplies probabilities when computing the utility of an explanation. We motivate the additive measure by studying the dramatic impact of zero entries in conditional probability tables on the number of zero-probability explanations, which again complicates the search process. We consider the relationship between MAXSAT and MPE, and show that additive utility (or gain) is a generalization, to the probabilistic setting, of MAXSAT utility (or gain) used in the celebrated GSAT and WalkSAT algorithms and their descendants. Utilizing our Markov chain framework, we show that expected hitting time is a rational function - i.e. a ratio of two polynomials - of the probability of applying an additive search operator. Experimentally, we report on synthetically generated BNs as well as BNs from applications, and compare SGSs performance to that of Hugin, which performs BN inference by compilation to and propagation in clique trees. On synthetic networks, SGS speeds up computation by approximately two orders of magnitude compared to Hugin. In application networks, our approach is highly competitive in Bayesian networks with a high degree of determinism. In addition to showing that stochastic local search can be competitive with clique tree clustering, our empirical results provide an improved understanding of the circumstances under which portfolio-based SLS outperforms clique tree clustering and vice versa.

  7. Epidemiology and social costs of hip fracture.

    PubMed

    Veronese, Nicola; Maggi, Stefania

    2018-04-20

    Hip fracture is an important and debilitating condition in older people, particularly in women. The epidemiological data varies between countries, but it is globally estimated that hip fractures will affect around 18% of women and 6% of men. Although the age-standardised incidence is gradually falling in many countries, this is far outweighed by the ageing of the population. Thus, the global number of hip fractures is expected to increase from 1.26 million in 1990 to 4.5 million by the year 2050. The direct costs associated with this condition are enormous since it requires a long period of hospitalisation and subsequent rehabilitation. Furthermore, hip fracture is associated with the development of other negative consequences, such as disability, depression, and cardiovascular diseases, with additional costs for society. In this review, we show the most recent epidemiological data regarding hip fracture, indicating the well-known risk factors and conditions that seem relevant for determining this condition. A specific part is dedicated to the social costs due to hip fracture. Although the costs of hip fracture are probably comparable to other common diseases with a high hospitalisation rate (e.g. cardiovascular disease), the other social costs (due to onset of new co-morbidities, sarcopenia, poor quality of life, disability and mortality) are probably greater. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. 49 CFR 173.50 - Class 1-Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... insensitive that there is very little probability of initiation or of transition from burning to detonation under normal conditions of transport. 1 The probability of transition from burning to detonation is... contain only extremely insensitive detonating substances and which demonstrate a negligible probability of...

  9. Probability Issues in without Replacement Sampling

    ERIC Educational Resources Information Center

    Joarder, A. H.; Al-Sabah, W. S.

    2007-01-01

    Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…

  10. Habitable periglacial landscapes in martian mid-latitudes

    NASA Astrophysics Data System (ADS)

    Ulrich, M.; Wagner, D.; Hauber, E.; de Vera, J.-P.; Schirrmeister, L.

    2012-05-01

    Subsurface permafrost environments on Mars are considered to be zones where extant life could have survived. For the identification of possible habitats it is important to understand periglacial landscape evolution and related subsurface and environmental conditions. Many landforms that are interpreted to be related to ground ice are located in the martian mid-latitudinal belts. This paper summarizes the insights gained from studies of terrestrial analogs to permafrost landforms on Mars. The potential habitability of martian mid-latitude periglacial landscapes is exemplarily deduced for one such landscape, that of Utopia Planitia, by a review and discussion of environmental conditions influencing periglacial landscape evolution. Based on recent calculations of the astronomical forcing of climate changes, specific climate periods are identified within the last 10 Ma when thaw processes and liquid water were probably important for the development of permafrost geomorphology. No periods could be identified within the last 4 Ma which met the suggested threshold criteria for liquid water and habitable conditions. Implications of past and present environmental conditions such as temperature variations, ground-ice conditions, and liquid water activity are discussed with respect to the potential survival of highly-specialized microorganisms known from terrestrial permafrost. We conclude that possible habitable subsurface niches might have been developed in close relation to specific permafrost landform morphology on Mars. These would have probably been dominated by lithoautotrophic microorganisms (i.e. methanogenic archaea).

  11. Dose estimation for nuclear power plant 4 accident in Taiwan at Fukushima nuclear meltdown emission level.

    PubMed

    Tang, Mei-Ling; Tsuang, Ben-Jei; Kuo, Pei-Hsuan

    2016-05-01

    An advanced Gaussian trajectory dispersion model is used to evaluate the evacuation zone due to a nuclear meltdown at the Nuclear Power Plant 4 (NPP4) in Taiwan, with the same emission level as that occurred at Fukushima nuclear meltdown (FNM) in 2011. Our study demonstrates that a FNM emission level would pollute 9% of the island's land area with annual effective dose ≥50 mSv using the meteorological data on 11 March 2011 in Taiwan. This high dose area is also called permanent evacuation zone (denoted as PEZ). The PEZ as well as the emergency-planning zone (EPZ) are found to be sensitive to meteorological conditions on the event. In a sunny day under the dominated NE wind conditions, the EPZ can be as far as 100 km with the first 7-day dose ≥20 mSv. Three hundred sixty-five daily events using the meteorological data from 11 March 2011 to 9 March 2012 are evaluated. It is found that the mean land area of Taiwan in becoming the PEZ is 11%. Especially, the probabilities of the northern counties/cities (Keelung, New Taipei, Taipei, Taoyuan, Hsinchu City, Hsinchu County and Ilan County) to be PEZs are high, ranging from 15% in Ilan County to 51% in Keelung City. Note that the total population of the above cities/counties is as high as 10 million people. Moreover, the western valleys of the Central Mountain Range are also found to be probable being PEZs, where all of the reservoirs in western Taiwan are located. For example, the probability can be as high as 3% in the far southern-most tip of Taiwan Island in Pingtung County. This shows that the entire populations in western Taiwan can be at risk due to the shortage of clean water sources under an event at FNM emission level, especially during the NE monsoon period. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  12. Modulations of stratospheric ozone by volcanic eruptions

    NASA Technical Reports Server (NTRS)

    Blanchette, Christian; Mcconnell, John C.

    1994-01-01

    We have used a time series of aerosol surface based on the measurements of Hofmann to investigate the modulation of total column ozone caused by the perturbation to gas phase chemistry by the reaction N2O5(gas) + H2O(aero) yields 2HNO3(gas) on the surface of stratospheric aerosols. We have tested a range of values for its reaction probability, gamma = 0.02, 0.13, and 0.26 which we compared to unperturbed homogeneous chemistry. Our analysis spans a period from Jan. 1974 to Oct. 1994. The results suggest that if lower values of gamma are the norm then we would expect larger ozone losses for highly enhanced aerosol content that for larger values of gamma. The ozone layer is more sensitive to the magnitude of the reaction probability under background conditions than during volcanically active periods. For most conditions, the conversion of NO2 to HNO3 is saturated for reaction probability in the range of laboratory measurements, but is only absolutely saturated following major volcanic eruptions when the heterogeneous loss dominates the losses of N2O5. The ozone loss due to this heterogeneous reaction increases with the increasing chlorine load. Total ozone losses calculated are comparable to ozone losses reported from TOMS and Dobson data.

  13. Effects of shifts in the rate of repetitive stimulation on sustained attention

    NASA Technical Reports Server (NTRS)

    Krulewitz, J. E.; Warm, J. S.; Wohl, T. H.

    1975-01-01

    The effects of shifts in the rate of presentation of repetitive neutral events (background event rate) were studied in a visual vigilance task. Four groups of subjects experienced either a high (21 events/min) or a low (6 events/min) event rate for 20 min and then experienced either the same or the alternate event rate for an additional 40 min. The temporal occurrence of critical target signals was identical for all groups, irrespective of event rate. The density of critical signals was 12 signals/20 min. By the end of the session, shifts in event rate were associated with changes in performance which resembled contrast effects found in other experimental situations in which shift paradigms were used. Relative to constant event rate control conditions, a shift from a low to a high event rate depressed the probability of signal detections, while a shift in the opposite direction enhanced the probability of signal detections.

  14. Screening for SNPs with Allele-Specific Methylation based on Next-Generation Sequencing Data

    PubMed Central

    Hu, Bo; Xu, Yaomin

    2013-01-01

    Allele-specific methylation (ASM) has long been studied but mainly documented in the context of genomic imprinting and X chromosome inactivation. Taking advantage of the next-generation sequencing technology, we conduct a high-throughput sequencing experiment with four prostate cell lines to survey the whole genome and identify single nucleotide polymorphisms (SNPs) with ASM. A Bayesian approach is proposed to model the counts of short reads for each SNP conditional on its genotypes of multiple subjects, leading to a posterior probability of ASM. We flag SNPs with high posterior probabilities of ASM by accounting for multiple comparisons based on posterior false discovery rates. Applying the Bayesian approach to the in-house prostate cell line data, we identify 269 SNPs as candidates of ASM. A simulation study is carried out to demonstrate the quantitative performance of the proposed approach. PMID:23710259

  15. Activated recombinative desorption: A potential component in mechanisms of spacecraft glow

    NASA Technical Reports Server (NTRS)

    Cross, J. B.

    1985-01-01

    The concept of activated recombination of atomic species on surfaces can explain the production of vibrationally and translationally excited desorbed molecular species. Equilibrium statistical mechanics predicts that the molecular quantum state distributions of desorbing molecules is a function of surface temperature only when the adsorption probability is unity and independent of initial collision conditions. In most cases, the adsorption probability is dependent upon initial conditions such as collision energy or internal quantum state distribution of impinging molecules. From detailed balance, such dynamical behavior is reflected in the internal quantum state distribution of the desorbing molecule. This concept, activated recombinative desorption, may offer a common thread in proposed mechanisms of spacecraft glow. Using molecular beam techniques and equipment available at Los Alamos, which includes a high translational energy 0-atom beam source, mass spectrometric detection of desorbed species, chemiluminescence/laser induced fluorescence detection of electronic and vibrationally excited reaction products, and Auger detection of surface adsorbed reaction products, a fundamental study of the gas surface chemistry underlying the glow process is proposed.

  16. The Effect of Leisure-Time Physical Activity on Obesity, Diabetes, High BP and Heart Disease Among Canadians: Evidence from 2000/2001 to 2005/2006.

    PubMed

    Sarma, Sisira; Devlin, Rose Anne; Gilliland, Jason; Campbell, M Karen; Zaric, Gregory S

    2015-12-01

    Although studies have looked at the effect of physical activity on obesity and other health outcomes, the causal nature of this relationship remains unclear. We fill this gap by investigating the impact of leisure-time physical activity (LTPA) and work-related physical activity (WRPA) on obesity and chronic conditions in Canadians aged 18-75 using instrumental variable and recursive bivariate probit approaches. Average local temperatures surrounding the respondents' interview month are used as a novel instrument to help identify the causal relationship between LTPA and health outcomes. We find that an active level of LTPA (i.e., walking ≥1 h/day) reduces the probability of obesity by five percentage points, which increases to 11 percentage points if also combined with some WRPA. WRPA exhibits a negative effect on the probability of obesity and chronic conditions. Copyright © 2014 John Wiley & Sons, Ltd.

  17. Race, unemployment rate, and chronic mental illness: a 15-year trend analysis.

    PubMed

    Lo, Celia C; Cheng, Tyrone C

    2014-07-01

    Before abating, the recession of the first decade of this century doubled the US unemployment rate. High unemployment is conceptualized as a stressor having serious effects on individuals' mental health. Data from surveys administered repeatedly over 15 years (1997-2011) described changes over time in the prevalence of chronic mental illness among US adults. The data allowed us to pinpoint changes characterizing the White majority--but not Black, Hispanic, or Asian minorities--and to ask whether such changes were attributable to economic conditions (measured via national unemployment rates). We combined 1.5 decades' worth of National Health Interview Survey data in one secondary analysis. We took social structural and demographic factors into account and let adjusted probability of chronic mental illness indicate prevalence of chronic mental illness We observed, as a general trend, that chronic mental illness probability increased as the unemployment rate rose. A greater increase in probability was observed for Blacks than Whites, notably during 2007-2011, the heart of the recession Our results confirmed that structural risk posed by the recent recession and by vulnerability to the recession's effects was differentially linked to Blacks. This led to the group's high probability of chronic mental illness, observed even when individual-level social structural and demographic factors were controlled. Future research should specify the particular kinds of vulnerability that created the additional disadvantage experienced by Black respondents.

  18. Role of the site of synaptic competition and the balance of learning forces for Hebbian encoding of probabilistic Markov sequences

    PubMed Central

    Bouchard, Kristofer E.; Ganguli, Surya; Brainard, Michael S.

    2015-01-01

    The majority of distinct sensory and motor events occur as temporally ordered sequences with rich probabilistic structure. Sequences can be characterized by the probability of transitioning from the current state to upcoming states (forward probability), as well as the probability of having transitioned to the current state from previous states (backward probability). Despite the prevalence of probabilistic sequencing of both sensory and motor events, the Hebbian mechanisms that mold synapses to reflect the statistics of experienced probabilistic sequences are not well understood. Here, we show through analytic calculations and numerical simulations that Hebbian plasticity (correlation, covariance, and STDP) with pre-synaptic competition can develop synaptic weights equal to the conditional forward transition probabilities present in the input sequence. In contrast, post-synaptic competition can develop synaptic weights proportional to the conditional backward probabilities of the same input sequence. We demonstrate that to stably reflect the conditional probability of a neuron's inputs and outputs, local Hebbian plasticity requires balance between competitive learning forces that promote synaptic differentiation and homogenizing learning forces that promote synaptic stabilization. The balance between these forces dictates a prior over the distribution of learned synaptic weights, strongly influencing both the rate at which structure emerges and the entropy of the final distribution of synaptic weights. Together, these results demonstrate a simple correspondence between the biophysical organization of neurons, the site of synaptic competition, and the temporal flow of information encoded in synaptic weights by Hebbian plasticity while highlighting the utility of balancing learning forces to accurately encode probability distributions, and prior expectations over such probability distributions. PMID:26257637

  19. Bivariate categorical data analysis using normal linear conditional multinomial probability model.

    PubMed

    Sun, Bingrui; Sutradhar, Brajendra

    2015-02-10

    Bivariate multinomial data such as the left and right eyes retinopathy status data are analyzed either by using a joint bivariate probability model or by exploiting certain odds ratio-based association models. However, the joint bivariate probability model yields marginal probabilities, which are complicated functions of marginal and association parameters for both variables, and the odds ratio-based association model treats the odds ratios involved in the joint probabilities as 'working' parameters, which are consequently estimated through certain arbitrary 'working' regression models. Also, this later odds ratio-based model does not provide any easy interpretations of the correlations between two categorical variables. On the basis of pre-specified marginal probabilities, in this paper, we develop a bivariate normal type linear conditional multinomial probability model to understand the correlations between two categorical variables. The parameters involved in the model are consistently estimated using the optimal likelihood and generalized quasi-likelihood approaches. The proposed model and the inferences are illustrated through an intensive simulation study as well as an analysis of the well-known Wisconsin Diabetic Retinopathy status data. Copyright © 2014 John Wiley & Sons, Ltd.

  20. Assessing the Relationship Between Chronic Health Conditions and Productivity Loss Trajectories

    PubMed Central

    Pranksy, Glenn

    2014-01-01

    Objective: To examine the relationship between health conditions and the risk for membership in longitudinal trajectories of productivity loss. Methods: Trajectories of productivity loss from the ages of 25 to 44 years, previously identified in the National Longitudinal Survey of Youth (NLSY79), were combined with information on health conditions from the age 40 years health module in the NLSY79. Multinomial logistic regression was used to examine the relative risk of being in the low-risk, early-onset increasing risk, late-onset increasing risk, or high-risk trajectories compared with the no-risk trajectory for having various health conditions. Results: The trajectories with the greatest probability of productivity loss longitudinally had a greater prevalence of the individual health conditions and a greater total number of health conditions experienced. Conclusions: Health conditions are associated with specific longitudinal patterns of experiencing productivity loss. PMID:25479294

  1. Brain mechanisms of emotions.

    PubMed

    Simonov, P V

    1997-01-01

    At the 23rd International Congress of Physiology Sciences (Tokyo, 1965) the results of experiment led us to the conclusion that emotions were determined by the actual need and estimation of probability (possibility) of its satisfaction. Low probability of need satisfaction leads to negative emotions actively minimized by the subject. Increased probability of satisfaction, as compared to the earlier forecast, generates positive emotions which the subject tries to maximize, that is, to enhance, to prolong, to repeat. We named our concept the Need-Informational Theory of Emotions. According to this theory, motivation, emotion, and estimation of probability have different neuromorphological substrates. Activation through the hypothalamic motivatiogenic structures of the frontal parts of the neocortex orients the behavior to signals with a high probability of their reinforcement. At the same time the hippocampus is necessary for reactions to signals of low probability events, which are typical for the emotionally excited brain. By comparison of motivational excitation with available stimuli or their engrams, the amygdala selects a dominant motivation, destined to be satisfied in the first instance. In the cases of classical conditioning and escape reaction the reinforcement was related to involvement of the negative emotion's hypothalamic neurons, while in the course of avoidance reaction the positive emotion's neurons were involved. The role of the left and right frontal neocortex in the appearance or positive or negative emotions depends on these informational (cognitive) functions.

  2. [The brain mechanisms of emotions].

    PubMed

    Simonov, P V

    1997-01-01

    At the 23rd International Congress of Physiological Sciences (Tokyo, 1965) the results of experiment brought us to a conclusion that emotions were determined by the actual need and estimation of probability (possibility) of its satisfaction. Low probability of need satisfaction leads to negative emotions actively minimized by the subject. Increased probability of satisfaction, as compared to the earlier forecast, generates positive emotions which the subject tries to maximize, that is to enhance, to prolong, to repeat. We named our concept the Need-Informational Theory of Emotions. According to this theory, motivation, emotion and estimation of probability have different neuromorphological substrate. Activating by motivatiogenic structures of the hypothalamus the frontal parts of neocortex orients the behavior to signals with a high probability of their reinforcement. At the same time the hippocampus is necessary for reactions to signals of low probability events, which is typical for emotionally excited brain. By comparison of motivational excitation with available stimuli or their engrams the amygdala selects a dominant motivation, destined to be satisfied in the first instance. In the cases of classical conditioning and escape reaction the reinforcement was related to involvement of the negative emotion's hypothalamic neurons while in the course of avoidance reaction the positive emotion's neurons being involved. The role of the left and right frontal neocortex in the appearance of positive or negative emotions depends on this informational (cognitive) functions.

  3. Approximation of Failure Probability Using Conditional Sampling

    NASA Technical Reports Server (NTRS)

    Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.

    2008-01-01

    In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.

  4. How long do the dead survive on the road? Carcass persistence probability and implications for road-kill monitoring surveys.

    PubMed

    Santos, Sara M; Carvalho, Filipe; Mira, António

    2011-01-01

    Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the effect of sampling frequency on the relative variation in road-kill estimates registered within a survey. Daily surveys of road-killed vertebrates were conducted over one year along four road sections with different traffic volumes. Survival analysis was then used to i) describe carcass persistence timings for overall and for specific animal groups; ii) assess optimal sampling designs according to research objectives; and iii) model the influence of road, animal and weather factors on carcass persistence probabilities. Most animal carcasses persisted on the road for the first day only, with some groups disappearing at very high rates. The advisable periodicity of road monitoring that minimizes bias in road mortality estimates is daily monitoring for bats (in the morning) and lizards (in the afternoon), daily monitoring for toads, small birds, small mammals, snakes, salamanders, and lagomorphs; 1 day-interval (alternate days) for large birds, birds of prey, hedgehogs, and freshwater turtles; and 2 day-interval for carnivores. Multiple factors influenced the persistence probabilities of vertebrate carcasses on the road. Overall, the persistence was much lower for small animals, on roads with lower traffic volumes, for carcasses located on road lanes, and during humid conditions and high temperatures during the wet season and dry seasons, respectively. The guidance given here on monitoring frequencies is particularly relevant to provide conservation and transportation agencies with accurate numbers of road-kills, realistic mitigation measures, and detailed designs for road monitoring programs.

  5. How Long Do the Dead Survive on the Road? Carcass Persistence Probability and Implications for Road-Kill Monitoring Surveys

    PubMed Central

    Santos, Sara M.; Carvalho, Filipe; Mira, António

    2011-01-01

    Background Road mortality is probably the best-known and visible impact of roads upon wildlife. Although several factors influence road-kill counts, carcass persistence time is considered the most important determinant underlying underestimates of road mortality. The present study aims to describe and model carcass persistence variability on the road for different taxonomic groups under different environmental conditions throughout the year; and also to assess the effect of sampling frequency on the relative variation in road-kill estimates registered within a survey. Methodology/Principal Findings Daily surveys of road-killed vertebrates were conducted over one year along four road sections with different traffic volumes. Survival analysis was then used to i) describe carcass persistence timings for overall and for specific animal groups; ii) assess optimal sampling designs according to research objectives; and iii) model the influence of road, animal and weather factors on carcass persistence probabilities. Most animal carcasses persisted on the road for the first day only, with some groups disappearing at very high rates. The advisable periodicity of road monitoring that minimizes bias in road mortality estimates is daily monitoring for bats (in the morning) and lizards (in the afternoon), daily monitoring for toads, small birds, small mammals, snakes, salamanders, and lagomorphs; 1 day-interval (alternate days) for large birds, birds of prey, hedgehogs, and freshwater turtles; and 2 day-interval for carnivores. Multiple factors influenced the persistence probabilities of vertebrate carcasses on the road. Overall, the persistence was much lower for small animals, on roads with lower traffic volumes, for carcasses located on road lanes, and during humid conditions and high temperatures during the wet season and dry seasons, respectively. Conclusion/Significance The guidance given here on monitoring frequencies is particularly relevant to provide conservation and transportation agencies with accurate numbers of road-kills, realistic mitigation measures, and detailed designs for road monitoring programs. PMID:21980437

  6. Seasonal Variability of Middle Latitude Ozone in the Lowermost Stratosphere Derived from Probability Distribution Functions

    NASA Technical Reports Server (NTRS)

    Rood, Richard B.; Douglass, Anne R.; Cerniglia, Mark C.; Sparling, Lynn C.; Nielsen, J. Eric

    1999-01-01

    We present a study of the distribution of ozone in the lowermost stratosphere with the goal of characterizing the observed variability. The air in the lowermost stratosphere is divided into two population groups based on Ertel's potential vorticity at 300 hPa. High (low) potential vorticity at 300 hPa indicates that the tropopause is low (high), and the identification of these two groups is made to account for the dynamic variability. Conditional probability distribution functions are used to define the statistics of the ozone distribution from both observations and a three-dimensional model simulation using winds from the Goddard Earth Observing System Data Assimilation System for transport. Ozone data sets include ozonesonde observations from northern midlatitude stations (1991-96) and midlatitude observations made by the Halogen Occultation Experiment (HALOE) on the Upper Atmosphere Research Satellite (UARS) (1994- 1998). The conditional probability distribution functions are calculated at a series of potential temperature surfaces spanning the domain from the midlatitude tropopause to surfaces higher than the mean tropical tropopause (approximately 380K). The probability distribution functions are similar for the two data sources, despite differences in horizontal and vertical resolution and spatial and temporal sampling. Comparisons with the model demonstrate that the model maintains a mix of air in the lowermost stratosphere similar to the observations. The model also simulates a realistic annual cycle. Results show that during summer, much of the observed variability is explained by the height of the tropopause. During the winter and spring, when the tropopause fluctuations are larger, less of the variability is explained by tropopause height. This suggests that more mixing occurs during these seasons. During all seasons, there is a transition zone near the tropopause that contains air characteristic of both the troposphere and the stratosphere. The relevance of the results to the assessment of the environmental impact of aircraft effluence is also discussed.

  7. Prioritizing forest fuels treatments based on the probability of high-severity fire restores adaptive capacity in Sierran forests

    Treesearch

    Daniel J. Krofcheck; Matthew D. Hurteau; Robert M. Scheller; E. Louise Loudermilk

    2017-01-01

    In frequent fire forests of the western United States, a legacy of fire suppression coupled with increases in fire weather severity have altered fire regimes and vegetation dynamics. When coupled with projected climate change, these conditions have the potential to lead to vegetation type change and altered carbon (C) dynamics. In the Sierra Nevada, fuels...

  8. Pedigrees, Prizes, and Prisoners: The Misuse of Conditional Probability

    ERIC Educational Resources Information Center

    Carlton, Matthew A.

    2005-01-01

    We present and discuss three examples of misapplication of the notion of conditional probability. In each example, we present the problem along with a published and/or well-known incorrect--but seemingly plausible--solution. We then give a careful treatment of the correct solution, in large part to show how careful application of basic probability…

  9. Unsolved Problems in Evolutionary Theory

    DTIC Science & Technology

    1967-01-01

    finding the probability of survival of a single new mutant). Most natural populations probably satisfy these conditions , as is illustrated by the...Ykl) of small quantities adding to zero. Then under suitable conditions on the function f(x), (3) xi + Yi,t+i = fi(x) + YE yjfi(tf) + O(y yt...It is clear that a sufficient condition for the point x to be locally stable is that all the roots of the matrix, (4) (a j) = ____ should have moduli

  10. High monetary reward rates and caloric rewards decrease temporal persistence

    PubMed Central

    Bode, Stefan; Murawski, Carsten

    2017-01-01

    Temporal persistence refers to an individual's capacity to wait for future rewards, while forgoing possible alternatives. This requires a trade-off between the potential value of delayed rewards and opportunity costs, and is relevant to many real-world decisions, such as dieting. Theoretical models have previously suggested that high monetary reward rates, or positive energy balance, may result in decreased temporal persistence. In our study, 50 fasted participants engaged in a temporal persistence task, incentivised with monetary rewards. In alternating blocks of this task, rewards were delivered at delays drawn randomly from distributions with either a lower or higher maximum reward rate. During some blocks participants received either a caloric drink or water. We used survival analysis to estimate participants' probability of quitting conditional on the delay distribution and the consumed liquid. Participants had a higher probability of quitting in blocks with the higher reward rate. Furthermore, participants who consumed the caloric drink had a higher probability of quitting than those who consumed water. Our results support the predictions from the theoretical models, and importantly, suggest that both higher monetary reward rates and physiologically relevant rewards can decrease temporal persistence, which is a crucial determinant for survival in many species. PMID:28228517

  11. High monetary reward rates and caloric rewards decrease temporal persistence.

    PubMed

    Fung, Bowen J; Bode, Stefan; Murawski, Carsten

    2017-02-22

    Temporal persistence refers to an individual's capacity to wait for future rewards, while forgoing possible alternatives. This requires a trade-off between the potential value of delayed rewards and opportunity costs, and is relevant to many real-world decisions, such as dieting. Theoretical models have previously suggested that high monetary reward rates, or positive energy balance, may result in decreased temporal persistence. In our study, 50 fasted participants engaged in a temporal persistence task, incentivised with monetary rewards. In alternating blocks of this task, rewards were delivered at delays drawn randomly from distributions with either a lower or higher maximum reward rate. During some blocks participants received either a caloric drink or water. We used survival analysis to estimate participants' probability of quitting conditional on the delay distribution and the consumed liquid. Participants had a higher probability of quitting in blocks with the higher reward rate. Furthermore, participants who consumed the caloric drink had a higher probability of quitting than those who consumed water. Our results support the predictions from the theoretical models, and importantly, suggest that both higher monetary reward rates and physiologically relevant rewards can decrease temporal persistence, which is a crucial determinant for survival in many species. © 2017 The Authors.

  12. Kidnapping of chicks in emperor penguins: a hormonal by-product?

    PubMed

    Angelier, Frédéric; Barbraud, Christophe; Lormée, Hervé; Prud'homme, François; Chastel, Olivier

    2006-04-01

    The function and causes of kidnapping juveniles are little understood because individuals sustain some breeding costs to rear an unrelated offspring. Here we focus on the proximal causes of this behaviour in emperor penguins (Aptenodytes forsteri), whose failed breeders often kidnap chicks. We experimentally tested the hypothesis that kidnapping behaviour was the result of high residual levels of prolactin (PRL), a hormone involved in parental behaviour. Penguins with artificially decreased PRL levels by bromocriptine administration kidnapped chicks less often than control penguins. Within the bromocriptine treated group, kidnapping behaviour was not totally suppressed and the probability of kidnapping a chick was positively correlated to PRL levels measured before treatment. During breeding, emperor penguins have to forage in remote ice-free areas. In these birds, PRL secretion is poorly influenced by chick stimuli and has probably evolved to maintain a willingness to return to the colony after a long absence at sea. Therefore, penguins that have lost their chick during a foraging trip still maintain high residual PRL levels and this, combined with colonial breeding, probably facilitates kidnapping. We suggest that kidnapping in non-cooperative systems may result from a hormonal byproduct of a reproductive adaptation to extreme conditions.

  13. The demand for preventive and restorative dental services.

    PubMed

    Meyerhoefer, Chad D; Zuvekas, Samuel H; Manski, Richard

    2014-01-01

    Chronic tooth decay is the most common chronic condition in the United States among children ages 5-17 and also affects a large percentage of adults. Oral health conditions are preventable, but less than half of the US population uses dental services annually. We seek to examine the extent to which limited dental coverage and high out-of-pocket costs reduce dental service use by the nonelderly privately insured and uninsured. Using data from the 2001-2006 Medical Expenditure Panel Survey and an American Dental Association survey of dental procedure prices, we jointly estimate the probability of using preventive and both basic and major restorative services through a correlated random effects specification that controls for endogeneity. We found that dental coverage increased the probability of preventive care use by 19% and the use of restorative services 11% to 16%. Both conditional and unconditional on dental coverage, the use of dental services was not sensitive to out-of-pocket costs. We conclude that dental coverage is an important determinant of preventive dental service use, but other nonprice factors related to consumer preferences, especially education, are equal if not stronger determinants. Copyright © 2013 John Wiley & Sons, Ltd.

  14. [Significance of motivation balance for a choice of dog's behavior under conditions of environmental uncertainty].

    PubMed

    Chilingarian, L I; Grigor'ian, G A

    2007-01-01

    Two experimental models with a choice between two reinforcements were used for assessment of individual typological features of dogs. In the first model dogs were given the choice of homogeneous food reinforcements: between less valuable constantly delivered reinforcement and more valuable reinforcement but delivered with low probabilities. In the second model the dogs had the choice of heterogeneous reinforcements: between performing alimentary and defensive reactions. Under conditions of rise of uncertainty owing to a decrease in probability of getting the valuable food, two dogs continued to prefer the valuable reinforcement, while the third animal gradually shifted its behavior from the choice of a highly valuable but infrequent reward to a less valuable but easily achieved reinforcement. Under condition of choice between the valuable food reinforcement and avoidance of electrocutaneous stimulation, the first two dogs preferred food, whereas the third animal which had been previously oriented to the choice of the low-valuable constant reinforcement, steadily preferred the avoidance behavior. The data obtained are consistent with the hypothesis according to which the individual typological characteristics of animals's (human's) behavior substantially depend on two parameters: extent of environmental uncertainty and subjective features of reinforcement assessment.

  15. Does red noise increase or decrease extinction risk? Single extreme events versus series of unfavorable conditions.

    PubMed

    Schwager, Monika; Johst, Karin; Jeltsch, Florian

    2006-06-01

    Recent theoretical studies have shown contrasting effects of temporal correlation of environmental fluctuations (red noise) on the risk of population extinction. It is still debated whether and under which conditions red noise increases or decreases extinction risk compared with uncorrelated (white) noise. Here, we explain the opposing effects by introducing two features of red noise time series. On the one hand, positive autocorrelation increases the probability of series of poor environmental conditions, implying increasing extinction risk. On the other hand, for a given time period, the probability of at least one extremely bad year ("catastrophe") is reduced compared with white noise, implying decreasing extinction risk. Which of these two features determines extinction risk depends on the strength of environmental fluctuations and the sensitivity of population dynamics to these fluctuations. If extreme (catastrophic) events can occur (strong noise) or sensitivity is high (overcompensatory density dependence), then temporal correlation decreases extinction risk; otherwise, it increases it. Thus, our results provide a simple explanation for the contrasting previous findings and are a crucial step toward a general understanding of the effect of noise color on extinction risk.

  16. A translational velocity command system for VTOL low speed flight

    NASA Technical Reports Server (NTRS)

    Merrick, V. K.

    1982-01-01

    A translational velocity flight controller, suitable for very low speed maneuvering, is described and its application to a large class of VTOL aircraft from jet lift to propeller driven types is analyzed. Estimates for the more critical lateral axis lead to the conclusion that the controller would provide a jet lift (high disk loading) VTOL aircraft with satisfactory "hands off" station keeping in operational conditions more stringent than any specified in current or projected requirements. It also seems likely that ducted fan or propeller driven (low disk loading) VTOL aircraft would have acceptable hovering handling qualities even in high turbulence, although in these conditions pilot intervention to maintain satisfactory station keeping would probably be required for landing in restricted areas.

  17. Hospitalisations among seafarers on merchant ships

    PubMed Central

    Hansen, H; Tuchsen, F; Hannerz, H

    2005-01-01

    Aims: To study morbidity among active seafarers in the merchant navy in order to clarify possible work related morbidity and the morbidity related to work and lifestyle where possible preventive measures may be initiated. Methods: From a register in the Danish Maritime Authority a cohort of Danish merchant seafarers who had been actively employed at sea in 1995 was identified. For each seafarer, information on all employment periods at sea, charge aboard, and ship was available. The cohort was linked with the National In-patient Register in Denmark. Standardised hospitalisation ratios (SHRs) were calculated for all major diagnostic groups using all gainfully employed as reference. Results: Seafarers were shown to be inhomogeneous, with significant differences in SHRs for the same disease groups between different groups of seafarers depending on charge and ship type. SHRs for lifestyle related diseases were high, although rates for acute conditions, such as acute myocardial infarction, were low, probably due to referral bias, as acute conditions are likely to cause hospitalisation abroad, and thus are not included in the study. SHRs for injury and poisoning were high, especially for ratings and officers aboard small ships. Conclusion: Despite pre-employment selection, a large proportion of the seafarers constitute a group of workers with evidence of poor health probably caused by lifestyle. The subgroups with high risk of hospitalisation due to lifestyle related diseases also had an increased risk of hospitalisation due to injury and poisoning. PMID:15723878

  18. Relationships between Long-Term Demography and Weather in a Sub-Arctic Population of Common Eider

    PubMed Central

    Jónsson, Jón Einar; Gardarsson, Arnthor; Gill, Jennifer A.; Pétursdóttir, Una Krístín; Petersen, Aevar; Gunnarsson, Tómas Grétar

    2013-01-01

    Effects of local weather on individuals and populations are key drivers of wildlife responses to climatic changes. However, studies often do not last long enough to identify weather conditions that influence demographic processes, or to capture rare but extreme weather events at appropriate scales. In Iceland, farmers collect nest down of wild common eider Somateria mollissima and many farmers count nests within colonies annually, which reflects annual variation in the number of breeding females. We collated these data for 17 colonies. Synchrony in breeding numbers was generally low between colonies. We evaluated 1) demographic relationships with weather in nesting colonies of common eider across Iceland during 1900–2007; and 2) impacts of episodic weather events (aberrantly cold seasons or years) on subsequent breeding numbers. Except for episodic events, breeding numbers within a colony generally had no relationship to local weather conditions in the preceding year. However, common eider are sexually mature at 2–3 years of age and we found a 3-year time lag between summer weather and breeding numbers for three colonies, indicating a positive effect of higher pressure, drier summers for one colony, and a negative effect of warmer, calmer summers for two colonies. These findings may represent weather effects on duckling production and subsequent recruitment. Weather effects were mostly limited to a few aberrant years causing reductions in breeding numbers, i.e. declines in several colonies followed severe winters (1918) and some years with high NAO (1992, 1995). In terms of life history, adult survival generally is high and stable and probably only markedly affected by inclement weather or aberrantly bad years. Conversely, breeding propensity of adults and duckling production probably do respond more to annual weather variations; i.e. unfavorable winter conditions for adults increase probability of death or skipped breeding, whereas favorable summers can promote boom years for recruitment. PMID:23805292

  19. Electrophysiological evidence that top-down knowledge controls working memory processing for subsequent visual search.

    PubMed

    Kawashima, Tomoya; Matsumoto, Eriko

    2016-03-23

    Items in working memory guide visual attention toward a memory-matching object. Recent studies have shown that when searching for an object this attentional guidance can be modulated by knowing the probability that the target will match an item in working memory. Here, we recorded the P3 and contralateral delay activity to investigate how top-down knowledge controls the processing of working memory items. Participants performed memory task (recognition only) and memory-or-search task (recognition or visual search) in which they were asked to maintain two colored oriented bars in working memory. For visual search, we manipulated the probability that target had the same color as memorized items (0, 50, or 100%). Participants knew the probabilities before the task. Target detection in 100% match condition was faster than that in 50% match condition, indicating that participants used their knowledge of the probabilities. We found that the P3 amplitude in 100% condition was larger than in other conditions and that contralateral delay activity amplitude did not vary across conditions. These results suggest that more attention was allocated to the memory items when observers knew in advance that their color would likely match a target. This led to better search performance despite using qualitatively equal working memory representations.

  20. A pilot study of naturally occurring high-probability request sequences in hostage negotiations.

    PubMed

    Hughes, James

    2009-01-01

    In the current study, the audiotapes from three hostage-taking situations were analyzed. Hostage negotiator requests to the hostage taker were characterized as either high or low probability. The results suggested that hostage-taker compliance to a hostage negotiator's low-probability request was more likely when a series of complied-with high-probability requests preceded the low-probability request. However, two of the three hostage-taking situations ended violently; therefore, the implications of the high-probability request sequence for hostage-taking situations should be assessed in future research.

  1. A PILOT STUDY OF NATURALLY OCCURRING HIGH-PROBABILITY REQUEST SEQUENCES IN HOSTAGE NEGOTIATIONS

    PubMed Central

    Hughes, James

    2009-01-01

    In the current study, the audiotapes from three hostage-taking situations were analyzed. Hostage negotiator requests to the hostage taker were characterized as either high or low probability. The results suggested that hostage-taker compliance to a hostage negotiator's low-probability request was more likely when a series of complied-with high-probability requests preceded the low-probability request. However, two of the three hostage-taking situations ended violently; therefore, the implications of the high-probability request sequence for hostage-taking situations should be assessed in future research. PMID:19949541

  2. Preemptive Anticoagulation in Patients With a High Pretest Probability of Pulmonary Embolism: Are Guidelines Followed?

    PubMed

    Willoughby, Laura; Adams, Daniel M; Evans, R Scott; Lloyd, James F; Stevens, Scott M; Woller, Scott C; Bledsoe, Joseph R; Aston, Valerie T; Wilson, Emily L; Elliott, C Gregory

    2018-05-01

    Guidelines suggest anticoagulation of patients with high pretest probability of pulmonary embolism (PE) while awaiting diagnostic test results (preemptive anticoagulation). Data relevant to the practice of preemptive anticoagulation are not available. We reviewed 3,500 consecutive patients who underwent CT pulmonary angiography (CTPA) at two EDs. We classified the pretest probability for PE using the revised Geneva Score (RGS) as low (RGS 0-3), intermediate (RGS 4-10), or high (RGS 11-18). We classified patients with a high pretest probability of PE as receiving preemptive anticoagulation if therapeutic anticoagulation was given before CTPA completion. Patients with a high bleeding risk and those receiving treatment for DVT before CTPA were excluded from the preemptive anticoagulation analysis. We compared the time elapsed between ED registration and CTPA completion for patients with a low, intermediate, and high pretest probability for PE. We excluded three of 3,500 patients because CTPA preceded ED registration. Of the remaining 3,497 patients, 167 (4.8%) had a high pretest probability for PE. After excluding 29 patients for high bleeding risk and 21 patients who were treated for DVT prior to CTPA, only two of 117 patients (1.7%) with a high pretest probability for PE received preemptive anticoagulation. Furthermore, 37 of the remaining 115 patients (32%) with a high pretest probability for PE had a preexisting indication for anticoagulation but did not receive preemptive anticoagulation. The time from ED registration to CTPA completion did not differ based on the pretest probability of PE. Physicians rarely use preemptive anticoagulation in patients with a high pretest probability for PE. Clinicians do not expedite CTPA examinations for patients with a high pretest probability for PE. Copyright © 2017 American College of Chest Physicians. Published by Elsevier Inc. All rights reserved.

  3. ELIPGRID-PC: A PC program for calculating hot spot probabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, J.R.

    1994-10-01

    ELIPGRID-PC, a new personal computer program has been developed to provide easy access to Singer`s 1972 ELIPGRID algorithm for hot-spot detection probabilities. Three features of the program are the ability to determine: (1) the grid size required for specified conditions, (2) the smallest hot spot that can be sampled with a given probability, and (3) the approximate grid size resulting from specified conditions and sampling cost. ELIPGRID-PC also provides probability of hit versus cost data for graphing with spread-sheets or graphics software. The program has been successfully tested using Singer`s published ELIPGRID results. An apparent error in the original ELIPGRIDmore » code has been uncovered and an appropriate modification incorporated into the new program.« less

  4. Risk estimation using probability machines

    PubMed Central

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  5. Risk estimation using probability machines.

    PubMed

    Dasgupta, Abhijit; Szymczak, Silke; Moore, Jason H; Bailey-Wilson, Joan E; Malley, James D

    2014-03-01

    Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a "risk machine", will share properties from the statistical machine that it is derived from.

  6. The effects of flow on schooling Devario aequipinnatus: school structure, startle response and information transmission

    PubMed Central

    Chicoli, A.; Butail, S.; Lun, Y.; Bak-Coleman, J.; Coombs, S.; Paley, D.A.

    2014-01-01

    To assess how flow affects school structure and threat detection, startle response rates of solitary and small groups of giant danio Devario aequipinnatus were compared to visual looming stimuli in flow and no-flow conditions. The instantaneous position and heading of each D. aequipinnatus were extracted from high-speed videos. Behavioural results indicate that (1) school structure is altered in flow such that D. aequipinnatus orient upstream while spanning out in a crosswise direction, (2) the probability of at least one D. aequipinnatus detecting the visual looming stimulus is higher in flow than no flow for both solitary D. aequipinnatus and groups of eight D. aequipinnatus, however, (3) the probability of three or more individuals responding is higher in no flow than flow. Taken together, these results indicate a higher probability of stimulus detection in flow but a higher probability of internal transmission of information in no flow. Finally, results were well predicted by a computational model of collective fright response that included the probability of direct detection (based on signal detection theory) and indirect detection (i.e. via interactions between group members) of threatening stimuli. This model provides a new theoretical framework for analysing the collective transfer of information among groups of fishes and other organisms. PMID:24773538

  7. An Evaluation of the High-Probability Instruction Sequence with and without Programmed Reinforcement for Compliance with High-Probability Instructions

    ERIC Educational Resources Information Center

    Zuluaga, Carlos A.; Normand, Matthew P.

    2008-01-01

    We assessed the effects of reinforcement and no reinforcement for compliance to high-probability (high-p) instructions on compliance to low-probability (low-p) instructions using a reversal design. For both participants, compliance with the low-p instruction increased only when compliance with high-p instructions was followed by reinforcement.…

  8. Three-dimensional obstacle classification in laser range data

    NASA Astrophysics Data System (ADS)

    Armbruster, Walter; Bers, Karl-Heinz

    1998-10-01

    The threat of hostile surveillance and weapon systems require military aircraft to fly under extreme conditions such as low altitude, high speed, poor visibility and incomplete terrain information. The probability of collision with natural and man-made obstacles during such contour missions is high if detection capability is restricted to conventional vision aids. Forward-looking scanning laser rangefinders which are presently being flight tested and evaluated at German proving grounds, provide a possible solution, having a large field of view, high angular and range resolution, a high pulse repetition rate, and sufficient pulse energy to register returns from wires at over 500 m range (depends on the system) with a high hit-and-detect probability. Despite the efficiency of the sensor, acceptance of current obstacle warning systems by test pilots is not very high, mainly due to the systems' inadequacies in obstacle recognition and visualization. This has motivated the development and the testing of more advanced 3d-scene analysis algorithm at FGAN-FIM to replace the obstacle recognition component of current warning systems. The basic ideas are to increase the recognition probability and to reduce the false alarm rate for hard-to-extract obstacles such as wires, by using more readily recognizable objects such as terrain, poles, pylons, trees, etc. by implementing a hierarchical classification procedure to generate a parametric description of the terrain surface as well as the class, position, orientation, size and shape of all objects in the scene. The algorithms can be used for other applications such as terrain following, autonomous obstacle avoidance, and automatic target recognition.

  9. Energy Approach-Based Simulation of Structural Materials High-Cycle Fatigue

    NASA Astrophysics Data System (ADS)

    Balayev, A. F.; Korolev, A. V.; Kochetkov, A. V.; Sklyarova, A. I.; Zakharov, O. V.

    2016-02-01

    The paper describes the mechanism of micro-cracks development in solid structural materials based on the theory of brittle fracture. A probability function of material cracks energy distribution is obtained using a probabilistic approach. The paper states energy conditions for cracks growth at material high-cycle loading. A formula allowing to calculate the amount of energy absorbed during the cracks growth is given. The paper proposes a high- cycle fatigue evaluation criterion allowing to determine the maximum permissible number of solid body loading cycles, at which micro-cracks start growing rapidly up to destruction.

  10. Heavy metal removal from waste waters by ion flotation.

    PubMed

    Polat, H; Erdogan, D

    2007-09-05

    Flotation studies were carried out to investigate the removal of heavy metals such as copper (II), zinc (II), chromium (III) and silver (I) from waste waters. Various parameters such as pH, collector and frother concentrations and airflow rate were tested to determine the optimum flotation conditions. Sodium dodecyl sulfate and hexadecyltrimethyl ammonium bromide were used as collectors. Ethanol and methyl isobutyl carbinol (MIBC) were used as frothers. Metal removal reached about 74% under optimum conditions at low pH. At basic pH it became as high as 90%, probably due to the contribution from the flotation of metal precipitates.

  11. Archival-grade optical disc design and international standards

    NASA Astrophysics Data System (ADS)

    Fujii, Toru; Kojyo, Shinichi; Endo, Akihisa; Kodaira, Takuo; Mori, Fumi; Shimizu, Atsuo

    2015-09-01

    Optical discs currently on the market exhibit large variations in life span among discs, making them unsuitable for certain business applications. To assess and potentially mitigate this problem, we performed accelerated degradation testing under standard ISO conditions, determined the probable disc failure mechanisms, and identified the essential criteria necessary for a stable disc composition. With these criteria as necessary conditions, we analyzed the physical and chemical changes that occur in the disc components, on the basis of which we determined technological measures to reduce these degradation processes. By applying these measures to disc fabrication, we were able to develop highly stable optical discs.

  12. Development of reverse biased p-n junction electron emission

    NASA Technical Reports Server (NTRS)

    Fowler, P.; Muly, E. C.

    1971-01-01

    A cold cathode emitter of hot electrons for use as a source of electrons in vacuum gauges and mass spectrometers was developed using standard Norton electroluminescent silicon carbide p-n diodes operated under reverse bias conditions. Continued development including variations in the geometry of these emitters was carried out such that emitters with an emission efficiency (emitted current/junction current) as high as 3 x 10-0.00001 were obtained. Pulse measurements of the diode characteristics were made and showed that higher efficiency can be attained under pulse conditions probably due to the resulting lower temperatures resulting from such operation.

  13. Predictability of Sleep in Patients with Insomnia

    PubMed Central

    Vallières, Annie; Ivers, Hans; Beaulieu-Bonneau, Simon; Morin, Charles M.

    2011-01-01

    Study Objectives: To evaluate whether the night-to-night variability in insomnia follows specific predictable patterns and to characterize sleep patterns using objective sleep and clinical variables. Design: Prospective observational study. Setting: University-affiliated sleep disorders center. Participants: 146 participants suffering from chronic and primary insomnia. Measurements and Results: Daily sleep diaries were completed for an average of 48 days and self-reported questionnaires once. Three nights were spent in the sleep laboratory for polysomnographic (PSG) assessment. Sleep efficiency, sleep onset latency, wake after sleep onset, and total sleep time were derived from sleep diaries and PSG. Time-series diary data were used to compute conditional probabilities of having an insomnia night after 1, 2, or 3 consecutive insomnia night(s). Conditional probabilities were submitted to a k-means cluster analysis. A 3-cluster solution was retained. One cluster included 38 participants exhibiting an unpredictable insomnia pattern. Another included 30 participants with a low and decreasing probability to have an insomnia night. The last cluster included 49 participants exhibiting a high probability to have insomnia every night. Clusters differed on age, insomnia severity, and mental fatigue, and on subjective sleep variables, but not on PSG sleep variables. Conclusion: These findings replicate our previous study and provide additional evidence that unpredictability is a less prevalent feature of insomnia than suggested previously in the literature. The presence of the 3 clusters is discussed in term of sleep perception and sleep homeostasis dysregulation. Citation: Vallières A; Ivers H; Beaulieu-Bonneau S; Morin CM. Predictability of sleep in patients with insomnia. SLEEP 2011;34(5):609-617. PMID:21532954

  14. Global assessment of surfing conditions: seasonal, interannual and long-term variability

    NASA Astrophysics Data System (ADS)

    Espejo, A.; Losada, I.; Mendez, F.

    2012-12-01

    International surfing destinations owe a great debt to specific combinations of wind-wave, thermal conditions and local bathymetry. As surf quality depends on a vast number of geophysical variables, a multivariable standardized index on the basis of expert judgment is proposed to analyze surf resource in a worldwide domain. Data needed is obtained by combining several datasets (reanalyses): 60-year satellite-calibrated spectral wave hindcast (GOW, WaveWatchIII), wind fields from NCEP/NCAR, global sea surface temperature from ERSST.v3b, and global tides from TPXO7.1. A summary of the global surf resource is presented, which highlights the high degree of variability in surfable events. According to general atmospheric circulation, results show that west facing low to middle latitude coasts are more suitable for surfing, especially those in Southern Hemisphere. Month to month analysis reveals strong seasonal changes in the occurrence of surfable events, enhancing those in North Atlantic or North Pacific. Interannual variability is investigated by comparing occurrence values with global and regional climate patterns showing a great influence at both, global and regional scales. Analysis of long term trends shows an increase in the probability of surfable events over the west facing coasts on the planet (i.e. + 30 hours/year in California). The resulting maps provide useful information for surfers and surf related stakeholders, coastal planning, education, and basic research.; Figure 1. Global distribution of medium quality (a) and high quality surf conditions probability (b).

  15. Bayesian anomaly detection in monitoring data applying relevance vector machine

    NASA Astrophysics Data System (ADS)

    Saito, Tomoo

    2011-04-01

    A method for automatically classifying the monitoring data into two categories, normal and anomaly, is developed in order to remove anomalous data included in the enormous amount of monitoring data, applying the relevance vector machine (RVM) to a probabilistic discriminative model with basis functions and their weight parameters whose posterior PDF (probabilistic density function) conditional on the learning data set is given by Bayes' theorem. The proposed framework is applied to actual monitoring data sets containing some anomalous data collected at two buildings in Tokyo, Japan, which shows that the trained models discriminate anomalous data from normal data very clearly, giving high probabilities of being normal to normal data and low probabilities of being normal to anomalous data.

  16. Variability in growth/no growth boundaries of 188 different Escherichia coli strains reveals that approximately 75% have a higher growth probability under low pH conditions than E. coli O157:H7 strain ATCC 43888.

    PubMed

    Haberbeck, L U; Oliveira, R C; Vivijs, B; Wenseleers, T; Aertsen, A; Michiels, C; Geeraerd, A H

    2015-02-01

    This study investigated the variation in growth/no growth boundaries of 188 Escherichia coli strains. Experiments were conducted in Luria-Bertani media under 36 combinations of lactic acid (LA) (0 and 25 mM), pH (3.8, 3.9, 4.0, 4.1, 4.2 and 4.3 for 0 mM LA and 4.3, 4.4, 4.5, 4.6, 4.7 and 4.8 for 25 mM LA) and temperature (20, 25 and 30 °C). After 3 days of incubation, growth was monitored through optical density measurements. For each strain, a so-called purposeful selection approach was used to fit a logistic regression model that adequately predicted the likelihood for growth. Further, to assess the growth/no growth variability for all the strains at once, a generalized linear mixed model was fitted to the data. Strain was fitted as a fixed factor and replicate as a random blocking factor. E. coli O157:H7 strain ATCC 43888 was used as reference strain allowing a comparison with the other strains. Out of the 188 strains tested, 140 strains (∼75%) presented a significantly higher probability of growth under low pH conditions than the O157:H7 strain ATCC 43888, whereas 20 strains (∼11%) showed a significantly lower probability of growth under high pH conditions. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Quantum-correlation breaking channels, quantum conditional probability and Perron-Frobenius theory

    NASA Astrophysics Data System (ADS)

    Chruściński, Dariusz

    2013-03-01

    Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum-classical and classical-classical channels. Applying the quantum analog of Perron-Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum-classical channels to arbitrary quantum channels.

  18. When Is Statistical Evidence Superior to Anecdotal Evidence in Supporting Probability Claims? The Role of Argument Type

    ERIC Educational Resources Information Center

    Hoeken, Hans; Hustinx, Lettica

    2009-01-01

    Under certain conditions, statistical evidence is more persuasive than anecdotal evidence in supporting a claim about the probability that a certain event will occur. In three experiments, it is shown that the type of argument is an important condition in this respect. If the evidence is part of an argument by generalization, statistical evidence…

  19. Transfer of Solutions to Conditional Probability Problems: Effects of Example Problem Format, Solution Format, and Problem Context

    ERIC Educational Resources Information Center

    Chow, Alan F.; Van Haneghan, James P.

    2016-01-01

    This study reports the results of a study examining how easily students are able to transfer frequency solutions to conditional probability problems to novel situations. University students studied either a problem solved using the traditional Bayes formula format or using a natural frequency (tree diagram) format. In addition, the example problem…

  20. Recruitment in a Colorado population of big brown bats: Breeding probabilities, litter size, and first-year survival

    USGS Publications Warehouse

    O'Shea, T.J.; Ellison, L.E.; Neubaum, D.J.; Neubaum, M.A.; Reynolds, C.A.; Bowen, R.A.

    2010-01-01

    We used markrecapture estimation techniques and radiography to test hypotheses about 3 important aspects of recruitment in big brown bats (Eptesicus fuscus) in Fort Collins, Colorado: adult breeding probabilities, litter size, and 1st-year survival of young. We marked 2,968 females with passive integrated transponder (PIT) tags at multiple sites during 2001-2005 and based our assessments on direct recaptures (breeding probabilities) and passive detection with automated PIT tag readers (1st-year survival). We interpreted our data in relation to hypotheses regarding demographic influences of bat age, roost, and effects of years with unusual environmental conditions: extreme drought (2002) and arrival of a West Nile virus epizootic (2003). Conditional breeding probabilities at 6 roosts sampled in 2002-2005 were estimated as 0.64 (95% confidence interval [95% CI] = 0.530.73) in 1-year-old females, but were consistently high (95% CI = 0.940.96) and did not vary by roost, year, or prior year breeding status in older adults. Mean litter size was 1.11 (95% CI = 1.051.17), based on examination of 112 pregnant females by radiography. Litter size was not higher in older or larger females and was similar to results of other studies in western North America despite wide variation in latitude. First-year survival was estimated as 0.67 (95% CI = 0.610.73) for weaned females at 5 maternity roosts over 5 consecutive years, was lower than adult survival (0.79; 95% CI = 0.770.81), and varied by roost. Based on model selection criteria, strong evidence exists for complex roost and year effects on 1st-year survival. First-year survival was lowest in bats born during the drought year. Juvenile females that did not return to roosts as 1-year-olds had lower body condition indices in late summer of their natal year than those known to survive. ?? 2009 American Society of Mammalogists.

  1. The effect of motivation on working memory: an fMRI and SEM study.

    PubMed

    Szatkowska, Iwona; Bogorodzki, Piotr; Wolak, Tomasz; Marchewka, Artur; Szeszkowski, Wojciech

    2008-09-01

    This study investigated the effective connectivity between prefrontal regions of human brain supporting motivational influence on working memory. Functional magnetic resonance imaging (fMRI) and structural equation modeling (SEM) were used to examine the interaction between the lateral orbitofrontal (OFC), medial OFC, and dorsolateral prefrontal (DLPFC) regions in the left and right hemisphere during performance of the verbal 2-back working memory task under two reinforcement conditions. The "low-motivation" condition was not associated with monetary reinforcement, while the "high-motivation" condition involved the probability of winning a certain amount of money. In the "low-motivation" condition, the OFC regions in both hemispheres positively influenced the left DLPFC activity. In the "high-motivation" condition, the connectivity in the network including the right OFC regions and left DLPFC changed from positive to negative, whereas the positive connectivity in the network composed of the left OFC and left DLPFC became slightly enhanced compared with the "low-motivation" condition. However, only the connection between the right lateral OFC and left DLPFC showed a significant condition-dependent change in the strength of influence conveyed through the pathway. This change appears to be the functional correlate of motivational influence on verbal working memory.

  2. Condition-dependent reproductive effort in frogs infected by a widespread pathogen

    PubMed Central

    Roznik, Elizabeth A.; Sapsford, Sarah J.; Pike, David A.; Schwarzkopf, Lin; Alford, Ross A.

    2015-01-01

    To minimize the negative effects of an infection on fitness, hosts can respond adaptively by altering their reproductive effort or by adjusting their timing of reproduction. We studied effects of the pathogenic fungus Batrachochytrium dendrobatidis on the probability of calling in a stream-breeding rainforest frog (Litoria rheocola). In uninfected frogs, calling probability was relatively constant across seasons and body conditions, but in infected frogs, calling probability differed among seasons (lowest in winter, highest in summer) and was strongly and positively related to body condition. Infected frogs in poor condition were up to 40% less likely to call than uninfected frogs, whereas infected frogs in good condition were up to 30% more likely to call than uninfected frogs. Our results suggest that frogs employed a pre-existing, plastic, life-history strategy in response to infection, which may have complex evolutionary implications. If infected males in good condition reproduce at rates equal to or greater than those of uninfected males, selection on factors affecting disease susceptibility may be minimal. However, because reproductive effort in infected males is positively related to body condition, there may be selection on mechanisms that limit the negative effects of infections on hosts. PMID:26063847

  3. Subsurface evaluation of the west parking lot and landfill 3 areas of Air Force Plant 4, Fort Worth, Texas, using two-dimensional direct-current resistivity profiling

    USGS Publications Warehouse

    Braun, Christopher L.; Jones, Sonya A.

    2002-01-01

    During September 1999, the U.S. Geological Survey made 10 two-dimensional direct-current resistivity profile surveys in the west parking lot and landfill 3 areas of Air Force Plant 4, Fort Worth, Texas, to identify subsurface areas of anomalously high or low resistivity that could indicate potential contamination, contaminant pathways, or anthropogenic structures. Six of the 10 surveys (transects) were in the west parking lot. Each of the inverted sections of these transects had anomalously high resistivities in the terrace alluvium/fill (the surficial subsurface layer) that probably were caused by highly resistive fill material. In addition, each of these transects had anomalously low resistivities in the Walnut Formation (a bedrock layer immediately beneath the alluvium/fill) that could have been caused by saturation of fractures within the Walnut Formation. A high-resistivity anomaly in the central part of the study area probably is associated with pea gravel fill used in construction of a French drain. Another high resistivity anomaly in the west parking lot, slightly southeast of the French drain, could be caused by dense nonaqueous-phase liquid in the Walnut Formation. The inverted sections of the four transects in the landfill 3 area tended to have slightly higher resistivities in both the alluvium/fill and the Walnut Formation than the transects in the west parking lot. The higher resistivities in the alluvium/fill could have been caused by drier conditions in grassy areas relative to conditions in the west parking lot. Higher resistivities in parts of the Walnut Formation also could be a function of drier conditions or variations in the lithology of the Walnut Formation. In addition to the 10 vertical sections, four horizontal sections at 2-meteraltitude intervals show generally increasing resistivity with decreasing altitude that most likely results from the increased influence of the Walnut Formation, which has a higher resistivity than the terrace alluvium/fill.

  4. Class dependency of fuzzy relational database using relational calculus and conditional probability

    NASA Astrophysics Data System (ADS)

    Deni Akbar, Mohammad; Mizoguchi, Yoshihiro; Adiwijaya

    2018-03-01

    In this paper, we propose a design of fuzzy relational database to deal with a conditional probability relation using fuzzy relational calculus. In the previous, there are several researches about equivalence class in fuzzy database using similarity or approximate relation. It is an interesting topic to investigate the fuzzy dependency using equivalence classes. Our goal is to introduce a formulation of a fuzzy relational database model using the relational calculus on the category of fuzzy relations. We also introduce general formulas of the relational calculus for the notion of database operations such as ’projection’, ’selection’, ’injection’ and ’natural join’. Using the fuzzy relational calculus and conditional probabilities, we introduce notions of equivalence class, redundant, and dependency in the theory fuzzy relational database.

  5. Flood Risk and Asset Management

    DTIC Science & Technology

    2012-09-01

    use by third parties of results or methods presented in this report. The Company also stresses that various sections of this report rely on data...inundation probability  Levee contribution to risk The methods used in FRE have been applied to establish the National Flood Risk in England and...be noted that when undertaking high level probabilistic risk assessments in the UK, if a defence’s condition is unknown, grade 3 is applied with

  6. Methods, apparatus and system for notification of predictable memory failure

    DOEpatents

    Cher, Chen-Yong; Andrade Costa, Carlos H.; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.

    2017-01-03

    A method for providing notification of a predictable memory failure includes the steps of: obtaining information regarding at least one condition associated with a memory; calculating a memory failure probability as a function of the obtained information; calculating a failure probability threshold; and generating a signal when the memory failure probability exceeds the failure probability threshold, the signal being indicative of a predicted future memory failure.

  7. [Diagnostics and treatment of Wernicke-Korsakoff syndrome patients with an alcohol abuse].

    PubMed

    Nilsson, Maria; Sonne, Charlotte

    2013-04-01

    Wernicke-Korsakoff syndrome is a condition with high morbidity and mortality and occurs as a consequence of thiamine deficiency. Clinical symptoms are often ambiguous and post-mortem examinations show that the syndrome is underdiagnosed and probably undertreated. There is sparse clinical evidence concerning optimal dosage and duration of treatment. This article reviews the current literature and concludes that all patients with a history of alcohol abuse should be treated with high dosage IV thiamine for an extended period of time, albeit further research is needed.

  8. The effect of disease risk probability and disease type on interest in clinic-based versus direct-to-consumer genetic testing services.

    PubMed

    Sherman, Kerry; Shaw, Laura-Kate; Champion, Katrina; Caldeira, Fernanda; McCaskill, Margaret

    2015-10-01

    The effect of disease-specific cognitions on interest in clinic-based and direct-to-consumer (DTC) genetic testing was assessed. Participants (N = 309) responded to an online hypothetical scenario and received genetic testing-related messages that varied by risk probability (25, 50, 75 %) and disease type (Alzheimer's disease vs. Type 2 Diabetes). Post-manipulation interest increased for both testing types, but was greater for clinic-based testing. Interest was greater for Type 2 Diabetes than for Alzheimer's disease, the latter perceived as more severe and likely, and less treatable and preventable. For DTC testing only, participants allocated to the high risk condition (75 %) had greater testing interest than those in the low (25 %) category. DTC testing is perceived as a viable, but less preferred, option compared with clinic-based testing. Particularly when considering DTC genetic testing, there is a need to emphasize subjective disease-related perceptions, including risk probability.

  9. 14 CFR 25.801 - Ditching.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., under reasonably probable water conditions, the flotation time and trim of the airplane will allow the... provision is shown by buoyancy and trim computations, appropriate allowances must be made for probable...

  10. Short-term capture of the Earth-Moon system

    NASA Astrophysics Data System (ADS)

    Qi, Yi; de Ruiter, Anton

    2018-06-01

    In this paper, the short-term capture (STC) of an asteroid in the Earth-Moon system is proposed and investigated. First, the space condition of STC is analysed and five subsets of the feasible region are defined and discussed. Then, the time condition of STC is studied by parameter scanning in the Sun-Earth-Moon-asteroid restricted four-body problem. Numerical results indicate that there is a clear association between the distributions of the time probability of STC and the five subsets. Next, the influence of the Jacobi constant on STC is examined using the space and time probabilities of STC. Combining the space and time probabilities of STC, we propose a STC index to evaluate the probability of STC comprehensively. Finally, three potential STC asteroids are found and analysed.

  11. Diminished caudate and superior temporal gyrus responses to effort-based decision making in patients with first-episode major depressive disorder.

    PubMed

    Yang, Xin-hua; Huang, Jia; Lan, Yong; Zhu, Cui-ying; Liu, Xiao-qun; Wang, Ye-fei; Cheung, Eric F C; Xie, Guang-rong; Chan, Raymond C K

    2016-01-04

    Anhedonia, the loss of interest or pleasure in reward processing, is a hallmark feature of major depressive disorder (MDD), but its underlying neurobiological mechanism is largely unknown. The present study aimed to examine the underlying neural mechanism of reward-related decision-making in patients with MDD. We examined behavioral and neural responses to rewards in patients with first-episode MDD (N=25) and healthy controls (N=25) using the Effort-Expenditure for Rewards Task (EEfRT). The task involved choices about possible rewards of varying magnitude and probability. We tested the hypothesis that individuals with MDD would exhibit a reduced neural response in reward-related brain structures involved in cost-benefit decision-making. Compared with healthy controls, patients with MDD showed significantly weaker responses in the left caudate nucleus when contrasting the 'high reward'-'low reward' condition, and blunted responses in the left superior temporal gyrus and the right caudate nucleus when contrasting high and low probabilities. In addition, hard tasks chosen during high probability trials were negatively correlated with superior temporal gyrus activity in MDD patients, while the same choices were negatively correlated with caudate nucleus activity in healthy controls. These results indicate that reduced caudate nucleus and superior temporal gyrus activation may underpin abnormal cost-benefit decision-making in MDD. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Late quaternary lake level changes of Taro Co and neighbouring lakes, southwestern Tibetan Plateau, based on OSL dating and ostracod analysis

    NASA Astrophysics Data System (ADS)

    Alivernini, Mauro; Lai, Zhongping; Frenzel, Peter; Fürstenberg, Sascha; Wang, Junbo; Guo, Yun; Peng, Ping; Haberzettl, Torsten; Börner, Nicole; Mischke, Steffen

    2018-07-01

    The Late Quaternary lake history of Taro Co and three neighbouring lakes was investigated to reconstruct local hydrological conditions and the regional moisture availability. Ostracod-based water depth and habitat reconstructions combined with OSL and radiocarbon dating are performed to better understand the Taro Co lake system evolution during the Late Quaternary. A high-stand is observed at 36.1 ka before present which represents the highest lake level since then related to a wet stage and resulting in a merging of Taro Co and its neighbouring lakes Zabuye and Lagkor Co this time. The lake level then decreased and reached its minimum around 30 ka. After c. 20 ka, the lake rose above the present day level. A minor low-stand, with colder and drier conditions, is documented at 12.5 cal. ka BP. Taro Co Zabuye and Lagkor Co formed one large lake with a corresponding high-stand during the early Holocene (11.2-9.7 cal. ka BP). After this Holocene lake level maximum, all three lakes shrank, probably related to drier conditions, and Lagkor Co became separated from the Taro Co-Zabuye system at c.7 ka. Subsequently, the lake levels decreased further about 30 m and Taro Co began to separate from Zabuye Lake at around 3.5 ka. The accelerating lake-level decrease of Taro Co was interrupted by a short-term lake level rise after 2 ka BP, probably related to minor variations of the monsoonal components. A last minor high-stand occurred at about 0.8 ka before today and subsequently the lake level of Taro Co registers a slight increase in recent years.

  13. What shapes fitness costs of reproduction in long-lived iteroparous species? A case study on the Alpine ibex.

    PubMed

    Garnier, Alexandre; Gaillard, Jean-Michel; Gauthier, Dominique; Besnard, Aurélien

    2016-01-01

    The fitness costs of reproduction can be masked by individual differences, and may only become apparent during adverse environmental conditions. Individual differences, however, are usually assessed by reproductive success, so how fitness costs are influenced by the interplay between the environmental context and overall individual differences requires further investigation. Here, we evaluated fitness costs of reproduction based on 15 yr of monitoring of individual Alpine ibex (Capra ibex) during a period when the population was affected by a severe disease outbreak (pneumonia). We quantified fitness costs using a novel multi-event capture-mark-recapture (CMR) modeling approach that accounted for uncertainty in reproductive status to estimate the survival and reproductive success of female ibex while also accounting for overall individual heterogeneity using mixture models. Our results show that the ability of females to reproduce was highly heterogeneous. In particular, one group including 76% of females had a much higher probability of giving birth annually (between 0.66 and 0.77, depending on the previous reproductive status) than females of the second group (24% of females, between 0 and 0.05 probability of giving birth annually). Low reproductive costs in terms of future reproduction occurred and were independent of the pneumonia outbreak. There was no survival cost of reproduction either before or after the epizootic, but the cost was high during the epizootic. Our findings indicate that adverse environmental conditions, such as disease outbreaks, may lead to survival costs of reproduction in long-lived species and select against females that have a high reproductive effort. Thereby, the occurrence of adverse conditions increases the diversity of reproductive tactics within a population.

  14. Re-assessment of road accident data-analysis policy : applying theory from involuntary, high-consequence, low-probability events like nuclear power plant meltdowns to voluntary, low-consequence, high-probability events like traffic accidents

    DOT National Transportation Integrated Search

    2002-02-01

    This report examines the literature on involuntary, high-consequence, low-probability (IHL) events like nuclear power plant meltdowns to determine what can be applied to the problem of voluntary, low-consequence high-probability (VLH) events like tra...

  15. The relationship between violence in Northern Mexico and potentially avoidable hospitalizations in the USA-Mexico border region.

    PubMed

    Geissler, Kimberley; Stearns, Sally C; Becker, Charles; Thirumurthy, Harsha; Holmes, George M

    2016-03-01

    Substantial proportions of US residents in the USA-Mexico border region cross into Mexico for health care; increases in violence in northern Mexico may have affected this access. We quantified associations between violence in Mexico and decreases in access to care for border county residents. We also examined associations between border county residence and access. We used hospital inpatient data for Arizona, California and Texas (2005-10) to estimate associations between homicide rates and the probability of hospitalization for ambulatory care sensitive (ACS) conditions. Hospitalizations for ACS conditions were compared with homicide rates in Mexican municipalities matched by patient residence. A 1 SD increase in the homicide rate of the nearest Mexican municipality was associated with a 2.2 percentage point increase in the probability of being hospitalized for an ACS condition for border county patients. Residence in a border county was associated with a 1.3 percentage point decrease in the probability of being hospitalized for an ACS condition. Increased homicide rates in Mexico were associated with increased hospitalizations for ACS conditions in the USA, although residence in a border county was associated with decreased probability of being hospitalized for an ACS condition. Expanding access in the border region may mitigate these effects by providing alternative sources of care. © The Author 2015. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  16. Constructing diagnostic likelihood: clinical decisions using subjective versus statistical probability.

    PubMed

    Kinnear, John; Jackson, Ruth

    2017-07-01

    Although physicians are highly trained in the application of evidence-based medicine, and are assumed to make rational decisions, there is evidence that their decision making is prone to biases. One of the biases that has been shown to affect accuracy of judgements is that of representativeness and base-rate neglect, where the saliency of a person's features leads to overestimation of their likelihood of belonging to a group. This results in the substitution of 'subjective' probability for statistical probability. This study examines clinicians' propensity to make estimations of subjective probability when presented with clinical information that is considered typical of a medical condition. The strength of the representativeness bias is tested by presenting choices in textual and graphic form. Understanding of statistical probability is also tested by omitting all clinical information. For the questions that included clinical information, 46.7% and 45.5% of clinicians made judgements of statistical probability, respectively. Where the question omitted clinical information, 79.9% of clinicians made a judgement consistent with statistical probability. There was a statistically significant difference in responses to the questions with and without representativeness information (χ2 (1, n=254)=54.45, p<0.0001). Physicians are strongly influenced by a representativeness bias, leading to base-rate neglect, even though they understand the application of statistical probability. One of the causes for this representativeness bias may be the way clinical medicine is taught where stereotypic presentations are emphasised in diagnostic decision making. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  17. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  18. Modeling highway travel time distribution with conditional probability models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliveira Neto, Francisco Moraes; Chin, Shih-Miao; Hwang, Ho-Ling

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program providesmore » a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).« less

  19. Time-dependent earthquake probabilities

    USGS Publications Warehouse

    Gomberg, J.; Belardinelli, M.E.; Cocco, M.; Reasenberg, P.

    2005-01-01

    We have attempted to provide a careful examination of a class of approaches for estimating the conditional probability of failure of a single large earthquake, particularly approaches that account for static stress perturbations to tectonic loading as in the approaches of Stein et al. (1997) and Hardebeck (2004). We have loading as in the framework based on a simple, generalized rate change formulation and applied it to these two approaches to show how they relate to one another. We also have attempted to show the connection between models of seismicity rate changes applied to (1) populations of independent faults as in background and aftershock seismicity and (2) changes in estimates of the conditional probability of failures of different members of a the notion of failure rate corresponds to successive failures of different members of a population of faults. The latter application requires specification of some probability distribution (density function of PDF) that describes some population of potential recurrence times. This PDF may reflect our imperfect knowledge of when past earthquakes have occurred on a fault (epistemic uncertainty), the true natural variability in failure times, or some combination of both. We suggest two end-member conceptual single-fault models that may explain natural variability in recurrence times and suggest how they might be distinguished observationally. When viewed deterministically, these single-fault patch models differ significantly in their physical attributes, and when faults are immature, they differ in their responses to stress perturbations. Estimates of conditional failure probabilities effectively integrate over a range of possible deterministic fault models, usually with ranges that correspond to mature faults. Thus conditional failure probability estimates usually should not differ significantly for these models. Copyright 2005 by the American Geophysical Union.

  20. Pediatric ambulatory care sensitive conditions: Birth cohorts and the socio-economic gradient.

    PubMed

    Roos, Leslie L; Dragan, Roxana; Schroth, Robert J

    2017-09-14

    This study examines the socio-economic gradient in utilization and the risk factors associated with hospitalization for four pediatric ambulatory care sensitive conditions (dental conditions, asthma, gastroenteritis, and bacterial pneumonia). Dental conditions, where much care is provided by dentists and insurance coverage varies among different population segments, present special issues. A population registry, provider registry, physician ambulatory claims, and hospital discharge abstracts from 28 398 children born in 2003-2006 in urban centres in Manitoba, Canada were the main data sources. Physician visits and hospitalizations were compared across neighbourhood income groupings using rank correlations and logistic regressions. Very strong relationships between neighbourhood income and utilization were highlighted. Additional variables - family on income assistance, mother's age at first birth, breastfeeding - helped predict the probability of hospitalization. Despite the complete insurance coverage (including visits to dentists and physicians and for hospitalizations) provided, receiving income assistance was associated with higher probabilities of hospitalization. We found a socio-economic gradient in utilization for pediatric ambulatory care sensitive conditions, with higher rates of ambulatory visits and hospitalizations in the poorest neighbourhoods. Insurance coverage which varies between different segments of the population complicates matters. Providing funding for dental care for Manitobans on income assistance has not prevented physician visits or intensive treatment in high-cost facilities, specifically treatment under general anesthesia. When services from one type of provider (dentist) are not universally insured but those from another type (physician) are, using rates of hospitalization to indicate problems in the organization of care seems particularly difficult.

  1. The General Necessary Condition for the Validity of Dirac's Transition Perturbation Theory

    NASA Technical Reports Server (NTRS)

    Quang, Nguyen Vinh

    1996-01-01

    For the first time, from the natural requirements for the successive approximation the general necessary condition of validity of the Dirac's method is explicitly established. It is proved that the conception of 'the transition probability per unit time' is not valid. The 'super-platinium rules' for calculating the transition probability are derived for the arbitrarily strong time-independent perturbation case.

  2. Probabilistic mapping of flood-induced backscatter changes in SAR time series

    NASA Astrophysics Data System (ADS)

    Schlaffer, Stefan; Chini, Marco; Giustarini, Laura; Matgen, Patrick

    2017-04-01

    The information content of flood extent maps can be increased considerably by including information on the uncertainty of the flood area delineation. This additional information can be of benefit in flood forecasting and monitoring. Furthermore, flood probability maps can be converted to binary maps showing flooded and non-flooded areas by applying a threshold probability value pF = 0.5. In this study, a probabilistic change detection approach for flood mapping based on synthetic aperture radar (SAR) time series is proposed. For this purpose, conditional probability density functions (PDFs) for land and open water surfaces were estimated from ENVISAT ASAR Wide Swath (WS) time series containing >600 images using a reference mask of permanent water bodies. A pixel-wise harmonic model was used to account for seasonality in backscatter from land areas caused by soil moisture and vegetation dynamics. The approach was evaluated for a large-scale flood event along the River Severn, United Kingdom. The retrieved flood probability maps were compared to a reference flood mask derived from high-resolution aerial imagery by means of reliability diagrams. The obtained performance measures indicate both high reliability and confidence although there was a slight under-estimation of the flood extent, which may in part be attributed to topographically induced radar shadows along the edges of the floodplain. Furthermore, the results highlight the importance of local incidence angle for the separability between flooded and non-flooded areas as specular reflection properties of open water surfaces increase with a more oblique viewing geometry.

  3. Stimulus discriminability may bias value-based probabilistic learning.

    PubMed

    Schutte, Iris; Slagter, Heleen A; Collins, Anne G E; Frank, Michael J; Kenemans, J Leon

    2017-01-01

    Reinforcement learning tasks are often used to assess participants' tendency to learn more from the positive or more from the negative consequences of one's action. However, this assessment often requires comparison in learning performance across different task conditions, which may differ in the relative salience or discriminability of the stimuli associated with more and less rewarding outcomes, respectively. To address this issue, in a first set of studies, participants were subjected to two versions of a common probabilistic learning task. The two versions differed with respect to the stimulus (Hiragana) characters associated with reward probability. The assignment of character to reward probability was fixed within version but reversed between versions. We found that performance was highly influenced by task version, which could be explained by the relative perceptual discriminability of characters assigned to high or low reward probabilities, as assessed by a separate discrimination experiment. Participants were more reliable in selecting rewarding characters that were more discriminable, leading to differences in learning curves and their sensitivity to reward probability. This difference in experienced reinforcement history was accompanied by performance biases in a test phase assessing ability to learn from positive vs. negative outcomes. In a subsequent large-scale web-based experiment, this impact of task version on learning and test measures was replicated and extended. Collectively, these findings imply a key role for perceptual factors in guiding reward learning and underscore the need to control stimulus discriminability when making inferences about individual differences in reinforcement learning.

  4. Reasoning in psychosis: risky but not necessarily hasty.

    PubMed

    Moritz, Steffen; Scheu, Florian; Andreou, Christina; Pfueller, Ute; Weisbrod, Matthias; Roesch-Ely, Daniela

    2016-01-01

    A liberal acceptance (LA) threshold for hypotheses has been put forward to explain the well-replicated "jumping to conclusions" (JTC) bias in psychosis, particularly in patients with paranoid symptoms. According to this account, schizophrenia patients rest their decisions on lower subjective probability estimates. The initial formulation of the LA account also predicts an absence of the JTC bias under high task ambiguity (i.e., if more than one response option surpasses the subjective acceptance threshold). Schizophrenia patients (n = 62) with current or former delusions and healthy controls (n = 30) were compared on six scenarios of a variant of the beads task paradigm. Decision-making was assessed under low and high task ambiguity. Along with decision judgments (optional), participants were required to provide probability estimates for each option in order to determine decision thresholds (i.e., the probability the individual deems sufficient for a decision). In line with the LA account, schizophrenia patients showed a lowered decision threshold compared to controls (82% vs. 93%) which predicted both more errors and less draws to decisions. Group differences on thresholds were comparable across conditions. At the same time, patients did not show hasty decision-making, reflecting overall lowered probability estimates in patients. Results confirm core predictions derived from the LA account. Our results may (partly) explain why hasty decision-making is sometimes aggravated and sometimes abolished in psychosis. The proneness to make risky decisions may contribute to the pathogenesis of psychosis. A revised LA account is put forward.

  5. Food prices and poverty negatively affect micronutrient intakes in Guatemala.

    PubMed

    Iannotti, Lora L; Robles, Miguel; Pachón, Helena; Chiarella, Cristina

    2012-08-01

    Limited empirical evidence exists for how economic conditions affect micronutrient nutrition. We hypothesized that increasing poverty and rising food prices would reduce consumption of high-quality "luxury" foods, leading to an increased probability of inadequacy for several nutrients. The 2006 Guatemala National Living Conditions Survey was analyzed. First, energy and nutrient intakes and adequacy levels were calculated. Second, the income-nutrient relationships were investigated by assessing disparities in intakes, determining income-nutrient elasticities, and modeling nutrient intakes by reductions in income. Third, the food price-nutrient relationships were explored through determination of price-nutrient elasticities and modeling 2 price scenarios: an increase in food prices similar in magnitude to the food price crisis of 2007-2008 and a standardized 10% increase across all food groups. Disparities in nutrient intakes were greatest for vitamin B-12 (0.38 concentration index) and vitamin A (0.30 concentration index); these nutrients were highly and positively correlated with income (r = 0.22-0.54; P < 0.05). Although the baseline probability of inadequacy was highest for vitamin B-12 (83%), zinc showed the greatest increase in probability of inadequacy as income was reduced, followed by folate and vitamin A. With rising food prices, zinc intake was most acutely affected under both scenarios (P < 0.05) and folate intake in the poorest quintile (+7 percentage points) under the 10% scenario. Price-nutrient elasticities were highest for vitamin B-12 and the meat, poultry, and fish group (-0.503) and for folate and the legumes group (-0.343). The economic factors of food prices and income differentially influenced micronutrient intakes in Guatemala, notably zinc and folate intakes.

  6. Generating intrinsically disordered protein conformational ensembles from a Markov chain

    NASA Astrophysics Data System (ADS)

    Cukier, Robert I.

    2018-03-01

    Intrinsically disordered proteins (IDPs) sample a diverse conformational space. They are important to signaling and regulatory pathways in cells. An entropy penalty must be payed when an IDP becomes ordered upon interaction with another protein or a ligand. Thus, the degree of conformational disorder of an IDP is of interest. We create a dichotomic Markov model that can explore entropic features of an IDP. The Markov condition introduces local (neighbor residues in a protein sequence) rotamer dependences that arise from van der Waals and other chemical constraints. A protein sequence of length N is characterized by its (information) entropy and mutual information, MIMC, the latter providing a measure of the dependence among the random variables describing the rotamer probabilities of the residues that comprise the sequence. For a Markov chain, the MIMC is proportional to the pair mutual information MI which depends on the singlet and pair probabilities of neighbor residue rotamer sampling. All 2N sequence states are generated, along with their probabilities, and contrasted with the probabilities under the assumption of independent residues. An efficient method to generate realizations of the chain is also provided. The chain entropy, MIMC, and state probabilities provide the ingredients to distinguish different scenarios using the terminologies: MoRF (molecular recognition feature), not-MoRF, and not-IDP. A MoRF corresponds to large entropy and large MIMC (strong dependence among the residues' rotamer sampling), a not-MoRF corresponds to large entropy but small MIMC, and not-IDP corresponds to low entropy irrespective of the MIMC. We show that MorFs are most appropriate as descriptors of IDPs. They provide a reasonable number of high-population states that reflect the dependences between neighbor residues, thus classifying them as IDPs, yet without very large entropy that might lead to a too high entropy penalty.

  7. Performance Analysis of Cluster Formation in Wireless Sensor Networks.

    PubMed

    Montiel, Edgar Romo; Rivero-Angeles, Mario E; Rubino, Gerardo; Molina-Lozano, Heron; Menchaca-Mendez, Rolando; Menchaca-Mendez, Ricardo

    2017-12-13

    Clustered-based wireless sensor networks have been extensively used in the literature in order to achieve considerable energy consumption reductions. However, two aspects of such systems have been largely overlooked. Namely, the transmission probability used during the cluster formation phase and the way in which cluster heads are selected. Both of these issues have an important impact on the performance of the system. For the former, it is common to consider that sensor nodes in a clustered-based Wireless Sensor Network (WSN) use a fixed transmission probability to send control data in order to build the clusters. However, due to the highly variable conditions experienced by these networks, a fixed transmission probability may lead to extra energy consumption. In view of this, three different transmission probability strategies are studied: optimal, fixed and adaptive. In this context, we also investigate cluster head selection schemes, specifically, we consider two intelligent schemes based on the fuzzy C-means and k-medoids algorithms and a random selection with no intelligence. We show that the use of intelligent schemes greatly improves the performance of the system, but their use entails higher complexity and selection delay. The main performance metrics considered in this work are energy consumption, successful transmission probability and cluster formation latency. As an additional feature of this work, we study the effect of errors in the wireless channel and the impact on the performance of the system under the different transmission probability schemes.

  8. Performance Analysis of Cluster Formation in Wireless Sensor Networks

    PubMed Central

    Montiel, Edgar Romo; Rivero-Angeles, Mario E.; Rubino, Gerardo; Molina-Lozano, Heron; Menchaca-Mendez, Rolando; Menchaca-Mendez, Ricardo

    2017-01-01

    Clustered-based wireless sensor networks have been extensively used in the literature in order to achieve considerable energy consumption reductions. However, two aspects of such systems have been largely overlooked. Namely, the transmission probability used during the cluster formation phase and the way in which cluster heads are selected. Both of these issues have an important impact on the performance of the system. For the former, it is common to consider that sensor nodes in a clustered-based Wireless Sensor Network (WSN) use a fixed transmission probability to send control data in order to build the clusters. However, due to the highly variable conditions experienced by these networks, a fixed transmission probability may lead to extra energy consumption. In view of this, three different transmission probability strategies are studied: optimal, fixed and adaptive. In this context, we also investigate cluster head selection schemes, specifically, we consider two intelligent schemes based on the fuzzy C-means and k-medoids algorithms and a random selection with no intelligence. We show that the use of intelligent schemes greatly improves the performance of the system, but their use entails higher complexity and selection delay. The main performance metrics considered in this work are energy consumption, successful transmission probability and cluster formation latency. As an additional feature of this work, we study the effect of errors in the wireless channel and the impact on the performance of the system under the different transmission probability schemes. PMID:29236065

  9. Deterioration and cost information for bridge management.

    DOT National Transportation Integrated Search

    2012-05-01

    This study applies contract bid tabulations and elementlevel condition records to develop elementlevel actions, : costs for actions, transition probabilities for models of deterioration of bridge elements, and transition probabilities : for imp...

  10. Modeling the effect of bus stops on capacity of curb lane

    NASA Astrophysics Data System (ADS)

    Luo, Qingyu; Zheng, Tianyao; Wu, Wenjing; Jia, Hongfei; Li, Jin

    With the increase of buses and bus lines, a negative effect on road section capacity is made by the prolonged delay and queuing time at bus stops. However, existing methods of measuring the negative effect pay little attention to different bus stop types in the curb lanes. This paper uses Gap theory and Queuing theory to build models for effect-time and potential capacity in different conditions, including curbside bus stops, bus bays with overflow and bus bays without overflow. In order to make the effect-time models accurate and reliable, two types of probabilities are introduced. One is the probability that the dwell time is less than the headway of curb lane at curbside bus stops; the other is the overflow probability at bus bays. Based on the fundamental road capacity model and effect-time models, potential capacity models of curb lane are designed. The new models are calibrated by the survey data from Changchun City, and verified by the simulation software of VISSIM. Furthermore, with different arrival rates of vehicles, the setting conditions of bus stops are researched. Results show that the potential capacity models have high precision. They can offer a reference for recognizing the effect of bus stops on the capacity of curb lane, which can provide a basis for planning, design and management of urban roads and bus stops.

  11. High-Speed Wind-Tunnel Investigation of the Lateral Stability Characteristics of a 0.10-Scale Model of the Grumman XF9F-2 Airplane, TED No. NACA DE 301

    NASA Technical Reports Server (NTRS)

    Polhamus, Edward C.; King, Thomas J., Jr.

    1949-01-01

    An investigation was made in the Langley high-speed 7- by 10-foot tunnel to determine the high-speed lateral and directional stability characteristics of a 0.10-scale model of the Grumman XF9F-2 airplane in the Mach number range from 0.40 to 0.85. The results indicate that static lateral and directional stability is present throughout the Mach number range investigated although in the Mach number range from 0.75 to 0.85 there is an appreciable decrease in rolling moment due to sideslip. Calculations of the dynamic stability indicate that according to current flying-quality requirements the damping of the lateral oscillation, although probably satisfactory for the sea-level condition, may not be satisfactory for the majority of the altitude conditions investigated

  12. High effective algorithm of the detection and identification of substance using the noisy reflected THz pulse

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Varentsova, Svetlana A.; Trofimov, Vladislav V.; Tikhomirov, Vasily V.

    2015-08-01

    Principal limitations of the standard THz-TDS method for the detection and identification are demonstrated under real conditions (at long distance of about 3.5 m and at a high relative humidity more than 50%) using neutral substances thick paper bag, paper napkins and chocolate. We show also that the THz-TDS method detects spectral features of dangerous substances even if the THz signals were measured in laboratory conditions (at distance 30-40 cm from the receiver and at a low relative humidity less than 2%); silicon-based semiconductors were used as the samples. However, the integral correlation criteria, based on SDA method, allows us to detect the absence of dangerous substances in the neutral substances. The discussed algorithm shows high probability of the substance identification and a reliability of realization in practice, especially for security applications and non-destructive testing.

  13. Use of weather data and remote sensing to predict the geographic and seasonal distribution of Phlebotomus papatasi in southwest Asia.

    PubMed

    Cross, E R; Newcomb, W W; Tucker, C J

    1996-05-01

    Sandfly fever and leishmaniasis were major causes of infectious disease morbidity among military personnel deployed to the Middle East during World War II. Recently, leishmaniasis has been reported in the United Nations Multinational Forces and Observers in the Sinai. Despite these indications of endemicity, no cases of sandfly fever and only 31 cases of leishmaniasis have been identified among U.S. veterans of the Persian Gulf War. The distribution in the Persian Gulf of the vector, Phlebotomus papatasi, is thought to be highly dependent on environmental conditions, especially temperature and relative humidity. A computer model was developed using the occurrence of P. papatasi as the dependent variable and weather data as the independent variables. The results of this model indicated that the greatest sand fly activity and thus the highest risk of sandfly fever and leishmania infections occurred during the spring/summer months before U.S. troops were deployed to the Persian Gulf. Because the weather model produced probability of occurrence information for locations of the weather stations only, normalized difference vegetation index (NDVI) levels from remotely sensed Advanced Very High Resolution Radiometer satellites were determined for each weather station. From the results of the frequency of NDVI levels by probability of occurrence, the range of NDVI levels for presence of the vector was determined. The computer then identified all pixels within the NDVI range indicated and produced a computer-generated map of the probable distribution of P. papatasi. The resulting map expanded the analysis to areas where there were no weather stations and from which no information was reported in the literature, identifying these areas as having either a high or low probability of vector occurrence.

  14. Predictive models attribute effects on fish assemblages to toxicity and habitat alteration.

    PubMed

    de Zwart, Dick; Dyer, Scott D; Posthuma, Leo; Hawkins, Charles P

    2006-08-01

    Biological assessments should both estimate the condition of a biological resource (magnitude of alteration) and provide environmental managers with a diagnosis of the potential causes of impairment. Although methods of quantifying condition are well developed, identifying and proportionately attributing impairment to probable causes remain problematic. Furthermore, analyses of both condition and cause have often been difficult to communicate. We developed an approach that (1) links fish, habitat, and chemistry data collected from hundreds of sites in Ohio (USA) streams, (2) assesses the biological condition at each site, (3) attributes impairment to multiple probable causes, and (4) provides the results of the analyses in simple-to-interpret pie charts. The data set was managed using a geographic information system. Biological condition was assessed using a RIVPACS (river invertebrate prediction and classification system)-like predictive model. The model provided probabilities of capture for 117 fish species based on the geographic location of sites and local habitat descriptors. Impaired biological condition was defined as the proportion of those native species predicted to occur at a site that were observed. The potential toxic effects of exposure to mixtures of contaminants were estimated using species sensitivity distributions and mixture toxicity principles. Generalized linear regression models described species abundance as a function of habitat characteristics. Statistically linking biological condition, habitat characteristics including mixture risks, and species abundance allowed us to evaluate the losses of species with environmental conditions. Results were mapped as simple effect and probable-cause pie charts (EPC pie diagrams), with pie sizes corresponding to magnitude of local impairment, and slice sizes to the relative probable contributions of different stressors. The types of models we used have been successfully applied in ecology and ecotoxicology, but they have not previously been used in concert to quantify impairment and its likely causes. Although data limitations constrained our ability to examine complex interactions between stressors and species, the direct relationships we detected likely represent conservative estimates of stressor contributions to local impairment. Future refinements of the general approach and specific methods described here should yield even more promising results.

  15. Experimental investigation of the intensity fluctuation joint probability and conditional distributions of the twin-beam quantum state.

    PubMed

    Zhang, Yun; Kasai, Katsuyuki; Watanabe, Masayoshi

    2003-01-13

    We give the intensity fluctuation joint probability of the twin-beam quantum state, which was generated with an optical parametric oscillator operating above threshold. Then we present what to our knowledge is the first measurement of the intensity fluctuation conditional probability distributions of twin beams. The measured inference variance of twin beams 0.62+/-0.02, which is less than the standard quantum limit of unity, indicates inference with a precision better than that of separable states. The measured photocurrent variance exhibits a quantum correlation of as much as -4.9+/-0.2 dB between the signal and the idler.

  16. A probability space for quantum models

    NASA Astrophysics Data System (ADS)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  17. Rethinking the learning of belief network probabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Musick, R.

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rotemore » learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neutral networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.« less

  18. Extreme weather and experience influence reproduction in an endangered bird

    USGS Publications Warehouse

    Reichert, Brian E.; Cattau, Christopher E.; Fletcher, Robert J.; Kendall, William L.; Kitchens, Wiley M.

    2012-01-01

    Using a 14-year time series spanning large variation in climatic conditions and the entirety of a population's breeding range, we estimated the effects of extreme weather conditions (drought) on the state-specific probabilities of breeding and survival of an endangered bird, the Florida Snail Kite (Rostrhamus sociabilis plumbeus). Our analysis accounted for uncertainty in breeding status assignment, a common source of uncertainty that is often ignored when states are based on field observations. Breeding probabilities in adult kites (>1 year of age) decreased during droughts, whereas the probability of breeding in young kites (1 year of age) tended to increase. Individuals attempting to breed showed no evidence of reduced future survival. Although population viability analyses of this species and other species often implicitly assume that all adults will attempt to breed, we find that breeding probabilities were significantly <1 for all 13 estimable years considered. Our results suggest that experience is an important factor determining whether or not individuals attempt to breed during harsh environmental conditions and that reproductive effort may be constrained by an individual's quality and/or despotic behavior among individuals attempting to breed.

  19. Community-specific hydraulic conductance potential of soil water decomposed for two Alpine grasslands by small-scale lysimetry

    NASA Astrophysics Data System (ADS)

    Frenck, Georg; Leitinger, Georg; Obojes, Nikolaus; Hofmann, Magdalena; Newesely, Christian; Deutschmann, Mario; Tappeiner, Ulrike; Tasser, Erich

    2018-02-01

    For central Europe in addition to rising temperatures an increasing variability in precipitation is predicted. This will increase the probability of drought periods in the Alps, where water supply has been sufficient in most areas so far. For Alpine grasslands, community-specific imprints on drought responses are poorly analyzed so far due to the sufficient natural water supply. In a replicated mesocosm experiment we compared evapotranspiration (ET) and biomass productivity of two differently drought-adapted Alpine grassland communities during two artificial drought periods divided by extreme precipitation events using high-precision small lysimeters. The drought-adapted vegetation type showed a high potential to utilize even scarce water resources. This is combined with a low potential to translate atmospheric deficits into higher water conductance and a lower biomass production as those measured for the non-drought-adapted type. The non-drought-adapted type, in contrast, showed high water conductance potential and a strong increase in ET rates when environmental conditions became less constraining. With high rates even at dry conditions, this community appears not to be optimized to save water and might experience drought effects earlier and probably more strongly. As a result, the water use efficiency of the drought-adapted plant community is with 2.6 gDW kg-1 of water much higher than that of the non-drought-adapted plant community (0.16 gDW kg-1). In summary, the vegetation's reaction to two covarying gradients of potential evapotranspiration and soil water content revealed a clear difference in vegetation development and between water-saving and water-spending strategies regarding evapotranspiration.

  20. Healthy-unhealthy weight and time preference. Is there an association? An analysis through a consumer survey.

    PubMed

    Cavaliere, Alessia; De Marchi, Elisa; Banterle, Alessandro

    2014-12-01

    Individual time preference has been recognized as key driver in explaining consumers' probability to have a healthy weight or to incur excess weight problems. The term time preference refers to the rate at which a person is disposed to trade a current satisfaction for a future benefit. This characteristic may affect the extent at which individuals invest in health and may influence diet choices. The purpose of this paper is to analyse which could be the role of time preference (measured in terms of diet-related behaviours) in explaining consumers' healthy or unhealthy body weight. The analysis also considers other drivers predicted to influence BMI, specifically information searching, health-related activities and socio-demographic conditions. The survey was based on face-to-face interviews on a sample of 240 consumers living in Milan. In order to test the hypothesis, we performed a set of seven ORM regressions, all having consumers' BMI as the dependent variable. Each ORM contains a different block of explanatory variables, while time preference is always included among the regressors. The results suggest that the healthy weight condition is associated with a high orientation to the future, with a high interest in nutrition claims, a low attention to health-related claims, and a high level of education. On the opposite, the probability to be overweight or obese increases when consumers are less future-concerned and is associated with a low searching for nutrition claims and to a high interest in health claims. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Probability, propensity and probability of propensities (and of probabilities)

    NASA Astrophysics Data System (ADS)

    D'Agostini, Giulio

    2017-06-01

    The process of doing Science in condition of uncertainty is illustrated with a toy experiment in which the inferential and the forecasting aspects are both present. The fundamental aspects of probabilistic reasoning, also relevant in real life applications, arise quite naturally and the resulting discussion among non-ideologized, free-minded people offers an opportunity for clarifications.

  2. Influences of Source-Item Contingency and Schematic Knowledge on Source Monitoring: Tests of the Probability-Matching Account

    ERIC Educational Resources Information Center

    Bayen, Ute J.; Kuhlmann, Beatrice G.

    2011-01-01

    The authors investigated conditions under which judgments in source-monitoring tasks are influenced by prior schematic knowledge. According to a probability-matching account of source guessing (Spaniol & Bayen, 2002), when people do not remember the source of information, they match source-guessing probabilities to the perceived contingency…

  3. The role of probabilities in physics.

    PubMed

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. Copyright © 2012 Elsevier Ltd. All rights reserved.

  4. The rise and fall of a challenger: the Bullet Cluster in Λ cold dark matter simulations

    NASA Astrophysics Data System (ADS)

    Thompson, Robert; Davé, Romeel; Nagamine, Kentaro

    2015-09-01

    The Bullet Cluster has provided some of the best evidence for the Λ cold dark matter (ΛCDM) model via direct empirical proof of the existence of collisionless dark matter, while posing a serious challenge owing to the unusually high inferred pairwise velocities of its progenitor clusters. Here, we investigate the probability of finding such a high-velocity pair in large-volume N-body simulations, particularly focusing on differences between halo-finding algorithms. We find that algorithms that do not account for the kinematics of infalling groups yield vastly different statistics and probabilities. When employing the ROCKSTAR halo finder that considers particle velocities, we find numerous Bullet-like pair candidates that closely match not only the high pairwise velocity, but also the mass, mass ratio, separation distance, and collision angle of the initial conditions that have been shown to produce the Bullet Cluster in non-cosmological hydrodynamic simulations. The probability of finding a high pairwise velocity pair among haloes with Mhalo ≥ 1014 M⊙ is 4.6 × 10-4 using ROCKSTAR, while it is ≈34 × lower using a friends-of-friends (FoF)-based approach as in previous studies. This is because the typical spatial extent of Bullet progenitors is such that FoF tends to group them into a single halo despite clearly distinct kinematics. Further requiring an appropriately high average mass among the two progenitors, we find the comoving number density of potential Bullet-like candidates to be of the order of ≈10-10 Mpc-3. Our findings suggest that ΛCDM straightforwardly produces massive, high relative velocity halo pairs analogous to Bullet Cluster progenitors, and hence the Bullet Cluster does not present a challenge to the ΛCDM model.

  5. Development and Testing of a Multiple Frequency Continuous Wave Radar for Target Detection and Classification

    DTIC Science & Technology

    2007-03-01

    1 2’ VIH " 1 ’ 󈧏) (34) where is the modified Bessel function of zero order. Here is the conditional variance and is the conditional probability...10, the probability of detection is the area under the signal-plus-noise curve above the detection threshold co M vF (V 2+ A2)]10 ( vAPd= fnp~ju,( vIH

  6. Material Logistic Support of the Hospital Ships

    DTIC Science & Technology

    1986-12-01

    Codeine Sulfate Tablets 6505-00-132-6904 Isoniazid Tablets 6505-00-165-6545 Cephalexin Capsules 6505-00-165-6575 Rifampin Capsules 6505-00-400-2054...35 4. CONSUMPTION RATE FOR MEDICAL CONSUMABLE ITEM FOR SPECIFIC CONDITION UNDER SCENARIO A 38 5. CONTRIBUTION FACTOR FOR BISACODYL TABLETS FOR SCENARIO...probability that patient condition 249 will require Bisacodyl. If the probability was twenty percent, then the amount of Bisacodyl needed would be two tablets

  7. Modeling Spatial Dependence of Rainfall Extremes Across Multiple Durations

    NASA Astrophysics Data System (ADS)

    Le, Phuong Dong; Leonard, Michael; Westra, Seth

    2018-03-01

    Determining the probability of a flood event in a catchment given that another flood has occurred in a nearby catchment is useful in the design of infrastructure such as road networks that have multiple river crossings. These conditional flood probabilities can be estimated by calculating conditional probabilities of extreme rainfall and then transforming rainfall to runoff through a hydrologic model. Each catchment's hydrological response times are unlikely to be the same, so in order to estimate these conditional probabilities one must consider the dependence of extreme rainfall both across space and across critical storm durations. To represent these types of dependence, this study proposes a new approach for combining extreme rainfall across different durations within a spatial extreme value model using max-stable process theory. This is achieved in a stepwise manner. The first step defines a set of common parameters for the marginal distributions across multiple durations. The parameters are then spatially interpolated to develop a spatial field. Storm-level dependence is represented through the max-stable process for rainfall extremes across different durations. The dependence model shows a reasonable fit between the observed pairwise extremal coefficients and the theoretical pairwise extremal coefficient function across all durations. The study demonstrates how the approach can be applied to develop conditional maps of the return period and return level across different durations.

  8. Role of beach morphology in wave overtopping hazard assessment

    NASA Astrophysics Data System (ADS)

    Phillips, Benjamin; Brown, Jennifer; Bidlot, Jean-Raymond; Plater, Andrew

    2017-04-01

    Understanding the role of beach morphology in controlling wave overtopping volume will further minimise uncertainties in flood risk assessments at coastal locations defended by engineered structures worldwide. XBeach is used to model wave overtopping volume for a 1:200 yr joint probability distribution of waves and water levels with measured, pre- and post-storm beach profiles. The simulation with measured bathymetry is repeated with and without morphological evolution enabled during the modelled storm event. This research assesses the role of morphology in controlling wave overtopping volumes for hazardous events that meet the typical design level of coastal defence structures. Results show disabling storm-driven morphology under-represents modelled wave overtopping volumes by up to 39% under high Hs conditions, and has a greater impact on the wave overtopping rate than the variability applied within the boundary conditions due to the range of wave-water level combinations that meet the 1:200 yr joint probability criterion. Accounting for morphology in flood modelling is therefore critical for accurately predicting wave overtopping volumes and the resulting flood hazard and to assess economic losses.

  9. Illusion of control: the role of personal involvement.

    PubMed

    Yarritu, Ion; Matute, Helena; Vadillo, Miguel A

    2014-01-01

    The illusion of control consists of overestimating the influence that our behavior exerts over uncontrollable outcomes. Available evidence suggests that an important factor in development of this illusion is the personal involvement of participants who are trying to obtain the outcome. The dominant view assumes that this is due to social motivations and self-esteem protection. We propose that this may be due to a bias in contingency detection which occurs when the probability of the action (i.e., of the potential cause) is high. Indeed, personal involvement might have been often confounded with the probability of acting, as participants who are more involved tend to act more frequently than those for whom the outcome is irrelevant and therefore become mere observers. We tested these two variables separately. In two experiments, the outcome was always uncontrollable and we used a yoked design in which the participants of one condition were actively involved in obtaining it and the participants in the other condition observed the adventitious cause-effect pairs. The results support the latter approach: Those acting more often to obtain the outcome developed stronger illusions, and so did their yoked counterparts.

  10. Hard choices in assessing survival past dams — a comparison of single- and paired-release strategies

    USGS Publications Warehouse

    Zydlewski, Joseph D.; Stich, Daniel S.; Sigourney, Douglas B.

    2017-01-01

    Mark–recapture models are widely used to estimate survival of salmon smolts migrating past dams. Paired releases have been used to improve estimate accuracy by removing components of mortality not attributable to the dam. This method is accompanied by reduced precision because (i) sample size is reduced relative to a single, large release; and (ii) variance calculations inflate error. We modeled an idealized system with a single dam to assess trade-offs between accuracy and precision and compared methods using root mean squared error (RMSE). Simulations were run under predefined conditions (dam mortality, background mortality, detection probability, and sample size) to determine scenarios when the paired release was preferable to a single release. We demonstrate that a paired-release design provides a theoretical advantage over a single-release design only at large sample sizes and high probabilities of detection. At release numbers typical of many survival studies, paired release can result in overestimation of dam survival. Failures to meet model assumptions of a paired release may result in further overestimation of dam-related survival. Under most conditions, a single-release strategy was preferable.

  11. Ground-water quality in east-central Idaho valleys

    USGS Publications Warehouse

    Parliman, D.J.

    1982-01-01

    From May through November 1978, water quality, geologic, and hydrologic data were collected for 108 wells in the Lemhi, Pahsimeroi, Salman River (Stanley to Salmon), Big Lost River, and Little Lost River valleys in east-central Idaho. Data were assembled to define, on a reconnaissance level, water-quality conditions in major aquifers and to develop an understanding of factors that affected conditions in 1978 and could affect future ground-water quality. Water-quality characteristics determined include specific conductance, pH, water temperature, major dissolved cations, major dissolved anions, and coliform bacteria. Concentrations of hardness, nitrite plus nitrate, coliform bacteria, dissolved solids, sulfate, chloride, fluoride , iron, calcium, magnesium, sodium, potassium or bicarbonate exceed public drinking water regulation limits or were anomalously high in some water samples. Highly mineralized ground water probably is due to the natural composition of the aquifers and not to surface contamination. Concentrations of coliform bacteria that exceed public drinking water limits and anomalously high dissolved nitrite-plus-nitrite concentrations are from 15- to 20-year old irrigation wells in heavily irrigated or more densely populated areas of the valleys. Ground-water quality and quantity in most of the study area are sufficient to meet current (1978) population and economic demands. Ground water in all valleys is characterized by significant concentrations of calcium, magnesium, and bicarbonate plus carbonate ions. Variations in the general trend of ground-water composition (especially in the Lemhi Valley) probably are most directly related to variability in aquifer lithology and proximity of sampling site to source of recharge. (USGS)

  12. Atmospheric Visibility Monitoring for planetary optical communications

    NASA Technical Reports Server (NTRS)

    Cowles, Kelly

    1991-01-01

    The Atmospheric Visibility Monitoring project endeavors to improve current atmospheric models and generate visibility statistics relevant to prospective earth-satellite optical communications systems. Three autonomous observatories are being used to measure atmospheric conditions on the basis of observed starlight; these data will yield clear-sky and transmission statistics for three sites with high clear-sky probabilities. Ground-based data will be compared with satellite imagery to determine the correlation between satellite data and ground-based observations.

  13. Protein construct storage: Bayesian variable selection and prediction with mixtures.

    PubMed

    Clyde, M A; Parmigiani, G

    1998-07-01

    Determining optimal conditions for protein storage while maintaining a high level of protein activity is an important question in pharmaceutical research. A designed experiment based on a space-filling design was conducted to understand the effects of factors affecting protein storage and to establish optimal storage conditions. Different model-selection strategies to identify important factors may lead to very different answers about optimal conditions. Uncertainty about which factors are important, or model uncertainty, can be a critical issue in decision-making. We use Bayesian variable selection methods for linear models to identify important variables in the protein storage data, while accounting for model uncertainty. We also use the Bayesian framework to build predictions based on a large family of models, rather than an individual model, and to evaluate the probability that certain candidate storage conditions are optimal.

  14. Adaptive aperture for Geiger mode avalanche photodiode flash ladar systems.

    PubMed

    Wang, Liang; Han, Shaokun; Xia, Wenze; Lei, Jieyu

    2018-02-01

    Although the Geiger-mode avalanche photodiode (GM-APD) flash ladar system offers the advantages of high sensitivity and simple construction, its detection performance is influenced not only by the incoming signal-to-noise ratio but also by the absolute number of noise photons. In this paper, we deduce a hyperbolic approximation to estimate the noise-photon number from the false-firing percentage in a GM-APD flash ladar system under dark conditions. By using this hyperbolic approximation function, we introduce a method to adapt the aperture to reduce the number of incoming background-noise photons. Finally, the simulation results show that the adaptive-aperture method decreases the false probability in all cases, increases the detection probability provided that the signal exceeds the noise, and decreases the average ranging error per frame.

  15. Adaptive aperture for Geiger mode avalanche photodiode flash ladar systems

    NASA Astrophysics Data System (ADS)

    Wang, Liang; Han, Shaokun; Xia, Wenze; Lei, Jieyu

    2018-02-01

    Although the Geiger-mode avalanche photodiode (GM-APD) flash ladar system offers the advantages of high sensitivity and simple construction, its detection performance is influenced not only by the incoming signal-to-noise ratio but also by the absolute number of noise photons. In this paper, we deduce a hyperbolic approximation to estimate the noise-photon number from the false-firing percentage in a GM-APD flash ladar system under dark conditions. By using this hyperbolic approximation function, we introduce a method to adapt the aperture to reduce the number of incoming background-noise photons. Finally, the simulation results show that the adaptive-aperture method decreases the false probability in all cases, increases the detection probability provided that the signal exceeds the noise, and decreases the average ranging error per frame.

  16. High probability neurotransmitter release sites represent an energy efficient design

    PubMed Central

    Lu, Zhongmin; Chouhan, Amit K.; Borycz, Jolanta A.; Lu, Zhiyuan; Rossano, Adam J; Brain, Keith L.; Zhou, You; Meinertzhagen, Ian A.; Macleod, Gregory T.

    2016-01-01

    Nerve terminals contain multiple sites specialized for the release of neurotransmitters. Release usually occurs with low probability, a design thought to confer many advantages. High probability release sites are not uncommon but their advantages are not well understood. Here we test the hypothesis that high probability release sites represent an energy efficient design. We examined release site probabilities and energy efficiency at the terminals of two glutamatergic motor neurons synapsing on the same muscle fiber in Drosophila larvae. Through electrophysiological and ultrastructural measurements we calculated release site probabilities to differ considerably between terminals (0.33 vs. 0.11). We estimated the energy required to release and recycle glutamate from the same measurements. The energy required to remove calcium and sodium ions subsequent to nerve excitation was estimated through microfluorimetric and morphological measurements. We calculated energy efficiency as the number of glutamate molecules released per ATP molecule hydrolyzed, and high probability release site terminals were found to be more efficient (0.13 vs. 0.06). Our analytical model indicates that energy efficiency is optimal (~0.15) at high release site probabilities (~0.76). As limitations in energy supply constrain neural function, high probability release sites might ameliorate such constraints by demanding less energy. Energy efficiency can be viewed as one aspect of nerve terminal function, in balance with others, because high efficiency terminals depress significantly during episodic bursts of activity. PMID:27593375

  17. Minimal preparation computed tomography instead of barium enema/colonoscopy for suspected colon cancer in frail elderly patients: an outcome analysis study.

    PubMed

    Kealey, S M; Dodd, J D; MacEneaney, P M; Gibney, R G; Malone, D E

    2004-01-01

    To evaluate the efficacy of minimal preparation computed tomography (MPCT) in diagnosing clinically significant colonic tumours in frail, elderly patients. A prospective study was performed in a group of consecutively referred, frail, elderly patients with symptoms or signs of anaemia, pain, rectal bleeding or weight loss. The MPCT protocol consisted of 1.5 l Gastrografin 1% diluted with sterile water administered during the 48 h before the procedure with no bowel preparation or administration of intravenous contrast medium. Eight millimetre contiguous scans through the abdomen and pelvis were performed. The scans were double-reported by two gastrointestinal radiologists as showing definite (>90% certain), probable (50-90% certain), possible (<50% certain) neoplasm or normal. Where observers disagreed the more pessimistic of the two reports was accepted. The gold standard was clinical outcome at 1 year with positive end-points defined as (1) histological confirmation of CRC, (2) clinical presentation consistent with CRC without histological confirmation if the patient was too unwell for biopsy/surgery, and (3) death directly attributable to colorectal carcinoma (CRC) with/without post-mortem confirmation. Negative end-points were defined as patients with no clinical, radiological or post-mortem findings of CRC. Patients were followed for 1 year or until one of the above end-points were met. Seventy-two patients were included (mean age 81; range 62-93). One-year follow-up was completed in 94.4% (n=68). Mortality from all causes was 33% (n=24). Five histologically proven tumours were diagnosed with CT and there were two probable false-negatives. Results were analysed twice: assuming all CT lesions test positive and considering "possible" lesions test negative [brackets] (95% confidence intervals): sensitivity 0.88 (0.47-1.0) [0.75 (0.35-0.97)], specificity 0.47 (0.34-0.6) [0.87 (0.75-0.94)], positive predictive value 0.18 [0.43], negative predictive value 0.97 [0.96], positive likelihood ratio result 1.6 [5.63], negative likelihood ratio result 0.27 [0.29], kappa 0.31 [0.43]. Tumour prevalence was 12%. A graph of conditional probabilities was generated and analysed. A variety of unsuspected pathology was also found in this series of patients. MPCT should be double-reported, at least initially. "Possible" lesions should be ignored. Analysis of the graph of conditional probability applied to a group of frail, elderly patients with a high mortality from all causes (33% in our study) suggests: (1) if MPCT suggests definite or probable carcinoma, regardless of the pre-test probability, the post-test probability is high enough to warrant further action, (2) frail, elderly patients with a low pre-test probability for CRC and a negative MPCT should not have further investigation, (3) frail, elderly patients with a higher pre-test probability of CRC (such as those presenting with rectal bleeding) and a negative MPCT should have either double contrast barium enema (DCBE) or colonoscopy as further investigations or be followed clinically for 3-6 months. MPCT was acceptable to patients and clinicians and may reveal significant extra-colonic pathology.

  18. A group filter algorithm for sea mine detection

    NASA Astrophysics Data System (ADS)

    Cobb, J. Tory; An, Myoung; Tolimieri, Richard

    2005-06-01

    Automatic detection of sea mines in coastal regions is a difficult task due to the highly variable sea bottom conditions present in the underwater environment. Detection systems must be able to discriminate objects which vary in size, shape, and orientation from naturally occurring and man-made clutter. Additionally, these automated systems must be computationally efficient to be incorporated into unmanned underwater vehicle (UUV) sensor systems characterized by high sensor data rates and limited processing abilities. Using noncommutative group harmonic analysis, a fast, robust sea mine detection system is created. A family of unitary image transforms associated to noncommutative groups is generated and applied to side scan sonar image files supplied by Naval Surface Warfare Center Panama City (NSWC PC). These transforms project key image features, geometrically defined structures with orientations, and localized spectral information into distinct orthogonal components or feature subspaces of the image. The performance of the detection system is compared against the performance of an independent detection system in terms of probability of detection (Pd) and probability of false alarm (Pfa).

  19. Changing risks of resonance in extreme weather events for higher atmospheric greenhouse gas concentrations

    NASA Astrophysics Data System (ADS)

    Huntingford, Chris; Mitchell, Dann; Osprey, Scott

    2015-04-01

    A recent paper by Petoukhov et al (2013) demonstrates that many of the recent major extreme events in the NH may have been caused by resonant conditions driving very high meridional winds around slowly moving centres-of-action. Besides high amplitudes of planetary wave numbers 6,7 and 8, additional features are identified through 4 further conditions that trigger system resonance. These make the potential for high amplitude waves more likely as well as the possibility of more persistent events. A concern is that human-induced climate change could create conditions more conducive to tropospheric Rossby wave resonance, thereby forcing any periods of extreme weather to become more commonplace and longer lasting. Whilst the CMIP5 ensemble provides much information on expected changes, to fully address changing probabilities of extreme event occurrence - which by definition are relatively rare - is, though, best approached through a massive ensemble modeling framework. The climateprediction-dot-net citizen-science massive ensemble GCM modeling framework provides order 104 simulations for sea-surface temperature, sea-ice extent and atmospheric gas composition representative of both pre-industrial and contemporary conditions. Here we present what these families of simulations imply in terms of the changing likelihood of conditions for mid-latitude resonance, and implications for amplitudes of Rossby waves

  20. Modeling spatial variation in risk of presence and insecticide resistance for malaria vectors in Laos

    PubMed Central

    Marcombe, Sébastien; Laforet, Julie; Brey, Paul T.; Corbel, Vincent; Overgaard, Hans J.

    2017-01-01

    Climatic, sociological and environmental conditions are known to affect the spatial distribution of malaria vectors and disease transmission. Intensive use of insecticides in the agricultural and public health sectors exerts a strong selective pressure on resistance genes in malaria vectors. Spatio-temporal models of favorable conditions for Anopheles species’ presence were developed to estimate the probability of presence of malaria vectors and insecticide resistance in Lao PDR. These models were based on environmental and meteorological conditions, and demographic factors. GIS software was used to build and manage a spatial database with data collected from various geographic information providers. GIS was also used to build and run the models. Results showed that potential insecticide use and therefore the probability of resistance to insecticide is greater in the southwestern part of the country, specifically in Champasack province and where malaria incidence is already known to be high. These findings can help national authorities to implement targeted and effective vector control strategies for malaria prevention and elimination among populations most at risk. Results can also be used to focus the insecticide resistance surveillance in Anopheles mosquito populations in more restricted area, reducing the area of surveys, and making the implementation of surveillance system for Anopheles mosquito insecticide resistance possible. PMID:28494013

  1. Enhancement of surface-atmosphere fluxes by desert-fringe vegetation through reduction of surface albedo and of soil heat flux

    NASA Technical Reports Server (NTRS)

    Otterman, J.

    1987-01-01

    Under the arid conditions prevailing at the end of the dry season in the western Negev/northern Sinai region, vegetation causes a sharp increase relative to bare soil in the daytime sensible heat flux from the surface to the atmosphere. Two mechanisms are involved: the increase in the surface absorptivity and a decrease in the surface heat flux. By increasing the sensible heat flux to the atmosphere through the albedo and the soil heat flux reductions, the desert-fringe vegetation increases the daytime convection and the growth of the planetary boundary layer. Removal of vegetation by overgrazing, by reducing the sensible heat flux, tends to reduce daytime convective precipitation, producing higher probabilities of drought conditions. This assessment of overgrazing is based on observations in the Sinai/Negev, where the soil albedo is high and where overgrazing produces an essential bare soil. Even if the assessment for the Sinai/Negev does not quantitatively apply throughout Africa, the current practice in many African countries of maintaining a large population of grazing animals, can contribute through the mesoscale mechanisms described to reduce daytime convective precipitation, perpetuating higher probabilities of drought. Time-of-day analysis of precipitation in Africa appears worthwhile, to better assess the role of the surface conditions in contributing to drought.

  2. Laser beam propagation through turbulence and adaptive optics for beam delivery improvement

    NASA Astrophysics Data System (ADS)

    Nicolas, Stephane

    2015-10-01

    We report results from numerical simulations of laser beam propagation through atmospheric turbulence. In particular, we study the statistical variations of the fractional beam energy hitting inside an optical aperture placed at several kilometer distance. The simulations are performed for different turbulence conditions and engagement ranges, with and without the use of turbulence mitigation. Turbulence mitigation is simulated with phase conjugation. The energy fluctuations are deduced from time sequence realizations. It is shown that turbulence mitigation leads to an increase of the mean energy inside the aperture and decrease of the fluctuations even in strong turbulence conditions and long distance engagement. As an example, the results are applied to a high energy laser countermeasure system, where we determine the probability that a single laser pulse, or one of the pulses in a sequence, will provide a lethal energy inside the target aperture. Again, turbulence mitigation contributes to increase the performance of the system at long-distance and for strong turbulence conditions in terms of kill probability. We also discuss a specific case where turbulence contributes to increase the pulse energy within the target aperture. The present analysis can be used to evaluate the performance of a variety of systems, such as directed countermeasures, laser communication, and laser weapons.

  3. Uncovering Longitudinal Health Care Behaviors for Millions of Medicaid Enrollees: A Multistate Comparison of Pediatric Asthma Utilization.

    PubMed

    Hilton, Ross; Zheng, Yuchen; Fitzpatrick, Anne; Serban, Nicoleta

    2018-01-01

    This study introduces a framework for analyzing and visualizing health care utilization for millions of children, with a focus on pediatric asthma, one of the major chronic respiratory conditions. The data source is the 2005 to 2012 Medicaid Analytic Extract claims for 10 Southeast states. The study population consists of Medicaid-enrolled children with persistent asthma. We translate multiyear, individual-level medical claims into sequences of discrete utilization events, which are modeled using Markov renewal processes and model-based clustering. Network analysis is used to visualize utilization profiles. The method is general, allowing the study of other chronic conditions. The study population consists of 1.5 million children with persistent asthma. All states have profiles with high probability of asthma controller medication, as large as 60.6% to 90.2% of the state study population. The probability of consecutive asthma controller prescriptions ranges between 0.75 and 0.95. All states have utilization profiles with uncontrolled asthma with 4.5% to 22.9% of the state study population. The probability for controller medication is larger than for short-term medication after a physician visit but not after an emergency department (ED) visit or hospitalization. Transitions from ED or hospitalization generally have a lower probability into physician office (between 0.11 and 0.38) than into ED or hospitalization (between 0.20 and 0.59). In most profiles, children who take asthma controller medication do so regularly. Follow-up physician office visits after an ED encounter or hospitalization are observed at a low rate across all states. Finally, all states have a proportion of children who have uncontrolled asthma, meaning they do not take controller medication while they have severe outcomes.

  4. The geological record of life 3500 Ma ago: Coping with the rigors of a young earth during late accretion

    NASA Technical Reports Server (NTRS)

    Lowe, Donald R.

    1989-01-01

    Thin cherty sedimentary layers within the volcanic portions of the 3,500 to 3,300 Ma-old Onverwacht and Fig Tree Groups, Barberton Greenstone belt, South Africa, and Warrawoona Group, eastern Pilbara Block, Western Australia, contain an abundant record of early Archean life. Five principal types of organic and probably biogenic remains and or structures can be identifed: stromatolites, stromatolite detritus, carbonaceous laminite or flat stromalite, carbonaceous detrital particles, and microfossils. Early Archean stromatolites were reported from both the Barberton and eastern Pilbara greenstone belts. Systematic studies are lacking, but two main morphological types of stromatolites appear to be represented by these occurrences. Morphology of the stromalites is described. Preserved early Archean stromatolites and carbonaceous matter appear to reflect communities of photosynthetic cyanobacteria inhabiting shallow, probably marine environments developed over the surfaces of low-relief, rapidly subsiding, simatic volcanic platforms. The overall environmental and tectonic conditions were those that probably prevailed at Earth's surface since the simatic crust and oceans formed sometime before 3,800 Ma. Recent studies also suggest that these early Archean sequences contain layers of debris formed by large-body impacts on early Earth. If so, then these early bacterial communities had developed strategies for coping with the disruptive effects of possibly globe-encircling high-temperature impact vapor clouds, dust blankets, and impact-generated tsunamis. It is probable that these early Archean biogenic materials represent organic communities that evolved long before the beginning of the preserved geological record and were well adapted to the rigors of life on a young, volcanically active Earth during late bombardment. These conditions may have had parallels on Mars during its early evolution.

  5. Modeling stream fish distributions using interval-censored detection times.

    PubMed

    Ferreira, Mário; Filipe, Ana Filipa; Bardos, David C; Magalhães, Maria Filomena; Beja, Pedro

    2016-08-01

    Controlling for imperfect detection is important for developing species distribution models (SDMs). Occupancy-detection models based on the time needed to detect a species can be used to address this problem, but this is hindered when times to detection are not known precisely. Here, we extend the time-to-detection model to deal with detections recorded in time intervals and illustrate the method using a case study on stream fish distribution modeling. We collected electrofishing samples of six fish species across a Mediterranean watershed in Northeast Portugal. Based on a Bayesian hierarchical framework, we modeled the probability of water presence in stream channels, and the probability of species occupancy conditional on water presence, in relation to environmental and spatial variables. We also modeled time-to-first detection conditional on occupancy in relation to local factors, using modified interval-censored exponential survival models. Posterior distributions of occupancy probabilities derived from the models were used to produce species distribution maps. Simulations indicated that the modified time-to-detection model provided unbiased parameter estimates despite interval-censoring. There was a tendency for spatial variation in detection rates to be primarily influenced by depth and, to a lesser extent, stream width. Species occupancies were consistently affected by stream order, elevation, and annual precipitation. Bayesian P-values and AUCs indicated that all models had adequate fit and high discrimination ability, respectively. Mapping of predicted occupancy probabilities showed widespread distribution by most species, but uncertainty was generally higher in tributaries and upper reaches. The interval-censored time-to-detection model provides a practical solution to model occupancy-detection when detections are recorded in time intervals. This modeling framework is useful for developing SDMs while controlling for variation in detection rates, as it uses simple data that can be readily collected by field ecologists.

  6. Contrast statistics for foveated visual systems: fixation selection by minimizing contrast entropy

    NASA Astrophysics Data System (ADS)

    Raj, Raghu; Geisler, Wilson S.; Frazor, Robert A.; Bovik, Alan C.

    2005-10-01

    The human visual system combines a wide field of view with a high-resolution fovea and uses eye, head, and body movements to direct the fovea to potentially relevant locations in the visual scene. This strategy is sensible for a visual system with limited neural resources. However, for this strategy to be effective, the visual system needs sophisticated central mechanisms that efficiently exploit the varying spatial resolution of the retina. To gain insight into some of the design requirements of these central mechanisms, we have analyzed the effects of variable spatial resolution on local contrast in 300 calibrated natural images. Specifically, for each retinal eccentricity (which produces a certain effective level of blur), and for each value of local contrast observed at that eccentricity, we measured the probability distribution of the local contrast in the unblurred image. These conditional probability distributions can be regarded as posterior probability distributions for the ``true'' unblurred contrast, given an observed contrast at a given eccentricity. We find that these conditional probability distributions are adequately described by a few simple formulas. To explore how these statistics might be exploited by central perceptual mechanisms, we consider the task of selecting successive fixation points, where the goal on each fixation is to maximize total contrast information gained about the image (i.e., minimize total contrast uncertainty). We derive an entropy minimization algorithm and find that it performs optimally at reducing total contrast uncertainty and that it also works well at reducing the mean squared error between the original image and the image reconstructed from the multiple fixations. Our results show that measurements of local contrast alone could efficiently drive the scan paths of the eye when the goal is to gain as much information about the spatial structure of a scene as possible.

  7. Changes in the high-mountain vegetation of the Central Iberian Peninsula as a probable sign of global warming.

    PubMed

    Sanz-Elorza, Mario; Dana, Elías D; González, Alberto; Sobrino, Eduardo

    2003-08-01

    Aerial images of the high summits of the Spanish Central Range reveal significant changes in vegetation over the period 1957 to 1991. These changes include the replacement of high-mountain grassland communities dominated by Festuca aragonensis, typical of the Cryoro-Mediterranean belt, by shrub patches of Juniperus communis ssp. alpina and Cytisus oromediterraneus from lower altitudes (Oro-Mediterranean belt). Climatic data indicate a shift towards warmer conditions in this mountainous region since the 1940s, with the shift being particularly marked from 1960. Changes include significantly higher minimum and maximum temperatures, fewer days with snow cover and a redistribution of monthly rainfall. Total yearly precipitation showed no significant variation. There were no marked changes in land use during the time frame considered, although there were minor changes in grazing species in the 19th century. It is hypothesized that the advance of woody species into higher altitudes is probably related to climate change, which could have acted in conjunction with discrete variations in landscape management. The pronounced changes observed in the plant communities of the area reflect the susceptibility of high-mountain Mediterranean species to environmental change.

  8. Earthquake induced liquefaction hazard, probability and risk assessment in the city of Kolkata, India: its historical perspective and deterministic scenario

    NASA Astrophysics Data System (ADS)

    Nath, Sankar Kumar; Srivastava, Nishtha; Ghatak, Chitralekha; Adhikari, Manik Das; Ghosh, Ambarish; Sinha Ray, S. P.

    2018-01-01

    Liquefaction-induced ground failure is one amongst the leading causes of infrastructure damage due to the impact of large earthquakes in unconsolidated, non-cohesive, water saturated alluvial terrains. The city of Kolkata is located on the potentially liquefiable alluvial fan deposits of Ganga-Bramhaputra-Meghna Delta system with subsurface litho-stratigraphic sequence comprising of varying percentages of clay, cohesionless silt, sand, and gravel interbedded with decomposed wood and peat. Additionally, the region has moderately shallow groundwater condition especially in the post-monsoon seasons. In view of burgeoning population, there had been unplanned expansion of settlements in the hazardous geological, geomorphological, and hydrological conditions exposing the city to severe liquefaction hazard. The 1897 Shillong and 1934 Bihar-Nepal earthquakes both of M w 8.1 reportedly induced Modified Mercalli Intensity of IV-V and VI-VII respectively in the city reportedly triggering widespread to sporadic liquefaction condition with surface manifestation of sand boils, lateral spreading, ground subsidence, etc., thus posing a strong case for liquefaction potential analysis in the terrain. With the motivation of assessing seismic hazard, vulnerability, and risk of the city of Kolkata through a consorted federal funding stipulated for all the metros and upstart urban centers in India located in BIS seismic zones III, IV, and V with population more than one million, an attempt has been made here to understand the liquefaction susceptibility condition of Kolkata under the impact of earthquake loading employing modern multivariate techniques and also to predict deterministic liquefaction scenario of the city in the event of a probabilistic seismic hazard condition with 10% probability of exceedance in 50 years and a return period of 475 years. We conducted in-depth geophysical and geotechnical investigations in the city encompassing 435 km2 area. The stochastically synthesized bedrock ground motion for both the 1897 and 1934 earthquakes on non-linear analysis of local site conditions through DEEPSOIL Geotechnical analysis package present surface level peak ground acceleration of the order of 0.05-0.14 g for the 1934 Bihar-Nepal earthquake while for the 1897 Shillong earthquake it is found to be in the range of 0.03-0.11 g. The factor of safety (FOS) against liquefaction, the probability of liquefaction ( P L), the liquefaction potential index (LPI), and the liquefaction risk index are estimated under the influence of these two earthquakes wherein the city is classified into severe (LPI > 15), high (5 < LPI ≤ 15), moderate (0 < LPI ≤ 5), and non-liquefiable (LPI = 0) susceptibility zones. While the 1934 Bihar-Nepal earthquake induced moderate to severe liquefaction hazard condition in the city in mostly the deltaic plain and interdistributary marsh geomorphologic units with 13.5% sites exhibiting moderate hazard with a median LPI of 1.8, 8.5% sites depicting high with a median LPI of 9.1 and 4% sites with a median LPI of 18.9 exhibiting severe hazard condition, 1897 Shillong earthquake induced mostly non-liquefaction condition with very few sites depicting moderate and high liquefaction hazard. A conservative liquefaction hazard scenario of the city on the other hand estimated through deterministic approach for 10% probability of exceedance in 50 years predicts a high hazard zone in the 3.5-19 m depth region with FOS < 1 and P L > 65% comprising of coarse-grained sediments of sand, silty sand, and clayey silty sand in mostly the deltaic plain geomorphologic unit with 39.1% sites depicting severe liquefaction hazard with a median LPI of 28.3. A non-linear regression analysis on both the historical and deterministic liquefaction scenarios in P L versus LPI domain with ± 1 standard deviation confidence bound generated a cubic polynomial relationship between the two liquefaction hazard proxies. This study considered a bench mark for other cities in the country and elsewhere forms an integral part of the mega-seismic microzonation endeavors undertaken in all the earthquake-prone counties in the world.

  9. Measurements of high energy loss rates of fast highly charged U ions channeled in thin silicon crystals

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ray, C.; Chevallier, M.; Dauvergne, D.

    2011-07-01

    The results of two channeling experiments show that highly charged heavy ions at moderate velocities (v<

  10. Economic evaluation of a psychological intervention for high distress cancer patients and carers: costs and quality-adjusted life years.

    PubMed

    Chatterton, Mary Lou; Chambers, Suzanne; Occhipinti, Stefano; Girgis, Afaf; Dunn, Jeffrey; Carter, Rob; Shih, Sophy; Mihalopoulos, Cathrine

    2016-07-01

    This study compared the cost-effectiveness of a psychologist-led, individualised cognitive behavioural intervention (PI) to a nurse-led, minimal contact self-management condition for highly distressed cancer patients and carers. This was an economic evaluation conducted alongside a randomised trial of highly distressed adult cancer patients and carers calling cancer helplines. Services used by participants were measured using a resource use questionnaire, and quality-adjusted life years were measured using the assessment of quality of life - eight-dimension - instrument collected through a computer-assisted telephone interview. The base case analysis stratified participants based on the baseline score on the Brief Symptom Inventory. Incremental cost-effectiveness ratio confidence intervals were calculated with a nonparametric bootstrap to reflect sampling uncertainty. The results were subjected to sensitivity analysis by varying unit costs for resource use and the method for handling missing data. No significant differences were found in overall total costs or quality-adjusted life years (QALYs) between intervention groups. Bootstrapped data suggest the PI had a higher probability of lower cost and greater QALYs for both carers and patients with high distress at baseline. For patients with low levels of distress at baseline, the PI had a higher probability of greater QALYs but at additional cost. Sensitivity analysis showed the results were robust. The PI may be cost-effective compared with the nurse-led, minimal contact self-management condition for highly distressed cancer patients and carers. More intensive psychological intervention for patients with greater levels of distress appears warranted. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd. Copyright © 2015 John Wiley & Sons, Ltd.

  11. The transition probability and the probability for the left-most particle's position of the q-totally asymmetric zero range process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korhonen, Marko; Lee, Eunghyun

    2014-01-15

    We treat the N-particle zero range process whose jumping rates satisfy a certain condition. This condition is required to use the Bethe ansatz and the resulting model is the q-boson model by Sasamoto and Wadati [“Exact results for one-dimensional totally asymmetric diffusion models,” J. Phys. A 31, 6057–6071 (1998)] or the q-totally asymmetric zero range process (TAZRP) by Borodin and Corwin [“Macdonald processes,” Probab. Theory Relat. Fields (to be published)]. We find the explicit formula of the transition probability of the q-TAZRP via the Bethe ansatz. By using the transition probability we find the probability distribution of the left-most particle'smore » position at time t. To find the probability for the left-most particle's position we find a new identity corresponding to identity for the asymmetric simple exclusion process by Tracy and Widom [“Integral formulas for the asymmetric simple exclusion process,” Commun. Math. Phys. 279, 815–844 (2008)]. For the initial state that all particles occupy a single site, the probability distribution of the left-most particle's position at time t is represented by the contour integral of a determinant.« less

  12. Conditional survival in patients with chronic myeloid leukemia in chronic phase in the era of tyrosine kinase inhibitors.

    PubMed

    Sasaki, Koji; Kantarjian, Hagop M; Jain, Preetesh; Jabbour, Elias J; Ravandi, Farhad; Konopleva, Marina; Borthakur, Gautam; Takahashi, Koichi; Pemmaraju, Naveen; Daver, Naval; Pierce, Sherry A; O'Brien, Susan M; Cortes, Jorge E

    2016-01-15

    Tyrosine kinase inhibitors (TKIs) significantly improve survival in patients with chronic myeloid leukemia in chronic phase (CML-CP). Conditional probability provides survival information in patients who have already survived for a specific period of time after treatment. Cumulative response and survival data from 6 consecutive frontline TKI clinical trials were analyzed. Conditional probability was calculated for failure-free survival (FFS), transformation-free survival (TFS), event-free survival (EFS), and overall survival (OS) according to depth of response within 1 year of the initiation of TKIs, including complete cytogenetic response, major molecular response, and molecular response with a 4-log or 4.5-log reduction. A total of 483 patients with a median follow-up of 99.4 months from the initiation of treatment with TKIs were analyzed. Conditional probabilities of FFS, TFS, EFS, and OS for 1 additional year for patients alive after 12 months of therapy ranged from 92.0% to 99.1%, 98.5% to 100%, 96.2% to 99.6%, and 96.8% to 99.7%, respectively. Conditional FFS for 1 additional year did not improve with a deeper response each year. Conditional probabilities of TFS, EFS, and OS for 1 additional year were maintained at >95% during the period. In the era of TKIs, patients with chronic myeloid leukemia in chronic phase who survived for a certain number of years maintained excellent clinical outcomes in each age group. Cancer 2016;122:238-248. © 2015 American Cancer Society. © 2015 American Cancer Society.

  13. [Conditional probability analysis between tinnitus and comorbidities in patients attending the National Rehabilitation Institute-LGII in the period 2012-2013].

    PubMed

    Gómez Toledo, Verónica; Gutiérrez Farfán, Ileana; Verduzco-Mendoza, Antonio; Arch-Tirado, Emilio

    Tinnitus is defined as the conscious perception of a sensation of sound that occurs in the absence of an external stimulus. This audiological symptom affects 7% to 19% of the adult population. The aim of this study is to describe the associated comorbidities present in patients with tinnitus usingjoint and conditional probability analysis. Patients of both genders, diagnosed with unilateral or bilateral tinnitus, aged between 20 and 45 years, and had a full computerised medical record, were selected. Study groups were formed on the basis of the following clinical aspects: 1) audiological findings; 2) vestibular findings; 3) comorbidities such as, temporomandibular dysfunction, tubal dysfunction, otosclerosis and, 4) triggering factors of tinnitus noise exposure, respiratory tract infection, use of ototoxic and/or drugs. Of the patients with tinnitus, 27 (65%) reported hearing loss, 11 (26.19%) temporomandibular dysfunction, and 11 (26.19%) with vestibular disorders. When performing the joint probability analysis, it was found that the probability that a patient with tinnitus having hearing loss was 2742 0.65, and 2042 0.47 for bilateral type. The result for P (A ∩ B)=30%. Bayes' theorem P (AiB) = P(Ai∩B)P(B) was used, and various probabilities were calculated. Therefore, in patients with temporomandibulardysfunction and vestibular disorders, a posterior probability of P (Aі/B)=31.44% was calculated. Consideration should be given to the joint and conditional probability approach as tools for the study of different pathologies. Copyright © 2016 Academia Mexicana de Cirugía A.C. Publicado por Masson Doyma México S.A. All rights reserved.

  14. Forecasting of future earthquakes in the northeast region of India considering energy released concept

    NASA Astrophysics Data System (ADS)

    Zarola, Amit; Sil, Arjun

    2018-04-01

    This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.

  15. Potential Lifestyles in Ancient Environments of Gusev Crater, Mars

    NASA Technical Reports Server (NTRS)

    DesMarais, David J.

    2006-01-01

    Habitable environments must sustain liquid water at least intermittently and also provide both chemical building blocks and useful sources of energy for life. Observations by Spirit rover indicate that conditions have probably been too dry to sustain life, at least since the emplacement of the extensive basalts that underlie the plains around the Columbia Memorial Station landing site. Local evidence of relatively minor aqueous alteration probably occurred under conditions where the activity of water was too low to sustain biological processes as we know them. In contrast, multiple bedrock units in West Spur and Husband Hill in the Columbia Wills have been extensively altered, probably by aqueous processes. The Fe in several of these units has been extensively oxidized, indicating that, in principle, any microbiota present during the aqueous alteration of these rocks could have obtained energy from Fe oxidation. Spirit discovered oliving-rich ultramafic rocks during her descent from Husband Hill southward into Inner Basin. Alteration of similar ultramafic rocks on Earth can yield H2 that can provide both energy and reducing power for microorganisms. Spirit s discovery of "salty" soil horizons rich in Fe and/or Mg is consistent with the aqueous dissolution and/or alteration of olivine. Such processes can oxidize Fe and also yield H2 under appropriate conditions. Very high S concentrations in these salty deposits indicate that soluble salts were mobilized by water and/or that S oxidation, a potential energy source for life, occurred. The Athena team has not yet established whether these salt components were deposited as large beds in ancient water bodies or, for example, were concentrated by more recent groundwater activity. Collectively these observations are consistent with the possibility that habitable environments existed at least intermittently in the distant geologic past.

  16. Cenomanian-Turonian aquifer of central Israel, its development and possible use as a storage reservoir

    USGS Publications Warehouse

    Schneider, Robert

    1964-01-01

    The Cenomanian-Turonian formations constitute a highly permeable dolomite and limestone aquifer in central Israel. The aquifer is on the west limb of an anticlinorium that trends north-northeast. In places it may be as much as 800 meters thick, but in the report area, largely the foothills of the Judean-Ephraim Mountains where the water development is most intensive, its thickness is generally considerably less. In some places the aquifer occurs at or near the land surface, or it is covered by sandy and gravelly coastal-plain deposits. However, in a large part of the area, it is overlain by as much as 400 meters of relatively impermeable strata, and it is probably underlain by less permeable Lower Cretaceous strata. In general the aquifer water is under artesian pressure. The porosity of the aquifer is characterized mainly by solution channels and cavities produced by jointing and faulting. In addition to the generally high permeability of the aquifer, some regions, which probably coincide with ancient drainage patterns and (or) fault zones, have exceptionally high permeabilities. The source of most of the water in the aquifer is believed to be rain that falls on the foothills area. The westward movement of ground water from the mountainous outcrop areas appears to be impeded by a zone of low permeability which is related to structural and stratigraphic conditions along the western side of the mountains. Gradients of the piezometric surface are small, and the net direction of water movement is westward and northwestward under natural conditions. Locally, however, the flow pattern may be in other directions owing to spatial variations in permeability in the aquifer, the location of natural discharge outlets, and the relation of the aquifer to adjacent geologic formations. There probably is also a large vertical component of flow. Pumping has modified the flow pattern by producing several irregularly shaped shallow depressions in the piezometric surface although, to date, no unwatering of the aquifer has occurred. In the central part of the area, pumping has induced some infiltration from overlying coastal-plain formations. Injecting and storing surplus water seasonally in the aquifer should be feasible at almost any place. However, the movement and recovery of the injected water probably could be controlled most easily if the water were injected where depressions have been formed in the piezometric surface.

  17. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect.

    PubMed

    Gosling, Corentin J; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation.

  18. High But Not Low Probability of Gain Elicits a Positive Feeling Leading to the Framing Effect

    PubMed Central

    Gosling, Corentin J.; Moutier, Sylvain

    2017-01-01

    Human risky decision-making is known to be highly susceptible to profit-motivated responses elicited by the way in which options are framed. In fact, studies investigating the framing effect have shown that the choice between sure and risky options depends on how these options are presented. Interestingly, the probability of gain of the risky option has been highlighted as one of the main factors causing variations in susceptibility to the framing effect. However, while it has been shown that high probabilities of gain of the risky option systematically lead to framing bias, questions remain about the influence of low probabilities of gain. Therefore, the first aim of this paper was to clarify the respective roles of high and low probabilities of gain in the framing effect. Due to the difference between studies using a within- or between-subjects design, we conducted a first study investigating the respective roles of these designs. For both designs, we showed that trials with a high probability of gain led to the framing effect whereas those with a low probability did not. Second, as emotions are known to play a key role in the framing effect, we sought to determine whether they are responsible for such a debiasing effect of the low probability of gain. Our second study thus investigated the relationship between emotion and the framing effect depending on high and low probabilities. Our results revealed that positive emotion was related to risk-seeking in the loss frame, but only for trials with a high probability of gain. Taken together, these results support the interpretation that low probabilities of gain suppress the framing effect because they prevent the positive emotion of gain anticipation. PMID:28232808

  19. Quantity-activity relationship of denitrifying bacteria and environmental scaling in streams of a forested watershed

    USGS Publications Warehouse

    O'Connor, B.L.; Hondzo, Miki; Dobraca, D.; LaPara, T.M.; Finlay, J.A.; Brezonik, P.L.

    2006-01-01

    The spatial variability of subreach denitrification rates in streams was evaluated with respect to controlling environmental conditions, molecular examination of denitrifying bacteria, and dimensional analysis. Denitrification activities ranged from 0 and 800 ng-N gsed-1 d-1 with large variations observed within short distances (<50 m) along stream reaches. A log-normal probability distribution described the range in denitrification activities and was used to define low (16% of the probability distributibn), medium (68%), and high (16%) denitrification potential groups. Denitrifying bacteria were quantified using a competitive polymerase chain reaction (cPCR) technique that amplified the nirK gene that encodes for nitrite reductase. Results showed a range of nirK quantities from 103 to 107 gene-copy-number gsed.-1 A nonparametric statistical test showed no significant difference in nirK quantifies among stream reaches, but revealed that samples with a high denitrification potential had significantly higher nirK quantities. Denitrification activity was positively correlated with nirK quantities with scatter in the data that can be attributed to varying environmental conditions along stream reaches. Dimensional analysis was used to evaluate denitrification activities according to environmental variables that describe fluid-flow properties, nitrate and organic material quantities, and dissolved oxygen flux. Buckingham's pi theorem was used to generate dimensionless groupings and field data were used to determine scaling parameters. The resulting expressions between dimensionless NO3- flux and dimensionless groupings of environmental variables showed consistent scaling, which indicates that the subreach variability in denitrification rates can be predicted by the controlling physical, chemical, and microbiological conditions. Copyright 2006 by the American Geophysical Union.

  20. Outcomes after Umbilical Cord Blood Transplantation for Myelodysplastic Syndromes.

    PubMed

    Gerds, Aaron T; Woo Ahn, Kwang; Hu, Zhen-Huan; Abdel-Azim, Hisham; Akpek, Gorgun; Aljurf, Mahmoud; Ballen, Karen K; Beitinjaneh, Amer; Bacher, Ulrike; Cahn, Jean-Yves; Chhabra, Saurabh; Cutler, Corey; Daly, Andrew; DeFilipp, Zachariah; Gale, Robert Peter; Gergis, Usama; Grunwald, Michael R; Hale, Gregory A; Hamilton, Betty Ky; Jagasia, Madan; Kamble, Rammurti T; Kindwall-Keller, Tamila; Nishihori, Taiga; Olsson, Richard F; Ramanathan, Muthalagu; Saad, Ayman A; Solh, Melhem; Ustun, Celalettin; Valcárcel, David; Warlick, Erica; Wirk, Baldeep M; Kalaycio, Matt; Alyea, Edwin; Popat, Uday; Sobecks, Ronald; Saber, Wael

    2017-06-01

    For patients with hematologic malignancies undergoing allogeneic hematopoietic cell transplantation, umbilical cord blood transplantation (UCBT) has become an acceptable alternative donor source in the absence of a matched sibling or unrelated donor. To date, however, there have been few published series dedicated solely to describing the outcomes of adult patients with myelodysplastic syndrome (MDS) who have undergone UCBT. Between 2004 and 2013, 176 adults with MDS underwent UCBT as reported to the Center for International Blood and Marrow Transplant Research. Median age at the time of transplantation was 56 years (range, 18-73 years). The study group included 10% with very low, 23% with low, 19% with intermediate, 19% with high, and 13% with very high-risk Revised International Prognostic Scoring System (IPSS-R) scores. The 100-day probability of grade II-IV acute graft-versus-host disease (GVHD) was 38%, and the 3-year probability of chronic GVHD was 28%. The probabilities of relapse and transplantation-related mortality (TRM) at 3 years were 32% and 40%, respectively, leading to a 3-year disease-free survival (DFS) of 28% and an overall survival (OS) of 31%. In multivariate analysis, increasing IPSS-R score at the time of HCT was associated with inferior TRM (P = .0056), DFS (P = .018), and OS (P = .0082), but not with GVHD or relapse. The presence of pretransplantation comorbidities was associated with TRM (P = .001), DFS (P = .02), and OS (P = .001). Reduced-intensity conditioning was associated with increased risk of relapse (relative risk, 3.95; 95% confidence interval, 1.78-8.75; P < .001), and although a higher proportion of myeloablative UCBTs were performed in patients with high-risk disease, the effect of conditioning regimen intensity was the same regardless of IPSS-R score. For some individuals who lack a matched sibling or unrelated donor, UCBT can result in long-term DFS; however, the success of UCBT in this population is hampered by a high rate of TRM. Copyright © 2017 The American Society for Blood and Marrow Transplantation. Published by Elsevier Inc. All rights reserved.

  1. Individual quality and age but not environmental or social conditions modulate costs of reproduction in a capital breeder.

    PubMed

    Debeffe, Lucie; Poissant, Jocelyn; McLoughlin, Philip D

    2017-08-01

    Costs associated with reproduction are widely known to play a role in the evolution of reproductive tactics with consequences to population and eco-evolutionary dynamics. Evaluating these costs as they pertain to species in the wild remains an important goal of evolutionary ecology. Individual heterogeneity, including differences in individual quality (i.e., among-individual differences in traits associated with survival and reproduction) or state, and variation in environmental and social conditions can modulate the costs of reproduction; however, few studies have considered effects of these factors simultaneously. Taking advantage of a detailed, long-term dataset for a population of feral horses (Sable Island, Nova Scotia, Canada), we address the question of how intrinsic (quality, age), environmental (winter severity, location), and social conditions (group size, composition, sex ratio, density) influence the costs of reproduction on subsequent reproduction. Individual quality was measured using a multivariate analysis on a combination of four static and dynamic traits expected to depict heterogeneity in individual performance. Female quality and age interacted with reproductive status of the previous year to determine current reproductive effort, while no effect of social or environmental covariates was found. High-quality females showed higher probabilities of giving birth and weaning their foal regardless of their reproductive status the previous year, while those of lower quality showed lower probabilities of producing foals in successive years. Middle-aged (prime) females had the highest probability of giving birth when they had not reproduced the year before, but no such relationship with age was found among females that had reproduced the previous year, indicating that prime-aged females bear higher costs of reproduction. We show that individual quality and age were key factors modulating the costs of reproduction in a capital breeder but that environmental or social conditions were not, highlighting the importance of considering multiple factors when studying costs of reproduction.

  2. The probability of reinforcement per trial affects posttrial responding and subsequent extinction but not within-trial responding.

    PubMed

    Harris, Justin A; Kwok, Dorothy W S

    2018-01-01

    During magazine approach conditioning, rats do not discriminate between a conditional stimulus (CS) that is consistently reinforced with food and a CS that is occasionally (partially) reinforced, as long as the CSs have the same overall reinforcement rate per second. This implies that rats are indifferent to the probability of reinforcement per trial. However, in the same rats, the per-trial reinforcement rate will affect subsequent extinction-responding extinguishes more rapidly for a CS that was consistently reinforced than for a partially reinforced CS. Here, we trained rats with consistently and partially reinforced CSs that were matched for overall reinforcement rate per second. We measured conditioned responding both during and immediately after the CSs. Differences in the per-trial probability of reinforcement did not affect the acquisition of responding during the CS but did affect subsequent extinction of that responding, and also affected the post-CS response rates during conditioning. Indeed, CSs with the same probability of reinforcement per trial evoked the same amount of post-CS responding even when they differed in overall reinforcement rate and thus evoked different amounts of responding during the CS. We conclude that reinforcement rate per second controls rats' acquisition of responding during the CS, but at the same time, rats also learn specifically about the probability of reinforcement per trial. The latter learning affects the rats' expectation of reinforcement as an outcome of the trial, which influences their ability to detect retrospectively that an opportunity for reinforcement was missed, and, in turn, drives extinction. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  3. The negated conditional: a litmus test for the suppositional conditional?

    PubMed

    Handley, Simon J; Evans, Jonathan St B T; Thompson, Valerie A

    2006-05-01

    Under the suppositional account of conditionals, when people think about a conditional assertion, "if p then q," they engage in a mental simulation in which they imagine p holds and evaluate the probability that q holds under this supposition. One implication of this account is that belief in a conditional equates to conditional probability [P(q/p)]. In this paper, the authors examine a further implication of this analysis with respect to the wide-scope negation of conditional assertions, "it is not the case that if p then q." Under the suppositional account, nothing categorically follows from the negation of a conditional, other than a second conditional, "if p then not-q." In contrast, according to the mental model theory, a negated conditional is consistent only with the determinate state of affairs, p and not-q. In 4 experiments, the authors compare the contrasting predictions that arise from each of these accounts. The findings are consistent with the suppositional theory but are incongruent with the mental model theory of conditionals.

  4. Probability of detecting perchlorate under natural conditions in deep groundwater in California and the Southwestern United States

    USGS Publications Warehouse

    Fram, Miranda S.; Belitz, Kenneth

    2011-01-01

    We use data from 1626 groundwater samples collected in California, primarily from public drinking water supply wells, to investigate the distribution of perchlorate in deep groundwater under natural conditions. The wells were sampled for the California Groundwater Ambient Monitoring and Assessment Priority Basin Project. We develop a logistic regression model for predicting probabilities of detecting perchlorate at concentrations greater than multiple threshold concentrations as a function of climate (represented by an aridity index) and potential anthropogenic contributions of perchlorate (quantified as an anthropogenic score, AS). AS is a composite categorical variable including terms for nitrate, pesticides, and volatile organic compounds. Incorporating water-quality parameters in AS permits identification of perturbation of natural occurrence patterns by flushing of natural perchlorate salts from unsaturated zones by irrigation recharge as well as addition of perchlorate from industrial and agricultural sources. The data and model results indicate low concentrations (0.1-0.5 μg/L) of perchlorate occur under natural conditions in groundwater across a wide range of climates, beyond the arid to semiarid climates in which they mostly have been previously reported. The probability of detecting perchlorate at concentrations greater than 0.1 μg/L under natural conditions ranges from 50-70% in semiarid to arid regions of California and the Southwestern United States to 5-15% in the wettest regions sampled (the Northern California coast). The probability of concentrations above 1 μg/L under natural conditions is low (generally <3%).

  5. Technology-Enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    ERIC Educational Resources Information Center

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like "What is the chance of event A occurring, given that event B was observed?" This generic question arises in discussions of many intriguing scientific questions such as "What is the probability that an adolescent weighs between 120 and 140 pounds given that…

  6. Bootstrap imputation with a disease probability model minimized bias from misclassification due to administrative database codes.

    PubMed

    van Walraven, Carl

    2017-04-01

    Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.

  7. Exceedance probability map: a tool helping the definition of arsenic Natural Background Level (NBL) within the Drainage Basin to the Venice Lagoon (NE Italy)

    NASA Astrophysics Data System (ADS)

    Dalla Libera, Nico; Fabbri, Paolo; Mason, Leonardo; Piccinini, Leonardo; Pola, Marco

    2017-04-01

    Arsenic groundwater contamination affects worldwide shallower groundwater bodies. Starting from the actual knowledges around arsenic origin into groundwater, we know that the major part of dissolved arsenic is naturally occurring through the dissolution of As-bearing minerals and ores. Several studies on the shallow aquifers of both the regional Venetian Plain (NE Italy) and the local Drainage Basin to the Venice Lagoon (DBVL) show local high arsenic concentration related to peculiar geochemical conditions, which drive arsenic mobilization. The uncertainty of arsenic spatial distribution makes difficult both the evaluation of the processes involved in arsenic mobilization and the stakeholders' decision about environmental management. Considering the latter aspect, the present study treats the problem of the Natural Background Level (NBL) definition as the threshold discriminating the natural contamination from the anthropogenic pollution. Actually, the UE's Directive 2006/118/EC suggests the procedures and criteria to set up the water quality standards guaranteeing a healthy status and reversing any contamination trends. In addition, the UE's BRIDGE project proposes some criteria, based on the 90th percentile of the contaminant's concentrations dataset, to estimate the NBL. Nevertheless, these methods provides just a statistical NBL for the whole area without considering the spatial variation of the contaminant's concentration. In this sense, we would reinforce the NBL concept using a geostatistical approach, which is able to give some detailed information about the distribution of arsenic concentrations and unveiling zones with high concentrations referred to the Italian drinking water standard (IDWS = 10 µg/liter). Once obtained the spatial information about arsenic distribution, we can apply the 90th percentile methods to estimate some Local NBL referring to every zones with arsenic higher than IDWS. The indicator kriging method was considered because it estimates the spatial distribution of the exceedance probabilities respect some pre-defined thresholds. This approach is largely mentioned in literature to face similar environmental problems. To test the validity of the procedure, we used the dataset from "A.Li.Na" project (founded by the Regional Environmental Agency) that defined regional NBLs of As, Fe, Mn and NH4+ into DBVL's groundwater. Primarily, we defined two thresholds corresponding respectively to the IDWS and the median of the data over the IDWS. These values were decided basing on the dataset's statistical structure and the quality criteria of the GWD 2006/118/EC. Subsequently, we evaluated the spatial distribution of the probability to exceed the defined thresholds using the Indicator kriging. The results highlight different zones with high exceedance probability ranging from 75% to 95% respect both the IDWS and the median value. Considering the geological setting of the DBVL, these probability values correspond with the occurrence of both organic matter and reducing conditions. In conclusion, the spatial prediction of the exceedance probability could be useful to define the areas in which estimate the local NBLs, enhancing the procedure of NBL definition. In that way, the NBL estimation could be more realistic because it considers the spatial distribution of the studied contaminant, distinguishing areas with high natural concentrations from polluted ones.

  8. Adaptive Conditioning of Multiple-Point Geostatistical Facies Simulation to Flow Data with Facies Probability Maps

    NASA Astrophysics Data System (ADS)

    Khodabakhshi, M.; Jafarpour, B.

    2013-12-01

    Characterization of complex geologic patterns that create preferential flow paths in certain reservoir systems requires higher-order geostatistical modeling techniques. Multipoint statistics (MPS) provides a flexible grid-based approach for simulating such complex geologic patterns from a conceptual prior model known as a training image (TI). In this approach, a stationary TI that encodes the higher-order spatial statistics of the expected geologic patterns is used to represent the shape and connectivity of the underlying lithofacies. While MPS is quite powerful for describing complex geologic facies connectivity, the nonlinear and complex relation between the flow data and facies distribution makes flow data conditioning quite challenging. We propose an adaptive technique for conditioning facies simulation from a prior TI to nonlinear flow data. Non-adaptive strategies for conditioning facies simulation to flow data can involves many forward flow model solutions that can be computationally very demanding. To improve the conditioning efficiency, we develop an adaptive sampling approach through a data feedback mechanism based on the sampling history. In this approach, after a short period of sampling burn-in time where unconditional samples are generated and passed through an acceptance/rejection test, an ensemble of accepted samples is identified and used to generate a facies probability map. This facies probability map contains the common features of the accepted samples and provides conditioning information about facies occurrence in each grid block, which is used to guide the conditional facies simulation process. As the sampling progresses, the initial probability map is updated according to the collective information about the facies distribution in the chain of accepted samples to increase the acceptance rate and efficiency of the conditioning. This conditioning process can be viewed as an optimization approach where each new sample is proposed based on the sampling history to improve the data mismatch objective function. We extend the application of this adaptive conditioning approach to the case where multiple training images are proposed to describe the geologic scenario in a given formation. We discuss the advantages and limitations of the proposed adaptive conditioning scheme and use numerical experiments from fluvial channel formations to demonstrate its applicability and performance compared to non-adaptive conditioning techniques.

  9. Spatial patterns of breeding success of grizzly bears derived from hierarchical multistate models.

    PubMed

    Fisher, Jason T; Wheatley, Matthew; Mackenzie, Darryl

    2014-10-01

    Conservation programs often manage populations indirectly through the landscapes in which they live. Empirically, linking reproductive success with landscape structure and anthropogenic change is a first step in understanding and managing the spatial mechanisms that affect reproduction, but this link is not sufficiently informed by data. Hierarchical multistate occupancy models can forge these links by estimating spatial patterns of reproductive success across landscapes. To illustrate, we surveyed the occurrence of grizzly bears (Ursus arctos) in the Canadian Rocky Mountains Alberta, Canada. We deployed camera traps for 6 weeks at 54 surveys sites in different types of land cover. We used hierarchical multistate occupancy models to estimate probability of detection, grizzly bear occupancy, and probability of reproductive success at each site. Grizzly bear occupancy varied among cover types and was greater in herbaceous alpine ecotones than in low-elevation wetlands or mid-elevation conifer forests. The conditional probability of reproductive success given grizzly bear occupancy was 30% (SE = 0.14). Grizzly bears with cubs had a higher probability of detection than grizzly bears without cubs, but sites were correctly classified as being occupied by breeding females 49% of the time based on raw data and thus would have been underestimated by half. Repeated surveys and multistate modeling reduced the probability of misclassifying sites occupied by breeders as unoccupied to <2%. The probability of breeding grizzly bear occupancy varied across the landscape. Those patches with highest probabilities of breeding occupancy-herbaceous alpine ecotones-were small and highly dispersed and are projected to shrink as treelines advance due to climate warming. Understanding spatial correlates in breeding distribution is a key requirement for species conservation in the face of climate change and can help identify priorities for landscape management and protection. © 2014 Society for Conservation Biology.

  10. An Evaluation of a Progressive High-Probability Instructional Sequence Combined with Low-Probability Demand Fading in the Treatment of Food Selectivity

    ERIC Educational Resources Information Center

    Penrod, Becky; Gardella, Laura; Fernand, Jonathan

    2012-01-01

    Few studies have examined the effects of the high-probability instructional sequence in the treatment of food selectivity, and results of these studies have been mixed (e.g., Dawson et al., 2003; Patel et al., 2007). The present study extended previous research on the high-probability instructional sequence by combining this procedure with…

  11. Holocene Vegetation Changes in Eastern Kamchatka Based on Pollen and Macrofossil Records

    NASA Astrophysics Data System (ADS)

    Dirksen, V.

    2004-12-01

    Little is known about the Quaternary climate and vegetation history of Kamchatka. Only a few previous studies have provided paleoenvironmental information for this area, but these studies have poor age control and are inconsistent To reconstruct paleoclimate and both regional and local vegetation history we are analyzing continuous, high-resolution pollen and macrofossil records from peats on Kamchatka. Thin, well-dated ash layers in these peats provide excellent age control; sections sampled thus far range in age back to 12,000 years. Herein we report results from one example, the Uka peat section (57.8oN 162.2oE; about 10 m a.s.l.). This section is located on a morainal terrace close to a small lake. The basal section is lacustrine clay with a few spores pointing to scarce vegetation under cold conditions, probably latest Pleistocene. This clay is replaced upward by limnic peat, probably early Holocene (pollen zone 1). This zone is characterized by dominance of shrub alder and birch, herbs, and ferns. The highest value (in the whole section) of sage and the absence of tree pollen suggest a treeless landscape and thin vegetation cover under still cold conditions, while increase in local aquatic pollen indicates lake filling and shrinking. In Zone 2 (ca. 8000-4000 BP), mire vegetation shows successive development of eutrophic fen including three pulses of sphagnum followed by sedge peaks. Pollen concentration is very low, probably indicating high deposition rates. A warming trend is suggested by the appearance ca. 5600 BP of tree birch, increasing to the end of zone 2, while alder strongly decreases. The most pronounced changes in both regional and local vegetation are found ca. 3800 BP, when tree birch pollen reaches its highest value and a few long-transported spruce pollen grains appear. The dominant (eutrophic) sedge is suddenly replaced by a more oligotrophic one. Such local components as grasses, shrub birch and willow increase, and total pollen concentration dramatically increases. All these features suggest drier conditions in zone 3 (3800-1200 BP), however it should be supported by further studies. Zone 4 (1200 BP to present) shows development of an oligotrophic peat bog indicated by high values of heath; the establishment of shrub pine is signaled by a sharp increase in its pollen.

  12. Co-occurrence of medical conditions: Exposing patterns through probabilistic topic modeling of snomed codes.

    PubMed

    Bhattacharya, Moumita; Jurkovitz, Claudine; Shatkay, Hagit

    2018-04-12

    Patients associated with multiple co-occurring health conditions often face aggravated complications and less favorable outcomes. Co-occurring conditions are especially prevalent among individuals suffering from kidney disease, an increasingly widespread condition affecting 13% of the general population in the US. This study aims to identify and characterize patterns of co-occurring medical conditions in patients employing a probabilistic framework. Specifically, we apply topic modeling in a non-traditional way to find associations across SNOMED-CT codes assigned and recorded in the EHRs of >13,000 patients diagnosed with kidney disease. Unlike most prior work on topic modeling, we apply the method to codes rather than to natural language. Moreover, we quantitatively evaluate the topics, assessing their tightness and distinctiveness, and also assess the medical validity of our results. Our experiments show that each topic is succinctly characterized by a few highly probable and unique disease codes, indicating that the topics are tight. Furthermore, inter-topic distance between each pair of topics is typically high, illustrating distinctiveness. Last, most coded conditions grouped together within a topic, are indeed reported to co-occur in the medical literature. Notably, our results uncover a few indirect associations among conditions that have hitherto not been reported as correlated in the medical literature. Copyright © 2018. Published by Elsevier Inc.

  13. Quantum key distribution without the wavefunction

    NASA Astrophysics Data System (ADS)

    Niestegge, Gerd

    A well-known feature of quantum mechanics is the secure exchange of secret bit strings which can then be used as keys to encrypt messages transmitted over any classical communication channel. It is demonstrated that this quantum key distribution allows a much more general and abstract access than commonly thought. The results include some generalizations of the Hilbert space version of quantum key distribution, but are based upon a general nonclassical extension of conditional probability. A special state-independent conditional probability is identified as origin of the superior security of quantum key distribution; this is a purely algebraic property of the quantum logic and represents the transition probability between the outcomes of two consecutive quantum measurements.

  14. What Health Issues or Conditions Affect Women Differently Than Men?

    MedlinePlus

    ... tract is structured. 13 National Cancer Institute. (2010). Probability of breast cancer in American women . Retrieved August ... from http://www.cancer.gov/cancertopics/factsheet/detection/probability-breast-cancer National Cancer Institute. (2017). General information ...

  15. Probability effects on stimulus evaluation and response processes

    NASA Technical Reports Server (NTRS)

    Gehring, W. J.; Gratton, G.; Coles, M. G.; Donchin, E.

    1992-01-01

    This study investigated the effects of probability information on response preparation and stimulus evaluation. Eight subjects responded with one hand to the target letter H and with the other to the target letter S. The target letter was surrounded by noise letters that were either the same as or different from the target letter. In 2 conditions, the targets were preceded by a warning stimulus unrelated to the target letter. In 2 other conditions, a warning letter predicted that the same letter or the opposite letter would appear as the imperative stimulus with .80 probability. Correct reaction times were faster and error rates were lower when imperative stimuli confirmed the predictions of the warning stimulus. Probability information affected (a) the preparation of motor responses during the foreperiod, (b) the development of expectancies for a particular target letter, and (c) a process sensitive to the identities of letter stimuli but not to their locations.

  16. [OPEN FIELD BEHAVIOR AS A PREDICTIVE CRITERIA REFLECTING RATS CORTICOSTERONELEVEL BEFORE AND AFTER STRESS].

    PubMed

    Umriukhin, P E; Grigorchuk, O S

    2015-12-01

    In the presented study we investigated the possibility to use the open field behavior data for prediction of corticosterone level in rat blood plasma before and after stress. It is shown that the most reliable open field behavior parameters, reflecting high probability of significant upregulation of corticosterone after 3 hours of immobilization, are the short latency of first movement and low locomotor activity during the test. Rats with high corticosterone at normal non-stress conditions are characterized by low locomotor activity and on the contrary long latency period for the entrance of open field center.

  17. Study on optimization method of test conditions for fatigue crack detection using lock-in vibrothermography

    NASA Astrophysics Data System (ADS)

    Min, Qing-xu; Zhu, Jun-zhen; Feng, Fu-zhou; Xu, Chao; Sun, Ji-wei

    2017-06-01

    In this paper, the lock-in vibrothermography (LVT) is utilized for defect detection. Specifically, for a metal plate with an artificial fatigue crack, the temperature rise of the defective area is used for analyzing the influence of different test conditions, i.e. engagement force, excitation intensity, and modulated frequency. The multivariate nonlinear and logistic regression models are employed to estimate the POD (probability of detection) and POA (probability of alarm) of fatigue crack, respectively. The resulting optimal selection of test conditions is presented. The study aims to provide an optimized selection method of the test conditions in the vibrothermography system with the enhanced detection ability.

  18. Biostimulators: A New Trend towards Solving an Old Problem.

    PubMed

    Posmyk, Małgorzata M; Szafrańska, Katarzyna

    2016-01-01

    Stresses provoked by adverse living conditions are inherent to a changing environment (climate change and anthropogenic influence) and they are basic factors that limit plant development and yields. Agriculture always struggled with this problem. The survey of non-toxic, natural, active substances useful in protection, and stimulation of plants growing under suboptimal and even harmful conditions, as well as searching for the most effective methods for their application, will direct our activities toward sustainable development and harmony with nature. It seems highly probable that boosting natural plant defense strategies by applying biostimulators will help to solve an old problem of poor yield in plant cultivation, by provoking their better growth and development even under suboptimal environmental conditions. This work is a concise review of such substances and methods of their application to plants.

  19. Collective choice in ants: the role of protein and carbohydrates ratios.

    PubMed

    Arganda, S; Nicolis, S C; Perochain, A; Péchabadens, C; Latil, G; Dussutour, A

    2014-10-01

    In a foraging context, social insects make collective decisions from individuals responding to local information. When faced with foods varying in quality, ants are known to be able to select the best food source using pheromone trails. Until now, studies investigating collective decisions have focused on single nutrients, mostly carbohydrates. In the environment, the foods available are a complex mixture and are composed of various nutrients, available in different forms. In this paper, we explore the effect of protein to carbohydrate ratio on ants' ability to detect and choose between foods with different protein characteristics (free amino acids or whole proteins). In a two-choice set up, Argentine ants Linepithema humile were presented with two artificial foods containing either whole protein or amino acids in two different dietary conditions: high protein food or high carbohydrate food. At the collective level, when ants were faced with high carbohydrate foods, they did not show a preference between free amino acids or whole proteins, while a preference for free amino acids emerged when choosing between high protein foods. At the individual level, the probability of feeding was higher for high carbohydrates food and for foods containing free amino acids. Two mathematical models were developed to evaluate the importance of feeding probability in collective food selection. A first model in which a forager deposits pheromone only after feeding, and a second model in which a forager always deposits pheromone, but with greater intensity after feeding. Both models were able to predict free amino acid selection, however the second one was better able to reproduce the experimental results suggesting that modulating trail strength according to feeding probability is likely the mechanism explaining amino acid preference at a collective level in Argentine ants. Copyright © 2014 Elsevier Ltd. All rights reserved.

  20. Temporal patterns of apparent leg band retention in North American geese

    USGS Publications Warehouse

    Zimmerman, Guthrie S.; Kendall, William L.; Moser, Timothy J.; White, Gary C.; Doherty, Paul F.

    2009-01-01

    An important assumption of mark?recapture studies is that individuals retain their marks, which has not been assessed for goose reward bands. We estimated aluminum leg band retention probabilities and modeled how band retention varied with band type (standard vs. reward band), band age (1-40 months), and goose characteristics (species and size class) for Canada (Branta canadensis), cackling (Branta hutchinsii), snow (Chen caerulescens), and Ross?s (Chen rossii) geese that field coordinators double-leg banded during a North American goose reward band study (N = 40,999 individuals from 15 populations). We conditioned all models in this analysis on geese that were encountered with >1 leg band still attached (n = 5,747 dead recoveries and live recaptures). Retention probabilities for standard aluminum leg bands were high (estimate of 0.9995, SE = 0.001) and constant over 1-40 months. In contrast, apparent retention probabilities for reward bands demonstrated an interactive relationship between 5 size and species classes (small cackling, medium Canada, large Canada, snow, and Ross?s geese). In addition, apparent retention probabilities for each of the 5 classes varied quadratically with time, being lower immediately after banding and at older age classes. The differential retention probabilities among band type (reward vs. standard) that we observed suggests that 1) models estimating reporting probability should incorporate differential band loss if it is nontrivial, 2) goose managers should consider the costs and benefits of double-banding geese on an operational basis, and 3) the United States Geological Survey Bird Banding Lab should modify protocols for receiving recovery data.

  1. Pyrrolizidine alkaloid-containing toxic plants (Senecio, Crotalaria, Cynoglossum, Amsinckia, Heliotropium, and Echium spp.).

    PubMed

    Stegelmeier, Bryan L

    2011-07-01

    Pyrrolizidine alkaloid (PA)-containing plants are found throughout the world and are probably the most common plant cause of poisoning of livestock, wildlife, and humans. PAs are potent liver toxins that under some conditions can be carcinogenic. This article briefly introduces high-risk North American PA-containing plants, summarizing their toxicity and subsequent pathology. Current diagnostic techniques, treatments, and strategies to avoid losses to PA poisoning are also reviewed. Published by Elsevier Inc.

  2. Physical interrelation between Fokker-Planck and random walk models with application to Coulomb interactions.

    NASA Technical Reports Server (NTRS)

    Englert, G. W.

    1971-01-01

    A model of the random walk is formulated to allow a simple computing procedure to replace the difficult problem of solution of the Fokker-Planck equation. The step sizes and probabilities of taking steps in the various directions are expressed in terms of Fokker-Planck coefficients. Application is made to many particle systems with Coulomb interactions. The relaxation of a highly peaked velocity distribution of particles to equilibrium conditions is illustrated.

  3. Teaching Probabilities and Statistics to Preschool Children

    ERIC Educational Resources Information Center

    Pange, Jenny

    2003-01-01

    This study considers the teaching of probabilities and statistics to a group of preschool children using traditional classroom activities and Internet games. It was clear from this study that children can show a high level of understanding of probabilities and statistics, and demonstrate high performance in probability games. The use of Internet…

  4. Cost of Crashes Related to Road Conditions, United States, 2006

    PubMed Central

    Zaloshnja, Eduard; Miller, Ted R.

    2009-01-01

    This is the first study to estimate the cost of crashes related to road conditions in the U.S. To model the probability that road conditions contributed to the involvement of a vehicle in the crash, we used 2000–03 Large Truck Crash Causation Study (LTCCS) data, the only dataset that provides detailed information whether road conditions contributed to crash occurrence. We applied the logistic regression results to a costed national crash dataset in order to calculate the probability that road conditions contributed to the involvement of a vehicle in each crash. In crashes where someone was moderately to seriously injured (AIS-2-6) in a vehicle that harmfully impacted a large tree or medium or large non-breakaway pole, or if the first harmful event was collision with a bridge, we changed the calculated probability of being road-related to 1. We used the state distribution of costs of fatal crashes where road conditions contributed to crash occurrence or severity to estimate the respective state distribution of non-fatal crash costs. The estimated comprehensive cost of traffic crashes where road conditions contributed to crash occurrence or severity was $217.5 billion in 2006. This represented 43.6% of the total comprehensive crash cost. The large share of crash costs related to road design and conditions underlines the importance of these factors in highway safety. Road conditions are largely controllable. Road maintenance and upgrading can prevent crashes and reduce injury severity. PMID:20184840

  5. A state comparison amplifier with feed forward state correction

    NASA Astrophysics Data System (ADS)

    Mazzarella, Luca; Donaldson, Ross; Collins, Robert; Zanforlin, Ugo; Buller, Gerald; Jeffers, John

    2017-04-01

    The Quantum State Comparison AMPlifier (SCAMP) is a probabilistic amplifier that works for known sets of coherent states. The input state is mixed with a guess state at a beam splitter and one of the output ports is coupled to a detector. The other output contains the amplified state, which is accepted on the condition that no counts are recorded. The system uses only classical resources and has been shown to achieve high gain and repetition rate. However the output fidelity is not high enough for most quantum communication purposes. Here we show how the success probability and fidelity are enhanced by repeated comparison stages, conditioning later state choices on the outcomes of earlier detections. A detector firing at an early stage means that a guess is wrong. This knowledge allows us to correct the state perfectly. The system requires fast-switching between different input states, but still requires only classical resources. Figures of merit compare favourably with other schemes, most notably the probability-fidelity product is higher than for unambiguous state discrimination. Due to its simplicity, the system is a candidate to counteract quantum signal degradation in a lossy fibre or as a quantum receiver to improve the key rate of continuous variable quantum communication. The work was supported by the QComm Project of the UK Engineering and Physical Sciences Research Council (EP/M013472/1).

  6. The electron localization as the information content of the conditional pair density

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Urbina, Andres S.; Torres, F. Javier; Universidad San Francisco de Quito

    2016-06-28

    In the present work, the information gained by an electron for “knowing” about the position of another electron with the same spin is calculated using the Kullback-Leibler divergence (D{sub KL}) between the same-spin conditional pair probability density and the marginal probability. D{sub KL} is proposed as an electron localization measurement, based on the observation that regions of the space with high information gain can be associated with strong correlated localized electrons. Taking into consideration the scaling of D{sub KL} with the number of σ-spin electrons of a system (N{sup σ}), the quantity χ = (N{sup σ} − 1) D{sub KL}f{submore » cut} is introduced as a general descriptor that allows the quantification of the electron localization in the space. f{sub cut} is defined such that it goes smoothly to zero for negligible densities. χ is computed for a selection of atomic and molecular systems in order to test its capability to determine the region in space where electrons are localized. As a general conclusion, χ is able to explain the electron structure of molecules on the basis of chemical grounds with a high degree of success and to produce a clear differentiation of the localization of electrons that can be traced to the fluctuation in the average number of electrons in these regions.« less

  7. Coherent nature of the radiation emitted in delayed luminescence of leaves

    PubMed

    Bajpai

    1999-06-07

    After exposure to light, a living system emits a photon signal of characteristic shape. The signal has a small decay region and a long tail region. The flux of photons in the decay region changes by 2 to 3 orders of magnitude, but remains almost constant in the tail region. The decaying part is attributed to delayed luminescence and the constant part to ultra-weak luminescence. Biophoton emission is the common name given to both kinds of luminescence, and photons emitted are called biophotons. The decay character of the biophoton signal is not exponential, which is suggestive of a coherent signal. We sought to establish the coherent nature by measuring the conditional probability of zero photon detection in a small interval Delta. Our measurements establish the coherent nature of biophotons emitted by different leaves at various temperatures in the range 15-50 degrees C. Our set up could measure the conditional probability for Delta

  8. Quantifying restoration effectiveness using multi-scale habitat models: implications for sage-grouse in the Great Basin

    USGS Publications Warehouse

    Arkle, Robert S.; Pilliod, David S.; Hanser, Steven E.; Brooks, Matthew L.; Chambers, Jeanne C.; Grace, James B.; Knutson, Kevin C.; Pyke, David A.; Welty, Justin L.

    2014-01-01

    A recurrent challenge in the conservation of wide-ranging, imperiled species is understanding which habitats to protect and whether we are capable of restoring degraded landscapes. For Greater Sage-grouse (Centrocercus urophasianus), a species of conservation concern in the western United States, we approached this problem by developing multi-scale empirical models of occupancy in 211 randomly located plots within a 40 million ha portion of the species' range. We then used these models to predict sage-grouse habitat quality at 826 plots associated with 101 post-wildfire seeding projects implemented from 1990 to 2003. We also compared conditions at restoration sites to published habitat guidelines. Sage-grouse occupancy was positively related to plot- and landscape-level dwarf sagebrush (Artemisia arbuscula, A. nova, A. tripartita) and big sagebrush steppe prevalence, and negatively associated with non-native plants and human development. The predicted probability of sage-grouse occupancy at treated plots was low on average (0.09) and not substantially different from burned areas that had not been treated. Restoration sites with quality habitat tended to occur at higher elevation locations with low annual temperatures, high spring precipitation, and high plant diversity. Of 313 plots seeded after fire, none met all sagebrush guidelines for breeding habitats, but approximately 50% met understory guidelines, particularly for perennial grasses. This pattern was similar for summer habitat. Less than 2% of treated plots met winter habitat guidelines. Restoration actions did not increase the probability of burned areas meeting most guideline criteria. The probability of meeting guidelines was influenced by a latitudinal gradient, climate, and topography. Our results suggest that sage-grouse are relatively unlikely to use many burned areas within 20 years of fire, regardless of treatment. Understory habitat conditions are more likely to be adequate than overstory conditions, but in most climates, establishing forbs and reducing cheatgrass dominance is unlikely. Reestablishing sagebrush cover will require more than 20 years using past restoration methods. Given current fire frequencies and restoration capabilities, protection of landscapes containing a mix of dwarf sagebrush and big sagebrush steppe, minimal human development, and low non-native plant cover may provide the best opportunity for conservation of sage-grouse habitats.

  9. Influence of Psychiatric Comorbidity on Recovery and Recurrence in Generalized Anxiety Disorder, Social Phobia, and Panic Disorder: A 12-Year Prospective Study

    PubMed Central

    Bruce, Steven E.; Yonkers, Kimberly A.; Otto, Michael W.; Eisen, Jane L.; Weisberg, Risa B.; Pagano, Maria; Shea, M. Tracie; Keller, Martin B.

    2012-01-01

    Objective The authors sought to observe the long-term clinical course of anxiety disorders over 12 years and to examine the influence of comorbid psychiatric disorders on recovery from or recurrence of panic disorder, generalized anxiety disorder, and social phobia. Method Data were drawn from the Harvard/Brown Anxiety Disorders Research Program, a prospective, naturalistic, longitudinal, multicenter study of adults with a current or past history of anxiety disorders. Probabilities of recovery and recurrence were calculated by using standard survival analysis methods. Proportional hazards regression analyses with time-varying covariates were conducted to determine risk ratios for possible comorbid psychiatric predictors of recovery and recurrence. Results Survival analyses revealed an overall chronic course for the majority of the anxiety disorders. Social phobia had the smallest probability of recovery after 12 years of follow-up. Moreover, patients who had prospectively observed recovery from their intake anxiety disorder had a high probability of recurrence over the follow-up period. The overall clinical course was worsened by several comorbid psychiatric conditions, including major depression and alcohol and other substance use disorders, and by comorbidity of generalized anxiety disorder and panic disorder with agoraphobia. Conclusions These data depict the anxiety disorders as insidious, with a chronic clinical course, low rates of recovery, and relatively high probabilities of recurrence. The presence of particular comorbid psychiatric disorders significantly lowered the likelihood of recovery from anxiety disorders and increased the likelihood of their recurrence. The findings add to the understanding of the nosology and treatment of these disorders. PMID:15930067

  10. Stress vulnerability and the effects of moderate daily stress on sleep polysomnography and subjective sleepiness.

    PubMed

    Petersen, Helena; Kecklund, Göran; D'Onofrio, Paolo; Nilsson, Jens; Åkerstedt, Torbjörn

    2013-02-01

    The purpose of this study was to investigate if and how sleep physiology is affected by naturally occurring high work stress and identify individual differences in the response of sleep to stress. Probable upcoming stress levels were estimated through weekly web questionnaire ratings. Based on the modified FIRST-scale (Ford insomnia response to stress) participants were grouped into high (n = 9) or low (n = 19) sensitivity to stress related sleep disturbances (Drake et al., 2004). Sleep was recorded in 28 teachers with polysomnography, sleep diaries and actigraphs during one high stress and one low stress condition in the participants home. EEG showed a decrease in sleep efficiency during the high stress condition. Significant interactions between group and condition were seen for REM sleep, arousals and stage transitions. The sensitive group had an increase in arousals and stage transitions during the high stress condition and a decrease in REM, whereas the opposite was seen in the resilient group. Diary ratings during the high stress condition showed higher bedtime stress and lower ratings on the awakening index (insufficient sleep and difficulties awakening). Ratings also showed lower cognitive function and preoccupation with work thoughts in the evening. KSS ratings of sleepiness increased during stress for the sensitive group. Saliva samples of cortisol showed no effect of stress. It was concluded that moderate daily stress is associated with a moderate negative effect on sleep sleep efficiency and fragmentation. A slightly stronger effect was seen in the sensitive group. © 2012 European Sleep Research Society.

  11. Attentional responses to stimuli associated with a reward can occur in the absence of knowledge of their predictive values.

    PubMed

    Leganes-Fonteneau, Mateo; Scott, Ryan; Duka, Theodora

    2018-04-02

    Classical conditioning theories of addiction suggest that stimuli associated with rewards acquire incentive salience, inducing emotional and attentional conditioned responses. It is not clear whether those responses occur without contingency awareness (CA), i.e. are based on explicit or implicit learning processes. Examining implicit aspects of stimulus-reward associations can improve our understanding of addictive behaviours, supporting treatment and prevention strategies. However, the acquisition of conditioned responses without CA has yet to be rigorously demonstrated, as the existing literature shows a lack of methodological agreement regarding the measurement of implicit and explicit processes. The purpose of two experiments presented here was to study the emotional value acquired by CS through implicit emotional and attentional processes, trying to overcome critical methodological issues. Experiment 1 (n = 48) paired two stimuli categories (houses/buildings) with high (HR) or low (LR) probabilities of monetary reward. An Emotional Attentional Blink revealed preferential attention for HR over LR regardless of CA; while pleasantness ratings were unaffected, probably due to the intrinsic nature of CS. Experiment 2 (n = 60) replicated the effect of conditioning on the Emotional Attentional Blink utilising abstract CS (octagons/squares). In addition increased pleasantness for HR over LR was found significant overall, and marginally significant for Aware but not for Unaware participants. Here CA was rigorously determined using a signal-detection analysis and metacognitive-awareness measurements. Bayesian analyses verified the unconscious nature of the learning. These findings demonstrate that attentional conditioned responses can occur without CA and advance our understanding of the mechanisms by which implicit conditioning can occur and becomes observable. Furthermore, these results can highlight how addictive behaviours might develop. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. The efficacy of fuel treatment in mitigating property loss during wildfires: Insights from analysis of the severity of the catastrophic fires in 2009 in Victoria, Australia.

    PubMed

    Price, Owen F; Bradstock, Ross A

    2012-12-30

    Treatment of fuel (e.g. prescribed fire, logging) in fire-prone ecosystems is done to reduce risks to people and their property but effects require quantification, particularly under severe weather conditions when the destructive potential of fires on human infrastructure is maximised. We analysed the relative effects of fuel age (i.e. indicative of the effectiveness of prescribed fire) and logging on remotely sensed (SPOT imagery) severity of fires which occurred in eucalypt forests in Victoria, Australia in 2009. These fires burned under the most severe weather conditions recorded in Australia and caused large losses of life and property. Statistical models of the probability of contrasting extremes of severity (crown fire versus fire confined to the understorey) were developed based on effects of fuel age, logging, weather, topography and forest type. Weather was the primary influence on severity, though it was reduced at low fuel ages in Moderate but not Catastrophic, Very High or Low fire-weather conditions. Probability of crown fires was higher in recently logged areas than in areas logged decades before, indicating likely ineffectiveness as a fuel treatment. The results suggest that recently burnt areas (up to 5-10 years) may reduce the intensity of the fire but not sufficiently to increase the chance of effective suppression under severe weather conditions. Since house loss was most likely under these conditions (67%), effects of prescribed burning across landscapes on house loss are likely to be small when weather conditions are severe. Fuel treatments need to be located close to houses in order to effectively mitigate risk of loss. Copyright © 2012 Elsevier Ltd. All rights reserved.

  13. Predicted sequence of cortical tau and amyloid-β deposition in Alzheimer disease spectrum.

    PubMed

    Cho, Hanna; Lee, Hye Sun; Choi, Jae Yong; Lee, Jae Hoon; Ryu, Young Hoon; Lee, Myung Sik; Lyoo, Chul Hyoung

    2018-04-17

    We investigated sequential order between tau and amyloid-β (Aβ) deposition in Alzheimer disease spectrum using a conditional probability method. Two hundred twenty participants underwent 18 F-flortaucipir and 18 F-florbetaben positron emission tomography scans and neuropsychological tests. The presence of tau and Aβ in each region and impairment in each cognitive domain were determined by Z-score cutoffs. By comparing pairs of conditional probabilities, the sequential order of tau and Aβ deposition were determined. Probability for the presence of tau in the entorhinal cortex was higher than that of Aβ in all cortical regions, and in the medial temporal cortices, probability for the presence of tau was higher than that of Aβ. Conversely, in the remaining neocortex above the inferior temporal cortex, probability for the presence of Aβ was always higher than that of tau. Tau pathology in the entorhinal cortex may appear earlier than neocortical Aβ and may spread in the absence of Aβ within the neighboring medial temporal regions. However, Aβ may be required for massive tau deposition in the distant cortical areas. Copyright © 2018 Elsevier Inc. All rights reserved.

  14. Examining the relationship between motor assessments and handwriting consistency in children with and without probable developmental coordination disorder.

    PubMed

    Bo, Jin; Colbert, Alison; Lee, Chi-Mei; Schaffert, Jeffrey; Oswald, Kaitlin; Neill, Rebecca

    2014-09-01

    Children with Developmental Coordination Disorder (DCD) often experience difficulties in handwriting. The current study examined the relationships between three motor assessments and the spatial and temporal consistency of handwriting. Twelve children with probable DCD and 29 children from 7 to 12 years who were typically developing wrote the lowercase letters "e" and "l" in cursive and printed forms repetitively on a digitizing tablet. Three behavioral assessments, including the Beery-Buktenica Developmental Test of Visual-Motor Integration (VMI), the Minnesota Handwriting Assessment (MHA) and the Movement Assessment Battery for Children (MABC), were administered. Children with probable DCD had low scores on the VMI, MABC and MHA and showed high temporal, not spatial, variability in the letter-writing task. Their MABC scores related to temporal consistency in all handwriting conditions, and the Legibility scores in their MHA correlated with temporal consistency in cursive "e" and printed "l". It appears that children with probable DCD have prominent difficulties on the temporal aspect of handwriting. While the MHA is a good product-oriented assessment for measuring handwriting deficits, the MABC shows promise as a good assessment for capturing the temporal process of handwriting in children with DCD. Copyright © 2014 Elsevier Ltd. All rights reserved.

  15. Impact of physical properties on ozone removal by several porous materials.

    PubMed

    Gall, Elliott T; Corsi, Richard L; Siegel, Jeffrey A

    2014-04-01

    Models of reactive uptake of ozone in indoor environments generally describe materials through aerial (horizontal) projections of surface area, a potentially limiting assumption for porous materials. We investigated the effect of changing porosity/pore size, material thickness, and chamber fluid mechanic conditions on the reactive uptake of ozone to five materials: two cellulose filter papers, two cementitious materials, and an activated carbon cloth. Results include (1) material porosity and pore size distributions, (2) effective diffusion coefficients for ozone in materials, and (3) material-ozone deposition velocities and reaction probabilities. At small length scales (0.02-0.16 cm) increasing thickness caused increases in estimated reaction probabilities from 1 × 10(-6) to 5 × 10(-6) for one type of filter paper and from 1 × 10(-6) to 1 × 10(-5) for a second type of filter paper, an effect not observed for materials tested at larger thicknesses. For high porosity materials, increasing chamber transport-limited deposition velocities resulted in increases in reaction probabilities by factors of 1.4-2.0. The impact of physical properties and transport effects on values of the Thiele modulus, ranging across all materials from 0.03 to 13, is discussed in terms of the challenges in estimating reaction probabilities to porous materials in scenarios relevant to indoor environments.

  16. Determination of Acoustic Cavitation Probabilities and Thresholds Using a Single Focusing Transducer to Induce and Detect Acoustic Cavitation Events: I. Method and Terminology.

    PubMed

    Haller, Julian; Wilkens, Volker; Shaw, Adam

    2018-02-01

    A method to determine acoustic cavitation probabilities in tissue-mimicking materials (TMMs) is described that uses a high-intensity focused ultrasound (HIFU) transducer for both inducing and detecting the acoustic cavitation events. The method was evaluated by studying acoustic cavitation probabilities in agar-based TMMs with and without scatterers and for different sonication modes like continuous wave, single pulses (microseconds to milliseconds) and repeated burst signals. Acoustic cavitation thresholds (defined here as the peak rarefactional in situ pressure at which the acoustic cavitation probability reaches 50%) at a frequency of 1.06 MHz were observed between 1.1 MPa (for 1 s of continuous wave sonication) and 4.6 MPa (for 1 s of a repeated burst signal with 25-cycle burst length and 10-ms burst period) in a 3% (by weight) agar phantom without scatterers. The method and its evaluation are described, and general terminology useful for standardizing the description of insonation conditions and comparing results is provided. In the accompanying second part, the presented method is used to systematically study the acoustic cavitation thresholds in the same material for a range of sonication modes. Copyright © 2018 The Authors. Published by Elsevier Inc. All rights reserved.

  17. Lexicographic Probability, Conditional Probability, and Nonstandard Probability

    DTIC Science & Technology

    2009-11-11

    1994; Hammond 1999; Kohlberg and Reny 1997; Kreps and Wilson 1982; Myerson 1986; Selten 1965; Selten 1975]). It also arises in the analysis of...sets of measure 0): BBD considered three; Kohlberg and Reny [1997] considered two others. It turns out that these notions are perhaps best understood...number of characterizations of solution concepts depend on independence (see, for example, [Battigalli 1996; Kohlberg and Reny 1997; Battigalli and

  18. Quasi-probabilities in conditioned quantum measurement and a geometric/statistical interpretation of Aharonov's weak value

    NASA Astrophysics Data System (ADS)

    Lee, Jaeha; Tsutsui, Izumi

    2017-05-01

    We show that the joint behavior of an arbitrary pair of (generally noncommuting) quantum observables can be described by quasi-probabilities, which are an extended version of the standard probabilities used for describing the outcome of measurement for a single observable. The physical situations that require these quasi-probabilities arise when one considers quantum measurement of an observable conditioned by some other variable, with the notable example being the weak measurement employed to obtain Aharonov's weak value. Specifically, we present a general prescription for the construction of quasi-joint probability (QJP) distributions associated with a given combination of observables. These QJP distributions are introduced in two complementary approaches: one from a bottom-up, strictly operational construction realized by examining the mathematical framework of the conditioned measurement scheme, and the other from a top-down viewpoint realized by applying the results of the spectral theorem for normal operators and their Fourier transforms. It is then revealed that, for a pair of simultaneously measurable observables, the QJP distribution reduces to the unique standard joint probability distribution of the pair, whereas for a noncommuting pair there exists an inherent indefiniteness in the choice of such QJP distributions, admitting a multitude of candidates that may equally be used for describing the joint behavior of the pair. In the course of our argument, we find that the QJP distributions furnish the space of operators in the underlying Hilbert space with their characteristic geometric structures such that the orthogonal projections and inner products of observables can be given statistical interpretations as, respectively, “conditionings” and “correlations”. The weak value Aw for an observable A is then given a geometric/statistical interpretation as either the orthogonal projection of A onto the subspace generated by another observable B, or equivalently, as the conditioning of A given B with respect to the QJP distribution under consideration.

  19. NESTOR: A Computer-Based Medical Diagnostic Aid That Integrates Causal and Probabilistic Knowledge.

    DTIC Science & Technology

    1984-11-01

    indiidual conditional probabilities between one cause node and its effect node, but less common to know a joint conditional probability between a...PERFOAMING ORG. REPORT NUMBER * 7. AUTI4ORs) O Gregory F. Cooper 1 CONTRACT OR GRANT NUMBERIa) ONR N00014-81-K-0004 g PERFORMING ORGANIZATION NAME AND...ADDRESS 10. PROGRAM ELEMENT, PROJECT. TASK Department of Computer Science AREA & WORK UNIT NUMBERS Stanford University Stanford, CA 94305 USA 12. REPORT

  20. Covariate-adjusted Spearman's rank correlation with probability-scale residuals.

    PubMed

    Liu, Qi; Li, Chun; Wanga, Valentine; Shepherd, Bryan E

    2018-06-01

    It is desirable to adjust Spearman's rank correlation for covariates, yet existing approaches have limitations. For example, the traditionally defined partial Spearman's correlation does not have a sensible population parameter, and the conditional Spearman's correlation defined with copulas cannot be easily generalized to discrete variables. We define population parameters for both partial and conditional Spearman's correlation through concordance-discordance probabilities. The definitions are natural extensions of Spearman's rank correlation in the presence of covariates and are general for any orderable random variables. We show that they can be neatly expressed using probability-scale residuals (PSRs). This connection allows us to derive simple estimators. Our partial estimator for Spearman's correlation between X and Y adjusted for Z is the correlation of PSRs from models of X on Z and of Y on Z, which is analogous to the partial Pearson's correlation derived as the correlation of observed-minus-expected residuals. Our conditional estimator is the conditional correlation of PSRs. We describe estimation and inference, and highlight the use of semiparametric cumulative probability models, which allow preservation of the rank-based nature of Spearman's correlation. We conduct simulations to evaluate the performance of our estimators and compare them with other popular measures of association, demonstrating their robustness and efficiency. We illustrate our method in two applications, a biomarker study and a large survey. © 2017, The International Biometric Society.

  1. GENERAL A Hierarchy of Compatibility and Comeasurability Levels in Quantum Logics with Unique Conditional Probabilities

    NASA Astrophysics Data System (ADS)

    Gerd, Niestegge

    2010-12-01

    In the quantum mechanical Hilbert space formalism, the probabilistic interpretation is a later ad-hoc add-on, more or less enforced by the experimental evidence, but not motivated by the mathematical model itself. A model involving a clear probabilistic interpretation from the very beginning is provided by the quantum logics with unique conditional probabilities. It includes the projection lattices in von Neumann algebras and here probability conditionalization becomes identical with the state transition of the Lüders-von Neumann measurement process. This motivates the definition of a hierarchy of five compatibility and comeasurability levels in the abstract setting of the quantum logics with unique conditional probabilities. Their meanings are: the absence of quantum interference or influence, the existence of a joint distribution, simultaneous measurability, and the independence of the final state after two successive measurements from the sequential order of these two measurements. A further level means that two elements of the quantum logic (events) belong to the same Boolean subalgebra. In the general case, the five compatibility and comeasurability levels appear to differ, but they all coincide in the common Hilbert space formalism of quantum mechanics, in von Neumann algebras, and in some other cases.

  2. Improving online risk assessment with equipment prognostics and health monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coble, Jamie B.; Liu, Xiaotong; Briere, Chris

    The current approach to evaluating the risk of nuclear power plant (NPP) operation relies on static probabilities of component failure, which are based on industry experience with the existing fleet of nominally similar light water reactors (LWRs). As the nuclear industry looks to advanced reactor designs that feature non-light water coolants (e.g., liquid metal, high temperature gas, molten salt), this operating history is not available. Many advanced reactor designs use advanced components, such as electromagnetic pumps, that have not been used in the US commercial nuclear fleet. Given the lack of rich operating experience, we cannot accurately estimate the evolvingmore » probability of failure for basic components to populate the fault trees and event trees that typically comprise probabilistic risk assessment (PRA) models. Online equipment prognostics and health management (PHM) technologies can bridge this gap to estimate the failure probabilities for components under operation. The enhanced risk monitor (ERM) incorporates equipment condition assessment into the existing PRA and risk monitor framework to provide accurate and timely estimates of operational risk.« less

  3. Two models for microsimulation of family life cycle and family structure.

    PubMed

    Bertino, S; Pinnelli, A; Vichi, M

    1988-01-01

    2 models are proposed for the microsimulation of the family and analysis of family structure and life cycle. These models were devised primarily for teaching purposes. The families are composed of 3 generations (parents, grandparents, children). Cohabitation is not considered. The 1st model is governed by a transition mechanism based on the rules of a Markov multidimensional, nonhonogeneous chain. The 2nd model is based on stochastic point processes. Input data comprise annual mortality probability according to 1) sex, 2) age, 3) civil status, 4) annual probability of 1st marriage, 5) age combinations between the spouses, and 6) the probability of having 1, 2, or 3 children at 6 months intervals from the previous event (marriage or birth of nth child). The applications of the 1st model are presented using 2 mortality and fertility hypotheses (high and low) and a nuptiality hypothesis (West European nature). The various features of family composition are analyzed according to the duration of a couple's marriage and the age of the individual, as well as the characteristic features of the individual and family life cycle given these 2 demographic conditions.

  4. Development of a European Ensemble System for Seasonal Prediction: Application to crop yield

    NASA Astrophysics Data System (ADS)

    Terres, J. M.; Cantelaube, P.

    2003-04-01

    Western European agriculture is highly intensive and the weather is the main source of uncertainty for crop yield assessment and for crop management. In the current system, at the time when a crop yield forecast is issued, the weather conditions leading up to harvest time are unknown and are therefore a major source of uncertainty. The use of seasonal weather forecast would bring additional information for the remaining crop season and has valuable benefit for improving the management of agricultural markets and environmentally sustainable farm practices. An innovative method for supplying seasonal forecast information to crop simulation models has been developed in the frame of the EU funded research project DEMETER. It consists in running a crop model on each individual member of the seasonal hindcasts to derive a probability distribution of crop yield. Preliminary results of cumulative probability function of wheat yield provides information on both the yield anomaly and the reliability of the forecast. Based on the spread of the probability distribution, the end-user can directly quantify the benefits and risks of taking weather-sensitive decisions.

  5. Constraints on rapidity-dependent initial conditions from charged-particle pseudorapidity densities and two-particle correlations

    NASA Astrophysics Data System (ADS)

    Ke, Weiyao; Moreland, J. Scott; Bernhard, Jonah E.; Bass, Steffen A.

    2017-10-01

    We study the initial three-dimensional spatial configuration of the quark-gluon plasma (QGP) produced in relativistic heavy-ion collisions using centrality and pseudorapidity-dependent measurements of the medium's charged particle density and two-particle correlations. A cumulant-generating function is first used to parametrize the rapidity dependence of local entropy deposition and extend arbitrary boost-invariant initial conditions to nonzero beam rapidities. The model is then compared to p +Pb and Pb + Pb charged-particle pseudorapidity densities and two-particle pseudorapidity correlations and systematically optimized using Bayesian parameter estimation to extract high-probability initial condition parameters. The optimized initial conditions are then compared to a number of experimental observables including the pseudorapidity-dependent anisotropic flows, event-plane decorrelations, and flow correlations. We find that the form of the initial local longitudinal entropy profile is well constrained by these experimental measurements.

  6. Bottom-Water Conditions in a Marine Basin after the Cretaceous–Paleogene Impact Event: Timing the Recovery of Oxygen Levels and Productivity

    PubMed Central

    Sosa-Montes De Oca, Claudia; Martínez-Ruiz, Francisca; Rodríguez-Tovar, Francisco Javier

    2013-01-01

    An ultra-high-resolution analysis of major and trace element contents from the Cretaceous–Paleogene boundary interval in the Caravaca section, southeast Spain, reveals a quick recovery of depositional conditions after the impact event. Enrichment/depletion profiles of redox sensitive elements indicate significant geochemical anomalies just within the boundary ejecta layer, supporting an instantaneous recovery –some 102 years– of pre-impact conditions in terms of oxygenation. Geochemical redox proxies point to oxygen levels comparable to those at the end of the Cretaceous shortly after impact, which is further evidenced by the contemporary macrobenthic colonization of opportunistic tracemakers. Recovery of the oxygen conditions was therefore several orders shorter than traditional proposals (104–105 years), suggesting a probable rapid recovery of deep-sea ecosystems at bottom and in intermediate waters. PMID:24349232

  7. Partitioning into hazard subregions for regional peaks-over-threshold modeling of heavy precipitation

    NASA Astrophysics Data System (ADS)

    Carreau, J.; Naveau, P.; Neppel, L.

    2017-05-01

    The French Mediterranean is subject to intense precipitation events occurring mostly in autumn. These can potentially cause flash floods, the main natural danger in the area. The distribution of these events follows specific spatial patterns, i.e., some sites are more likely to be affected than others. The peaks-over-threshold approach consists in modeling extremes, such as heavy precipitation, by the generalized Pareto (GP) distribution. The shape parameter of the GP controls the probability of extreme events and can be related to the hazard level of a given site. When interpolating across a region, the shape parameter should reproduce the observed spatial patterns of the probability of heavy precipitation. However, the shape parameter estimators have high uncertainty which might hide the underlying spatial variability. As a compromise, we choose to let the shape parameter vary in a moderate fashion. More precisely, we assume that the region of interest can be partitioned into subregions with constant hazard level. We formalize the model as a conditional mixture of GP distributions. We develop a two-step inference strategy based on probability weighted moments and put forward a cross-validation procedure to select the number of subregions. A synthetic data study reveals that the inference strategy is consistent and not very sensitive to the selected number of subregions. An application on daily precipitation data from the French Mediterranean shows that the conditional mixture of GPs outperforms two interpolation approaches (with constant or smoothly varying shape parameter).

  8. The Importance of Conditional Probability in Diagnostic Reasoning and Clinical Decision Making: A Primer for the Eye Care Practitioner.

    PubMed

    Sanfilippo, Paul G; Hewitt, Alex W; Mackey, David A

    2017-04-01

    To outline and detail the importance of conditional probability in clinical decision making and discuss the various diagnostic measures eye care practitioners should be aware of in order to improve the scope of their clinical practice. We conducted a review of the importance of conditional probability in diagnostic testing for the eye care practitioner. Eye care practitioners use diagnostic tests on a daily basis to assist in clinical decision making and optimizing patient care and management. These tests provide probabilistic information that can enable the clinician to increase (or decrease) their level of certainty about the presence of a particular condition. While an understanding of the characteristics of diagnostic tests are essential to facilitate proper interpretation of test results and disease risk, many practitioners either confuse or misinterpret these measures. In the interests of their patients, practitioners should be aware of the basic concepts associated with diagnostic testing and the simple mathematical rule that underpins them. Importantly, the practitioner needs to recognize that the prevalence of a disease in the population greatly determines the clinical value of a diagnostic test.

  9. Future southcentral US wildfire probability due to climate change

    USGS Publications Warehouse

    Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.

    2018-01-01

    Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.

  10. Laser based in-situ and standoff detection of chemical warfare agents and explosives

    NASA Astrophysics Data System (ADS)

    Patel, C. Kumar N.

    2009-09-01

    Laser based detection of gaseous, liquid and solid residues and trace amounts has been developed ever since lasers were invented. However, the lack of availability of reasonably high power tunable lasers in the spectral regions where the relevant targets can be interrogated as well as appropriate techniques for high sensitivity, high selectivity detection has hampered the practical exploitation of techniques for the detection of targets important for homeland security and defense applications. Furthermore, emphasis has been on selectivity without particular attention being paid to the impact of interfering species on the quality of detection. Having high sensitivity is necessary but not a sufficient condition. High sensitivity assures a high probability of detection of the target species. However, it is only recently that the sensor community has come to recognize that any measure of probability of detection must be associated with a probability of false alarm, if it is to have any value as a measure of performance. This is especially true when one attempts to compare performance characteristics of different sensors based on different physical principles. In this paper, I will provide a methodology for characterizing the performance of sensors utilizing optical absorption measurement techniques. However, the underlying principles are equally application to all other sensors. While most of the current progress in high sensitivity, high selectivity detection of CWAs, TICs and explosives involve identifying and quantifying the target species in-situ, there is an urgent need for standoff detection of explosives from safe distances. I will describe our results on CO2 and quantum cascade laser (QCL) based photoacoustic sensors for the detection of CWAs, TICs and explosives as well the very new results on stand-off detection of explosives at distances up to 150 meters. The latter results are critically important for assuring safety of military personnel in battlefield environment, especially from improvised explosive devices (IEDs), and of civilian personnel from terrorist attacks in metropolitan areas.

  11. Posttraumatic stress disorder, depression, and alcohol and tobacco use in public health workers after the 2004 Florida hurricanes.

    PubMed

    Fullerton, Carol S; McKibben, Jodi B A; Reissman, Dori B; Scharf, Ted; Kowalski-Trakofler, Kathleen M; Shultz, James M; Ursano, Robert J

    2013-02-01

    We examined the relationship of probable posttraumatic stress disorder (PTSD), probable depression, and increased alcohol and/or tobacco use to disaster exposure and work demand in Florida Department of Health workers after the 2004 hurricanes. Participants (N = 2249) completed electronic questionnaires assessing PTSD, depression, alcohol and tobacco use, hurricane exposure, and work demand. Total mental and behavioral health burden (probable PTSD, probable depression, increased alcohol and/or tobacco use) was 11%. More than 4% had probable PTSD, and 3.8% had probable depression. Among those with probable PTSD, 29.2% had increased alcohol use, and 50% had increased tobacco use. Among those with probable depression, 34% indicated increased alcohol use and 55.6% increased tobacco use. Workers with greater exposure were more likely to have probable PTSD and probable depression (ORs = 3.3 and 3.06, respectively). After adjusting for demographics and work demand, those with high exposure were more likely to have probable PTSD and probable depression (ORs = 3.21 and 3.13). Those with high exposure had increased alcohol and tobacco use (ORs = 3.01 and 3.40), and those with high work demand indicated increased alcohol and tobacco use (ORs = 1.98 and 2.10). High exposure and work demand predicted increased alcohol and tobacco use, after adjusting for demographics, work demand, and exposure. Work-related disaster mental and behavioral health burden indicate the need for additional mental health interventions in the public health disaster workforce.

  12. Posttraumatic Stress Disorder, Depression, and Alcohol and Tobacco Use in Public Health Workers After the 2004 Florida Hurricanes

    PubMed Central

    Fullerton, Carol S.; McKibben, Jodi B.A.; Reissman, Dori B.; Scharf, Ted; Kowalski-Trakofler, Kathleen M.; Shultz, James M.; Ursano, Robert J.

    2015-01-01

    Objective We examined the relationship of probable posttraumatic stress disorder (PTSD), probable depression, and increased alcohol and/or tobacco use to disaster exposure and work demand in Florida Department of Health workers after the 2004 hurricanes. Methods Participants (N = 2249) completed electronic questionnaires assessing PTSD, depression, alcohol and tobacco use, hurricane exposure, and work demand. Results Total mental and behavioral health burden (probable PTSD, probable depression, increased alcohol and/or tobacco use) was 11%. More than 4% had probable PTSD, and 3.8% had probable depression. Among those with probable PTSD, 29.2% had increased alcohol use, and 50% had increased tobacco use. Among those with probable depression, 34% indicated increased alcohol use and 55.6% increased tobacco use. Workers with greater exposure were more likely to have probable PTSD and probable depression (ORs = 3.3 and 3.06, respectively). After adjusting for demographics and work demand, those with high exposure were more likely to have probable PTSD and probable depression (ORs = 3.21 and 3.13). Those with high exposure had increased alcohol and tobacco use (ORs = 3.01 and 3.40), and those with high work demand indicated increased alcohol and tobacco use (ORs = 1.98 and 2.10). High exposure and work demand predicted increased alcohol and tobacco use, after adjusting for demographics, work demand, and exposure. Conclusions Work-related disaster mental and behavioral health burden indicate the need for additional mental health interventions in the public health disaster workforce. PMID:24618140

  13. Sociocultural definitions of risk

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rayner, S.

    1990-10-01

    Public constituencies frequently are criticized by technical experts as being irrational in response to low-probability risks. This presentation argued that most people are concerned with a variety of risk attributes other than probability and that is rather irrational to exclude these from the definition and analysis of technological risk. Risk communication, which is at the heart of the right-to-know concept, is described as the creation of shared meaning rather than the mere transmission of information. A case study of utilities, public utility commissions, and public interest groups illustrates how the diversity of institutional cultures in modern society leads to problemsmore » for the creation of shared meanings in establishing trust, distributing liability, and obtaining consent to risk. This holistic approach to risk analysis is most appropriate under conditions of high uncertainty and/or decision stakes. 1 fig., 5 tabs.« less

  14. Void swelling and irradiation creep in austenitic and martensitic stainless steels under cyclic irradiation

    NASA Astrophysics Data System (ADS)

    Zhiyong, Zhu; Jung, Peter; Klein, Horst

    1993-07-01

    A high purity austenitic FeCrNiMo alloy and DIN 1.4914 martensitic stainless steel were irradiated with 6.2 MeV protons. The pulsed operation of a tokamak fusion reactor was simulated by simultaneous cycling of beam, temperature and stress similar to that anticipated in the NET (Next European Torus) design. Void swelling and irradiation creep of the FeCrNiMo alloy under cyclic and stationary conditions were identical within the experimental error. The martensitic steel showed no swelling at the present low doses (~0.2 dpa). The plastic deformation under continuous and cyclic irradiation was essentially determined by thermal creep. During irradiation the electrical resistivity of FeCrNiMo slightly increased, probably due to swelling, while that of DIN 1.4914 linearly decreased, probably due to segregation effects.

  15. Extended Eden model reproduces growth of an acellular slime mold.

    PubMed

    Wagner, G; Halvorsrud, R; Meakin, P

    1999-11-01

    A stochastic growth model was used to simulate the growth of the acellular slime mold Physarum polycephalum on substrates where the nutrients were confined in separate drops. Growth of Physarum on such substrates was previously studied experimentally and found to produce a range of different growth patterns [Phys. Rev. E 57, 941 (1998)]. The model represented the aging of cluster sites and differed from the original Eden model in that the occupation probability of perimeter sites depended on the time of occupation of adjacent cluster sites. This feature led to a bias in the selection of growth directions. A moderate degree of persistence was found to be crucial to reproduce the biological growth patterns under various conditions. Persistence in growth combined quick propagation in heterogeneous environments with a high probability of locating sources of nutrients.

  16. Extended Eden model reproduces growth of an acellular slime mold

    NASA Astrophysics Data System (ADS)

    Wagner, Geri; Halvorsrud, Ragnhild; Meakin, Paul

    1999-11-01

    A stochastic growth model was used to simulate the growth of the acellular slime mold Physarum polycephalum on substrates where the nutrients were confined in separate drops. Growth of Physarum on such substrates was previously studied experimentally and found to produce a range of different growth patterns [Phys. Rev. E 57, 941 (1998)]. The model represented the aging of cluster sites and differed from the original Eden model in that the occupation probability of perimeter sites depended on the time of occupation of adjacent cluster sites. This feature led to a bias in the selection of growth directions. A moderate degree of persistence was found to be crucial to reproduce the biological growth patterns under various conditions. Persistence in growth combined quick propagation in heterogeneous environments with a high probability of locating sources of nutrients.

  17. How mutation affects evolutionary games on graphs

    PubMed Central

    Allen, Benjamin; Traulsen, Arne; Tarnita, Corina E.; Nowak, Martin A.

    2011-01-01

    Evolutionary dynamics are affected by population structure, mutation rates and update rules. Spatial or network structure facilitates the clustering of strategies, which represents a mechanism for the evolution of cooperation. Mutation dilutes this effect. Here we analyze how mutation influences evolutionary clustering on graphs. We introduce new mathematical methods to evolutionary game theory, specifically the analysis of coalescing random walks via generating functions. These techniques allow us to derive exact identity-by-descent (IBD) probabilities, which characterize spatial assortment on lattices and Cayley trees. From these IBD probabilities we obtain exact conditions for the evolution of cooperation and other game strategies, showing the dual effects of graph topology and mutation rate. High mutation rates diminish the clustering of cooperators, hindering their evolutionary success. Our model can represent either genetic evolution with mutation, or social imitation processes with random strategy exploration. PMID:21473871

  18. Peak streamflow on selected streams in Arkansas, December 2015

    USGS Publications Warehouse

    Breaker, Brian K.

    2017-01-11

    Heavy rainfall during December 2015 resulted in flooding across parts of Arkansas; rainfall amounts were as high as 12 inches over a period from December 27, 2015, to December 29, 2015. Although precipitation accumulations were highest in northwestern Arkansas, significant flooding occurred in other parts of the State. Flood damage occurred in several counties as water levels rose in streams, and disaster declarations were declared in 32 of the 75 counties in Arkansas.Given the severity of the December 2015 flooding, the U.S. Geological Survey (USGS), in cooperation with the Federal Emergency Management Agency (FEMA), conducted a study to document the meteorological and hydrological conditions prior to and during the flood; compiled flood-peak gage heights, streamflows, and flood probabilities at USGS streamflow-gaging stations; and estimated streamflows and flood probabilities at selected ungaged locations.

  19. Probabilistic cluster labeling of imagery data

    NASA Technical Reports Server (NTRS)

    Chittineni, C. B. (Principal Investigator)

    1980-01-01

    The problem of obtaining the probabilities of class labels for the clusters using spectral and spatial information from a given set of labeled patterns and their neighbors is considered. A relationship is developed between class and clusters conditional densities in terms of probabilities of class labels for the clusters. Expressions are presented for updating the a posteriori probabilities of the classes of a pixel using information from its local neighborhood. Fixed-point iteration schemes are developed for obtaining the optimal probabilities of class labels for the clusters. These schemes utilize spatial information and also the probabilities of label imperfections. Experimental results from the processing of remotely sensed multispectral scanner imagery data are presented.

  20. On the formation of granulites

    USGS Publications Warehouse

    Bohlen, S.R.

    1991-01-01

    The tectonic settings for the formation and evolution of regional granulite terranes and the lowermost continental crust can be deduced from pressure-temperature-time (P-T-time) paths and constrained by petrological and geophysical considerations. P-T conditions deduced for regional granulites require transient, average geothermal gradients of greater than 35??C km-1, implying minimum heat flow in excess of 100 mW m-2. Such high heat flow is probably caused by magmatic heating. Tectonic settings wherein such conditions are found include convergent plate margins, continental rifts, hot spots and at the margins of large, deep-seated batholiths. Cooling paths can be constrained by solid-solid and devolatilization equilibria and geophysical modelling. -from Author

  1. Bivariate extreme value distributions

    NASA Technical Reports Server (NTRS)

    Elshamy, M.

    1992-01-01

    In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

  2. Exposure history determines pteropod vulnerability to ocean acidification along the US West Coast.

    PubMed

    Bednaršek, N; Feely, R A; Tolimieri, N; Hermann, A J; Siedlecki, S A; Waldbusser, G G; McElhany, P; Alin, S R; Klinger, T; Moore-Maley, B; Pörtner, H O

    2017-07-03

    The pteropod Limacina helicina frequently experiences seasonal exposure to corrosive conditions (Ω ar   < 1) along the US West Coast and is recognized as one of the species most susceptible to ocean acidification (OA). Yet, little is known about their capacity to acclimatize to such conditions. We collected pteropods in the California Current Ecosystem (CCE) that differed in the severity of exposure to Ω ar conditions in the natural environment. Combining field observations, high-CO 2 perturbation experiment results, and retrospective ocean transport simulations, we investigated biological responses based on histories of magnitude and duration of exposure to Ω ar  < 1. Our results suggest that both exposure magnitude and duration affect pteropod responses in the natural environment. However, observed declines in calcification performance and survival probability under high CO 2 experimental conditions do not show acclimatization capacity or physiological tolerance related to history of exposure to corrosive conditions. Pteropods from the coastal CCE appear to be at or near the limit of their physiological capacity, and consequently, are already at extinction risk under projected acceleration of OA over the next 30 years. Our results demonstrate that Ω ar exposure history largely determines pteropod response to experimental conditions and is essential to the interpretation of biological observations and experimental results.

  3. Effects of sporadic E-layer characteristics on spread-F generation in the nighttime midlatitude ionosphere: A climatological study

    NASA Astrophysics Data System (ADS)

    Lee, C. C.; Chen, W. S.

    2018-04-01

    The aim of this study is to examine the effects of Es-layer characteristics on spread-F generation in the nighttime midlatitude ionosphere. The Es-layer parameters and spread-F appearance of the 23rd solar cycle (1996-2008) are recorded by the Kokubunji ionosonde. The Es-layer parameters are foEs (critical frequency of Es-layer), fbEs (blanketing frequency of Es-layer), and Δf (≡foEs-fbEs). In order to completely explore the effects, the pre-midnight and post-midnight data are classified by seasons, solar activities, and geomagnetic conditions. Results show that the spread-F occurs more frequently in post-midnight and in summer. And, the occurrence probabilities of spread-F are greater, when the solar activity is lower. For the occurrence probabilities of spread-F versus foEs and Δf under geomagnetic quiet-conditions, the trend is increasing, when the associated probabilities are significant. These indicate that the spread-F occurrence increases with increasing foEs and/or Δf. Further, the increasing trends demonstrate that polarization electric fields generated in Es-layer would be helpful to generate spread-F, through the electrodynamical coupling of Es-layer and F-region. Moreover, this electrodynamical coupling is efficient not only under quiet-conditions but under disturbed-conditions, since the significant increasing trend can also be found under disturbed-conditions. Regarding the occurrence probabilities of spread-F versus fbEs, the evident trends are not in the majority. This implies that fbEs might not be a major factor for the spread-F formation.

  4. Identifying HIV care enrollees at-risk for cannabis use disorder.

    PubMed

    Hartzler, Bryan; Carlini, Beatriz H; Newville, Howard; Crane, Heidi M; Eron, Joseph J; Geng, Elvin H; Mathews, W Christopher; Mayer, Kenneth H; Moore, Richard D; Mugavero, Michael J; Napravnik, Sonia; Rodriguez, Benigno; Donovan, Dennis M

    2017-07-01

    Increased scientific attention given to cannabis in the United States has particular relevance for its domestic HIV care population, given that evidence exists for both cannabis as a therapeutic agent and cannabis use disorder (CUD) as a barrier to antiretroviral medication adherence. It is critical to identify relative risk for CUD among demographic subgroups of HIV patients, as this will inform detection and intervention efforts. A Center For AIDS Research Network of Integrated Clinical Systems cohort (N = 10,652) of HIV-positive adults linked to care at seven United State sites was examined for this purpose. Based on a patient-report instrument with validated diagnostic threshold for CUD, the prevalence of recent cannabis use and corresponding conditional probabilities for CUD were calculated for the aggregate sample and demographic subgroups. Generalized estimating equations then tested models directly examining patient demographic indices as predictors of CUD, while controlling for history and geography. Conditional probability of CUD among cannabis-using patients was 49%, with the highest conditional probabilities among demographic subgroups of young adults and those with non-specified sexual orientation (67-69%) and the lowest conditional probability among females and those 50+ years of age (42% apiece). Similarly, youthful age and male gender emerged as robust multivariate model predictors of CUD. In the context of increasingly lenient policies for use of cannabis as a therapeutic agent for chronic conditions like HIV/AIDS, current study findings offer needed direction in terms of specifying targeted patient groups in HIV care on whom resources for enhanced surveillance and intervention efforts will be most impactful.

  5. Small and large wetland fragments are equally suited breeding sites for a ground-nesting passerine.

    PubMed

    Pasinelli, Gilberto; Mayer, Christian; Gouskov, Alexandre; Schiegg, Karin

    2008-06-01

    Large habitat fragments are generally thought to host more species and to offer more diverse and/or better quality habitats than small fragments. However, the importance of small fragments for population dynamics in general and for reproductive performance in particular is highly controversial. Using an information-theoretic approach, we examined reproductive performance and probability of local recruitment of color-banded reed buntings Emberiza schoeniclus in relation to the size of 18 wetland fragments in northeastern Switzerland over 4 years. We also investigated if reproductive performance and recruitment probability were density-dependent. None of the four measures of reproductive performance (laying date, nest failure probability, fledgling production per territory, fledgling condition) nor recruitment probability were found to be related to wetland fragment size. In terms of fledgling production, however, fragment size interacted with year, indicating that small fragments were better reproductive grounds in some years than large fragments. Reproductive performance and recruitment probability were not density-dependent. Our results suggest that small fragments are equally suited as breeding grounds for the reed bunting as large fragments and should therefore be managed to provide a habitat for this and other specialists occurring in the same habitat. Moreover, large fragments may represent sinks in specific years because a substantial percentage of all breeding pairs in our study area breed in large fragments, and reproductive failure in these fragments due to the regularly occurring floods may have a much stronger impact on regional population dynamics than comparable events in small fragments.

  6. Solving probability reasoning based on DNA strand displacement and probability modules.

    PubMed

    Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun

    2017-12-01

    In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Comonotonic bounds on the survival probabilities in the Lee-Carter model for mortality projection

    NASA Astrophysics Data System (ADS)

    Denuit, Michel; Dhaene, Jan

    2007-06-01

    In the Lee-Carter framework, future survival probabilities are random variables with an intricate distribution function. In large homogeneous portfolios of life annuities, value-at-risk or conditional tail expectation of the total yearly payout of the company are approximately equal to the corresponding quantities involving random survival probabilities. This paper aims to derive some bounds in the increasing convex (or stop-loss) sense on these random survival probabilities. These bounds are obtained with the help of comonotonic upper and lower bounds on sums of correlated random variables.

  8. Estimating probabilities of reservoir storage for the upper Delaware River basin

    USGS Publications Warehouse

    Hirsch, Robert M.

    1981-01-01

    A technique for estimating conditional probabilities of reservoir system storage is described and applied to the upper Delaware River Basin. The results indicate that there is a 73 percent probability that the three major New York City reservoirs (Pepacton, Cannonsville, and Neversink) would be full by June 1, 1981, and only a 9 percent probability that storage would return to the ' drought warning ' sector of the operations curve sometime in the next year. In contrast, if restrictions are lifted and there is an immediate return to normal operating policies, the probability of the reservoir system being full by June 1 is 37 percent and the probability that storage would return to the ' drought warning ' sector in the next year is 30 percent. (USGS)

  9. Probabilities for time-dependent properties in classical and quantum mechanics

    NASA Astrophysics Data System (ADS)

    Losada, Marcelo; Vanni, Leonardo; Laura, Roberto

    2013-05-01

    We present a formalism which allows one to define probabilities for expressions that involve properties at different times for classical and quantum systems and we study its lattice structure. The formalism is based on the notion of time translation of properties. In the quantum case, the properties involved should satisfy compatibility conditions in order to obtain well-defined probabilities. The formalism is applied to describe the double-slit experiment.

  10. Low Probability of Intercept Waveforms via Intersymbol Dither Performance Under Multiple Conditions

    DTIC Science & Technology

    2009-03-01

    United States Air Force, Department of Defense, or the United States Government . AFIT/GE/ENG/09-23 Low Probability of Intercept Waveforms via...21 D random variable governing the distribution of dither values 21 p (ct) D (t) probability density function of the...potential performance loss of a non-cooperative receiver compared to a cooperative receiver designed to account for ISI and multipath. 1.3 Thesis

  11. Low Probability of Intercept Waveforms via Intersymbol Dither Performance Under Multipath Conditions

    DTIC Science & Technology

    2009-03-01

    United States Air Force, Department of Defense, or the United States Government . AFIT/GE/ENG/09-23 Low Probability of Intercept Waveforms via...21 D random variable governing the distribution of dither values 21 p (ct) D (t) probability density function of the...potential performance loss of a non-cooperative receiver compared to a cooperative receiver designed to account for ISI and multipath. 1.3 Thesis

  12. The Probability of Exceedance as a Nonparametric Person-Fit Statistic for Tests of Moderate Length

    ERIC Educational Resources Information Center

    Tendeiro, Jorge N.; Meijer, Rob R.

    2013-01-01

    To classify an item score pattern as not fitting a nonparametric item response theory (NIRT) model, the probability of exceedance (PE) of an observed response vector x can be determined as the sum of the probabilities of all response vectors that are, at most, as likely as x, conditional on the test's total score. Vector x is to be considered…

  13. Knock probability estimation through an in-cylinder temperature model with exogenous noise

    NASA Astrophysics Data System (ADS)

    Bares, P.; Selmanaj, D.; Guardiola, C.; Onder, C.

    2018-01-01

    This paper presents a new knock model which combines a deterministic knock model based on the in-cylinder temperature and an exogenous noise disturbing this temperature. The autoignition of the end-gas is modelled by an Arrhenius-like function and the knock probability is estimated by propagating a virtual error probability distribution. Results show that the random nature of knock can be explained by uncertainties at the in-cylinder temperature estimation. The model only has one parameter for calibration and thus can be easily adapted online. In order to reduce the measurement uncertainties associated with the air mass flow sensor, the trapped mass is derived from the in-cylinder pressure resonance, which improves the knock probability estimation and reduces the number of sensors needed for the model. A four stroke SI engine was used for model validation. By varying the intake temperature, the engine speed, the injected fuel mass, and the spark advance, specific tests were conducted, which furnished data with various knock intensities and probabilities. The new model is able to predict the knock probability within a sufficient range at various operating conditions. The trapped mass obtained by the acoustical model was compared in steady conditions by using a fuel balance and a lambda sensor and differences below 1 % were found.

  14. The Effects of Framing, Reflection, Probability, and Payoff on Risk Preference in Choice Tasks.

    PubMed

    Kühberger; Schulte-Mecklenbeck; Perner

    1999-06-01

    A meta-analysis of Asian-disease-like studies is presented to identify the factors which determine risk preference. First the confoundings between probability levels, payoffs, and framing conditions are clarified in a task analysis. Then the role of framing, reflection, probability, type, and size of payoff is evaluated in a meta-analysis. It is shown that bidirectional framing effects exist for gains and for losses. Presenting outcomes as gains tends to induce risk aversion, while presenting outcomes as losses tends to induce risk seeking. Risk preference is also shown to depend on the size of the payoffs, on the probability levels, and on the type of good at stake (money/property vs human lives). In general, higher payoffs lead to increasing risk aversion. Higher probabilities lead to increasing risk aversion for gains and to increasing risk seeking for losses. These findings are confirmed by a subsequent empirical test. Shortcomings of existing formal theories, such as prospect theory, cumulative prospect theory, venture theory, and Markowitz's utility theory, are identified. It is shown that it is not probabilities or payoffs, but the framing condition, which explains most variance. These findings are interpreted as showing that no linear combination of formally relevant predictors is sufficient to capture the essence of the framing phenomenon. Copyright 1999 Academic Press.

  15. The impact of macroeconomic conditions on obesity in Canada.

    PubMed

    Latif, Ehsan

    2014-06-01

    The paper used longitudinal Canadian data from the National Population Health Survey to estimate the impact of macroeconomic conditions measured by provincial unemployment rate on individual obesity and BMI. To control for individual-specific unobserved heterogeneity, the study utilized the conditional fixed effect logit and fixed effects models. The study found that unemployment rate had a significant positive impact on the probability of being severely obese. The study also found that unemployment rate significantly increased BMI. However, the study did not find any significant impact of unemployment rate on the probability of being overweight or obese. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Increasing Compliance of Children with Autism: Effects of Programmed Reinforcement for High-Probability Requests and Varied Inter-Instruction Intervals

    ERIC Educational Resources Information Center

    Pitts, Laura; Dymond, Simon

    2012-01-01

    Research on the high-probability (high-p) request sequence shows that compliance with low-probability (low-p) requests generally increases when preceded by a series of high-p requests. Few studies have conducted formal preference assessments to identify the consequences used for compliance, which may partly explain treatment failures, and still…

  17. Random-Forest Classification of High-Resolution Remote Sensing Images and Ndsm Over Urban Areas

    NASA Astrophysics Data System (ADS)

    Sun, X. F.; Lin, X. G.

    2017-09-01

    As an intermediate step between raw remote sensing data and digital urban maps, remote sensing data classification has been a challenging and long-standing research problem in the community of remote sensing. In this work, an effective classification method is proposed for classifying high-resolution remote sensing data over urban areas. Starting from high resolution multi-spectral images and 3D geometry data, our method proceeds in three main stages: feature extraction, classification, and classified result refinement. First, we extract color, vegetation index and texture features from the multi-spectral image and compute the height, elevation texture and differential morphological profile (DMP) features from the 3D geometry data. Then in the classification stage, multiple random forest (RF) classifiers are trained separately, then combined to form a RF ensemble to estimate each sample's category probabilities. Finally the probabilities along with the feature importance indicator outputted by RF ensemble are used to construct a fully connected conditional random field (FCCRF) graph model, by which the classification results are refined through mean-field based statistical inference. Experiments on the ISPRS Semantic Labeling Contest dataset show that our proposed 3-stage method achieves 86.9% overall accuracy on the test data.

  18. The Hardwood Gneiss: Evidence for high P-T Archean metamorphism in the southern province of the Lake Superior region

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peterson, J.W.; Geiger, C.A.

    1990-03-01

    The Hardwood Gneiss is an areally small unit of Precambrian granulite-grade rocks exposed in the Archean gneiss terrane of the southern Lake Superior region. The rocks are located in the southwestern portion of the Upper Peninsula of Michigan and consist of a structurally conformable package of quartzitic, metapelitic, amphibolitic, and metabasic units. Three texturally distinct garnet types are present in the metabasites and are interpreted to represent two metamorphic events. Geothermobarometry indicates conditions of {approximately}8.2-11.6 kbar and {approximately}770C for M1, and conditions of {approximately}6.0-10.1 kbar and {approximately}610-740C for M2. It is proposed that M1 was Archean and contemporaneous with amore » high-grade metamorphic event recorded in the Minnesota River Valley. The M2 event was probably Early Proterozoic and pre-Penokean, with metamorphic conditions more intense than those generally ascribed to the Penokean Orogeny in Michigan, but similar to the conditions reported for the Kapuskasing zone of Ontario. The high paleopressures and temperatures of the M1 event make the Hardwood Gneiss distinct from any rocks previously described in the southern Lake Superior region, and suggest intense tectonic activity during the Archean.« less

  19. System for information discovery

    DOEpatents

    Pennock, Kelly A [Richland, WA; Miller, Nancy E [Kennewick, WA

    2002-11-19

    A sequence of word filters are used to eliminate terms in the database which do not discriminate document content, resulting in a filtered word set and a topic word set whose members are highly predictive of content. These two word sets are then formed into a two dimensional matrix with matrix entries calculated as the conditional probability that a document will contain a word in a row given that it contains the word in a column. The matrix representation allows the resultant vectors to be utilized to interpret document contents.

  20. Integrity of Ceramic Parts Predicted When Loads and Temperatures Fluctuate Over Time

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    2004-01-01

    Brittle materials are being used, and being considered for use, for a wide variety of high performance applications that operate in harsh environments, including static and rotating turbine parts for unmanned aerial vehicles, auxiliary power units, and distributed power generation. Other applications include thermal protection systems, dental prosthetics, fuel cells, oxygen transport membranes, radomes, and microelectromechanical systems (MEMS). In order for these high-technology ceramics to be used successfully for structural applications that push the envelope of materials capabilities, design engineers must consider that brittle materials are designed and analyzed differently than metallic materials. Unlike ductile metals, brittle materials display a stochastic strength response because of the combination of low fracture toughness and the random nature of the size, orientation, and distribution of inherent microscopic flaws. This plus the fact that the strength of a component under load may degrade over time because of slow crack growth means that a probabilistic-based life-prediction methodology must be used when the tradeoffs of failure probability, performance, and useful life are being optimized. The CARES/Life code (which was developed at the NASA Glenn Research Center) predicts the probability of ceramic components failing from spontaneous catastrophic rupture when these components are subjected to multiaxial loading and slow crack growth conditions. Enhancements to CARES/Life now allow for the component survival probability to be calculated when loading and temperature vary over time.

  1. Laser induced breakdown in gas mixtures. Experimental and statistical investigation on n-decane ignition: Pressure, mixture composition and equivalence ratio effects.

    PubMed

    Mokrani, Nabil; Gillard, Philippe

    2018-03-26

    This paper presents a physical and statistical approach to laser-induced breakdown in n-decane/N 2  + O 2 mixtures as a function of incident or absorbed energy. A parametric study, with pressure, fuel purity and equivalence ratio, was conducted to determine the incident and absorbed energies involved in producing breakdown, followed or not by ignition. The experiments were performed using a Q-switched Nd-YAG laser (1064 nm) inside a cylindrical 1-l combustion chamber in the range of 1-100 mJ of incident energy. A stochastic study of breakdown and ignition probabilities showed that the mixture composition had a significant effect on ignition with large variation of incident or absorbed energy required to obtain 50% of breakdown. It was observed that the combustion products absorb more energy coming from the laser. The effect of pressure on the ignition probabilities of lean and near stoichiometric mixtures was also investigated. It was found that a high ignition energy E50% is required for lean mixtures at high pressures (3 bar). The present study provides new data obtained on an original experimental setup and the results, close to laboratory-produced laser ignition phenomena, will enhance the understanding of initial conditions on the breakdown or ignition probabilities for different mixtures. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. A discrete choice experiment studying students' preferences for scholarships to private medical schools in Japan.

    PubMed

    Goto, Rei; Kakihara, Hiroaki

    2016-02-09

    The shortage of physicians in rural areas and in some specialties is a societal problem in Japan. Expensive tuition in private medical schools limits access to them particularly for students from middle- and low-income families. One way to reduce this barrier and lessen maldistribution is to offer conditional scholarships to private medical schools. A discrete choice experiment is carried out on a total of 374 students considering application to medical schools. The willingness to receive a conditional scholarship program to private medical schools is analyzed. The probability of attending private medical schools significantly decreased because of high tuition, a postgraduate obligation to provide a service in specific specialty areas, and the length of time of this obligation. An obligation to provide a service in rural regions had no significant effect on this probability. To motivate non-applicants to private medical schools to enroll in such schools, a decrease in tuition to around 1.2 million yen (US$ 12,000) or less, which is twice that of public schools, was found to be necessary. Further, it was found that non-applicants to private medical schools choose to apply to such schools even with restrictions if they have tuition support at the public school level. Conditional scholarships for private medical schools may widen access to medical education and simultaneously provide incentives to work in insufficiently served areas.

  3. Familiarity with breeding habitat improves daily survival in colonial cliff swallows

    PubMed Central

    BROWN, CHARLES R.; BROWN, MARY BOMBERGER; BRAZEAL, KATHLEEN R.

    2008-01-01

    One probable cost of dispersing to a new breeding habitat is unfamiliarity with local conditions such as the whereabouts of food or the habits of local predators, and consequently immigrants may have lower probabilities of survival than more experienced residents. Within a breeding season, estimated daily survival probabilities of cliff swallows (Petrochelidon pyrrhonota) at colonies in southwestern Nebraska were highest for birds that had always nested at the same site, followed by those for birds that had nested there in some (but not all) past years. Daily survival probabilities were lowest for birds that were naïve immigrants to a colony site and for yearling birds that were nesting for the first time. Birds with past experience at a colony site had monthly survival 8.6% greater than that of naïve immigrants. All colonies where experienced residents did better than immigrants were smaller than 750 nests in size, and in colonies greater than 750 nests, naïve immigrants paid no survival costs relative to experienced residents. Removal of nest ectoparasites by fumigation resulted in higher survival probabilities for all birds, on average, and diminished the differences between immigrants and past residents, probably by improving bird condition to the extent that effects of past experience were relatively less important and harder to detect. The greater survival of experienced residents could not be explained by condition or territory quality, suggesting that familiarity with a local area confers survival advantages during the breeding season for cliff swallows. Colonial nesting may help to moderate the cost of unfamiliarity with an area, likely through social transfer of information about food sources and enhanced vigilance in large groups. PMID:19802326

  4. Dynamic prediction of patient outcomes during ongoing cardiopulmonary resuscitation.

    PubMed

    Kim, Joonghee; Kim, Kyuseok; Callaway, Clifton W; Doh, Kibbeum; Choi, Jungho; Park, Jongdae; Jo, You Hwan; Lee, Jae Hyuk

    2017-02-01

    The probability of the return of spontaneous circulation (ROSC) and subsequent favourable outcomes changes dynamically during advanced cardiac life support (ACLS). We sought to model these changes using time-to-event analysis in out-of-hospital cardiac arrest (OHCA) patients. Adult (≥18 years old), non-traumatic OHCA patients without prehospital ROSC were included. Utstein variables and initial arterial blood gas measurements were used as predictors. The incidence rate of ROSC during the first 30min of ACLS in the emergency department (ED) was modelled using spline-based parametric survival analysis. Conditional probabilities of subsequent outcomes after ROSC (1-week and 1-month survival and 6-month neurologic recovery) were modelled using multivariable logistic regression. The ROSC and conditional probability models were then combined to estimate the likelihood of achieving ROSC and subsequent outcomes by providing k additional minutes of effort. A total of 727 patients were analyzed. The incidence rate of ROSC increased rapidly until the 10th minute of ED ACLS, and it subsequently decreased. The conditional probabilities of subsequent outcomes after ROSC were also dependent on the duration of resuscitation with odds ratios for 1-week and 1-month survival and neurologic recovery of 0.93 (95% CI: 0.90-0.96, p<0.001), 0.93 (0.88-0.97, p=0.001) and 0.93 (0.87-0.99, p=0.031) per 1-min increase, respectively. Calibration testing of the combined models showed good correlation between mean predicted probability and actual prevalence. The probability of ROSC and favourable subsequent outcomes changed according to a multiphasic pattern over the first 30min of ACLS, and modelling of the dynamic changes was feasible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  5. Using a Betabinomial distribution to estimate the prevalence of adherence to physical activity guidelines among children and youth.

    PubMed

    Garriguet, Didier

    2016-04-01

    Estimates of the prevalence of adherence to physical activity guidelines in the population are generally the result of averaging individual probability of adherence based on the number of days people meet the guidelines and the number of days they are assessed. Given this number of active and inactive days (days assessed minus days active), the conditional probability of meeting the guidelines that has been used in the past is a Beta (1 + active days, 1 + inactive days) distribution assuming the probability p of a day being active is bounded by 0 and 1 and averages 50%. A change in the assumption about the distribution of p is required to better match the discrete nature of the data and to better assess the probability of adherence when the percentage of active days in the population differs from 50%. Using accelerometry data from the Canadian Health Measures Survey, the probability of adherence to physical activity guidelines is estimated using a conditional probability given the number of active and inactive days distributed as a Betabinomial(n, a + active days , β + inactive days) assuming that p is randomly distributed as Beta(a, β) where the parameters a and β are estimated by maximum likelihood. The resulting Betabinomial distribution is discrete. For children aged 6 or older, the probability of meeting physical activity guidelines 7 out of 7 days is similar to published estimates. For pre-schoolers, the Betabinomial distribution yields higher estimates of adherence to the guidelines than the Beta distribution, in line with the probability of being active on any given day. In estimating the probability of adherence to physical activity guidelines, the Betabinomial distribution has several advantages over the previously used Beta distribution. It is a discrete distribution and maximizes the richness of accelerometer data.

  6. Probability models for growth and aflatoxin B1 production as affected by intraspecies variability in Aspergillus flavus.

    PubMed

    Aldars-García, Laila; Berman, María; Ortiz, Jordi; Ramos, Antonio J; Marín, Sonia

    2018-06-01

    The probability of growth and aflatoxin B 1 (AFB 1 ) production of 20 isolates of Aspergillus flavus were studied using a full factorial design with eight water activity levels (0.84-0.98 a w ) and six temperature levels (15-40 °C). Binary data obtained from growth studies were modelled using linear logistic regression analysis as a function of temperature, water activity and time for each isolate. In parallel, AFB 1 was extracted at different times from newly formed colonies (up to 20 mm in diameter). Although a total of 950 AFB 1 values over time for all conditions studied were recorded, they were not considered to be enough to build probability models over time, and therefore, only models at 30 days were built. The confidence intervals of the regression coefficients of the probability of growth models showed some differences among the 20 growth models. Further, to assess the growth/no growth and AFB 1 /no- AFB 1 production boundaries, 0.05 and 0.5 probabilities were plotted at 30 days for all of the isolates. The boundaries for growth and AFB 1 showed that, in general, the conditions for growth were wider than those for AFB 1 production. The probability of growth and AFB 1 production seemed to be less variable among isolates than AFB 1 accumulation. Apart from the AFB 1 production probability models, using growth probability models for AFB 1 probability predictions could be, although conservative, a suitable alternative. Predictive mycology should include a number of isolates to generate data to build predictive models and take into account the genetic diversity of the species and thus make predictions as similar as possible to real fungal food contamination. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. One-dimensional numerical study of charged particle trajectories in turbulent electrostatic wave fields

    NASA Technical Reports Server (NTRS)

    Graham, K. N.; Fejer, J. A.

    1976-01-01

    The paper describes a numerical simulation of electron trajectories in weak random electric fields under conditions that are approximately true for Langmuir waves whose wavelength is much longer than the Debye length. Two types of trajectory calculations were made: (1) the initial particle velocity was made equal to the mean phase velocity of the waves, or (2) it was equal to 0.7419 times the mean velocity of the waves, so that the initial velocity differed substantially from all phase velocities of the wave spectrum. When the autocorrelation time is much greater than the trapping time, the particle motion can change virtually instantaneously from one of three states - high-velocity, low-velocity, or trapped state - to another. The probability of instantaneous transition from a high- or low-velocity state becomes small when the difference between the particle velocity and the mean phase velocity of the waves becomes high in comparison to the trapping velocity. Diffusive motion becomes negligible under these conditions also.

  8. Demographic responses to weather fluctuations are context dependent in a long-lived amphibian.

    PubMed

    Cayuela, Hugo; Arsovski, Dragan; Thirion, Jean-Marc; Bonnaire, Eric; Pichenot, Julian; Boitaud, Sylvain; Miaud, Claude; Joly, Pierre; Besnard, Aurélien

    2016-08-01

    Weather fluctuations have been demonstrated to affect demographic traits in many species. In long-lived organisms, their impact on adult survival might be buffered by the evolution of traits that reduce variation in interannual adult survival. For example, skipping breeding is an effective behavioral mechanism that may limit yearly variation in adult survival when harsh weather conditions occur; however, this in turn would likely lead to strong variation in recruitment. Yet, only a few studies to date have examined the impact of weather variation on survival, recruitment and breeding probability simultaneously in different populations of the same species. To fill this gap, we studied the impact of spring temperatures and spring rainfall on survival, on reproductive skipping behavior and on recruitment in five populations of a long-lived amphibian, the yellow-bellied toad (Bombina variegata). Based on capture-recapture data, our findings demonstrate that survival depends on interactions between age, population and weather variation. Varying weather conditions in the spring result in strong variation in the survival of immature toads, whereas they have little effect on adult toads. Breeding probability depends on both the individual's previous reproductive status and on the weather conditions during the current breeding season, leading to high interannual variation in recruitment. Crucially, we found that the impact of weather variation on demographic traits is largely context dependent and may thus differ sharply between populations. Our results suggest that studies predicting the impact of climate change on population dynamics should be taken with caution when the relationship between climate and demographic traits is established using only one population or few populations. We therefore highly recommend further research that includes surveys replicated in a substantial number of populations to account for context-dependent variation in demographic processes. © 2016 John Wiley & Sons Ltd.

  9. Effects of mindful eating training on delay and probability discounting for food and money in obese and healthy-weight individuals.

    PubMed

    Hendrickson, Kelsie L; Rasmussen, Erin B

    2013-07-01

    Obese individuals tend to behave more impulsively than healthy weight individuals across a variety of measures, but it is unclear whether this pattern can be altered. The present study examined the effects of a mindful eating behavioral strategy on impulsive and risky choice patterns for hypothetical food and money. In Experiment 1, 304 participants completed computerized delay and probability discounting tasks for food-related and monetary outcomes. High percent body fat (PBF) predicted more impulsive choice for food, but not small-value money, replicating previous work. In Experiment 2, 102 randomly selected participants from Experiment 1 were assigned to participate in a 50-min workshop on mindful eating or to watch an educational video. They then completed the discounting tasks again. Participants who completed the mindful eating session showed more self-controlled and less risk-averse discounting patterns for food compared to baseline; those in the control condition discounted similarly to baseline rates. There were no changes in discounting for money for either group, suggesting stimulus specificity for food for the mindful eating condition. Copyright © 2013 Elsevier Ltd. All rights reserved.

  10. Immune challenge retards seasonal reproductive regression in rodents: evidence for terminal investment.

    PubMed

    Weil, Zachary M; Martin, Lynn B; Workman, Joanna L; Nelson, Randy J

    2006-09-22

    Animals must balance investments in different physiological activities to allow them to maximize fitness in the environments they inhabit. These adjustments among reproduction, growth and survival are mandated because of the competing high costs of each process. Seasonally breeding rodents generally bias their investments towards reproduction when environmental conditions are benign, but shift these investments towards processes that promote survival, including immune activity, when environmental conditions deteriorate. Because survival probability of non-tropical small mammals is generally low in winter, under certain circumstances, these animals may not allocate resources to survival mechanisms in an effort to produce as many offspring as possible in the face of increased probability of death. Such 'terminal investments' have been described in passerines, but there are few examples of such phenomena in small mammals. Here, we show that male Siberian hamsters (Phodopus sungorus) challenged with lipopolysaccharide (a component of gram-negative bacteria that activates the immune system) induced a small, but significant, retardation of seasonal regression of the reproductive system relative to saline-injected hamsters. This delayed reproductive regression likely reflects a strategy to maintain reproductive function when survival prospects are compromised by infection.

  11. New aragonite 87Sr/86Sr records of Mesozoic ammonoids and approach to the problem of N, O, C and Sr isotope cycles in the evolution of the Earth

    NASA Astrophysics Data System (ADS)

    Zakharov, Yuri D.; Dril, Sergei I.; Shigeta, Yasunari; Popov, Alexander M.; Baraboshkin, Eugenij Y.; Michailova, Irina A.; Safronov, Peter P.

    2018-02-01

    New Sr isotope data from well-preserved aragonite ammonoid shell material from the Mesozoic are compared with that from a living Nautilus shell. The prominent negative Sr isotope excursions known from the Middle Permian, Jurassic and Cretaceous probably have their origins in intensive plate tectonic activity, followed by enhanced hydrothermal activity at the mid-ocean ridges (mantle volcanism) which supplied low radiogenic Sr to seawater. The maximum positive (radiogenic) shift in the lower Mesozoic Sr isotope curve (Lower Triassic peak) was likely caused by a significant expansion of dry land surfaces (Dabie-Sulu Triassic orogeny) and their intensive silicate weathering in conditions of extreme warming and aridity in the very end of the Smithian, followed by warm and humid conditions in the late Spathian, which apparently resulted in a significant oceanic input of radiogenic Sr through riverine flux. The comparatively high 87Sr/86Sr ratio obtained from the living Nautilus shell is probably a function of both the Alpine orogeny, which was accompanied by significant continental weathering and input of radiogenic Sr to the oceans, and the weakening of mantle volcanism.

  12. The Association Between Unhealthy Alcohol Use and Acute Care Expenditures in the 30 Days Following Hospital Discharge Among Older Veterans Affairs Patients with a Medical Condition.

    PubMed

    Chavez, Laura J; Liu, Chuan-Fen; Tefft, Nathan; Hebert, Paul L; Devine, Beth; Bradley, Katharine A

    2017-10-01

    Hospital readmissions and emergency department (ED) visits within 30 days of discharge are costly. Heavy alcohol use could predict increased risk for post-discharge acute care. This study assessed 30-day acute care utilization and expenditures for different categories of alcohol use. Veterans Affairs (VA) patients age ≥65 years with past-year alcohol screening, hospitalized for a medical condition, were included. VA and Medicare health care utilization data were used. Two-part models adjusted for patient demographics. Among 416,050 hospitalized patients, 25% had 30-day acute care use. Nondrinking patients (n = 267,746) had increased probability of acute care use, mean utilization days, and expenditures (difference of $345; 95% CI $268-$423), relative to low-risk drinkers (n = 105,023). High-risk drinking patients (n = 5,300) had increased probability of acute care use and mean utilization days, but not expenditures. Although these patients did not have greater acute care expenditures than low-risk drinking patients, they may nevertheless be vulnerable to poor post-discharge outcomes.

  13. Statistical Maps of Ground Magnetic Disturbance Derived from Global Geospace Models

    NASA Astrophysics Data System (ADS)

    Rigler, E. J.; Wiltberger, M. J.; Love, J. J.

    2017-12-01

    Electric currents in space are the principal driver of magnetic variations measured at Earth's surface. These in turn induce geoelectric fields that present a natural hazard for technological systems like high-voltage power distribution networks. Modern global geospace models can reasonably simulate large-scale geomagnetic response to solar wind variations, but they are less successful at deterministic predictions of intense localized geomagnetic activity that most impacts technological systems on the ground. Still, recent studies have shown that these models can accurately reproduce the spatial statistical distributions of geomagnetic activity, suggesting that their physics are largely correct. Since the magnetosphere is a largely externally driven system, most model-measurement discrepancies probably arise from uncertain boundary conditions. So, with realistic distributions of solar wind parameters to establish its boundary conditions, we use the Lyon-Fedder-Mobarry (LFM) geospace model to build a synthetic multivariate statistical model of gridded ground magnetic disturbance. From this, we analyze the spatial modes of geomagnetic response, regress on available measurements to fill in unsampled locations on the grid, and estimate the global probability distribution of extreme magnetic disturbance. The latter offers a prototype geomagnetic "hazard map", similar to those used to characterize better-known geophysical hazards like earthquakes and floods.

  14. 75 FR 80866 - Credit Rating Standardization Study

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-23

    ... ratings using identical terms; standardizing the market stress conditions under which ratings are... probabilities and loss expectations under standardized conditions of economic stress; and standardizing credit... identical terms; (B) standardizing the market stress conditions under which ratings are evaluated; (C...

  15. Heating efficiency evaluation with mimicking plasma conditions of integrated fast-ignition experiment.

    PubMed

    Fujioka, Shinsuke; Johzaki, Tomoyuki; Arikawa, Yasunobu; Zhang, Zhe; Morace, Alessio; Ikenouchi, Takahito; Ozaki, Tetsuo; Nagai, Takahiro; Abe, Yuki; Kojima, Sadaoki; Sakata, Shohei; Inoue, Hiroaki; Utsugi, Masaru; Hattori, Shoji; Hosoda, Tatsuya; Lee, Seung Ho; Shigemori, Keisuke; Hironaka, Youichiro; Sunahara, Atsushi; Sakagami, Hitoshi; Mima, Kunioki; Fujimoto, Yasushi; Yamanoi, Kohei; Norimatsu, Takayoshi; Tokita, Shigeki; Nakata, Yoshiki; Kawanaka, Junji; Jitsuno, Takahisa; Miyanaga, Noriaki; Nakai, Mitsuo; Nishimura, Hiroaki; Shiraga, Hiroyuki; Nagatomo, Hideo; Azechi, Hiroshi

    2015-06-01

    A series of experiments were carried out to evaluate the energy-coupling efficiency from heating laser to a fuel core in the fast-ignition scheme of laser-driven inertial confinement fusion. Although the efficiency is determined by a wide variety of complex physics, from intense laser plasma interactions to the properties of high-energy density plasmas and the transport of relativistic electron beams (REB), here we simplify the physics by breaking down the efficiency into three measurable parameters: (i) energy conversion ratio from laser to REB, (ii) probability of collision between the REB and the fusion fuel core, and (iii) fraction of energy deposited in the fuel core from the REB. These three parameters were measured with the newly developed experimental platform designed for mimicking the plasma conditions of a realistic integrated fast-ignition experiment. The experimental results indicate that the high-energy tail of REB must be suppressed to heat the fuel core efficiently.

  16. An Investigation of a Hybrid Mixing Model for PDF Simulations of Turbulent Premixed Flames

    NASA Astrophysics Data System (ADS)

    Zhou, Hua; Li, Shan; Wang, Hu; Ren, Zhuyin

    2015-11-01

    Predictive simulations of turbulent premixed flames over a wide range of Damköhler numbers in the framework of Probability Density Function (PDF) method still remain challenging due to the deficiency in current micro-mixing models. In this work, a hybrid micro-mixing model, valid in both the flamelet regime and broken reaction zone regime, is proposed. A priori testing of this model is first performed by examining the conditional scalar dissipation rate and conditional scalar diffusion in a 3-D direct numerical simulation dataset of a temporally evolving turbulent slot jet flame of lean premixed H2-air in the thin reaction zone regime. Then, this new model is applied to PDF simulations of the Piloted Premixed Jet Burner (PPJB) flames, which are a set of highly shear turbulent premixed flames and feature strong turbulence-chemistry interaction at high Reynolds and Karlovitz numbers. Supported by NSFC 51476087 and NSFC 91441202.

  17. Formation of Acetylene in the Reaction of Methane with Iron Carbide Cluster Anions FeC3- under High-Temperature Conditions.

    PubMed

    Li, Hai-Fang; Jiang, Li-Xue; Zhao, Yan-Xia; Liu, Qing-Yu; Zhang, Ting; He, Sheng-Gui

    2018-03-01

    The underlying mechanism for non-oxidative methane aromatization remains controversial owing to the lack of experimental evidence for the formation of the first C-C bond. For the first time, the elementary reaction of methane with atomic clusters (FeC 3 - ) under high-temperature conditions to produce C-C coupling products has been characterized by mass spectrometry. With the elevation of temperature from 300 K to 610 K, the production of acetylene, the important intermediate proposed in a monofunctional mechanism of methane aromatization, was significantly enhanced, which can be well-rationalized by quantum chemistry calculations. This study narrows the gap between gas-phase and condensed-phase studies on methane conversion and suggests that the monofunctional mechanism probably operates in non-oxidative methane aromatization. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. The Negated Conditional: A Litmus Test for the Suppositional Conditional?

    ERIC Educational Resources Information Center

    Handley, Simon J.; Evans, Jonathan St. B. T.; Thompson, Valerie A.

    2006-01-01

    Under the suppositional account of conditionals, when people think about a conditional assertion, "if p then q," they engage in a mental simulation in which they imagine p holds and evaluate the probability that q holds under this supposition. One implication of this account is that belief in a conditional equates to conditional probability…

  19. Realistic Clocks for a Universe Without Time

    NASA Astrophysics Data System (ADS)

    Bryan, K. L. H.; Medved, A. J. M.

    2018-01-01

    There are a number of problematic features within the current treatment of time in physical theories, including the "timelessness" of the Universe as encapsulated by the Wheeler-DeWitt equation. This paper considers one particular investigation into resolving this issue; a conditional probability interpretation that was first proposed by Page and Wooters. Those authors addressed the apparent timelessness by subdividing a faux Universe into two entangled parts, "the clock" and "the remainder of the Universe", and then synchronizing the effective dynamics of the two subsystems by way of conditional probabilities. The current treatment focuses on the possibility of using a (somewhat) realistic clock system; namely, a coherent-state description of a damped harmonic oscillator. This clock proves to be consistent with the conditional probability interpretation; in particular, a standard evolution operator is identified with the position of the clock playing the role of time for the rest of the Universe. Restrictions on the damping factor are determined and, perhaps contrary to expectations, the optimal choice of clock is not necessarily one of minimal damping.

  20. Conditional Independence in Applied Probability.

    ERIC Educational Resources Information Center

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

Top