NASA Astrophysics Data System (ADS)
Gerd, Niestegge
2010-12-01
In the quantum mechanical Hilbert space formalism, the probabilistic interpretation is a later ad-hoc add-on, more or less enforced by the experimental evidence, but not motivated by the mathematical model itself. A model involving a clear probabilistic interpretation from the very beginning is provided by the quantum logics with unique conditional probabilities. It includes the projection lattices in von Neumann algebras and here probability conditionalization becomes identical with the state transition of the Lüders-von Neumann measurement process. This motivates the definition of a hierarchy of five compatibility and comeasurability levels in the abstract setting of the quantum logics with unique conditional probabilities. Their meanings are: the absence of quantum interference or influence, the existence of a joint distribution, simultaneous measurability, and the independence of the final state after two successive measurements from the sequential order of these two measurements. A further level means that two elements of the quantum logic (events) belong to the same Boolean subalgebra. In the general case, the five compatibility and comeasurability levels appear to differ, but they all coincide in the common Hilbert space formalism of quantum mechanics, in von Neumann algebras, and in some other cases.
Naive Probability: Model-Based Estimates of Unique Events.
Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N
2015-08-01
We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.
Decomposition of conditional probability for high-order symbolic Markov chains.
Melnik, S S; Usatenko, O V
2017-07-01
The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.
Decomposition of conditional probability for high-order symbolic Markov chains
NASA Astrophysics Data System (ADS)
Melnik, S. S.; Usatenko, O. V.
2017-07-01
The main goal of this paper is to develop an estimate for the conditional probability function of random stationary ergodic symbolic sequences with elements belonging to a finite alphabet. We elaborate on a decomposition procedure for the conditional probability function of sequences considered to be high-order Markov chains. We represent the conditional probability function as the sum of multilinear memory function monomials of different orders (from zero up to the chain order). This allows us to introduce a family of Markov chain models and to construct artificial sequences via a method of successive iterations, taking into account at each step increasingly high correlations among random elements. At weak correlations, the memory functions are uniquely expressed in terms of the high-order symbolic correlation functions. The proposed method fills the gap between two approaches, namely the likelihood estimation and the additive Markov chains. The obtained results may have applications for sequential approximation of artificial neural network training.
NASA Technical Reports Server (NTRS)
Nitsche, Ludwig C.; Nitsche, Johannes M.; Brenner, Howard
1988-01-01
The sedimentation and diffusion of a nonneutrally buoyant Brownian particle in vertical fluid-filled cylinder of finite length which is instantaneously inverted at regular intervals are investigated analytically. A one-dimensional convective-diffusive equation is derived to describe the temporal and spatial evolution of the probability density; a periodicity condition is formulated; the applicability of Fredholm theory is established; and the parameter-space regions are determined within which the existence and uniqueness of solutions are guaranteed. Numerical results for sample problems are presented graphically and briefly characterized.
Higher-dimensional attractors with absolutely continuous invariant probability
NASA Astrophysics Data System (ADS)
Bocker, Carlos; Bortolotti, Ricardo
2018-05-01
Consider a dynamical system given by , where E is a linear expanding map of , C is a linear contracting map of and f is in . We provide sufficient conditions for E that imply the existence of an open set of pairs for which the corresponding dynamic T admits a unique absolutely continuous invariant probability. A geometrical characteristic of transversality between self-intersections of images of is present in the dynamic of the maps in . In addition, we give a condition between E and C under which it is possible to perturb f to obtain a pair in .
Hart, Andrew; Cortés, María Paz; Latorre, Mauricio; Martinez, Servet
2018-01-01
The analysis of codon usage bias has been widely used to characterize different communities of microorganisms. In this context, the aim of this work was to study the codon usage bias in a natural consortium of five acidophilic bacteria used for biomining. The codon usage bias of the consortium was contrasted with genes from an alternative collection of acidophilic reference strains and metagenome samples. Results indicate that acidophilic bacteria preferentially have low codon usage bias, consistent with both their capacity to live in a wide range of habitats and their slow growth rate, a characteristic probably acquired independently from their phylogenetic relationships. In addition, the analysis showed significant differences in the unique sets of genes from the autotrophic species of the consortium in relation to other acidophilic organisms, principally in genes which code for proteins involved in metal and oxidative stress resistance. The lower values of codon usage bias obtained in this unique set of genes suggest higher transcriptional adaptation to living in extreme conditions, which was probably acquired as a measure for resisting the elevated metal conditions present in the mine.
New normative standards of conditional reasoning and the dual-source model
Singmann, Henrik; Klauer, Karl Christoph; Over, David
2014-01-01
There has been a major shift in research on human reasoning toward Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998) for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer and Kleiter, 2005, 2010) exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer et al., 2010) is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches) and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task. PMID:24860516
New normative standards of conditional reasoning and the dual-source model.
Singmann, Henrik; Klauer, Karl Christoph; Over, David
2014-01-01
There has been a major shift in research on human reasoning toward Bayesian and probabilistic approaches, which has been called a new paradigm. The new paradigm sees most everyday and scientific reasoning as taking place in a context of uncertainty, and inference is from uncertain beliefs and not from arbitrary assumptions. In this manuscript we present an empirical test of normative standards in the new paradigm using a novel probabilized conditional reasoning task. Our results indicated that for everyday conditional with at least a weak causal connection between antecedent and consequent only the conditional probability of the consequent given antecedent contributes unique variance to predicting the probability of conditional, but not the probability of the conjunction, nor the probability of the material conditional. Regarding normative accounts of reasoning, we found significant evidence that participants' responses were confidence preserving (i.e., p-valid in the sense of Adams, 1998) for MP inferences, but not for MT inferences. Additionally, only for MP inferences and to a lesser degree for DA inferences did the rate of responses inside the coherence intervals defined by mental probability logic (Pfeifer and Kleiter, 2005, 2010) exceed chance levels. In contrast to the normative accounts, the dual-source model (Klauer et al., 2010) is a descriptive model. It posits that participants integrate their background knowledge (i.e., the type of information primary to the normative approaches) and their subjective probability that a conclusion is seen as warranted based on its logical form. Model fits showed that the dual-source model, which employed participants' responses to a deductive task with abstract contents to estimate the form-based component, provided as good an account of the data as a model that solely used data from the probabilized conditional reasoning task.
Fingelkurts, Alexander A.; Fingelkurts, Andrew A.
2014-01-01
For the first time the dynamic repertoires and oscillatory types of local EEG states in 13 diverse conditions (examined over 9 studies) that covered healthy-normal, altered and pathological brain states were quantified within the same methodological and conceptual framework. EEG oscillatory states were assessed by the probability-classification analysis of short-term EEG spectral patterns. The results demonstrated that brain activity consists of a limited repertoire of local EEG states in any of the examined conditions. The size of the state repertoires was associated with changes in cognition and vigilance or neuropsychopathologic conditions. Additionally universal, optional and unique EEG states across 13 diverse conditions were observed. It was demonstrated also that EEG oscillations which constituted EEG states were characteristic for different groups of conditions in accordance to oscillations’ functional significance. The results suggested that (a) there is a limit in the number of local states available to the cortex and many ways in which these local states can rearrange themselves and still produce the same global state and (b) EEG individuality is determined by varying proportions of universal, optional and unique oscillatory states. The results enriched our understanding about dynamic microstructure of EEG-signal. PMID:24505292
Quantifying seining detection probability for fishes of Great Plains sand‐bed rivers
Mollenhauer, Robert; Logue, Daniel R.; Brewer, Shannon K.
2018-01-01
Species detection error (i.e., imperfect and variable detection probability) is an essential consideration when investigators map distributions and interpret habitat associations. When fish detection error that is due to highly variable instream environments needs to be addressed, sand‐bed streams of the Great Plains represent a unique challenge. We quantified seining detection probability for diminutive Great Plains fishes across a range of sampling conditions in two sand‐bed rivers in Oklahoma. Imperfect detection resulted in underestimates of species occurrence using naïve estimates, particularly for less common fishes. Seining detection probability also varied among fishes and across sampling conditions. We observed a quadratic relationship between water depth and detection probability, in which the exact nature of the relationship was species‐specific and dependent on water clarity. Similarly, the direction of the relationship between water clarity and detection probability was species‐specific and dependent on differences in water depth. The relationship between water temperature and detection probability was also species dependent, where both the magnitude and direction of the relationship varied among fishes. We showed how ignoring detection error confounded an underlying relationship between species occurrence and water depth. Despite imperfect and heterogeneous detection, our results support that determining species absence can be accomplished with two to six spatially replicated seine hauls per 200‐m reach under average sampling conditions; however, required effort would be higher under certain conditions. Detection probability was low for the Arkansas River Shiner Notropis girardi, which is federally listed as threatened, and more than 10 seine hauls per 200‐m reach would be required to assess presence across sampling conditions. Our model allows scientists to estimate sampling effort to confidently assess species occurrence, which maximizes the use of available resources. Increased implementation of approaches that consider detection error promote ecological advancements and conservation and management decisions that are better informed.
Coffee, Caffeine, and Health Outcomes: An Umbrella Review.
Grosso, Giuseppe; Godos, Justyna; Galvano, Fabio; Giovannucci, Edward L
2017-08-21
To evaluate the associations between coffee and caffeine consumption and various health outcomes, we performed an umbrella review of the evidence from meta-analyses of observational studies and randomized controlled trials (RCTs). Of the 59 unique outcomes examined in the selected 112 meta-analyses of observational studies, coffee was associated with a probable decreased risk of breast, colorectal, colon, endometrial, and prostate cancers; cardiovascular disease and mortality; Parkinson's disease; and type-2 diabetes. Of the 14 unique outcomes examined in the 20 selected meta-analyses of observational studies, caffeine was associated with a probable decreased risk of Parkinson's disease and type-2 diabetes and an increased risk of pregnancy loss. Of the 12 unique acute outcomes examined in the selected 9 meta-analyses of RCTs, coffee was associated with a rise in serum lipids, but this result was affected by significant heterogeneity, and caffeine was associated with a rise in blood pressure. Given the spectrum of conditions studied and the robustness of many of the results, these findings indicate that coffee can be part of a healthful diet.
Naive Probability: Model-based Estimates of Unique Events
2014-05-04
Gilio & Over, 2012) – a possibility to which we return later. Despite these studies...Barrouillet, Jean-François Bonnefon, Nick Cassimatis, Nick Chater, Ernest Davis, Igor Douven, Angelo Gilio , Adam Harris, Gernot Kleiter, Gary Marcus, Ray...1230-1239. Gilio , A., & Over, D. (2012). The psychology of inferring conditionals from disjunctions: A probabilistic study. Journal of
The cost of proactive interference is constant across presentation conditions.
Endress, Ansgar D; Siddique, Aneela
2016-10-01
Proactive interference (PI) severely constrains how many items people can remember. For example, Endress and Potter (2014a) presented participants with sequences of everyday objects at 250ms/picture, followed by a yes/no recognition test. They manipulated PI by either using new images on every trial in the unique condition (thus minimizing PI among items), or by re-using images from a limited pool for all trials in the repeated condition (thus maximizing PI among items). In the low-PI unique condition, the probability of remembering an item was essentially independent of the number of memory items, showing no clear memory limitations; more traditional working memory-like memory limitations appeared only in the high-PI repeated condition. Here, we ask whether the effects of PI are modulated by the availability of long-term memory (LTM) and verbal resources. Participants viewed sequences of 21 images, followed by a yes/no recognition test. Items were presented either quickly (250ms/image) or sufficiently slowly (1500ms/image) to produce LTM representations, either with or without verbal suppression. Across conditions, participants performed better in the unique than in the repeated condition, and better for slow than for fast presentations. In contrast, verbal suppression impaired performance only with slow presentations. The relative cost of PI was remarkably constant across conditions: relative to the unique condition, performance in the repeated condition was about 15% lower in all conditions. The cost of PI thus seems to be a function of the relative strength or recency of target items and interfering items, but relatively insensitive to other experimental manipulations. Copyright © 2016 Elsevier B.V. All rights reserved.
Future southcentral US wildfire probability due to climate change
Stambaugh, Michael C.; Guyette, Richard P.; Stroh, Esther D.; Struckhoff, Matthew A.; Whittier, Joanna B.
2018-01-01
Globally, changing fire regimes due to climate is one of the greatest threats to ecosystems and society. In this paper, we present projections of future fire probability for the southcentral USA using downscaled climate projections and the Physical Chemistry Fire Frequency Model (PC2FM). Future fire probability is projected to both increase and decrease across the study region of Oklahoma, New Mexico, and Texas. Among all end-of-century projections, change in fire probabilities (CFPs) range from − 51 to + 240%. Greatest absolute increases in fire probability are shown for areas within the range of approximately 75 to 160 cm mean annual precipitation (MAP), regardless of climate model. Although fire is likely to become more frequent across the southcentral USA, spatial patterns may remain similar unless significant increases in precipitation occur, whereby more extensive areas with increased fire probability are predicted. Perhaps one of the most important results is illumination of climate changes where fire probability response (+, −) may deviate (i.e., tipping points). Fire regimes of southcentral US ecosystems occur in a geographic transition zone from reactant- to reaction-limited conditions, potentially making them uniquely responsive to different scenarios of temperature and precipitation changes. Identification and description of these conditions may help anticipate fire regime changes that will affect human health, agriculture, species conservation, and nutrient and water cycling.
Modeling highway travel time distribution with conditional probability models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oliveira Neto, Francisco Moraes; Chin, Shih-Miao; Hwang, Ho-Ling
ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program providesmore » a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).« less
On Graph Isomorphism and the PageRank Algorithm
2008-09-01
specifies the probability of visiting each node from any other node. The perturbed matrix satisfies the Perron - Frobenius theorem’s conditions. Therefore... Frobenius and Perron theorems establishes the matrix must yield the dominant eigenvalue, one. Normalizing the unique and associated dominant eigenvector...is constructed such that none of its entries equal zero. An arbitrary PageRank matrix, S, is irreducible and satisfies the Perron - Frobenius
Consequences of evolution: is rhinosinusitis, like otitis media, a unique disease of humans?
Bluestone, Charles D; Pagano, Anthony S; Swarts, J Douglas; Laitman, Jeffrey T
2012-12-01
We hypothesize that if otitis media is most likely primarily a human disease due to consequences of evolution, rhinosinusitis may also be limited to humans for similar reasons. If otitis media, with its associated hearing loss, occurred in animals in the wild, they probably would have been culled out by predation. Similarly, if rhinosinusitis occurred regularly in animals, they likely would have suffered from severely decreased olfactory abilities, crucial for predator avoidance, and presumably would likewise have been selected against evolutionarily. Thus, both otitis media and rhinosinusitis-common conditions particularly in infants and young children-appear to be essentially human conditions. Their manifestation in our species is likely due to our unique evolutionary trajectory and may be a consequence of adaptations, including adaptations to bipedalism and speech, loss of prognathism, and immunologic and environmental factors.
Quantum gravity in timeless configuration space
NASA Astrophysics Data System (ADS)
Gomes, Henrique
2017-12-01
On the path towards quantum gravity we find friction between temporal relations in quantum mechanics (QM) (where they are fixed and field-independent), and in general relativity (where they are field-dependent and dynamic). This paper aims to attenuate that friction, by encoding gravity in the timeless configuration space of spatial fields with dynamics given by a path integral. The framework demands that boundary conditions for this path integral be uniquely given, but unlike other approaches where they are prescribed—such as the no-boundary and the tunneling proposals—here I postulate basic principles to identify boundary conditions in a large class of theories. Uniqueness arises only if a reduced configuration space can be defined and if it has a profoundly asymmetric fundamental structure. These requirements place strong restrictions on the field and symmetry content of theories encompassed here; shape dynamics is one such theory. When these constraints are met, any emerging theory will have a Born rule given merely by a particular volume element built from the path integral in (reduced) configuration space. Also as in other boundary proposals, Time, including space-time, emerges as an effective concept; valid for certain curves in configuration space but not assumed from the start. When some such notion of time becomes available, conservation of (positive) probability currents ensues. I show that, in the appropriate limits, a Schrödinger equation dictates the evolution of weakly coupled source fields on a classical gravitational background. Due to the asymmetry of reduced configuration space, these probabilities and currents avoid a known difficulty of standard WKB approximations for Wheeler DeWitt in minisuperspace: the selection of a unique Hamilton–Jacobi solution to serve as background. I illustrate these constructions with a simple example of a full quantum gravitational theory (i.e. not in minisuperspace) for which the formalism is applicable, and give a formula for calculating gravitational semi-classical relative probabilities in it.
Focused sunlight factor of forest fire danger assessment using Web-GIS and RS technologies
NASA Astrophysics Data System (ADS)
Baranovskiy, Nikolay V.; Sherstnyov, Vladislav S.; Yankovich, Elena P.; Engel, Marina V.; Belov, Vladimir V.
2016-08-01
Timiryazevskiy forestry of Tomsk region (Siberia, Russia) is a study area elaborated in current research. Forest fire danger assessment is based on unique technology using probabilistic criterion, statistical data on forest fires, meteorological conditions, forest sites classification and remote sensing data. MODIS products are used for estimating some meteorological conditions and current forest fire situation. Geonformation technologies are used for geospatial analysis of forest fire danger situation on controlled forested territories. GIS-engine provides opportunities to construct electronic maps with different levels of forest fire probability and support raster layer for satellite remote sensing data on current forest fires. Web-interface is used for data loading on specific web-site and for forest fire danger data representation via World Wide Web. Special web-forms provide interface for choosing of relevant input data in order to process the forest fire danger data and assess the forest fire probability.
NASA Astrophysics Data System (ADS)
Lee, Jaeha; Tsutsui, Izumi
2017-05-01
We show that the joint behavior of an arbitrary pair of (generally noncommuting) quantum observables can be described by quasi-probabilities, which are an extended version of the standard probabilities used for describing the outcome of measurement for a single observable. The physical situations that require these quasi-probabilities arise when one considers quantum measurement of an observable conditioned by some other variable, with the notable example being the weak measurement employed to obtain Aharonov's weak value. Specifically, we present a general prescription for the construction of quasi-joint probability (QJP) distributions associated with a given combination of observables. These QJP distributions are introduced in two complementary approaches: one from a bottom-up, strictly operational construction realized by examining the mathematical framework of the conditioned measurement scheme, and the other from a top-down viewpoint realized by applying the results of the spectral theorem for normal operators and their Fourier transforms. It is then revealed that, for a pair of simultaneously measurable observables, the QJP distribution reduces to the unique standard joint probability distribution of the pair, whereas for a noncommuting pair there exists an inherent indefiniteness in the choice of such QJP distributions, admitting a multitude of candidates that may equally be used for describing the joint behavior of the pair. In the course of our argument, we find that the QJP distributions furnish the space of operators in the underlying Hilbert space with their characteristic geometric structures such that the orthogonal projections and inner products of observables can be given statistical interpretations as, respectively, “conditionings” and “correlations”. The weak value Aw for an observable A is then given a geometric/statistical interpretation as either the orthogonal projection of A onto the subspace generated by another observable B, or equivalently, as the conditioning of A given B with respect to the QJP distribution under consideration.
Mayer control problem with probabilistic uncertainty on initial positions
NASA Astrophysics Data System (ADS)
Marigonda, Antonio; Quincampoix, Marc
2018-03-01
In this paper we introduce and study an optimal control problem in the Mayer's form in the space of probability measures on Rn endowed with the Wasserstein distance. Our aim is to study optimality conditions when the knowledge of the initial state and velocity is subject to some uncertainty, which are modeled by a probability measure on Rd and by a vector-valued measure on Rd, respectively. We provide a characterization of the value function of such a problem as unique solution of an Hamilton-Jacobi-Bellman equation in the space of measures in a suitable viscosity sense. Some applications to a pursuit-evasion game with uncertainty in the state space is also discussed, proving the existence of a value for the game.
Estimating earthquake-induced failure probability and downtime of critical facilities.
Porter, Keith; Ramer, Kyle
2012-01-01
Fault trees have long been used to estimate failure risk in earthquakes, especially for nuclear power plants (NPPs). One interesting application is that one can assess and manage the probability that two facilities - a primary and backup - would be simultaneously rendered inoperative in a single earthquake. Another is that one can calculate the probabilistic time required to restore a facility to functionality, and the probability that, during any given planning period, the facility would be rendered inoperative for any specified duration. A large new peer-reviewed library of component damageability and repair-time data for the first time enables fault trees to be used to calculate the seismic risk of operational failure and downtime for a wide variety of buildings other than NPPs. With the new library, seismic risk of both the failure probability and probabilistic downtime can be assessed and managed, considering the facility's unique combination of structural and non-structural components, their seismic installation conditions, and the other systems on which the facility relies. An example is offered of real computer data centres operated by a California utility. The fault trees were created and tested in collaboration with utility operators, and the failure probability and downtime results validated in several ways.
The Probabilities of Unique Events
Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil
2012-01-01
Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224
Most recent common ancestor probability distributions in gene genealogies under selection.
Slade, P F
2000-12-01
A computational study is made of the conditional probability distribution for the allelic type of the most recent common ancestor in genealogies of samples of n genes drawn from a population under selection, given the initial sample configuration. Comparisons with the corresponding unconditional cases are presented. Such unconditional distributions differ from samples drawn from the unique stationary distribution of population allelic frequencies, known as Wright's formula, and are quantified. Biallelic haploid and diploid models are considered. A simplified structure for the ancestral selection graph of S. M. Krone and C. Neuhauser (1997, Theor. Popul. Biol. 51, 210-237) is enhanced further, reducing the effective branching rate in the graph. This improves efficiency of such a nonneutral analogue of the coalescent for use with computational likelihood-inference techniques.
Probability and surprisal in auditory comprehension of morphologically complex words.
Balling, Laura Winther; Baayen, R Harald
2012-10-01
Two auditory lexical decision experiments document for morphologically complex words two points at which the probability of a target word given the evidence shifts dramatically. The first point is reached when morphologically unrelated competitors are no longer compatible with the evidence. Adapting terminology from Marslen-Wilson (1984), we refer to this as the word's initial uniqueness point (UP1). The second point is the complex uniqueness point (CUP) introduced by Balling and Baayen (2008), at which morphologically related competitors become incompatible with the input. Later initial as well as complex uniqueness points predict longer response latencies. We argue that the effects of these uniqueness points arise due to the large surprisal (Levy, 2008) carried by the phonemes at these uniqueness points, and provide independent evidence that how cumulative surprisal builds up in the course of the word co-determines response latencies. The presence of effects of surprisal, both at the initial uniqueness point of complex words, and cumulatively throughout the word, challenges the Shortlist B model of Norris and McQueen (2008), and suggests that a Bayesian approach to auditory comprehension requires complementation from information theory in order to do justice to the cognitive cost of updating probability distributions over lexical candidates. Copyright © 2012 Elsevier B.V. All rights reserved.
War during childhood: The long run effects of warfare on health.
Akbulut-Yuksel, Mevlude
2017-05-01
This paper estimates the causal long-term consequences of an exposure to war in utero and during childhood on the risk of obesity and the probability of having a chronic health condition in adulthood. Using the plausibly exogenous city-by-cohort variation in the intensity of WWII destruction as a unique quasi-experiment, I find that individuals who were exposed to WWII destruction during the prenatal and early postnatal periods have higher BMIs and are more likely to be obese as adults. I also find an elevated incidence of chronic health conditions such as stroke, hypertension, diabetes, and cardiovascular disorder in adulthood among these wartime children. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Chen, Po-Hao; Botzolakis, Emmanuel; Mohan, Suyash; Bryan, R. N.; Cook, Tessa
2016-03-01
In radiology, diagnostic errors occur either through the failure of detection or incorrect interpretation. Errors are estimated to occur in 30-35% of all exams and contribute to 40-54% of medical malpractice litigations. In this work, we focus on reducing incorrect interpretation of known imaging features. Existing literature categorizes cognitive bias leading a radiologist to an incorrect diagnosis despite having correctly recognized the abnormal imaging features: anchoring bias, framing effect, availability bias, and premature closure. Computational methods make a unique contribution, as they do not exhibit the same cognitive biases as a human. Bayesian networks formalize the diagnostic process. They modify pre-test diagnostic probabilities using clinical and imaging features, arriving at a post-test probability for each possible diagnosis. To translate Bayesian networks to clinical practice, we implemented an entirely web-based open-source software tool. In this tool, the radiologist first selects a network of choice (e.g. basal ganglia). Then, large, clearly labeled buttons displaying salient imaging features are displayed on the screen serving both as a checklist and for input. As the radiologist inputs the value of an extracted imaging feature, the conditional probabilities of each possible diagnosis are updated. The software presents its level of diagnostic discrimination using a Pareto distribution chart, updated with each additional imaging feature. Active collaboration with the clinical radiologist is a feasible approach to software design and leads to design decisions closely coupling the complex mathematics of conditional probability in Bayesian networks with practice.
Approved Methods and Algorithms for DoD Risk-Based Explosives Siting
2007-02-02
glass. Pgha Probability of a person being in the glass hazard area Phit Probability of hit Phit (f) Probability of hit for fatality Phit (maji...Probability of hit for major injury Phit (mini) Probability of hit for minor injury Pi Debris probability densities at the ES PMaj (pair) Individual...combined high-angle and combined low-angle tables. A unique probability of hit is calculated for the three consequences of fatality, Phit (f), major injury
A combinatorial perspective of the protein inference problem.
Yang, Chao; He, Zengyou; Yu, Weichuan
2013-01-01
In a shotgun proteomics experiment, proteins are the most biologically meaningful output. The success of proteomics studies depends on the ability to accurately and efficiently identify proteins. Many methods have been proposed to facilitate the identification of proteins from peptide identification results. However, the relationship between protein identification and peptide identification has not been thoroughly explained before. In this paper, we devote ourselves to a combinatorial perspective of the protein inference problem. We employ combinatorial mathematics to calculate the conditional protein probabilities (protein probability means the probability that a protein is correctly identified) under three assumptions, which lead to a lower bound, an upper bound, and an empirical estimation of protein probabilities, respectively. The combinatorial perspective enables us to obtain an analytical expression for protein inference. Our method achieves comparable results with ProteinProphet in a more efficient manner in experiments on two data sets of standard protein mixtures and two data sets of real samples. Based on our model, we study the impact of unique peptides and degenerate peptides (degenerate peptides are peptides shared by at least two proteins) on protein probabilities. Meanwhile, we also study the relationship between our model and ProteinProphet. We name our program ProteinInfer. Its Java source code, our supplementary document and experimental results are available at: >http://bioinformatics.ust.hk/proteininfer.
Titanium carbide and titania phases on Antarctic ice particles of probable extraterrrestrial origin
NASA Technical Reports Server (NTRS)
Zolensky, M. E.; Pun, A.; Thomas, K. L.
1989-01-01
Two unique titania-rich particles, found within ancient Antarctic ice have been discovered and characterized, and are believed to be of extraterrestrial origin. Both particles contain abundant submicron-sized crystals of Magneli phases (Ti(n)O(2n-1). In addition, one particle contains a core of TiC. Whereas the Magneli phases would have been stable in the early solar nebula, and so probably formed there, the TiC is more likely to have condensed in the cool, dusty, carbon-rich outer shell of a red giant star. It is suggested that both particles are interplanetary dust particles whose Magneli phases carry a record of the PO2-T conditions of the early solar nebula. It is further suggested that the TiC grain in particle 705 is remnant interstellar dust.
A method to establish stimulus control and compliance with instructions.
Borgen, John G; Charles Mace, F; Cavanaugh, Brenna M; Shamlian, Kenneth; Lit, Keith R; Wilson, Jillian B; Trauschke, Stephanie L
2017-10-01
We evaluated a unique procedure to establish compliance with instructions in four young children diagnosed with autism spectrum disorder (ASD) who had low levels of compliance. Our procedure included methods to establish a novel therapist as a source of positive reinforcement, reliably evoke orienting responses to the therapist, increase the number of exposures to instruction-compliance-reinforcer contingencies, and minimize the number of exposures to instruction-noncompliance-no reinforcer contingencies. We further alternated between instructions with a high probability of compliance (high-p instructions) with instructions that had a prior low probability of compliance (low-p instructions) as soon as low-p instructions lost stimulus control. The intervention is discussed in relation to the conditions necessary for the development of stimulus control and as an example of a variation of translational research. © 2017 Society for the Experimental Analysis of Behavior.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friar, James Lewis; Goldman, Terrance J.; Pérez-Mercader, J.
In this paper, we apply the Law of Total Probability to the construction of scale-invariant probability distribution functions (pdf's), and require that probability measures be dimensionless and unitless under a continuous change of scales. If the scale-change distribution function is scale invariant then the constructed distribution will also be scale invariant. Repeated application of this construction on an arbitrary set of (normalizable) pdf's results again in scale-invariant distributions. The invariant function of this procedure is given uniquely by the reciprocal distribution, suggesting a kind of universality. Finally, we separately demonstrate that the reciprocal distribution results uniquely from requiring maximum entropymore » for size-class distributions with uniform bin sizes.« less
Use and interpretation of logistic regression in habitat-selection studies
Keating, Kim A.; Cherry, Steve
2004-01-01
Logistic regression is an important tool for wildlife habitat-selection studies, but the method frequently has been misapplied due to an inadequate understanding of the logistic model, its interpretation, and the influence of sampling design. To promote better use of this method, we review its application and interpretation under 3 sampling designs: random, case-control, and use-availability. Logistic regression is appropriate for habitat use-nonuse studies employing random sampling and can be used to directly model the conditional probability of use in such cases. Logistic regression also is appropriate for studies employing case-control sampling designs, but careful attention is required to interpret results correctly. Unless bias can be estimated or probability of use is small for all habitats, results of case-control studies should be interpreted as odds ratios, rather than probability of use or relative probability of use. When data are gathered under a use-availability design, logistic regression can be used to estimate approximate odds ratios if probability of use is small, at least on average. More generally, however, logistic regression is inappropriate for modeling habitat selection in use-availability studies. In particular, using logistic regression to fit the exponential model of Manly et al. (2002:100) does not guarantee maximum-likelihood estimates, valid probabilities, or valid likelihoods. We show that the resource selection function (RSF) commonly used for the exponential model is proportional to a logistic discriminant function. Thus, it may be used to rank habitats with respect to probability of use and to identify important habitat characteristics or their surrogates, but it is not guaranteed to be proportional to probability of use. Other problems associated with the exponential model also are discussed. We describe an alternative model based on Lancaster and Imbens (1996) that offers a method for estimating conditional probability of use in use-availability studies. Although promising, this model fails to converge to a unique solution in some important situations. Further work is needed to obtain a robust method that is broadly applicable to use-availability studies.
Bayesian network models for error detection in radiotherapy plans
NASA Astrophysics Data System (ADS)
Kalet, Alan M.; Gennari, John H.; Ford, Eric C.; Phillips, Mark H.
2015-04-01
The purpose of this study is to design and develop a probabilistic network for detecting errors in radiotherapy plans for use at the time of initial plan verification. Our group has initiated a multi-pronged approach to reduce these errors. We report on our development of Bayesian models of radiotherapy plans. Bayesian networks consist of joint probability distributions that define the probability of one event, given some set of other known information. Using the networks, we find the probability of obtaining certain radiotherapy parameters, given a set of initial clinical information. A low probability in a propagated network then corresponds to potential errors to be flagged for investigation. To build our networks we first interviewed medical physicists and other domain experts to identify the relevant radiotherapy concepts and their associated interdependencies and to construct a network topology. Next, to populate the network’s conditional probability tables, we used the Hugin Expert software to learn parameter distributions from a subset of de-identified data derived from a radiation oncology based clinical information database system. These data represent 4990 unique prescription cases over a 5 year period. Under test case scenarios with approximately 1.5% introduced error rates, network performance produced areas under the ROC curve of 0.88, 0.98, and 0.89 for the lung, brain and female breast cancer error detection networks, respectively. Comparison of the brain network to human experts performance (AUC of 0.90 ± 0.01) shows the Bayes network model performs better than domain experts under the same test conditions. Our results demonstrate the feasibility and effectiveness of comprehensive probabilistic models as part of decision support systems for improved detection of errors in initial radiotherapy plan verification procedures.
Conservational PDF Equations of Turbulence
NASA Technical Reports Server (NTRS)
Shih, Tsan-Hsing; Liu, Nan-Suey
2010-01-01
Recently we have revisited the traditional probability density function (PDF) equations for the velocity and species in turbulent incompressible flows. They are all unclosed due to the appearance of various conditional means which are modeled empirically. However, we have observed that it is possible to establish a closed velocity PDF equation and a closed joint velocity and species PDF equation through conditions derived from the integral form of the Navier-Stokes equations. Although, in theory, the resulted PDF equations are neither general nor unique, they nevertheless lead to the exact transport equations for the first moment as well as all higher order moments. We refer these PDF equations as the conservational PDF equations. This observation is worth further exploration for its validity and CFD application
The Interaction Between the Magnetosphere of Mars and that of Comet Siding Spring
NASA Astrophysics Data System (ADS)
Holmstrom, M.; Futaana, Y.; Barabash, S. V.
2015-12-01
On 19 October 2014 the comet Siding Spring flew by Mars. This was a unique opportunity to study the interaction between a cometary and a planetary magnetosphere. Here we model the magnetosphere of the comet using a hybrid plasma solver (ions as particles, electrons as a fluid). The undisturbed upstream solar wind ion conditions are estimated from observations by ASPERA-3/IMA on Mars Express during several orbits. It is found that Mars probably passed through a solar wind that was disturbed by the comet during the flyby. The uncertainty derives from that the size of the disturbed solar wind region in the comet simulation is sensitive to the assumed upstream solar wind conditions, especially the solar wind proton density.
Ubiquity of Benford's law and emergence of the reciprocal distribution
Friar, James Lewis; Goldman, Terrance J.; Pérez-Mercader, J.
2016-04-07
In this paper, we apply the Law of Total Probability to the construction of scale-invariant probability distribution functions (pdf's), and require that probability measures be dimensionless and unitless under a continuous change of scales. If the scale-change distribution function is scale invariant then the constructed distribution will also be scale invariant. Repeated application of this construction on an arbitrary set of (normalizable) pdf's results again in scale-invariant distributions. The invariant function of this procedure is given uniquely by the reciprocal distribution, suggesting a kind of universality. Finally, we separately demonstrate that the reciprocal distribution results uniquely from requiring maximum entropymore » for size-class distributions with uniform bin sizes.« less
The global dynamics for a stochastic SIS epidemic model with isolation
NASA Astrophysics Data System (ADS)
Chen, Yiliang; Wen, Buyu; Teng, Zhidong
2018-02-01
In this paper, we investigate the dynamical behavior for a stochastic SIS epidemic model with isolation which is as an important strategy for the elimination of infectious diseases. It is assumed that the stochastic effects manifest themselves mainly as fluctuation in the transmission coefficient, the death rate and the proportional coefficient of the isolation of infective. It is shown that the extinction and persistence in the mean of the model are determined by a threshold value R0S . That is, if R0S < 1, then disease dies out with probability one, and if R0S > 1, then the disease is stochastic persistent in the means with probability one. Furthermore, the existence of a unique stationary distribution is discussed, and the sufficient conditions are established by using the Lyapunov function method. Finally, some numerical examples are carried out to confirm the analytical results.
Learning Problem-Solving Rules as Search Through a Hypothesis Space.
Lee, Hee Seung; Betts, Shawn; Anderson, John R
2016-07-01
Learning to solve a class of problems can be characterized as a search through a space of hypotheses about the rules for solving these problems. A series of four experiments studied how different learning conditions affected the search among hypotheses about the solution rule for a simple computational problem. Experiment 1 showed that a problem property such as computational difficulty of the rules biased the search process and so affected learning. Experiment 2 examined the impact of examples as instructional tools and found that their effectiveness was determined by whether they uniquely pointed to the correct rule. Experiment 3 compared verbal directions with examples and found that both could guide search. The final experiment tried to improve learning by using more explicit verbal directions or by adding scaffolding to the example. While both manipulations improved learning, learning still took the form of a search through a hypothesis space of possible rules. We describe a model that embodies two assumptions: (1) the instruction can bias the rules participants hypothesize rather than directly be encoded into a rule; (2) participants do not have memory for past wrong hypotheses and are likely to retry them. These assumptions are realized in a Markov model that fits all the data by estimating two sets of probabilities. First, the learning condition induced one set of Start probabilities of trying various rules. Second, should this first hypothesis prove wrong, the learning condition induced a second set of Choice probabilities of considering various rules. These findings broaden our understanding of effective instruction and provide implications for instructional design. Copyright © 2015 Cognitive Science Society, Inc.
Vacuolar protein sorting mechanisms in plants.
Xiang, Li; Etxeberria, Ed; Van den Ende, Wim
2013-02-01
Plant vacuoles are unique, multifunctional organelles among eukaryotes. Considerable new insights in plant vacuolar protein sorting have been obtained recently. The basic machinery of protein export from the endoplasmic reticulum to the Golgi and the classical route to the lytic vacuole and the protein storage vacuole shows many similarities to vacuolar/lysosomal sorting in other eukaryotes. However, as a result of its unique functions in plant defence and as a storage compartment, some plant-specific entities and sorting determinants appear to exist. The alternative post-Golgi route, as found in animals and yeast, probably exists in plants as well. Likely, adaptor protein complex 3 fulfils a central role in this route. A Golgi-independent route involving plant-specific endoplasmic reticulum bodies appears to provide sedentary organisms such as plants with extra flexibility to cope with changing environmental conditions. © 2012 The Authors Journal compilation © 2012 FEBS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sattison, M.B.; Schroeder, J.A.; Russell, K.D.
The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of conditional core damage probability (CCDP) evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according tomore » plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.« less
A satellite data terminal for land mobile use
NASA Technical Reports Server (NTRS)
Sutherland, Colin A.
1990-01-01
Telesat Mobile Incorporated (TMI) has recently introduced the Mobile Data Service (MDS) into Canada. This paper outlines the system design and some key aspects of the detailed design of the Mobile Earth Terminal (MET) developed by Canadian Aeronautics Limited (CAL) for use with the MDS. The technical requirements for the MET are outlined and the equipment architecture is described. The major design considerations for each functional module are then addressed. Environmental conditions unique to the land mobile service are highlighted, along with the measures taken to ensure satisfactory operation and survival of the MET. Finally, the probable direction of future developments is indicated.
NASA Astrophysics Data System (ADS)
Fan, Kuangang; Zhang, Yan; Gao, Shujing; Wei, Xiang
2017-09-01
A class of SIR epidemic model with generalized nonlinear incidence rate is presented in this paper. Temporary immunity and stochastic perturbation are also considered. The existence and uniqueness of the global positive solution is achieved. Sufficient conditions guaranteeing the extinction and persistence of the epidemic disease are established. Moreover, the threshold behavior is discussed, and the threshold value R0 is obtained. We show that if R0 < 1, the disease eventually becomes extinct with probability one, whereas if R0 > 1, then the system remains permanent in the mean.
NASA Astrophysics Data System (ADS)
Skilling, John
2005-11-01
This tutorial gives a basic overview of Bayesian methodology, from its axiomatic foundation through the conventional development of data analysis and model selection to its rôle in quantum mechanics, and ending with some comments on inference in general human affairs. The central theme is that probability calculus is the unique language within which we can develop models of our surroundings that have predictive capability. These models are patterns of belief; there is no need to claim external reality. 1. Logic and probability 2. Probability and inference 3. Probability and model selection 4. Prior probabilities 5. Probability and frequency 6. Probability and quantum mechanics 7. Probability and fundamentalism 8. Probability and deception 9. Prediction and truth
Fox, Anthony W; Payne-James, J Jason
2012-11-30
Alleged fatalities associated with conductive-energy devices (CEDs) are similar to alleged serious adverse events (SAEs) after the use of pharmaceutical products: both types of case arise rarely, in complex (if not unique) combinations of circumstances, frequently when there are multiple concomitant putative aetiologies for the injury, and after the suspected product has been previously well-designed and tested. Attribution (or otherwise) of SAEs to pharmaceutical products is often assessed by use of the Naranjo algorithm. The purpose of this study was to investigate whether an adapted Naranjo algorithm could be used to assess alleged CED-associated fatalities. Unique cases had four independent identifiers. Prospectively, 7 (of the 10) Naranjo algorithm questions were chosen as being potentially applicable to CED use. These had maximum score 9, and the associated ordinal probability scale (doubtful, possible, probable, and definite) was retained by linear proportion to the integral scores. An arbitrary requirement was for database sufficiency≥50%=([n unique cases×7 questions answerable]×0.5); a pilot sample (n=29 unique cases) suggested feasibility (see below). One hundred and seventy-five unique cases were found, with a data sufficiency of 56.8%. Modified Naranjo algorithm scores had an unequally bimodal distribution. CED-attributability was suggested in 21 (12% of 175) cases. Substantial numbers of concomitant conditions existed among cases with low algorithm scores, all being potentially lethal under field conditions without CED exposure. The number of CED-administered shocks sustained was unrelated to CED-attributability of fatality. Two of the Naranjo questions (regarding dechallenge and the effects of challenge with a non-identical but similar agent) proved to be non-contributory. An algorithmic approach to assessment of CED-associated fatality seems feasible. By these pharmacovigilance standards, some published case fatality rates attributable to CED exposure seem exaggerated. CED-attributable deaths have close similarity to Type-B SAEs. The latter are rare, unpredictable, and usually due to a patient idiosyncrasy. In the person being restrained, such idiosyncratic factors may be unavoidable by law enforcement officers (LEO) in the field. These are unlike predictable (Type-A) SAEs, which have their corollary amongst secondary CED-associated deaths, e.g., head injury among cyclists or ignition of an inflammable atmosphere by the CED, and are identifiable risk factors for which LEO can train. Regardless, absolute CED tolerability is obviously greater than that for firearms. A prospective registry of CED deployments would measure this more precisely. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Enceladus as a hydrothermal water world
NASA Astrophysics Data System (ADS)
Postberg, Frank; Hsu, Hsiang-Wen; Sekine, Yasuhito
2014-05-01
The composition of both salty ice grains and nanometer-sized stream particles emitted from Enceladus and measured by Cassini-CDA require require liquid water as a source. Moreover, they provide strong geochemical constraints for their origin inside the active moon. Most stream particles are composed of silica, a unique indicator as nano-silica would only form under quite specific conditions. With high probability on-going or geological recent hydrothermal activity at Enceladus is required to generate these particles. Inferred reaction temperatures at Enceladus ocean floor lie between 100 and 350 °C in a slightly alkaline environment (pH 7.5 - 10.5). The inferred high temperatures at great depth might require heat sources other than tides alone, such as remaining primordial heat and/or serpentinization of a probably porous rocky core. Long-term laboratory experiments were carried out to simulate the conditions at the Enceladus rock/water interface using the constraints derived from CDA measurements. These experiments allow insights into a rock/water chemistry which severely constrains the formation history of the moon and substantially enhances its astrobiological potential. Together with recent results from other Cassini instruments a conclusive picture of Enceladus as an active water world seems to be in reach.
Uncertainty analysis for the steady-state flows in a dual throat nozzle
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Q.-Y.; Gottlieb, David; Hesthaven, Jan S.
2005-03-20
It is well known that the steady state of an isentropic flow in a dual-throat nozzle with equal throat areas is not unique. In particular there is a possibility that the flow contains a shock wave, whose location is determined solely by the initial condition. In this paper, we consider cases with uncertainty in this initial condition and use generalized polynomial chaos methods to study the steady-state solutions for stochastic initial conditions. Special interest is given to the statistics of the shock location. The polynomial chaos (PC) expansion modes are shown to be smooth functions of the spatial variable x,more » although each solution realization is discontinuous in the spatial variable x. When the variance of the initial condition is small, the probability density function of the shock location is computed with high accuracy. Otherwise, many terms are needed in the PC expansion to produce reasonable results due to the slow convergence of the PC expansion, caused by non-smoothness in random space.« less
NASA Astrophysics Data System (ADS)
Azarnavid, Babak; Parand, Kourosh; Abbasbandy, Saeid
2018-06-01
This article discusses an iterative reproducing kernel method with respect to its effectiveness and capability of solving a fourth-order boundary value problem with nonlinear boundary conditions modeling beams on elastic foundations. Since there is no method of obtaining reproducing kernel which satisfies nonlinear boundary conditions, the standard reproducing kernel methods cannot be used directly to solve boundary value problems with nonlinear boundary conditions as there is no knowledge about the existence and uniqueness of the solution. The aim of this paper is, therefore, to construct an iterative method by the use of a combination of reproducing kernel Hilbert space method and a shooting-like technique to solve the mentioned problems. Error estimation for reproducing kernel Hilbert space methods for nonlinear boundary value problems have yet to be discussed in the literature. In this paper, we present error estimation for the reproducing kernel method to solve nonlinear boundary value problems probably for the first time. Some numerical results are given out to demonstrate the applicability of the method.
Optimum space shuttle launch times relative to natural environment
NASA Technical Reports Server (NTRS)
King, R. L.
1977-01-01
Three sets of meteorological criteria were analyzed to determine the probabilities of favorable launch and landing conditions. Probabilities were computed for every 3 hours on a yearly basis using 14 years of weather data. These temporal probability distributions, applicable to the three sets of weather criteria encompassing benign, moderate and severe weather conditions, were computed for both Kennedy Space Center (KSC) and Edwards Air Force Base. In addition, conditional probabilities were computed for unfavorable weather conditions occurring after a delay which may or may not be due to weather conditions. Also, for KSC, the probabilities of favorable landing conditions at various times after favorable launch conditions have prevailed have been computed so that mission probabilities may be more accurately computed for those time periods when persistence strongly correlates weather conditions. Moreover, the probabilities and conditional probabilities of the occurrence of both favorable and unfavorable events for each individual criterion were computed to indicate the significance of each weather element to the overall result.
Is MMTV associated with human breast cancer? Maybe, but probably not.
Perzova, Raisa; Abbott, Lynn; Benz, Patricia; Landas, Steve; Khan, Seema; Glaser, Jordan; Cunningham, Coleen K; Poiesz, Bernard
2017-10-13
Conflicting results regarding the association of MMTV with human breast cancer have been reported. Published sequence data have indicated unique MMTV strains in some human samples. However, concerns regarding contamination as a cause of false positive results have persisted. We performed PCR assays for MMTV on human breast cancer cell lines and fresh frozen and formalin fixed normal and malignant human breast epithelial samples. Assays were also performed on peripheral blood mononuclear cells from volunteer blood donors and subjects at risk for human retroviral infections. In addition, assays were performed on DNA samples from wild and laboratory mice. Sequencing of MMTV positive samples from both humans and mice were performed and phylogenetically compared. Using PCR under rigorous conditions to prevent and detect "carryover" contamination, we did detect MMTV DNA in human samples, including breast cancer. However, the results were not consistent and seemed to be an artifact. Further, experiments indicated that the probable source of false positives was murine DNA, containing endogenous MMTV, present in our building. However, comparison of published and, herein, newly described MMTV sequences with published data, indicates that there are some very unique human MMTV sequences in the literature. While we could not confirm the true presence of MMTV in our human breast cancer subjects, the data indicate that further, perhaps more traditional, retroviral studies are warranted to ascertain whether MMTV might rarely be the cause of human breast cancer.
Retrieval practice enhances the accessibility but not the quality of memory.
Sutterer, David W; Awh, Edward
2016-06-01
Numerous studies have demonstrated that retrieval from long-term memory (LTM) can enhance subsequent memory performance, a phenomenon labeled the retrieval practice effect. However, the almost exclusive reliance on categorical stimuli in this literature leaves open a basic question about the nature of this improvement in memory performance. It has not yet been determined whether retrieval practice improves the probability of successful memory retrieval or the quality of the retrieved representation. To answer this question, we conducted three experiments using a mixture modeling approach (Zhang & Luck, 2008) that provides a measure of both the probability of recall and the quality of the recalled memories. Subjects attempted to memorize the color of 400 unique shapes. After every 10 images were presented, subjects either recalled the last 10 colors (the retrieval practice condition) by clicking on a color wheel with each shape as a retrieval cue or they participated in a control condition that involved no further presentations (Experiment 1) or restudy of the 10 shape/color associations (Experiments 2 and 3). Performance in a subsequent delayed recall test revealed a robust retrieval practice effect. Subjects recalled a significantly higher proportion of items that they had previously retrieved relative to items that were untested or that they had restudied. Interestingly, retrieval practice did not elicit any improvement in the precision of the retrieved memories. The same empirical pattern also was observed following delays of greater than 24 hours. Thus, retrieval practice increases the probability of successful memory retrieval but does not improve memory quality.
Meng, Xiangfei; D'Arcy, Carl
2012-08-01
To explore the common and unique risk factors for mood and anxiety disorders. What sociodemographic, psychological, and physical risk factors are associated with mood and anxiety disorders and their comorbidities? What is the impact of multiple risk factors? Data from the Canadian Community Health Survey: Mental Health and Well-Being were analyzed. Appropriate sampling weights and bootstrap variance estimation were employed. Multiple logistic regression was used to estimate odds ratios and confidence intervals. The annual prevalence of any mood disorder was 5.2%, and of any anxiety disorder 4.7%. Major depressive episode was the most prevalent mood and anxiety disorder (4.8%), followed by social phobia, panic disorder, mania, and agoraphobia. Among people with mood and anxiety disorders, 22.4% had 2 or more disorders. Risk factors common to mood and anxiety disorders were being young, having lower household income, being unmarried, experiencing greater stress, having poorer mental health, and having a medical condition. Unique risk factors were found: major depressive episode and social phobia were associated with being born in Canada; panic disorder was associated with being Caucasian; lower education was associated with panic and agoraphobia; and poor physical health was associated with mania and agoraphobia. People who were young, unmarried, not fully employed, and had a medical condition, greater stress, poorer self-rated mental health, and dissatisfaction with life, were more likely to have a comorbid mood and (or) anxiety disorder. As the number of common risk factors increases, the probability of having mood and anxiety disorders also increases. Common and unique risk factors exist for mood and anxiety disorders. Risk factors are additive in increasing the likelihood of disease.
NASA Technical Reports Server (NTRS)
Raup, D. M.; Valentine, J. W.
1983-01-01
There is some indication that life may have originated readily under primitive earth conditions. If there were multiple origins of life, the result could have been a polyphyletic biota today. Using simple stochastic models for diversification and extinction, we conclude: (1) the probability of survival of life is low unless there are multiple origins, and (2) given survival of life and given as many as 10 independent origins of life, the odds are that all but one would have gone extinct, yielding the monophyletic biota we have now. The fact of the survival of our particular form of life does not imply that it was unique or superior.
Grabau, Olga; Leonhardi, Jochen; Reimers, Carl D.
2014-01-01
Introduction: Recurrent oculomotor nerve palsies are extremely rare clinical conditions. Case report: Here, we report on a unique case of a short-lasting recurrent unilateral incomplete external and complete internal oculomotor nerve palsy. The episodic palsies were probably caused by an ipsilateral mesencephalic metastasis of a breast carcinoma and occurred after successful brain radiation therapy. Discussion: While the pathogenic mechanism remains unclear, the recurrent sudden onset and disappearance of the palsies and their decreasing frequency after antiepileptic treatment suggest the occurrence of epilepsy-like brainstem seizures. A review of case reports of spontaneous reversible oculomotor nerve palsies is presented. PMID:25104947
The Probabilities of Unique Events
2012-08-30
social justice and also participated in antinuclear demonstrations. The participants ranked the probability that Linda is a feminist bank teller as...investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only...of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of
Gross, Eliza L.; Low, Dennis J.
2013-01-01
Logistic regression models were created to predict and map the probability of elevated arsenic concentrations in groundwater statewide in Pennsylvania and in three intrastate regions to further improve predictions for those three regions (glacial aquifer system, Gettysburg Basin, Newark Basin). Although the Pennsylvania and regional predictive models retained some different variables, they have common characteristics that can be grouped by (1) geologic and soils variables describing arsenic sources and mobilizers, (2) geochemical variables describing the geochemical environment of the groundwater, and (3) locally specific variables that are unique to each of the three regions studied and not applicable to statewide analysis. Maps of Pennsylvania and the three intrastate regions were produced that illustrate that areas most at risk are those with geology and soils capable of functioning as an arsenic source or mobilizer and geochemical groundwater conditions able to facilitate redox reactions. The models have limitations because they may not characterize areas that have localized controls on arsenic mobility. The probability maps associated with this report are intended for regional-scale use and may not be accurate for use at the field scale or when considering individual wells.
Uncertainty, imprecision, and the precautionary principle in climate change assessment.
Borsuk, M E; Tomassini, L
2005-01-01
Statistical decision theory can provide useful support for climate change decisions made under conditions of uncertainty. However, the probability distributions used to calculate expected costs in decision theory are themselves subject to uncertainty, disagreement, or ambiguity in their specification. This imprecision can be described using sets of probability measures, from which upper and lower bounds on expectations can be calculated. However, many representations, or classes, of probability measures are possible. We describe six of the more useful classes and demonstrate how each may be used to represent climate change uncertainties. When expected costs are specified by bounds, rather than precise values, the conventional decision criterion of minimum expected cost is insufficient to reach a unique decision. Alternative criteria are required, and the criterion of minimum upper expected cost may be desirable because it is consistent with the precautionary principle. Using simple climate and economics models as an example, we determine the carbon dioxide emissions levels that have minimum upper expected cost for each of the selected classes. There can be wide differences in these emissions levels and their associated costs, emphasizing the need for care when selecting an appropriate class.
Students' Understanding of Conditional Probability on Entering University
ERIC Educational Resources Information Center
Reaburn, Robyn
2013-01-01
An understanding of conditional probability is essential for students of inferential statistics as it is used in Null Hypothesis Tests. Conditional probability is also used in Bayes' theorem, in the interpretation of medical screening tests and in quality control procedures. This study examines the understanding of conditional probability of…
Training modalities: impact on endurance capacity.
Flueck, Martin; Eilers, Wouter
2010-03-01
Endurance athletes demonstrate an exceptional resistance to fatigue when exercising at high intensity. Much research has been devoted to the contribution of aerobic capacity for the economy of endurance performance. Important aspects of the fine-tuning of metabolic processes and power output in the endurance athlete have been overlooked. This review addresses how training paradigms exploit bioenergetic pathways in recruited muscle groups to promote the endurance phenotype. A special focus is laid on the genome-mediated mechanisms that underlie the conditioning of fatigue resistance and aerobic performance by training macrocycles and complements. The available data on work-induced muscle plasticity implies that different biologic strategies are exploited in athletic and untrained populations to boost endurance capacity. Olympic champions are probably endowed with a unique constitution that renders the conditioning of endurance capacity for competition particularly efficient. Copyright 2010 Elsevier Inc. All rights reserved.
CPROB: A COMPUTATIONAL TOOL FOR CONDUCTING CONDITIONAL PROBABILITY ANALYSIS
Conditional probability analysis measures the probability of observing one event given that another event has occurred. In an environmental context, conditional probability analysis helps assess the association between an environmental contaminant (i.e. the stressor) and the ec...
From Dualism to Unity in Quantum Physics
NASA Astrophysics Data System (ADS)
Landé, Alfred
2016-02-01
Preface; Introduction; 1. Causality, chance, continuity; 2. States, observables, probabilities; 3. The metric law of probabilities; 4. Quantum dynamics; 5. Quantum fact and fiction; Retrospect. From dualism to unity, from positivism to realism; Appendix 1. Survey of elementary postulates; Appendix 2. Two problems of uniqueness; References; Index.
NASA Astrophysics Data System (ADS)
Little, Crispin T. S.; Herrington, Richard J.; Haymon, Rachel M.; Danelian, Taniel
1999-02-01
The Figueroa massive sulfide deposit, located in Franciscan Complex rocks in the San Rafael Mountains of California, preserves the only known Jurassic hydrothermal vent fossils. The Figueroa fossil assemblage is specimen rich but of low diversity and comprises, in order of decreasing abundance, vestimentiferan worm tubes, the rhynchonellid brachiopod Anarhynchia cf. gabbi and a species of ?nododelphinulid gastropod. The Figueroa fossil organisms lived at a deep-water, high-temperature vent site located on a mid-ocean ridge or seamount at an equatorial latitude. The fossil vent site was then translated northwestward by the motion of the Farallon plate and was subsequently accreted to its present location. An iron-silica exhalite bed, the probable lateral equivalent of the Figueroa deposit, contains abundant filamentous microfossils with two distinct morphologies and probably represents a lower-temperature, diffuse-flow environment. The Figueroa fossil community was subject to the same environmental conditions as modern vent communities, but it is unique among modern and other fossil vent communities in having rhynchonellid brachiopods.
Voting on Embryonic Stem Cell Research: Citizens More Supportive than Politicians.
Stadelmann, David; Torgler, Benno
2017-01-01
As the public debate over stem cell research continues, the observable voting behaviour in Switzerland offers a unique opportunity to compare the voting behaviour of politicians with that of voters. By analysing the outcomes of a referendum on a liberal new bill regulating such research, we reveal an about 10 percentage point lower conditional probability of the bill being accepted by politicians than by voters. Whereas the behaviour of politicians is driven almost entirely by party affiliation, citizen votes are driven not only by party attachment but also by church attendance. Seldom or never attending church increases the probability of bill acceptance by over 15 percentage points, while supporting the Liberal Party and the Social Democratic Party instead of the Christian Democratic Party makes supporting the bill more likely for voters, suggesting that religious observance is important. The observance of these tendencies in Switzerland-an environment that promotes discussion through direct democratic rights-strongly suggests that citizens see the benefits of stem cell research.
NASA Astrophysics Data System (ADS)
Inoue, N.
2017-12-01
The conditional probability of surface ruptures is affected by various factors, such as shallow material properties, process of earthquakes, ground motions and so on. Toda (2013) pointed out difference of the conditional probability of strike and reverse fault by considering the fault dip and width of seismogenic layer. This study evaluated conditional probability of surface rupture based on following procedures. Fault geometry was determined from the randomly generated magnitude based on The Headquarters for Earthquake Research Promotion (2017) method. If the defined fault plane was not saturated in the assumed width of the seismogenic layer, the fault plane depth was randomly provided within the seismogenic layer. The logistic analysis was performed to two data sets: surface displacement calculated by dislocation methods (Wang et al., 2003) from the defined source fault, the depth of top of the defined source fault. The estimated conditional probability from surface displacement indicated higher probability of reverse faults than that of strike faults, and this result coincides to previous similar studies (i.e. Kagawa et al., 2004; Kataoka and Kusakabe, 2005). On the contrary, the probability estimated from the depth of the source fault indicated higher probability of thrust faults than that of strike and reverse faults, and this trend is similar to the conditional probability of PFDHA results (Youngs et al., 2003; Moss and Ross, 2011). The probability of combined simulated results of thrust and reverse also shows low probability. The worldwide compiled reverse fault data include low fault dip angle earthquake. On the other hand, in the case of Japanese reverse fault, there is possibility that the conditional probability of reverse faults with less low dip angle earthquake shows low probability and indicates similar probability of strike fault (i.e. Takao et al., 2013). In the future, numerical simulation by considering failure condition of surface by the source fault would be performed in order to examine the amount of the displacement and conditional probability quantitatively.
van Lamsweerde, Amanda E; Beck, Melissa R
2015-12-01
In this study, we investigated whether the ability to learn probability information is affected by the type of representation held in visual working memory. Across 4 experiments, participants detected changes to displays of coloured shapes. While participants detected changes in 1 dimension (e.g., colour), a feature from a second, nonchanging dimension (e.g., shape) predicted which object was most likely to change. In Experiments 1 and 3, items could be grouped by similarity in the changing dimension across items (e.g., colours and shapes were repeated in the display), while in Experiments 2 and 4 items could not be grouped by similarity (all features were unique). Probability information from the predictive dimension was learned and used to increase performance, but only when all of the features within a display were unique (Experiments 2 and 4). When it was possible to group by feature similarity in the changing dimension (e.g., 2 blue objects appeared within an array), participants were unable to learn probability information and use it to improve performance (Experiments 1 and 3). The results suggest that probability information can be learned in a dimension that is not explicitly task-relevant, but only when the probability information is represented with the changing dimension in visual working memory. (c) 2015 APA, all rights reserved).
Clonal evolution of colorectal cancer in IBD.
Choi, Chang-Ho R; Bakir, Ibrahim Al; Hart, Ailsa L; Graham, Trevor A
2017-04-01
Optimizing the management of colorectal cancer (CRC) risk in IBD requires a fundamental understanding of the evolutionary process underpinning tumorigenesis. In IBD, clonal evolution begins long before the development of overt neoplasia, and is probably accelerated by the repeated cycles of epithelial wounding and repair that are characteristic of the condition. Here, we review the biological drivers of mutant clone selection in IBD with particular reference to the unique histological architecture of the intestinal epithelium coupled with the inflammatory microenvironment in IBD, and the unique mutation patterns seen in IBD-driven neoplasia when compared with sporadic adenomas and CRC. How these data can be leveraged as evolutionary-based biomarkers to predict cancer risk is discussed, as well as how the efficacy of CRC surveillance programmes and the management of dysplasia can be improved. From a research perspective, the longitudinal surveillance of patients with IBD provides an under-exploited opportunity to investigate the biology of the human gastrointestinal tract over space and time.
Fossil Microorganisms and Formation of Early Precambrian Weathering Profiles
NASA Technical Reports Server (NTRS)
Rozanov, A. Yu; Astafieva, M. M.; Vrevsky, A. B.; Alfimova, N. A.; Matrenichev, V. A.; Hoover, R. B.
2009-01-01
Weathering crusts are the only reliable evidences of the existence of continental conditions. Often they are the only source of information about exogenous processes and subsequently about conditions under which the development of the biosphere occurred. A complex of diverse fossil microorganisms was discovered as a result of Scanning Electron Microscope investigations. The chemical composition of the discovered fossils is identical to that of the host rocks and is represented by Si, Al, Fe, Ca and Mg. Probably, the microorganisms fixed in rocks played the role of catalyst. The decomposition of minerals comprising the rocks and their transformation into clayey (argillaceous) minerals, most likely occurred under the influence of microorganisms. And may be unique weathering crusts of Early Precambrian were formed due to interaction between specific composition of microorganism assemblage and conditions of hypergene transformations. So it is possible to speak about colonization of land by microbes already at that time and about existence of single raw from weathering crusts (Primitive soils) to real soils.
Excluding joint probabilities from quantum theory
NASA Astrophysics Data System (ADS)
Allahverdyan, Armen E.; Danageozian, Arshag
2018-03-01
Quantum theory does not provide a unique definition for the joint probability of two noncommuting observables, which is the next important question after the Born's probability for a single observable. Instead, various definitions were suggested, e.g., via quasiprobabilities or via hidden-variable theories. After reviewing open issues of the joint probability, we relate it to quantum imprecise probabilities, which are noncontextual and are consistent with all constraints expected from a quantum probability. We study two noncommuting observables in a two-dimensional Hilbert space and show that there is no precise joint probability that applies for any quantum state and is consistent with imprecise probabilities. This contrasts with theorems by Bell and Kochen-Specker that exclude joint probabilities for more than two noncommuting observables, in Hilbert space with dimension larger than two. If measurement contexts are included into the definition, joint probabilities are not excluded anymore, but they are still constrained by imprecise probabilities.
Optimum space shuttle launch times relative to natural environment
NASA Technical Reports Server (NTRS)
King, R. L.
1977-01-01
The probabilities of favorable and unfavorable weather conditions for launch and landing of the STS under different criteria were computed for every three hours on a yearly basis using 14 years of weather data. These temporal probability distributions were considered for three sets of weather criteria encompassing benign, moderate and severe weather conditions for both Kennedy Space Center and for Edwards Air Force Base. In addition, the conditional probabilities were computed for unfavorable weather conditions occurring after a delay which may or may not be due to weather conditions. Also for KSC, the probabilities of favorable landing conditions at various times after favorable launch conditions have prevailed. The probabilities were computed to indicate the significance of each weather element to the overall result.
Competitive or weak cooperative stochastic Lotka-Volterra systems conditioned on non-extinction.
Cattiaux, Patrick; Méléard, Sylvie
2010-06-01
We are interested in the long time behavior of a two-type density-dependent biological population conditioned on non-extinction, in both cases of competition or weak cooperation between the two species. This population is described by a stochastic Lotka-Volterra system, obtained as limit of renormalized interacting birth and death processes. The weak cooperation assumption allows the system not to blow up. We study the existence and uniqueness of a quasi-stationary distribution, that is convergence to equilibrium conditioned on non-extinction. To this aim we generalize in two-dimensions spectral tools developed for one-dimensional generalized Feller diffusion processes. The existence proof of a quasi-stationary distribution is reduced to the one for a d-dimensional Kolmogorov diffusion process under a symmetry assumption. The symmetry we need is satisfied under a local balance condition relying the ecological rates. A novelty is the outlined relation between the uniqueness of the quasi-stationary distribution and the ultracontractivity of the killed semi-group. By a comparison between the killing rates for the populations of each type and the one of the global population, we show that the quasi-stationary distribution can be either supported by individuals of one (the strongest one) type or supported by individuals of the two types. We thus highlight two different long time behaviors depending on the parameters of the model: either the model exhibits an intermediary time scale for which only one type (the dominant trait) is surviving, or there is a positive probability to have coexistence of the two species.
NASA Astrophysics Data System (ADS)
Wang, Sheng; Wang, Linshan; Wei, Tengda
2018-04-01
This paper concerns the dynamics of a stochastic predator-prey system with Markovian switching and Lévy noise. First, the existence and uniqueness of global positive solution to the system is proved. Then, by combining stochastic analytical techniques with M-matrix analysis, sufficient conditions of stochastic permanence and extinction are obtained. Furthermore, for the stochastic permanence case, by means of four constants related to the stationary probability distribution of the Markov chain and the parameters of the subsystems, both the superior limit and the inferior limit of the average in time of the sample path of the solution are estimated. Finally, our conclusions are illustrated through an example.
Drug choice as a self-handicapping strategy in response to noncontingent success.
Berglas, S; Jones, E E
1978-04-01
In two closely related experiments, college student subjects were instructed to choose between a drug that allegedly interfered with performance and a drug that allegedly enhanced performance. This choice was the main dependent measure of the experiment. The drug choice intervened between work on soluble or insoluble problems and a promised retest on similar problems. In Experiment 1, all subjects received success feedback after their initial problem-solving attempts, thus creating one condition in which the success appeared to be accidental (noncontingent on performance) and one in which the success appeared to be contingent on appropriate knowledge. Males in the noncontingent-success condition were alone in preferring the performance-inhibiting drug, presumably because they wished to externalize probable failure on the retest. The predicted effect, however, did not hold for female subjects. Experiment 2 replicated the unique preference shown by males after noncontingent success and showed the critical importance of success feedback.
Ageing of structural materials in tokamaks: TEXTOR liner study
NASA Astrophysics Data System (ADS)
Weckmann, A.; Petersson, P.; Rubel, M.; Fortuna-Zaleśna, E.; Zielinski, W.; Romelczyk-Baishya, B.; Grigore, E.; Ruset, C.; Kreter, A.
2017-12-01
After the final shut-down of the tokamak TEXTOR, all of its machine parts became accessible for comprehensive studies. This unique opportunity enabled the study of the Inconel 625 liner by a wide range of methods. The aim was to evaluate eventual alteration of surface and bulk characteristics from recessed wall elements that may influence the machine performance. The surface was covered with stratified layers consisting mainly of boron, carbon, oxygen, and in some cases also silicon. Wall conditioning and limiter materials hence predominantly define deposition on the liner. Deposited layers on recessed wall elements reach micrometre thickness within decades, peel off and may contribute to the dust inventory in tokamaks. Deuterium content was about 4,7 at% on average most probably due to wall conditioning with deuterated gas, and very low concentration in the Inconel substrate. Inconel 625 retained its mechanical strength despite 26 years of cyclic heating, stresses and particle bombardment.
The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions
Larget, Bret
2013-01-01
In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066
ERIC Educational Resources Information Center
Satake, Eiki; Amato, Philip P.
2008-01-01
This paper presents an alternative version of formulas of conditional probabilities and Bayes' rule that demonstrate how the truth table of elementary mathematical logic applies to the derivations of the conditional probabilities of various complex, compound statements. This new approach is used to calculate the prior and posterior probabilities…
A Bayesian pick-the-winner design in a randomized phase II clinical trial.
Chen, Dung-Tsa; Huang, Po-Yu; Lin, Hui-Yi; Chiappori, Alberto A; Gabrilovich, Dmitry I; Haura, Eric B; Antonia, Scott J; Gray, Jhanelle E
2017-10-24
Many phase II clinical trials evaluate unique experimental drugs/combinations through multi-arm design to expedite the screening process (early termination of ineffective drugs) and to identify the most effective drug (pick the winner) to warrant a phase III trial. Various statistical approaches have been developed for the pick-the-winner design but have been criticized for lack of objective comparison among the drug agents. We developed a Bayesian pick-the-winner design by integrating a Bayesian posterior probability with Simon two-stage design in a randomized two-arm clinical trial. The Bayesian posterior probability, as the rule to pick the winner, is defined as probability of the response rate in one arm higher than in the other arm. The posterior probability aims to determine the winner when both arms pass the second stage of the Simon two-stage design. When both arms are competitive (i.e., both passing the second stage), the Bayesian posterior probability performs better to correctly identify the winner compared with the Fisher exact test in the simulation study. In comparison to a standard two-arm randomized design, the Bayesian pick-the-winner design has a higher power to determine a clear winner. In application to two studies, the approach is able to perform statistical comparison of two treatment arms and provides a winner probability (Bayesian posterior probability) to statistically justify the winning arm. We developed an integrated design that utilizes Bayesian posterior probability, Simon two-stage design, and randomization into a unique setting. It gives objective comparisons between the arms to determine the winner.
NASA Astrophysics Data System (ADS)
DeMarco, Adam Ward
The turbulent motions with the atmospheric boundary layer exist over a wide range of spatial and temporal scales and are very difficult to characterize. Thus, to explore the behavior of such complex flow enviroments, it is customary to examine their properties from a statistical perspective. Utilizing the probability density functions of velocity and temperature increments, deltau and deltaT, respectively, this work investigates their multiscale behavior to uncover the unique traits that have yet to be thoroughly studied. Utilizing diverse datasets, including idealized, wind tunnel experiments, atmospheric turbulence field measurements, multi-year ABL tower observations, and mesoscale models simulations, this study reveals remarkable similiarities (and some differences) between the small and larger scale components of the probability density functions increments fields. This comprehensive analysis also utilizes a set of statistical distributions to showcase their ability to capture features of the velocity and temperature increments' probability density functions (pdfs) across multiscale atmospheric motions. An approach is proposed for estimating their pdfs utilizing the maximum likelihood estimation (MLE) technique, which has never been conducted utilizing atmospheric data. Using this technique, we reveal the ability to estimate higher-order moments accurately with a limited sample size, which has been a persistent concern for atmospheric turbulence research. With the use robust Goodness of Fit (GoF) metrics, we quantitatively reveal the accuracy of the distributions to the diverse dataset. Through this analysis, it is shown that the normal inverse Gaussian (NIG) distribution is a prime candidate to be used as an estimate of the increment pdfs fields. Therefore, using the NIG model and its parameters, we display the variations in the increments over a range of scales revealing some unique scale-dependent qualities under various stability and ow conditions. This novel approach can provide a method of characterizing increment fields with the sole use of only four pdf parameters. Also, we investigate the capability of the current state-of-the-art mesoscale atmospheric models to predict the features and highlight the potential for use for future model development. With the knowledge gained in this study, a number of applications can benefit by using our methodology, including the wind energy and optical wave propagation fields.
Toomey, D E; Yang, K H; Van Ee, C A
2014-01-01
Physical biomechanical surrogates are critical for testing the efficacy of injury-mitigating safety strategies. The interpretation of measured Hybrid III neck loads in test scenarios resulting in compressive loading modes would be aided by a further understanding of the correlation between the mechanical responses in the Hybrid III neck and the probability of injury in the human cervical spine. The anthropomorphic test device (ATD) peak upper and lower neck responses were measured during dynamic compressive loading conditions comparable to those of postmortem human subject (PMHS) experiments. The peak ATD response could then be compared to the PMHS injury outcomes. A Hybrid III 50th percentile ATD head and neck assembly was tested under conditions matching those of male PMHS tests conducted on an inverted drop track. This includes variation in impact plate orientation (4 sagittal plane and 2 frontal plane orientations), impact plate surface friction, and ATD initial head/neck orientation. This unique matched data with known injury outcomes were used to evaluate existing ATD neck injury criteria. The Hybrid III ATD head and neck assembly was found to be robust and repeatable under severe loading conditions. The initial axial force response of the ATD head and neck is very comparable to PMHS experiments up to the point of PMHS cervical column buckle or material failure. An ATD lower neck peak compressive force as low as 6,290 N was associated with an unstable orthopedic cervical injury in a PMHS under equivalent impact conditions. ATD upper neck peak compressive force associated with a 5% probability of unstable cervical orthopedic injury ranged from as low as 3,708 to 3,877 N depending on the initial ATD neck angle. The correlation between peak ATD compressive neck response and PMHS test outcome in the current study resulted in a relationship between axial load and injury probability consistent with the current Hybrid III injury assessment reference values. The results add to the current understanding of cervical injury probability based on ATD neck compressive loading in that it is the only known study, in addition to Mertz et al. (1978), formulated directly from ATD compressive loading scenarios with known human injury outcomes.
Knapp, Sabine; Kumar, Shashi; Sakurada, Yuri; Shen, Jiajun
2011-05-01
This study uses econometric models to measure the effect of significant wave height and wind strength on the probability of casualty and tests whether these effects changed. While both effects are in particular relevant for stability and strength calculations of vessels, it is also helpful for the development of ship construction standards in general to counteract increased risk resulting from changing oceanographic conditions. The authors analyzed a unique dataset of 3.2 million observations from 20,729 individual vessels in the North Atlantic and Arctic regions gathered during the period 1979-2007. The results show that although there is a seasonal pattern in the probability of casualty especially during the winter months, the effect of wind strength and significant wave height do not follow the same seasonal pattern. Additionally, over time, significant wave height shows an increasing effect in January, March, May and October while wind strength shows a decreasing effect, especially in January, March and May. The models can be used to simulate relationships and help understand the relationships. This is of particular interest to naval architects and ship designers as well as multilateral agencies such as the International Maritime Organization (IMO) that establish global standards in ship design and construction. Copyright © 2011 Elsevier Ltd. All rights reserved.
Automatic seed selection for segmentation of liver cirrhosis in laparoscopic sequences
NASA Astrophysics Data System (ADS)
Sinha, Rahul; Marcinczak, Jan Marek; Grigat, Rolf-Rainer
2014-03-01
For computer aided diagnosis based on laparoscopic sequences, image segmentation is one of the basic steps which define the success of all further processing. However, many image segmentation algorithms require prior knowledge which is given by interaction with the clinician. We propose an automatic seed selection algorithm for segmentation of liver cirrhosis in laparoscopic sequences which assigns each pixel a probability of being cirrhotic liver tissue or background tissue. Our approach is based on a trained classifier using SIFT and RGB features with PCA. Due to the unique illumination conditions in laparoscopic sequences of the liver, a very low dimensional feature space can be used for classification via logistic regression. The methodology is evaluated on 718 cirrhotic liver and background patches that are taken from laparoscopic sequences of 7 patients. Using a linear classifier we achieve a precision of 91% in a leave-one-patient-out cross-validation. Furthermore, we demonstrate that with logistic probability estimates, seeds with high certainty of being cirrhotic liver tissue can be obtained. For example, our precision of liver seeds increases to 98.5% if only seeds with more than 95% probability of being liver are used. Finally, these automatically selected seeds can be used as priors in Graph Cuts which is demonstrated in this paper.
A Possible Operational Motivation for the Orthocomplementation in Quantum Structures
NASA Astrophysics Data System (ADS)
D'Hooghe, Bart
2010-11-01
In the foundations of quantum mechanics Gleason’s theorem dictates the uniqueness of the state transition probability via the inner product of the corresponding state vectors in Hilbert space, independent of which measurement context induces this transition. We argue that the state transition probability should not be regarded as a secondary concept which can be derived from the structure on the set of states and properties, but instead should be regarded as a primitive concept for which measurement context is crucial. Accordingly, we adopt an operational approach to quantum mechanics in which a physical entity is defined by the structure of its set of states, set of properties and the possible (measurement) contexts which can be applied to this entity. We put forward some elementary definitions to derive an operational theory from this State-COntext-Property (SCOP) formalism. We show that if the SCOP satisfies a Gleason-like condition, namely that the state transition probability is independent of which measurement context induces the change of state, then the lattice of properties is orthocomplemented, which is one of the ‘quantum axioms’ used in the Piron-Solèr representation theorem for quantum systems. In this sense we obtain a possible physical meaning for the orthocomplementation widely used in quantum structures.
ERIC Educational Resources Information Center
Satake, Eiki; Vashlishan Murray, Amy
2015-01-01
This paper presents a comparison of three approaches to the teaching of probability to demonstrate how the truth table of elementary mathematical logic can be used to teach the calculations of conditional probabilities. Students are typically introduced to the topic of conditional probabilities--especially the ones that involve Bayes' rule--with…
Practical relevance of pattern uniqueness in forensic science.
Jayaprakash, Paul T
2013-09-10
Uniqueness being unprovable, it has recently been argued that individualization in forensic science is irrelevant and, probability, as applied for DNA profiles, should be applied for all identifications. Critiques against uniqueness have omitted physical matching, a realistic and tangible individualization that supports uniqueness. Describing case examples illustrating pattern matches including physical matching, it is indicated that individualizations are practically relevant for forensic science as they establish facts on a definitive basis providing firm leads benefitting criminal investigation. As a tenet of forensic identification, uniqueness forms a fundamental paradigm relevant for individualization. Evidence on the indeterministic and stochastic causal pathways of characteristics in patterns available in the related fields of science sufficiently supports the proposition of uniqueness. Characteristics involved in physical matching and matching achieved in patterned evidence existing in the state of nature are not events amenable for counting; instead these are ensemble of visible units occupying the entire pattern area stretching the probability of re-occurrence of a verisimilitude pattern into infinity offering epistemic support to uniqueness. Observational methods are as respectable as instrumental or statistical methods since they are capable of generating results that are tangible and obviously valid as in physical matching. Applying the probabilistic interpretation used for DNA profiles to the other patterns would be unbefitting since these two are disparate, the causal pathways of the events, the loci, in the manipulated DNA profiles being determinable. While uniqueness enables individualizations, it does not vouch for eliminating errors. Instead of dismissing uniqueness and individualization, accepting errors as human or system failures and seeking remedial measures would benefit forensic science practice and criminal investigation. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
On Schrödinger's bridge problem
NASA Astrophysics Data System (ADS)
Friedland, S.
2017-11-01
In the first part of this paper we generalize Georgiou-Pavon's result that a positive square matrix can be scaled uniquely to a column stochastic matrix which maps a given positive probability vector to another given positive probability vector. In the second part we prove that a positive quantum channel can be scaled to another positive quantum channel which maps a given positive definite density matrix to another given positive definite density matrix using Brouwer's fixed point theorem. This result proves the Georgiou-Pavon conjecture for two positive definite density matrices, made in their recent paper. We show that the fixed points are unique for certain pairs of positive definite density matrices. Bibliography: 15 titles.
NASA Astrophysics Data System (ADS)
Nieuwenhuizen, Theodorus M.; Kupczynski, Marian
2017-02-01
Ilya Schmelzer wrote recently: Nieuwenhuizen argued that there exists some "contextuality loophole" in Bell's theorem. This claim in unjustified. It is made clear that this arose from attaching a meaning to the title and the content of the paper different from the one intended by Nieuwenhuizen. "Contextual loophole" means only that if the supplementary parameters describing measuring instruments are correctly introduced, Bell and Bell-type inequalities may not be proven. It is also stressed that a hidden variable model suffers from a "contextuality loophole" if it tries to describe different sets of incompatible experiments using a unique probability space and a unique joint probability distribution.
Pavelková Řičánková, Věra; Robovský, Jan; Riegert, Jan
2014-01-01
Pleistocene mammalian communities display unique features which differ from present-day faunas. The paleocommunities were characterized by the extraordinarily large body size of herbivores and predators and by their unique structure consisting of species now inhabiting geographically and ecologically distinct natural zones. These features were probably the result of the unique environmental conditions of ice age ecosystems. To analyze the ecological structure of Last Glacial and Recent mammal communities we classified the species into biome and trophic-size categories, using Principal Component analysis. We found a marked similarity in ecological structure between Recent eastern Altai-Sayan mammalian assemblages and comparable Pleistocene faunas. The composition of Last Glacial and Recent eastern Altai-Sayan assemblages were characterized by the occurrence of large herbivore and predator species associated with steppe, desert and alpine biomes. These three modern biomes harbor most of the surviving Pleistocene mammals. None of the analyzed Palearctic Last Glacial faunas showed affinity to the temperate forest, taiga, or tundra biome. The Eastern part of the Altai-Sayan region could be considered a refugium of the Last Glacial-like mammalian assemblages. Glacial fauna seems to persist up to present in those areas where the forest belt does not separate alpine vegetation from the steppes and deserts. PMID:24454791
Laser radar system for obstacle avoidance
NASA Astrophysics Data System (ADS)
Bers, Karlheinz; Schulz, Karl R.; Armbruster, Walter
2005-09-01
The threat of hostile surveillance and weapon systems require military aircraft to fly under extreme conditions such as low altitude, high speed, poor visibility and incomplete terrain information. The probability of collision with natural and man-made obstacles during such contour missions is high if detection capability is restricted to conventional vision aids. Forward-looking scanning laser radars which are build by the EADS company and presently being flight tested and evaluated at German proving grounds, provide a possible solution, having a large field of view, high angular and range resolution, a high pulse repetition rate, and sufficient pulse energy to register returns from objects at distances of military relevance with a high hit-and-detect probability. The development of advanced 3d-scene analysis algorithms had increased the recognition probability and reduced the false alarm rate by using more readily recognizable objects such as terrain, poles, pylons, trees, etc. to generate a parametric description of the terrain surface as well as the class, position, orientation, size and shape of all objects in the scene. The sensor system and the implemented algorithms can be used for other applications such as terrain following, autonomous obstacle avoidance, and automatic target recognition. This paper describes different 3D-imaging ladar sensors with unique system architecture but different components matched for different military application. Emphasis is laid on an obstacle warning system with a high probability of detection of thin wires, the real time processing of the measured range image data, obstacle classification und visualization.
The Probability Approach to English If-Conditional Sentences
ERIC Educational Resources Information Center
Wu, Mei
2012-01-01
Users of the Probability Approach choose the right one from four basic types of conditional sentences--factual, predictive, hypothetical and counterfactual conditionals, by judging how likely (i.e. the probability) the event in the result-clause will take place when the condition in the if-clause is met. Thirty-three students from the experimental…
PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT
We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...
Ren, Guo-Jian; Chang, Ze; Xu, Jian; Hu, Zhenpeng; Liu, Yan-Qing; Xu, Yue-Ling; Bu, Xian-He
2016-02-04
A novel decorated metal-organic polyhedron (MOP) based metal-organic framework with a unique 4,9-connected network is successfully constructed, which displays a relatively strong interaction toward H2 and CO2 probably due to the existence of open metal sites in the secondary building units.
Bhattacharya, Moumita; Jurkovitz, Claudine; Shatkay, Hagit
2018-04-12
Patients associated with multiple co-occurring health conditions often face aggravated complications and less favorable outcomes. Co-occurring conditions are especially prevalent among individuals suffering from kidney disease, an increasingly widespread condition affecting 13% of the general population in the US. This study aims to identify and characterize patterns of co-occurring medical conditions in patients employing a probabilistic framework. Specifically, we apply topic modeling in a non-traditional way to find associations across SNOMED-CT codes assigned and recorded in the EHRs of >13,000 patients diagnosed with kidney disease. Unlike most prior work on topic modeling, we apply the method to codes rather than to natural language. Moreover, we quantitatively evaluate the topics, assessing their tightness and distinctiveness, and also assess the medical validity of our results. Our experiments show that each topic is succinctly characterized by a few highly probable and unique disease codes, indicating that the topics are tight. Furthermore, inter-topic distance between each pair of topics is typically high, illustrating distinctiveness. Last, most coded conditions grouped together within a topic, are indeed reported to co-occur in the medical literature. Notably, our results uncover a few indirect associations among conditions that have hitherto not been reported as correlated in the medical literature. Copyright © 2018. Published by Elsevier Inc.
Probability Surveys, Conditional Probability, and Ecological Risk Assessment
We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...
The Integrated Medical Model: Statistical Forecasting of Risks to Crew Health and Mission Success
NASA Technical Reports Server (NTRS)
Fitts, M. A.; Kerstman, E.; Butler, D. J.; Walton, M. E.; Minard, C. G.; Saile, L. G.; Toy, S.; Myers, J.
2008-01-01
The Integrated Medical Model (IMM) helps capture and use organizational knowledge across the space medicine, training, operations, engineering, and research domains. The IMM uses this domain knowledge in the context of a mission and crew profile to forecast crew health and mission success risks. The IMM is most helpful in comparing the risk of two or more mission profiles, not as a tool for predicting absolute risk. The process of building the IMM adheres to Probability Risk Assessment (PRA) techniques described in NASA Procedural Requirement (NPR) 8705.5, and uses current evidence-based information to establish a defensible position for making decisions that help ensure crew health and mission success. The IMM quantitatively describes the following input parameters: 1) medical conditions and likelihood, 2) mission duration, 3) vehicle environment, 4) crew attributes (e.g. age, sex), 5) crew activities (e.g. EVA's, Lunar excursions), 6) diagnosis and treatment protocols (e.g. medical equipment, consumables pharmaceuticals), and 7) Crew Medical Officer (CMO) training effectiveness. It is worth reiterating that the IMM uses the data sets above as inputs. Many other risk management efforts stop at determining only likelihood. The IMM is unique in that it models not only likelihood, but risk mitigations, as well as subsequent clinical outcomes based on those mitigations. Once the mathematical relationships among the above parameters are established, the IMM uses a Monte Carlo simulation technique (a random sampling of the inputs as described by their statistical distribution) to determine the probable outcomes. Because the IMM is a stochastic model (i.e. the input parameters are represented by various statistical distributions depending on the data type), when the mission is simulated 10-50,000 times with a given set of medical capabilities (risk mitigations), a prediction of the most probable outcomes can be generated. For each mission, the IMM tracks which conditions occurred and decrements the pharmaceuticals and supplies required to diagnose and treat these medical conditions. If supplies are depleted, then the medical condition goes untreated, and crew and mission risk increase. The IMM currently models approximately 30 medical conditions. By the end of FY2008, the IMM will be modeling over 100 medical conditions, approximately 60 of which have been recorded to have occurred during short and long space missions.
Contextuality in canonical systems of random variables
NASA Astrophysics Data System (ADS)
Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.
2017-10-01
Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.
Ambiguity Aversion in Rhesus Macaques
Hayden, Benjamin Y.; Heilbronner, Sarah R.; Platt, Michael L.
2010-01-01
People generally prefer risky options, which have fully specified outcome probabilities, to ambiguous options, which have unspecified probabilities. This preference, formalized in economics, is strong enough that people will reliably prefer a risky option to an ambiguous option with a greater expected value. Explanations for ambiguity aversion often invoke uniquely human faculties like language, self-justification, or a desire to avoid public embarrassment. Challenging these ideas, here we demonstrate that a preference for unambiguous options is shared with rhesus macaques. We trained four monkeys to choose between pairs of options that both offered explicitly cued probabilities of large and small juice outcomes. We then introduced occasional trials where one of the options was obscured and examined their resulting preferences; we ran humans in a parallel experiment on a nearly identical task. We found that monkeys reliably preferred risky options to ambiguous ones, even when this bias was costly, closely matching the behavior of humans in the analogous task. Notably, ambiguity aversion varied parametrically with the extent of ambiguity. As expected, ambiguity aversion gradually declined as monkeys learned the underlying probability distribution of rewards. These data indicate that ambiguity aversion reflects fundamental cognitive biases shared with other animals rather than uniquely human factors guiding decisions. PMID:20922060
A quantum-implementable neural network model
NASA Astrophysics Data System (ADS)
Chen, Jialin; Wang, Lingli; Charbon, Edoardo
2017-10-01
A quantum-implementable neural network, namely quantum probability neural network (QPNN) model, is proposed in this paper. QPNN can use quantum parallelism to trace all possible network states to improve the result. Due to its unique quantum nature, this model is robust to several quantum noises under certain conditions, which can be efficiently implemented by the qubus quantum computer. Another advantage is that QPNN can be used as memory to retrieve the most relevant data and even to generate new data. The MATLAB experimental results of Iris data classification and MNIST handwriting recognition show that much less neuron resources are required in QPNN to obtain a good result than the classical feedforward neural network. The proposed QPNN model indicates that quantum effects are useful for real-life classification tasks.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyack, B.E.; Steiner, J.L.; Harmony, S.C.
The PIUS advanced reactor is a 640-MWe pressurized water reactor developed by Asea Brown Boveri (ABB). A unique feature of the PIUS concept is the absence of mechanical control and shutdown rods. Reactivity is normally controlled by coolant boron concentration and the temperature of the moderator coolant. ABB submitted the PIUS design to the US Nuclear Regulatory Commission (NRC) for preapplication review, and Los Alamos supported the NRC`s review effort. Baseline analyses of small-break initiators at two locations were performed with the system neutronic and thermal-hydraulic analysis code TRAC-PF1/MOD2. In addition, sensitivity studies were performed to explore the robustness ofmore » the PIUS concept to severe off-normal conditions having a very low probability of occurrence.« less
The estimation of tree posterior probabilities using conditional clade probability distributions.
Larget, Bret
2013-07-01
In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.
Link, W.A.; Armitage, Peter; Colton, Theodore
1998-01-01
Unbiasedness is probably the best known criterion for evaluating the performance of estimators. This note describes unbiasedness, demonstrating various failings of the criterion. It is shown that unbiased estimators might not exist, or might not be unique; an example of a unique but clearly unacceptable unbiased estimator is given. It is shown that unbiased estimators are not translation invariant. Various alternative criteria are described, and are illustrated through examples.
Hydraulic Conductivity Estimation using Bayesian Model Averaging and Generalized Parameterization
NASA Astrophysics Data System (ADS)
Tsai, F. T.; Li, X.
2006-12-01
Non-uniqueness in parameterization scheme is an inherent problem in groundwater inverse modeling due to limited data. To cope with the non-uniqueness problem of parameterization, we introduce a Bayesian Model Averaging (BMA) method to integrate a set of selected parameterization methods. The estimation uncertainty in BMA includes the uncertainty in individual parameterization methods as the within-parameterization variance and the uncertainty from using different parameterization methods as the between-parameterization variance. Moreover, the generalized parameterization (GP) method is considered in the geostatistical framework in this study. The GP method aims at increasing the flexibility of parameterization through the combination of a zonation structure and an interpolation method. The use of BMP with GP avoids over-confidence in a single parameterization method. A normalized least-squares estimation (NLSE) is adopted to calculate the posterior probability for each GP. We employee the adjoint state method for the sensitivity analysis on the weighting coefficients in the GP method. The adjoint state method is also applied to the NLSE problem. The proposed methodology is implemented to the Alamitos Barrier Project (ABP) in California, where the spatially distributed hydraulic conductivity is estimated. The optimal weighting coefficients embedded in GP are identified through the maximum likelihood estimation (MLE) where the misfits between the observed and calculated groundwater heads are minimized. The conditional mean and conditional variance of the estimated hydraulic conductivity distribution using BMA are obtained to assess the estimation uncertainty.
Zhong, Qing; Chen, Qi-Yue; Li, Ping; Xie, Jian-Wei; Wang, Jia-Bin; Lin, Jian-Xian; Lu, Jun; Cao, Long-Long; Lin, Mi; Tu, Ru-Hong; Zheng, Chao-Hui; Huang, Chang-Ming
2018-04-20
The dynamic prognosis of patients who have undergone curative surgery for gastric cancer has yet to be reported. Our objective was to devise an accurate tool for predicting the conditional probability of survival for these patients. We analyzed 11,551 gastric cancer patients from the Surveillance, Epidemiology, and End Results database. Two-thirds of the patients were selected randomly for the development set and one-third for the validation set. Two nomograms were constructed to predict the conditional probability of overall survival and the conditional probability of disease-specific survival, using conditional survival methods. We then applied these nomograms to the 4,001 patients in the database from Fujian Medical University Union Hospital, Fuzhou, China, one of the most active Chinese institutes. The 5-year conditional probability of overall survival of the patients was 41.6% immediately after resection and increased to 52.8%, 68.2%, and 80.4% at 1, 2, and 3 years after gastrectomy. The 5-year conditional probability of disease-specific survival "increased" from 48.9% at the time of gastrectomy to 59.8%, 74.7%, and 85.5% for patients surviving 1, 2, and 3 years, respectively. Sex; race; age; depth of tumor invasion; lymph node metastasis; and tumor size, site, and grade were associated with overall survival and disease-specific survival (P <.05). Within the Surveillance, Epidemiology, and End Results validation set, the accuracy of the conditional probability of overall survival nomogram was 0.77, 0.81, 0.82, and 0.82 at 1, 3, 5, and 10 years after gastrectomy, respectively. Within the other validation set from the Fujian Medical University Union Hospital (n = 4,001), the accuracy of the conditional probability of overall survival nomogram was 0.76, 0.79, 0.77, and 0.77 at 1, 3, 5, and 10 years, respectively. The accuracy of the conditional probability of disease-specific survival model was also favorable. The calibration curve demonstrated good agreement between the predicted and observed survival rates. Based on the large Eastern and Western data sets, we developed and validated the first conditional nomogram for prediction of conditional probability of survival for patients with gastric cancer to allow consideration of the duration of survivorship. Copyright © 2018 Elsevier Inc. All rights reserved.
Option volatility and the acceleration Lagrangian
NASA Astrophysics Data System (ADS)
Baaquie, Belal E.; Cao, Yang
2014-01-01
This paper develops a volatility formula for option on an asset from an acceleration Lagrangian model and the formula is calibrated with market data. The Black-Scholes model is a simpler case that has a velocity dependent Lagrangian. The acceleration Lagrangian is defined, and the classical solution of the system in Euclidean time is solved by choosing proper boundary conditions. The conditional probability distribution of final position given the initial position is obtained from the transition amplitude. The volatility is the standard deviation of the conditional probability distribution. Using the conditional probability and the path integral method, the martingale condition is applied, and one of the parameters in the Lagrangian is fixed. The call option price is obtained using the conditional probability and the path integral method.
Bivariate normal, conditional and rectangular probabilities: A computer program with applications
NASA Technical Reports Server (NTRS)
Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.
1980-01-01
Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.
Storkel, Holly L.; Lee, Jaehoon; Cox, Casey
2016-01-01
Purpose Noisy conditions make auditory processing difficult. This study explores whether noisy conditions influence the effects of phonotactic probability (the likelihood of occurrence of a sound sequence) and neighborhood density (phonological similarity among words) on adults' word learning. Method Fifty-eight adults learned nonwords varying in phonotactic probability and neighborhood density in either an unfavorable (0-dB signal-to-noise ratio [SNR]) or a favorable (+8-dB SNR) listening condition. Word learning was assessed using a picture naming task by scoring the proportion of phonemes named correctly. Results The unfavorable 0-dB SNR condition showed a significant interaction between phonotactic probability and neighborhood density in the absence of main effects. In particular, adults learned more words when phonotactic probability and neighborhood density were both low or both high. The +8-dB SNR condition did not show this interaction. These results are inconsistent with those from a prior adult word learning study conducted under quiet listening conditions that showed main effects of word characteristics. Conclusions As the listening condition worsens, adult word learning benefits from a convergence of phonotactic probability and neighborhood density. Clinical implications are discussed for potential populations who experience difficulty with auditory perception or processing, making them more vulnerable to noise. PMID:27788276
Han, Min Kyung; Storkel, Holly L; Lee, Jaehoon; Cox, Casey
2016-11-01
Noisy conditions make auditory processing difficult. This study explores whether noisy conditions influence the effects of phonotactic probability (the likelihood of occurrence of a sound sequence) and neighborhood density (phonological similarity among words) on adults' word learning. Fifty-eight adults learned nonwords varying in phonotactic probability and neighborhood density in either an unfavorable (0-dB signal-to-noise ratio [SNR]) or a favorable (+8-dB SNR) listening condition. Word learning was assessed using a picture naming task by scoring the proportion of phonemes named correctly. The unfavorable 0-dB SNR condition showed a significant interaction between phonotactic probability and neighborhood density in the absence of main effects. In particular, adults learned more words when phonotactic probability and neighborhood density were both low or both high. The +8-dB SNR condition did not show this interaction. These results are inconsistent with those from a prior adult word learning study conducted under quiet listening conditions that showed main effects of word characteristics. As the listening condition worsens, adult word learning benefits from a convergence of phonotactic probability and neighborhood density. Clinical implications are discussed for potential populations who experience difficulty with auditory perception or processing, making them more vulnerable to noise.
Tian, Haoting; Gao, Juan; Li, Hui; Boyd, Stephen A.; Gu, Cheng
2016-01-01
Here we describe a unique process that achieves complete defluorination and decomposition of perfluorinated compounds (PFCs) which comprise one of the most recalcitrant and widely distributed classes of toxic pollutant chemicals found in natural environments. Photogenerated hydrated electrons derived from 3-indole-acetic-acid within an organomodified clay induce the reductive defluorination of co-sorbed PFCs. The process proceeds to completion within a few hours under mild reaction conditions. The organomontmorillonite clay promotes the formation of highly reactive hydrated electrons by stabilizing indole radical cations formed upon photolysis, and prevents their deactivation by reaction with protons or oxygen. In the constrained interlayer regions of the clay, hydrated electrons and co-sorbed PFCs are brought in close proximity thereby increasing the probability of reaction. This novel green chemistry provides the basis for in situ and ex situ technologies to treat one of the most troublesome, recalcitrant and ubiquitous classes of environmental contaminants, i.e., PFCs, utilizing innocuous reagents, naturally occurring materials and mild reaction conditions. PMID:27608658
ERIC Educational Resources Information Center
Erickson, Tim
2017-01-01
Understanding a Bayesian perspective demands comfort with conditional probability and with probabilities that appear to change as we acquire additional information. This paper suggests a simple context in conditional probability that helps develop the understanding students would need for a successful introduction to Bayesian reasoning.
Brand, Matthias; Schiebener, Johannes; Pertl, Marie-Theres; Delazer, Margarete
2014-01-01
Recent models on decision making under risk conditions have suggested that numerical abilities are important ingredients of advantageous decision-making performance, but empirical evidence is still limited. The results of our first study show that logical reasoning and basic mental calculation capacities predict ratio processing and that ratio processing predicts decision making under risk. In the second study, logical reasoning together with executive functions predicted probability processing (numeracy and probability knowledge), and probability processing predicted decision making under risk. These findings suggest that increasing an individual's understanding of ratios and probabilities should lead to more advantageous decisions under risk conditions.
Laser velocimetry measurements in a gas turbine research combustor
NASA Technical Reports Server (NTRS)
Driscoll, J. F.; Pelaccio, D. G.
1979-01-01
The effects of turbulence on the production of pollutant species in a gas-turbine research combustor are studied using laser diffraction velocimetry (LDV) techniques. Measurements that were made in the primary combustion zone include mean velocity, rms velocity fluctuations, velocity probability distributions, and autocorrelation functions. A unique combustor design provides relatively uniform flow conditions and independent control of drop size, equivalence ratio, inlet temperature, and combustor pressure. Parameters which characterize the nature of the spray combustion (i.e., whether single droplet or group combustion occurs), were determined from the LDV data. Turbulent diffusivity (eddy viscosity) reaches a value of 2930 sq cm/sec, corresponding to a convective integral length scale of 1.8 cm. The group combustion number, based on turbulent diffusivity, is measured to be 6.2
Stochastic control system parameter identifiability
NASA Technical Reports Server (NTRS)
Lee, C. H.; Herget, C. J.
1975-01-01
The parameter identification problem of general discrete time, nonlinear, multiple input/multiple output dynamic systems with Gaussian white distributed measurement errors is considered. The knowledge of the system parameterization was assumed to be known. Concepts of local parameter identifiability and local constrained maximum likelihood parameter identifiability were established. A set of sufficient conditions for the existence of a region of parameter identifiability was derived. A computation procedure employing interval arithmetic was provided for finding the regions of parameter identifiability. If the vector of the true parameters is locally constrained maximum likelihood (CML) identifiable, then with probability one, the vector of true parameters is a unique maximal point of the maximum likelihood function in the region of parameter identifiability and the constrained maximum likelihood estimation sequence will converge to the vector of true parameters.
NASA Astrophysics Data System (ADS)
Southworth, B. S.; Kempf, S.; Schmidt, J.
2015-12-01
The discovery of Jupiter's moon Europa maintaining a probably sporadic water vapor plume constitutes a huge scientific opportunity for NASA's upcoming mission to this Galilean moon. Measuring properties of material emerging from interior sources offers a unique chance to understand conditions at Europa's subsurface ocean. Exploiting results obtained for the Enceladus plume, we simulate possible Europa plume configurations, analyze particle number density and surface deposition results, and estimate the expected flux of ice grains on a spacecraft. Due to Europa's high escape speed, observing an active plume will require low-altitude flybys, preferably at altitudes of 5-100 km. At higher altitudes a plume may escape detection. Our simulations provide an extensive library documenting the possible structure of Europa dust plumes, which can be quickly refined as more data on Europa dust plumes are collected.
Yeah Right! Adolescents in the Classroom. Building Success through Better Behaviour Series
ERIC Educational Resources Information Center
Long, Rob
2005-01-01
Is there more disruptive behaviour in schools today? The simple answer to this often asked question is probably yes. But the reasons lie more outside teenagers than inside. For too many teachers there can be an attitude of: "I was a teenager once, therefore I know what it is like." We all develop in a unique time and the issues are unique to that…
Estimating avian population size using Bowden's estimator
Diefenbach, D.R.
2009-01-01
Avian researchers often uniquely mark birds, and multiple estimators could be used to estimate population size using individually identified birds. However, most estimators of population size require that all sightings of marked birds be uniquely identified, and many assume homogeneous detection probabilities. Bowden's estimator can incorporate sightings of marked birds that are not uniquely identified and relax assumptions required of other estimators. I used computer simulation to evaluate the performance of Bowden's estimator for situations likely to be encountered in bird studies. When the assumptions of the estimator were met, abundance and variance estimates and confidence-interval coverage were accurate. However, precision was poor for small population sizes (N < 50) unless a large percentage of the population was marked (>75%) and multiple (≥8) sighting surveys were conducted. If additional birds are marked after sighting surveys begin, it is important to initially mark a large proportion of the population (pm ≥ 0.5 if N ≤ 100 or pm > 0.1 if N ≥ 250) and minimize sightings in which birds are not uniquely identified; otherwise, most population estimates will be overestimated by >10%. Bowden's estimator can be useful for avian studies because birds can be resighted multiple times during a single survey, not all sightings of marked birds have to uniquely identify individuals, detection probabilities among birds can vary, and the complete study area does not have to be surveyed. I provide computer code for use with pilot data to design mark-resight surveys to meet desired precision for abundance estimates.
An Inverse Problem for a Class of Conditional Probability Measure-Dependent Evolution Equations
Mirzaev, Inom; Byrne, Erin C.; Bortz, David M.
2016-01-01
We investigate the inverse problem of identifying a conditional probability measure in measure-dependent evolution equations arising in size-structured population modeling. We formulate the inverse problem as a least squares problem for the probability measure estimation. Using the Prohorov metric framework, we prove existence and consistency of the least squares estimates and outline a discretization scheme for approximating a conditional probability measure. For this scheme, we prove general method stability. The work is motivated by Partial Differential Equation (PDE) models of flocculation for which the shape of the post-fragmentation conditional probability measure greatly impacts the solution dynamics. To illustrate our methodology, we apply the theory to a particular PDE model that arises in the study of population dynamics for flocculating bacterial aggregates in suspension, and provide numerical evidence for the utility of the approach. PMID:28316360
NASA Technical Reports Server (NTRS)
Benavides J. A.; Huchard, E.; Pettorelli, N.; King, A. J.; Brown, M. E.; Archer, C. E.; Appleton, C. C.; Raymond, M.; Cowlishaw, G.
2011-01-01
Host parasite diversity plays a fundamental role in ecological and evolutionary processes, yet the factors that drive it are still poorly understood. A variety of processes, operating across a range of spatial scales, are likely to influence both the probability of parasite encounter and subsequent infection. Here, we explored eight possible determinants of parasite richness, comprising rainfall and temperature at the population level, ranging behavior and home range productivity at the group level, and age, sex, body condition, and social rank at the individual level. We used a unique dataset describing gastrointestinal parasites in a terrestrial subtropical vertebrate (chacma baboons, Papio ursinus), comprising 662 faecal samples from 86 individuals representing all age-sex classes across two groups over two dry seasons in a desert population. Three mixed models were used to identify the most important factor at each of the three spatial scales (population, group, individual); these were then standardised and combined in a single, global, mixed model. Individual age had the strongest influence on parasite richness, in a convex relationship. Parasite richness was also higher in females and animals in poor condition, albeit at a lower order of magnitude than age. Finally, with a further halving of effect size, parasite richness was positively correlated to day range and temperature. These findings indicate that a range of factors influence host parasite richness through both encounter and infection probabilities, but that individual-level processes may be more important than those at the group or population level.
NASA Astrophysics Data System (ADS)
Krechowicz, Maria
2017-10-01
Nowadays, one of the characteristic features of construction industry is an increased complexity of a growing number of projects. Almost each construction project is unique, has its project-specific purpose, its own project structural complexity, owner’s expectations, ground conditions unique to a certain location, and its own dynamics. Failure costs and costs resulting from unforeseen problems in complex construction projects are very high. Project complexity drivers pose many vulnerabilities to a successful completion of a number of projects. This paper discusses the process of effective risk management in complex construction projects in which renewable energy sources were used, on the example of the realization phase of the ENERGIS teaching-laboratory building, from the point of view of DORBUD S.A., its general contractor. This paper suggests a new approach to risk management for complex construction projects in which renewable energy sources were applied. The risk management process was divided into six stages: gathering information, identification of the top, critical project risks resulting from the project complexity, construction of the fault tree for each top, critical risks, logical analysis of the fault tree, quantitative risk assessment applying fuzzy logic and development of risk response strategy. A new methodology for the qualitative and quantitative risk assessment for top, critical risks in complex construction projects was developed. Risk assessment was carried out applying Fuzzy Fault Tree analysis on the example of one top critical risk. Application of the Fuzzy sets theory to the proposed model allowed to decrease uncertainty and eliminate problems with gaining the crisp values of the basic events probability, common during expert risk assessment with the objective to give the exact risk score of each unwanted event probability.
Absolute continuity for operator valued completely positive maps on C∗-algebras
NASA Astrophysics Data System (ADS)
Gheondea, Aurelian; Kavruk, Ali Şamil
2009-02-01
Motivated by applicability to quantum operations, quantum information, and quantum probability, we investigate the notion of absolute continuity for operator valued completely positive maps on C∗-algebras, previously introduced by Parthasarathy [in Athens Conference on Applied Probability and Time Series Analysis I (Springer-Verlag, Berlin, 1996), pp. 34-54]. We obtain an intrinsic definition of absolute continuity, we show that the Lebesgue decomposition defined by Parthasarathy is the maximal one among all other Lebesgue-type decompositions and that this maximal Lebesgue decomposition does not depend on the jointly dominating completely positive map, we obtain more flexible formulas for calculating the maximal Lebesgue decomposition, and we point out the nonuniqueness of the Lebesgue decomposition as well as a sufficient condition for uniqueness. In addition, we consider Radon-Nikodym derivatives for absolutely continuous completely positive maps that, in general, are unbounded positive self-adjoint operators affiliated to a certain von Neumann algebra, and we obtain a spectral approximation by bounded Radon-Nikodym derivatives. An application to the existence of the infimum of two completely positive maps is indicated, and formulas in terms of Choi's matrices for the Lebesgue decomposition of completely positive maps in matrix algebras are obtained.
2016-03-01
cyclone THORPEX The Observing System Research and Predictability Experiment TIGGE THORPEX Interactive Grand Global Ensemble TS tropical storm ...forecast possible, but also relay the level of uncertainty unique to a given storm . This will better inform decision makers to help protect all assets at...for any given storm . Thus, the probabilities may 4 increase or decrease (and the probability swath may widen or narrow) to provide a more
Koerner, Naomi; Mejia, Teresa; Kusec, Andrea
2017-03-01
A number of studies have examined the association of intolerance of uncertainty (IU) to trait worry and generalized anxiety disorder (GAD). However, few studies have examined the extent of overlap between IU and other psychological constructs that bear conceptual resemblance to IU, despite the fact that IU-type constructs have been discussed and examined extensively within psychology and other disciplines. The present study investigated (1) the associations of IU, trait worry, and GAD status to a negative risk orientation, trait curiosity, indecisiveness, perceived constraints, self-oriented and socially prescribed perfectionism, intolerance of ambiguity, the need for predictability, and the need for order and structure and (2) whether IU is a unique correlate of trait worry and of the presence versus absence of Probable GAD, when overlap with other uncertainty-relevant constructs is accounted for. N = 255 adults completed self-report measures of the aforementioned constructs. Each of the constructs was significantly associated with IU. Only IU, and a subset of the other uncertainty-relevant constructs were correlated with trait worry or distinguished the Probable GAD group from the Non-GAD group. IU was the strongest unique correlate of trait worry and of the presence versus absence of Probable GAD. Indecisiveness, self-oriented perfectionism and the need for predictability were also unique correlates of trait worry or GAD status. Implications of the findings are discussed, in particular as they pertain to the definition, conceptualization, and cognitive-behavioral treatment of IU in GAD.
Updating: Learning versus Supposing
ERIC Educational Resources Information Center
Zhao, Jiaying; Crupi, Vincenzo; Tentori, Katya; Fitelson, Branden; Osherson, Daniel
2012-01-01
Bayesian orthodoxy posits a tight relationship between conditional probability and updating. Namely, the probability of an event "A" after learning "B" should equal the conditional probability of "A" given "B" prior to learning "B". We examine whether ordinary judgment conforms to the orthodox view. In three experiments we found substantial…
Tree-average distances on certain phylogenetic networks have their weights uniquely determined.
Willson, Stephen J
2012-01-01
A phylogenetic network N has vertices corresponding to species and arcs corresponding to direct genetic inheritance from the species at the tail to the species at the head. Measurements of DNA are often made on species in the leaf set, and one seeks to infer properties of the network, possibly including the graph itself. In the case of phylogenetic trees, distances between extant species are frequently used to infer the phylogenetic trees by methods such as neighbor-joining. This paper proposes a tree-average distance for networks more general than trees. The notion requires a weight on each arc measuring the genetic change along the arc. For each displayed tree the distance between two leaves is the sum of the weights along the path joining them. At a hybrid vertex, each character is inherited from one of its parents. We will assume that for each hybrid there is a probability that the inheritance of a character is from a specified parent. Assume that the inheritance events at different hybrids are independent. Then for each displayed tree there will be a probability that the inheritance of a given character follows the tree; this probability may be interpreted as the probability of the tree. The tree-average distance between the leaves is defined to be the expected value of their distance in the displayed trees. For a class of rooted networks that includes rooted trees, it is shown that the weights and the probabilities at each hybrid vertex can be calculated given the network and the tree-average distances between the leaves. Hence these weights and probabilities are uniquely determined. The hypotheses on the networks include that hybrid vertices have indegree exactly 2 and that vertices that are not leaves have a tree-child.
Music-evoked incidental happiness modulates probability weighting during risky lottery choices
Schulreich, Stefan; Heussen, Yana G.; Gerhardt, Holger; Mohr, Peter N. C.; Binkofski, Ferdinand C.; Koelsch, Stefan; Heekeren, Hauke R.
2014-01-01
We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music—happy, sad, or no music, or sequences of random tones—and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the “happy” than in the “sad” and “random tones” conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the “happy” condition, participants showed significantly higher decision weights associated with the larger payoffs than in the “sad” and “random tones” conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting. PMID:24432007
Music-evoked incidental happiness modulates probability weighting during risky lottery choices.
Schulreich, Stefan; Heussen, Yana G; Gerhardt, Holger; Mohr, Peter N C; Binkofski, Ferdinand C; Koelsch, Stefan; Heekeren, Hauke R
2014-01-07
We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music-happy, sad, or no music, or sequences of random tones-and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the "happy" than in the "sad" and "random tones" conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the "happy" condition, participants showed significantly higher decision weights associated with the larger payoffs than in the "sad" and "random tones" conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting.
CProb: a computational tool for conducting conditional probability analysis.
Hollister, Jeffrey W; Walker, Henry A; Paul, John F
2008-01-01
Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.
Statistical inference of seabed sound-speed structure in the Gulf of Oman Basin.
Sagers, Jason D; Knobles, David P
2014-06-01
Addressed is the statistical inference of the sound-speed depth profile of a thick soft seabed from broadband sound propagation data recorded in the Gulf of Oman Basin in 1977. The acoustic data are in the form of time series signals recorded on a sparse vertical line array and generated by explosive sources deployed along a 280 km track. The acoustic data offer a unique opportunity to study a deep-water bottom-limited thickly sedimented environment because of the large number of time series measurements, very low seabed attenuation, and auxiliary measurements. A maximum entropy method is employed to obtain a conditional posterior probability distribution (PPD) for the sound-speed ratio and the near-surface sound-speed gradient. The multiple data samples allow for a determination of the average error constraint value required to uniquely specify the PPD for each data sample. Two complicating features of the statistical inference study are addressed: (1) the need to develop an error function that can both utilize the measured multipath arrival structure and mitigate the effects of data errors and (2) the effect of small bathymetric slopes on the structure of the bottom interacting arrivals.
Probability in reasoning: a developmental test on conditionals.
Barrouillet, Pierre; Gauffroy, Caroline
2015-04-01
Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.
Internal Medicine residents use heuristics to estimate disease probability.
Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin
2015-01-01
Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing.
Conditional, Time-Dependent Probabilities for Segmented Type-A Faults in the WGCEP UCERF 2
Field, Edward H.; Gupta, Vipin
2008-01-01
This appendix presents elastic-rebound-theory (ERT) motivated time-dependent probabilities, conditioned on the date of last earthquake, for the segmented type-A fault models of the 2007 Working Group on California Earthquake Probabilities (WGCEP). These probabilities are included as one option in the WGCEP?s Uniform California Earthquake Rupture Forecast 2 (UCERF 2), with the other options being time-independent Poisson probabilities and an ?Empirical? model based on observed seismicity rate changes. A more general discussion of the pros and cons of all methods for computing time-dependent probabilities, as well as the justification of those chosen for UCERF 2, are given in the main body of this report (and the 'Empirical' model is also discussed in Appendix M). What this appendix addresses is the computation of conditional, time-dependent probabilities when both single- and multi-segment ruptures are included in the model. Computing conditional probabilities is relatively straightforward when a fault is assumed to obey strict segmentation in the sense that no multi-segment ruptures occur (e.g., WGCEP (1988, 1990) or see Field (2007) for a review of all previous WGCEPs; from here we assume basic familiarity with conditional probability calculations). However, and as we?ll see below, the calculation is not straightforward when multi-segment ruptures are included, in essence because we are attempting to apply a point-process model to a non point process. The next section gives a review and evaluation of the single- and multi-segment rupture probability-calculation methods used in the most recent statewide forecast for California (WGCEP UCERF 1; Petersen et al., 2007). We then present results for the methodology adopted here for UCERF 2. We finish with a discussion of issues and possible alternative approaches that could be explored and perhaps applied in the future. A fault-by-fault comparison of UCERF 2 probabilities with those of previous studies is given in the main part of this report.
The Formalism of Generalized Contexts and Decay Processes
NASA Astrophysics Data System (ADS)
Losada, Marcelo; Laura, Roberto
2013-04-01
The formalism of generalized contexts for quantum histories is used to investigate the possibility to consider the survival probability as the probability of no decay property at a given time conditional to no decay property at an earlier time. A negative result is found for an isolated system. The inclusion of two quantum measurement instruments at two different times makes possible to interpret the survival probability as a conditional probability of the whole system.
Vitamin panacea: Is advertising fueling demand for products with uncertain scientific benefit?
Eisenberg, Matthew D; Avery, Rosemary J; Cantor, Jonathan H
2017-09-01
This study examines the effect of advertising on demand for vitamins-products with spiraling sales despite little evidence of efficacy. We merge seven years (2003-2009) of advertising data from Kantar Media with the Simmons National Consumer Survey to estimate individual-level vitamin print and television ad exposure effects. Identification relies on exploiting exogenous variation in year-to-year advertising exposure by controlling for each individual's unique media consumption. We find that increasing advertising exposure from zero to the mean number of ads increases the probability of consumption by 1.2 and 0.8% points (or 2 and 1.4%) in print and television respectively. Stratifications by the presence of health conditions suggests that in print demand is being driven by both healthy and sick individuals. Copyright © 2017 Elsevier B.V. All rights reserved.
Apparent impact: the hidden cost of one-shot trades
NASA Astrophysics Data System (ADS)
Mastromatteo, Iacopo
2015-06-01
We study the problem of the execution of a moderate size order in an illiquid market within the framework of a solvable Markovian model. We suppose that in order to avoid impact costs, a trader decides to execute her order through a unique trade, waiting for enough liquidity to accumulate at the best quote. We find that despite the absence of a proper price impact, such trader faces an execution cost arising from a non-vanishing correlation among volume at the best quotes and price changes. We characterize analytically the statistics of the execution time and its cost by mapping the problem to the simpler one of calculating a set of first-passage probabilities on a semi-infinite strip. We finally argue that price impact cannot be completely avoided by conditioning the execution of an order to a more favorable liquidity scenario.
Friedrich Nietzsche: the wandering and learned neuropath under Dionisius.
Gomes, Marleide da Mota
2015-11-01
Friedrich Nietzsche (1844-1900) was a remarkable philologist-philosopher while remaining in a condition of ill-health. Issues about his wandering/disruptive behavior that might be a consequence and/or protection against his cognitive decline and multifaceted disease are presented. The life complex that raises speculations about its etiology is constituted by: insight, creativity and wandering behavior besides several symptoms and signs of disease(s), mainly neurological one. The most important issue to be considered at the moment is not the disease diagnosis (Lissauer's general paresis or CADASIL, e.g.), but the probable Nietzsche's great cognitive reserve linked to the multifactorial etiology (genetic and environmental), and shared characteristics both to creativity and psychopathology. This makes any disease seems especial regarding Nietzsche, and whichever the diagnostic hypothesis has to consider the Nietzsche's unique background to express any disease(s).
Sufficient conditions for uniqueness of the weak value
NASA Astrophysics Data System (ADS)
Dressel, J.; Jordan, A. N.
2012-01-01
We review and clarify the sufficient conditions for uniquely defining the generalized weak value as the weak limit of a conditioned average using the contextual values formalism introduced in Dressel, Agarwal and Jordan (2010 Phys. Rev. Lett. 104 240401). We also respond to criticism of our work by Parrott (arXiv:1105.4188v1) concerning a proposed counter-example to the uniqueness of the definition of the generalized weak value. The counter-example does not satisfy our prescription in the case of an underspecified measurement context. We show that when the contextual values formalism is properly applied to this example, a natural interpretation of the measurement emerges and the unique definition in the weak limit holds. We also prove a theorem regarding the uniqueness of the definition under our sufficient conditions for the general case. Finally, a second proposed counter-example by Parrott (arXiv:1105.4188v6) is shown not to satisfy the sufficiency conditions for the provided theorem.
NASA Astrophysics Data System (ADS)
Radakovic, Nenad; McDougall, Douglas
2012-10-01
This classroom note illustrates how dynamic visualization can be used to teach conditional probability and Bayes' theorem. There are two features of the visualization that make it an ideal pedagogical tool in probability instruction. The first feature is the use of area-proportional Venn diagrams that, along with showing qualitative relationships, describe the quantitative relationship between two sets. The second feature is the slider and animation component of dynamic geometry software enabling students to observe how the change in the base rate of an event influences conditional probability. A hypothetical instructional sequence using a well-known breast cancer example is described.
NASA Astrophysics Data System (ADS)
Berkovitz, Joseph
Bruno de Finetti is one of the founding fathers of the subjectivist school of probability, where probabilities are interpreted as rational degrees of belief. His work on the relation between the theorems of probability and rationality is among the corner stones of modern subjective probability theory. De Finetti maintained that rationality requires that degrees of belief be coherent, and he argued that the whole of probability theory could be derived from these coherence conditions. De Finetti's interpretation of probability has been highly influential in science. This paper focuses on the application of this interpretation to quantum mechanics. We argue that de Finetti held that the coherence conditions of degrees of belief in events depend on their verifiability. Accordingly, the standard coherence conditions of degrees of belief that are familiar from the literature on subjective probability only apply to degrees of belief in events which could (in principle) be jointly verified; and the coherence conditions of degrees of belief in events that cannot be jointly verified are weaker. While the most obvious explanation of de Finetti's verificationism is the influence of positivism, we argue that it could be motivated by the radical subjectivist and instrumental nature of probability in his interpretation; for as it turns out, in this interpretation it is difficult to make sense of the idea of coherent degrees of belief in, and accordingly probabilities of unverifiable events. We then consider the application of this interpretation to quantum mechanics, concentrating on the Einstein-Podolsky-Rosen experiment and Bell's theorem.
NASA Astrophysics Data System (ADS)
Dasgupta, S.; Fang, J.; Zhang, L.; Li, J.
2012-12-01
Lipid analysis and carbon isotope ratios (δ13C) of lipids in biofilms in an acid mine drainage site (AMD) site in western Indiana revealed unique biogeochemical signatures of microeukaryotes, never recorded before. Dominance of photosynthetic microeukaryote Euglena was indicated by the detection of abundant phytadiene, phytol, phytanol, polyunsaturated n-alkenes, polyunsaturated fatty acids, short-chain (C25-32) wax esters (WE), ergosterol, and tocopherols. The WE were probably synthesized in mitochondria under anoxic conditions by the reverse β-oxidation pathway, whereas the sterols (ergosterol and ergosta-7,22-dien-3β-ol) were likely synthesized in the cytosol in the presence of molecular oxygen. The dual aerobic and anaerobic biosynthetic pathways of Euglena may be a response to survive the recurring anoxic and oxic conditions in primitive Earth, whereby microeukaryotes retained this mechanism of conserved compartmentalization within their physiology to evolve and diversify in extreme conditions. Hydrocarbons, including n-alkenes, phytadienes, and wax esters showed heavy δ13C values than usual. The primary cause for the 13C-enrichment can be attributed to a CO2-limiting system that exists in the AMD, which is further regulated by the pH of the AMD. Floating biofilms BF2, 4, and 6 showed more depleted δ13C values for phytadienes and n-alkenes (average of -23.6‰) as compared to benthic biofilm BF5 (average of -20.8‰), indicating that physiology plays an important role in isotopic discrimination. 13C-enriched values of the esters could result from kinetic isotope effects at two branch points (pyruvate and/or acetyl CoA) in the biosynthetic pathway. Our understanding of biogeochemical conditions in this AMD environment would allow us to identify unique sets of biosignatures that can act as a proxy in deciphering the links between eukaryotic evolutions, oxygenation of the early atmosphere, formation of BIF, and possibly iron-rich extraterrestrial environments.
Internal Medicine residents use heuristics to estimate disease probability
Phang, Sen Han; Ravani, Pietro; Schaefer, Jeffrey; Wright, Bruce; McLaughlin, Kevin
2015-01-01
Background Training in Bayesian reasoning may have limited impact on accuracy of probability estimates. In this study, our goal was to explore whether residents previously exposed to Bayesian reasoning use heuristics rather than Bayesian reasoning to estimate disease probabilities. We predicted that if residents use heuristics then post-test probability estimates would be increased by non-discriminating clinical features or a high anchor for a target condition. Method We randomized 55 Internal Medicine residents to different versions of four clinical vignettes and asked them to estimate probabilities of target conditions. We manipulated the clinical data for each vignette to be consistent with either 1) using a representative heuristic, by adding non-discriminating prototypical clinical features of the target condition, or 2) using anchoring with adjustment heuristic, by providing a high or low anchor for the target condition. Results When presented with additional non-discriminating data the odds of diagnosing the target condition were increased (odds ratio (OR) 2.83, 95% confidence interval [1.30, 6.15], p = 0.009). Similarly, the odds of diagnosing the target condition were increased when a high anchor preceded the vignette (OR 2.04, [1.09, 3.81], p = 0.025). Conclusions Our findings suggest that despite previous exposure to the use of Bayesian reasoning, residents use heuristics, such as the representative heuristic and anchoring with adjustment, to estimate probabilities. Potential reasons for attribute substitution include the relative cognitive ease of heuristics vs. Bayesian reasoning or perhaps residents in their clinical practice use gist traces rather than precise probability estimates when diagnosing. PMID:27004080
Conservative Analytical Collision Probabilities for Orbital Formation Flying
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
2004-01-01
The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.
Conservative Analytical Collision Probability for Design of Orbital Formations
NASA Technical Reports Server (NTRS)
Carpenter, J. Russell
2004-01-01
The literature offers a number of approximations for analytically and/or efficiently computing the probability of collision between two space objects. However, only one of these techniques is a completely analytical approximation that is suitable for use in the preliminary design phase, when it is more important to quickly analyze a large segment of the trade space than it is to precisely compute collision probabilities. Unfortunately, among the types of formations that one might consider, some combine a range of conditions for which this analytical method is less suitable. This work proposes a simple, conservative approximation that produces reasonable upper bounds on the collision probability in such conditions. Although its estimates are much too conservative under other conditions, such conditions are typically well suited for use of the existing method.
Implementation of the Iterative Proportion Fitting Algorithm for Geostatistical Facies Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li Yupeng, E-mail: yupeng@ualberta.ca; Deutsch, Clayton V.
2012-06-15
In geostatistics, most stochastic algorithm for simulation of categorical variables such as facies or rock types require a conditional probability distribution. The multivariate probability distribution of all the grouped locations including the unsampled location permits calculation of the conditional probability directly based on its definition. In this article, the iterative proportion fitting (IPF) algorithm is implemented to infer this multivariate probability. Using the IPF algorithm, the multivariate probability is obtained by iterative modification to an initial estimated multivariate probability using lower order bivariate probabilities as constraints. The imposed bivariate marginal probabilities are inferred from profiles along drill holes or wells.more » In the IPF process, a sparse matrix is used to calculate the marginal probabilities from the multivariate probability, which makes the iterative fitting more tractable and practical. This algorithm can be extended to higher order marginal probability constraints as used in multiple point statistics. The theoretical framework is developed and illustrated with estimation and simulation example.« less
ERIC Educational Resources Information Center
Radakovic, Nenad; McDougall, Douglas
2012-01-01
This classroom note illustrates how dynamic visualization can be used to teach conditional probability and Bayes' theorem. There are two features of the visualization that make it an ideal pedagogical tool in probability instruction. The first feature is the use of area-proportional Venn diagrams that, along with showing qualitative relationships,…
Vanrie, Jan; Béatse, Erik; Wagemans, Johan; Sunaert, Stefan; Van Hecke, Paul
2002-01-01
It has been proposed that object perception can proceed through different routes, which can be situated on a continuum ranging from complete viewpoint-dependency to complete viewpoint-independency, depending on the objects and the task at hand. Although these different routes have been extensively demonstrated on the behavioral level, the corresponding distinction in the underlying neural substrate has not received the same attention. Our goal was to disentangle, on the behavioral and the neurofunctional level, a process associated with extreme viewpoint-dependency, i.e. mental rotation, and a process associated with extreme viewpoint-independency, i.e. the use of viewpoint-invariant, diagnostic features. Two sets of 3-D block figures were created that either differed in handedness (original versus mirrored) or in the angles joining the block components (orthogonal versus skewed). Behavioral measures on a same-different judgment task were predicted to be dependent on viewpoint in the rotation condition (same versus mirrored), but not in the invariance condition (same angles versus different angles). Six subjects participated in an fMRI experiment while presented with both conditions in alternating blocks. Both reaction times and accuracy confirmed the predicted dissociation between the two conditions. Neurofunctional results indicate that all cortical areas activated in the invariance condition were also activated in the rotation condition. Parietal areas were more activated than occipito-temporal areas in the rotation condition, while this pattern was reversed in the invariance condition. Furthermore, some areas were activated uniquely by the rotation condition, probably reflecting the additional processes apparent in the behavioral response patterns.
Dealing with non-unique and non-monotonic response in particle sizing instruments
NASA Astrophysics Data System (ADS)
Rosenberg, Phil
2017-04-01
A number of instruments used as de-facto standards for measuring particle size distributions are actually incapable of uniquely determining the size of an individual particle. This is due to non-unique or non-monotonic response functions. Optical particle counters have non monotonic response due to oscillations in the Mie response curves, especially for large aerosol and small cloud droplets. Scanning mobility particle sizers respond identically to two particles where the ratio of particle size to particle charge is approximately the same. Images of two differently sized cloud or precipitation particles taken by an optical array probe can have similar dimensions or shadowed area depending upon where they are in the imaging plane. A number of methods exist to deal with these issues, including assuming that positive and negative errors cancel, smoothing response curves, integrating regions in measurement space before conversion to size space and matrix inversion. Matrix inversion (also called kernel inversion) has the advantage that it determines the size distribution which best matches the observations, given specific information about the instrument (a matrix which specifies the probability that a particle of a given size will be measured in a given instrument size bin). In this way it maximises use of the information in the measurements. However this technique can be confused by poor counting statistics which can cause erroneous results and negative concentrations. Also an effective method for propagating uncertainties is yet to be published or routinely implemented. Her we present a new alternative which overcomes these issues. We use Bayesian methods to determine the probability that a given size distribution is correct given a set of instrument data and then we use Markov Chain Monte Carlo methods to sample this many dimensional probability distribution function to determine the expectation and (co)variances - hence providing a best guess and an uncertainty for the size distribution which includes contributions from the non-unique response curve, counting statistics and can propagate calibration uncertainties.
Statistical learning of action: the role of conditional probability.
Meyer, Meredith; Baldwin, Dare
2011-12-01
Identification of distinct units within a continuous flow of human action is fundamental to action processing. Such segmentation may rest in part on statistical learning. In a series of four experiments, we examined what types of statistics people can use to segment a continuous stream involving many brief, goal-directed action elements. The results of Experiment 1 showed no evidence for sensitivity to conditional probability, whereas Experiment 2 displayed learning based on joint probability. In Experiment 3, we demonstrated that additional exposure to the input failed to engender sensitivity to conditional probability. However, the results of Experiment 4 showed that a subset of adults-namely, those more successful at identifying actions that had been seen more frequently than comparison sequences-were also successful at learning conditional-probability statistics. These experiments help to clarify the mechanisms subserving processing of intentional action, and they highlight important differences from, as well as similarities to, prior studies of statistical learning in other domains, including language.
Probabilistic Parameter Uncertainty Analysis of Single Input Single Output Control Systems
NASA Technical Reports Server (NTRS)
Smith, Brett A.; Kenny, Sean P.; Crespo, Luis G.
2005-01-01
The current standards for handling uncertainty in control systems use interval bounds for definition of the uncertain parameters. This approach gives no information about the likelihood of system performance, but simply gives the response bounds. When used in design, current methods of m-analysis and can lead to overly conservative controller design. With these methods, worst case conditions are weighted equally with the most likely conditions. This research explores a unique approach for probabilistic analysis of control systems. Current reliability methods are examined showing the strong areas of each in handling probability. A hybrid method is developed using these reliability tools for efficiently propagating probabilistic uncertainty through classical control analysis problems. The method developed is applied to classical response analysis as well as analysis methods that explore the effects of the uncertain parameters on stability and performance metrics. The benefits of using this hybrid approach for calculating the mean and variance of responses cumulative distribution functions are shown. Results of the probabilistic analysis of a missile pitch control system, and a non-collocated mass spring system, show the added information provided by this hybrid analysis.
Nonlinear Stability and Structure of Compressible Reacting Mixing Layers
NASA Technical Reports Server (NTRS)
Day, M. J.; Mansour, N. N.; Reynolds, W. C.
2000-01-01
The parabolized stability equations (PSE) are used to investigate issues of nonlinear flow development and mixing in compressible reacting shear layers. Particular interest is placed on investigating the change in flow structure that occurs when compressibility and heat release are added to the flow. These conditions allow the 'outer' instability modes- one associated with each of the fast and slow streams-to dominate over the 'central', Kelvin-Helmholtz mode that unaccompanied in incompressible nonreacting mixing layers. Analysis of scalar probability density functions in flows with dominant outer modes demonstrates the ineffective, one-sided nature of mixing that accompany these flow structures. Colayer conditions, where two modes have equal growth rate and the mixing layer is formed by two sets of vortices, offer some opportunity for mixing enhancement. Their extent, however, is found to be limited in the mixing layer's parameter space. Extensive validation of the PSE technique also provides a unique perspective on central- mode vortex pairing, further supporting the view that pairing is primarily governed perspective sheds insight on how linear stability theory is able to provide such an accurate prediction of experimentally-observed, fully nonlinear flow phenomenon.
Preobrazhenskaia, L A; Ioffe, M E; Mats, V N
2004-01-01
The role of the prefrontal cortex was investigated on the reaction of the active choice of the two feeders under changes value and probability reinforcement. The experiments were performed on 2 dogs with prefrontal ablation (g. proreus). Before the lesions the dogs were taught to receive food in two different feeders to conditioned stimuli with equally probable alimentary reinforcement. After ablation in the inter-trial intervals the dogs were running from the one feeder to another. In the answer to conditioned stimuli for many times the dogs choose the same feeder. The disturbance of the behavior after some times completely restored. In the experiments with competition of probability events and values of reinforcement the dogs chose the feeder with low-probability but better quality of reinforcement. In the experiments with equal value but different probability the intact dogs chose the feeder with higher probability. In our experiments the dogs with prefrontal lesions chose the each feeder equiprobably. Thus in condition of free behavior one of different functions of the prefrontal cortex is the reactions choose with more probability of reinforcement.
Leue, Anja; Cano Rodilla, Carmen; Beauducel, André
2015-01-01
Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated. PMID:26783525
Leue, Anja; Cano Rodilla, Carmen; Beauducel, André
2015-01-01
Individuals typically evaluate whether their performance and the obtained feedback match. Previous research has shown that feedback negativity (FN) depends on outcome probability and feedback valence. It is, however, less clear to what extent previous effects of outcome probability on FN depend on self-evaluations of response correctness. Therefore, we investigated the effects of outcome probability on FN amplitude in a simple go/no-go task that allowed for the self-evaluation of response correctness. We also investigated effects of performance incompatibility and feedback valence. In a sample of N = 22 participants, outcome probability was manipulated by means of precues, feedback valence by means of monetary feedback, and performance incompatibility by means of feedback that induced a match versus mismatch with individuals' performance. We found that the 100% outcome probability condition induced a more negative FN following no-loss than the 50% outcome probability condition. The FN following loss was more negative in the 50% compared to the 100% outcome probability condition. Performance-incompatible loss resulted in a more negative FN than performance-compatible loss. Our results indicate that the self-evaluation of the correctness of responses should be taken into account when the effects of outcome probability and expectation mismatch on FN are investigated.
Fingerprints of exceptional points in the survival probability of resonances in atomic spectra
NASA Astrophysics Data System (ADS)
Cartarius, Holger; Moiseyev, Nimrod
2011-07-01
The unique time signature of the survival probability exactly at the exceptional point parameters is studied here for the hydrogen atom in strong static magnetic and electric fields. We show that indeed the survival probability S(t)=|<ψ(0)|ψ(t)>|2 decays exactly as |1-at|2e-ΓEPt/ℏ, where ΓEP is associated with the decay rate at the exceptional point and a is a complex constant depending solely on the initial wave packet that populates exclusively the two almost degenerate states of the non-Hermitian Hamiltonian. This may open the possibility for a first experimental detection of exceptional points in a quantum system.
Uniqueness of solutions for a mathematical model for magneto-viscoelastic flows
NASA Astrophysics Data System (ADS)
Schlömerkemper, A.; Žabenský, J.
2018-06-01
We investigate uniqueness of weak solutions for a system of partial differential equations capturing behavior of magnetoelastic materials. This system couples the Navier–Stokes equations with evolutionary equations for the deformation gradient and for the magnetization obtained from a special case of the micromagnetic energy. It turns out that the conditions on uniqueness coincide with those for the well-known Navier–Stokes equations in bounded domains: weak solutions are unique in two spatial dimensions, and weak solutions satisfying the Prodi–Serrin conditions are unique among all weak solutions in three dimensions. That is, we obtain the so-called weak-strong uniqueness result in three spatial dimensions.
Statistical Decoupling of a Lagrangian Fluid Parcel in Newtonian Cosmology
NASA Astrophysics Data System (ADS)
Wang, Xin; Szalay, Alex
2016-03-01
The Lagrangian dynamics of a single fluid element within a self-gravitational matter field is intrinsically non-local due to the presence of the tidal force. This complicates the theoretical investigation of the nonlinear evolution of various cosmic objects, e.g., dark matter halos, in the context of Lagrangian fluid dynamics, since fluid parcels with given initial density and shape may evolve differently depending on their environments. In this paper, we provide a statistical solution that could decouple this environmental dependence. After deriving the evolution equation for the probability distribution of the matter field, our method produces a set of closed ordinary differential equations whose solution is uniquely determined by the initial condition of the fluid element. Mathematically, it corresponds to the projected characteristic curve of the transport equation of the density-weighted probability density function (ρPDF). Consequently it is guaranteed that the one-point ρPDF would be preserved by evolving these local, yet nonlinear, curves with the same set of initial data as the real system. Physically, these trajectories describe the mean evolution averaged over all environments by substituting the tidal tensor with its conditional average. For Gaussian distributed dynamical variables, this mean tidal tensor is simply proportional to the velocity shear tensor, and the dynamical system would recover the prediction of the Zel’dovich approximation (ZA) with the further assumption of the linearized continuity equation. For a weakly non-Gaussian field, the averaged tidal tensor could be expanded perturbatively as a function of all relevant dynamical variables whose coefficients are determined by the statistics of the field.
STATISTICAL DECOUPLING OF A LAGRANGIAN FLUID PARCEL IN NEWTONIAN COSMOLOGY
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Xin; Szalay, Alex, E-mail: xwang@cita.utoronto.ca
The Lagrangian dynamics of a single fluid element within a self-gravitational matter field is intrinsically non-local due to the presence of the tidal force. This complicates the theoretical investigation of the nonlinear evolution of various cosmic objects, e.g., dark matter halos, in the context of Lagrangian fluid dynamics, since fluid parcels with given initial density and shape may evolve differently depending on their environments. In this paper, we provide a statistical solution that could decouple this environmental dependence. After deriving the evolution equation for the probability distribution of the matter field, our method produces a set of closed ordinary differentialmore » equations whose solution is uniquely determined by the initial condition of the fluid element. Mathematically, it corresponds to the projected characteristic curve of the transport equation of the density-weighted probability density function (ρPDF). Consequently it is guaranteed that the one-point ρPDF would be preserved by evolving these local, yet nonlinear, curves with the same set of initial data as the real system. Physically, these trajectories describe the mean evolution averaged over all environments by substituting the tidal tensor with its conditional average. For Gaussian distributed dynamical variables, this mean tidal tensor is simply proportional to the velocity shear tensor, and the dynamical system would recover the prediction of the Zel’dovich approximation (ZA) with the further assumption of the linearized continuity equation. For a weakly non-Gaussian field, the averaged tidal tensor could be expanded perturbatively as a function of all relevant dynamical variables whose coefficients are determined by the statistics of the field.« less
Li, Jiabao; Rui, Junpeng; Yao, Minjie; Zhang, Shiheng; Yan, Xuefeng; Wang, Yuanpeng; Yan, Zhiying; Li, Xiangzhen
2015-01-01
The microbial-mediated anaerobic digestion (AD) process represents an efficient biological process for the treatment of organic waste along with biogas harvest. Currently, the key factors structuring bacterial communities and the potential core and unique bacterial populations in manure anaerobic digesters are not completely elucidated yet. In this study, we collected sludge samples from 20 full-scale anaerobic digesters treating cattle or swine manure, and investigated the variations of bacterial community compositions using high-throughput 16S rRNA amplicon sequencing. Clustering and correlation analysis suggested that substrate type and free ammonia (FA) play key roles in determining the bacterial community structure. The COD: [Formula: see text] (C:N) ratio of substrate and FA were the most important available operational parameters correlating to the bacterial communities in cattle and swine manure digesters, respectively. The bacterial populations in all of the digesters were dominated by phylum Firmicutes, followed by Bacteroidetes, Proteobacteria and Chloroflexi. Increased FA content selected Firmicutes, suggesting that they probably play more important roles under high FA content. Syntrophic metabolism by Proteobacteria, Chloroflexi, Synergistetes and Planctomycetes are likely inhibited when FA content is high. Despite the different manure substrates, operational conditions and geographical locations of digesters, core bacterial communities were identified. The core communities were best characterized by phylum Firmicutes, wherein Clostridium predominated overwhelmingly. Substrate-unique and abundant communities may reflect the properties of manure substrate and operational conditions. These findings extend our current understanding of the bacterial assembly in full-scale manure anaerobic digesters.
Li, Jiabao; Rui, Junpeng; Yao, Minjie; Zhang, Shiheng; Yan, Xuefeng; Wang, Yuanpeng; Yan, Zhiying; Li, Xiangzhen
2015-01-01
The microbial-mediated anaerobic digestion (AD) process represents an efficient biological process for the treatment of organic waste along with biogas harvest. Currently, the key factors structuring bacterial communities and the potential core and unique bacterial populations in manure anaerobic digesters are not completely elucidated yet. In this study, we collected sludge samples from 20 full-scale anaerobic digesters treating cattle or swine manure, and investigated the variations of bacterial community compositions using high-throughput 16S rRNA amplicon sequencing. Clustering and correlation analysis suggested that substrate type and free ammonia (FA) play key roles in determining the bacterial community structure. The COD: NH4+-N (C:N) ratio of substrate and FA were the most important available operational parameters correlating to the bacterial communities in cattle and swine manure digesters, respectively. The bacterial populations in all of the digesters were dominated by phylum Firmicutes, followed by Bacteroidetes, Proteobacteria and Chloroflexi. Increased FA content selected Firmicutes, suggesting that they probably play more important roles under high FA content. Syntrophic metabolism by Proteobacteria, Chloroflexi, Synergistetes and Planctomycetes are likely inhibited when FA content is high. Despite the different manure substrates, operational conditions and geographical locations of digesters, core bacterial communities were identified. The core communities were best characterized by phylum Firmicutes, wherein Clostridium predominated overwhelmingly. Substrate-unique and abundant communities may reflect the properties of manure substrate and operational conditions. These findings extend our current understanding of the bacterial assembly in full-scale manure anaerobic digesters. PMID:26648921
``Sequence space soup'' of proteins and copolymers
NASA Astrophysics Data System (ADS)
Chan, Hue Sun; Dill, Ken A.
1991-09-01
To study the protein folding problem, we use exhaustive computer enumeration to explore ``sequence space soup,'' an imaginary solution containing the ``native'' conformations (i.e., of lowest free energy) under folding conditions, of every possible copolymer sequence. The model is of short self-avoiding chains of hydrophobic (H) and polar (P) monomers configured on the two-dimensional square lattice. By exhaustive enumeration, we identify all native structures for every possible sequence. We find that random sequences of H/P copolymers will bear striking resemblance to known proteins: Most sequences under folding conditions will be approximately as compact as known proteins, will have considerable amounts of secondary structure, and it is most probable that an arbitrary sequence will fold to a number of lowest free energy conformations that is of order one. In these respects, this simple model shows that proteinlike behavior should arise simply in copolymers in which one monomer type is highly solvent averse. It suggests that the structures and uniquenesses of native proteins are not consequences of having 20 different monomer types, or of unique properties of amino acid monomers with regard to special packing or interactions, and thus that simple copolymers might be designable to collapse to proteinlike structures and properties. A good strategy for designing a sequence to have a minimum possible number of native states is to strategically insert many P monomers. Thus known proteins may be marginally stable due to a balance: More H residues stabilize the desired native state, but more P residues prevent simultaneous stabilization of undesired native states.
Encounter risk analysis of rainfall and reference crop evapotranspiration in the irrigation district
NASA Astrophysics Data System (ADS)
Zhang, Jinping; Lin, Xiaomin; Zhao, Yong; Hong, Yang
2017-09-01
Rainfall and reference crop evapotranspiration are random but mutually affected variables in the irrigation district, and their encounter situation can determine water shortage risks under the contexts of natural water supply and demand. However, in reality, the rainfall and reference crop evapotranspiration may have different marginal distributions and their relations are nonlinear. In this study, based on the annual rainfall and reference crop evapotranspiration data series from 1970 to 2013 in the Luhun irrigation district of China, the joint probability distribution of rainfall and reference crop evapotranspiration are developed with the Frank copula function. Using the joint probability distribution, the synchronous-asynchronous encounter risk, conditional joint probability, and conditional return period of different combinations of rainfall and reference crop evapotranspiration are analyzed. The results show that the copula-based joint probability distributions of rainfall and reference crop evapotranspiration are reasonable. The asynchronous encounter probability of rainfall and reference crop evapotranspiration is greater than their synchronous encounter probability, and the water shortage risk associated with meteorological drought (i.e. rainfall variability) is more prone to appear. Compared with other states, there are higher conditional joint probability and lower conditional return period in either low rainfall or high reference crop evapotranspiration. For a specifically high reference crop evapotranspiration with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is increased with the decrease in frequency. For a specifically low rainfall with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is decreased with the decrease in frequency. When either the high reference crop evapotranspiration exceeds a certain frequency or low rainfall does not exceed a certain frequency, the higher conditional joint probability and lower conditional return period of various combinations likely cause a water shortage, but the water shortage is not severe.
Tracking the Sensory Environment: An ERP Study of Probability and Context Updating in ASD
Westerfield, Marissa A.; Zinni, Marla; Vo, Khang; Townsend, Jeanne
2014-01-01
We recorded visual event-related brain potentials (ERPs) from 32 adult male participants (16 high-functioning participants diagnosed with Autism Spectrum Disorder (ASD) and 16 control participants, ranging in age from 18–53 yrs) during a three-stimulus oddball paradigm. Target and non-target stimulus probability was varied across three probability conditions, whereas the probability of a third non-target stimulus was held constant in all conditions. P3 amplitude to target stimuli was more sensitive to probability in ASD than in TD participants, whereas P3 amplitude to non-target stimuli was less responsive to probability in ASD participants. This suggests that neural responses to changes in event probability are attention-dependant in high-functioning ASD. The implications of these findings for higher-level behaviors such as prediction and planning are discussed. PMID:24488156
Wisco, Blair E; Marx, Brian P; Miller, Mark W; Wolf, Erika J; Mota, Natalie P; Krystal, John H; Southwick, Steven M; Pietrzak, Robert H
2016-11-01
With the publication of DSM-5, important changes were made to the diagnostic criteria for posttraumatic stress disorder (PTSD), including the addition of 3 new symptoms. Some have argued that these changes will further increase the already high rates of comorbidity between PTSD and other psychiatric disorders. This study examined the prevalence of DSM-5 PTSD, conditional probability of PTSD given certain trauma exposures, endorsement of specific PTSD symptoms, and psychiatric comorbidities in the US veteran population. Data were analyzed from the National Health and Resilience in Veterans Study (NHRVS), a Web-based survey of a cross-sectional, nationally representative, population-based sample of 1,484 US veterans, which was fielded from September through October 2013. Probable PTSD was assessed using the PTSD Checklist-5. The weighted lifetime and past-month prevalence of probable DSM-5 PTSD was 8.1% (SE = 0.7%) and 4.7% (SE = 0.6%), respectively. Conditional probability of lifetime probable PTSD ranged from 10.1% (sudden death of close family member or friend) to 28.0% (childhood sexual abuse). The DSM-5 PTSD symptoms with the lowest prevalence among veterans with probable PTSD were trauma-related amnesia and reckless and self-destructive behavior. Probable PTSD was associated with increased odds of mood and anxiety disorders (OR = 7.6-62.8, P < .001), substance use disorders (OR = 3.9-4.5, P < .001), and suicidal behaviors (OR = 6.7-15.1, P < .001). In US veterans, the prevalence of DSM-5 probable PTSD, conditional probability of probable PTSD, and odds of psychiatric comorbidity were similar to prior findings with DSM-IV-based measures; we found no evidence that changes in DSM-5 increase psychiatric comorbidity. Results underscore the high rates of exposure to both military and nonmilitary trauma and the high public health burden of DSM-5 PTSD and comorbid conditions in veterans. © Copyright 2016 Physicians Postgraduate Press, Inc.
Autism Spectrum Disorders: Is Mesenchymal Stem Cell Personalized Therapy the Future?
Siniscalco, Dario; Sapone, Anna; Cirillo, Alessandra; Giordano, Catia; Maione, Sabatino; Antonucci, Nicola
2012-01-01
Autism and autism spectrum disorders (ASDs) are heterogeneous neurodevelopmental disorders. They are enigmatic conditions that have their origins in the interaction of genes and environmental factors. ASDs are characterized by dysfunctions in social interaction and communication skills, in addition to repetitive and stereotypic verbal and nonverbal behaviours. Immune dysfunction has been confirmed with autistic children. There are no defined mechanisms of pathogenesis or curative therapy presently available. Indeed, ASDs are still untreatable. Available treatments for autism can be divided into behavioural, nutritional, and medical approaches, although no defined standard approach exists. Nowadays, stem cell therapy represents the great promise for the future of molecular medicine. Among the stem cell population, mesenchymal stem cells (MSCs) show probably best potential good results in medical research. Due to the particular immune and neural dysregulation observed in ASDs, mesenchymal stem cell transplantation could offer a unique tool to provide better resolution for this disease. PMID:22496609
Methodological Gaps in Left Atrial Function Assessment by 2D Speckle Tracking Echocardiography
Rimbaş, Roxana Cristina; Dulgheru, Raluca Elena; Vinereanu, Dragoş
2015-01-01
The assessment of left atrial (LA) function is used in various cardiovascular diseases. LA plays a complementary role in cardiac performance by modulating left ventricular (LV) function. Transthoracic two-dimensional (2D) phasic volumes and Doppler echocardiography can measure LA function non-invasively. However, evaluation of LA deformation derived from 2D speckle tracking echocardiography (STE) is a new feasible and promising approach for assessment of LA mechanics. These parameters are able to detect subclinical LA dysfunction in different pathological condition. Normal ranges for LA deformation and cut-off values to diagnose LA dysfunction with different diseases have been reported, but data are still conflicting, probably because of some methodological and technical issues. This review highlights the importance of an unique standardized technique to assess the LA phasic functions by STE, and discusses recent studies on the most important clinical applications of this technique. PMID:26761370
Information transport in classical statistical systems
NASA Astrophysics Data System (ADS)
Wetterich, C.
2018-02-01
For "static memory materials" the bulk properties depend on boundary conditions. Such materials can be realized by classical statistical systems which admit no unique equilibrium state. We describe the propagation of information from the boundary to the bulk by classical wave functions. The dependence of wave functions on the location of hypersurfaces in the bulk is governed by a linear evolution equation that can be viewed as a generalized Schrödinger equation. Classical wave functions obey the superposition principle, with local probabilities realized as bilinears of wave functions. For static memory materials the evolution within a subsector is unitary, as characteristic for the time evolution in quantum mechanics. The space-dependence in static memory materials can be used as an analogue representation of the time evolution in quantum mechanics - such materials are "quantum simulators". For example, an asymmetric Ising model on a Euclidean two-dimensional lattice represents the time evolution of free relativistic fermions in two-dimensional Minkowski space.
Cross over of recurrence networks to random graphs and random geometric graphs
NASA Astrophysics Data System (ADS)
Jacob, Rinku; Harikrishnan, K. P.; Misra, R.; Ambika, G.
2017-02-01
Recurrence networks are complex networks constructed from the time series of chaotic dynamical systems where the connection between two nodes is limited by the recurrence threshold. This condition makes the topology of every recurrence network unique with the degree distribution determined by the probability density variations of the representative attractor from which it is constructed. Here we numerically investigate the properties of recurrence networks from standard low-dimensional chaotic attractors using some basic network measures and show how the recurrence networks are different from random and scale-free networks. In particular, we show that all recurrence networks can cross over to random geometric graphs by adding sufficient amount of noise to the time series and into the classical random graphs by increasing the range of interaction to the system size. We also highlight the effectiveness of a combined plot of characteristic path length and clustering coefficient in capturing the small changes in the network characteristics.
Direct measurement of nonlocal entanglement of two-qubit spin quantum states.
Cheng, Liu-Yong; Yang, Guo-Hui; Guo, Qi; Wang, Hong-Fu; Zhang, Shou
2016-01-18
We propose efficient schemes of direct concurrence measurement for two-qubit spin and photon-polarization entangled states via the interaction between single-photon pulses and nitrogen-vacancy (NV) centers in diamond embedded in optical microcavities. For different entangled-state types, diversified quantum devices and operations are designed accordingly. The initial unknown entangled states are possessed by two spatially separated participants, and nonlocal spin (polarization) entanglement can be measured with the aid of detection probabilities of photon (NV center) states. This non-demolition entanglement measurement manner makes initial entangled particle-pair avoid complete annihilation but evolve into corresponding maximally entangled states. Moreover, joint inter-qubit operation or global qubit readout is not required for the presented schemes and the final analyses inform favorable performance under the current parameters conditions in laboratory. The unique advantages of spin qubits assure our schemes wide potential applications in spin-based solid quantum information and computation.
Pennsylvanian coniferopsid forests in sabkha facies reveal the nature of seasonal tropical biome
Falcon-Lang, H. J.; Jud, N.A.; John, Nelson W.; DiMichele, W.A.; Chaney, D.S.; Lucas, S.G.
2011-01-01
Pennsylvanian fossil forests are known from hundreds of sites across tropical Pangea, but nearly all comprise remains of humid Coal Forests. Here we report a unique occurrence of seasonally dry vegetation, preserved in growth position along >5 km of strike, in the Pennsylvanian (early Kasimovian, Missourian) of New Mexico (United States). Analyses of stump anatomy, diameter, and spatial density, coupled with observations of vascular traces and associated megaflora, show that this was a deciduous, mixed-age, coniferopsid woodland (~100 trees per hectare) with an open canopy. The coniferopsids colonized coastal sabkha facies and show tree rings, confirming growth under seasonally dry conditions. Such woodlands probably served as the source of coniferopsids that replaced Coal Forests farther east in central Pangea during drier climate phases. Thus, the newly discovered woodland helps unravel biome-scale vegetation dynamics and allows calibration of climate models. ?? 2011 Geological Society of America.
Mass movements on Venus - Preliminary results from Magellan cycle 1 observations
NASA Technical Reports Server (NTRS)
Malin, Michael C.
1992-01-01
A preliminary assessment of mass movements and their geomorphic characteristics as determined from visual inspection of Magellan cycle 1 synthetic aperture radar images is described. The primary data set was a catalog of over 200 ten-inch square photographic prints of full-resolution mosaic image data records. Venus exhibits unambiguous evidence of mass movements at a variety of scales. Mass movements appear mostly in the form of block and rock movements; there is little evidence of regolith and sediment movements. Unique Venusian conditions may play a role in the creation of some mass movement features. Dark (smooth) surfaces surrounding many rockslide avalanches are probably fine materials emplaced as part of the mass movement process, as airfall, surface-hugging density flows, or coarse-depleted debris flows. The size and efficiency of emplacement of landslide deposits on Venus are comparable to those seen on Mars, which in turn generally resemble terrestrial occurrences.
A microfluidic device to study neuronal and motor responses to acute chemical stimuli in zebrafish.
Candelier, Raphaël; Murmu, Meena Sriti; Romano, Sebastián Alejo; Jouary, Adrien; Debrégeas, Georges; Sumbre, Germán
2015-07-21
Zebrafish larva is a unique model for whole-brain functional imaging and to study sensory-motor integration in the vertebrate brain. To take full advantage of this system, one needs to design sensory environments that can mimic the complex spatiotemporal stimulus patterns experienced by the animal in natural conditions. We report on a novel open-ended microfluidic device that delivers pulses of chemical stimuli to agarose-restrained larvae with near-millisecond switching rate and unprecedented spatial and concentration accuracy and reproducibility. In combination with two-photon calcium imaging and recordings of tail movements, we found that stimuli of opposite hedonic values induced different circuit activity patterns. Moreover, by precisely controlling the duration of the stimulus (50-500 ms), we found that the probability of generating a gustatory-induced behavior is encoded by the number of neurons activated. This device may open new ways to dissect the neural-circuit principles underlying chemosensory perception.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Boyack, B.E.; Steiner, J.L.; Harmony, S.C.
The PIUS Advanced Reactor is a 640-MW(e) pressurized-water reactor developed by Asea Brown Boveri. A unique feature of the PIUS concept is the absence of mechanical control and shutdown rods. Reactivity normally is controlled by the boron concentration in the coolant and the temperature of the moderator coolant. Analyses of five initiating events have been completed on the basis of calculations performed with the system neutronic and thermal-hydraulic analysis code TRAC-PF1/MOD2. The initiating events analyzed are (1) reactor scram, (2) loss of off-site power (3) main steam-line break, (4) small-break loss of coolant, and (5) large-break loss of coolant. Inmore » addition to the baseline calculation for each sequence, sensitivity studies were performed to explore the response of the PIUS reactor to severe off-normal conditions having a very low probability of occurrence. The sensitivity studies provide insights into the robustness of the design.« less
Physiological condition of autumn-banded mallards and its relationship to hunting vulnerability
Hepp, G.R.; Blohm, R.J.; Reynolds, R.E.; Hines, J.E.; Nichols, J.D.
1986-01-01
An important topic of waterfowl ecology concerns the relationship between the physiological condition of ducks during the nonbreeding season and fitness, i.e., survival and future reproductive success. We investigated this subject using direct band recovery records of mallards (Anas platyrhynchos) banded in autumn (1 Oct-15 Dec) 1981-83 in the Mississippi Alluvial Valley (MAV) [USA]. A condition index, weight (g)/wing length (mm), was calculated for each duck, and we tested whether condition of mallards at time of banding was related to their probability of recovery during the hunting season. In 3 years, 5,610 mallards were banded and there were 234 direct recoveries. Three binary regression model was used to test the relationship between recovery probability and condition. Likelihood-ratio tests were conducted to determine the most suitable model. For mallards banded in autumn there was a negative relationship between physical condition and the probability of recovery. Mallards in poor condition at the time of banding had a greater probability of being recovered during the hunting season. In general, this was true for all ages and sex classes; however, the strongest relationship occurred for adult males.
Quantitative assessment of building fire risk to life safety.
Guanquan, Chu; Jinhua, Sun
2008-06-01
This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.
Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas
2014-01-01
Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016
Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas
2013-01-01
Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.
Metocean design parameter estimation for fixed platform based on copula functions
NASA Astrophysics Data System (ADS)
Zhai, Jinjin; Yin, Qilin; Dong, Sheng
2017-08-01
Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.
Global Transcriptomic Analysis Reveals the Mechanism of Phelipanche aegyptiaca Seed Germination
Yao, Zhaoqun; Tian, Fang; Cao, Xiaolei; Xu, Ying; Chen, Meixiu; Xiang, Benchun; Zhao, Sifeng
2016-01-01
Phelipanche aegyptiaca is one of the most destructive root parasitic plants of Orobanchaceae. This plant has significant impacts on crop yields worldwide. Conditioned and host root stimulants, in particular, strigolactones, are needed for unique seed germination. However, no extensive study on this phenomenon has been conducted because of insufficient genomic information. Deep RNA sequencing, including de novo assembly and functional annotation was performed on P. aegyptiaca germinating seeds. The assembled transcriptome was used to analyze transcriptional dynamics during seed germination. Key gene categories involved were identified. A total of 274,964 transcripts were determined, and 53,921 unigenes were annotated according to the NR, GO, COG, KOG, and KEGG databases. Overall, 5324 differentially expressed genes among dormant, conditioned, and GR24-treated seeds were identified. GO and KEGG enrichment analyses demonstrated numerous DEGs related to DNA, RNA, and protein repair and biosynthesis, as well as carbohydrate and energy metabolism. Moreover, ABA and ethylene were found to play important roles in this process. GR24 application resulted in dramatic changes in ABA and ethylene-associated genes. Fluridone, a carotenoid biosynthesis inhibitor, alone could induce P. aegyptiaca seed germination. In addition, conditioning was probably not the indispensable stage for P. aegyptiaca, because the transcript level variation of MAX2 and KAI2 genes (relate to strigolactone signaling) was not up-regulated by conditioning treatment. PMID:27428962
Global Transcriptomic Analysis Reveals the Mechanism of Phelipanche aegyptiaca Seed Germination.
Yao, Zhaoqun; Tian, Fang; Cao, Xiaolei; Xu, Ying; Chen, Meixiu; Xiang, Benchun; Zhao, Sifeng
2016-07-15
Phelipanche aegyptiaca is one of the most destructive root parasitic plants of Orobanchaceae. This plant has significant impacts on crop yields worldwide. Conditioned and host root stimulants, in particular, strigolactones, are needed for unique seed germination. However, no extensive study on this phenomenon has been conducted because of insufficient genomic information. Deep RNA sequencing, including de novo assembly and functional annotation was performed on P. aegyptiaca germinating seeds. The assembled transcriptome was used to analyze transcriptional dynamics during seed germination. Key gene categories involved were identified. A total of 274,964 transcripts were determined, and 53,921 unigenes were annotated according to the NR, GO, COG, KOG, and KEGG databases. Overall, 5324 differentially expressed genes among dormant, conditioned, and GR24-treated seeds were identified. GO and KEGG enrichment analyses demonstrated numerous DEGs related to DNA, RNA, and protein repair and biosynthesis, as well as carbohydrate and energy metabolism. Moreover, ABA and ethylene were found to play important roles in this process. GR24 application resulted in dramatic changes in ABA and ethylene-associated genes. Fluridone, a carotenoid biosynthesis inhibitor, alone could induce P. aegyptiaca seed germination. In addition, conditioning was probably not the indispensable stage for P. aegyptiaca, because the transcript level variation of MAX2 and KAI2 genes (relate to strigolactone signaling) was not up-regulated by conditioning treatment.
Probability based models for estimation of wildfire risk
Haiganoush Preisler; D. R. Brillinger; R. E. Burgan; John Benoit
2004-01-01
We present a probability-based model for estimating fire risk. Risk is defined using three probabilities: the probability of fire occurrence; the conditional probability of a large fire given ignition; and the unconditional probability of a large fire. The model is based on grouped data at the 1 km²-day cell level. We fit a spatially and temporally explicit non-...
Targeting the probability versus cost of feared outcomes in public speaking anxiety.
Nelson, Elizabeth A; Deacon, Brett J; Lickel, James J; Sy, Jennifer T
2010-04-01
Cognitive-behavioral theory suggests that social phobia is maintained, in part, by overestimates of the probability and cost of negative social events. Indeed, empirically supported cognitive-behavioral treatments directly target these cognitive biases through the use of in vivo exposure or behavioral experiments. While cognitive-behavioral theories and treatment protocols emphasize the importance of targeting probability and cost biases in the reduction of social anxiety, few studies have examined specific techniques for reducing probability and cost bias, and thus the relative efficacy of exposure to the probability versus cost of negative social events is unknown. In the present study, 37 undergraduates with high public speaking anxiety were randomly assigned to a single-session intervention designed to reduce either the perceived probability or the perceived cost of negative outcomes associated with public speaking. Compared to participants in the probability treatment condition, those in the cost treatment condition demonstrated significantly greater improvement on measures of public speaking anxiety and cost estimates for negative social events. The superior efficacy of the cost treatment condition was mediated by greater treatment-related changes in social cost estimates. The clinical implications of these findings are discussed. Published by Elsevier Ltd.
System for Measuring Conditional Amplitude, Phase, or Time Distributions of Pulsating Phenomena
Van Brunt, Richard J.; Cernyar, Eric W.
1992-01-01
A detailed description is given of an electronic stochastic analyzer for use with direct “real-time” measurements of the conditional distributions needed for a complete stochastic characterization of pulsating phenomena that can be represented as random point processes. The measurement system described here is designed to reveal and quantify effects of pulse-to-pulse or phase-to-phase memory propagation. The unraveling of memory effects is required so that the physical basis for observed statistical properties of pulsating phenomena can be understood. The individual unique circuit components that comprise the system and the combinations of these components for various measurements, are thoroughly documented. The system has been applied to the measurement of pulsating partial discharges generated by applying alternating or constant voltage to a discharge gap. Examples are shown of data obtained for conditional and unconditional amplitude, time interval, and phase-of-occurrence distributions of partial-discharge pulses. The results unequivocally show the existence of significant memory effects as indicated, for example, by the observations that the most probable amplitudes and phases-of-occurrence of discharge pulses depend on the amplitudes and/or phases of the preceding pulses. Sources of error and fundamental limitations of the present measurement approach are analyzed. Possible extensions of the method are also discussed. PMID:28053450
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hammond, Glenn Edward; Song, Xuehang; Ye, Ming
A new approach is developed to delineate the spatial distribution of discrete facies (geological units that have unique distributions of hydraulic, physical, and/or chemical properties) conditioned not only on direct data (measurements directly related to facies properties, e.g., grain size distribution obtained from borehole samples) but also on indirect data (observations indirectly related to facies distribution, e.g., hydraulic head and tracer concentration). Our method integrates for the first time ensemble data assimilation with traditional transition probability-based geostatistics. The concept of level set is introduced to build shape parameterization that allows transformation between discrete facies indicators and continuous random variables. Themore » spatial structure of different facies is simulated by indicator models using conditioning points selected adaptively during the iterative process of data assimilation. To evaluate the new method, a two-dimensional semi-synthetic example is designed to estimate the spatial distribution and permeability of two distinct facies from transient head data induced by pumping tests. The example demonstrates that our new method adequately captures the spatial pattern of facies distribution by imposing spatial continuity through conditioning points. The new method also reproduces the overall response in hydraulic head field with better accuracy compared to data assimilation with no constraints on spatial continuity on facies.« less
Shine, R; LeMaster, M P; Moore, I T; Olsson, M M; Mason, R T
2001-03-01
Huge breeding aggregations of red-sided garter snakes (Thamnophis sirtalis parietalis) at overwintering dens in Manitoba provide a unique opportunity to identify sources of mortality and to clarify factors that influence a snake's vulnerability to these factors. Comparisons of sexes, body sizes, and body condition of more than 1000 dead snakes versus live animals sampled at the same time reveal significant biases. Three primary sources of mortality were identified. Predation by crows, Corvus brachyrhynchos (590 snakes killed), was focussed mostly on small snakes of both sexes. Crows generally removed the snake's liver and left the carcass, but very small snakes were sometimes brought back to the nest. Suffocation beneath massive piles of other snakes within the den (301 dead animals) involved mostly small males and (to a lesser extent) large females; snakes in poor body condition were particularly vulnerable. Many emaciated snakes (n = 142, mostly females) also died without overt injuries, probably due to depleted energy reserves. These biases in vulnerability are readily interpretable from information on behavioral ecology of the snakes. For example, sex biases in mortality reflect differences in postemergence behavior and locomotor capacity, the greater attractiveness of larger females to males, and the high energy costs of reproduction for females.
Generic pure quantum states as steady states of quasi-local dissipative dynamics
NASA Astrophysics Data System (ADS)
Karuvade, Salini; Johnson, Peter D.; Ticozzi, Francesco; Viola, Lorenza
2018-04-01
We investigate whether a generic pure state on a multipartite quantum system can be the unique asymptotic steady state of locality-constrained purely dissipative Markovian dynamics. In the tripartite setting, we show that the problem is equivalent to characterizing the solution space of a set of linear equations and establish that the set of pure states obeying the above property has either measure zero or measure one, solely depending on the subsystems’ dimension. A complete analytical characterization is given when the central subsystem is a qubit. In the N-partite case, we provide conditions on the subsystems’ size and the nature of the locality constraint, under which random pure states cannot be quasi-locally stabilized generically. Also, allowing for the possibility to approximately stabilize entangled pure states that cannot be exact steady states in settings where stabilizability is generic, our results offer insights into the extent to which random pure states may arise as unique ground states of frustration-free parent Hamiltonians. We further argue that, to a high probability, pure quantum states sampled from a t-design enjoy the same stabilizability properties of Haar-random ones as long as suitable dimension constraints are obeyed and t is sufficiently large. Lastly, we demonstrate a connection between the tasks of quasi-local state stabilization and unique state reconstruction from local tomographic information, and provide a constructive procedure for determining a generic N-partite pure state based only on knowledge of the support of any two of the reduced density matrices of about half the parties, improving over existing results.
Ennis, Erin J; Foley, Joe P
2016-07-15
A stochastic approach was utilized to estimate the probability of a successful isocratic or gradient separation in conventional chromatography for numbers of sample components, peak capacities, and saturation factors ranging from 2 to 30, 20-300, and 0.017-1, respectively. The stochastic probabilities were obtained under conditions of (i) constant peak width ("gradient" conditions) and (ii) peak width increasing linearly with time ("isocratic/constant N" conditions). The isocratic and gradient probabilities obtained stochastically were compared with the probabilities predicted by Martin et al. [Anal. Chem., 58 (1986) 2200-2207] and Davis and Stoll [J. Chromatogr. A, (2014) 128-142]; for a given number of components and peak capacity the same trend is always observed: probability obtained with the isocratic stochastic approach
Study of recreational land and open space using Skylab imagery
NASA Technical Reports Server (NTRS)
Sattinger, I. J. (Principal Investigator)
1975-01-01
The author has identified the following significant results. An analysis of the statistical uniqueness of each of the signatures of the Gratiot-Saginaw State Game Area was made by computing a matrix of probabilities of misclassification for all possible signature pairs. Within each data set, the 35 signatures were then aggregated into a smaller set of composite signatures by combining groups of signatures having high probabilities of misclassification. Computer separation of forest denisty classes was poor with multispectral scanner data collected on 5 August 1973. Signatures from the scanner data were further analyzed to determine the ranking of spectral channels for computer separation of the scene classes. Probabilities of misclassification were computed for composite signatures using four separate combinations of data source and channel selection.
Offshore fatigue design turbulence
NASA Astrophysics Data System (ADS)
Larsen, Gunner C.
2001-07-01
Fatigue damage on wind turbines is mainly caused by stochastic loading originating from turbulence. While onshore sites display large differences in terrain topology, and thereby also in turbulence conditions, offshore sites are far more homogeneous, as the majority of them are likely to be associated with shallow water areas. However, despite this fact, specific recommendations on offshore turbulence intensities, applicable for fatigue design purposes, are lacking in the present IEC code. This article presents specific guidelines for such loading. These guidelines are based on the statistical analysis of a large number of wind data originating from two Danish shallow water offshore sites. The turbulence standard deviation depends on the mean wind speed, upstream conditions, measuring height and thermal convection. Defining a population of turbulence standard deviations, at a given measuring position, uniquely by the mean wind speed, variations in upstream conditions and atmospheric stability will appear as variability of the turbulence standard deviation. Distributions of such turbulence standard deviations, conditioned on the mean wind speed, are quantified by fitting the measured data to logarithmic Gaussian distributions. By combining a simple heuristic load model with the parametrized conditional probability density functions of the turbulence standard deviations, an empirical offshore design turbulence intensity is determined. For pure stochastic loading (as associated with standstill situations), the design turbulence intensity yields a fatigue damage equal to the average fatigue damage caused by the distributed turbulence intensity. If the stochastic loading is combined with a periodic deterministic loading (as in the normal operating situation), the proposed design turbulence intensity is shown to be conservative.
Making Peace Pay: Post-Conflict Economic and Infrastructure Development in Kosovo and Iraq
probability that a post-conflict state will return to war. Additionally, consecutive presidential administrations and joint doctrine have declared...and Iraqi Freedom as historical case studies to demonstrate that the armed forces possess unique advantages, to include physical presence and
Preschoolers' speed of locating a target symbol under different color conditions.
Wilkinson, Krista M; Carlin, Michael; Jagaroo, Vinoth
2006-06-01
A pressing decision in AAC concerns the organization of aided visual symbols. One recent proposal suggested that basic principles of visual processing may be important determinants of how easily a symbol is found in an array, and that this, in turn will influence more functional outcomes like symbol identification or use. This study examined the role of color on accuracy and speed of symbol location by 16 preschool children without disabilities. Participants searched for a target stimulus in an array of eight stimuli. In the same-color condition, the eight stimuli were all red; in the guided search condition, four of the stimuli were red and four were yellow; in the unique-color condition, all stimuli were unique colors. Accuracy was higher and reaction time was faster when stimuli were unique colors than when they were all one color. Reaction time and accuracy did not differ under the guided search and the color-unique conditions. The implications for AAC are discussed.
Height probabilities in the Abelian sandpile model on the generalized finite Bethe lattice
NASA Astrophysics Data System (ADS)
Chen, Haiyan; Zhang, Fuji
2013-08-01
In this paper, we study the sandpile model on the generalized finite Bethe lattice with a particular boundary condition. Using a combinatorial method, we give the exact expressions for all single-site probabilities and some two-site joint probabilities. As a by-product, we prove that the height probabilities of bulk vertices are all the same for the Bethe lattice with certain given boundary condition, which was found from numerical evidence by Grassberger and Manna ["Some more sandpiles," J. Phys. (France) 51, 1077-1098 (1990)], 10.1051/jphys:0199000510110107700 but without a proof.
Intergroup Contact and Outgroup Humanization: Is the Causal Relationship Uni- or Bidirectional?
Capozza, Dora; Di Bernardo, Gian Antonio; Falvo, Rossella
2017-01-01
The attribution of uniquely human characteristics to the outgroup may favor the search for contact with outgroup members and, vice versa, contact experiences may improve humanity attributions to the outgroup. To explore this bidirectional relationship, two studies were performed. In Study 1, humanity perceptions were manipulated using subliminal conditioning. Two experimental conditions were created. In the humanization condition, the unconditioned stimuli (US) were uniquely human words; in the dehumanization condition, the US were non-uniquely human and animal words. In both conditions, conditioned stimuli were typical outgroup faces. An approach/avoidance technique (the manikin task) was used to measure the willingness to have contact with outgroup members. Findings showed that in the humanization condition participants were faster in approaching than in avoiding outgroup members: closeness to the outgroup was preferred to distance. Latencies of approach and avoidance movements were not different in the dehumanization condition. In Study 2, contact was manipulated using the manikin task. One approach (contact) condition and two control conditions were created. The attribution of uniquely human traits to the outgroup was stronger in the contact than in the no-contact conditions. Furthermore, the effect of contact on humanity attributions was mediated by increased trust toward the outgroup. Thus, findings demonstrate the bidirectionality of the relationship between contact and humanity attributions. Practical implications of findings are discussed.
NASA Technical Reports Server (NTRS)
Garner, Gregory G.; Thompson, Anne M.
2013-01-01
An ensemble statistical post-processor (ESP) is developed for the National Air Quality Forecast Capability (NAQFC) to address the unique challenges of forecasting surface ozone in Baltimore, MD. Air quality and meteorological data were collected from the eight monitors that constitute the Baltimore forecast region. These data were used to build the ESP using a moving-block bootstrap, regression tree models, and extreme-value theory. The ESP was evaluated using a 10-fold cross-validation to avoid evaluation with the same data used in the development process. Results indicate that the ESP is conditionally biased, likely due to slight overfitting while training the regression tree models. When viewed from the perspective of a decision-maker, the ESP provides a wealth of additional information previously not available through the NAQFC alone. The user is provided the freedom to tailor the forecast to the decision at hand by using decision-specific probability thresholds that define a forecast for an ozone exceedance. Taking advantage of the ESP, the user not only receives an increase in value over the NAQFC, but also receives value for An ensemble statistical post-processor (ESP) is developed for the National Air Quality Forecast Capability (NAQFC) to address the unique challenges of forecasting surface ozone in Baltimore, MD. Air quality and meteorological data were collected from the eight monitors that constitute the Baltimore forecast region. These data were used to build the ESP using a moving-block bootstrap, regression tree models, and extreme-value theory. The ESP was evaluated using a 10-fold cross-validation to avoid evaluation with the same data used in the development process. Results indicate that the ESP is conditionally biased, likely due to slight overfitting while training the regression tree models. When viewed from the perspective of a decision-maker, the ESP provides a wealth of additional information previously not available through the NAQFC alone. The user is provided the freedom to tailor the forecast to the decision at hand by using decision-specific probability thresholds that define a forecast for an ozone exceedance. Taking advantage of the ESP, the user not only receives an increase in value over the NAQFC, but also receives value for
NASA Astrophysics Data System (ADS)
Egozi, Roey
2015-04-01
Wildfires are common to the Mediterranean region due to its defined dry season and long historical anthropogenic activities. Most of post-wildfire studies focus on mountains areas and thus refer to the hill-slope and its physical characteristics, e.g. morphology, length, angles, and aspect; its soil characteristics, e.g. type, infiltration rate, repellency; and its vegetative covers, e.g. planted trees vs. natural forest or native vs. exotic vegetation. In contrary there is very limited literature focusing on ecological and hydro-geomorphic aspects of post-wildfire of riparian vegetation / zone probably because of its negligible burned area relative to the spread of the fire, sometimes, over the whole watershed area. The limited literature on the topic is surprising given the fact that riparian vegetation zone has been acknowledged as a unique and important habitat supporting rich biodiversity. Herein we report on a wildfire event occurred on October 14th 2009 in a river section of Nahal Grar, Northern Negev Desert, Israel. The wildfire although was limited in its area (only 3 hectare) extended over the channel alone from bank to bank and thus provide a unique case study of completely burn down of riparian vegetation, mainly dense stands of Common Red (Australis Phragmites. Therefore a detailed study of this event provides an opportunity to tackle one of the basics questions which is determining the rate of natural restoration process that act at the immediate time after the wildfire event occurred. This type of information is most valuable to professional and stakeholders for better management of post-fire riparian zones. The results of the study suggest that under stable conditions, i.e. no major flood events occurred; disturbance time was short and ranged over 200 days due to, almost, immediate recovery of the riparian vegetation. However the re-growth of the riparian vegetation was not even but rather deferential and more complex then reported in the literature. In addition during that period no morphological changes were measured in the channel bed and banks; similarly no changes observed to base flow discharge though slight changes were measured to water pH probably due to the large quantities of ash on river bed.
Conditional Probability Analysis: A Statistical Tool for Environmental Analysis.
The use and application of environmental conditional probability analysis (CPA) is relatively recent. The first presentation using CPA was made in 2002 at the New England Association of Environmental Biologists Annual Meeting in Newport. Rhode Island. CPA has been used since the...
From Inverse Problems in Mathematical Physiology to Quantitative Differential Diagnoses
Zenker, Sven; Rubin, Jonathan; Clermont, Gilles
2007-01-01
The improved capacity to acquire quantitative data in a clinical setting has generally failed to improve outcomes in acutely ill patients, suggesting a need for advances in computer-supported data interpretation and decision making. In particular, the application of mathematical models of experimentally elucidated physiological mechanisms could augment the interpretation of quantitative, patient-specific information and help to better target therapy. Yet, such models are typically complex and nonlinear, a reality that often precludes the identification of unique parameters and states of the model that best represent available data. Hypothesizing that this non-uniqueness can convey useful information, we implemented a simplified simulation of a common differential diagnostic process (hypotension in an acute care setting), using a combination of a mathematical model of the cardiovascular system, a stochastic measurement model, and Bayesian inference techniques to quantify parameter and state uncertainty. The output of this procedure is a probability density function on the space of model parameters and initial conditions for a particular patient, based on prior population information together with patient-specific clinical observations. We show that multimodal posterior probability density functions arise naturally, even when unimodal and uninformative priors are used. The peaks of these densities correspond to clinically relevant differential diagnoses and can, in the simplified simulation setting, be constrained to a single diagnosis by assimilating additional observations from dynamical interventions (e.g., fluid challenge). We conclude that the ill-posedness of the inverse problem in quantitative physiology is not merely a technical obstacle, but rather reflects clinical reality and, when addressed adequately in the solution process, provides a novel link between mathematically described physiological knowledge and the clinical concept of differential diagnoses. We outline possible steps toward translating this computational approach to the bedside, to supplement today's evidence-based medicine with a quantitatively founded model-based medicine that integrates mechanistic knowledge with patient-specific information. PMID:17997590
Türkowsky, Dominique; Esken, Jens; Goris, Tobias; Schubert, Torsten; Diekert, Gabriele; Jehmlich, Nico; von Bergen, Martin
2018-06-15
Organohalide respiration (OHR), comprising the reductive dehalogenation of halogenated organic compounds, is subject to a unique memory effect and long-term transcriptional downregulation of the involved genes in Sulfurospirillum multivorans. Gene expression ceases slowly over approximately 100 generations in the absence of tetrachloroethene (PCE). However, the molecular mechanisms of this regulation process are not understood. We show here that Sulfurospirillum halorespirans undergoes the same type of regulation when cultivated without chlorinated ethenes for a long period of time. In addition, we compared the proteomes of S. halorespirans cells cultivated in the presence of PCE with those of cells long- and short-term cultivated with nitrate as the sole electron acceptor. Important OHR-related proteins previously unidentified in S. multivorans include a histidine kinase, a putative quinol dehydrogenase membrane protein, and a PCE-induced porin. Since for some regulatory proteins a posttranslational regulation of activity by lysine acetylations is known, we also analyzed the acetylome of S. halorespirans, revealing that 32% of the proteome was acetylated in at least one condition. The data indicate that the response regulator and the histidine kinase of a two-component system most probably involved in induction of PCE respiration are highly acetylated during short-term cultivation with nitrate in the absence of PCE. The so far unique long-term downregulation of organohalide respiration is now identified in a second species suggesting a broader distribution of this regulatory phenomenon. An improved protein extraction method allowed the identification of proteins most probably involved in transcriptional regulation of OHR in Sulfurospirillum spp. Our data indicate that acetylations of regulatory proteins are involved in this extreme, sustained standby-mode of metabolic enzymes in the absence of a substrate. This first published acetylome of Epsilonproteobacteria might help to study other ecologically or medically important species of this clade. Copyright © 2018 Elsevier B.V. All rights reserved.
On defense strategies for system of systems using aggregated correlations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S.; Imam, Neena; Ma, Chris Y. T.
2017-04-01
We consider a System of Systems (SoS) wherein each system Si, i = 1; 2; ... ;N, is composed of discrete cyber and physical components which can be attacked and reinforced. We characterize the disruptions using aggregate failure correlation functions given by the conditional failure probability of SoS given the failure of an individual system. We formulate the problem of ensuring the survival of SoS as a game between an attacker and a provider, each with a utility function composed of asurvival probability term and a cost term, both expressed in terms of the number of components attacked and reinforced.more » The survival probabilities of systems satisfy simple product-form, first-order differential conditions, which simplify the Nash Equilibrium (NE) conditions. We derive the sensitivity functions that highlight the dependence of SoS survival probability at NE on cost terms, correlation functions, and individual system survival probabilities.We apply these results to a simplified model of distributed cloud computing infrastructure.« less
Single, Complete, Probability Spaces Consistent With EPR-Bohm-Bell Experimental Data
NASA Astrophysics Data System (ADS)
Avis, David; Fischer, Paul; Hilbert, Astrid; Khrennikov, Andrei
2009-03-01
We show that paradoxical consequences of violations of Bell's inequality are induced by the use of an unsuitable probabilistic description for the EPR-Bohm-Bell experiment. The conventional description (due to Bell) is based on a combination of statistical data collected for different settings of polarization beam splitters (PBSs). In fact, such data consists of some conditional probabilities which only partially define a probability space. Ignoring this conditioning leads to apparent contradictions in the classical probabilistic model (due to Kolmogorov). We show how to make a completely consistent probabilistic model by taking into account the probabilities of selecting the settings of the PBSs. Our model matches both the experimental data and is consistent with classical probability theory.
Logic, probability, and human reasoning.
Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P
2015-04-01
This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction. Copyright © 2015 Elsevier Ltd. All rights reserved.
10 CFR 100.10 - Factors to be considered when evaluating sites.
Code of Federal Regulations, 2013 CFR
2013-01-01
... reactor incorporates unique or unusual features having a significant bearing on the probability or consequences of accidental release of radioactive materials; (4) The safety features that are to be engineered... radioactive fission products. In addition, the site location and the engineered features included as...
10 CFR 100.10 - Factors to be considered when evaluating sites.
Code of Federal Regulations, 2012 CFR
2012-01-01
... reactor incorporates unique or unusual features having a significant bearing on the probability or consequences of accidental release of radioactive materials; (4) The safety features that are to be engineered... radioactive fission products. In addition, the site location and the engineered features included as...
10 CFR 100.10 - Factors to be considered when evaluating sites.
Code of Federal Regulations, 2014 CFR
2014-01-01
... reactor incorporates unique or unusual features having a significant bearing on the probability or consequences of accidental release of radioactive materials; (4) The safety features that are to be engineered... radioactive fission products. In addition, the site location and the engineered features included as...
NASA Technical Reports Server (NTRS)
Gracey, William; Jewel, Joseph W., Jr.; Carpenter, Gene T.
1960-01-01
The overall errors of the service altimeter installations of a variety of civil transport, military, and general-aviation airplanes have been experimentally determined during normal landing-approach and take-off operations. The average height above the runway at which the data were obtained was about 280 feet for the landings and about 440 feet for the take-offs. An analysis of the data obtained from 196 airplanes during 415 landing approaches and from 70 airplanes during 152 take-offs showed that: 1. The overall error of the altimeter installations in the landing- approach condition had a probable value (50 percent probability) of +/- 36 feet and a maximum probable value (99.7 percent probability) of +/- 159 feet with a bias of +10 feet. 2. The overall error in the take-off condition had a probable value of +/- 47 feet and a maximum probable value of +/- 207 feet with a bias of -33 feet. 3. The overall errors of the military airplanes were generally larger than those of the civil transports in both the landing-approach and take-off conditions. In the landing-approach condition the probable error and the maximum probable error of the military airplanes were +/- 43 and +/- 189 feet, respectively, with a bias of +15 feet, whereas those for the civil transports were +/- 22 and +/- 96 feet, respectively, with a bias of +1 foot. 4. The bias values of the error distributions (+10 feet for the landings and -33 feet for the take-offs) appear to represent a measure of the hysteresis characteristics (after effect and recovery) and friction of the instrument and the pressure lag of the tubing-instrument system.
Revisiting Ramakrishnan's approach to relatively. [Velocity addition theorem uniqueness
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nandi, K.K.; Shankara, T.S.
The conditions under which the velocity addition theorem (VAT) is formulated by Ramakrishnan gave rise to doubts about the uniqueness of the theorem. These conditions are rediscussed with reference to their algebraic and experimental implications. 9 references.
DOT National Transportation Integrated Search
2009-10-13
This paper describes a probabilistic approach to estimate the conditional probability of release of hazardous materials from railroad tank cars during train accidents. Monte Carlo methods are used in developing a probabilistic model to simulate head ...
The Dependence Structure of Conditional Probabilities in a Contingency Table
ERIC Educational Resources Information Center
Joarder, Anwar H.; Al-Sabah, Walid S.
2002-01-01
Conditional probability and statistical independence can be better explained with contingency tables. In this note some special cases of 2 x 2 contingency tables are considered. In turn an interesting insight into statistical dependence as well as independence of events is obtained.
NASA Astrophysics Data System (ADS)
Kaur, Prabhmandeep; Jain, Virander Kumar; Kar, Subrat
2014-12-01
In this paper, we investigate the performance of a Free Space Optic (FSO) link considering the impairments caused by the presence of various weather conditions such as very clear air, drizzle, haze, fog, etc., and turbulence in the atmosphere. Analytic expression for the outage probability is derived using the gamma-gamma distribution for turbulence and accounting the effect of weather conditions using the Beer-Lambert's law. The effect of receiver diversity schemes using aperture averaging and array receivers on the outage probability is studied and compared. As the aperture diameter is increased, the outage probability decreases irrespective of the turbulence strength (weak, moderate and strong) and weather conditions. Similar effects are observed when the number of direct detection receivers in the array are increased. However, it is seen that as the desired level of performance in terms of the outage probability decreases, array receiver becomes the preferred choice as compared to the receiver with aperture averaging.
Duality based direct resolution of unique profiles using zero concentration region information.
Tavakkoli, Elnaz; Rajkó, Róbert; Abdollahi, Hamid
2018-07-01
Self Modeling Curve Resolution (SMCR) is a class of techniques concerned with estimating pure profiles underlying a set of measurements on chemical systems. In general, the estimated profiles are ambiguous (non-unique) except if some special conditions fulfilled. Implementing the adequate information can reduce the so-called rotational ambiguity effectively, and in the most desirable cases lead to the unique solution. Therefore, studies on circumstances resulting in unique solution are of particular importance. The conditions of unique solution can particularly be studied based on duality principle. In bilinear chemical (e.g., spectroscopic) data matrix, there is a natural duality between its row and column vector spaces using minimal constraints (non-negativity of concentrations and absorbances). In this article, the conditions of the unique solution according to duality concept and using zero concentration region information is intended to show. A simulated dataset of three components and an experimental system with synthetic mixtures containing three amino acids tyrosine, phenylalanine and tryptophan are analyzed. It is shown that in the presence of sufficient information, the reliable unique solution is obtained that is valuable in analytical qualification and for quantitative verification analysis. Copyright © 2018 Elsevier B.V. All rights reserved.
In search of the Hohenberg-Kohn theorem
NASA Astrophysics Data System (ADS)
Lammert, Paul E.
2018-04-01
The Hohenberg-Kohn theorem, a cornerstone of electronic density functional theory, concerns uniqueness of external potentials yielding given ground densities of an N -body system. The problem is rigorously explored in a universe of three-dimensional Kato-class potentials, with emphasis on trade-offs between conditions on the density and conditions on the potential sufficient to ensure uniqueness. Sufficient conditions range from none on potentials coupled with everywhere strict positivity of the density to none on the density coupled with something a little weaker than local 3 N /2 -power integrability of the potential on a connected full-measure set. A second theme is localizability, that is, the possibility of uniqueness over subsets of R3 under less stringent conditions.
Being and feeling unique: statistical deviance and psychological marginality.
Frable, D E
1993-03-01
Two studies tested the hypothesis that people with culturally stigmatized and concealable conditions (e.g., gays, epileptics, juvenile delinquents, and incest victims) would be more likely to feel unique than people with culturally valued or conspicuous conditions (e.g., the physically attractive, the intellectually gifted, the obese, and the facially scarred). In Study 1, culturally stigmatized individuals with concealable conditions were least likely to perceive consensus between their personal preferences and those of others. In Study 2, they were most likely to describe themselves as unique and to make these self-relevant decisions quickly. Marginality is a psychological reality, not just a statistical one, for those with stigmatized and concealable "master status" conditions.
ERIC Educational Resources Information Center
White, Susan W.; Bray, Bethany C.; Ollendick, Thomas H.
2012-01-01
Social Anxiety Disorder (SAD) and Autism Spectrum Disorders (ASD) are fairly common psychiatric conditions that impair the functioning of otherwise healthy young adults. Given that the two conditions frequently co-occur, measurement of the characteristics unique to each condition is critical. This study evaluated the structure and construct…
Conditional clustering of temporal expression profiles
Wang, Ling; Montano, Monty; Rarick, Matt; Sebastiani, Paola
2008-01-01
Background Many microarray experiments produce temporal profiles in different biological conditions but common cluster techniques are not able to analyze the data conditional on the biological conditions. Results This article presents a novel technique to cluster data from time course microarray experiments performed across several experimental conditions. Our algorithm uses polynomial models to describe the gene expression patterns over time, a full Bayesian approach with proper conjugate priors to make the algorithm invariant to linear transformations, and an iterative procedure to identify genes that have a common temporal expression profile across two or more experimental conditions, and genes that have a unique temporal profile in a specific condition. Conclusion We use simulated data to evaluate the effectiveness of this new algorithm in finding the correct number of clusters and in identifying genes with common and unique profiles. We also use the algorithm to characterize the response of human T cells to stimulations of antigen-receptor signaling gene expression temporal profiles measured in six different biological conditions and we identify common and unique genes. These studies suggest that the methodology proposed here is useful in identifying and distinguishing uniquely stimulated genes from commonly stimulated genes in response to variable stimuli. Software for using this clustering method is available from the project home page. PMID:18334028
Reichert, Brian E.; Martin, J.; Kendall, William L.; Cattau, Christopher E.; Kitchens, Wiley M.
2010-01-01
Individuals in wild populations face risks associated with both intrinsic (i.e. aging) and external (i.e. environmental) sources of mortality. Condition-dependent mortality occurs when there is an interaction between such factors; however, few studies have clearly demonstrated condition-dependent mortality and some have even argued that condition-dependent mortality does not occur in wild avian populations. Using large sample sizes (2084 individuals, 3746 re-sights) of individual-based longitudinal data collected over a 33 year period (1976-2008) on multiple cohorts, we used a capture-mark-recapture framework to model age-dependent survival in the snail kite Rostrhamus sociabilis plumbeus population in Florida. Adding to the growing amount of evidence for actuarial senescence in wild populations, we found evidence of senescent declines in survival probabilities in adult kites. We also tested the hypothesis that older kites experienced condition-dependent mortality during a range-wide drought event (2000-2002). The results provide convincing evidence that the annual survival probability of senescent kites was disproportionately affected by the drought relative to the survival probability of prime-aged adults. To our knowledge, this is the first evidence of condition-dependent mortality to be demonstrated in a wild avian population, a finding which challenges recent conclusions drawn in the literature. Our study suggests that senescence and condition-dependent mortality can affect the demography of wild avian populations. Accounting for these sources of variation may be particularly important to appropriately compute estimates of population growth rate, and probabilities of quasi-extinctions.
BIODEGRADATION PROBABILITY PROGRAM (BIODEG)
The Biodegradation Probability Program (BIODEG) calculates the probability that a chemical under aerobic conditions with mixed cultures of microorganisms will biodegrade rapidly or slowly. It uses fragment constants developed using multiple linear and non-linear regressions and d...
Complicated grief associated with hurricane Katrina.
Shear, M Katherine; McLaughlin, Katie A; Ghesquiere, Angela; Gruber, Michael J; Sampson, Nancy A; Kessler, Ronald C
2011-08-01
Although losses are important consequences of disasters, few epidemiological studies of disasters have assessed complicated grief (CG) and none assessed CG associated with losses other than death of loved one. Data come from the baseline survey of the Hurricane Katrina Community Advisory Group, a representative sample of 3,088 residents of the areas directly affected by Hurricane Katrina. A brief screen for CG was included containing four items consistent with the proposed DSM-V criteria for a diagnosis of bereavement-related adjustment disorder. Fifty-eight and half percent of respondents reported a significant hurricane-related loss: Most-severe losses were 29.0% tangible, 9.5% interpersonal, 8.1% intangible, 4.2% work/financial, and 3.7% death of loved one. Twenty-six point one percent respondents with significant loss had possible CG and 7.0% moderate-to-severe CG. Death of loved one was associated with the highest conditional probability of moderate-to-severe CG (18.5%, compared to 1.1-10.5% conditional probabilities for other losses), but accounted for only 16.5% of moderate-to-severe CG due to its comparatively low prevalence. Most moderate-to-severe CG was due to tangible (52.9%) or interpersonal (24.0%) losses. Significant predictors of CG were mostly unique to either bereavement (racial-ethnic minority status, social support) or other losses (prehurricane history of psychopathology, social competence.). Nonbereavement losses accounted for the vast majority of hurricane-related possible CG despite risk of CG being much higher in response to bereavement than to other losses. This result argues for expansion of research on CG beyond bereavement and alerts clinicians to the need to address postdisaster grief associated with a wide range of losses. © 2011 Wiley-Liss, Inc.
Complicated grief associated with Hurricane Katrina
Shear, M. Katherine; McLaughlin, Katie A.; Ghesquiere, Angela; Gruber, Michael J.; Sampson, Nancy A.; Kessler, Ronald C.
2011-01-01
Background Although losses are important consequences of disasters, few epidemiological studies of disasters have assessed complicated grief (CG) and none assessed CG associated with losses other than death of loved one. Methods Data come from the baseline survey of the Hurricane Katrina Community Advisory Group (CAG), a representative sample of 3,088 residents of the areas directly affected by Hurricane Katrina. A brief screen for CG was included containing four items consistent with the proposed DSM 5 criteria for a diagnosis of bereavement-related adjustment disorder. Results 58.5% of respondents reported a significant hurricane-related loss: Most-severe losses were 29.0% tangible, 9.5% interpersonal, 8.1% intangible, 4.2% work-financial, and 3.7% death of loved one. 26.1% of respondents with significant loss had possible CG and 7.0% moderate-severe CG. Death of loved one was associated with the highest conditional probability of moderate-severe CG (18.5%, compared to 1.1–10.5% conditional probabilities for other losses) but accounted for only 16.5% of moderate-severe CG due to its comparatively low prevalence. Most moderate-severe CG was due to tangible (52.9%) or interpersonal (24.0%) losses. Significant predictors of CG were mostly unique to either bereavement (racial-ethnic minority status, social support) or other losses (pre-hurricane history of psychopathology, social competence.). Conclusions Non-bereavement losses accounted for the vast majority of hurricane-related possible CG despite risk of CG being much higher in response to bereavement than to other losses. This result argues for expansion of research on CG beyond bereavement and alerts clinicians to the need to address post-disaster grief associated with a wide range of losses. PMID:21796740
2012-01-01
We derive the mean-field equations arising as the limit of a network of interacting spiking neurons, as the number of neurons goes to infinity. The neurons belong to a fixed number of populations and are represented either by the Hodgkin-Huxley model or by one of its simplified version, the FitzHugh-Nagumo model. The synapses between neurons are either electrical or chemical. The network is assumed to be fully connected. The maximum conductances vary randomly. Under the condition that all neurons’ initial conditions are drawn independently from the same law that depends only on the population they belong to, we prove that a propagation of chaos phenomenon takes place, namely that in the mean-field limit, any finite number of neurons become independent and, within each population, have the same probability distribution. This probability distribution is a solution of a set of implicit equations, either nonlinear stochastic differential equations resembling the McKean-Vlasov equations or non-local partial differential equations resembling the McKean-Vlasov-Fokker-Planck equations. We prove the well-posedness of the McKean-Vlasov equations, i.e. the existence and uniqueness of a solution. We also show the results of some numerical experiments that indicate that the mean-field equations are a good representation of the mean activity of a finite size network, even for modest sizes. These experiments also indicate that the McKean-Vlasov-Fokker-Planck equations may be a good way to understand the mean-field dynamics through, e.g. a bifurcation analysis. Mathematics Subject Classification (2000): 60F99, 60B10, 92B20, 82C32, 82C80, 35Q80. PMID:22657695
Social learning spreads knowledge about dangerous humans among American crows.
Cornell, Heather N; Marzluff, John M; Pecoraro, Shannon
2012-02-07
Individuals face evolutionary trade-offs between the acquisition of costly but accurate information gained firsthand and the use of inexpensive but possibly less reliable social information. American crows (Corvus brachyrhynchos) use both sources of information to learn the facial features of a dangerous person. We exposed wild crows to a novel 'dangerous face' by wearing a unique mask as we trapped, banded and released 7-15 birds at five study sites near Seattle, WA, USA. An immediate scolding response to the dangerous mask after trapping by previously captured crows demonstrates individual learning, while an immediate response by crows that were not captured probably represents conditioning to the trapping scene by the mob of birds that assembled during the capture. Later recognition of dangerous masks by lone crows that were never captured is consistent with horizontal social learning. Independent scolding by young crows, whose parents had conditioned them to scold the dangerous mask, demonstrates vertical social learning. Crows that directly experienced trapping later discriminated among dangerous and neutral masks more precisely than did crows that learned through social means. Learning enabled scolding to double in frequency and spread at least 1.2 km from the place of origin over a 5 year period at one site.
Employee choice of flexible spending account participation and health plan.
Hamilton, Barton H; Marton, James
2008-07-01
Despite the fact that flexible spending accounts (FSAs) are becoming an increasingly popular employer-provided health benefit, there has been very little empirical study of FSA use among employees at the individual level. This study contributes to the literature on FSAs using a unique data set that provides three years of employee-level-matched benefits data. Motivated by the theoretical model of FSA choice presented in Cardon and Showalter (J. Health Econ. 2001; 20(6):935-954), we examine the determinants of FSA participation and contribution levels using cross-sectional and random-effect two-part models. FSA participation and health plan choice are also modeled jointly in each year using conditional logit models. We find that, even after controlling for a number of other demographic characteristics, non-whites are less likely to participate in the FSA program, have lower contributions conditional on participation, and have a lower probability of switching to new lower cost share, higher premium plans when they were introduced. We also find evidence that choosing health plans with more expected out-of-pocket expenses is correlated with participation in the FSA program. Copyright (c) 2007 John Wiley & Sons, Ltd.
A mitochondrial mutator plasmid that causes senescence under dietary restricted conditions
Maas, Marc FPM; Hoekstra, Rolf F; Debets, Alfons JM
2007-01-01
Background Calorie or dietary restriction extends life span in a wide range of organisms including the filamentous fungus Podospora anserina. Under dietary restricted conditions, P. anserina isolates are several-fold longer lived. This is however not the case in isolates that carry one of the pAL2-1 homologous mitochondrial plasmids. Results We show that the pAL2-1 homologues act as 'insertional mutators' of the mitochondrial genome, which may explain their negative effect on life span extension. Sequencing revealed at least fourteen unique plasmid integration sites, of which twelve were located within the mitochondrial genome and two within copies of the plasmid itself. The plasmids were able to integrate in their entirety, via a non-homologous mode of recombination. Some of the integrated plasmid copies were truncated, which probably resulted from secondary, post-integrative, recombination processes. Integration sites were predominantly located within and surrounding the region containing the mitochondrial rDNA loci. Conclusion We propose a model for the mechanism of integration, based on innate modes of mtDNA recombination, and discuss its possible link with the plasmid's negative effect on dietary restriction mediated life span extension. PMID:17407571
Cyber-Physical Correlations for Infrastructure Resilience: A Game-Theoretic Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S; He, Fei; Ma, Chris Y. T.
In several critical infrastructures, the cyber and physical parts are correlated so that disruptions to one affect the other and hence the whole system. These correlations may be exploited to strategically launch components attacks, and hence must be accounted for ensuring the infrastructure resilience, specified by its survival probability. We characterize the cyber-physical interactions at two levels: (i) the failure correlation function specifies the conditional survival probability of cyber sub-infrastructure given the physical sub-infrastructure as a function of their marginal probabilities, and (ii) the individual survival probabilities of both sub-infrastructures are characterized by first-order differential conditions. We formulate a resiliencemore » problem for infrastructures composed of discrete components as a game between the provider and attacker, wherein their utility functions consist of an infrastructure survival probability term and a cost term expressed in terms of the number of components attacked and reinforced. We derive Nash Equilibrium conditions and sensitivity functions that highlight the dependence of infrastructure resilience on the cost term, correlation function and sub-infrastructure survival probabilities. These results generalize earlier ones based on linear failure correlation functions and independent component failures. We apply the results to models of cloud computing infrastructures and energy grids.« less
76 FR 770 - Proposed Information Collection; Comment Request; Monthly Wholesale Trade Survey
Federal Register 2010, 2011, 2012, 2013, 2014
2011-01-06
... reduces the time and cost of preparing mailout packages that contain unique variable data, while improving... developing productivity measurements. Estimates produced from the MWTS are based on a probability sample and..., excluding manufacturers' sales branches and offices. Estimated Number of Respondents: 4,500. Estimated Time...
Determinants of Students' Success at University
ERIC Educational Resources Information Center
Danilowicz-Gösele, Kamila; Lerche, Katharina; Meya, Johannes; Schwager, Robert
2017-01-01
This paper studies the determinants of academic success using a unique administrative data set of a German university. We show that high school grades are strongly associated with both graduation probabilities and final grades, whereas variables measuring social origin or income have only a smaller impact. Moreover, the link between high school…
Learning and Visualizing Modulation Discriminative Radio Signal Features
2016-09-01
implemented as a mapping of a sequence of in-phase quadrature ( IQ ) measurements generated by a software-defined radio to a probability distri- bution...over modulation classes. 3.1 TRAINING SNR EVALUATION Training CNNs on RF data raises the unique question of determining an optimal training SNR, that
2008-12-01
between our current project and the historical projects. Therefore to refine the historical volatility estimate of the previously completed software... historical volatility estimates obtained in the form of beliefs and plausibility based on subjective probabilities that take into consideration unique
Graphene-based nanoparticles (NPs) are used extensively in industrial, consumer, and mechanical applications based on their unique structural properties. Due to increasing use of these NPs, environmental exposure to graphene oxide (GO) is probable. GO has been shown to compromise...
NASA Astrophysics Data System (ADS)
Nie, Xiaokai; Coca, Daniel
2018-01-01
The paper introduces a matrix-based approach to estimate the unique one-dimensional discrete-time dynamical system that generated a given sequence of probability density functions whilst subjected to an additive stochastic perturbation with known density.
Nie, Xiaokai; Coca, Daniel
2018-01-01
The paper introduces a matrix-based approach to estimate the unique one-dimensional discrete-time dynamical system that generated a given sequence of probability density functions whilst subjected to an additive stochastic perturbation with known density.
Evaluation of Differential DependencY (EDDY) is a statistical test for the differential dependency relationship of a set of genes between two given conditions. For each condition, possible dependency network structures are enumerated and their likelihoods are computed to represent a probability distribution of dependency networks. The difference between the probability distributions of dependency networks is computed between conditions, and its statistical significance is evaluated with random permutations of condition labels on the samples.
Evaluation of Differential DependencY (EDDY) is a statistical test for the differential dependency relationship of a set of genes between two given conditions. For each condition, possible dependency network structures are enumerated and their likelihoods are computed to represent a probability distribution of dependency networks. The difference between the probability distributions of dependency networks is computed between conditions, and its statistical significance is evaluated with random permutations of condition labels on the samples.
Bayesian data analysis tools for atomic physics
NASA Astrophysics Data System (ADS)
Trassinelli, Martino
2017-10-01
We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested_fit to calculate the different probability distributions and other related quantities. Nested_fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.
Vaccaro, J.J.; Olsen, T.D.
2007-01-01
Unique ID grid with a unique value per Hydrologic Response Unit (HRU) per basin in reference to the estimated ground-water recharge for current conditions in the Yakima Basin Aquifer System, (USGS report SIR 2007-5007). Total 78,144 unique values. This grid made it easy to provide estimates of monthly ground-water recharge for water years 1960-2001in an electronic format for water managers, planners, and hydrologists, that could be related back to a spatially referenced grid by the unique ID.
A study of parameter identification
NASA Technical Reports Server (NTRS)
Herget, C. J.; Patterson, R. E., III
1978-01-01
A set of definitions for deterministic parameter identification ability were proposed. Deterministic parameter identificability properties are presented based on four system characteristics: direct parameter recoverability, properties of the system transfer function, properties of output distinguishability, and uniqueness properties of a quadratic cost functional. Stochastic parameter identifiability was defined in terms of the existence of an estimation sequence for the unknown parameters which is consistent in probability. Stochastic parameter identifiability properties are presented based on the following characteristics: convergence properties of the maximum likelihood estimate, properties of the joint probability density functions of the observations, and properties of the information matrix.
Kidnapping model: an extension of Selten's game.
Iqbal, Azhar; Masson, Virginie; Abbott, Derek
2017-12-01
Selten's game is a kidnapping model where the probability of capturing the kidnapper is independent of whether the hostage has been released or executed. Most often, in view of the elevated sensitivities involved, authorities put greater effort and resources into capturing the kidnapper if the hostage has been executed, in contrast with the case when a ransom is paid to secure the hostage's release. In this paper, we study the asymmetric game when the probability of capturing the kidnapper depends on whether the hostage has been executed or not and find a new uniquely determined perfect equilibrium point in Selten's game.
Adaptive tracking control for a class of stochastic switched systems
NASA Astrophysics Data System (ADS)
Zhang, Hui; Xia, Yuanqing
2018-02-01
The problem of adaptive tracking is considered for a class of stochastic switched systems, in this paper. As preliminaries, the criterion of global asymptotical practical stability in probability is first presented by the aid of common Lyapunov function method. Based on the Lyapunov stability criterion, adaptive backstepping controllers are designed to guarantee that the closed-loop system has a unique global solution, which is globally asymptotically practically stable in probability, and the tracking error in the fourth moment converges to an arbitrarily small neighbourhood of zero. Simulation examples are given to demonstrate the efficiency of the proposed schemes.
Conditional Probabilities and Collapse in Quantum Measurements
NASA Astrophysics Data System (ADS)
Laura, Roberto; Vanni, Leonardo
2008-09-01
We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.
Probabilities of good, marginal, and poor flying conditions for space shuttle ferry flights
NASA Technical Reports Server (NTRS)
Whiting, D. M.; Guttman, N. B.
1977-01-01
Empirical probabilities are provided for good, marginal, and poor flying weather for ferrying the Space Shuttle Orbiter from Edwards AFB, California, to Kennedy Space Center, Florida, and from Edwards AFB to Marshall Space Flight Center, Alabama. Results are given by month for each overall route plus segments of each route. The criteria for defining a day as good, marginal, or poor and the method of computing the relative frequencies and conditional probabilities for monthly reference periods are described.
Pre-Service Teachers' Conceptions of Probability
ERIC Educational Resources Information Center
Odafe, Victor U.
2011-01-01
Probability knowledge and skills are needed in science and in making daily decisions that are sometimes made under uncertain conditions. Hence, there is the need to ensure that the pre-service teachers of our children are well prepared to teach probability. Pre-service teachers' conceptions of probability are identified, and ways of helping them…
The Efficacy of Using Diagrams When Solving Probability Word Problems in College
ERIC Educational Resources Information Center
Beitzel, Brian D.; Staley, Richard K.
2015-01-01
Previous experiments have shown a deleterious effect of visual representations on college students' ability to solve total- and joint-probability word problems. The present experiments used conditional-probability problems, known to be more difficult than total- and joint-probability problems. The diagram group was instructed in how to use tree…
Relative Contributions of Three Descriptive Methods: Implications for Behavioral Assessment
ERIC Educational Resources Information Center
Pence, Sacha T.; Roscoe, Eileen M.; Bourret, Jason C.; Ahearn, William H.
2009-01-01
This study compared the outcomes of three descriptive analysis methods--the ABC method, the conditional probability method, and the conditional and background probability method--to each other and to the results obtained from functional analyses. Six individuals who had been diagnosed with developmental delays and exhibited problem behavior…
NASA Technical Reports Server (NTRS)
Whitnah, A. M.; Howes, D. B.
1971-01-01
Statistical information for the Apollo command module water landings is presented. This information includes the probability of occurrence of various impact conditions, a successful impact, and body X-axis loads of various magnitudes.
I show that a conditional probability analysis using a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criteria from empirical data, such as the Maryland Biological Streams Survey (MBSS) data.
Spatial prediction models for the probable biological condition of streams and rivers in the USA
The National Rivers and Streams Assessment (NRSA) is a probability-based survey conducted by the US Environmental Protection Agency and its state and tribal partners. It provides information on the ecological condition of the rivers and streams in the conterminous USA, and the ex...
Random forest models for the probable biological condition of streams and rivers in the USA
The National Rivers and Streams Assessment (NRSA) is a probability based survey conducted by the US Environmental Protection Agency and its state and tribal partners. It provides information on the ecological condition of the rivers and streams in the conterminous USA, and the ex...
A conditional probability approach using monitoring data to develop geographic-specific water quality criteria for protection of aquatic life is presented. Typical methods to develop criteria using existing monitoring data are limited by two issues: (1) how to extrapolate to an...
A two-stage broadcast message propagation model in social networks
NASA Astrophysics Data System (ADS)
Wang, Dan; Cheng, Shun-Jun
2016-11-01
Message propagation in social networks is becoming a popular topic in complex networks. One of the message types in social networks is called broadcast message. It refers to a type of message which has a unique and unknown destination for the publisher, such as 'lost and found'. Its propagation always has two stages. Due to this feature, rumor propagation model and epidemic propagation model have difficulty in describing this message's propagation accurately. In this paper, an improved two-stage susceptible-infected-removed model is proposed. We come up with the concept of the first forwarding probability and the second forwarding probability. Another part of our work is figuring out the influence to the successful message transmission chance in each level resulting from multiple reasons, including the topology of the network, the receiving probability, the first stage forwarding probability, the second stage forwarding probability as well as the length of the shortest path between the publisher and the relevant destination. The proposed model has been simulated on real networks and the results proved the model's effectiveness.
Nash Equilibrium of Social-Learning Agents in a Restless Multiarmed Bandit Game.
Nakayama, Kazuaki; Hisakado, Masato; Mori, Shintaro
2017-05-16
We study a simple model for social-learning agents in a restless multiarmed bandit (rMAB). The bandit has one good arm that changes to a bad one with a certain probability. Each agent stochastically selects one of the two methods, random search (individual learning) or copying information from other agents (social learning), using which he/she seeks the good arm. Fitness of an agent is the probability to know the good arm in the steady state of the agent system. In this model, we explicitly construct the unique Nash equilibrium state and show that the corresponding strategy for each agent is an evolutionarily stable strategy (ESS) in the sense of Thomas. It is shown that the fitness of an agent with ESS is superior to that of an asocial learner when the success probability of social learning is greater than a threshold determined from the probability of success of individual learning, the probability of change of state of the rMAB, and the number of agents. The ESS Nash equilibrium is a solution to Rogers' paradox.
Tucson's Santa Cruz River and the Arroyo Legacy
NASA Astrophysics Data System (ADS)
Betancourt, Julio Luis
1990-01-01
Between 1865 and 1915, arroyos developed in the southwestern United States across diverse hydrological, ecological and cultural settings. That they developed simultaneously has encouraged the search for a common cause --some phenomenon that was equally widespread and synchronous. There are few southwestern streams for which we have even a qualitative understanding of timelines and processes involved in initiation and extension of historic arroyos. Tucson's Santa Cruz River, often cited in the arroyo literature, offers a unique opportunity to chronicle the arroyo legacy and evaluate its causes. The present study reconstructs both the physical and cultural circumstances of channel entrenchment along the Santa Cruz River. Primary data include newspaper accounts, notes and plants of General Land Office surveys, eyewitness accounts, legal depositions, and repeat photography. On the Santa Cruz River, arroyo initiation and extension happened during relatively wet decades associated with frequent warm episodes in the tropical Pacific (El Nino conditions). Intensified El Nino activity during the period 1864-1891 may be symptomatic of long-term climatic change, perhaps indicative of global warming and destabilization of Pacific climate at the end of the Little Ice Age. During this period all but one of the years registering more than three days with rain exceeding 2.54 cm (1 in) in Tucson were El Nino events. The one exception was the summer of 1890, when the central equatorial Pacific was relatively cold but when prevailing low-surface pressures and low -level winds nevertheless steered tropical moisture from the west coast of Mexico into southern Arizona. In the twentieth century, catastrophic channel widening was caused by floods during El Nino events in 1905, 1915, 1977 and 1983. The Santa Cruz River arroyo formed when climatic conditions heightened the probabilities for occurrence of large floods in southern Arizona. Inadequate engineering of ditches that resulted in abrupt changes in the longitudinal profile of the stream further augmented probabilities that any one of these floods would initiate an arroyo. In the future, changing flood probabilities with low-frequency climatic fluctuations and improved flow conveyance due to intensified land use and channel stabilization will further complicate management of the arroyo in an increasingly urbanized floodplain.
Essential health care among Mexican indigenous people in a universal coverage context.
Servan-Mori, Edson; Pelcastre-Villafuerte, Blanca; Heredia-Pi, Ileana; Montoya-Rodríguez, Arain
2014-01-01
To analyze the influence of indigenous condition on essential health care among Mexican children, older people and women in reproductive age. The influence of indigenous condition on the probability of receiving medical care due to acute respiratory infection (ARI) and acute diarrheal disease (ADD), vaccination coverage; and antenatal care (ANC) was analyzed using the 2012 National Health Survey and non-experimental matching methods. Indigenous condition does not influence per-se vaccination coverage (in < 1 year), probability of attention of ARI's and ADD's as well as, timely, frequent, and quality ANC. Being indigenous and older adult increases 9% the probability of receiving a fulfilled vaccination schedule. Unfavorable structural conditions in which Mexican indigenous live constitutes the persistent mechanisms of their health vulnerability. Public policy should consider this level of intervention, in a way that intensive and focalized health strategies contribute to improve their health condition and life.
What Are Probability Surveys used by the National Aquatic Resource Surveys?
The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.
Modeling Dynamic Contrast-Enhanced MRI Data with a Constrained Local AIF.
Duan, Chong; Kallehauge, Jesper F; Pérez-Torres, Carlos J; Bretthorst, G Larry; Beeman, Scott C; Tanderup, Kari; Ackerman, Joseph J H; Garbow, Joel R
2018-02-01
This study aims to develop a constrained local arterial input function (cL-AIF) to improve quantitative analysis of dynamic contrast-enhanced (DCE)-magnetic resonance imaging (MRI) data by accounting for the contrast-agent bolus amplitude error in the voxel-specific AIF. Bayesian probability theory-based parameter estimation and model selection were used to compare tracer kinetic modeling employing either the measured remote-AIF (R-AIF, i.e., the traditional approach) or an inferred cL-AIF against both in silico DCE-MRI data and clinical, cervical cancer DCE-MRI data. When the data model included the cL-AIF, tracer kinetic parameters were correctly estimated from in silico data under contrast-to-noise conditions typical of clinical DCE-MRI experiments. Considering the clinical cervical cancer data, Bayesian model selection was performed for all tumor voxels of the 16 patients (35,602 voxels in total). Among those voxels, a tracer kinetic model that employed the voxel-specific cL-AIF was preferred (i.e., had a higher posterior probability) in 80 % of the voxels compared to the direct use of a single R-AIF. Maps of spatial variation in voxel-specific AIF bolus amplitude and arrival time for heterogeneous tissues, such as cervical cancer, are accessible with the cL-AIF approach. The cL-AIF method, which estimates unique local-AIF amplitude and arrival time for each voxel within the tissue of interest, provides better modeling of DCE-MRI data than the use of a single, measured R-AIF. The Bayesian-based data analysis described herein affords estimates of uncertainties for each model parameter, via posterior probability density functions, and voxel-wise comparison across methods/models, via model selection in data modeling.
Bayesian updating in a fault tree model for shipwreck risk assessment.
Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M
2017-07-15
Shipwrecks containing oil and other hazardous substances have been deteriorating on the seabeds of the world for many years and are threatening to pollute the marine environment. The status of the wrecks and the potential volume of harmful substances present in the wrecks are affected by a multitude of uncertainties. Each shipwreck poses a unique threat, the nature of which is determined by the structural status of the wreck and possible damage resulting from hazardous activities that could potentially cause a discharge. Decision support is required to ensure the efficiency of the prioritisation process and the allocation of resources required to carry out risk mitigation measures. Whilst risk assessments can provide the requisite decision support, comprehensive methods that take into account key uncertainties related to shipwrecks are limited. The aim of this paper was to develop a method for estimating the probability of discharge of hazardous substances from shipwrecks. The method is based on Bayesian updating of generic information on the hazards posed by different activities in the surroundings of the wreck, with information on site-specific and wreck-specific conditions in a fault tree model. Bayesian updating is performed using Monte Carlo simulations for estimating the probability of a discharge of hazardous substances and formal handling of intrinsic uncertainties. An example application involving two wrecks located off the Swedish coast is presented. Results show the estimated probability of opening, discharge and volume of the discharge for the two wrecks and illustrate the capability of the model to provide decision support. Together with consequence estimations of a discharge of hazardous substances, the suggested model enables comprehensive and probabilistic risk assessments of shipwrecks to be made. Copyright © 2017 Elsevier B.V. All rights reserved.
Jaguar interactions with pumas and prey at the northern edge of jaguars’ range
2017-01-01
We present the first study that evaluates jaguar-puma interactions in the arid lands of northern Mexico, where jaguars have their northernmost breeding population and both predators are persecuted for livestock depredation. We tested whether jaguars are the dominant species in this unique ecosystem, where: (1) pumas outnumber jaguars, (2) pumas are better adapted to arid environments, and (3) jaguars and pumas are of similar size. We analyzed four years of data with two approaches; a two species conditional occupancy model and an activity patterns analysis. We used camera location and prey presence as covariates for jaguar and puma detection and presence probabilities. We also explored overlap in activities of predators and prey. Where both species were detected, peccary presence was positively correlated with both jaguar and puma presence, whereas in areas where jaguars were detected but pumas were not, deer presence explained the probability of jaguar presence. We found that both predators were more likely to co-occur together than to be found independently, and so we rejected the hypothesis that jaguars were the dominant species in our study area. Predators were mainly nocturnal and their activity patterns overlapped by 60%. Jaguar, as compared with puma, overlapped more with deer and calves; puma overlapped with calves more than with other prey, suggesting a preference. We believe exploring predator relationships at different scales may help elucidate mechanisms that regulate their coexistence. PMID:28133569
Central limit theorem for recurrent random walks on a strip with bounded potential
NASA Astrophysics Data System (ADS)
Dolgopyat, D.; Goldsheid, I.
2018-07-01
We prove that the recurrent random walk (RW) in random environment (RE) on a strip in bounded potential satisfies the central limit theorem (CLT). The key ingredients of the proof are the analysis of the invariant measure equation and construction of a linearly growing martingale for walks in bounded potential. Our main result implies a complete classification of recurrent i.i.d. RWRE on the strip. Namely the walk either exhibits the Sinai behaviour in the sense that converges, as , to a (random) limit (the Sinai law) or, it satisfies the CLT. Another application of our main result is the CLT for the quasiperiodic environments with Diophantine frequencies in the recurrent case. We complement this result by proving that in the transient case the CLT holds for all uniquely ergodic environments. We also investigate the algebraic structure of the environments satisfying the CLT. In particular, we show that there exists a collection of proper algebraic subvarieties in the space of transition probabilities, such that: • If RE is stationary and ergodic and the transition probabilities are con-centrated on one of subvarieties from our collection then the CLT holds. • If the environment is i.i.d then the above condition is also necessary forthe CLT. All these results are valid for one-dimensional RWRE with bounded jumps as a particular case of the strip model.
ZERODUR - bending strength: review of achievements
NASA Astrophysics Data System (ADS)
Hartmann, Peter
2017-08-01
Increased demand for using the glass ceramic ZERODUR® with high mechanical loads called for strength data based on larger statistical samples. Design calculations for failure probability target value below 1: 100 000 cannot be made reliable with parameters derived from 20 specimen samples. The data now available for a variety of surface conditions, ground with different grain sizes and acid etched for full micro crack removal, allow stresses by factors four to ten times higher than before. The large sample revealed that breakage stresses of ground surfaces follow the three parameter Weibull distribution instead of the two parameter version. This is more reasonable considering that the micro cracks of such surfaces have a maximum depth which is reflected in the existence of a threshold breakage stress below which breakage probability is zero. This minimum strength allows calculating minimum lifetimes. Fatigue under load can be taken into account by using the stress corrosion coefficient for the actual environmental humidity. For fully etched surfaces Weibull statistics fails. The precondition of the Weibull distribution, the existence of one unique failure mechanism, is not given anymore. ZERODUR® with fully etched surfaces free from damages introduced after etching endures easily 100 MPa tensile stress. The possibility to use ZERODUR® for combined high precision and high stress application was confirmed by the successful launch and continuing operation of LISA Pathfinder the precursor experiment for the gravitational wave antenna satellite array eLISA.
Self-reproduction in k-inflation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Helmer, Ferdinand; Winitzki, Sergei
2006-09-15
We study cosmological self-reproduction in models of inflation driven by a scalar field {phi} with a noncanonical kinetic term (k-inflation). We develop a general criterion for the existence of attractors and establish conditions selecting a class of k-inflation models that admit a unique attractor solution. We then consider quantum fluctuations on the attractor background. We show that the correlation length of the fluctuations is of order c{sub s}H{sup -1}, where c{sub s} is the speed of sound. By computing the magnitude of field fluctuations, we determine the coefficients of Fokker-Planck equations describing the probability distribution of the spatially averaged fieldmore » {phi}. The field fluctuations are generally large in the inflationary attractor regime; hence, eternal self-reproduction is a generic feature of k-inflation. This is established more formally by demonstrating the existence of stationary solutions of the relevant Fokker-Planck equations. We also show that there exists a (model-dependent) range {phi}{sub R}<{phi}<{phi}{sub max} within which large fluctuations are likely to drive the field towards the upper boundary {phi}={phi}{sub max}, where the semiclassical consideration breaks down. An exit from inflation into reheating without reaching {phi}{sub max} will occur almost surely (with probability 1) only if the initial value of {phi} is below {phi}{sub R}. In this way, strong self-reproduction effects constrain models of k-inflation.« less
On the Uniqueness Conditions and Bifurcation Criteria in Coupled Thermo-Elasto-Plasticity
NASA Astrophysics Data System (ADS)
Śloderbach, Z.
2017-02-01
The global and local conditions of uniqueness and the criteria excluding a possibility of bifurcation of the equilibrium state for small strains are derived. The conditions and criteria are derived analyzing the problem of uniqueness of solution of the basic incremental boundary problem of coupled generalized thermo-elasto-plasticity. This paper is a continuation of some previous works by the author, but contains new derivation of the global and local criteria excluding a possibility of bifurcation of the equilibrium state for a comparison body dependent on statically admissible fields of stress velocity. All the thermal elastoplastic coupling effects, non-associated laws of plastic flow and influence of plastic strains on thermoplastic properties of a body were taken into account in this work. Thus, the mathematical problem considered here is not a self-conjugated problem. The paper contains four Appendices A, B, C and D where the local necessery and sufficient conditions of uniqueness have been derived.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fan, Linlin; Wang, Hongrui; Wang, Cheng
Drought risk analysis is essential for regional water resource management. In this study, the probabilistic relationship between precipitation and meteorological drought in Beijing, China, was calculated under three different precipitation conditions (precipitation equal to, greater than, or less than a threshold) based on copulas. The Standardized Precipitation Evapotranspiration Index (SPEI) was calculated based on monthly total precipitation and monthly mean temperature data. The trends and variations in the SPEI were analysed using Hilbert-Huang Transform (HHT) and Mann-Kendall (MK) trend tests with a running approach. The results of the HHT and MK test indicated a significant decreasing trend in the SPEI.more » The copula-based conditional probability indicated that the probability of meteorological drought decreased as monthly precipitation increased and that 10 mm can be regarded as the threshold for triggering extreme drought. From a quantitative perspective, when R ≤ mm, the probabilities of moderate drought, severe drought, and extreme drought were 22.1%, 18%, and 13.6%, respectively. This conditional probability distribution not only revealed the occurrence of meteorological drought in Beijing but also provided a quantitative way to analyse the probability of drought under different precipitation conditions. Furthermore, the results provide a useful reference for future drought prediction.« less
Fan, Linlin; Wang, Hongrui; Wang, Cheng; ...
2017-05-16
Drought risk analysis is essential for regional water resource management. In this study, the probabilistic relationship between precipitation and meteorological drought in Beijing, China, was calculated under three different precipitation conditions (precipitation equal to, greater than, or less than a threshold) based on copulas. The Standardized Precipitation Evapotranspiration Index (SPEI) was calculated based on monthly total precipitation and monthly mean temperature data. The trends and variations in the SPEI were analysed using Hilbert-Huang Transform (HHT) and Mann-Kendall (MK) trend tests with a running approach. The results of the HHT and MK test indicated a significant decreasing trend in the SPEI.more » The copula-based conditional probability indicated that the probability of meteorological drought decreased as monthly precipitation increased and that 10 mm can be regarded as the threshold for triggering extreme drought. From a quantitative perspective, when R ≤ mm, the probabilities of moderate drought, severe drought, and extreme drought were 22.1%, 18%, and 13.6%, respectively. This conditional probability distribution not only revealed the occurrence of meteorological drought in Beijing but also provided a quantitative way to analyse the probability of drought under different precipitation conditions. Furthermore, the results provide a useful reference for future drought prediction.« less
A procedure for landslide susceptibility zonation by the conditional analysis method1
NASA Astrophysics Data System (ADS)
Clerici, Aldo; Perego, Susanna; Tellini, Claudio; Vescovi, Paolo
2002-12-01
Numerous methods have been proposed for landslide probability zonation of the landscape by means of a Geographic Information System (GIS). Among the multivariate methods, i.e. those methods which simultaneously take into account all the factors contributing to instability, the Conditional Analysis method applied to a subdivision of the territory into Unique Condition Units is particularly straightforward from a conceptual point of view and particularly suited to the use of a GIS. In fact, working on the principle that future landslides are more likely to occur under those conditions which led to past instability, landslide susceptibility is defined by computing the landslide density in correspondence with different combinations of instability factors. The conceptual simplicity of this method, however, does not necessarily imply that it is simple to implement, especially as it requires rather complex operations and a high number of GIS commands. Moreover, there is the possibility that, in order to achieve satisfactory results, the procedure has to be repeated a few times changing the factors or modifying the class subdivision. To solve this problem, we created a shell program which, by combining the shell commands, the GIS Geographical Research Analysis Support System (GRASS) commands and the gawk language commands, carries out the whole procedure automatically. This makes the construction of a Landslide Susceptibility Map easy and fast for large areas too, and even when a high spatial resolution is adopted, as shown by application of the procedure to the Parma River basin, in the Italian Northern Apennines.
Game-Theoretic strategies for systems of components using product-form utilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rao, Nageswara S; Ma, Cheng-Yu; Hausken, K.
Many critical infrastructures are composed of multiple systems of components which are correlated so that disruptions to one may propagate to others. We consider such infrastructures with correlations characterized in two ways: (i) an aggregate failure correlation function specifies the conditional failure probability of the infrastructure given the failure of an individual system, and (ii) a pairwise correlation function between two systems specifies the failure probability of one system given the failure of the other. We formulate a game for ensuring the resilience of the infrastructure, wherein the utility functions of the provider and attacker are products of an infrastructuremore » survival probability term and a cost term, both expressed in terms of the numbers of system components attacked and reinforced. The survival probabilities of individual systems satisfy first-order differential conditions that lead to simple Nash Equilibrium conditions. We then derive sensitivity functions that highlight the dependence of infrastructure resilience on the cost terms, correlation functions, and individual system survival probabilities. We apply these results to simplified models of distributed cloud computing and energy grid infrastructures.« less
Anthropogenic warming has increased drought risk in California.
Diffenbaugh, Noah S; Swain, Daniel L; Touma, Danielle
2015-03-31
California is currently in the midst of a record-setting drought. The drought began in 2012 and now includes the lowest calendar-year and 12-mo precipitation, the highest annual temperature, and the most extreme drought indicators on record. The extremely warm and dry conditions have led to acute water shortages, groundwater overdraft, critically low streamflow, and enhanced wildfire risk. Analyzing historical climate observations from California, we find that precipitation deficits in California were more than twice as likely to yield drought years if they occurred when conditions were warm. We find that although there has not been a substantial change in the probability of either negative or moderately negative precipitation anomalies in recent decades, the occurrence of drought years has been greater in the past two decades than in the preceding century. In addition, the probability that precipitation deficits co-occur with warm conditions and the probability that precipitation deficits produce drought have both increased. Climate model experiments with and without anthropogenic forcings reveal that human activities have increased the probability that dry precipitation years are also warm. Further, a large ensemble of climate model realizations reveals that additional global warming over the next few decades is very likely to create ∼ 100% probability that any annual-scale dry period is also extremely warm. We therefore conclude that anthropogenic warming is increasing the probability of co-occurring warm-dry conditions like those that have created the acute human and ecosystem impacts associated with the "exceptional" 2012-2014 drought in California.
Anthropogenic warming has increased drought risk in California
Diffenbaugh, Noah S.; Swain, Daniel L.; Touma, Danielle
2015-01-01
California is currently in the midst of a record-setting drought. The drought began in 2012 and now includes the lowest calendar-year and 12-mo precipitation, the highest annual temperature, and the most extreme drought indicators on record. The extremely warm and dry conditions have led to acute water shortages, groundwater overdraft, critically low streamflow, and enhanced wildfire risk. Analyzing historical climate observations from California, we find that precipitation deficits in California were more than twice as likely to yield drought years if they occurred when conditions were warm. We find that although there has not been a substantial change in the probability of either negative or moderately negative precipitation anomalies in recent decades, the occurrence of drought years has been greater in the past two decades than in the preceding century. In addition, the probability that precipitation deficits co-occur with warm conditions and the probability that precipitation deficits produce drought have both increased. Climate model experiments with and without anthropogenic forcings reveal that human activities have increased the probability that dry precipitation years are also warm. Further, a large ensemble of climate model realizations reveals that additional global warming over the next few decades is very likely to create ∼100% probability that any annual-scale dry period is also extremely warm. We therefore conclude that anthropogenic warming is increasing the probability of co-occurring warm–dry conditions like those that have created the acute human and ecosystem impacts associated with the “exceptional” 2012–2014 drought in California. PMID:25733875
Mathematical issues in eternal inflation
NASA Astrophysics Data System (ADS)
Singh Kohli, Ikjyot; Haslam, Michael C.
2015-04-01
In this paper, we consider the problem of the existence and uniqueness of solutions to the Einstein field equations for a spatially flat Friedmann-Lemaître-Robertson-Walker universe in the context of stochastic eternal inflation, where the stochastic mechanism is modelled by adding a stochastic forcing term representing Gaussian white noise to the Klein-Gordon equation. We show that under these considerations, the Klein-Gordon equation actually becomes a stochastic differential equation. Therefore, the existence and uniqueness of solutions to Einstein’s equations depend on whether the coefficients of this stochastic differential equation obey Lipschitz continuity conditions. We show that for any choice of V(φ ), the Einstein field equations are not globally well-posed, hence, any solution found to these equations is not guaranteed to be unique. Instead, the coefficients are at best locally Lipschitz continuous in the physical state space of the dynamical variables, which only exist up to a finite explosion time. We further perform Feller’s explosion test for an arbitrary power-law inflaton potential and prove that all solutions to the Einstein field equations explode in a finite time with probability one. This implies that the mechanism of stochastic inflation thus considered cannot be described to be eternal, since the very concept of eternal inflation implies that the process continues indefinitely. We therefore argue that stochastic inflation based on a stochastic forcing term would not produce an infinite number of universes in some multiverse ensemble. In general, since the Einstein field equations in both situations are not well-posed, we further conclude that the existence of a multiverse via the stochastic eternal inflation mechanism considered in this paper is still very much an open question that will require much deeper investigation.
Estimation of post-test probabilities by residents: Bayesian reasoning versus heuristics?
Hall, Stacey; Phang, Sen Han; Schaefer, Jeffrey P; Ghali, William; Wright, Bruce; McLaughlin, Kevin
2014-08-01
Although the process of diagnosing invariably begins with a heuristic, we encourage our learners to support their diagnoses by analytical cognitive processes, such as Bayesian reasoning, in an attempt to mitigate the effects of heuristics on diagnosing. There are, however, limited data on the use ± impact of Bayesian reasoning on the accuracy of disease probability estimates. In this study our objective was to explore whether Internal Medicine residents use a Bayesian process to estimate disease probabilities by comparing their disease probability estimates to literature-derived Bayesian post-test probabilities. We gave 35 Internal Medicine residents four clinical vignettes in the form of a referral letter and asked them to estimate the post-test probability of the target condition in each case. We then compared these to literature-derived probabilities. For each vignette the estimated probability was significantly different from the literature-derived probability. For the two cases with low literature-derived probability our participants significantly overestimated the probability of these target conditions being the correct diagnosis, whereas for the two cases with high literature-derived probability the estimated probability was significantly lower than the calculated value. Our results suggest that residents generate inaccurate post-test probability estimates. Possible explanations for this include ineffective application of Bayesian reasoning, attribute substitution whereby a complex cognitive task is replaced by an easier one (e.g., a heuristic), or systematic rater bias, such as central tendency bias. Further studies are needed to identify the reasons for inaccuracy of disease probability estimates and to explore ways of improving accuracy.
A Geostatistical Scaling Approach for the Generation of Non Gaussian Random Variables and Increments
NASA Astrophysics Data System (ADS)
Guadagnini, Alberto; Neuman, Shlomo P.; Riva, Monica; Panzeri, Marco
2016-04-01
We address manifestations of non-Gaussian statistical scaling displayed by many variables, Y, and their (spatial or temporal) increments. Evidence of such behavior includes symmetry of increment distributions at all separation distances (or lags) with sharp peaks and heavy tails which tend to decay asymptotically as lag increases. Variables reported to exhibit such distributions include quantities of direct relevance to hydrogeological sciences, e.g. porosity, log permeability, electrical resistivity, soil and sediment texture, sediment transport rate, rainfall, measured and simulated turbulent fluid velocity, and other. No model known to us captures all of the documented statistical scaling behaviors in a unique and consistent manner. We recently proposed a generalized sub-Gaussian model (GSG) which reconciles within a unique theoretical framework the probability distributions of a target variable and its increments. We presented an algorithm to generate unconditional random realizations of statistically isotropic or anisotropic GSG functions and illustrated it in two dimensions. In this context, we demonstrated the feasibility of estimating all key parameters of a GSG model underlying a single realization of Y by analyzing jointly spatial moments of Y data and corresponding increments. Here, we extend our GSG model to account for noisy measurements of Y at a discrete set of points in space (or time), present an algorithm to generate conditional realizations of corresponding isotropic or anisotropic random field, and explore them on one- and two-dimensional synthetic test cases.
Secondary School Students' Reasoning about Conditional Probability, Samples, and Sampling Procedures
ERIC Educational Resources Information Center
Prodromou, Theodosia
2016-01-01
In the Australian mathematics curriculum, Year 12 students (aged 16-17) are asked to solve conditional probability problems that involve the representation of the problem situation with two-way tables or three-dimensional diagrams and consider sampling procedures that result in different correct answers. In a small exploratory study, we…
We show that a conditional probability analysis that utilizes a stressor-response model based on a logistic regression provides a useful approach for developing candidate water quality criterai from empirical data. The critical step in this approach is transforming the response ...
A conditional probability analysis (CPA) approach has been developed for identifying biological thresholds of impact for use in the development of geographic-specific water quality criteria for protection of aquatic life. This approach expresses the threshold as the likelihood ...
Racial/Ethnic and County-level Disparity in Inpatient Utilization among Hawai'i Medicaid Population.
Siriwardhana, Chathura; Lim, Eunjung; Aggarwal, Lovedhi; Davis, James; Hixon, Allen; Chen, John J
2018-05-01
We investigated racial/ethnic and county-level disparities in inpatient utilization for 15 clinical conditions among Hawaii's Medicaid population. The study was conducted using inpatient claims data from more than 200,000 Hawai'i Medicaid beneficiaries, reported in the year 2010. The analysis was performed by stratifying the Medicaid population into three age groups: children and adolescent group (1-20 years), adult group (21-64 years), and elderly group (65 years and above). Among the differences found, Asians had a low probability of inpatient admissions compared to Whites for many disease categories, while Native Hawaiian/Pacific Islanders had higher probabilities than Whites, across all age groups. Pediatric and adult groups from Hawai'i County (Big Island) had lower probabilities for inpatient admissions compared to Honolulu County (O'ahu) for most disease conditions, but higher probabilities were observed for several conditions in the elderly group. Notably, the elderly population residing on Kaua'i County (Kaua'i and Ni'ihau islands) had substantially increased odds of hospital admissions for several disease conditions, compared to Honolulu.
Oberauer, Klaus; Awh, Edward; Sutterer, David W.
2016-01-01
We report four experiments examining whether associations in visual working memory are subject to proactive interference from long term memory (LTM). Following a long-term learning phase in which participants learned the colors of 120 unique objects, a working memory (WM) test was administered in which participants recalled the precise colors of three concrete objects in an array. Each array in the WM test consisted of one old (previously learned) object with a new color (old-mismatch), one old object with its old color (old-match), and one new object. Experiments 1 to 3 showed that WM performance was better in the old-match condition than in the new condition, reflecting a beneficial contribution from long term memory. In the old mismatch condition, participants sometimes reported colors associated with the relevant shape in LTM, but the probability of successful recall was equivalent to that in the new condition. Thus, information from LTM only intruded in the absence of reportable information in WM. Experiment 4 tested for, and failed to find, proactive interference from the preceding trial in the WM test: Performance in the old-mismatch condition, presenting an object from the preceding trial with a new color, was equal to performance with new objects. Experiment 5 showed that long-term memory for object-color associations is subject to proactive interference. We conclude that the exchange of information between LTM and WM appears to be controlled by a gating mechanism that protects the contents of WM from proactive interference but admits LTM information when it is useful. PMID:27685018
Brinkløv, Signe; Elemans, Coen P. H.
2017-01-01
Oilbirds are active at night, foraging for fruits using keen olfaction and extremely light-sensitive eyes, and echolocate as they leave and return to their cavernous roosts. We recorded the echolocation behaviour of wild oilbirds using a multi-microphone array as they entered and exited their roosts under different natural light conditions. During echolocation, the birds produced click bursts (CBs) lasting less than 10 ms and consisting of a variable number (2–8) of clicks at 2–3 ms intervals. The CBs have a bandwidth of 7–23 kHz at −6 dB from signal peak frequency. We report on two unique characteristics of this avian echolocation system. First, oilbirds reduce both the energy and number of clicks in their CBs under conditions of clear, moonlit skies, compared with dark, moonless nights. Second, we document a frequency mismatch between the reported best frequency of oilbird hearing (approx. 2 kHz) and the bandwidth of their echolocation CBs. This unusual signal-to-sensory system mismatch probably reflects avian constraints on high-frequency hearing but may still allow oilbirds fine-scale, close-range detail resolution at the upper extreme (approx. 10 kHz) of their presumed hearing range. Alternatively, oilbirds, by an as-yet unknown mechanism, are able to hear frequencies higher than currently appreciated. PMID:28573036
Ramey, Andrew M.; Ely, Craig R.; Schmutz, Joel A.; Pearce, John M.; Heard, Darryl J.
2012-01-01
Tundra swans (Cygnus columbianus) are broadly distributed in North America, use a wide variety of habitats, and exhibit diverse migration strategies. We investigated patterns of hematozoa infection in three populations of tundra swans that breed in Alaska using satellite tracking to infer host movement and molecular techniques to assess the prevalence and genetic diversity of parasites. We evaluated whether migratory patterns and environmental conditions at breeding areas explain the prevalence of blood parasites in migratory birds by contrasting the fit of competing models formulated in an occupancy modeling framework and calculating the detection probability of the top model using Akaike Information Criterion (AIC). We described genetic diversity of blood parasites in each population of swans by calculating the number of unique parasite haplotypes observed. Blood parasite infection was significantly different between populations of Alaska tundra swans, with the highest estimated prevalence occurring among birds occupying breeding areas with lower mean daily wind speeds and higher daily summer temperatures. Models including covariates of wind speed and temperature during summer months at breeding grounds better predicted hematozoa prevalence than those that included annual migration distance or duration. Genetic diversity of blood parasites in populations of tundra swans appeared to be relative to hematozoa prevalence. Our results suggest ecological conditions at breeding grounds may explain differences of hematozoa infection among populations of tundra swans that breed in Alaska. PMID:23049862
Ramey, Andrew M.; Ely, Craig R.; Schmutz, Joel A.; Pearce, John M.; Heard, Darryl J.
2012-01-01
Tundra swans (Cygnus columbianus) are broadly distributed in North America, use a wide variety of habitats, and exhibit diverse migration strategies. We investigated patterns of hematozoa infection in three populations of tundra swans that breed in Alaska using satellite tracking to infer host movement and molecular techniques to assess the prevalence and genetic diversity of parasites. We evaluated whether migratory patterns and environmental conditions at breeding areas explain the prevalence of blood parasites in migratory birds by contrasting the fit of competing models formulated in an occupancy modeling framework and calculating the detection probability of the top model using Akaike Information Criterion (AIC). We described genetic diversity of blood parasites in each population of swans by calculating the number of unique parasite haplotypes observed. Blood parasite infection was significantly different between populations of Alaska tundra swans, with the highest estimated prevalence occurring among birds occupying breeding areas with lower mean daily wind speeds and higher daily summer temperatures. Models including covariates of wind speed and temperature during summer months at breeding grounds better predicted hematozoa prevalence than those that included annual migration distance or duration. Genetic diversity of blood parasites in populations of tundra swans appeared to be relative to hematozoa prevalence. Our results suggest ecological conditions at breeding grounds may explain differences of hematozoa infection among populations of tundra swans that breed in Alaska.
Ramey, Andrew M; Ely, Craig R; Schmutz, Joel A; Pearce, John M; Heard, Darryl J
2012-01-01
Tundra swans (Cygnus columbianus) are broadly distributed in North America, use a wide variety of habitats, and exhibit diverse migration strategies. We investigated patterns of hematozoa infection in three populations of tundra swans that breed in Alaska using satellite tracking to infer host movement and molecular techniques to assess the prevalence and genetic diversity of parasites. We evaluated whether migratory patterns and environmental conditions at breeding areas explain the prevalence of blood parasites in migratory birds by contrasting the fit of competing models formulated in an occupancy modeling framework and calculating the detection probability of the top model using Akaike Information Criterion (AIC). We described genetic diversity of blood parasites in each population of swans by calculating the number of unique parasite haplotypes observed. Blood parasite infection was significantly different between populations of Alaska tundra swans, with the highest estimated prevalence occurring among birds occupying breeding areas with lower mean daily wind speeds and higher daily summer temperatures. Models including covariates of wind speed and temperature during summer months at breeding grounds better predicted hematozoa prevalence than those that included annual migration distance or duration. Genetic diversity of blood parasites in populations of tundra swans appeared to be relative to hematozoa prevalence. Our results suggest ecological conditions at breeding grounds may explain differences of hematozoa infection among populations of tundra swans that breed in Alaska.
The probability of object-scene co-occurrence influences object identification processes.
Sauvé, Geneviève; Harmand, Mariane; Vanni, Léa; Brodeur, Mathieu B
2017-07-01
Contextual information allows the human brain to make predictions about the identity of objects that might be seen and irregularities between an object and its background slow down perception and identification processes. Bar and colleagues modeled the mechanisms underlying this beneficial effect suggesting that the brain stocks information about the statistical regularities of object and scene co-occurrence. Their model suggests that these recurring regularities could be conceptualized along a continuum in which the probability of seeing an object within a given scene can be high (probable condition), moderate (improbable condition) or null (impossible condition). In the present experiment, we propose to disentangle the electrophysiological correlates of these context effects by directly comparing object-scene pairs found along this continuum. We recorded the event-related potentials of 30 healthy participants (18-34 years old) and analyzed their brain activity in three time windows associated with context effects. We observed anterior negativities between 250 and 500 ms after object onset for the improbable and impossible conditions (improbable more negative than impossible) compared to the probable condition as well as a parieto-occipital positivity (improbable more positive than impossible). The brain may use different processing pathways to identify objects depending on whether the probability of co-occurrence with the scene is moderate (rely more on top-down effects) or null (rely more on bottom-up influences). The posterior positivity could index error monitoring aimed to ensure that no false information is integrated into mental representations of the world.
49 CFR 173.50 - Class 1-Definitions.
Code of Federal Regulations, 2010 CFR
2010-10-01
... insensitive that there is very little probability of initiation or of transition from burning to detonation under normal conditions of transport. 1 The probability of transition from burning to detonation is... contain only extremely insensitive detonating substances and which demonstrate a negligible probability of...
Probability Issues in without Replacement Sampling
ERIC Educational Resources Information Center
Joarder, A. H.; Al-Sabah, W. S.
2007-01-01
Sampling without replacement is an important aspect in teaching conditional probabilities in elementary statistics courses. Different methods proposed in different texts for calculating probabilities of events in this context are reviewed and their relative merits and limitations in applications are pinpointed. An alternative representation of…
Duncan C. Lutes; Robert E. Keane; John F. Caratti; Carl H. Key; Nathan C. Benson
2006-01-01
This is probably the most critical phase of FIREMON sampling because this plot ID must be unique across all plots that will be entered in the FIREMON database. The plot identifier is made up of three parts: Registration Code, Project Code, and Plot Number.The FIREMON Analysis Tools program will allow summarization and comparison of plots only if...
Some Factor Analytic Approximations to Latent Class Structure.
ERIC Educational Resources Information Center
Dziuban, Charles D.; Denton, William T.
Three procedures, alpha, image, and uniqueness rescaling, were applied to a joint occurrence probability matrix. That matrix was the basis of a well-known latent class structure. The values of the recurring subscript elements were varied as follows: Case 1 - The known elements were input; Case 2 - The upper bounds to the recurring subscript…
A Simpli ed, General Approach to Simulating from Multivariate Copula Functions
Barry Goodwin
2012-01-01
Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses \\probability{...
A unique case of right-sided Poland syndrome with true dextrocardia and total situs inversus.
Atasoy, Halil I; Yavuz, Taner; Altunrende, Sevil; Guven, Melih; Kılıcgun, Ali; Polat, Omer; Yesiller, Erkan; Duzenli, Selma
2013-02-01
Poland syndrome has been reported to be associated with true dextrocardia, but not with true situs inversus. In this report, we describe the first patient with total situs inversus in medical literature and try to highlight the syndrome's probable etiology and pathogenetic mechanisms in utero.
Early Effects of Radical Position Legality in Chinese: An ERP Study
ERIC Educational Resources Information Center
Yum, Yen Na; Su, I-Fan; Law, Sam-Po
2015-01-01
This study aimed to investigate the timecourse and neural underpinnings of the coding of radical positions in Chinese character reading. To isolate effects of radical positions, four types of pseudocharacters were created in which the constituent radicals appeared in positions varying in probability of occurrence, that is, Unique, Dominant,…
Public Education in the Philippine Islands. Bulletin, 1935, No. 9
ERIC Educational Resources Information Center
Cook, Katherine M.
1935-01-01
The initiation and development of public education in the Philippine Islands is unique in modern educational history. Probably only among a people with an enthusiastic belief in the significance of education, under the benevolent guidance of a nation with an equally enthusiastic confidence in its possibilities, could the educational experiment…
Bouchard, Kristofer E.; Ganguli, Surya; Brainard, Michael S.
2015-01-01
The majority of distinct sensory and motor events occur as temporally ordered sequences with rich probabilistic structure. Sequences can be characterized by the probability of transitioning from the current state to upcoming states (forward probability), as well as the probability of having transitioned to the current state from previous states (backward probability). Despite the prevalence of probabilistic sequencing of both sensory and motor events, the Hebbian mechanisms that mold synapses to reflect the statistics of experienced probabilistic sequences are not well understood. Here, we show through analytic calculations and numerical simulations that Hebbian plasticity (correlation, covariance, and STDP) with pre-synaptic competition can develop synaptic weights equal to the conditional forward transition probabilities present in the input sequence. In contrast, post-synaptic competition can develop synaptic weights proportional to the conditional backward probabilities of the same input sequence. We demonstrate that to stably reflect the conditional probability of a neuron's inputs and outputs, local Hebbian plasticity requires balance between competitive learning forces that promote synaptic differentiation and homogenizing learning forces that promote synaptic stabilization. The balance between these forces dictates a prior over the distribution of learned synaptic weights, strongly influencing both the rate at which structure emerges and the entropy of the final distribution of synaptic weights. Together, these results demonstrate a simple correspondence between the biophysical organization of neurons, the site of synaptic competition, and the temporal flow of information encoded in synaptic weights by Hebbian plasticity while highlighting the utility of balancing learning forces to accurately encode probability distributions, and prior expectations over such probability distributions. PMID:26257637
Modelling detection probabilities to evaluate management and control tools for an invasive species
Christy, M.T.; Yackel Adams, A.A.; Rodda, G.H.; Savidge, J.A.; Tyrrell, C.L.
2010-01-01
For most ecologists, detection probability (p) is a nuisance variable that must be modelled to estimate the state variable of interest (i.e. survival, abundance, or occupancy). However, in the realm of invasive species control, the rate of detection and removal is the rate-limiting step for management of this pervasive environmental problem. For strategic planning of an eradication (removal of every individual), one must identify the least likely individual to be removed, and determine the probability of removing it. To evaluate visual searching as a control tool for populations of the invasive brown treesnake Boiga irregularis, we designed a mark-recapture study to evaluate detection probability as a function of time, gender, size, body condition, recent detection history, residency status, searcher team and environmental covariates. We evaluated these factors using 654 captures resulting from visual detections of 117 snakes residing in a 5-ha semi-forested enclosure on Guam, fenced to prevent immigration and emigration of snakes but not their prey. Visual detection probability was low overall (= 0??07 per occasion) but reached 0??18 under optimal circumstances. Our results supported sex-specific differences in detectability that were a quadratic function of size, with both small and large females having lower detection probabilities than males of those sizes. There was strong evidence for individual periodic changes in detectability of a few days duration, roughly doubling detection probability (comparing peak to non-elevated detections). Snakes in poor body condition had estimated mean detection probabilities greater than snakes with high body condition. Search teams with high average detection rates exhibited detection probabilities about twice that of search teams with low average detection rates. Surveys conducted with bright moonlight and strong wind gusts exhibited moderately decreased probabilities of detecting snakes. Synthesis and applications. By emphasizing and modelling detection probabilities, we now know: (i) that eradication of this species by searching is possible, (ii) how much searching effort would be required, (iii) under what environmental conditions searching would be most efficient, and (iv) several factors that are likely to modulate this quantification when searching is applied to new areas. The same approach can be use for evaluation of any control technology or population monitoring programme. ?? 2009 The Authors. Journal compilation ?? 2009 British Ecological Society.
Bivariate categorical data analysis using normal linear conditional multinomial probability model.
Sun, Bingrui; Sutradhar, Brajendra
2015-02-10
Bivariate multinomial data such as the left and right eyes retinopathy status data are analyzed either by using a joint bivariate probability model or by exploiting certain odds ratio-based association models. However, the joint bivariate probability model yields marginal probabilities, which are complicated functions of marginal and association parameters for both variables, and the odds ratio-based association model treats the odds ratios involved in the joint probabilities as 'working' parameters, which are consequently estimated through certain arbitrary 'working' regression models. Also, this later odds ratio-based model does not provide any easy interpretations of the correlations between two categorical variables. On the basis of pre-specified marginal probabilities, in this paper, we develop a bivariate normal type linear conditional multinomial probability model to understand the correlations between two categorical variables. The parameters involved in the model are consistently estimated using the optimal likelihood and generalized quasi-likelihood approaches. The proposed model and the inferences are illustrated through an intensive simulation study as well as an analysis of the well-known Wisconsin Diabetic Retinopathy status data. Copyright © 2014 John Wiley & Sons, Ltd.
Approximation of Failure Probability Using Conditional Sampling
NASA Technical Reports Server (NTRS)
Giesy. Daniel P.; Crespo, Luis G.; Kenney, Sean P.
2008-01-01
In analyzing systems which depend on uncertain parameters, one technique is to partition the uncertain parameter domain into a failure set and its complement, and judge the quality of the system by estimating the probability of failure. If this is done by a sampling technique such as Monte Carlo and the probability of failure is small, accurate approximation can require so many sample points that the computational expense is prohibitive. Previous work of the authors has shown how to bound the failure event by sets of such simple geometry that their probabilities can be calculated analytically. In this paper, it is shown how to make use of these failure bounding sets and conditional sampling within them to substantially reduce the computational burden of approximating failure probability. It is also shown how the use of these sampling techniques improves the confidence intervals for the failure probability estimate for a given number of sample points and how they reduce the number of sample point analyses needed to achieve a given level of confidence.
Pedigrees, Prizes, and Prisoners: The Misuse of Conditional Probability
ERIC Educational Resources Information Center
Carlton, Matthew A.
2005-01-01
We present and discuss three examples of misapplication of the notion of conditional probability. In each example, we present the problem along with a published and/or well-known incorrect--but seemingly plausible--solution. We then give a careful treatment of the correct solution, in large part to show how careful application of basic probability…
Unsolved Problems in Evolutionary Theory
1967-01-01
finding the probability of survival of a single new mutant). Most natural populations probably satisfy these conditions , as is illustrated by the...Ykl) of small quantities adding to zero. Then under suitable conditions on the function f(x), (3) xi + Yi,t+i = fi(x) + YE yjfi(tf) + O(y yt...It is clear that a sufficient condition for the point x to be locally stable is that all the roots of the matrix, (4) (a j) = ____ should have moduli
Medicaid reimbursement, prenatal care and infant health.
Sonchak, Lyudmyla
2015-12-01
This paper evaluates the impact of state-level Medicaid reimbursement rates for obstetric care on prenatal care utilization across demographic groups. It also uses these rates as an instrumental variable to assess the importance of prenatal care on birth weight. The analysis is conducted using a unique dataset of Medicaid reimbursement rates and 2001-2010 Vital Statistics Natality data. Conditional on county fixed effects, the study finds a modest, but statistically significant positive relationship between Medicaid reimbursement rates and the number of prenatal visits obtained by pregnant women. Additionally, higher rates are associated with an increase in the probability of obtaining adequate care, as well as a reduction in the incidence of going without any prenatal care. However, the effect of an additional prenatal visit on birth weight is virtually zero for black disadvantaged mothers, while an additional visit yields a substantial increase in birth weight of over 20 g for white disadvantaged mothers. Copyright © 2015 Elsevier B.V. All rights reserved.
Peatmoss (Sphagnum) diversification associated with Miocene Northern Hemisphere climatic cooling?
Shaw, A Jonathan; Devos, Nicolas; Cox, Cymon J; Boles, Sandra B; Shaw, Blanka; Buchanan, Alex M; Cave, Lynette; Seppelt, Rodney
2010-06-01
Global climate changes sometimes spark biological radiations that can feed back to effect significant ecological impacts. Northern Hemisphere peatlands dominated by living and dead peatmosses (Sphagnum) harbor almost 30% of the global soil carbon pool and have functioned as a net carbon sink throughout the Holocene, and probably since the late Tertiary. Before that time, northern latitudes were dominated by tropical and temperate plant groups and ecosystems. Phylogenetic analyses of mosses (phylum Bryophyta) based on nucleotide sequences from the plastid, mitochondrial, and nuclear genomes indicate that most species of Sphagnum are of recent origin (ca. <20 Ma). Sphagnum species are not only well-adapted to boreal peatlands, they create the conditions that promote development of peatlands. The recent radiation that gave rise to extant diversity of peatmosses is temporally associated with Miocene climatic cooling in the Northern Hemisphere. The evolution of Sphagnum has had profound influences on global biogeochemistry because of the unique biochemical, physiological, and morphological features of these plants, both while alive and after death. 2010 Elsevier Inc. All rights reserved.
Penn, Elizabeth Maggie
2014-01-01
This article presents a new model for scoring alternatives from “contest” outcomes. The model is a generalization of the method of paired comparison to accommodate comparisons between arbitrarily sized sets of alternatives in which outcomes are any division of a fixed prize. Our approach is also applicable to contests between varying quantities of alternatives. We prove that under a reasonable condition on the comparability of alternatives, there exists a unique collection of scores that produces accurate estimates of the overall performance of each alternative and satisfies a well-known axiom regarding choice probabilities. We apply the method to several problems in which varying choice sets and continuous outcomes may create problems for standard scoring methods. These problems include measuring centrality in network data and the scoring of political candidates via a “feeling thermometer.” In the latter case, we also use the method to uncover and solve a potential difficulty with common methods of rescaling thermometer data to account for issues of interpersonal comparability. PMID:24748759
Heat conduction in periodic laminates with probabilistic distribution of material properties
NASA Astrophysics Data System (ADS)
Ostrowski, Piotr; Jędrysiak, Jarosław
2017-04-01
This contribution deals with a problem of heat conduction in a two-phase laminate made of periodically distributed micro-laminas along one direction. In general, the Fourier's Law describing the heat conduction in a considered composite has highly oscillating and discontinuous coefficients. Therefore, the tolerance averaging technique (cf. Woźniak et al. in Thermomechanics of microheterogeneous solids and structures. Monografie - Politechnika Łódzka, Wydawnictwo Politechniki Łódzkiej, Łódź, 2008) is applied. Based on this technique, the averaged differential equations for a tolerance-asymptotic model are derived and solved analytically for given initial-boundary conditions. The second part of this contribution is an investigation of the effect of material properties ratio ω of two components on the total temperature field θ, by the assumption that conductivities of micro-laminas are not necessary uniquely described. Numerical experiments (Monte Carlo simulation) are executed under assumption that ω is a random variable with a fixed probability distribution. At the end, based on the obtained results, a crucial hypothesis is formulated.
A TWO-STATE MIXED HIDDEN MARKOV MODEL FOR RISKY TEENAGE DRIVING BEHAVIOR
Jackson, John C.; Albert, Paul S.; Zhang, Zhiwei
2016-01-01
This paper proposes a joint model for longitudinal binary and count outcomes. We apply the model to a unique longitudinal study of teen driving where risky driving behavior and the occurrence of crashes or near crashes are measured prospectively over the first 18 months of licensure. Of scientific interest is relating the two processes and predicting crash and near crash outcomes. We propose a two-state mixed hidden Markov model whereby the hidden state characterizes the mean for the joint longitudinal crash/near crash outcomes and elevated g-force events which are a proxy for risky driving. Heterogeneity is introduced in both the conditional model for the count outcomes and the hidden process using a shared random effect. An estimation procedure is presented using the forward–backward algorithm along with adaptive Gaussian quadrature to perform numerical integration. The estimation procedure readily yields hidden state probabilities as well as providing for a broad class of predictors. PMID:27766124
Decision-support information system to manage mass casualty incidents at a level 1 trauma center.
Bar-El, Yaron; Tzafrir, Sara; Tzipori, Idan; Utitz, Liora; Halberthal, Michael; Beyar, Rafael; Reisner, Shimon
2013-12-01
Mass casualty incidents are probably the greatest challenge to a hospital. When such an event occurs, hospitals are required to instantly switch from their routine activity to conditions of great uncertainty and confront needs that exceed resources. We describe an information system that was uniquely designed for managing mass casualty events. The web-based system is activated when a mass casualty event is declared; it displays relevant operating procedures, checklists, and a log book. The system automatically or semiautomatically initiates phone calls and public address announcements. It collects real-time data from computerized clinical and administrative systems in the hospital, and presents them to the managing team in a clear graphic display. It also generates periodic reports and summaries of available or scarce resources that are sent to predefined recipients. When the system was tested in a nationwide exercise, it proved to be an invaluable tool for informed decision making in demanding and overwhelming situations such as mass casualty events.
Priming anticancer active specific immunotherapy with dendritic cells.
Mocellin, Simone
2005-06-01
Dendritic cells (DCs) probably represent the most powerful naturally occurring immunological adjuvant for anticancer vaccines. However, the initial enthusiasm for DC-based vaccines is being tempered by clinical results not meeting expectations. The partial failure of current vaccine formulations is explained by the extraordinary complexity of the immune system, which makes the task of exploiting the potential of such a biotherapeutic approach highly challenging. Clinical findings obtained in humans so far indicate that the immune system can be actively polarized against malignant cells by means of DC-based active specific immunotherapy, and that in some cases this is associated with tumor regression. This implies that under some unique circumstances, the naturally 'dormant' immune effectors can actually be employed as endogenous weapons against malignant cells. Only the thorough understanding of DC biology and tumor-host immune system interactions will allow researchers to reproduce, in a larger set of patients, the cellular/molecular conditions leading to an effective immune-mediated eradication of cancer.
Pricing risk and ambiguity: the effect of perspective taking.
Trautmann, Stefan T; Schmidt, Ulrich
2012-01-01
In the valuation of uncertain prospects, a difference is often observed between selling and buying perspectives. This paper distinguishes between risk (known probabilities) and ambiguity (unknown probabilities) in decisions under uncertainty and shows that the valuation disparity increases under ambiguity compared to risk. It is found that both the comparative versus noncomparative evaluation of risky and ambiguous prospects and the uniqueness of the valuation perspective (either seller or buyer) moderate this increase in the disparity under ambiguity. The finding is consistent with recent theoretical accounts of pricing under uncertainty. We discuss implications for market behaviour and for the ambiguity paradigm as a research tool.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cartarius, Holger; Moiseyev, Nimrod; Department of Physics and Minerva Center for Nonlinear Physics of Complex Systems, Technion-Israel Institute of Technology, Haifa, 32000
The unique time signature of the survival probability exactly at the exceptional point parameters is studied here for the hydrogen atom in strong static magnetic and electric fields. We show that indeed the survival probability S(t)=|<{psi}(0)|{psi}(t)>|{sup 2} decays exactly as |1-at|{sup 2}e{sup -{Gamma}{sub E}{sub P}t/({Dirac_h}/2{pi})}, where {Gamma}{sub EP} is associated with the decay rate at the exceptional point and a is a complex constant depending solely on the initial wave packet that populates exclusively the two almost degenerate states of the non-Hermitian Hamiltonian. This may open the possibility for a first experimental detection of exceptional points in a quantum system.
NASA Astrophysics Data System (ADS)
Chen, Huabin
2013-08-01
In this paper, the problems about the existence and uniqueness, attraction for strong solution of stochastic age-structured population systems with diffusion and Poisson jump are considered. Under the non-Lipschitz condition with the Lipschitz condition being considered as a special case, the existence and uniqueness for such systems is firstly proved by using the Burkholder-Davis-Gundy inequality (B-D-G inequality) and Itô's formula. And then by using a novel inequality technique, some sufficient conditions ensuring the existence for the domain of attraction are established. As another by-product, the exponential stability in mean square moment of strong solution for such systems can be also discussed.
NASA Astrophysics Data System (ADS)
Khodayar, Samiro; Kalthoff, Norbert
2013-04-01
Among all severe convective weather situations, fall season heavy rainfall represents the most threatening phenomenon in the western Mediterranean region. Devastating flash floods occur every year somewhere in eastern Spain, southern France, Italy, or North Africa, being responsible for a great proportion of the fatalities, property losses, and destruction of infrastructure caused by natural hazards. Investigations in the area have shown that most of the heavy rainfall events in this region can be attributed to mesoscale convective systems. The main goal of this investigation is to understand and identify the atmospheric conditions that favor the initiation and development of such systems. Insight of the involved processes and conditions will improve their predictability and help preventing some of the fatal consequences related with the occurrence of these weather phenomena. The HyMeX (Hydrological cycle in the Mediterranean eXperiment) provides a unique framework to investigate this issue. Making use of high-resolution seasonal simulations with the COSMO-CLM model the mean atmospheric conditions of the fall season, September, October and November, are investigated in different western Mediterranean regions such as eastern Spain, Southern France, northern Africa and Italy. The precipitation distribution, its daily cycle, and probability distribution function are evaluated to ascertain the similarities and differences between the regions of interest, as well as the spatial distribution of extreme events. Additionally, the regional differences of the boundary layer and mid-tropospheric conditions, atmospheric stability and inhibition, and low-level triggering are presented. Selected high impact weather HyMeX episodes' are analyzed with special focus on the atmospheric pre-conditions leading to the extreme weather situations. These pre-conditions are then compared to the mean seasonal conditions to identify and point out possible anomalies in the atmospheric conditions which could favor the initiation and intensification of extreme precipitation weather events.
Kawashima, Tomoya; Matsumoto, Eriko
2016-03-23
Items in working memory guide visual attention toward a memory-matching object. Recent studies have shown that when searching for an object this attentional guidance can be modulated by knowing the probability that the target will match an item in working memory. Here, we recorded the P3 and contralateral delay activity to investigate how top-down knowledge controls the processing of working memory items. Participants performed memory task (recognition only) and memory-or-search task (recognition or visual search) in which they were asked to maintain two colored oriented bars in working memory. For visual search, we manipulated the probability that target had the same color as memorized items (0, 50, or 100%). Participants knew the probabilities before the task. Target detection in 100% match condition was faster than that in 50% match condition, indicating that participants used their knowledge of the probabilities. We found that the P3 amplitude in 100% condition was larger than in other conditions and that contralateral delay activity amplitude did not vary across conditions. These results suggest that more attention was allocated to the memory items when observers knew in advance that their color would likely match a target. This led to better search performance despite using qualitatively equal working memory representations.
ELIPGRID-PC: A PC program for calculating hot spot probabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Davidson, J.R.
1994-10-01
ELIPGRID-PC, a new personal computer program has been developed to provide easy access to Singer`s 1972 ELIPGRID algorithm for hot-spot detection probabilities. Three features of the program are the ability to determine: (1) the grid size required for specified conditions, (2) the smallest hot spot that can be sampled with a given probability, and (3) the approximate grid size resulting from specified conditions and sampling cost. ELIPGRID-PC also provides probability of hit versus cost data for graphing with spread-sheets or graphics software. The program has been successfully tested using Singer`s published ELIPGRID results. An apparent error in the original ELIPGRIDmore » code has been uncovered and an appropriate modification incorporated into the new program.« less
Risk estimation using probability machines
2014-01-01
Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306
Risk estimation using probability machines.
Dasgupta, Abhijit; Szymczak, Silke; Moore, Jason H; Bailey-Wilson, Joan E; Malley, James D
2014-03-01
Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a "risk machine", will share properties from the statistical machine that it is derived from.
Sedinger, James S.; Schamber, Jason L.; Ward, David H.; Nicolai, Christopher A.; Conant, Bruce
2011-01-01
We used observations of individually marked female black brant geese (Branta bernicla nigricans; brant) at three wintering lagoons on the Pacific coast of Baja California—Laguna San Ignacio (LSI), Laguna Ojo de Liebre (LOL), and Bahía San Quintín (BSQ)—and the Tutakoke River breeding colony in Alaska to assess hypotheses about carryover effects on breeding and distribution of individuals among wintering areas. We estimated transition probabilities from wintering locations to breeding and nonbreeding by using multistratum robust-design capture-mark-recapture models. We also examined the effect of breeding on migration to wintering areas to assess the hypothesis that individuals in family groups occupied higher-quality wintering locations. We used 4,538 unique female brant in our analysis of the relationship between winter location and breeding probability. All competitive models of breeding probability contained additive effects of wintering location and the 1997–1998 El Niño–Southern Oscillation (ENSO) event on probability of breeding. Probability of breeding in non-ENSO years was 0.98 ± 0.02, 0.68 ± 0.04, and 0.91 ± 0.11 for females wintering at BSQ, LOL, and LSI, respectively. After the 1997–1998 ENSO event, breeding probability was between 2% (BSQ) and 38% (LOL) lower than in other years. Individuals that bred had the highest probability of migrating the next fall to the wintering area producing the highest probability of breeding.
Seismicity alert probabilities at Parkfield, California, revisited
Michael, A.J.; Jones, L.M.
1998-01-01
For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.
Sussman, Steve; Pokhrel, Pallav; Sun, Ping; Rohrbach, Louise A; Spruijt-Metz, Donna
2015-09-01
Recent work has studied addictions using a matrix measure, which taps multiple addictions through single responses for each type. This is the first longitudinal study using a matrix measure. We investigated the use of this approach among former alternative high school youth (average age = 19.8 years at baseline; longitudinal n = 538) at risk for addictions. Lifetime and last 30-day prevalence of one or more of 11 addictions reviewed in other work was the primary focus (i.e., cigarettes, alcohol, hard drugs, shopping, gambling, Internet, love, sex, eating, work, and exercise). These were examined at two time-points one year apart. Latent class and latent transition analyses (LCA and LTA) were conducted in Mplus. Prevalence rates were stable across the two time-points. As in the cross-sectional baseline analysis, the 2-class model (addiction class, non-addiction class) fit the data better at follow-up than models with more classes. Item-response or conditional probabilities for each addiction type did not differ between time-points. As a result, the LTA model utilized constrained the conditional probabilities to be equal across the two time-points. In the addiction class, larger conditional probabilities (i.e., 0.40-0.49) were found for love, sex, exercise, and work addictions; medium conditional probabilities (i.e., 0.17-0.27) were found for cigarette, alcohol, other drugs, eating, Internet and shopping addiction; and a small conditional probability (0.06) was found for gambling. Persons in an addiction class tend to remain in this addiction class over a one-year period.
Effects of sampling conditions on DNA-based estimates of American black bear abundance
Laufenberg, Jared S.; Van Manen, Frank T.; Clark, Joseph D.
2013-01-01
DNA-based capture-mark-recapture techniques are commonly used to estimate American black bear (Ursus americanus) population abundance (N). Although the technique is well established, many questions remain regarding study design. In particular, relationships among N, capture probability of heterogeneity mixtures A and B (pA and pB, respectively, or p, collectively), the proportion of each mixture (π), number of capture occasions (k), and probability of obtaining reliable estimates of N are not fully understood. We investigated these relationships using 1) an empirical dataset of DNA samples for which true N was unknown and 2) simulated datasets with known properties that represented a broader array of sampling conditions. For the empirical data analysis, we used the full closed population with heterogeneity data type in Program MARK to estimate N for a black bear population in Great Smoky Mountains National Park, Tennessee. We systematically reduced the number of those samples used in the analysis to evaluate the effect that changes in capture probabilities may have on parameter estimates. Model-averaged N for females and males were 161 (95% CI = 114–272) and 100 (95% CI = 74–167), respectively (pooled N = 261, 95% CI = 192–419), and the average weekly p was 0.09 for females and 0.12 for males. When we reduced the number of samples of the empirical data, support for heterogeneity models decreased. For the simulation analysis, we generated capture data with individual heterogeneity covering a range of sampling conditions commonly encountered in DNA-based capture-mark-recapture studies and examined the relationships between those conditions and accuracy (i.e., probability of obtaining an estimated N that is within 20% of true N), coverage (i.e., probability that 95% confidence interval includes true N), and precision (i.e., probability of obtaining a coefficient of variation ≤20%) of estimates using logistic regression. The capture probability for the larger of 2 mixture proportions of the population (i.e., pA or pB, depending on the value of π) was most important for predicting accuracy and precision, whereas capture probabilities of both mixture proportions (pA and pB) were important to explain variation in coverage. Based on sampling conditions similar to parameter estimates from the empirical dataset (pA = 0.30, pB = 0.05, N = 250, π = 0.15, and k = 10), predicted accuracy and precision were low (60% and 53%, respectively), whereas coverage was high (94%). Increasing pB, the capture probability for the predominate but most difficult to capture proportion of the population, was most effective to improve accuracy under those conditions. However, manipulation of other parameters may be more effective under different conditions. In general, the probabilities of obtaining accurate and precise estimates were best when p≥ 0.2. Our regression models can be used by managers to evaluate specific sampling scenarios and guide development of sampling frameworks or to assess reliability of DNA-based capture-mark-recapture studies.
NASA Astrophysics Data System (ADS)
Warren, Paul H.; Shirley, David N.; Kallemeyn, Gregory W.
1986-09-01
Analysis of previously unstudied Apollo lithic fragments continues to yield surprising results. Among this year's samples is a small anorthosite fragment, 76504,18, the first pristine anorthosite found from Apollo 17. This unique lithology strongly resembles the main type of Apollo anorthosites ferroan anorthosites), but 76504,18 has a far higher ratio (about 9) of high-Ca pyroxene to low-Ca pyroxene, higher Na in its plagioclase, higher contents of incompatible elements such as REE, and a higher Eu/Al ratio. Assuming that 76504,18 is a cumulate with less than 45% trapped liquid, its parent melt probably had a negative Eu anomaly. In all these respects, 76504,18 seems more likely than (other) ferroan anorthosites to be closely related to typical mare basalts. Apparrently this anorthosite was among the latest to form by plagioclase flotation abovbe a primordial magmasphere; typical mare basalt source regions probably accumulated at about the same time or even earlier. Another previusly unstudied fragment, 14181c,is a VKH (very high potassium) basalt that is similar in most respects to typical (``aluminous'') Apollo 14 mare basalt but has a K/La ratio of 1050. This lithology probably formed after a normal Apollo 14 mare basaltic melt partially assimilated granite. New data for siderophile elements in Apollo 1 mare basalts indicate that only the lowest of earlier data are trustworthy a being free of laboratory contamination.
Quantum-correlation breaking channels, quantum conditional probability and Perron-Frobenius theory
NASA Astrophysics Data System (ADS)
Chruściński, Dariusz
2013-03-01
Using the quantum analog of conditional probability and classical Bayes theorem we discuss some aspects of particular entanglement breaking channels: quantum-classical and classical-classical channels. Applying the quantum analog of Perron-Frobenius theorem we generalize the recent result of Korbicz et al. (2012) [8] on full and spectrum broadcasting from quantum-classical channels to arbitrary quantum channels.
ERIC Educational Resources Information Center
Hoeken, Hans; Hustinx, Lettica
2009-01-01
Under certain conditions, statistical evidence is more persuasive than anecdotal evidence in supporting a claim about the probability that a certain event will occur. In three experiments, it is shown that the type of argument is an important condition in this respect. If the evidence is part of an argument by generalization, statistical evidence…
ERIC Educational Resources Information Center
Chow, Alan F.; Van Haneghan, James P.
2016-01-01
This study reports the results of a study examining how easily students are able to transfer frequency solutions to conditional probability problems to novel situations. University students studied either a problem solved using the traditional Bayes formula format or using a natural frequency (tree diagram) format. In addition, the example problem…
Estimating the probability of mountain pine beetle red-attack damage
Michael A Wulder; J. C. White; Barbara J Bentz; M. F. Alvarez; N. C. Coops
2006-01-01
Accurate spatial information on the location and extent of mountain pine beetle infestation is critical for the planning of mitigation and treatment activities. Areas of mixed forest and variable terrain present unique challenges for the detection and mapping of mountain pine beetle red-attack damage, as red-attack has a more heterogeneous distribution under these...
Tyler Prante; Jennifer A. Thacher; Daniel W. McCollum; Robert P. Berrens
2007-01-01
In part because of its emphasis on building social capital, the Collaborative Forest Restoration Program (CFRP) in New Mexico represents a unique experiment in public lands management. This study uses logit probability modeling to investigate what factors determined CFRP funding, which totaled $26 million between 2001 and 2006. Results reveal program preferences for...
ERIC Educational Resources Information Center
Griffin, Barbara; Hesketh, Beryl; Loh, Vanessa
2012-01-01
This study examines the construct of subjective life expectancy (SLE), or the estimation of one's probable age of death. Drawing on the tenets of socioemotional selectivity theory (Carstensen, Isaacowitz, & Charles, 1999), we propose that SLE provides individuals with their own unique mental model of remaining time that is likely to affect their…
ERIC Educational Resources Information Center
Tompkins, Loren D.; Mehring, Teresa
Scholastic and personal characteristics of students undertaking exit competency examinations are investigated. Research questions concerned: what the tests measure, whether the tests provide unique information or are duplicating other easily obtained measures, whether it is possible to increase the probability of student success by controlling…
NASA Astrophysics Data System (ADS)
Mori, Ryuhei
2016-11-01
Brassard et al. [Phys. Rev. Lett. 96, 250401 (2006), 10.1103/PhysRevLett.96.250401] showed that shared nonlocal boxes with a CHSH (Clauser, Horne, Shimony, and Holt) probability greater than 3/+√{6 } 6 yield trivial communication complexity. There still exists a gap with the maximum CHSH probability 2/+√{2 } 4 achievable by quantum mechanics. It is an interesting open question to determine the exact threshold for the trivial communication complexity. Brassard et al.'s idea is based on recursive bias amplification by the three-input majority function. It was not obvious if another choice of function exhibits stronger bias amplification. We show that the three-input majority function is the unique optimal function, so that one cannot improve the threshold 3/+√{6 } 6 by Brassard et al.'s bias amplification. In this work, protocols for computing the function used for the bias amplification are restricted to be nonadaptive protocols or a particular adaptive protocol inspired by Pawłowski et al.'s protocol for information causality [Nature (London) 461, 1101 (2009), 10.1038/nature08400]. We first show an adaptive protocol inspired by Pawłowski et al.'s protocol, and then show that the adaptive protocol improves upon nonadaptive protocols. Finally, we show that the three-input majority function is the unique optimal function for the bias amplification if we apply the adaptive protocol to each step of the bias amplification.
Condition-dependent reproductive effort in frogs infected by a widespread pathogen
Roznik, Elizabeth A.; Sapsford, Sarah J.; Pike, David A.; Schwarzkopf, Lin; Alford, Ross A.
2015-01-01
To minimize the negative effects of an infection on fitness, hosts can respond adaptively by altering their reproductive effort or by adjusting their timing of reproduction. We studied effects of the pathogenic fungus Batrachochytrium dendrobatidis on the probability of calling in a stream-breeding rainforest frog (Litoria rheocola). In uninfected frogs, calling probability was relatively constant across seasons and body conditions, but in infected frogs, calling probability differed among seasons (lowest in winter, highest in summer) and was strongly and positively related to body condition. Infected frogs in poor condition were up to 40% less likely to call than uninfected frogs, whereas infected frogs in good condition were up to 30% more likely to call than uninfected frogs. Our results suggest that frogs employed a pre-existing, plastic, life-history strategy in response to infection, which may have complex evolutionary implications. If infected males in good condition reproduce at rates equal to or greater than those of uninfected males, selection on factors affecting disease susceptibility may be minimal. However, because reproductive effort in infected males is positively related to body condition, there may be selection on mechanisms that limit the negative effects of infections on hosts. PMID:26063847
Smith, Erik A.; Sanocki, Chris A.; Lorenz, David L.; Jacobsen, Katrin E.
2017-12-27
Streamflow distribution maps for the Cannon River and St. Louis River drainage basins were developed by the U.S. Geological Survey, in cooperation with the Legislative-Citizen Commission on Minnesota Resources, to illustrate relative and cumulative streamflow distributions. The Cannon River was selected to provide baseline data to assess the effects of potential surficial sand mining, and the St. Louis River was selected to determine the effects of ongoing Mesabi Iron Range mining. Each drainage basin (Cannon, St. Louis) was subdivided into nested drainage basins: the Cannon River was subdivided into 152 nested drainage basins, and the St. Louis River was subdivided into 353 nested drainage basins. For each smaller drainage basin, the estimated volumes of groundwater discharge (as base flow) and surface runoff flowing into all surface-water features were displayed under the following conditions: (1) extreme low-flow conditions, comparable to an exceedance-probability quantile of 0.95; (2) low-flow conditions, comparable to an exceedance-probability quantile of 0.90; (3) a median condition, comparable to an exceedance-probability quantile of 0.50; and (4) a high-flow condition, comparable to an exceedance-probability quantile of 0.02.Streamflow distribution maps were developed using flow-duration curve exceedance-probability quantiles in conjunction with Soil-Water-Balance model outputs; both the flow-duration curve and Soil-Water-Balance models were built upon previously published U.S. Geological Survey reports. The selected streamflow distribution maps provide a proactive water management tool for State cooperators by illustrating flow rates during a range of hydraulic conditions. Furthermore, after the nested drainage basins are highlighted in terms of surface-water flows, the streamflows can be evaluated in the context of meeting specific ecological flows under different flow regimes and potentially assist with decisions regarding groundwater and surface-water appropriations. Presented streamflow distribution maps are foundational work intended to support the development of additional streamflow distribution maps that include statistical constraints on the selected flow conditions.
Towards an Artificial Space Object Taxonomy
NASA Astrophysics Data System (ADS)
Wilkins, M.; Schumacher, P.; Jah, M.; Pfeffer, A.
2013-09-01
Object recognition is the first step in positively identifying a resident space object (RSO), i.e. assigning an RSO to a category such as GPS satellite or space debris. Object identification is the process of deciding that two RSOs are in fact one and the same. Provided we have appropriately defined a satellite taxonomy that allows us to place a given RSO into a particular class of object without any ambiguity, one can assess the probability of assignment to a particular class by determining how well the object satisfies the unique criteria of belonging to that class. Ultimately, tree-based taxonomies delineate unique signatures by defining the minimum amount of information required to positively identify a RSO. Therefore, taxonomic trees can be used to depict hypotheses in a Bayesian object recognition and identification process. This work describes a new RSO taxonomy along with specific reasoning behind the choice of groupings. An alternative taxonomy was recently presented at the Sixth Conference on Space Debris in Darmstadt, Germany. [1] The best example of a taxonomy that enjoys almost universal scientific acceptance is the classical Linnaean biological taxonomy. A strength of Linnaean taxonomy is that it can be used to organize the different kinds of living organisms, simply and practically. Every species can be given a unique name. This uniqueness and stability are a result of the acceptance by biologists specializing in taxonomy, not merely of the binomial names themselves. Fundamentally, the taxonomy is governed by rules for the use of these names, and these are laid down in formal Nomenclature Codes. We seek to provide a similar formal nomenclature system for RSOs through a defined tree-based taxonomy structure. Each categorization, beginning with the most general or inclusive, at any level is called a taxon. Taxon names are defined by a type, which can be a specimen or a taxon of lower rank, and a diagnosis, a statement intended to supply characters that differentiate the taxon from others with which it is likely to be confused. Each taxon will have a set of uniquely distinguishing features that will allow one to place a given object into a specific group without any ambiguity. When a new object does not fall into a specific taxon that is already defined, the entire tree structure will need to be evaluated to determine if a new taxon should be created. Ultimately, an online learning process to facilitate tree growth would be desirable. One can assess the probability of assignment to a particular taxon by determining how well the object satisfies the unique criteria of belonging to that taxon. Therefore, we can use taxonomic trees in a Bayesian process to assign prior probabilities to each of our object recognition and identification hypotheses. We will show that this taxonomy is robust by demonstrating specific stressing classification examples. We will also demonstrate how to implement this taxonomy in Figaro, an open source probabilistic programming language.
Class dependency of fuzzy relational database using relational calculus and conditional probability
NASA Astrophysics Data System (ADS)
Deni Akbar, Mohammad; Mizoguchi, Yoshihiro; Adiwijaya
2018-03-01
In this paper, we propose a design of fuzzy relational database to deal with a conditional probability relation using fuzzy relational calculus. In the previous, there are several researches about equivalence class in fuzzy database using similarity or approximate relation. It is an interesting topic to investigate the fuzzy dependency using equivalence classes. Our goal is to introduce a formulation of a fuzzy relational database model using the relational calculus on the category of fuzzy relations. We also introduce general formulas of the relational calculus for the notion of database operations such as ’projection’, ’selection’, ’injection’ and ’natural join’. Using the fuzzy relational calculus and conditional probabilities, we introduce notions of equivalence class, redundant, and dependency in the theory fuzzy relational database.
Reinforcement Learning for Constrained Energy Trading Games With Incomplete Information.
Wang, Huiwei; Huang, Tingwen; Liao, Xiaofeng; Abu-Rub, Haitham; Chen, Guo
2017-10-01
This paper considers the problem of designing adaptive learning algorithms to seek the Nash equilibrium (NE) of the constrained energy trading game among individually strategic players with incomplete information. In this game, each player uses the learning automaton scheme to generate the action probability distribution based on his/her private information for maximizing his own averaged utility. It is shown that if one of admissible mixed-strategies converges to the NE with probability one, then the averaged utility and trading quantity almost surely converge to their expected ones, respectively. For the given discontinuous pricing function, the utility function has already been proved to be upper semicontinuous and payoff secure which guarantee the existence of the mixed-strategy NE. By the strict diagonal concavity of the regularized Lagrange function, the uniqueness of NE is also guaranteed. Finally, an adaptive learning algorithm is provided to generate the strategy probability distribution for seeking the mixed-strategy NE.
Methods, apparatus and system for notification of predictable memory failure
Cher, Chen-Yong; Andrade Costa, Carlos H.; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.
2017-01-03
A method for providing notification of a predictable memory failure includes the steps of: obtaining information regarding at least one condition associated with a memory; calculating a memory failure probability as a function of the obtained information; calculating a failure probability threshold; and generating a signal when the memory failure probability exceeds the failure probability threshold, the signal being indicative of a predicted future memory failure.
NASA Astrophysics Data System (ADS)
Sato, Kenji; Yano, Makoto
2012-09-01
Unique existence of the absolutely continuous ergodic measure, or existence of ergodic chaos (in a strong sense), has been considered important in economics since it explains the mechanism underlying economic fluctuations. In the present study, a simple sufficient condition for ergodic chaos is proved and applied to economic models.
Code of Federal Regulations, 2011 CFR
2011-01-01
..., under reasonably probable water conditions, the flotation time and trim of the airplane will allow the... provision is shown by buoyancy and trim computations, appropriate allowances must be made for probable...
Short-term capture of the Earth-Moon system
NASA Astrophysics Data System (ADS)
Qi, Yi; de Ruiter, Anton
2018-06-01
In this paper, the short-term capture (STC) of an asteroid in the Earth-Moon system is proposed and investigated. First, the space condition of STC is analysed and five subsets of the feasible region are defined and discussed. Then, the time condition of STC is studied by parameter scanning in the Sun-Earth-Moon-asteroid restricted four-body problem. Numerical results indicate that there is a clear association between the distributions of the time probability of STC and the five subsets. Next, the influence of the Jacobi constant on STC is examined using the space and time probabilities of STC. Combining the space and time probabilities of STC, we propose a STC index to evaluate the probability of STC comprehensively. Finally, three potential STC asteroids are found and analysed.
Geissler, Kimberley; Stearns, Sally C; Becker, Charles; Thirumurthy, Harsha; Holmes, George M
2016-03-01
Substantial proportions of US residents in the USA-Mexico border region cross into Mexico for health care; increases in violence in northern Mexico may have affected this access. We quantified associations between violence in Mexico and decreases in access to care for border county residents. We also examined associations between border county residence and access. We used hospital inpatient data for Arizona, California and Texas (2005-10) to estimate associations between homicide rates and the probability of hospitalization for ambulatory care sensitive (ACS) conditions. Hospitalizations for ACS conditions were compared with homicide rates in Mexican municipalities matched by patient residence. A 1 SD increase in the homicide rate of the nearest Mexican municipality was associated with a 2.2 percentage point increase in the probability of being hospitalized for an ACS condition for border county patients. Residence in a border county was associated with a 1.3 percentage point decrease in the probability of being hospitalized for an ACS condition. Increased homicide rates in Mexico were associated with increased hospitalizations for ACS conditions in the USA, although residence in a border county was associated with decreased probability of being hospitalized for an ACS condition. Expanding access in the border region may mitigate these effects by providing alternative sources of care. © The Author 2015. Published by Oxford University Press on behalf of Faculty of Public Health. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Social learning spreads knowledge about dangerous humans among American crows
Cornell, Heather N.; Marzluff, John M.; Pecoraro, Shannon
2012-01-01
Individuals face evolutionary trade-offs between the acquisition of costly but accurate information gained firsthand and the use of inexpensive but possibly less reliable social information. American crows (Corvus brachyrhynchos) use both sources of information to learn the facial features of a dangerous person. We exposed wild crows to a novel ‘dangerous face’ by wearing a unique mask as we trapped, banded and released 7–15 birds at five study sites near Seattle, WA, USA. An immediate scolding response to the dangerous mask after trapping by previously captured crows demonstrates individual learning, while an immediate response by crows that were not captured probably represents conditioning to the trapping scene by the mob of birds that assembled during the capture. Later recognition of dangerous masks by lone crows that were never captured is consistent with horizontal social learning. Independent scolding by young crows, whose parents had conditioned them to scold the dangerous mask, demonstrates vertical social learning. Crows that directly experienced trapping later discriminated among dangerous and neutral masks more precisely than did crows that learned through social means. Learning enabled scolding to double in frequency and spread at least 1.2 km from the place of origin over a 5 year period at one site. PMID:21715408
Brown, Elliot G
2002-01-01
To support signal generation a terminology should facilitate recognition of medical conditions by using terms which represent unique concepts, providing appropriate, homogeneous grouping of related terms. It should allow intuitive or mathematical identification of adverse events reaching a threshold frequency or with disproportionate incidence, permit identification of important events which are commonly drug-related, and support recognition of new syndromes. It is probable that the Medical Dictionary for Regulatory Activities (MedDRA) preferred terms (PTs) or high level terms (HLTs) will be used to represent adverse events for the purposes of signal generation. A comparison with 315 WHO Adverse Reaction Terminology (WHO-ART) PTs showed that for about 72% of WHO-ART PTs, there were one or two corresponding MedDRA PTs. However, there were instances where there were many MedDRA PTs corresponding to single WHO-ART PTs. In many cases, MedDRA HLTs grouped large numbers of PTs and sometimes there could be problems when a single HLT comprises PTs which represent very different medical concepts, or conditions which differ greatly in their clinical importance. Further studies are needed to compare the way in which identical data sets coded with MedDRA and with other terminologies actually function in generating and exploring signals using the same methods of detection and evaluation.
A Comparative Study of Automated Infrasound Detectors - PMCC and AFD with Analyst Review.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Junghyun; Hayward, Chris; Zeiler, Cleat
Automated detections calculated by the progressive multi-channel correlation (PMCC) method (Cansi, 1995) and the adaptive F detector (AFD) (Arrowsmith et al., 2009) are compared to the signals identified by five independent analysts. Each detector was applied to a four-hour time sequence recorded by the Korean infrasound array CHNAR. This array was used because it is composed of both small (<100 m) and large (~1000 m) aperture element spacing. The four hour time sequence contained a number of easily identified signals under noise conditions that have average RMS amplitudes varied from 1.2 to 4.5 mPa (1 to 5 Hz), estimated withmore » running five-minute window. The effectiveness of the detectors was estimated for the small aperture, large aperture, small aperture combined with the large aperture, and full array. The full and combined arrays performed the best for AFD under all noise conditions while the large aperture array had the poorest performance for both detectors. PMCC produced similar results as AFD under the lower noise conditions, but did not produce as dramatic an increase in detections using the full and combined arrays. Both automated detectors and the analysts produced a decrease in detections under the higher noise conditions. Comparing the detection probabilities with Estimated Receiver Operating Characteristic (EROC) curves we found that the smaller value of consistency for PMCC and the larger p-value for AFD had the highest detection probability. These parameters produced greater changes in detection probability than estimates of the false alarm rate. The detection probability was impacted the most by noise level, with low noise (average RMS amplitude of 1.7 mPa) having an average detection probability of ~40% and high noise (average RMS amplitude of 2.9 mPa) average detection probability of ~23%.« less
Time-dependent earthquake probabilities
Gomberg, J.; Belardinelli, M.E.; Cocco, M.; Reasenberg, P.
2005-01-01
We have attempted to provide a careful examination of a class of approaches for estimating the conditional probability of failure of a single large earthquake, particularly approaches that account for static stress perturbations to tectonic loading as in the approaches of Stein et al. (1997) and Hardebeck (2004). We have loading as in the framework based on a simple, generalized rate change formulation and applied it to these two approaches to show how they relate to one another. We also have attempted to show the connection between models of seismicity rate changes applied to (1) populations of independent faults as in background and aftershock seismicity and (2) changes in estimates of the conditional probability of failures of different members of a the notion of failure rate corresponds to successive failures of different members of a population of faults. The latter application requires specification of some probability distribution (density function of PDF) that describes some population of potential recurrence times. This PDF may reflect our imperfect knowledge of when past earthquakes have occurred on a fault (epistemic uncertainty), the true natural variability in failure times, or some combination of both. We suggest two end-member conceptual single-fault models that may explain natural variability in recurrence times and suggest how they might be distinguished observationally. When viewed deterministically, these single-fault patch models differ significantly in their physical attributes, and when faults are immature, they differ in their responses to stress perturbations. Estimates of conditional failure probabilities effectively integrate over a range of possible deterministic fault models, usually with ranges that correspond to mature faults. Thus conditional failure probability estimates usually should not differ significantly for these models. Copyright 2005 by the American Geophysical Union.
Extreme river flow dependence in Northern Scotland
NASA Astrophysics Data System (ADS)
Villoria, M. Franco; Scott, M.; Hoey, T.; Fischbacher-Smith, D.
2012-04-01
Various methods for the spatial analysis of hydrologic data have been developed recently. Here we present results using the conditional probability approach proposed by Keef et al. [Appl. Stat. (2009): 58,601-18] to investigate spatial interdependence in extreme river flows in Scotland. This approach does not require the specification of a correlation function, being mostly suitable for relatively small geographical areas. The work is motivated by the Flood Risk Management Act (Scotland (2009)) which requires maps of flood risk that take account of spatial dependence in extreme river flow. The method is based on two conditional measures of spatial flood risk: firstly the conditional probability PC(p) that a set of sites Y = (Y 1,...,Y d) within a region C of interest exceed a flow threshold Qp at time t (or any lag of t), given that in the specified conditioning site X > Qp; and, secondly the expected number of sites within C that will exceed a flow Qp on average (given that X > Qp). The conditional probabilities are estimated using the conditional distribution of Y |X = x (for large x), which can be modeled using a semi-parametric approach (Heffernan and Tawn [Roy. Statist. Soc. Ser. B (2004): 66,497-546]). Once the model is fitted, pseudo-samples can be generated to estimate functionals of the joint tails of the distribution of (Y,X). Conditional return level plots were directly compared to traditional return level plots thus improving our understanding of the dependence structure of extreme river flow events. Confidence intervals were calculated using block bootstrapping methods (100 replicates). We report results from applying this approach to a set of four rivers (Dulnain, Lossie, Ewe and Ness) in Northern Scotland. These sites were chosen based on data quality, spatial location and catchment characteristics. The river Ness, being the largest (catchment size 1839.1km2) was chosen as the conditioning river. Both the Ewe (441.1km2) and Ness catchments have predominantly impermeable bedrock, with the Ewe's one being very wet. The Lossie(216km2) and Dulnain (272.2km2) both contain significant areas of glacial deposits. River flow in the Dulnain is usually affected by snowmelt. In all cases, the conditional probability of each of the three rivers (Dulnain, Lossie, Ewe) decreases as the event in the conditioning river (Ness) becomes more extreme. The Ewe, despite being the furthest of the three sites from the Ness shows the strongest dependence, with relatively high (>0.4) conditional probabilities even for very extreme events (>0.995). Although the Lossie is closer geographically to the Ness than the Ewe, it shows relatively low conditional probabilities and can be considered independent of the Ness for very extreme events (> 0.990). The conditional probabilities seem to reflect the different catchment characteristics and dominant precipitation generating events, with the Ewe being more similar to the Ness than the other two rivers. This interpretation suggests that the conditional method may yield improved estimates of extreme events, but the approach is time consuming. An alternative model that is easier to implement, using a spatial quantile regression, is currently being investigated, which would also allow the introduction of further covariates, essential as the effects of climate change are incorporated into estimation procedures.
Lost in search: (Mal-)adaptation to probabilistic decision environments in children and adults.
Betsch, Tilmann; Lehmann, Anne; Lindow, Stefanie; Lang, Anna; Schoemann, Martin
2016-02-01
Adaptive decision making in probabilistic environments requires individuals to use probabilities as weights in predecisional information searches and/or when making subsequent choices. Within a child-friendly computerized environment (Mousekids), we tracked 205 children's (105 children 5-6 years of age and 100 children 9-10 years of age) and 103 adults' (age range: 21-22 years) search behaviors and decisions under different probability dispersions (.17; .33, .83 vs. .50, .67, .83) and constraint conditions (instructions to limit search: yes vs. no). All age groups limited their depth of search when instructed to do so and when probability dispersion was high (range: .17-.83). Unlike adults, children failed to use probabilities as weights for their searches, which were largely not systematic. When examining choices, however, elementary school children (unlike preschoolers) systematically used probabilities as weights in their decisions. This suggests that an intuitive understanding of probabilities and the capacity to use them as weights during integration is not a sufficient condition for applying simple selective search strategies that place one's focus on weight distributions. PsycINFO Database Record (c) 2016 APA, all rights reserved.
The General Necessary Condition for the Validity of Dirac's Transition Perturbation Theory
NASA Technical Reports Server (NTRS)
Quang, Nguyen Vinh
1996-01-01
For the first time, from the natural requirements for the successive approximation the general necessary condition of validity of the Dirac's method is explicitly established. It is proved that the conception of 'the transition probability per unit time' is not valid. The 'super-platinium rules' for calculating the transition probability are derived for the arbitrarily strong time-independent perturbation case.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Detrano, R.; Yiannikas, J.; Salcedo, E.E.
One hundred fifty-four patients referred for coronary arteriography were prospectively studied with stress electrocardiography, stress thallium scintigraphy, cine fluoroscopy (for coronary calcifications), and coronary angiography. Pretest probabilities of coronary disease were determined based on age, sex, and type of chest pain. These and pooled literature values for the conditional probabilities of test results based on disease state were used in Bayes theorem to calculate posttest probabilities of disease. The results of the three noninvasive tests were compared for statistical independence, a necessary condition for their simultaneous use in Bayes theorem. The test results were found to demonstrate pairwise independence inmore » patients with and those without disease. Some dependencies that were observed between the test results and the clinical variables of age and sex were not sufficient to invalidate application of the theorem. Sixty-eight of the study patients had at least one major coronary artery obstruction of greater than 50%. When these patients were divided into low-, intermediate-, and high-probability subgroups according to their pretest probabilities, noninvasive test results analyzed by Bayesian probability analysis appropriately advanced 17 of them by at least one probability subgroup while only seven were moved backward. Of the 76 patients without disease, 34 were appropriately moved into a lower probability subgroup while 10 were incorrectly moved up. We conclude that posttest probabilities calculated from Bayes theorem more accurately classified patients with and without disease than did pretest probabilities, thus demonstrating the utility of the theorem in this application.« less
Using optimal transport theory to estimate transition probabilities in metapopulation dynamics
Nichols, Jonathan M.; Spendelow, Jeffrey A.; Nichols, James D.
2017-01-01
This work considers the estimation of transition probabilities associated with populations moving among multiple spatial locations based on numbers of individuals at each location at two points in time. The problem is generally underdetermined as there exists an extremely large number of ways in which individuals can move from one set of locations to another. A unique solution therefore requires a constraint. The theory of optimal transport provides such a constraint in the form of a cost function, to be minimized in expectation over the space of possible transition matrices. We demonstrate the optimal transport approach on marked bird data and compare to the probabilities obtained via maximum likelihood estimation based on marked individuals. It is shown that by choosing the squared Euclidean distance as the cost, the estimated transition probabilities compare favorably to those obtained via maximum likelihood with marked individuals. Other implications of this cost are discussed, including the ability to accurately interpolate the population's spatial distribution at unobserved points in time and the more general relationship between the cost and minimum transport energy.
Oberauer, Klaus; Awh, Edward; Sutterer, David W
2017-01-01
We report 4 experiments examining whether associations in visual working memory are subject to proactive interference from long-term memory (LTM). Following a long-term learning phase in which participants learned the colors of 120 unique objects, a working memory (WM) test was administered in which participants recalled the precise colors of 3 concrete objects in an array. Each array in the WM test consisted of 1 old (previously learned) object with a new color (old-mismatch), 1 old object with its old color (old-match), and 1 new object. Experiments 1 to 3 showed that WM performance was better in the old-match condition than in the new condition, reflecting a beneficial contribution from LTM. In the old-mismatch condition, participants sometimes reported colors associated with the relevant shape in LTM, but the probability of successful recall was equivalent to that in the new condition. Thus, information from LTM only intruded in the absence of reportable information in WM. Experiment 4 tested for, and failed to find, proactive interference from the preceding trial in the WM test: Performance in the old-mismatch condition, presenting an object from the preceding trial with a new color, was equal to performance with new objects. Experiment 5 showed that long-term memory for object-color associations is subject to proactive interference. We conclude that the exchange of information between LTM and WM appears to be controlled by a gating mechanism that protects the contents of WM from proactive interference but admits LTM information when it is useful. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Influence of support conditions on vertical whole-body vibration of the seated human body.
M-Pranesh, Anand; Rakheja, Subhash; Demont, Richard
2010-01-01
The vibration transmission to the lumbar and thoracic segments of seated human subjects exposed to whole body vibration of a vehicular nature have been mostly characterised without the back and hand supports, which is not representative of general driving conditions. This non-invasive experimental study investigated the transmission of vertical seat vibration to selected vertebrae and the head along the vertical and fore-aft axes of twelve male human subjects seated on a rigid seat and exposed to random vertical excitation in the 0.5-20 Hz range. The measurements were performed under four different sitting postures involving combinations of back support conditions and hands positions, and three difference magnitudes of vertical vibration (0.25, 0.5 and 1.0 m/s(2) rms acceleration). The results showed significant errors induced by sensor misalignment and skin effects, which required appropriate correction methodologies. The averaged corrected responses revealed that the back support attenuates vibration in the vertical axis to all the body locations while increasing the fore-aft transmissibility at the C7 and T5. The hands position generally has a relatively smaller effect, showing some influences on the C7 and L5 vibration. Sitting without a back support resulted in very low magnitude fore-aft vibration at T5, which was substantially higher with a back support, suggestive of a probable change in the body's vibration mode. The effect of back support was observed to be very small on the horizontal vibration of the lower thoracic and lumbar regions. The results suggest that distinctly different target body-segment biodynamic functions need to be defined for different support conditions in order to represent the unique contribution of the specific support condition. These datasets may then be useful for the development of biodynamic models.
Ghrelin in obesity, physiological and pharmacological considerations.
Álvarez-Castro, Paula; Pena, Lara; Cordido, Fernando
2013-04-01
The aim of this review is to summarize the physiological and pharmacological aspects of ghrelin. Obesity can be defined as an excess of body fat and is associated with significant disturbances in metabolic and endocrine function. Obesity has become a worldwide epidemic. In obesity there is a decreased growth hormone (GH) secretion, and the altered somatotroph secretion in obesity is functional. Ghrelin is a peptide that has a unique structure with 28 amino-acids and an n-octanoyl ester at its third serine residue, which is essential for its potent stimulatory activity on somatotroph secretion. The pathophysiological mechanism responsible for GH hyposecretion in obesity is probably multifactorial, and there is probably a defect in ghrelin secretion. Ghrelin is the only known circulating orexigenic factor, and has been found to be reduced in obese humans. Ghrelin levels in blood decrease during periods of feeding. Due to its orexigenic and metabolic effects, ghrelin has a potential benefit in antagonizing protein breakdown and weight loss in catabolic conditions such as cancer cachexia, renal and cardiac disease, and age-related frailty. Theoretically ghrelin receptor antagonists could be employed as anti-obesity drugs, blocking the orexigenic signal. By blocking the constitutive receptor activity, inverse agonists of the ghrelin receptor may lower the set-point for hunger, and could be used for the treatment of obesity. In summary, ghrelin secretion is reduced in obesity, and could be partly responsible for GH hyposecretion in obesity, ghrelin antagonist or partial inverse agonists should be considered for the treatment of obesity.
NASA Astrophysics Data System (ADS)
Zhai, L.
2017-12-01
Plant community can be simultaneously affected by human activities and climate changes, and quantifying and predicting this combined effect on plant community by appropriate model framework which is validated by field data is complex, but very useful to conservation management. Plant communities in the Everglades provide an unique set of conditions to develop and validate this model framework, because they are both experiencing intensive effects of human activities (such as changing hydroperiod by drainage and restoration projects, nutrients from upstream agriculture, prescribed fire, etc.) and climate changes (such as warming, changing precipitation patter, sea level rise, etc.). More importantly, previous research attention focuses on plant communities in slough ecosystem (including ridge, slough and their tree islands), very few studies consider the marl prairie ecosystem. Comparing with slough ecosystem featured by remaining consistently flooded almost year-round, marl prairie has relatively shorter hydroperiod (just in wet-season of one year). Therefore, plant communities of marl prairie may receive more impacts from hydroperiod change. In addition to hydroperiod, fire and nutrients also affect the plant communities in the marl prairie. Therefore, to quantify the combined effects of water level, fire, and nutrients on the composition of the plant communities, we are developing a joint probability method based vegetation dynamic model. Further, the model is being validated by field data about changes of vegetation assemblage along environmental gradients in the marl prairie. Our poster showed preliminary data from our current project.
Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills
Polyak, Stephen T.; von Davier, Alina A.; Peterschmidt, Kurt
2017-01-01
This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD) and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses. PMID:29238314
Computational Psychometrics for the Measurement of Collaborative Problem Solving Skills.
Polyak, Stephen T; von Davier, Alina A; Peterschmidt, Kurt
2017-01-01
This paper describes a psychometrically-based approach to the measurement of collaborative problem solving skills, by mining and classifying behavioral data both in real-time and in post-game analyses. The data were collected from a sample of middle school children who interacted with a game-like, online simulation of collaborative problem solving tasks. In this simulation, a user is required to collaborate with a virtual agent to solve a series of tasks within a first-person maze environment. The tasks were developed following the psychometric principles of Evidence Centered Design (ECD) and are aligned with the Holistic Framework developed by ACT. The analyses presented in this paper are an application of an emerging discipline called computational psychometrics which is growing out of traditional psychometrics and incorporates techniques from educational data mining, machine learning and other computer/cognitive science fields. In the real-time analysis, our aim was to start with limited knowledge of skill mastery, and then demonstrate a form of continuous Bayesian evidence tracing that updates sub-skill level probabilities as new conversation flow event evidence is presented. This is performed using Bayes' rule and conversation item conditional probability tables. The items are polytomous and each response option has been tagged with a skill at a performance level. In our post-game analysis, our goal was to discover unique gameplay profiles by performing a cluster analysis of user's sub-skill performance scores based on their patterns of selected dialog responses.
Deterioration and cost information for bridge management.
DOT National Transportation Integrated Search
2012-05-01
This study applies contract bid tabulations and elementlevel condition records to develop elementlevel actions, : costs for actions, transition probabilities for models of deterioration of bridge elements, and transition probabilities : for imp...
Robust parameter design for automatically controlled systems and nanostructure synthesis
NASA Astrophysics Data System (ADS)
Dasgupta, Tirthankar
2007-12-01
This research focuses on developing comprehensive frameworks for developing robust parameter design methodology for dynamic systems with automatic control and for synthesis of nanostructures. In many automatically controlled dynamic processes, the optimal feedback control law depends on the parameter design solution and vice versa and therefore an integrated approach is necessary. A parameter design methodology in the presence of feedback control is developed for processes of long duration under the assumption that experimental noise factors are uncorrelated over time. Systems that follow a pure-gain dynamic model are considered and the best proportional-integral and minimum mean squared error control strategies are developed by using robust parameter design. The proposed method is illustrated using a simulated example and a case study in a urea packing plant. This idea is also extended to cases with on-line noise factors. The possibility of integrating feedforward control with a minimum mean squared error feedback control scheme is explored. To meet the needs of large scale synthesis of nanostructures, it is critical to systematically find experimental conditions under which the desired nanostructures are synthesized reproducibly, at large quantity and with controlled morphology. The first part of the research in this area focuses on modeling and optimization of existing experimental data. Through a rigorous statistical analysis of experimental data, models linking the probabilities of obtaining specific morphologies to the process variables are developed. A new iterative algorithm for fitting a Multinomial GLM is proposed and used. The optimum process conditions, which maximize the above probabilities and make the synthesis process less sensitive to variations of process variables around set values, are derived from the fitted models using Monte-Carlo simulations. The second part of the research deals with development of an experimental design methodology, tailor-made to address the unique phenomena associated with nanostructure synthesis. A sequential space filling design called Sequential Minimum Energy Design (SMED) for exploring best process conditions for synthesis of nanowires. The SMED is a novel approach to generate sequential designs that are model independent, can quickly "carve out" regions with no observable nanostructure morphology, and allow for the exploration of complex response surfaces.
Sussman, Steve; Pokhrel, Pallav; Sun, Ping; Rohrbach, Louise A.; Spruijt-Metz, Donna
2015-01-01
Background and Aims Recent work has studied addictions using a matrix measure, which taps multiple addictions through single responses for each type. This is the first longitudinal study using a matrix measure. Methods We investigated the use of this approach among former alternative high school youth (average age = 19.8 years at baseline; longitudinal n = 538) at risk for addictions. Lifetime and last 30-day prevalence of one or more of 11 addictions reviewed in other work was the primary focus (i.e., cigarettes, alcohol, hard drugs, shopping, gambling, Internet, love, sex, eating, work, and exercise). These were examined at two time-points one year apart. Latent class and latent transition analyses (LCA and LTA) were conducted in Mplus. Results Prevalence rates were stable across the two time-points. As in the cross-sectional baseline analysis, the 2-class model (addiction class, non-addiction class) fit the data better at follow-up than models with more classes. Item-response or conditional probabilities for each addiction type did not differ between time-points. As a result, the LTA model utilized constrained the conditional probabilities to be equal across the two time-points. In the addiction class, larger conditional probabilities (i.e., 0.40−0.49) were found for love, sex, exercise, and work addictions; medium conditional probabilities (i.e., 0.17−0.27) were found for cigarette, alcohol, other drugs, eating, Internet and shopping addiction; and a small conditional probability (0.06) was found for gambling. Discussion and Conclusions Persons in an addiction class tend to remain in this addiction class over a one-year period. PMID:26551909
Modeling the effect of toe clipping on treefrog survival: Beyond the return rate
Waddle, J.H.; Rice, K.G.; Mazzotti, F.J.; Percival, H.F.
2008-01-01
Some studies have described a negative effect of toe clipping on return rates of marked anurans, but the return rate is limited in that it does not account for heterogeneity of capture probabilities. We used open population mark-recapture models to estimate both apparent survival (ϕ) and the recapture probability (p) of two treefrog species individually marked by clipping 2–4 toes. We used information-theoretic model selection to examine the effect of toe clipping on survival while accounting for variation in capture probability. The model selection results indicate strong support for an effect of toe clipping on survival of Green Treefrogs (Hyla cinerea) and only limited support for an effect of toe clipping on capture probability. We estimate there was a mean absolute decrease in survival of 5.02% and 11.16% for Green Treefrogs with three and four toes removed, respectively, compared to individuals with just two toes removed. Results for Squirrel Treefrogs (Hyla squirella) indicate little support for an effect of toe clipping on survival but may indicate some support for a negative effect on capture probability. We believe that the return rate alone should not be used to examine survival of marked animals because constant capture probability must be assumed, and our examples demonstrate how capture probability may vary over time and among groups. Mark-recapture models provide a method for estimating the effect of toe clipping on anuran survival in situations where unique marks are applied.
The impact of land ownership, firefighting, and reserve status on fire probability in California
NASA Astrophysics Data System (ADS)
Starrs, Carlin Frances; Butsic, Van; Stephens, Connor; Stewart, William
2018-03-01
The extent of wildfires in the western United States is increasing, but how land ownership, firefighting, and reserve status influence fire probability is unclear. California serves as a unique natural experiment to estimate the impact of these factors, as ownership is split equally between federal and non-federal landowners; there is a relatively large proportion of reserved lands where extractive uses are prohibited and fire suppression is limited; and land ownership and firefighting responsibility are purposefully not always aligned. Panel Poisson regression techniques and pre-regression matching were used to model changes in annual fire probability from 1950-2015 on reserve and non-reserve lands on federal and non-federal ownerships across four vegetation types: forests, rangelands, shrublands, and forests without commercial species. Fire probability was found to have increased over time across all 32 categories. A marginal effects analysis showed that federal ownership and firefighting was associated with increased fire probability, and that the difference in fire probability on federal versus non-federal lands is increasing over time. Ownership, firefighting, and reserve status, played roughly equal roles in determining fire probability, and were found to have much greater influence than average maximum temperature (°C) during summer months (June, July, August), average annual precipitation (cm), and average annual topsoil moisture content by volume, demonstrating the critical role these factors play in western fire regimes and the importance of including them in future analysis focused on understanding and predicting wildfire in the Western United States.
Synthesis of silver-titanium dioxide nanocomposites for antimicrobial applications
NASA Astrophysics Data System (ADS)
Yang, X. H.; Fu, H. T.; Wang, X. C.; Yang, J. L.; Jiang, X. C.; Yu, A. B.
2014-08-01
Silver-titanium dioxide (Ag-TiO2) nanostructures have attracted increasing attention because of unique functional properties and potential applications in many areas such as photocatalysis, antibacterial, and self-cleaning coatings. In this study, Ag@TiO2 core-shell nanostructures and Ag-decorated TiO2 particles (TiO2@Ag) (the size of these two nanoparticles is ranging from 200-300 nm) have been synthesized by a developed facile but efficient method. These two types of hybrid nanostructures, characterized by various advanced techniques (TEM, XRD, BET and others), exhibit unique functional properties particularly in antibacterial toward Gram negative Escherichia coli, as a case study. Specifically: (i) the TiO2@Ag nanoparticles are superior in bacterial growth inhibition in standard culture conditions (37 °C incubator) to the Ag@TiO2 core-shell ones, in which silver may dominate the antibacterial performance; (ii) while after UV irradiation treatment, the Ag@TiO2 core-shell nanoparticles exhibit better performance in killing grown bacteria than the TiO2@Ag ones, probably because of the Ag cores facilitating charge separation for TiO2, and thus produce more hydroxyl radicals on the surface of the TiO2 particles; and (iii) without UV irradiation, both TiO2@Ag and Ag@TiO2 nanostructures show poor capabilities in killing mature bacteria. These findings would be useful for designing hybrid metal oxide nanocomposites with desirable functionalities in bioapplications in terms of sterilization, deodorization, and water purification.
Xu, Xiangming; Passey, Thomas; Wei, Feng; Saville, Robert; Harrison, Richard J.
2015-01-01
A phenomenon of yield decline due to weak plant growth in strawberry was recently observed in non-chemo-fumigated soils, which was not associated with the soil fungal pathogen Verticillium dahliae, the main target of fumigation. Amplicon-based metagenomics was used to profile soil microbiota in order to identify microbial organisms that may have caused the yield decline. A total of 36 soil samples were obtained in 2013 and 2014 from four sites for metagenomic studies; two of the four sites had a yield-decline problem, the other two did not. More than 2000 fungal or bacterial operational taxonomy units (OTUs) were found in these samples. Relative abundance of individual OTUs was statistically compared for differences between samples from sites with or without yield decline. A total of 721 individual comparisons were statistically significant – involving 366 unique bacterial and 44 unique fungal OTUs. Based on further selection criteria, we focused on 34 bacterial and 17 fungal OTUs and found that yield decline resulted probably from one or more of the following four factors: (1) low abundance of Bacillus and Pseudomonas populations, which are well known for their ability of supressing pathogen development and/or promoting plant growth; (2) lack of the nematophagous fungus (Paecilomyces species); (3) a high level of two non-specific fungal root rot pathogens; and (4) wet soil conditions. This study demonstrated the usefulness of an amplicon-based metagenomics approach to profile soil microbiota and to detect differential abundance in microbes. PMID:26504572
Efird, Jimmy T; Griffin, William F; Gudimella, Preeti; O'Neal, Wesley T; Davies, Stephen W; Crane, Patricia B; Anderson, Ethan J; Kindell, Linda C; Landrine, Hope; O'Neal, Jason B; Alwair, Hazaim; Kypson, Alan P; Nifong, Wiley L; Chitwood, W Randolph
2015-09-01
Conditional survival is defined as the probability of surviving an additional number of years beyond that already survived. The aim of this study was to compute conditional survival in patients who received a robotically assisted, minimally invasive mitral valve repair procedure (RMVP). Patients who received RMVP with annuloplasty band from May 2000 through April 2011 were included. A 5- and 10-year conditional survival model was computed using a multivariable product-limit method. Non-smoking men (≤65 years) who presented in sinus rhythm had a 96% probability of surviving at least 10 years if they survived their first year following surgery. In contrast, recent female smokers (>65 years) with preoperative atrial fibrillation only had an 11% probability of surviving beyond 10 years if alive after one year post-surgery. In the context of an increasingly managed healthcare environment, conditional survival provides useful information for patients needing to make important treatment decisions, physicians seeking to select patients most likely to benefit long-term following RMVP, and hospital administrators needing to comparatively assess the life-course economic value of high-tech surgical procedures.
Predictive models attribute effects on fish assemblages to toxicity and habitat alteration.
de Zwart, Dick; Dyer, Scott D; Posthuma, Leo; Hawkins, Charles P
2006-08-01
Biological assessments should both estimate the condition of a biological resource (magnitude of alteration) and provide environmental managers with a diagnosis of the potential causes of impairment. Although methods of quantifying condition are well developed, identifying and proportionately attributing impairment to probable causes remain problematic. Furthermore, analyses of both condition and cause have often been difficult to communicate. We developed an approach that (1) links fish, habitat, and chemistry data collected from hundreds of sites in Ohio (USA) streams, (2) assesses the biological condition at each site, (3) attributes impairment to multiple probable causes, and (4) provides the results of the analyses in simple-to-interpret pie charts. The data set was managed using a geographic information system. Biological condition was assessed using a RIVPACS (river invertebrate prediction and classification system)-like predictive model. The model provided probabilities of capture for 117 fish species based on the geographic location of sites and local habitat descriptors. Impaired biological condition was defined as the proportion of those native species predicted to occur at a site that were observed. The potential toxic effects of exposure to mixtures of contaminants were estimated using species sensitivity distributions and mixture toxicity principles. Generalized linear regression models described species abundance as a function of habitat characteristics. Statistically linking biological condition, habitat characteristics including mixture risks, and species abundance allowed us to evaluate the losses of species with environmental conditions. Results were mapped as simple effect and probable-cause pie charts (EPC pie diagrams), with pie sizes corresponding to magnitude of local impairment, and slice sizes to the relative probable contributions of different stressors. The types of models we used have been successfully applied in ecology and ecotoxicology, but they have not previously been used in concert to quantify impairment and its likely causes. Although data limitations constrained our ability to examine complex interactions between stressors and species, the direct relationships we detected likely represent conservative estimates of stressor contributions to local impairment. Future refinements of the general approach and specific methods described here should yield even more promising results.
Austin, Samuel H.; Nelms, David L.
2017-01-01
Climate change raises concern that risks of hydrological drought may be increasing. We estimate hydrological drought probabilities for rivers and streams in the United States (U.S.) using maximum likelihood logistic regression (MLLR). Streamflow data from winter months are used to estimate the chance of hydrological drought during summer months. Daily streamflow data collected from 9,144 stream gages from January 1, 1884 through January 9, 2014 provide hydrological drought streamflow probabilities for July, August, and September as functions of streamflows during October, November, December, January, and February, estimating outcomes 5-11 months ahead of their occurrence. Few drought prediction methods exploit temporal links among streamflows. We find MLLR modeling of drought streamflow probabilities exploits the explanatory power of temporally linked water flows. MLLR models with strong correct classification rates were produced for streams throughout the U.S. One ad hoc test of correct prediction rates of September 2013 hydrological droughts exceeded 90% correct classification. Some of the best-performing models coincide with areas of high concern including the West, the Midwest, Texas, the Southeast, and the Mid-Atlantic. Using hydrological drought MLLR probability estimates in a water management context can inform understanding of drought streamflow conditions, provide warning of future drought conditions, and aid water management decision making.
Zhang, Yun; Kasai, Katsuyuki; Watanabe, Masayoshi
2003-01-13
We give the intensity fluctuation joint probability of the twin-beam quantum state, which was generated with an optical parametric oscillator operating above threshold. Then we present what to our knowledge is the first measurement of the intensity fluctuation conditional probability distributions of twin beams. The measured inference variance of twin beams 0.62+/-0.02, which is less than the standard quantum limit of unity, indicates inference with a precision better than that of separable states. The measured photocurrent variance exhibits a quantum correlation of as much as -4.9+/-0.2 dB between the signal and the idler.
Quantifying Thin Mat Floating Marsh Strength and Interaction with Hydrodynamic Conditions
NASA Astrophysics Data System (ADS)
Collins, J. H., III; Sasser, C.; Willson, C. S.
2016-12-01
Louisiana possesses over 350,000 acres of unique floating vegetated systems known as floating marshes or flotants. Floating marshes make up 70% of the Terrebonne and Barataria basin wetlands and exist in several forms, mainly thick mat or thin mat. Salt-water intrusion, nutria grazing, and high-energy wave events are believed to be some contributing factors to the degradation of floating marshes; however, there has been little investigation into the hydrodynamic effects on their structural integrity. Due to their unique nature, floating marshes could be susceptible to changes in the hydrodynamic environment that may result from proposed river freshwater and sediment diversion projects introducing flow to areas that are typically somewhat isolated. This study aims to improve the understanding of how thin mat floating marshes respond to increased hydrodynamic stresses and, more specifically, how higher water velocities might increase the washout probability of this vegetation type. There are two major components of this research: 1) A thorough measurement of the material properties of the vegetative mats as a root-soil matrix composite material; and 2) An accurate numerical simulation of the hydrodynamics and forces imposed on the floating marsh mats by the flow. To achieve these goals, laboratory and field experiments were conducted using a customized device to measure the bulk properties of typical floating marshes. Additionally, Delft-3D FLOW and ANSYS FLUENT were used to simulate the flow around a series of simplified mat structures in order to estimate the hydrodynamic forcings on the mats. The hydrodynamic forcings are coupled with a material analysis, allowing for a thorough analysis of their interaction under various conditions. The 2-way Fluid Structure Interaction (F.S.I.) between the flow and the mat is achieved by coupling a Finite Element Analysis (F.E.A.) solver in ANSYS with FLUENT. The flow conditions necessary for the structural failure of the floating marshes are determined for a multitude of mat shapes and sizes, leading to a quantifiable critical velocity required for washout. Ultimately, through dimensional analysis, an equation for washout potential will be developed from the results, which could be used as a design guideline.
NASA Astrophysics Data System (ADS)
Pangle, L. A.; Cardoso, C.; Kim, M.; Lora, M.; Wang, Y.; Troch, P. A. A.; Harman, C. J.
2014-12-01
Water molecules traverse myriad flow paths and spend different lengths of time on or within the landscape before they are discharged into a stream channel. The transit-time distribution (TTD) is a probability distribution that represents the range and likelihood of transit times for water and conservative solutes within soils and catchments, and is useful for comparative analysis and prediction of solute transport into streams. The TTD has customarily been assumed to be time-invariant in practical applications, but is understood to vary due to unsteady flow rates, changes in water-balance partitioning, and shifting flow pathways. Recent theoretical advances have clarified how the distribution of transit times experienced by water and solutes within a stream channel at any moment in time is conditional on the specific series of precipitation events preceding that time. Observations resolving how TTDs vary during a specific sequence of precipitation events could be obtained by introducing unique and conservative tracers during each event and quantifying their distinct breakthrough curves in the stream. At present, the number of distinct and conservative tracers available for this purpose is insufficient. Harman and Kim [Harman, C.J. and Kim, M., 2014, Geophysical Research Letters, 41, 1567-1575] proposed a new experimental method—based on the establishment of periodic steady-state conditions—that allows multiple overlapping breakthrough curves of non-unique tracers to be decomposed, thus enabling analysis of the distinct TTDs associated with their specific times of introduction through precipitation. We present results from one of the first physical experiments to test this methodology. Our experiment involves a sloping lysimeter (10° slope) that contains one cubic meter of crushed basalt rock (loamy sand texture), an irrigation system adaptable to controlled tracer introductions, and instruments that enable total water balance monitoring. We imposed a repeated sequence of rainfall pulses and achieved periodic-steady-state conditions over 24 days. Using systematic introductions of deuterium- and chloride-enriched water, and the PERTH method, we resolve the time-conditional TTDs associated with tracer injections that occurred during specific intervals of the overall rainfall sequence.
Frequency analyses for recent regional floods in the United States
Melcher, Nick B.; Martinez, Patsy G.; ,
1996-01-01
During 1993-95, significant floods that resulted in record-high river stages, loss of life, and significant property damage occurred in the United States. The floods were caused by unique global weather patterns that produced large amounts of rain over large areas. Standard methods for flood-frequency analyses may not adequately consider the probability of recurrence of these global weather patterns.
A note on a simplified and general approach to simulating from multivariate copula functions
Barry K. Goodwin
2013-01-01
Copulas have become an important analytic tool for characterizing multivariate distributions and dependence. One is often interested in simulating data from copula estimates. The process can be analytically and computationally complex and usually involves steps that are unique to a given parametric copula. We describe an alternative approach that uses âProbability-...
ERIC Educational Resources Information Center
DeCicca, Philip; Smith, Justin D.
2011-01-01
We investigate short and long-term effects of early childhood education using variation created by a unique policy experiment in British Columbia, Canada. Our findings imply starting Kindergarten one year late substantially reduces the probability of repeating the third grade, and meaningfully increases in tenth grade math and reading scores.…
ERIC Educational Resources Information Center
Obata, Miki
2010-01-01
The goal of the dissertation is to determine aspects of the structure of the human language faculty, a cognitive system, specifically focusing on human syntactic systems, (unique in the animal kingdom) which enable us to creatively produce an unlimited number of grammatical sentences (like the one you just read, probably never before written or…
A probability space for quantum models
NASA Astrophysics Data System (ADS)
Lemmens, L. F.
2017-06-01
A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.
Rethinking the learning of belief network probabilities
DOE Office of Scientific and Technical Information (OSTI.GOV)
Musick, R.
Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rotemore » learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neutral networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.« less
Extreme weather and experience influence reproduction in an endangered bird
Reichert, Brian E.; Cattau, Christopher E.; Fletcher, Robert J.; Kendall, William L.; Kitchens, Wiley M.
2012-01-01
Using a 14-year time series spanning large variation in climatic conditions and the entirety of a population's breeding range, we estimated the effects of extreme weather conditions (drought) on the state-specific probabilities of breeding and survival of an endangered bird, the Florida Snail Kite (Rostrhamus sociabilis plumbeus). Our analysis accounted for uncertainty in breeding status assignment, a common source of uncertainty that is often ignored when states are based on field observations. Breeding probabilities in adult kites (>1 year of age) decreased during droughts, whereas the probability of breeding in young kites (1 year of age) tended to increase. Individuals attempting to breed showed no evidence of reduced future survival. Although population viability analyses of this species and other species often implicitly assume that all adults will attempt to breed, we find that breeding probabilities were significantly <1 for all 13 estimable years considered. Our results suggest that experience is an important factor determining whether or not individuals attempt to breed during harsh environmental conditions and that reproductive effort may be constrained by an individual's quality and/or despotic behavior among individuals attempting to breed.
NASA Technical Reports Server (NTRS)
Freeman, Frederick G.
1993-01-01
The increased use of automation in the cockpits of commercial planes has dramatically decreased the workload requirements of pilots, enabling them to function more efficiently and with a higher degree of safety. Unfortunately, advances in technology have led to an unexpected problem: the decreased demands on pilots have increased the probability of inducing 'hazardous states of awareness.' A hazardous state of awareness is defined as a decreased level of alertness or arousal which makes an individual less capable of reacting to unique or emergency types of situations. These states tend to be induced when an individual is not actively processing information. Under such conditions a person is likely to let his/her mind wander, either to internal states or to irrelevant external conditions. As a result, they are less capable of reacting quickly to emergency situations. Since emergencies are relatively rare, and since the high automated cockpit requires progressively decreasing levels of engagement, the probability of being seduced into a lowered state of awareness is increasing. This further decreases the readiness of the pilot to react to unique circumstances such as system failures. The HEM Lab at NASA-Langley Research Center has been studying how these states of awareness are induced and what the physiological correlates of these different states are. Specifically, they have been interested in studying electroencephalographic (EEG) measures of different states of alertness to determine if such states can be identified and, hopefully, avoided. The project worked on this summer involved analyzing the EEG and the event related potentials (ERP) data collected while subjects performed under two conditions. Each condition required subjects to perform a relatively boring vigilance task. The purpose of using these tasks was to induce a decreased state of awareness while still requiring the subject to process information. Each task involved identifying an infrequently presented target stimulus. In addition to the task requirements, irrelevant tones were presented in the background. Research has shown that even though these stimuli are not attended, ERP's to them can still be elicited. The amplitude of the ERP waves has been shown to change as a function of a person's level of alertness. ERP's were also collected and analyzed for the target stimuli for each task. Brain maps were produced based on the ERP voltages for the different stimuli. In addition to the ERP's, a quantitative EEG (QEEG) was performed on the data using a fast Fourier technique to produce a power spectral analysis of the EEG. This analysis was conducted on the continuous EEG while the subjects were performing the tasks. Finally, a QEEG was performed on periods during the task when subjects indicated that they were in an altered state of awareness. During the tasks, subjects were asked to indicate by pressing a button when they realized their level of task awareness had changed. EEG epochs were collected for times just before and just after subjects made this reponse. The purpose of this final analysis was to determine whether or not subjective indices of level of awareness could be correlated with different patterns of EEG.
Unique Association of Rare Cardiovascular Disease in an Athlete With Ventricular Arrhythmias.
Santomauro, V; Contursi, M; Dellegrottaglie, S; Borsellino, G
2015-01-01
Ventricular arrhythmias are a leading cause of non-elegibility to competitive sport. The failure to detect a significant organic substrate in the initial stage of screening does not preclude the identification of structural pathologies in the follow-up by using advanced imaging techniques. Here we report the case of a senior athlete judged not elegible because an arrhythmia with the morphology consistent with the origin of the left ventricle, in which subsequent execution of a cardiac MR and a thoracic CT scan has allowed the identification of an unique association between an area of myocardial damage, probable site of origine of the arrhythma, and a rare aortic malformation.
Social Interactions under Incomplete Information: Games, Equilibria, and Expectations
NASA Astrophysics Data System (ADS)
Yang, Chao
My dissertation research investigates interactions of agents' behaviors through social networks when some information is not shared publicly, focusing on solutions to a series of challenging problems in empirical research, including heterogeneous expectations and multiple equilibria. The first chapter, "Social Interactions under Incomplete Information with Heterogeneous Expectations", extends the current literature in social interactions by devising econometric models and estimation tools with private information in not only the idiosyncratic shocks but also some exogenous covariates. For example, when analyzing peer effects in class performances, it was previously assumed that all control variables, including individual IQ and SAT scores, are known to the whole class, which is unrealistic. This chapter allows such exogenous variables to be private information and models agents' behaviors as outcomes of a Bayesian Nash Equilibrium in an incomplete information game. The distribution of equilibrium outcomes can be described by the equilibrium conditional expectations, which is unique when the parameters are within a reasonable range according to the contraction mapping theorem in function spaces. The equilibrium conditional expectations are heterogeneous in both exogenous characteristics and the private information, which makes estimation in this model more demanding than in previous ones. This problem is solved in a computationally efficient way by combining the quadrature method and the nested fixed point maximum likelihood estimation. In Monte Carlo experiments, if some exogenous characteristics are private information and the model is estimated under the mis-specified hypothesis that they are known to the public, estimates will be biased. Applying this model to municipal public spending in North Carolina, significant negative correlations between contiguous municipalities are found, showing free-riding effects. The Second chapter "A Tobit Model with Social Interactions under Incomplete Information", is an application of the first chapter to censored outcomes, corresponding to the situation when agents" behaviors are subjected to some binding restrictions. In an interesting empirical analysis for property tax rates set by North Carolina municipal governments, it is found that there is a significant positive correlation among near-by municipalities. Additionally, some private information about its own residents is used by a municipal government to predict others' tax rates, which enriches current empirical work about tax competition. The third chapter, "Social Interactions under Incomplete Information with Multiple Equilibria", extends the first chapter by investigating effective estimation methods when the condition for a unique equilibrium may not be satisfied. With multiple equilibria, the previous model is incomplete due to the unobservable equilibrium selection. Neither conventional likelihoods nor moment conditions can be used to estimate parameters without further specifications. Although there are some solutions to this issue in the current literature, they are based on strong assumptions such as agents with the same observable characteristics play the same strategy. This paper relaxes those assumptions and extends the all-solution method used to estimate discrete choice games to a setting with both discrete and continuous choices, bounded and unbounded outcomes, and a general form of incomplete information, where the existence of a pure strategy equilibrium has been an open question for a long time. By the use of differential topology and functional analysis, it is found that when all exogenous characteristics are public information, there are a finite number of equilibria. With privately known exogenous characteristics, the equilbria can be represented by a compact set in a Banach space and be approximated by a finite set. As a result, a finite-state probability mass function can be used to specify a probability measure for equilibrium selection, which completes the model. From Monte Carlo experiments about two types of binary choice models, it is found that assuming equilibrium uniqueness can bring in estimation biases when the true value of interaction intensity is large and there are multiple equilibria in the data generating process.
Probability, propensity and probability of propensities (and of probabilities)
NASA Astrophysics Data System (ADS)
D'Agostini, Giulio
2017-06-01
The process of doing Science in condition of uncertainty is illustrated with a toy experiment in which the inferential and the forecasting aspects are both present. The fundamental aspects of probabilistic reasoning, also relevant in real life applications, arise quite naturally and the resulting discussion among non-ideologized, free-minded people offers an opportunity for clarifications.
ERIC Educational Resources Information Center
Bayen, Ute J.; Kuhlmann, Beatrice G.
2011-01-01
The authors investigated conditions under which judgments in source-monitoring tasks are influenced by prior schematic knowledge. According to a probability-matching account of source guessing (Spaniol & Bayen, 2002), when people do not remember the source of information, they match source-guessing probabilities to the perceived contingency…
The role of probabilities in physics.
Le Bellac, Michel
2012-09-01
Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. Copyright © 2012 Elsevier Ltd. All rights reserved.
2007-03-01
1 2’ VIH " 1 ’ ) (34) where is the modified Bessel function of zero order. Here is the conditional variance and is the conditional probability...10, the probability of detection is the area under the signal-plus-noise curve above the detection threshold co M vF (V 2+ A2)]10 ( vAPd= fnp~ju,( vIH
Material Logistic Support of the Hospital Ships
1986-12-01
Codeine Sulfate Tablets 6505-00-132-6904 Isoniazid Tablets 6505-00-165-6545 Cephalexin Capsules 6505-00-165-6575 Rifampin Capsules 6505-00-400-2054...35 4. CONSUMPTION RATE FOR MEDICAL CONSUMABLE ITEM FOR SPECIFIC CONDITION UNDER SCENARIO A 38 5. CONTRIBUTION FACTOR FOR BISACODYL TABLETS FOR SCENARIO...probability that patient condition 249 will require Bisacodyl. If the probability was twenty percent, then the amount of Bisacodyl needed would be two tablets
Modeling Spatial Dependence of Rainfall Extremes Across Multiple Durations
NASA Astrophysics Data System (ADS)
Le, Phuong Dong; Leonard, Michael; Westra, Seth
2018-03-01
Determining the probability of a flood event in a catchment given that another flood has occurred in a nearby catchment is useful in the design of infrastructure such as road networks that have multiple river crossings. These conditional flood probabilities can be estimated by calculating conditional probabilities of extreme rainfall and then transforming rainfall to runoff through a hydrologic model. Each catchment's hydrological response times are unlikely to be the same, so in order to estimate these conditional probabilities one must consider the dependence of extreme rainfall both across space and across critical storm durations. To represent these types of dependence, this study proposes a new approach for combining extreme rainfall across different durations within a spatial extreme value model using max-stable process theory. This is achieved in a stepwise manner. The first step defines a set of common parameters for the marginal distributions across multiple durations. The parameters are then spatially interpolated to develop a spatial field. Storm-level dependence is represented through the max-stable process for rainfall extremes across different durations. The dependence model shows a reasonable fit between the observed pairwise extremal coefficients and the theoretical pairwise extremal coefficient function across all durations. The study demonstrates how the approach can be applied to develop conditional maps of the return period and return level across different durations.
Reasoning and choice in the Monty Hall Dilemma (MHD): implications for improving Bayesian reasoning
Tubau, Elisabet; Aguilar-Lleyda, David; Johnson, Eric D.
2015-01-01
The Monty Hall Dilemma (MHD) is a two-step decision problem involving counterintuitive conditional probabilities. The first choice is made among three equally probable options, whereas the second choice takes place after the elimination of one of the non-selected options which does not hide the prize. Differing from most Bayesian problems, statistical information in the MHD has to be inferred, either by learning outcome probabilities or by reasoning from the presented sequence of events. This often leads to suboptimal decisions and erroneous probability judgments. Specifically, decision makers commonly develop a wrong intuition that final probabilities are equally distributed, together with a preference for their first choice. Several studies have shown that repeated practice enhances sensitivity to the different reward probabilities, but does not facilitate correct Bayesian reasoning. However, modest improvements in probability judgments have been observed after guided explanations. To explain these dissociations, the present review focuses on two types of causes producing the observed biases: Emotional-based choice biases and cognitive limitations in understanding probabilistic information. Among the latter, we identify a crucial cause for the universal difficulty in overcoming the equiprobability illusion: Incomplete representation of prior and conditional probabilities. We conclude that repeated practice and/or high incentives can be effective for overcoming choice biases, but promoting an adequate partitioning of possibilities seems to be necessary for overcoming cognitive illusions and improving Bayesian reasoning. PMID:25873906
Thouand, Gérald; Durand, Marie-José; Maul, Armand; Gancet, Christian; Blok, Han
2011-01-01
The European REACH Regulation (Registration, Evaluation, Authorization of CHemical substances) implies, among other things, the evaluation of the biodegradability of chemical substances produced by industry. A large set of test methods is available including detailed information on the appropriate conditions for testing. However, the inoculum used for these tests constitutes a “black box.” If biodegradation is achievable from the growth of a small group of specific microbial species with the substance as the only carbon source, the result of the test depends largely on the cell density of this group at “time zero.” If these species are relatively rare in an inoculum that is normally used, the likelihood of inoculating a test with sufficient specific cells becomes a matter of probability. Normally this probability increases with total cell density and with the diversity of species in the inoculum. Furthermore the history of the inoculum, e.g., a possible pre-exposure to the test substance or similar substances will have a significant influence on the probability. A high probability can be expected for substances that are widely used and regularly released into the environment, whereas a low probability can be expected for new xenobiotic substances that have not yet been released into the environment. Be that as it may, once the inoculum sample contains sufficient specific degraders, the performance of the biodegradation will follow a typical S shaped growth curve which depends on the specific growth rate under laboratory conditions, the so called F/M ratio (ratio between food and biomass) and the more or less toxic recalcitrant, but possible, metabolites. Normally regulators require the evaluation of the growth curve using a simple approach such as half-time. Unfortunately probability and biodegradation half-time are very often confused. As the half-time values reflect laboratory conditions which are quite different from environmental conditions (after a substance is released), these values should not be used to quantify and predict environmental behavior. The probability value could be of much greater benefit for predictions under realistic conditions. The main issue in the evaluation of probability is that the result is not based on a single inoculum from an environmental sample, but on a variety of samples. These samples can be representative of regional or local areas, climate regions, water types, and history, e.g., pristine or polluted. The above concept has provided us with a new approach, namely “Probabio.” With this approach, persistence is not only regarded as a simple intrinsic property of a substance, but also as the capability of various environmental samples to degrade a substance under realistic exposure conditions and F/M ratio. PMID:21863143
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korhonen, Marko; Lee, Eunghyun
2014-01-15
We treat the N-particle zero range process whose jumping rates satisfy a certain condition. This condition is required to use the Bethe ansatz and the resulting model is the q-boson model by Sasamoto and Wadati [“Exact results for one-dimensional totally asymmetric diffusion models,” J. Phys. A 31, 6057–6071 (1998)] or the q-totally asymmetric zero range process (TAZRP) by Borodin and Corwin [“Macdonald processes,” Probab. Theory Relat. Fields (to be published)]. We find the explicit formula of the transition probability of the q-TAZRP via the Bethe ansatz. By using the transition probability we find the probability distribution of the left-most particle'smore » position at time t. To find the probability for the left-most particle's position we find a new identity corresponding to identity for the asymmetric simple exclusion process by Tracy and Widom [“Integral formulas for the asymmetric simple exclusion process,” Commun. Math. Phys. 279, 815–844 (2008)]. For the initial state that all particles occupy a single site, the probability distribution of the left-most particle's position at time t is represented by the contour integral of a determinant.« less
Sasaki, Koji; Kantarjian, Hagop M; Jain, Preetesh; Jabbour, Elias J; Ravandi, Farhad; Konopleva, Marina; Borthakur, Gautam; Takahashi, Koichi; Pemmaraju, Naveen; Daver, Naval; Pierce, Sherry A; O'Brien, Susan M; Cortes, Jorge E
2016-01-15
Tyrosine kinase inhibitors (TKIs) significantly improve survival in patients with chronic myeloid leukemia in chronic phase (CML-CP). Conditional probability provides survival information in patients who have already survived for a specific period of time after treatment. Cumulative response and survival data from 6 consecutive frontline TKI clinical trials were analyzed. Conditional probability was calculated for failure-free survival (FFS), transformation-free survival (TFS), event-free survival (EFS), and overall survival (OS) according to depth of response within 1 year of the initiation of TKIs, including complete cytogenetic response, major molecular response, and molecular response with a 4-log or 4.5-log reduction. A total of 483 patients with a median follow-up of 99.4 months from the initiation of treatment with TKIs were analyzed. Conditional probabilities of FFS, TFS, EFS, and OS for 1 additional year for patients alive after 12 months of therapy ranged from 92.0% to 99.1%, 98.5% to 100%, 96.2% to 99.6%, and 96.8% to 99.7%, respectively. Conditional FFS for 1 additional year did not improve with a deeper response each year. Conditional probabilities of TFS, EFS, and OS for 1 additional year were maintained at >95% during the period. In the era of TKIs, patients with chronic myeloid leukemia in chronic phase who survived for a certain number of years maintained excellent clinical outcomes in each age group. Cancer 2016;122:238-248. © 2015 American Cancer Society. © 2015 American Cancer Society.
Gómez Toledo, Verónica; Gutiérrez Farfán, Ileana; Verduzco-Mendoza, Antonio; Arch-Tirado, Emilio
Tinnitus is defined as the conscious perception of a sensation of sound that occurs in the absence of an external stimulus. This audiological symptom affects 7% to 19% of the adult population. The aim of this study is to describe the associated comorbidities present in patients with tinnitus usingjoint and conditional probability analysis. Patients of both genders, diagnosed with unilateral or bilateral tinnitus, aged between 20 and 45 years, and had a full computerised medical record, were selected. Study groups were formed on the basis of the following clinical aspects: 1) audiological findings; 2) vestibular findings; 3) comorbidities such as, temporomandibular dysfunction, tubal dysfunction, otosclerosis and, 4) triggering factors of tinnitus noise exposure, respiratory tract infection, use of ototoxic and/or drugs. Of the patients with tinnitus, 27 (65%) reported hearing loss, 11 (26.19%) temporomandibular dysfunction, and 11 (26.19%) with vestibular disorders. When performing the joint probability analysis, it was found that the probability that a patient with tinnitus having hearing loss was 2742 0.65, and 2042 0.47 for bilateral type. The result for P (A ∩ B)=30%. Bayes' theorem P (AiB) = P(Ai∩B)P(B) was used, and various probabilities were calculated. Therefore, in patients with temporomandibulardysfunction and vestibular disorders, a posterior probability of P (Aі/B)=31.44% was calculated. Consideration should be given to the joint and conditional probability approach as tools for the study of different pathologies. Copyright © 2016 Academia Mexicana de Cirugía A.C. Publicado por Masson Doyma México S.A. All rights reserved.
NASA Astrophysics Data System (ADS)
Zarola, Amit; Sil, Arjun
2018-04-01
This study presents the forecasting of time and magnitude size of the next earthquake in the northeast India, using four probability distribution models (Gamma, Lognormal, Weibull and Log-logistic) considering updated earthquake catalog of magnitude Mw ≥ 6.0 that occurred from year 1737-2015 in the study area. On the basis of past seismicity of the region, two types of conditional probabilities have been estimated using their best fit model and respective model parameters. The first conditional probability is the probability of seismic energy (e × 1020 ergs), which is expected to release in the future earthquake, exceeding a certain level of seismic energy (E × 1020 ergs). And the second conditional probability is the probability of seismic energy (a × 1020 ergs/year), which is expected to release per year, exceeding a certain level of seismic energy per year (A × 1020 ergs/year). The logarithm likelihood functions (ln L) were also estimated for all four probability distribution models. A higher value of ln L suggests a better model and a lower value shows a worse model. The time of the future earthquake is forecasted by dividing the total seismic energy expected to release in the future earthquake with the total seismic energy expected to release per year. The epicentre of recently occurred 4 January 2016 Manipur earthquake (M 6.7), 13 April 2016 Myanmar earthquake (M 6.9) and the 24 August 2016 Myanmar earthquake (M 6.8) are located in zone Z.12, zone Z.16 and zone Z.15, respectively and that are the identified seismic source zones in the study area which show that the proposed techniques and models yield good forecasting accuracy.
White, Angela M.; Manley, Patricia N.; Tarbill, Gina; Richardson, T.L.; Russell, Robin E.; Safford, Hugh D.; Dobrowski, Solomon Z.
2015-01-01
Fire is a natural process and the dominant disturbance shaping plant and animal communities in many coniferous forests of the western US. Given that fire size and severity are predicted to increase in the future, it has become increasingly important to understand how wildlife responds to fire and post-fire management. The Angora Fire burned 1243 hectares of mixed conifer forest in South Lake Tahoe, California. We conducted avian point counts for the first 3 years following the fire in burned and unburned areas to investigate which habitat characteristics are most important for re-establishing or maintaining the native avian community in post-fire landscapes. We used a multi-species occurrence model to estimate how avian species are influenced by the density of live and dead trees and shrub cover. While accounting for variations in the detectability of species, our approach estimated the occurrence probabilities of all species detected including those that were rare or observed infrequently. Although all species encountered in this study were detected in burned areas, species-specific modeling results predicted that some species were strongly associated with specific post-fire conditions, such as a high density of dead trees, open-canopy conditions or high levels of shrub cover that occur at particular burn severities or at a particular time following fire. These results indicate that prescribed fire or managed wildfire which burns at low to moderate severity without at least some high-severity effects is both unlikely to result in the species assemblages that are unique to post-fire areas or to provide habitat for burn specialists. Additionally, the probability of occurrence for many species was associated with high levels of standing dead trees indicating that intensive post-fire harvest of these structures could negatively impact habitat of a considerable proportion of the avian community.
Dynamic Response of an Optomechanical System to a Stationary Random Excitation in the Time Domain
Palmer, Jeremy A.; Paez, Thomas L.
2011-01-01
Modern electro-optical instruments are typically designed with assemblies of optomechanical members that support optics such that alignment is maintained in service environments that include random vibration loads. This paper presents a nonlinear numerical analysis that calculates statistics for the peak lateral response of optics in an optomechanical sub-assembly subject to random excitation of the housing. The work is unique in that the prior art does not address peak response probability distribution for stationary random vibration in the time domain for a common lens-retainer-housing system with Coulomb damping. Analytical results are validated by using displacement response data from random vibration testingmore » of representative prototype sub-assemblies. A comparison of predictions to experimental results yields reasonable agreement. The Type I Asymptotic form provides the cumulative distribution function for peak response probabilities. Probabilities are calculated for actual lens centration tolerances. The probability that peak response will not exceed the centration tolerance is greater than 80% for prototype configurations where the tolerance is high (on the order of 30 micrometers). Conversely, the probability is low for those where the tolerance is less than 20 micrometers. The analysis suggests a design paradigm based on the influence of lateral stiffness on the magnitude of the response.« less
The gravitational law of social interaction
NASA Astrophysics Data System (ADS)
Levy, Moshe; Goldenberg, Jacob
2014-01-01
While a great deal is known about the topology of social networks, there is much less agreement about the geographical structure of these networks. The fundamental question in this context is: how does the probability of a social link between two individuals depend on the physical distance between them? While it is clear that the probability decreases with the distance, various studies have found different functional forms for this dependence. The exact form of the distance dependence has crucial implications for network searchability and dynamics: Kleinberg (2000) [15] shows that the small-world property holds if the probability of a social link is a power-law function of the distance with power -2, but not with any other power. We investigate the distance dependence of link probability empirically by analyzing four very different sets of data: Facebook links, data from the electronic version of the Small-World experiment, email messages, and data from detailed personal interviews. All four datasets reveal the same empirical regularity: the probability of a social link is proportional to the inverse of the square of the distance between the two individuals, analogously to the distance dependence of the gravitational force. Thus, it seems that social networks spontaneously converge to the exact unique distance dependence that ensures the Small-World property.
On the Existence and Uniqueness of JML Estimates for the Partial Credit Model
ERIC Educational Resources Information Center
Bertoli-Barsotti, Lucio
2005-01-01
A necessary and sufficient condition is given in this paper for the existence and uniqueness of the maximum likelihood (the so-called joint maximum likelihood) estimate of the parameters of the Partial Credit Model. This condition is stated in terms of a structural property of the pattern of the data matrix that can be easily verified on the basis…
Morelli, Federico; Benedetti, Yanina; Møller, Anders Pape; Liang, Wei; Carrascal, Luis M
2018-05-01
The evolutionary distinctiveness (ED) score is a measure of phylogenetic isolation that quantifies the evolutionary uniqueness of a species. Here, we compared the ED score of parasitic and non-parasitic cuckoo species world-wide, to understand whether parental care or parasitism represents the largest amount of phylogenetic uniqueness. Next, we focused only on 46 cuckoo species characterized by brood parasitism with a known number of host species, and we explored the associations among ED score, number of host species and breeding range size for these species. We assessed these associations using phylogenetic generalized least squares (PGLS) models, taking into account the phylogenetic signal. Parasitic cuckoo species were not more unique in terms of ED than non-parasitic species. However, we found a significant negative association between the evolutionary uniqueness and host range and a positive correlation between the number of host species and range size of parasitic cuckoos, probably suggesting a passive sampling of hosts by parasitic species as the breeding range broadens. The findings of this study showed that more generalist brood parasites occupied very different positions in a phylogenetic tree, suggesting that they have evolved independently within the Cuculiformes order. Finally, we demonstrated that specialist cuckoo species also represent the most evolutionarily unique species in the order of Cuculiformes. © 2018 The Authors. Journal of Animal Ecology © 2018 British Ecological Society.
The geologic story of Isle Royale National Park
Huber, N. King
1975-01-01
Isle Royale is an outstanding example of relatively undisturbed northwoods lake wilderness. But more than simple preservation of such an environment is involved in its inclusion in our National Park System. Its isolation from the mainland provides an almost untouched laboratory for research in the natural sciences, especially those studies whose very nature depends upon such isolation. One excellent example of such research is the intensive study of the predator-prey relationship of the timber wolf and moose, long sponsored by the National Park Service and Purdue University. In probably no other place in North America are the necessary ecological conditions for such a study so admirably fulfilled as on Isle Royale. The development of a natural laboratory with such conditions is ultimately dependent upon geologic processes and events that although not unique in themselves, produced in their interplay a unique result, the island archipelago as we know it today, with its hills and valleys, swamps and bogs the ecological framework of the plant and animal world. Even the most casual visitor can hardly fail to be struck by the fiordlike nature of many of the bays, the chains of fringing islands, the ridge-and-valley topography, and the linear nature of all these features. The distinctive topography of the archipelago is, of course, only the latest manifestation of geologic processes in operation since time immemorial. Fragments of geologic history going back over a billion years can be read from the rocks of the island, and with additional data from other parts of the Lake Superior region, we can fill in some of the story of Isle Royale. After more than a hundred years of study by man, the story is still incomplete. But then, geologic stories are seldom complete, and what we do know allows a deeper appreciation of one of our most naturally preserved parks and whets our curiosity about the missing fragments.
Harris, Justin A; Kwok, Dorothy W S
2018-01-01
During magazine approach conditioning, rats do not discriminate between a conditional stimulus (CS) that is consistently reinforced with food and a CS that is occasionally (partially) reinforced, as long as the CSs have the same overall reinforcement rate per second. This implies that rats are indifferent to the probability of reinforcement per trial. However, in the same rats, the per-trial reinforcement rate will affect subsequent extinction-responding extinguishes more rapidly for a CS that was consistently reinforced than for a partially reinforced CS. Here, we trained rats with consistently and partially reinforced CSs that were matched for overall reinforcement rate per second. We measured conditioned responding both during and immediately after the CSs. Differences in the per-trial probability of reinforcement did not affect the acquisition of responding during the CS but did affect subsequent extinction of that responding, and also affected the post-CS response rates during conditioning. Indeed, CSs with the same probability of reinforcement per trial evoked the same amount of post-CS responding even when they differed in overall reinforcement rate and thus evoked different amounts of responding during the CS. We conclude that reinforcement rate per second controls rats' acquisition of responding during the CS, but at the same time, rats also learn specifically about the probability of reinforcement per trial. The latter learning affects the rats' expectation of reinforcement as an outcome of the trial, which influences their ability to detect retrospectively that an opportunity for reinforcement was missed, and, in turn, drives extinction. (PsycINFO Database Record (c) 2018 APA, all rights reserved).
The negated conditional: a litmus test for the suppositional conditional?
Handley, Simon J; Evans, Jonathan St B T; Thompson, Valerie A
2006-05-01
Under the suppositional account of conditionals, when people think about a conditional assertion, "if p then q," they engage in a mental simulation in which they imagine p holds and evaluate the probability that q holds under this supposition. One implication of this account is that belief in a conditional equates to conditional probability [P(q/p)]. In this paper, the authors examine a further implication of this analysis with respect to the wide-scope negation of conditional assertions, "it is not the case that if p then q." Under the suppositional account, nothing categorically follows from the negation of a conditional, other than a second conditional, "if p then not-q." In contrast, according to the mental model theory, a negated conditional is consistent only with the determinate state of affairs, p and not-q. In 4 experiments, the authors compare the contrasting predictions that arise from each of these accounts. The findings are consistent with the suppositional theory but are incongruent with the mental model theory of conditionals.
Fram, Miranda S.; Belitz, Kenneth
2011-01-01
We use data from 1626 groundwater samples collected in California, primarily from public drinking water supply wells, to investigate the distribution of perchlorate in deep groundwater under natural conditions. The wells were sampled for the California Groundwater Ambient Monitoring and Assessment Priority Basin Project. We develop a logistic regression model for predicting probabilities of detecting perchlorate at concentrations greater than multiple threshold concentrations as a function of climate (represented by an aridity index) and potential anthropogenic contributions of perchlorate (quantified as an anthropogenic score, AS). AS is a composite categorical variable including terms for nitrate, pesticides, and volatile organic compounds. Incorporating water-quality parameters in AS permits identification of perturbation of natural occurrence patterns by flushing of natural perchlorate salts from unsaturated zones by irrigation recharge as well as addition of perchlorate from industrial and agricultural sources. The data and model results indicate low concentrations (0.1-0.5 μg/L) of perchlorate occur under natural conditions in groundwater across a wide range of climates, beyond the arid to semiarid climates in which they mostly have been previously reported. The probability of detecting perchlorate at concentrations greater than 0.1 μg/L under natural conditions ranges from 50-70% in semiarid to arid regions of California and the Southwestern United States to 5-15% in the wettest regions sampled (the Northern California coast). The probability of concentrations above 1 μg/L under natural conditions is low (generally <3%).
ERIC Educational Resources Information Center
Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas
2013-01-01
Data analysis requires subtle probability reasoning to answer questions like "What is the chance of event A occurring, given that event B was observed?" This generic question arises in discussions of many intriguing scientific questions such as "What is the probability that an adolescent weighs between 120 and 140 pounds given that…
van Walraven, Carl
2017-04-01
Diagnostic codes used in administrative databases cause bias due to misclassification of patient disease status. It is unclear which methods minimize this bias. Serum creatinine measures were used to determine severe renal failure status in 50,074 hospitalized patients. The true prevalence of severe renal failure and its association with covariates were measured. These were compared to results for which renal failure status was determined using surrogate measures including the following: (1) diagnostic codes; (2) categorization of probability estimates of renal failure determined from a previously validated model; or (3) bootstrap methods imputation of disease status using model-derived probability estimates. Bias in estimates of severe renal failure prevalence and its association with covariates were minimal when bootstrap methods were used to impute renal failure status from model-based probability estimates. In contrast, biases were extensive when renal failure status was determined using codes or methods in which model-based condition probability was categorized. Bias due to misclassification from inaccurate diagnostic codes can be minimized using bootstrap methods to impute condition status using multivariable model-derived probability estimates. Copyright © 2017 Elsevier Inc. All rights reserved.
Appraisal of geodynamic inversion results: a data mining approach
NASA Astrophysics Data System (ADS)
Baumann, T. S.
2016-11-01
Bayesian sampling based inversions require many thousands or even millions of forward models, depending on how nonlinear or non-unique the inverse problem is, and how many unknowns are involved. The result of such a probabilistic inversion is not a single `best-fit' model, but rather a probability distribution that is represented by the entire model ensemble. Often, a geophysical inverse problem is non-unique, and the corresponding posterior distribution is multimodal, meaning that the distribution consists of clusters with similar models that represent the observations equally well. In these cases, we would like to visualize the characteristic model properties within each of these clusters of models. However, even for a moderate number of inversion parameters, a manual appraisal for a large number of models is not feasible. This poses the question whether it is possible to extract end-member models that represent each of the best-fit regions including their uncertainties. Here, I show how a machine learning tool can be used to characterize end-member models, including their uncertainties, from a complete model ensemble that represents a posterior probability distribution. The model ensemble used here results from a nonlinear geodynamic inverse problem, where rheological properties of the lithosphere are constrained from multiple geophysical observations. It is demonstrated that by taking vertical cross-sections through the effective viscosity structure of each of the models, the entire model ensemble can be classified into four end-member model categories that have a similar effective viscosity structure. These classification results are helpful to explore the non-uniqueness of the inverse problem and can be used to compute representative data fits for each of the end-member models. Conversely, these insights also reveal how new observational constraints could reduce the non-uniqueness. The method is not limited to geodynamic applications and a generalized MATLAB code is provided to perform the appraisal analysis.
NASA Astrophysics Data System (ADS)
Khodabakhshi, M.; Jafarpour, B.
2013-12-01
Characterization of complex geologic patterns that create preferential flow paths in certain reservoir systems requires higher-order geostatistical modeling techniques. Multipoint statistics (MPS) provides a flexible grid-based approach for simulating such complex geologic patterns from a conceptual prior model known as a training image (TI). In this approach, a stationary TI that encodes the higher-order spatial statistics of the expected geologic patterns is used to represent the shape and connectivity of the underlying lithofacies. While MPS is quite powerful for describing complex geologic facies connectivity, the nonlinear and complex relation between the flow data and facies distribution makes flow data conditioning quite challenging. We propose an adaptive technique for conditioning facies simulation from a prior TI to nonlinear flow data. Non-adaptive strategies for conditioning facies simulation to flow data can involves many forward flow model solutions that can be computationally very demanding. To improve the conditioning efficiency, we develop an adaptive sampling approach through a data feedback mechanism based on the sampling history. In this approach, after a short period of sampling burn-in time where unconditional samples are generated and passed through an acceptance/rejection test, an ensemble of accepted samples is identified and used to generate a facies probability map. This facies probability map contains the common features of the accepted samples and provides conditioning information about facies occurrence in each grid block, which is used to guide the conditional facies simulation process. As the sampling progresses, the initial probability map is updated according to the collective information about the facies distribution in the chain of accepted samples to increase the acceptance rate and efficiency of the conditioning. This conditioning process can be viewed as an optimization approach where each new sample is proposed based on the sampling history to improve the data mismatch objective function. We extend the application of this adaptive conditioning approach to the case where multiple training images are proposed to describe the geologic scenario in a given formation. We discuss the advantages and limitations of the proposed adaptive conditioning scheme and use numerical experiments from fluvial channel formations to demonstrate its applicability and performance compared to non-adaptive conditioning techniques.
NASA Astrophysics Data System (ADS)
Smith, L. A.
2007-12-01
We question the relevance of climate-model based Bayesian (or other) probability statements for decision support and impact assessment on spatial scales less than continental and temporal averages less than seasonal. Scientific assessment of higher resolution space and time scale information is urgently needed, given the commercial availability of "products" at high spatiotemporal resolution, their provision by nationally funded agencies for use both in industry decision making and governmental policy support, and their presentation to the public as matters of fact. Specifically we seek to establish necessary conditions for probability forecasts (projections conditioned on a model structure and a forcing scenario) to be taken seriously as reflecting the probability of future real-world events. We illustrate how risk management can profitably employ imperfect models of complicated chaotic systems, following NASA's study of near-Earth PHOs (Potentially Hazardous Objects). Our climate models will never be perfect, nevertheless the space and time scales on which they provide decision- support relevant information is expected to improve with the models themselves. Our aim is to establish a set of baselines of internal consistency; these are merely necessary conditions (not sufficient conditions) that physics based state-of-the-art models are expected to pass if their output is to be judged decision support relevant. Probabilistic Similarity is proposed as one goal which can be obtained even when our models are not empirically adequate. In short, probabilistic similarity requires that, given inputs similar to today's empirical observations and observational uncertainties, we expect future models to produce similar forecast distributions. Expert opinion on the space and time scales on which we might reasonably expect probabilistic similarity may prove of much greater utility than expert elicitation of uncertainty in parameter values in a model that is not empirically adequate; this may help to explain the reluctance of experts to provide information on "parameter uncertainty." Probability statements about the real world are always conditioned on some information set; they may well be conditioned on "False" making them of little value to a rational decision maker. In other instances, they may be conditioned on physical assumptions not held by any of the modellers whose model output is being cast as a probability distribution. Our models will improve a great deal in the next decades, and our insight into the likely climate fifty years hence will improve: maintaining the credibility of the science and the coherence of science based decision support, as our models improve, require a clear statement of our current limitations. What evidence do we have that today's state-of-the-art models provide decision-relevant probability forecasts? What space and time scales do we currently have quantitative, decision-relevant information on for 2050? 2080?
Multiple positive solutions to a coupled systems of nonlinear fractional differential equations.
Shah, Kamal; Khan, Rahmat Ali
2016-01-01
In this article, we study existence, uniqueness and nonexistence of positive solution to a highly nonlinear coupled system of fractional order differential equations. Necessary and sufficient conditions for the existence and uniqueness of positive solution are developed by using Perov's fixed point theorem for the considered problem. Further, we also established sufficient conditions for existence of multiplicity results for positive solutions. Also, we developed some conditions under which the considered coupled system of fractional order differential equations has no positive solution. Appropriate examples are also provided which demonstrate our results.
Quantum key distribution without the wavefunction
NASA Astrophysics Data System (ADS)
Niestegge, Gerd
A well-known feature of quantum mechanics is the secure exchange of secret bit strings which can then be used as keys to encrypt messages transmitted over any classical communication channel. It is demonstrated that this quantum key distribution allows a much more general and abstract access than commonly thought. The results include some generalizations of the Hilbert space version of quantum key distribution, but are based upon a general nonclassical extension of conditional probability. A special state-independent conditional probability is identified as origin of the superior security of quantum key distribution; this is a purely algebraic property of the quantum logic and represents the transition probability between the outcomes of two consecutive quantum measurements.
What Health Issues or Conditions Affect Women Differently Than Men?
... tract is structured. 13 National Cancer Institute. (2010). Probability of breast cancer in American women . Retrieved August ... from http://www.cancer.gov/cancertopics/factsheet/detection/probability-breast-cancer National Cancer Institute. (2017). General information ...
Probability effects on stimulus evaluation and response processes
NASA Technical Reports Server (NTRS)
Gehring, W. J.; Gratton, G.; Coles, M. G.; Donchin, E.
1992-01-01
This study investigated the effects of probability information on response preparation and stimulus evaluation. Eight subjects responded with one hand to the target letter H and with the other to the target letter S. The target letter was surrounded by noise letters that were either the same as or different from the target letter. In 2 conditions, the targets were preceded by a warning stimulus unrelated to the target letter. In 2 other conditions, a warning letter predicted that the same letter or the opposite letter would appear as the imperative stimulus with .80 probability. Correct reaction times were faster and error rates were lower when imperative stimuli confirmed the predictions of the warning stimulus. Probability information affected (a) the preparation of motor responses during the foreperiod, (b) the development of expectancies for a particular target letter, and (c) a process sensitive to the identities of letter stimuli but not to their locations.
Bayesian Inference for Source Reconstruction: A Real-World Application
2014-09-25
deliberately or acci- dentally . Two examples of operational monitoring sensor networks are the deployment of biological sensor arrays by the Department of...remarkable paper, Cox [16] demonstrated that proba- bility theory, when interpreted as logic, is the only calculus that conforms to a consistent theory...of inference. This demonstration provides the firm logical basis for asserting that probability calculus is the unique quantitative theory of
NASA Technical Reports Server (NTRS)
Shih, C.-Y.; Nyquist, L. E.; Reese, Y.; Wiesmann, H.; Nazarov, M. A.; Taylor, L. A.
2002-01-01
The Sm-Nd isochron for lunar mare basalt meteorite Dhofar 287A yields T = 3.46 +/- 0.03 Ga and Nd = 0.6 +/- 0.3. Its Rb-Sr isotopic system is severely altered. The basalt is unique, probably coming from an enriched mantle source. Additional information is contained in the original extended abstract.
Early assessment of the trees of the Luquillo Mountains
Frank H. Wadsworth
2009-01-01
This is a description of the composition of a forest in Puerto Rico almost unique in that it probably had not received human modification. It comes from a 1948 inventory of 4,570 hectares of what today is the El Yunque National Forest in the Luquillo Mountains. The inventory, preparatory to a forest management plan, was a search for sustainable level of timber volume...
NASA Astrophysics Data System (ADS)
Min, Qing-xu; Zhu, Jun-zhen; Feng, Fu-zhou; Xu, Chao; Sun, Ji-wei
2017-06-01
In this paper, the lock-in vibrothermography (LVT) is utilized for defect detection. Specifically, for a metal plate with an artificial fatigue crack, the temperature rise of the defective area is used for analyzing the influence of different test conditions, i.e. engagement force, excitation intensity, and modulated frequency. The multivariate nonlinear and logistic regression models are employed to estimate the POD (probability of detection) and POA (probability of alarm) of fatigue crack, respectively. The resulting optimal selection of test conditions is presented. The study aims to provide an optimized selection method of the test conditions in the vibrothermography system with the enhanced detection ability.
Davis, Teri D; Campbell, Duncan G; Bonner, Laura M; Bolkan, Cory R; Lanto, Andrew; Chaney, Edmund F; Waltz, Thomas; Zivin, Kara; Yano, Elizabeth M; Rubenstein, Lisa V
Depression is the most prevalent mental health condition in primary care (PC). Yet as the Veterans Health Administration increases resources for PC/mental health integration, including integrated care for women, there is little detailed information about depression care needs, preferences, comorbidity, and access patterns among women veterans with depression followed in PC. We sampled patients regularly engaged with Veterans Health Administration PC. We screened 10,929 (10,580 men, 349 women) with the two-item Patient Health Questionnaire. Of the 2,186 patients who screened positive (2,092 men, 94 women), 2,017 men and 93 women completed the full Patient Health Questionnaire-9 depression screening tool. Ultimately, 46 women and 715 men with probable major depression were enrolled and completed a baseline telephone survey. We conducted descriptive statistics to provide information about the depression care experiences of women veterans and to examine potential gender differences at baseline and at seven month follow-up across study variables. Among those patients who agreed to screening, 20% of women (70 of 348) had probable major depression, versus only 12% of men (1,243 of 10,505). Of the women, 48% had concurrent probable posttraumatic stress disorder and 65% reported general anxiety. Women were more likely to receive adequate depression care than men (57% vs. 39%, respectively; p < .05); 46% of women and 39% of men reported depression symptom improvement at the 7-month follow-up. Women veterans were less likely than men to prefer care from a PC physician (p < .01) at baseline and were more likely than men to report mental health specialist care (p < .01) in the 6 months before baseline. PC/mental health integration planners should consider methods for accommodating women veterans unique care needs and preferences for mental health care delivered by health care professionals other than physicians. Published by Elsevier Inc.
Szabó, Attila; Korponai, Kristóf; Kerepesi, Csaba; Somogyi, Boglárka; Vörös, Lajos; Bartha, Dániel; Márialigeti, Károly; Felföldi, Tamás
2017-05-01
Soda pans of the Pannonian steppe are unique environments regarding their physical and chemical characteristics: shallowness, high turbidity, intermittent character, alkaline pH, polyhumic organic carbon concentration, hypertrophic condition, moderately high salinity, sodium and carbonate ion dominance. The pans are highly productive environments with picophytoplankton predominance. Little is known about the planktonic bacterial communities inhabiting these aquatic habitats; therefore, amplicon sequencing and shotgun metagenomics were applied to reveal their composition and functional properties. Results showed a taxonomically complex bacterial community which was distinct from other soda lakes regarding its composition, e.g. the dominance of class Alphaproteobacteria was observed within phylum Proteobacteria. The shotgun metagenomic analysis revealed several functional gene components related to the harsh and at the same time hypertrophic environmental conditions, e.g. proteins involved in stress response, transport and hydrolase systems targeting phytoplankton-derived organic matter. This is the first detailed report on the indigenous planktonic bacterial communities coping with the multiple extreme conditions present in the unique soda pans of the Pannonian steppe.
Cost of Crashes Related to Road Conditions, United States, 2006
Zaloshnja, Eduard; Miller, Ted R.
2009-01-01
This is the first study to estimate the cost of crashes related to road conditions in the U.S. To model the probability that road conditions contributed to the involvement of a vehicle in the crash, we used 2000–03 Large Truck Crash Causation Study (LTCCS) data, the only dataset that provides detailed information whether road conditions contributed to crash occurrence. We applied the logistic regression results to a costed national crash dataset in order to calculate the probability that road conditions contributed to the involvement of a vehicle in each crash. In crashes where someone was moderately to seriously injured (AIS-2-6) in a vehicle that harmfully impacted a large tree or medium or large non-breakaway pole, or if the first harmful event was collision with a bridge, we changed the calculated probability of being road-related to 1. We used the state distribution of costs of fatal crashes where road conditions contributed to crash occurrence or severity to estimate the respective state distribution of non-fatal crash costs. The estimated comprehensive cost of traffic crashes where road conditions contributed to crash occurrence or severity was $217.5 billion in 2006. This represented 43.6% of the total comprehensive crash cost. The large share of crash costs related to road design and conditions underlines the importance of these factors in highway safety. Road conditions are largely controllable. Road maintenance and upgrading can prevent crashes and reduce injury severity. PMID:20184840
Coherent nature of the radiation emitted in delayed luminescence of leaves
Bajpai
1999-06-07
After exposure to light, a living system emits a photon signal of characteristic shape. The signal has a small decay region and a long tail region. The flux of photons in the decay region changes by 2 to 3 orders of magnitude, but remains almost constant in the tail region. The decaying part is attributed to delayed luminescence and the constant part to ultra-weak luminescence. Biophoton emission is the common name given to both kinds of luminescence, and photons emitted are called biophotons. The decay character of the biophoton signal is not exponential, which is suggestive of a coherent signal. We sought to establish the coherent nature by measuring the conditional probability of zero photon detection in a small interval Delta. Our measurements establish the coherent nature of biophotons emitted by different leaves at various temperatures in the range 15-50 degrees C. Our set up could measure the conditional probability for Delta=100 &mgr;s in only 100 ms, which enabled us to make its measurement in the decaying part of the signal. Various measurements were repeated 2000 times in contiguous intervals, which determined the dependence of the conditional probability on signal strength. The observed conditional probabilities at different signal strengths are in agreement with the predictions for coherent photons. The agreement is impressive at the discriminatory range, 0.1-5 counts per Delta, of signal strengths. The predictions for coherent and thermal photons differ substantially in this range. We used the values of Delta in the range, 10 &mgr;s-10 ms for obtaining a discriminatory signal strength in different regions of a decaying signal. These measurements establish the coherent nature of photons in all regions of a biophoton signal from 10 ms to 5 hr. We have checked the efficacy of out method by measuring the conditional probability of zero-photon detection in the radiation of a light emitting diode along with a leaf for Delta in the range 10 &mgr;s-100 &mgr;s. The conditional probability in the diode radiation was different from the prediction for coherent photons when the signal strength was less than 2.5 counts per Delta. Only the diode radiation exhibited photon bunching at signal strength around 0.05 count in Delta. Copyright 1999 Academic Press.
Markov chains for testing redundant software
NASA Technical Reports Server (NTRS)
White, Allan L.; Sjogren, Jon A.
1988-01-01
A preliminary design for a validation experiment has been developed that addresses several problems unique to assuring the extremely high quality of multiple-version programs in process-control software. The procedure uses Markov chains to model the error states of the multiple version programs. The programs are observed during simulated process-control testing, and estimates are obtained for the transition probabilities between the states of the Markov chain. The experimental Markov chain model is then expanded into a reliability model that takes into account the inertia of the system being controlled. The reliability of the multiple version software is computed from this reliability model at a given confidence level using confidence intervals obtained for the transition probabilities during the experiment. An example demonstrating the method is provided.
REGULATION OF GEOGRAPHIC VARIABILITY IN HAPLOID:DIPLOD RATIOS OF BIPHASIC SEAWEED LIFE CYCLES(1).
da Silva Vieira, Vasco Manuel Nobre de Carvalho; Santos, Rui Orlando Pimenta
2012-08-01
The relative abundance of haploid and diploid individuals (H:D) in isomorphic marine algal biphasic cycles varies spatially, but only if vital rates of haploid and diploid phases vary differently with environmental conditions (i.e. conditional differentiation between phases). Vital rates of isomorphic phases in particular environments may be determined by subtle morphological or physiological differences. Herein, we test numerically how geographic variability in H:D is regulated by conditional differentiation between isomorphic life phases and the type of life strategy of populations (i.e. life cycles dominated by reproduction, survival or growth). Simulation conditions were selected using available data on H:D spatial variability in seaweeds. Conditional differentiation between ploidy phases had a small effect on the H:D variability for species with life strategies that invest either in fertility or in growth. Conversely, species with life strategies that invest mainly in survival, exhibited high variability in H:D through a conditional differentiation in stasis (the probability of staying in the same size class), breakage (the probability of changing to a smaller size class) or growth (the probability of changing to a bigger size class). These results were consistent with observed geographic variability in H:D of natural marine algae populations. © 2012 Phycological Society of America.
Predicted sequence of cortical tau and amyloid-β deposition in Alzheimer disease spectrum.
Cho, Hanna; Lee, Hye Sun; Choi, Jae Yong; Lee, Jae Hoon; Ryu, Young Hoon; Lee, Myung Sik; Lyoo, Chul Hyoung
2018-04-17
We investigated sequential order between tau and amyloid-β (Aβ) deposition in Alzheimer disease spectrum using a conditional probability method. Two hundred twenty participants underwent 18 F-flortaucipir and 18 F-florbetaben positron emission tomography scans and neuropsychological tests. The presence of tau and Aβ in each region and impairment in each cognitive domain were determined by Z-score cutoffs. By comparing pairs of conditional probabilities, the sequential order of tau and Aβ deposition were determined. Probability for the presence of tau in the entorhinal cortex was higher than that of Aβ in all cortical regions, and in the medial temporal cortices, probability for the presence of tau was higher than that of Aβ. Conversely, in the remaining neocortex above the inferior temporal cortex, probability for the presence of Aβ was always higher than that of tau. Tau pathology in the entorhinal cortex may appear earlier than neocortical Aβ and may spread in the absence of Aβ within the neighboring medial temporal regions. However, Aβ may be required for massive tau deposition in the distant cortical areas. Copyright © 2018 Elsevier Inc. All rights reserved.
Lexicographic Probability, Conditional Probability, and Nonstandard Probability
2009-11-11
1994; Hammond 1999; Kohlberg and Reny 1997; Kreps and Wilson 1982; Myerson 1986; Selten 1965; Selten 1975]). It also arises in the analysis of...sets of measure 0): BBD considered three; Kohlberg and Reny [1997] considered two others. It turns out that these notions are perhaps best understood...number of characterizations of solution concepts depend on independence (see, for example, [Battigalli 1996; Kohlberg and Reny 1997; Battigalli and
NESTOR: A Computer-Based Medical Diagnostic Aid That Integrates Causal and Probabilistic Knowledge.
1984-11-01
indiidual conditional probabilities between one cause node and its effect node, but less common to know a joint conditional probability between a...PERFOAMING ORG. REPORT NUMBER * 7. AUTI4ORs) O Gregory F. Cooper 1 CONTRACT OR GRANT NUMBERIa) ONR N00014-81-K-0004 g PERFORMING ORGANIZATION NAME AND...ADDRESS 10. PROGRAM ELEMENT, PROJECT. TASK Department of Computer Science AREA & WORK UNIT NUMBERS Stanford University Stanford, CA 94305 USA 12. REPORT
NASA Astrophysics Data System (ADS)
Wang, Fuming; Hunsche, Stefan; Anunciado, Roy; Corradi, Antonio; Tien, Hung Yu; Tang, Peng; Wei, Junwei; Wang, Yongjun; Fang, Wei; Wong, Patrick; van Oosten, Anton; van Ingen Schenau, Koen; Slachter, Bram
2018-03-01
We present an experimental study of pattern variability and defectivity, based on a large data set with more than 112 million SEM measurements from an HMI high-throughput e-beam tool. The test case is a 10nm node SRAM via array patterned with a DUV immersion LELE process, where we see a variation in mean size and litho sensitivities between different unique via patterns that leads to a seemingly qualitative differences in defectivity. The large available data volume enables further analysis to reliably distinguish global and local CDU variations, including a breakdown into local systematics and stochastics. A closer inspection of the tail end of the distributions and estimation of defect probabilities concludes that there is a common defect mechanism and defect threshold despite the observed differences of specific pattern characteristics. We expect that the analysis methodology can be applied for defect probability modeling as well as general process qualification in the future.
Berlow, Noah; Pal, Ranadip
2011-01-01
Genetic Regulatory Networks (GRNs) are frequently modeled as Markov Chains providing the transition probabilities of moving from one state of the network to another. The inverse problem of inference of the Markov Chain from noisy and limited experimental data is an ill posed problem and often generates multiple model possibilities instead of a unique one. In this article, we address the issue of intervention in a genetic regulatory network represented by a family of Markov Chains. The purpose of intervention is to alter the steady state probability distribution of the GRN as the steady states are considered to be representative of the phenotypes. We consider robust stationary control policies with best expected behavior. The extreme computational complexity involved in search of robust stationary control policies is mitigated by using a sequential approach to control policy generation and utilizing computationally efficient techniques for updating the stationary probability distribution of a Markov chain following a rank one perturbation.
NASA Astrophysics Data System (ADS)
Fernandes, Kátia; Verchot, Louis; Baethgen, Walter; Gutierrez-Velez, Victor; Pinedo-Vasquez, Miguel; Martius, Christopher
2017-05-01
In Indonesia, drought driven fires occur typically during the warm phase of the El Niño Southern Oscillation. This was the case of the events of 1997 and 2015 that resulted in months-long hazardous atmospheric pollution levels in Equatorial Asia and record greenhouse gas emissions. Nonetheless, anomalously active fire seasons have also been observed in non-drought years. In this work, we investigated the impact of temperature on fires and found that when the July-October (JASO) period is anomalously dry, the sensitivity of fires to temperature is modest. In contrast, under normal-to-wet conditions, fire probability increases sharply when JASO is anomalously warm. This describes a regime in which an active fire season is not limited to drought years. Greater susceptibility to fires in response to a warmer environment finds support in the high evapotranspiration rates observed in normal-to-wet and warm conditions in Indonesia. We also find that fire probability in wet JASOs would be considerably less sensitive to temperature were not for the added effect of recent positive trends. Near-term regional climate projections reveal that, despite negligible changes in precipitation, a continuing warming trend will heighten fire probability over the next few decades especially in non-drought years. Mild fire seasons currently observed in association with wet conditions and cool temperatures will become rare events in Indonesia.
Covariate-adjusted Spearman's rank correlation with probability-scale residuals.
Liu, Qi; Li, Chun; Wanga, Valentine; Shepherd, Bryan E
2018-06-01
It is desirable to adjust Spearman's rank correlation for covariates, yet existing approaches have limitations. For example, the traditionally defined partial Spearman's correlation does not have a sensible population parameter, and the conditional Spearman's correlation defined with copulas cannot be easily generalized to discrete variables. We define population parameters for both partial and conditional Spearman's correlation through concordance-discordance probabilities. The definitions are natural extensions of Spearman's rank correlation in the presence of covariates and are general for any orderable random variables. We show that they can be neatly expressed using probability-scale residuals (PSRs). This connection allows us to derive simple estimators. Our partial estimator for Spearman's correlation between X and Y adjusted for Z is the correlation of PSRs from models of X on Z and of Y on Z, which is analogous to the partial Pearson's correlation derived as the correlation of observed-minus-expected residuals. Our conditional estimator is the conditional correlation of PSRs. We describe estimation and inference, and highlight the use of semiparametric cumulative probability models, which allow preservation of the rank-based nature of Spearman's correlation. We conduct simulations to evaluate the performance of our estimators and compare them with other popular measures of association, demonstrating their robustness and efficiency. We illustrate our method in two applications, a biomarker study and a large survey. © 2017, The International Biometric Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chang, Shih-Jung
Dynamic strength of the High Flux Isotope Reactor (HFIR) vessel to resist hypothetical accidents is analyzed by using the method of fracture mechanics. Vessel critical stresses are estimated by applying dynamic pressure pulses of a range of magnitudes and pulse-durations. The pulses versus time functions are assumed to be step functions. The probability of vessel fracture is then calculated by assuming a distribution of possible surface cracks of different crack depths. The probability distribution function for the crack depths is based on the form that is recommended by the Marshall report. The toughness of the vessel steel used in themore » analysis is based on the projected and embrittled value after 10 effective full power years from 1986. From the study made by Cheverton, Merkle and Nanstad, the weakest point on the vessel for fracture evaluation is known to be located within the region surrounding the tangential beam tube HB3. The increase in the probability of fracture is obtained as an extension of the result from that report for the regular operating condition to include conditions of higher dynamic pressures due to accident loadings. The increase in the probability of vessel fracture is plotted for a range of hoop stresses to indicate the vessel strength against hypothetical accident conditions.« less
Wildfire risk in the wildland-urban interface: A simulation study in northwestern Wisconsin
Massada, Avi Bar; Radeloff, Volker C.; Stewart, Susan I.; Hawbaker, Todd J.
2009-01-01
The rapid growth of housing in and near the wildland–urban interface (WUI) increases wildfirerisk to lives and structures. To reduce fire risk, it is necessary to identify WUI housing areas that are more susceptible to wildfire. This is challenging, because wildfire patterns depend on fire behavior and spread, which in turn depend on ignition locations, weather conditions, the spatial arrangement of fuels, and topography. The goal of our study was to assess wildfirerisk to a 60,000 ha WUI area in northwesternWisconsin while accounting for all of these factors. We conducted 6000 simulations with two dynamic fire models: Fire Area Simulator (FARSITE) and Minimum Travel Time (MTT) in order to map the spatial pattern of burn probabilities. Simulations were run under normal and extreme weather conditions to assess the effect of weather on fire spread, burn probability, and risk to structures. The resulting burn probability maps were intersected with maps of structure locations and land cover types. The simulations revealed clear hotspots of wildfire activity and a large range of wildfirerisk to structures in the study area. As expected, the extreme weather conditions yielded higher burn probabilities over the entire landscape, as well as to different land cover classes and individual structures. Moreover, the spatial pattern of risk was significantly different between extreme and normal weather conditions. The results highlight the fact that extreme weather conditions not only produce higher fire risk than normal weather conditions, but also change the fine-scale locations of high risk areas in the landscape, which is of great importance for fire management in WUI areas. In addition, the choice of weather data may limit the potential for comparisons of risk maps for different areas and for extrapolating risk maps to future scenarios where weather conditions are unknown. Our approach to modeling wildfirerisk to structures can aid fire risk reduction management activities by identifying areas with elevated wildfirerisk and those most vulnerable under extreme weather conditions.
Sharma, Vinamra; Chaudhary, Anand Kumar
2014-01-01
To maintain health and to cure diseases through Rasayana (rejuvenation) therapy along with main treatment is the unique approach of Ayurveda. The basic constituent unit of a living being is always a functional cell. Question arises from where it is generated? How it attains its final specific differentiation form? As age progresses, various changes occur at every cell level and cell undergoes to adaptation accordingly. Microenvironment for cell nourishment diminishes with age or as disease condition persists. In this context, Acharyas had contributed and documented various facts and theories through their insight wisdom. Hidden secretes in the basic principles of any medical system are needed to be explained in terms of contemporary knowledge. Contemporary research areas should be opened to include various explanations of different fields of ancient thoughts to support these new doctrines, if any. This review may be helpful to open the door of future research area in the field of reverse scientific approach of Ayurveda in the context of Dhatu Siddhanta (theory of tissues formation and differentiation) and theory of stem cell.
NASA Technical Reports Server (NTRS)
Wigton, W. H.; Vonsteen, D. H.
1974-01-01
The Statistical Reporting Service of the U.S. Department of Agriculture is evaluating ERTS-1 imagery as a potential tool for estimating crop acreage. A main data source for the estimates is obtained by enumerating small land parcels that have been randomly selected from the total U.S. land area. These small parcels are being used as ground observations in this investigation. The test sites are located in Missouri, Kansas, Idaho, and South Dakota. The major crops of interest are wheat, cotton, corn, soybeans, sugar beets, potatoes, oats, alfalfa, and grain sorghum. Some of the crops are unique to a given site while others are common in two or three states. This provides an opportunity to observe crops grown under different conditions. Results for the Missouri test site are presented. Results of temporal overlays, unequal prior probabilities, and sample classifiers are discussed. The amount of improvement that each technique contributes is shown in terms of overall performance. The results show that useful information for making crop acreage estimates can be obtained from ERTS-1 data.
COP21 climate negotiators' responses to climate model forecasts
NASA Astrophysics Data System (ADS)
Bosetti, Valentina; Weber, Elke; Berger, Loïc; Budescu, David V.; Liu, Ning; Tavoni, Massimo
2017-02-01
Policymakers involved in climate change negotiations are key users of climate science. It is therefore vital to understand how to communicate scientific information most effectively to this group. We tested how a unique sample of policymakers and negotiators at the Paris COP21 conference update their beliefs on year 2100 global mean temperature increases in response to a statistical summary of climate models' forecasts. We randomized the way information was provided across participants using three different formats similar to those used in Intergovernmental Panel on Climate Change reports. In spite of having received all available relevant scientific information, policymakers adopted such information very conservatively, assigning it less weight than their own prior beliefs. However, providing individual model estimates in addition to the statistical range was more effective in mitigating such inertia. The experiment was repeated with a population of European MBA students who, despite starting from similar priors, reported conditional probabilities closer to the provided models' forecasts than policymakers. There was also no effect of presentation format in the MBA sample. These results highlight the importance of testing visualization tools directly on the population of interest.
Coqueugniot, Hélène; Dutour, Olivier; Arensburg, Baruch; Duday, Henri; Vandermeersch, Bernard; Tillier, Anne-marie
2014-01-01
The Qafzeh site (Lower Galilee, Israel) has yielded the largest Levantine hominin collection from Middle Palaeolithic layers which were dated to circa 90–100 kyrs BP or to marine isotope stage 5b–c. Within the hominin sample, Qafzeh 11, circa 12–13 yrs old at death, presents a skull lesion previously attributed to a healed trauma. Three dimensional imaging methods allowed us to better explore this lesion which appeared as being a frontal bone depressed fracture, associated with brain damage. Furthermore the endocranial volume, smaller than expected for dental age, supports the hypothesis of a growth delay due to traumatic brain injury. This trauma did not affect the typical human brain morphology pattern of the right frontal and left occipital petalia. It is highly probable that this young individual suffered from personality and neurological troubles directly related to focal cerebral damage. Interestingly this young individual benefited of a unique funerary practice among the south-western Asian burials dated to Middle Palaeolithic. PMID:25054798
Dynamics of a stochastic multi-strain SIS epidemic model driven by Lévy noise
NASA Astrophysics Data System (ADS)
Chen, Can; Kang, Yanmei
2017-01-01
A stochastic multi-strain SIS epidemic model is formulated by introducing Lévy noise into the disease transmission rate of each strain. First, we prove that the stochastic model admits a unique global positive solution, and, by the comparison theorem, we show that the solution remains within a positively invariant set almost surely. Next we investigate stochastic stability of the disease-free equilibrium, including stability in probability and pth moment asymptotic stability. Then sufficient conditions for persistence in the mean of the disease are established. Finally, based on an Euler scheme for Lévy-driven stochastic differential equations, numerical simulations for a stochastic two-strain model are carried out to verify the theoretical results. Moreover, numerical comparison results of the stochastic two-strain model and the deterministic version are also given. Lévy noise can cause the two strains to become extinct almost surely, even though there is a dominant strain that persists in the deterministic model. It can be concluded that the introduction of Lévy noise reduces the disease extinction threshold, which indicates that Lévy noise may suppress the disease outbreak.
Strong metal-support interactions
NASA Technical Reports Server (NTRS)
Vannice, M. Albert
1987-01-01
It has been demonstrated that synergistic metal-support effects can occur which markedly enhance specific activity and alter selectivity in certain reactions. Because of the presence of such effects in certain reactions conducted under reducing conditions (that is, under H2 pressure), but not others, the creation of unique sites at the metal-support interface seems to be the best model at the present time to explain this behavior. The postulation of these sites, which are specific for a certain reactant such as CO, provides an effective explanation for the higher methanation rates that have been reported over some catalysts. The creation of these sites in the adlineation zone is facilitated by hydrogen spillover from the metal surface, and this same process can also enhance the reduction of many oxide supports. Although oxygen spillover is much less probable due to its higher heat of adsorption, it is much less well understood and the possibility of rate enhancements in CO oxidation caused by special interface sites cannot be discounted at the present time. Consequently, this seems to be an important area of future research.
Ray, James V; Thornton, Laura C; Frick, Paul J; Steinberg, Laurence; Cauffman, Elizabeth
2016-04-01
Both callous-unemotional (CU) traits and impulse control are known risk factors associated with delinquency and substance use. However, research is limited in how contextual factors such as neighborhood conditions influence the associations between these two dispositional factors and these two externalizing behaviors. The current study utilized latent class analysis (LCA) to identify unique classes of delinquency and substance use within an ethnically diverse sample (n = 1216) of justice-involved adolescents (ages 13 to 17) from three different sites. Neighborhood disorder, CU traits, and impulse control were all independently associated with membership in classes with more extensive histories of delinquency and substance use. The effects of CU traits and impulse control in distinguishing delinquent classes was invariant across levels of neighborhood disorder, whereas neighborhood disorder moderated the association between impulse control and substance use. Specifically, the probability of being in more severe substance using classes for those low in impulse control was stronger in neighborhoods with fewer indicators of social and physical disorder.
Zheng, Wenjing; van der Laan, Mark
2017-01-01
In this paper, we study the effect of a time-varying exposure mediated by a time-varying intermediate variable. We consider general longitudinal settings, including survival outcomes. At a given time point, the exposure and mediator of interest are influenced by past covariates, mediators and exposures, and affect future covariates, mediators and exposures. Right censoring, if present, occurs in response to past history. To address the challenges in mediation analysis that are unique to these settings, we propose a formulation in terms of random interventions based on conditional distributions for the mediator. This formulation, in particular, allows for well-defined natural direct and indirect effects in the survival setting, and natural decomposition of the standard total effect. Upon establishing identifiability and the corresponding statistical estimands, we derive the efficient influence curves and establish their robustness properties. Applying Targeted Maximum Likelihood Estimation, we use these efficient influence curves to construct multiply robust and efficient estimators. We also present an inverse probability weighted estimator and a nested non-targeted substitution estimator for these parameters. PMID:29387520
NASA Astrophysics Data System (ADS)
Belkina, T. A.; Konyukhova, N. B.; Kurochkin, S. V.
2012-10-01
A singular boundary value problem for a second-order linear integrodifferential equation with Volterra and non-Volterra integral operators is formulated and analyzed. The equation is defined on ℝ+, has a weak singularity at zero and a strong singularity at infinity, and depends on several positive parameters. Under natural constraints on the coefficients of the equation, existence and uniqueness theorems for this problem with given limit boundary conditions at singular points are proved, asymptotic representations of the solution are given, and an algorithm for its numerical determination is described. Numerical computations are performed and their interpretation is given. The problem arises in the study of the survival probability of an insurance company over infinite time (as a function of its initial surplus) in a dynamic insurance model that is a modification of the classical Cramer-Lundberg model with a stochastic process rate of premium under a certain investment strategy in the financial market. A comparative analysis of the results with those produced by the model with deterministic premiums is given.
NASA Astrophysics Data System (ADS)
Fosnight, Alyssa M.; Moran, Benjamin L.; Branco, Daniela R.; Thomas, Jessica R.; Medvedev, Ivan R.
2013-06-01
As many as 3000 chemicals are reported to be found in exhaled human breath. Many of these chemicals are linked to certain health conditions and environmental exposures. Present state of the art techniques used for analysis of exhaled human breath include mass spectrometry based methods, infrared spectroscopic sensors, electro chemical sensors and semiconductor oxide based testers. Some of these techniques are commercially available but are somewhat limited in their specificity and exhibit fairly high probability of false alarm. Here, we present the results of our most recent study which demonstrated a novel application of a terahertz high resolutions spectroscopic technique to the analysis of exhaled human breath, focused on detection of ethanol in the exhaled breath of a person which consumed an alcoholic drink. This technique possesses nearly ``absolute'' specificity and we demonstrated its ability to uniquely identify ethanol, methanol, and acetone in human breath. This project is now complete and we are looking to extend this method of chemical analysis of exhaled human breath to a broader range of chemicals in an attempt to demonstrate its potential for biomedical diagnostic purposes.
NASA Astrophysics Data System (ADS)
Litvak, Maxim
2017-04-01
During more than 4 years MSL Curiosity rover (landed in Gale crater in August 2012) is traveling toward sedimentary layered mound deposited with phyllosilicates and hematite hydrated minerals. Curiosity already traversed more than 14 km and identified lacustrine deposits left from ancient lakes filled Gale area in early history of Mars. Along the traverse the Curiosity rover discovered unique signatures regarding how the Mars environment changed from ancient warm and wet conditions and probably habitable environment to the modern cold and dry climate. We have summarized numerous measurements from the Dynamic Albedo of Neutron (DAN) instrument on Curiosity rover to overview variations of subsurface bound water distribution from the wet to the dry locations, compared it with other MSL measurements and with possible distribution of hydrated minerals and sequence of geological units travelled by Curiosity. We have also performed joint analysis of water and chlorine distributions and compared bulk (down to 0.5 m depth) equivalent chlorine concentrations measured by DAN throughout the Gale area and APXS observations of corresponding local surface targets and drill fines.
Sharma, Vinamra; Chaudhary, Anand Kumar
2014-01-01
To maintain health and to cure diseases through Rasayana (rejuvenation) therapy along with main treatment is the unique approach of Ayurveda. The basic constituent unit of a living being is always a functional cell. Question arises from where it is generated? How it attains its final specific differentiation form? As age progresses, various changes occur at every cell level and cell undergoes to adaptation accordingly. Microenvironment for cell nourishment diminishes with age or as disease condition persists. In this context, Acharyas had contributed and documented various facts and theories through their insight wisdom. Hidden secretes in the basic principles of any medical system are needed to be explained in terms of contemporary knowledge. Contemporary research areas should be opened to include various explanations of different fields of ancient thoughts to support these new doctrines, if any. This review may be helpful to open the door of future research area in the field of reverse scientific approach of Ayurveda in the context of Dhatu Siddhanta (theory of tissues formation and differentiation) and theory of stem cell. PMID:26664231
Motivation and effort in individuals with social anhedonia
McCarthy, Julie M.; Treadway, Michael T.; Blanchard, Jack J.
2015-01-01
It has been proposed that anhedonia may, in part, reflect difficulties in reward processing and effortful decision-making. The current study aimed to replicate previous findings of effortful decision-making deficits associated with elevated anhedonia and expand upon these findings by investigating whether these decision-making deficits are specific to elevated social anhedonia or are also associated with elevated positive schizotypy characteristics. The current study compared controls (n = 40) to individuals elevated on social anhedonia (n = 30), and individuals elevated on perceptual aberration/magical ideation (n = 30) on the Effort Expenditure for Rewards Task (EEfRT). Across groups, participants chose a higher proportion of hard tasks with increasing probability of reward and reward magnitude, demonstrating sensitivity to probability and reward values. Contrary to our expectations, when the probability of reward was most uncertain (50% probability), at low and medium reward values, the social anhedonia group demonstrated more effortful decision-making than either individuals high in positive schizotypy or controls. The positive schizotypy group only differed from controls (making less effortful choices than controls) when reward probability was lowest (12%) and the magnitude of reward was the smallest. Our results suggest that social anhedonia is related to intact motivation and effort for monetary rewards, but that individuals with this characteristic display a unique and perhaps inefficient pattern of effort allocation when the probability of reward is most uncertain. Future research is needed to better understand effortful decision-making and the processing of reward across a range of individual difference characteristics. PMID:25888337
Sanfilippo, Paul G; Hewitt, Alex W; Mackey, David A
2017-04-01
To outline and detail the importance of conditional probability in clinical decision making and discuss the various diagnostic measures eye care practitioners should be aware of in order to improve the scope of their clinical practice. We conducted a review of the importance of conditional probability in diagnostic testing for the eye care practitioner. Eye care practitioners use diagnostic tests on a daily basis to assist in clinical decision making and optimizing patient care and management. These tests provide probabilistic information that can enable the clinician to increase (or decrease) their level of certainty about the presence of a particular condition. While an understanding of the characteristics of diagnostic tests are essential to facilitate proper interpretation of test results and disease risk, many practitioners either confuse or misinterpret these measures. In the interests of their patients, practitioners should be aware of the basic concepts associated with diagnostic testing and the simple mathematical rule that underpins them. Importantly, the practitioner needs to recognize that the prevalence of a disease in the population greatly determines the clinical value of a diagnostic test.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Manthe, Uwe, E-mail: uwe.manthe@uni-bielefeld.de; Ellerbrock, Roman, E-mail: roman.ellerbrock@uni-bielefeld.de
2016-05-28
A new approach for the quantum-state resolved analysis of polyatomic reactions is introduced. Based on the singular value decomposition of the S-matrix, energy-dependent natural reaction channels and natural reaction probabilities are defined. It is shown that the natural reaction probabilities are equal to the eigenvalues of the reaction probability operator [U. Manthe and W. H. Miller, J. Chem. Phys. 99, 3411 (1993)]. Consequently, the natural reaction channels can be interpreted as uniquely defined pathways through the transition state of the reaction. The analysis can efficiently be combined with reactive scattering calculations based on the propagation of thermal flux eigenstates. Inmore » contrast to a decomposition based straightforwardly on thermal flux eigenstates, it does not depend on the choice of the dividing surface separating reactants from products. The new approach is illustrated studying a prototypical example, the H + CH{sub 4} → H{sub 2} + CH{sub 3} reaction. The natural reaction probabilities and the contributions of the different vibrational states of the methyl product to the natural reaction channels are calculated and discussed. The relation between the thermal flux eigenstates and the natural reaction channels is studied in detail.« less
Quantum stochastic walks on networks for decision-making.
Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo
2016-03-31
Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce's response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process' degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making.
Quantum stochastic walks on networks for decision-making
NASA Astrophysics Data System (ADS)
Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo
2016-03-01
Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce’s response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process’ degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making.
Quantum stochastic walks on networks for decision-making
Martínez-Martínez, Ismael; Sánchez-Burillo, Eduardo
2016-01-01
Recent experiments report violations of the classical law of total probability and incompatibility of certain mental representations when humans process and react to information. Evidence shows promise of a more general quantum theory providing a better explanation of the dynamics and structure of real decision-making processes than classical probability theory. Inspired by this, we show how the behavioral choice-probabilities can arise as the unique stationary distribution of quantum stochastic walkers on the classical network defined from Luce’s response probabilities. This work is relevant because (i) we provide a very general framework integrating the positive characteristics of both quantum and classical approaches previously in confrontation, and (ii) we define a cognitive network which can be used to bring other connectivist approaches to decision-making into the quantum stochastic realm. We model the decision-maker as an open system in contact with her surrounding environment, and the time-length of the decision-making process reveals to be also a measure of the process’ degree of interplay between the unitary and irreversible dynamics. Implementing quantum coherence on classical networks may be a door to better integrate human-like reasoning biases in stochastic models for decision-making. PMID:27030372
Compact perturbative expressions for neutrino oscillations in matter
Denton, Peter B.; Minakata, Hisakazu; Parke, Stephen J.
2016-06-08
We further develop and extend a recent perturbative framework for neutrino oscillations in uniform matter density so that the resulting oscillation probabilities are accurate for the complete matter potential versus baseline divided by neutrino energy plane. This extension also gives the exact oscillation probabilities in vacuum for all values of baseline divided by neutrino energy. The expansion parameter used is related to the ratio of the solar to the atmosphericmore » $$\\Delta m^2$$ scales but with a unique choice of the atmospheric $$\\Delta m^2$$ such that certain first-order effects are taken into account in the zeroth-order Hamiltonian. Using a mixing matrix formulation, this framework has the exceptional feature that the neutrino oscillation probability in matter has the same structure as in vacuum, to all orders in the expansion parameter. It also contains all orders in the matter potential and $$\\sin\\theta_{13}$$. It facilitates immediate physical interpretation of the analytic results, and makes the expressions for the neutrino oscillation probabilities extremely compact and very accurate even at zeroth order in our perturbative expansion. Furthermore, the first and second order results are also given which improve the precision by approximately two or more orders of magnitude per perturbative order.« less
Computing under-ice discharge: A proof-of-concept using hydroacoustics and the Probability Concept
NASA Astrophysics Data System (ADS)
Fulton, John W.; Henneberg, Mark F.; Mills, Taylor J.; Kohn, Michael S.; Epstein, Brian; Hittle, Elizabeth A.; Damschen, William C.; Laveau, Christopher D.; Lambrecht, Jason M.; Farmer, William H.
2018-07-01
Under-ice discharge is estimated using open-water reference hydrographs; however, the ratings for ice-affected sites are generally qualified as poor. The U.S. Geological Survey (USGS), in collaboration with the Colorado Water Conservation Board, conducted a proof-of-concept to develop an alternative method for computing under-ice discharge using hydroacoustics and the Probability Concept. The study site was located south of Minturn, Colorado (CO), USA, and was selected because of (1) its proximity to the existing USGS streamgage 09064600 Eagle River near Minturn, CO, and (2) its ease-of-access to verify discharge using a variety of conventional methods. From late September 2014 to early March 2015, hydraulic conditions varied from open water to under ice. These temporal changes led to variations in water depth and velocity. Hydroacoustics (tethered and uplooking acoustic Doppler current profilers and acoustic Doppler velocimeters) were deployed to measure the vertical-velocity profile at a singularly important vertical of the channel-cross section. Because the velocity profile was non-standard and cannot be characterized using a Power Law or Log Law, velocity data were analyzed using the Probability Concept, which is a probabilistic formulation of the velocity distribution. The Probability Concept-derived discharge was compared to conventional methods including stage-discharge and index-velocity ratings and concurrent field measurements; each is complicated by the dynamics of ice formation, pressure influences on stage measurements, and variations in cross-sectional area due to ice formation. No particular discharge method was assigned as truth. Rather one statistical metric (Kolmogorov-Smirnov; KS), agreement plots, and concurrent measurements provided a measure of comparability between various methods. Regardless of the method employed, comparisons between each method revealed encouraging results depending on the flow conditions and the absence or presence of ice cover. For example, during lower discharges dominated by under-ice and transition (intermittent open-water and under-ice) conditions, the KS metric suggests there is not sufficient information to reject the null hypothesis and implies that the Probability Concept and index-velocity rating represent similar distributions. During high-flow, open-water conditions, the comparisons are less definitive; therefore, it is important that the appropriate analytical method and instrumentation be selected. Six conventional discharge measurements were collected concurrently with Probability Concept-derived discharges with percent differences (%) of -9.0%, -21%, -8.6%, 17.8%, 3.6%, and -2.3%. This proof-of-concept demonstrates that riverine discharges can be computed using the Probability Concept for a range of hydraulic extremes (variations in discharge, open-water and under-ice conditions) immediately after the siting phase is complete, which typically requires one day. Computing real-time discharges is particularly important at sites, where (1) new streamgages are planned, (2) river hydraulics are complex, and (3) shifts in the stage-discharge rating are needed to correct the streamflow record. Use of the Probability Concept does not preclude the need to maintain a stage-area relation. Both the Probability Concept and index-velocity rating offer water-resource managers and decision makers alternatives for computing real-time discharge for open-water and under-ice conditions.
Computing under-ice discharge: A proof-of-concept using hydroacoustics and the Probability Concept
Fulton, John W.; Henneberg, Mark F.; Mills, Taylor J.; Kohn, Michael S.; Epstein, Brian; Hittle, Elizabeth A.; Damschen, William C.; Laveau, Christopher D.; Lambrecht, Jason M.; Farmer, William H.
2018-01-01
Under-ice discharge is estimated using open-water reference hydrographs; however, the ratings for ice-affected sites are generally qualified as poor. The U.S. Geological Survey (USGS), in collaboration with the Colorado Water Conservation Board, conducted a proof-of-concept to develop an alternative method for computing under-ice discharge using hydroacoustics and the Probability Concept.The study site was located south of Minturn, Colorado (CO), USA, and was selected because of (1) its proximity to the existing USGS streamgage 09064600 Eagle River near Minturn, CO, and (2) its ease-of-access to verify discharge using a variety of conventional methods. From late September 2014 to early March 2015, hydraulic conditions varied from open water to under ice. These temporal changes led to variations in water depth and velocity. Hydroacoustics (tethered and uplooking acoustic Doppler current profilers and acoustic Doppler velocimeters) were deployed to measure the vertical-velocity profile at a singularly important vertical of the channel-cross section. Because the velocity profile was non-standard and cannot be characterized using a Power Law or Log Law, velocity data were analyzed using the Probability Concept, which is a probabilistic formulation of the velocity distribution. The Probability Concept-derived discharge was compared to conventional methods including stage-discharge and index-velocity ratings and concurrent field measurements; each is complicated by the dynamics of ice formation, pressure influences on stage measurements, and variations in cross-sectional area due to ice formation.No particular discharge method was assigned as truth. Rather one statistical metric (Kolmogorov-Smirnov; KS), agreement plots, and concurrent measurements provided a measure of comparability between various methods. Regardless of the method employed, comparisons between each method revealed encouraging results depending on the flow conditions and the absence or presence of ice cover.For example, during lower discharges dominated by under-ice and transition (intermittent open-water and under-ice) conditions, the KS metric suggests there is not sufficient information to reject the null hypothesis and implies that the Probability Concept and index-velocity rating represent similar distributions. During high-flow, open-water conditions, the comparisons are less definitive; therefore, it is important that the appropriate analytical method and instrumentation be selected. Six conventional discharge measurements were collected concurrently with Probability Concept-derived discharges with percent differences (%) of −9.0%, −21%, −8.6%, 17.8%, 3.6%, and −2.3%.This proof-of-concept demonstrates that riverine discharges can be computed using the Probability Concept for a range of hydraulic extremes (variations in discharge, open-water and under-ice conditions) immediately after the siting phase is complete, which typically requires one day. Computing real-time discharges is particularly important at sites, where (1) new streamgages are planned, (2) river hydraulics are complex, and (3) shifts in the stage-discharge rating are needed to correct the streamflow record. Use of the Probability Concept does not preclude the need to maintain a stage-area relation. Both the Probability Concept and index-velocity rating offer water-resource managers and decision makers alternatives for computing real-time discharge for open-water and under-ice conditions.
The Effect of Interruptions on Part 121 Air Carrier Operations
NASA Technical Reports Server (NTRS)
Damos, Diane L.
1998-01-01
The primary purpose of this study was to determine the relative priorities of various events and activities by examining the probability that a given activity was interrupted by a given event. The analysis will begin by providing frequency of interruption data by crew position (captain versus first officer) and event type. Any differences in the pattern of interruptions between the first officers and the captains will be explored and interpreted in terms of standard operating procedures. Subsequent data analyses will focus on comparing the frequency of interruptions for different types of activities and for the same activities under normal versus emergency conditions. Briefings and checklists will receive particular attention. The frequency with which specific activities are interrupted under multiple- versus single-task conditions also will be examined; because the majority of multiple-task data were obtained under laboratory conditions, LOFT-type tapes offer a unique opportunity to examine concurrent task performance under 'real-world' conditions. A second purpose of this study is to examine the effects of the interruptions on performance. More specifically, when possible, the time to resume specific activities will be compared to determine if pilots are slower to resume certain types of activities. Errors in resumption or failures to resume specific activities will be noted and any patterns in these errors will be identified. Again, particular attention will be given to the effects of interruptions on the completion of checklists and briefings. Other types of errors and missed events (i.e., the crew should have responded to the event but did not) will be examined. Any methodology using interruptions to examine task prioritization must be able to identify when an interruption has occurred and describe the ongoing activities that were interrupted. Both of these methodological problems are discussed In detail in the following section,
A quantified dosing ALD reactor with in-situ diagnostics for surface chemistry studies
NASA Astrophysics Data System (ADS)
Larrabee, Thomas J.
A specialized atomic layer deposition (ALD) reactor has been constructed to serve as an instrument to simultaneously study the surface chemistry of the ALD process, and perform ALD as is conventionally done in continuum flow of inert gas. This reactor is uniquely useful to gain insight into the ALD process because of the combination of its precise, controllable, and quantified dosing/microdosing capability; its in-situ quadrupole mass spectrometer for gas composition analysis; its pair of highly-sensitive in-situ quartz crystal microbalances (QCMs); and its complete spectrum of pressures and operating conditions --- from viscous to molecular flow regimes. Control of the dose is achieved independently of the conditions by allowing a reactant gas to fill a fixed volume and measured pressure, which is held at a controlled temperature, and subsequently dosed into the system by computer controlled pneumatic valves. Absolute reactant exposure to the substrate and QCMs is unambiguously calculated from the molecular impingement flux, and its relationship to dose size is established, allowing means for easily intentionally reproducing specific exposures. Methods for understanding atomic layer growth and adsorption phenomena, including the precursor sticking probability, dynamics of molecular impingement, size of dose, and other operating variables are for the first time quantitatively related to surface reaction rates by mass balance. Extensive characterization of the QCM as a measurement tool for adsorption under realistic ALD conditions has been examined, emphasizing the state-of-the-art and importance of QCM system features required. Finally, the importance of dose-quantification and microdosing has been contextualized in view of the ALD literature, underscoring the significance of more precise condition specification in establishing a better basis for reactor and reactant comparison.
Vickers, Timothy A.; Freier, Susan M.; Bui, Huynh-Hoa; Watt, Andrew; Crooke, Stanley T.
2014-01-01
A new strategy for identifying potent RNase H-dependent antisense oligonucleotides (ASOs) is presented. Our analysis of the human transcriptome revealed that a significant proportion of genes contain unique repeated sequences of 16 or more nucleotides in length. Activities of ASOs targeting these repeated sites in several representative genes were compared to those of ASOs targeting unique single sites in the same transcript. Antisense activity at repeated sites was also evaluated in a highly controlled minigene system. Targeting both native and minigene repeat sites resulted in significant increases in potency as compared to targeting of non-repeated sites. The increased potency at these sites is a result of increased frequency of ASO/RNA interactions which, in turn, increases the probability of a productive interaction between the ASO/RNA heteroduplex and human RNase H1 in the cell. These results suggest a new, highly efficient strategy for rapid identification of highly potent ASOs. PMID:25334092
A unique role of endogenous visual-spatial attention in rapid processing of multiple targets
Guzman, Emmanuel; Grabowecky, Marcia; Palafox, German; Suzuki, Satoru
2012-01-01
Visual spatial attention can be exogenously captured by a salient stimulus or can be endogenously allocated by voluntary effort. Whether these two attention modes serve distinctive functions is debated, but for processing of single targets the literature suggests superiority of exogenous attention (it is faster acting and serves more functions). We report that endogenous attention uniquely contributes to processing of multiple targets. For speeded visual discrimination, response times are faster for multiple redundant targets than for single targets due to probability summation and/or signal integration. This redundancy gain was unaffected when attention was exogenously diverted from the targets, but was completely eliminated when attention was endogenously diverted. This was not due to weaker manipulation of exogenous attention because our exogenous and endogenous cues similarly affected overall response times. Thus, whereas exogenous attention is superior for processing single targets, endogenous attention plays a unique role in allocating resources crucial for rapid concurrent processing of multiple targets. PMID:21517209
Rationality, irrationality and escalating behavior in lowest unique bid auctions.
Radicchi, Filippo; Baronchelli, Andrea; Amaral, Luís A N
2012-01-01
Information technology has revolutionized the traditional structure of markets. The removal of geographical and time constraints has fostered the growth of online auction markets, which now include millions of economic agents worldwide and annual transaction volumes in the billions of dollars. Here, we analyze bid histories of a little studied type of online auctions--lowest unique bid auctions. Similarly to what has been reported for foraging animals searching for scarce food, we find that agents adopt Lévy flight search strategies in their exploration of "bid space". The Lévy regime, which is characterized by a power-law decaying probability distribution of step lengths, holds over nearly three orders of magnitude. We develop a quantitative model for lowest unique bid online auctions that reveals that agents use nearly optimal bidding strategies. However, agents participating in these auctions do not optimize their financial gain. Indeed, as long as there are many auction participants, a rational profit optimizing agent would choose not to participate in these auction markets.
Rationality, Irrationality and Escalating Behavior in Lowest Unique Bid Auctions
Radicchi, Filippo; Baronchelli, Andrea; Amaral, Luís A. N.
2012-01-01
Information technology has revolutionized the traditional structure of markets. The removal of geographical and time constraints has fostered the growth of online auction markets, which now include millions of economic agents worldwide and annual transaction volumes in the billions of dollars. Here, we analyze bid histories of a little studied type of online auctions – lowest unique bid auctions. Similarly to what has been reported for foraging animals searching for scarce food, we find that agents adopt Lévy flight search strategies in their exploration of “bid space”. The Lévy regime, which is characterized by a power-law decaying probability distribution of step lengths, holds over nearly three orders of magnitude. We develop a quantitative model for lowest unique bid online auctions that reveals that agents use nearly optimal bidding strategies. However, agents participating in these auctions do not optimize their financial gain. Indeed, as long as there are many auction participants, a rational profit optimizing agent would choose not to participate in these auction markets. PMID:22279553
Mardanov, M J; Mahmudov, N I; Sharifov, Y A
2014-01-01
We study a boundary value problem for the system of nonlinear impulsive fractional differential equations of order α (0 < α ≤ 1) involving the two-point and integral boundary conditions. Some new results on existence and uniqueness of a solution are established by using fixed point theorems. Some illustrative examples are also presented. We extend previous results even in the integer case α = 1.
MIZUMACHI, ERI; MORI, AKIRA; OSAWA, NAOYA; AKIYAMA, REIKO; TOKUCHI, NAOKO
2006-01-01
• Background and Aims Plants have the ability to compensate for damage caused by herbivores. This is important to plant growth, because a plant cannot always avoid damage, even if it has developed defence mechanisms against herbivores. In previous work, we elucidated the herbivory-induced compensatory response of Quercus (at both the individual shoot and whole sapling levels) in both low- and high-nutrient conditions throughout one growing season. In this study, we determine how the compensatory growth of Quercus serrata saplings is achieved at different nutrient levels. • Methods Quercus serrata saplings were grown under controlled conditions. Length, number of leaves and percentage of leaf area lost on all extension units (EUs) were measured. • Key Results Both the probability of flushing and the length of subsequent EUs significantly increased with an increase in the length of the parent EU. The probability of flushing increased with an increase in leaf damage of the parent EU, but the length of subsequent EUs decreased. This indicates that EU growth is fundamentally regulated at the individual EU level. The probabilities of a second and third flush were significantly higher in plants in high-nutrient soil than those in low-nutrient soil. The subsequent EUs of damaged saplings were also significantly longer at high-nutrient conditions. • Conclusions An increase in the probability of flushes in response to herbivore damage is important for damaged saplings to produce new EUs; further, shortening the length of EUs helps to effectively reproduce foliage lost by herbivory. The probability of flushing also varied according to soil nutrient levels, suggesting that the compensatory growth of individual EUs in response to local damage levels is affected by the nutrients available to the whole sapling. PMID:16709576
Bernstein, Andrey; Wang, Cong; Dall'Anese, Emiliano; ...
2018-01-01
This paper considers unbalanced multiphase distribution systems with generic topology and different load models, and extends the Z-bus iterative load-flow algorithm based on a fixed-point interpretation of the AC load-flow equations. Explicit conditions for existence and uniqueness of load-flow solutions are presented. These conditions also guarantee convergence of the load-flow algorithm to the unique solution. The proposed methodology is applicable to generic systems featuring (i) wye connections; (ii) ungrounded delta connections; (iii) a combination of wye-connected and delta-connected sources/loads; and, (iv) a combination of line-to-line and line-to-grounded-neutral devices at the secondary of distribution transformers. Further, a sufficient condition for themore » non-singularity of the load-flow Jacobian is proposed. Finally, linear load-flow models are derived, and their approximation accuracy is analyzed. Theoretical results are corroborated through experiments on IEEE test feeders.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bernstein, Andrey; Wang, Cong; Dall'Anese, Emiliano
This paper considers unbalanced multiphase distribution systems with generic topology and different load models, and extends the Z-bus iterative load-flow algorithm based on a fixed-point interpretation of the AC load-flow equations. Explicit conditions for existence and uniqueness of load-flow solutions are presented. These conditions also guarantee convergence of the load-flow algorithm to the unique solution. The proposed methodology is applicable to generic systems featuring (i) wye connections; (ii) ungrounded delta connections; (iii) a combination of wye-connected and delta-connected sources/loads; and, (iv) a combination of line-to-line and line-to-grounded-neutral devices at the secondary of distribution transformers. Further, a sufficient condition for themore » non-singularity of the load-flow Jacobian is proposed. Finally, linear load-flow models are derived, and their approximation accuracy is analyzed. Theoretical results are corroborated through experiments on IEEE test feeders.« less
Probabilistic cluster labeling of imagery data
NASA Technical Reports Server (NTRS)
Chittineni, C. B. (Principal Investigator)
1980-01-01
The problem of obtaining the probabilities of class labels for the clusters using spectral and spatial information from a given set of labeled patterns and their neighbors is considered. A relationship is developed between class and clusters conditional densities in terms of probabilities of class labels for the clusters. Expressions are presented for updating the a posteriori probabilities of the classes of a pixel using information from its local neighborhood. Fixed-point iteration schemes are developed for obtaining the optimal probabilities of class labels for the clusters. These schemes utilize spatial information and also the probabilities of label imperfections. Experimental results from the processing of remotely sensed multispectral scanner imagery data are presented.
Bivariate extreme value distributions
NASA Technical Reports Server (NTRS)
Elshamy, M.
1992-01-01
In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.
NASA Astrophysics Data System (ADS)
Lee, C. C.; Chen, W. S.
2018-04-01
The aim of this study is to examine the effects of Es-layer characteristics on spread-F generation in the nighttime midlatitude ionosphere. The Es-layer parameters and spread-F appearance of the 23rd solar cycle (1996-2008) are recorded by the Kokubunji ionosonde. The Es-layer parameters are foEs (critical frequency of Es-layer), fbEs (blanketing frequency of Es-layer), and Δf (≡foEs-fbEs). In order to completely explore the effects, the pre-midnight and post-midnight data are classified by seasons, solar activities, and geomagnetic conditions. Results show that the spread-F occurs more frequently in post-midnight and in summer. And, the occurrence probabilities of spread-F are greater, when the solar activity is lower. For the occurrence probabilities of spread-F versus foEs and Δf under geomagnetic quiet-conditions, the trend is increasing, when the associated probabilities are significant. These indicate that the spread-F occurrence increases with increasing foEs and/or Δf. Further, the increasing trends demonstrate that polarization electric fields generated in Es-layer would be helpful to generate spread-F, through the electrodynamical coupling of Es-layer and F-region. Moreover, this electrodynamical coupling is efficient not only under quiet-conditions but under disturbed-conditions, since the significant increasing trend can also be found under disturbed-conditions. Regarding the occurrence probabilities of spread-F versus fbEs, the evident trends are not in the majority. This implies that fbEs might not be a major factor for the spread-F formation.
Identifying HIV care enrollees at-risk for cannabis use disorder.
Hartzler, Bryan; Carlini, Beatriz H; Newville, Howard; Crane, Heidi M; Eron, Joseph J; Geng, Elvin H; Mathews, W Christopher; Mayer, Kenneth H; Moore, Richard D; Mugavero, Michael J; Napravnik, Sonia; Rodriguez, Benigno; Donovan, Dennis M
2017-07-01
Increased scientific attention given to cannabis in the United States has particular relevance for its domestic HIV care population, given that evidence exists for both cannabis as a therapeutic agent and cannabis use disorder (CUD) as a barrier to antiretroviral medication adherence. It is critical to identify relative risk for CUD among demographic subgroups of HIV patients, as this will inform detection and intervention efforts. A Center For AIDS Research Network of Integrated Clinical Systems cohort (N = 10,652) of HIV-positive adults linked to care at seven United State sites was examined for this purpose. Based on a patient-report instrument with validated diagnostic threshold for CUD, the prevalence of recent cannabis use and corresponding conditional probabilities for CUD were calculated for the aggregate sample and demographic subgroups. Generalized estimating equations then tested models directly examining patient demographic indices as predictors of CUD, while controlling for history and geography. Conditional probability of CUD among cannabis-using patients was 49%, with the highest conditional probabilities among demographic subgroups of young adults and those with non-specified sexual orientation (67-69%) and the lowest conditional probability among females and those 50+ years of age (42% apiece). Similarly, youthful age and male gender emerged as robust multivariate model predictors of CUD. In the context of increasingly lenient policies for use of cannabis as a therapeutic agent for chronic conditions like HIV/AIDS, current study findings offer needed direction in terms of specifying targeted patient groups in HIV care on whom resources for enhanced surveillance and intervention efforts will be most impactful.
Solving probability reasoning based on DNA strand displacement and probability modules.
Zhang, Qiang; Wang, Xiaobiao; Wang, Xiaojun; Zhou, Changjun
2017-12-01
In computation biology, DNA strand displacement technology is used to simulate the computation process and has shown strong computing ability. Most researchers use it to solve logic problems, but it is only rarely used in probabilistic reasoning. To process probabilistic reasoning, a conditional probability derivation model and total probability model based on DNA strand displacement were established in this paper. The models were assessed through the game "read your mind." It has been shown to enable the application of probabilistic reasoning in genetic diagnosis. Copyright © 2017 Elsevier Ltd. All rights reserved.
Comonotonic bounds on the survival probabilities in the Lee-Carter model for mortality projection
NASA Astrophysics Data System (ADS)
Denuit, Michel; Dhaene, Jan
2007-06-01
In the Lee-Carter framework, future survival probabilities are random variables with an intricate distribution function. In large homogeneous portfolios of life annuities, value-at-risk or conditional tail expectation of the total yearly payout of the company are approximately equal to the corresponding quantities involving random survival probabilities. This paper aims to derive some bounds in the increasing convex (or stop-loss) sense on these random survival probabilities. These bounds are obtained with the help of comonotonic upper and lower bounds on sums of correlated random variables.
Concentration and mindfulness meditations: unique forms of consciousness?
Dunn, B R; Hartigan, J A; Mikulas, W L
1999-09-01
Electroencephalographic (EEG) recordings from 19 scalp recording sites were used to differentiate among two posited unique forms of mediation, concentration and mindfulness, and a normal relaxation control condition. Analyzes of all traditional frequency bandwidth data (i.e., delta 1-3 Hz; theta, 4-7 Hz; alpha, 8-12 Hz; beta 1, 13-25 Hz; beta 2, 26-32 Hz) showed strong mean amplitude frequency differences between the two meditation conditions and relaxation over numerous cortical sites. Furthermore, significant differences were obtained between concentration and mindfulness states at all bandwidths. Taken together, our results suggest that concentration and mindfulness "meditations" may be unique forms of consciousness and are not merely degrees of a state of relaxation.
Estimating probabilities of reservoir storage for the upper Delaware River basin
Hirsch, Robert M.
1981-01-01
A technique for estimating conditional probabilities of reservoir system storage is described and applied to the upper Delaware River Basin. The results indicate that there is a 73 percent probability that the three major New York City reservoirs (Pepacton, Cannonsville, and Neversink) would be full by June 1, 1981, and only a 9 percent probability that storage would return to the ' drought warning ' sector of the operations curve sometime in the next year. In contrast, if restrictions are lifted and there is an immediate return to normal operating policies, the probability of the reservoir system being full by June 1 is 37 percent and the probability that storage would return to the ' drought warning ' sector in the next year is 30 percent. (USGS)
Structure and biochemical functions of four simian virus 40 truncated large-T antigens.
Chaudry, F; Harvey, R; Smith, A E
1982-01-01
The structure of four abnormal T antigens which are present in different simian virus 40 (SV40)-transformed mouse cell lines was studied by tryptic peptide mapping, partial proteolysis fingerprinting, immunoprecipitation with monoclonal antibodies, and in vitro translation. The results obtained allowed us to deduce that these proteins, which have apparent molecular weights of 15,000, 22,000, 33,000 and 45,000, are truncated forms of large-T antigen extending to different amounts into the amino acid sequences unique to large-T. The proteins are all phosphorylated, probably at a site between amino acids 106 and 123. The mRNAs coding for the proteins probably contain the normal large-T splice but are shorter than the normal transcripts of the SV40 early region. The truncated large-Ts were tested for the ability to bind to double-stranded DNA-cellulose. This showed that the 33,000- and 45,000-molecular-weight polypeptides contained sequences sufficient for binding under the conditions used, whereas the 15,000- and 22,000-molecular-weight forms did not. Together with published data, this allows the tentative mapping of a region of SV40 large-T between amino acids 109 and 272 that is necessary and may be sufficient for the binding to double-stranded DNA-cellulose in vitro. None of the truncated large-T species formed a stable complex with the host cell protein referred to as nonviral T-antigen or p53, suggesting that the carboxy-terminal sequences of large-T are necessary for complex formation. Images PMID:6292504
Ohad, D G; Avrahami, A; Waner, T; David, L
2013-08-01
The Dogue de Bordeaux (DdB) breed has gone through several genetic 'bottle necks' and has a relatively small effective population size. Importing new stock into Israel has been limited, further narrowing the already restricted local gene-pool and increasing the chances of inherited defects. In 56 DdB dogs examined between 2003 and 2010, the authors sought to study the proportion congenital subaortic stenosis (SAS) and tricuspid valve dysplasia (TVD). The aim was also to identify a probable mode of inheritance (MOI) using segregation and pedigree analyses of genealogical data available from 13/21 DdB dogs diagnosed with these conditions between 2004 and 2007. Among all breeds in the country, TVD was highest in the DdB breed, which also displayed the second highest proportion of SAS. Echocardiographic measurements and selected physical examination findings from 26 normal DdB dogs, 18 DdB dogs with SAS, and 12 DdB dogs with TVD are reported. Based on pedigree and segregation analyses, the most probable MOI appeared to be autosomal recessive. Pedigree analyses helped to identify three ancestors that might have introduced these two congenital heart defects into the local DdB population. Excluding those three dogs and their progeny from future mating could therefore reduce the prevalence of these diseases in the DdB population in Israel. The unusual local breeding circumstances may offer a unique opportunity to identify associated SAS and TVD genes in the DdB, as well as in other dog breeds. Copyright © 2013 Elsevier Ltd. All rights reserved.
Chen, Xiao-hong; Motani, Ryosuke; Cheng, Long; Jiang, Da-yong; Rieppel, Olivier
2014-01-01
Parahupehsuchus longus is a new species of marine reptile from the Lower Triassic of Yuan'an County, Hubei Province, China. It is unique among vertebrates for having a body wall that is completely surrounded by a bony tube, about 50 cm long and 6.5 cm deep, comprising overlapping ribs and gastralia. This tube and bony ossicles on the back are best interpreted as anti-predatory features, suggesting that there was predation pressure upon marine tetrapods in the Early Triassic. There is at least one sauropterygian that is sufficiently large to feed on Parahupehsuchus in the Nanzhang-Yuan'an fauna, together with six more species of potential prey marine reptiles with various degrees of body protection. Modern predators of marine tetrapods belong to the highest trophic levels in the marine ecosystem but such predators did not always exist through geologic time. The indication of marine-tetrapod feeding in the Nanzhang-Yuan'an fauna suggests that such a trophic level emerged for the first time in the Early Triassic. The recovery from the end-Permian extinction probably proceeded faster than traditionally thought for marine predators. Parahupehsuchus has superficially turtle-like features, namely expanded ribs without intercostal space, very short transverse processes, and a dorsal outgrowth from the neural spine. However, these features are structurally different from their turtle counterparts. Phylogeny suggests that they are convergent with the condition in turtles, which has a fundamentally different body plan that involves the folding of the body wall. Expanded ribs without intercostal space evolved at least twice and probably even more among reptiles.
The Impact of Racism on the Sexual and Reproductive Health of African American Women
Prather, Cynthia; Fuller, Taleria R.; Marshall, Khiya J.; Jeffries, William L.
2016-01-01
African American women are disproportionately affected by multiple sexual and reproductive health conditions compared with women of other races/ethnicities. Research suggests that social determinants of health, including poverty, unemployment, and limited education, contribute to health disparities. However, racism is a probable underlying determinant of these social conditions. This article uses a socioecological model to describe racism and its impact on African American women’s sexual and reproductive health. Although similar models have been used for specific infectious and chronic diseases, they have not described how the historical underpinnings of racism affect current sexual and reproductive health outcomes among African American women. We propose a socioecological model that demonstrates how social determinants grounded in racism affect individual behaviors and interpersonal relationships, which may contribute to sexual and reproductive health outcomes. This model provides a perspective to understand how these unique contextual experiences are intertwined with the daily lived experiences of African American women and how they are potentially linked to poor sexual and reproductive health outcomes. The model also presents an opportunity to increase dialog and research among public health practitioners and encourages them to consider the role of these contextual experiences and supportive data when developing prevention interventions. Considerations address the provision of opportunities to promote health equity by reducing the effects of racism and improving African American women’s sexual and reproductive health. PMID:27227533
DOE Office of Scientific and Technical Information (OSTI.GOV)
Degos, R.; Duverne, J.; Picot, Ch.
1961-04-01
An unusual case of radiodermatitis in a 58-yr-old woman operated on for basocellular epithelioma of the inner aspect of the right eyelid is described. Postoperative radiation treatment was given as follows: 2000-r doses of unfiltered 50-kv x rays, three times, one week apart, for a total of 6000 r. After 15 days, slight ulceration of the irradiated portion and marked dermatitis of the surrounding skin area were noted. Local application of solution and ointments provided analgesia but did not cure the condition, which spread from the cheek to nose and chin, and was accompanied by seborrhea, over a period ofmore » 16 months. Bacterial examination revealed staphylococci, which promptly responded to antibiotic (erythromycin) treatment. A similarity was noted between this patient's condition and the more common reactions of children to scalp radiation treatment for ringworm. Other cases are cited to show occasional unusual side- effects of radiation treatment, and the possibility that the present case represents a unique type of skin response to radiation is discussed. Allergy alone, it is contended, probably does not explain the reaction observed, but a bacterial infection was involved in the reaction. It is concluded that this case and the two others mentioned indicate that other factors are of importance, the nature of which needs to be determined by further study. (BBB)« less
The Impact of Racism on the Sexual and Reproductive Health of African American Women.
Prather, Cynthia; Fuller, Taleria R; Marshall, Khiya J; Jeffries, William L
2016-07-01
African American women are disproportionately affected by multiple sexual and reproductive health conditions compared with women of other races/ethnicities. Research suggests that social determinants of health, including poverty, unemployment, and limited education, contribute to health disparities. However, racism is a probable underlying determinant of these social conditions. This article uses a socioecological model to describe racism and its impact on African American women's sexual and reproductive health. Although similar models have been used for specific infectious and chronic diseases, they have not described how the historical underpinnings of racism affect current sexual and reproductive health outcomes among African American women. We propose a socioecological model that demonstrates how social determinants grounded in racism affect individual behaviors and interpersonal relationships, which may contribute to sexual and reproductive health outcomes. This model provides a perspective to understand how these unique contextual experiences are intertwined with the daily lived experiences of African American women and how they are potentially linked to poor sexual and reproductive health outcomes. The model also presents an opportunity to increase dialog and research among public health practitioners and encourages them to consider the role of these contextual experiences and supportive data when developing prevention interventions. Considerations address the provision of opportunities to promote health equity by reducing the effects of racism and improving African American women's sexual and reproductive health.
Sleep Disturbance and Emotion Dysregulation as Transdiagnostic Processes in a Comorbid Sample
Fairholme, Christopher P.; Nosen, Elizabeth L.; Nillni, Yael I.; Schumacher, Julie A.; Tull, Matthew T.; Coffey, Scott F.
2013-01-01
Sleep disturbance and emotion dysregulation have been identified as etiologic and maintaining factors for a range of psychopathology and separate literatures support their relationships to anxiety, depression, PTSD, and alcohol dependence (AD) symptom severity. Previous studies have examined these relationships in isolation, failing to account for the high rates of comorbidity among disorders. It is not yet known whether these processes uniquely predict symptom severity in each of these domains. Participants were 220 patients in residential substance abuse treatment, who had experienced a potentially traumatic event and exceeded screening cutoffs for probable PTSD and problematic alcohol use. Controlling for emotion dysregulation and the interrelationships among the outcome variables, insomnia was uniquely associated with anxiety (B = .27, p < .001), depression (B = .25, p < .001), PTSD (B = .22, p < .001), and AD (B = .17, p = .01) symptom severity. Similarly, controlling for insomnia, emotion dysregulation was uniquely associated with anxiety (B = .40, p < .001), depression (B = .47, p < .001), PTSD (B = .38, p < .001), and AD (B = .26, p < .001) symptom severity. Insomnia and emotion dysregulation appear to be transdiagnostic processes uniquely associated with symptom severity across a number of different domains and might be important treatment targets for individuals with PTSD and AD. PMID:23831496
ERIC Educational Resources Information Center
Hunter, Jennifer L.; Heath, Claudia J.
2017-01-01
This article uses a random digit dial probability sample (N = 328) to examine the relationship between credit card use behaviors and household well-being during a period of severe economic recession: The Great Recession. The ability to measure the role of credit card use during a period of recession provides unique insights to the study of credit…
Diverse Deposits in Melas Chasma
2015-07-29
This scene captured by NASA Mars Reconnaissance Orbiter includes chaotic deposits with a wide range of colors. The deposits are distinctive with both unique colors and small-scale textures such as fracture patterns. These are probably sedimentary rocks, transported and deposited in water or air. The original layers may have been jumbled in a landslide. Dark or reddish sand dunes cover some of the bedrock. http://photojournal.jpl.nasa.gov/catalog/PIA19860
Uranium disequilibrium in groundwater: An isotope dilution approach in hydrologic investigations
Osmond, J.K.; Rydell, H.S.; Kaufman, M.I.
1968-01-01
The distribution and environmental disequilibrium patterns of naturally occurring uranium isotopes (U234 and U238) in waters of the Floridan aquifer suggest that variations in the ratios of isotopic activity and concentrations can be used quantitatively to evaluate mixing proportions of waters from differing sources. Uranium is probably unique in its potential for this approach, which seems to have general usefulness in hydrologic investigations.
Uniqueness of large positive solutions
NASA Astrophysics Data System (ADS)
López-Gómez, Julián; Maire, Luis
2017-08-01
We establish the uniqueness of the positive solution of the singular problem (1.1) through some standard comparison techniques involving the maximum principle. Our proofs do not invoke to the blow-up rates of the solutions, as in most of the specialized literature. We give two different types of results according to the geometrical properties of Ω and the regularity of partial Ω . Even in the autonomous case, our theorems are extremely sharp extensions of all existing results. Precisely, when a(x)≡ 1, it is shown that the monotonicity and superadditivity of f( u) with constant C≥ 0 entail the uniqueness; f is said to be superadditive with constant C≥ 0 if f(a+b) ≥ f(a) + f(b) - C \\quad for all a, b ≥ 0. This condition, introduced by Marcus and Véron (J Evol Equ 3:637-652, 2004), weakens all previous sufficient conditions for uniqueness, as it will become apparent in this paper.
NASA Astrophysics Data System (ADS)
Shi, Chenguang; Salous, Sana; Wang, Fei; Zhou, Jianjiang
2017-08-01
Distributed radar network systems have been shown to have many unique features. Due to their advantage of signal and spatial diversities, radar networks are attractive for target detection. In practice, the netted radars in radar networks are supposed to maximize their transmit power to achieve better detection performance, which may be in contradiction with low probability of intercept (LPI). Therefore, this paper investigates the problem of adaptive power allocation for radar networks in a cooperative game-theoretic framework such that the LPI performance can be improved. Taking into consideration both the transmit power constraints and the minimum signal to interference plus noise ratio (SINR) requirement of each radar, a cooperative Nash bargaining power allocation game based on LPI is formulated, whose objective is to minimize the total transmit power by optimizing the power allocation in radar networks. First, a novel SINR-based network utility function is defined and utilized as a metric to evaluate power allocation. Then, with the well-designed network utility function, the existence and uniqueness of the Nash bargaining solution are proved analytically. Finally, an iterative Nash bargaining algorithm is developed that converges quickly to a Pareto optimal equilibrium for the cooperative game. Numerical simulations and theoretic analysis are provided to evaluate the effectiveness of the proposed algorithm.
New evidence for mammaliaform ear evolution and feeding adaptation in a Jurassic ecosystem
NASA Astrophysics Data System (ADS)
Luo, Zhe-Xi; Meng, Qing-Jin; Grossnickle, David M.; Liu, Di; Neander, April I.; Zhang, Yu-Guang; Ji, Qiang
2017-08-01
Stem mammaliaforms are forerunners to modern mammals, and they achieved considerable ecomorphological diversity in their own right. Recent discoveries suggest that eleutherodontids, a subclade of Haramiyida, were more species-rich during the Jurassic period in Asia than previously recognized. Here we report a new Jurassic eleutherodontid mammaliaform with an unusual mosaic of highly specialized characteristics, and the results of phylogenetic analyses that support the hypothesis that haramiyidans are stem mammaliaforms. The new fossil shows fossilized skin membranes that are interpreted to be for gliding and a mandibular middle ear with a unique character combination previously unknown in mammaliaforms. Incisor replacement is prolonged until well after molars are fully erupted, a timing pattern unique to most other mammaliaforms. In situ molar occlusion and a functional analysis reveal a new mode of dental occlusion: dual mortar-pestle occlusion of opposing upper and lower molars, probably for dual crushing and grinding. This suggests that eleutherodontids are herbivorous, and probably specialized for granivory or feeding on soft plant tissues. The inferred dietary adaptation of eleutherodontid gliders represents a remarkable evolutionary convergence with herbivorous gliders in Theria. These Jurassic fossils represent volant, herbivorous stem mammaliaforms associated with pre-angiosperm plants that appear long before the later, iterative associations between angiosperm plants and volant herbivores in various therian clades.
New evidence for mammaliaform ear evolution and feeding adaptation in a Jurassic ecosystem.
Luo, Zhe-Xi; Meng, Qing-Jin; Grossnickle, David M; Liu, Di; Neander, April I; Zhang, Yu-Guang; Ji, Qiang
2017-08-17
Stem mammaliaforms are forerunners to modern mammals, and they achieved considerable ecomorphological diversity in their own right. Recent discoveries suggest that eleutherodontids, a subclade of Haramiyida, were more species-rich during the Jurassic period in Asia than previously recognized. Here we report a new Jurassic eleutherodontid mammaliaform with an unusual mosaic of highly specialized characteristics, and the results of phylogenetic analyses that support the hypothesis that haramiyidans are stem mammaliaforms. The new fossil shows fossilized skin membranes that are interpreted to be for gliding and a mandibular middle ear with a unique character combination previously unknown in mammaliaforms. Incisor replacement is prolonged until well after molars are fully erupted, a timing pattern unique to most other mammaliaforms. In situ molar occlusion and a functional analysis reveal a new mode of dental occlusion: dual mortar-pestle occlusion of opposing upper and lower molars, probably for dual crushing and grinding. This suggests that eleutherodontids are herbivorous, and probably specialized for granivory or feeding on soft plant tissues. The inferred dietary adaptation of eleutherodontid gliders represents a remarkable evolutionary convergence with herbivorous gliders in Theria. These Jurassic fossils represent volant, herbivorous stem mammaliaforms associated with pre-angiosperm plants that appear long before the later, iterative associations between angiosperm plants and volant herbivores in various therian clades.
Probabilities for time-dependent properties in classical and quantum mechanics
NASA Astrophysics Data System (ADS)
Losada, Marcelo; Vanni, Leonardo; Laura, Roberto
2013-05-01
We present a formalism which allows one to define probabilities for expressions that involve properties at different times for classical and quantum systems and we study its lattice structure. The formalism is based on the notion of time translation of properties. In the quantum case, the properties involved should satisfy compatibility conditions in order to obtain well-defined probabilities. The formalism is applied to describe the double-slit experiment.
Low Probability of Intercept Waveforms via Intersymbol Dither Performance Under Multiple Conditions
2009-03-01
United States Air Force, Department of Defense, or the United States Government . AFIT/GE/ENG/09-23 Low Probability of Intercept Waveforms via...21 D random variable governing the distribution of dither values 21 p (ct) D (t) probability density function of the...potential performance loss of a non-cooperative receiver compared to a cooperative receiver designed to account for ISI and multipath. 1.3 Thesis
Low Probability of Intercept Waveforms via Intersymbol Dither Performance Under Multipath Conditions
2009-03-01
United States Air Force, Department of Defense, or the United States Government . AFIT/GE/ENG/09-23 Low Probability of Intercept Waveforms via...21 D random variable governing the distribution of dither values 21 p (ct) D (t) probability density function of the...potential performance loss of a non-cooperative receiver compared to a cooperative receiver designed to account for ISI and multipath. 1.3 Thesis
The Probability of Exceedance as a Nonparametric Person-Fit Statistic for Tests of Moderate Length
ERIC Educational Resources Information Center
Tendeiro, Jorge N.; Meijer, Rob R.
2013-01-01
To classify an item score pattern as not fitting a nonparametric item response theory (NIRT) model, the probability of exceedance (PE) of an observed response vector x can be determined as the sum of the probabilities of all response vectors that are, at most, as likely as x, conditional on the test's total score. Vector x is to be considered…
Knock probability estimation through an in-cylinder temperature model with exogenous noise
NASA Astrophysics Data System (ADS)
Bares, P.; Selmanaj, D.; Guardiola, C.; Onder, C.
2018-01-01
This paper presents a new knock model which combines a deterministic knock model based on the in-cylinder temperature and an exogenous noise disturbing this temperature. The autoignition of the end-gas is modelled by an Arrhenius-like function and the knock probability is estimated by propagating a virtual error probability distribution. Results show that the random nature of knock can be explained by uncertainties at the in-cylinder temperature estimation. The model only has one parameter for calibration and thus can be easily adapted online. In order to reduce the measurement uncertainties associated with the air mass flow sensor, the trapped mass is derived from the in-cylinder pressure resonance, which improves the knock probability estimation and reduces the number of sensors needed for the model. A four stroke SI engine was used for model validation. By varying the intake temperature, the engine speed, the injected fuel mass, and the spark advance, specific tests were conducted, which furnished data with various knock intensities and probabilities. The new model is able to predict the knock probability within a sufficient range at various operating conditions. The trapped mass obtained by the acoustical model was compared in steady conditions by using a fuel balance and a lambda sensor and differences below 1 % were found.
The Effects of Framing, Reflection, Probability, and Payoff on Risk Preference in Choice Tasks.
Kühberger; Schulte-Mecklenbeck; Perner
1999-06-01
A meta-analysis of Asian-disease-like studies is presented to identify the factors which determine risk preference. First the confoundings between probability levels, payoffs, and framing conditions are clarified in a task analysis. Then the role of framing, reflection, probability, type, and size of payoff is evaluated in a meta-analysis. It is shown that bidirectional framing effects exist for gains and for losses. Presenting outcomes as gains tends to induce risk aversion, while presenting outcomes as losses tends to induce risk seeking. Risk preference is also shown to depend on the size of the payoffs, on the probability levels, and on the type of good at stake (money/property vs human lives). In general, higher payoffs lead to increasing risk aversion. Higher probabilities lead to increasing risk aversion for gains and to increasing risk seeking for losses. These findings are confirmed by a subsequent empirical test. Shortcomings of existing formal theories, such as prospect theory, cumulative prospect theory, venture theory, and Markowitz's utility theory, are identified. It is shown that it is not probabilities or payoffs, but the framing condition, which explains most variance. These findings are interpreted as showing that no linear combination of formally relevant predictors is sufficient to capture the essence of the framing phenomenon. Copyright 1999 Academic Press.
The impact of macroeconomic conditions on obesity in Canada.
Latif, Ehsan
2014-06-01
The paper used longitudinal Canadian data from the National Population Health Survey to estimate the impact of macroeconomic conditions measured by provincial unemployment rate on individual obesity and BMI. To control for individual-specific unobserved heterogeneity, the study utilized the conditional fixed effect logit and fixed effects models. The study found that unemployment rate had a significant positive impact on the probability of being severely obese. The study also found that unemployment rate significantly increased BMI. However, the study did not find any significant impact of unemployment rate on the probability of being overweight or obese. Copyright © 2013 John Wiley & Sons, Ltd.
Spatial Probability Dynamically Modulates Visual Target Detection in Chickens
Sridharan, Devarajan; Ramamurthy, Deepa L.; Knudsen, Eric I.
2013-01-01
The natural world contains a rich and ever-changing landscape of sensory information. To survive, an organism must be able to flexibly and rapidly locate the most relevant sources of information at any time. Humans and non-human primates exploit regularities in the spatial distribution of relevant stimuli (targets) to improve detection at locations of high target probability. Is the ability to flexibly modify behavior based on visual experience unique to primates? Chickens (Gallus domesticus) were trained on a multiple alternative Go/NoGo task to detect a small, briefly-flashed dot (target) in each of the quadrants of the visual field. When targets were presented with equal probability (25%) in each quadrant, chickens exhibited a distinct advantage for detecting targets at lower, relative to upper, hemifield locations. Increasing the probability of presentation in the upper hemifield locations (to 80%) dramatically improved detection performance at these locations to be on par with lower hemifield performance. Finally, detection performance in the upper hemifield changed on a rapid timescale, improving with successive target detections, and declining with successive detections at the diagonally opposite location in the lower hemifield. These data indicate the action of a process that in chickens, as in primates, flexibly and dynamically modulates detection performance based on the spatial probabilities of sensory stimuli as well as on recent performance history. PMID:23734188
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hansen, Ryan H.; Timm, Andrea C.; Timm, Collin M.
The structure, function and evolving composition of microbial communities is deeply influenced by the physical and chemical architecture of the local microenvironment. The complexity of this parameter space in naturally occurring systems has made a clear understanding of the key drivers of community development elusive. Here, we examine the role of spatial confinement on community development using a microwell platform that allows for assembly and monitoring of unique microbial communities en masse. This platform was designed to contain microwells with varied size features in order to mimic various levels of spatial confinement found in natural systems. Microbial populations assembled inmore » wells with incrementally smaller size features showed increasingly larger variations in inoculum levels. By exploiting this size dependence, large wells were used to assemble homogenous initial populations of Pseudomonas aeruginosa, allowing for reproducible, directed growth trajectories. In contrast, smaller wells were used to assemble a heterogeneous range of initial populations, resulting in a variety of growth and decay trajectories. This allowed for parallel screening of single member communities across different levels of confinement to identify initial conditions in which P. aeruginosa colonies have dramatically higher probabilities of survival. These results demonstrate a unique approach for manipulating the distribution of initial microbial populations assembled into controlled microenvironments to rapidly identify population and environmental parameters conducive or inhibitive to growth. Additionally, multi-member community assembly was characterized to demonstrate the power of this platform for studying the role of member abundance on microbial competition, mutualism and community succession.« less
Hansen, Ryan H.; Timm, Andrea C.; Timm, Collin M.; ...
2016-05-06
The structure, function and evolving composition of microbial communities is deeply influenced by the physical and chemical architecture of the local microenvironment. The complexity of this parameter space in naturally occurring systems has made a clear understanding of the key drivers of community development elusive. Here, we examine the role of spatial confinement on community development using a microwell platform that allows for assembly and monitoring of unique microbial communities en masse. This platform was designed to contain microwells with varied size features in order to mimic various levels of spatial confinement found in natural systems. Microbial populations assembled inmore » wells with incrementally smaller size features showed increasingly larger variations in inoculum levels. By exploiting this size dependence, large wells were used to assemble homogenous initial populations of Pseudomonas aeruginosa, allowing for reproducible, directed growth trajectories. In contrast, smaller wells were used to assemble a heterogeneous range of initial populations, resulting in a variety of growth and decay trajectories. This allowed for parallel screening of single member communities across different levels of confinement to identify initial conditions in which P. aeruginosa colonies have dramatically higher probabilities of survival. These results demonstrate a unique approach for manipulating the distribution of initial microbial populations assembled into controlled microenvironments to rapidly identify population and environmental parameters conducive or inhibitive to growth. Additionally, multi-member community assembly was characterized to demonstrate the power of this platform for studying the role of member abundance on microbial competition, mutualism and community succession.« less
Zisk, S.H.; Hodges, C.A.; Moore, H.J.; Shorthill, R.W.; Thompson, T.W.; Whitaker, E.A.; Wilhelms, D.E.
1977-01-01
The region including the Aristarchus Plateau and Montes Harbinger is probably the most diverse, geologically, of any area of comparble size on the Moon. This part of the northwest quadrant of the lunar near side includes unique dark mantling material; both the densest concentration and the largest of the sinuous rilles; apparent volcanic vents, sinks, and domes; mare materials of various ages and colors; one of the freshest large craters (Aristarchus) with ejecta having unique colors and albedos; and three other large craters in different states of flooding and degradation (krieger, Herodotus, and Prinz). The three best-authenticated lunar transient phenomena were also observed here. This study is based principally on photographic and remote sensing observations made from Earth and Apollo orbiting space craft. Results include (1) delineation of geologic map units and their stratigraphic relationships; (2) discussion of the complex interrelationships between materials of volcanic and impact origin, including the effects of excavation, redistribution and mixing of previously deposited materials by younger impact craters; (3) deduction of physical and chemical properties of certain of the geologic units, based on both the remote-sensing information and on extrapolation of Apollo data to this area; and (4) development of a detailed geologic history of the region, outlining the probable sequence of events that resulted in its present appearance. A primary concern of the investigation has been anomalous red dark mantle on the Plateau. Based on an integration of Earth- and lunar orbit-based data, this layer seems to consist of fine-grained, block-free material containing a relatively large fraction of orange glass. It is probably of pyroclastic origin, laid down at some time during the Imbrian period of mare flooding. ?? 1977 D. Reidel Publishing Company.
Jaspers, Ellen; Balsters, Joshua H; Kassraian Fard, Pegah; Mantini, Dante; Wenderoth, Nicole
2017-03-01
Over the last decade, structure-function relationships have begun to encompass networks of brain areas rather than individual structures. For example, corticostriatal circuits have been associated with sensorimotor, limbic, and cognitive information processing, and damage to these circuits has been shown to produce unique behavioral outcomes in Autism, Parkinson's Disease, Schizophrenia and healthy ageing. However, it remains an open question how abnormal or absent connectivity can be detected at the individual level. Here, we provide a method for clustering gross morphological structures into subregions with unique functional connectivity fingerprints, and generate network probability maps usable as a baseline to compare individual cases against. We used connectivity metrics derived from resting-state fMRI (N = 100), in conjunction with hierarchical clustering methods, to parcellate the striatum into functionally distinct clusters. We identified three highly reproducible striatal subregions, across both hemispheres and in an independent replication dataset (N = 100) (dice-similarity values 0.40-1.00). Each striatal seed region resulted in a highly reproducible distinct connectivity fingerprint: the putamen showed predominant connectivity with cortical and cerebellar sensorimotor and language processing areas; the ventromedial striatum cluster had a distinct limbic connectivity pattern; the caudate showed predominant connectivity with the thalamus, frontal and occipital areas, and the cerebellum. Our corticostriatal probability maps agree with existing connectivity data in humans and non-human primates, and showed a high degree of replication. We believe that these maps offer an efficient tool to further advance hypothesis driven research and provide important guidance when investigating deviant connectivity in neurological patient populations suffering from e.g., stroke or cerebral palsy. Hum Brain Mapp 38:1478-1491, 2017. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.
Familiarity with breeding habitat improves daily survival in colonial cliff swallows
BROWN, CHARLES R.; BROWN, MARY BOMBERGER; BRAZEAL, KATHLEEN R.
2008-01-01
One probable cost of dispersing to a new breeding habitat is unfamiliarity with local conditions such as the whereabouts of food or the habits of local predators, and consequently immigrants may have lower probabilities of survival than more experienced residents. Within a breeding season, estimated daily survival probabilities of cliff swallows (Petrochelidon pyrrhonota) at colonies in southwestern Nebraska were highest for birds that had always nested at the same site, followed by those for birds that had nested there in some (but not all) past years. Daily survival probabilities were lowest for birds that were naïve immigrants to a colony site and for yearling birds that were nesting for the first time. Birds with past experience at a colony site had monthly survival 8.6% greater than that of naïve immigrants. All colonies where experienced residents did better than immigrants were smaller than 750 nests in size, and in colonies greater than 750 nests, naïve immigrants paid no survival costs relative to experienced residents. Removal of nest ectoparasites by fumigation resulted in higher survival probabilities for all birds, on average, and diminished the differences between immigrants and past residents, probably by improving bird condition to the extent that effects of past experience were relatively less important and harder to detect. The greater survival of experienced residents could not be explained by condition or territory quality, suggesting that familiarity with a local area confers survival advantages during the breeding season for cliff swallows. Colonial nesting may help to moderate the cost of unfamiliarity with an area, likely through social transfer of information about food sources and enhanced vigilance in large groups. PMID:19802326
Dynamic prediction of patient outcomes during ongoing cardiopulmonary resuscitation.
Kim, Joonghee; Kim, Kyuseok; Callaway, Clifton W; Doh, Kibbeum; Choi, Jungho; Park, Jongdae; Jo, You Hwan; Lee, Jae Hyuk
2017-02-01
The probability of the return of spontaneous circulation (ROSC) and subsequent favourable outcomes changes dynamically during advanced cardiac life support (ACLS). We sought to model these changes using time-to-event analysis in out-of-hospital cardiac arrest (OHCA) patients. Adult (≥18 years old), non-traumatic OHCA patients without prehospital ROSC were included. Utstein variables and initial arterial blood gas measurements were used as predictors. The incidence rate of ROSC during the first 30min of ACLS in the emergency department (ED) was modelled using spline-based parametric survival analysis. Conditional probabilities of subsequent outcomes after ROSC (1-week and 1-month survival and 6-month neurologic recovery) were modelled using multivariable logistic regression. The ROSC and conditional probability models were then combined to estimate the likelihood of achieving ROSC and subsequent outcomes by providing k additional minutes of effort. A total of 727 patients were analyzed. The incidence rate of ROSC increased rapidly until the 10th minute of ED ACLS, and it subsequently decreased. The conditional probabilities of subsequent outcomes after ROSC were also dependent on the duration of resuscitation with odds ratios for 1-week and 1-month survival and neurologic recovery of 0.93 (95% CI: 0.90-0.96, p<0.001), 0.93 (0.88-0.97, p=0.001) and 0.93 (0.87-0.99, p=0.031) per 1-min increase, respectively. Calibration testing of the combined models showed good correlation between mean predicted probability and actual prevalence. The probability of ROSC and favourable subsequent outcomes changed according to a multiphasic pattern over the first 30min of ACLS, and modelling of the dynamic changes was feasible. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Garriguet, Didier
2016-04-01
Estimates of the prevalence of adherence to physical activity guidelines in the population are generally the result of averaging individual probability of adherence based on the number of days people meet the guidelines and the number of days they are assessed. Given this number of active and inactive days (days assessed minus days active), the conditional probability of meeting the guidelines that has been used in the past is a Beta (1 + active days, 1 + inactive days) distribution assuming the probability p of a day being active is bounded by 0 and 1 and averages 50%. A change in the assumption about the distribution of p is required to better match the discrete nature of the data and to better assess the probability of adherence when the percentage of active days in the population differs from 50%. Using accelerometry data from the Canadian Health Measures Survey, the probability of adherence to physical activity guidelines is estimated using a conditional probability given the number of active and inactive days distributed as a Betabinomial(n, a + active days , β + inactive days) assuming that p is randomly distributed as Beta(a, β) where the parameters a and β are estimated by maximum likelihood. The resulting Betabinomial distribution is discrete. For children aged 6 or older, the probability of meeting physical activity guidelines 7 out of 7 days is similar to published estimates. For pre-schoolers, the Betabinomial distribution yields higher estimates of adherence to the guidelines than the Beta distribution, in line with the probability of being active on any given day. In estimating the probability of adherence to physical activity guidelines, the Betabinomial distribution has several advantages over the previously used Beta distribution. It is a discrete distribution and maximizes the richness of accelerometer data.
Aldars-García, Laila; Berman, María; Ortiz, Jordi; Ramos, Antonio J; Marín, Sonia
2018-06-01
The probability of growth and aflatoxin B 1 (AFB 1 ) production of 20 isolates of Aspergillus flavus were studied using a full factorial design with eight water activity levels (0.84-0.98 a w ) and six temperature levels (15-40 °C). Binary data obtained from growth studies were modelled using linear logistic regression analysis as a function of temperature, water activity and time for each isolate. In parallel, AFB 1 was extracted at different times from newly formed colonies (up to 20 mm in diameter). Although a total of 950 AFB 1 values over time for all conditions studied were recorded, they were not considered to be enough to build probability models over time, and therefore, only models at 30 days were built. The confidence intervals of the regression coefficients of the probability of growth models showed some differences among the 20 growth models. Further, to assess the growth/no growth and AFB 1 /no- AFB 1 production boundaries, 0.05 and 0.5 probabilities were plotted at 30 days for all of the isolates. The boundaries for growth and AFB 1 showed that, in general, the conditions for growth were wider than those for AFB 1 production. The probability of growth and AFB 1 production seemed to be less variable among isolates than AFB 1 accumulation. Apart from the AFB 1 production probability models, using growth probability models for AFB 1 probability predictions could be, although conservative, a suitable alternative. Predictive mycology should include a number of isolates to generate data to build predictive models and take into account the genetic diversity of the species and thus make predictions as similar as possible to real fungal food contamination. Copyright © 2017 Elsevier Ltd. All rights reserved.
Traveling waves in the discrete fast buffered bistable system.
Tsai, Je-Chiang; Sneyd, James
2007-11-01
We study the existence and uniqueness of traveling wave solutions of the discrete buffered bistable equation. Buffered excitable systems are used to model, among other things, the propagation of waves of increased calcium concentration, and discrete models are often used to describe the propagation of such waves across multiple cells. We derive necessary conditions for the existence of waves, and, under some restrictive technical assumptions, we derive sufficient conditions. When the wave exists it is unique and stable.
75 FR 80866 - Credit Rating Standardization Study
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-23
... ratings using identical terms; standardizing the market stress conditions under which ratings are... probabilities and loss expectations under standardized conditions of economic stress; and standardizing credit... identical terms; (B) standardizing the market stress conditions under which ratings are evaluated; (C...
The Negated Conditional: A Litmus Test for the Suppositional Conditional?
ERIC Educational Resources Information Center
Handley, Simon J.; Evans, Jonathan St. B. T.; Thompson, Valerie A.
2006-01-01
Under the suppositional account of conditionals, when people think about a conditional assertion, "if p then q," they engage in a mental simulation in which they imagine p holds and evaluate the probability that q holds under this supposition. One implication of this account is that belief in a conditional equates to conditional probability…
Computer-aided diagnosis with potential application to rapid detection of disease outbreaks.
Burr, Tom; Koster, Frederick; Picard, Rick; Forslund, Dave; Wokoun, Doug; Joyce, Ed; Brillman, Judith; Froman, Phil; Lee, Jack
2007-04-15
Our objectives are to quickly interpret symptoms of emergency patients to identify likely syndromes and to improve population-wide disease outbreak detection. We constructed a database of 248 syndromes, each syndrome having an estimated probability of producing any of 85 symptoms, with some two-way, three-way, and five-way probabilities reflecting correlations among symptoms. Using these multi-way probabilities in conjunction with an iterative proportional fitting algorithm allows estimation of full conditional probabilities. Combining these conditional probabilities with misdiagnosis error rates and incidence rates via Bayes theorem, the probability of each syndrome is estimated. We tested a prototype of computer-aided differential diagnosis (CADDY) on simulated data and on more than 100 real cases, including West Nile Virus, Q fever, SARS, anthrax, plague, tularaemia and toxic shock cases. We conclude that: (1) it is important to determine whether the unrecorded positive status of a symptom means that the status is negative or that the status is unknown; (2) inclusion of misdiagnosis error rates produces more realistic results; (3) the naive Bayes classifier, which assumes all symptoms behave independently, is slightly outperformed by CADDY, which includes available multi-symptom information on correlations; as more information regarding symptom correlations becomes available, the advantage of CADDY over the naive Bayes classifier should increase; (4) overlooking low-probability, high-consequence events is less likely if the standard output summary is augmented with a list of rare syndromes that are consistent with observed symptoms, and (5) accumulating patient-level probabilities across a larger population can aid in biosurveillance for disease outbreaks. c 2007 John Wiley & Sons, Ltd.
Realistic Clocks for a Universe Without Time
NASA Astrophysics Data System (ADS)
Bryan, K. L. H.; Medved, A. J. M.
2018-01-01
There are a number of problematic features within the current treatment of time in physical theories, including the "timelessness" of the Universe as encapsulated by the Wheeler-DeWitt equation. This paper considers one particular investigation into resolving this issue; a conditional probability interpretation that was first proposed by Page and Wooters. Those authors addressed the apparent timelessness by subdividing a faux Universe into two entangled parts, "the clock" and "the remainder of the Universe", and then synchronizing the effective dynamics of the two subsystems by way of conditional probabilities. The current treatment focuses on the possibility of using a (somewhat) realistic clock system; namely, a coherent-state description of a damped harmonic oscillator. This clock proves to be consistent with the conditional probability interpretation; in particular, a standard evolution operator is identified with the position of the clock playing the role of time for the rest of the Universe. Restrictions on the damping factor are determined and, perhaps contrary to expectations, the optimal choice of clock is not necessarily one of minimal damping.
Ashoub, Ahmed; Müller, Niels; Jiménez-Gómez, José M; Brüggemann, Wolfgang
2018-05-01
Under field conditions, drought and heat stress typically happen simultaneously and their negative impact on the agricultural production is expected to increase worldwide under the climate change scenario. In this study, we performed RNA-sequencing analysis on leaves of wild barley (Hordeum spontaneum) originated from the northern coastal region of Egypt following individual drought acclimation (DA) and heat shock (HS) treatments and their combination (CS, combined stresses) to distinguish the unique and shared differentially expressed genes (DEG). Results indicated that the number of unique genes that were differentially expressed following HS treatment exceeded the number of those expressed following DA. In addition, the number of genes that were uniquely differentially expressed in response to CS treatment exceeded the number of those of shared responses to individual DA and HS treatments. These results indicate a better adaptation of the Mediterranean wild barley to drought conditions when compared with heat stress. It also manifests that the wild barley response to CS tends to be unique rather than common. Annotation of DEG showed that metabolic processes were the most influenced biological function in response to the applied stresses. © 2017 Scandinavian Plant Physiology Society.
Personomics: The Missing Link in the Evolution from Precision Medicine to Personalized Medicine.
Ziegelstein, Roy C
2017-10-16
Clinical practice guidelines have been developed for many common conditions based on data from randomized controlled trials. When medicine is informed solely by clinical practice guidelines, however, the patient is not treated as an individual, but rather a member of a group. Precision medicine, as defined herein, characterizes unique biological characteristics of the individual or of specimens obtained from an individual to tailor diagnostics and therapeutics to a specific patient. These unique biological characteristics are defined by the tools of precision medicine: genomics, proteomics, metabolomics, epigenomics, pharmacogenomics, and other "-omics." Personalized medicine, as defined herein, uses additional information about the individual derived from knowing the patient as a person. These unique personal characteristics are defined by tools known as personomics which takes into account an individual's personality, preferences, values, goals, health beliefs, social support network, financial resources, and unique life circumstances that affect how and when a given health condition will manifest in that person and how that condition will respond to treatment. In this paradigm, precision medicine may be considered a necessary step in the evolution of medical care to personalized medicine, with personomics as the missing link.
Conditional Independence in Applied Probability.
ERIC Educational Resources Information Center
Pfeiffer, Paul E.
This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…
NASA Astrophysics Data System (ADS)
Rong, Ying; Wen, Huiying
2018-05-01
In this paper, the appearing probability of truck is introduced and an extended car-following model is presented to analyze the traffic flow based on the consideration of driver's characteristics, under honk environment. The stability condition of this proposed model is obtained through linear stability analysis. In order to study the evolution properties of traffic wave near the critical point, the mKdV equation is derived by the reductive perturbation method. The results show that the traffic flow will become more disorder for the larger appearing probability of truck. Besides, the appearance of leading truck affects not only the stability of traffic flow, but also the effect of other aspects on traffic flow, such as: driver's reaction and honk effect. The effects of them on traffic flow are closely correlated with the appearing probability of truck. Finally, the numerical simulations under the periodic boundary condition are carried out to verify the proposed model. And they are consistent with the theoretical findings.
Asymptotic Equivalence of Probability Measures and Stochastic Processes
NASA Astrophysics Data System (ADS)
Touchette, Hugo
2018-03-01
Let P_n and Q_n be two probability measures representing two different probabilistic models of some system (e.g., an n-particle equilibrium system, a set of random graphs with n vertices, or a stochastic process evolving over a time n) and let M_n be a random variable representing a "macrostate" or "global observable" of that system. We provide sufficient conditions, based on the Radon-Nikodym derivative of P_n and Q_n, for the set of typical values of M_n obtained relative to P_n to be the same as the set of typical values obtained relative to Q_n in the limit n→ ∞. This extends to general probability measures and stochastic processes the well-known thermodynamic-limit equivalence of the microcanonical and canonical ensembles, related mathematically to the asymptotic equivalence of conditional and exponentially-tilted measures. In this more general sense, two probability measures that are asymptotically equivalent predict the same typical or macroscopic properties of the system they are meant to model.
Wysocki, Andrea; Kane, Robert L; Golberstein, Ezra; Dowd, Bryan; Lum, Terry; Shippee, Tetyana
2014-06-01
To compare the probability of experiencing a potentially preventable hospitalization (PPH) between older dual eligible Medicaid home and community-based service (HCBS) users and nursing home residents. Three years of Medicaid and Medicare claims data (2003-2005) from seven states, linked to area characteristics from the Area Resource File. A primary diagnosis of an ambulatory care sensitive condition on the inpatient hospital claim was used to identify PPHs. We used inverse probability of treatment weighting to mitigate the potential selection of HCBS versus nursing home use. The most frequent conditions accounting for PPHs were the same among the HCBS users and nursing home residents and included congestive heart failure, pneumonia, chronic obstructive pulmonary disease, urinary tract infection, and dehydration. Compared to nursing home residents, elderly HCBS users had an increased probability of experiencing both a PPH and a non-PPH. HCBS users' increased probability for potentially and non-PPHs suggests a need for more proactive integration of medical and long-term care. © Health Research and Educational Trust.
Hawkes-diffusion process and the conditional probability of defaults in the Eurozone
NASA Astrophysics Data System (ADS)
Kim, Jungmu; Park, Yuen Jung; Ryu, Doojin
2016-05-01
This study examines market information embedded in the European sovereign CDS (credit default swap) market by analyzing the sovereign CDSs of 13 Eurozone countries from January 1, 2008, to February 29, 2012, which includes the recent Eurozone debt crisis period. We design the conditional probability of defaults for the CDS prices based on the Hawkes-diffusion process and obtain the theoretical prices of CDS indexes. To estimate the model parameters, we calibrate the model prices to empirical prices obtained from individual sovereign CDS term structure data. The estimated parameters clearly explain both cross-sectional and time-series data. Our empirical results show that the probability of a huge loss event sharply increased during the Eurozone debt crisis, indicating a contagion effect. Even countries with strong and stable economies, such as Germany and France, suffered from the contagion effect. We also find that the probability of small events is sensitive to the state of the economy, spiking several times due to the global financial crisis and the Greek government debt crisis.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr
In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less
Probability Forecasting Using Monte Carlo Simulation
NASA Astrophysics Data System (ADS)
Duncan, M.; Frisbee, J.; Wysack, J.
2014-09-01
Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a collision probability distribution given known, predicted uncertainty. This paper presents the details of the collision probability forecasting method. We examine various conjunction event scenarios and numerically demonstrate the utility of this approach in typical event scenarios. We explore the utility of a probability-based track scenario simulation that models expected tracking data frequency as the tasking levels are increased. The resulting orbital uncertainty is subsequently used in the forecasting algorithm.
Adverse Housing Conditions and Early-Onset Delinquency.
Jackson, Dylan B; Newsome, Jamie; Lynch, Kellie R
2017-09-01
Housing constitutes an important health resource for children. Research has revealed that, when housing conditions are unfavorable, they can interfere with child health, academic performance, and cognition. Little to no research, however, has considered whether adverse housing conditions and early-onset delinquency are significantly associated with one another. This study explores the associations between structural and non-structural housing conditions and delinquent involvement during childhood. Data from the Fragile Families and Child Wellbeing Study (FFCWS) were employed in this study. Each adverse housing condition was significantly associated with early-onset delinquency. Even so, disarray and deterioration were only significantly linked to early delinquent involvement in the presence of health/safety hazards. The predicted probability of early-onset delinquency among children exposed to housing risks in the presence of health/safety hazards was nearly three times as large as the predicted probability of early-onset delinquency among children exposed only to disarray and/or deterioration, and nearly four times as large as the predicted probability of early-onset delinquency among children exposed to none of the adverse housing conditions. The findings suggest that minimizing housing-related health/safety hazards among at-risk subsets of the population may help to alleviate other important public health concerns-particularly early-onset delinquency. Addressing household health/safety hazards may represent a fruitful avenue for public health programs aimed at the prevention of early-onset delinquency. © Society for Community Research and Action 2017.
Cytologic diagnosis: expression of probability by clinical pathologists.
Christopher, Mary M; Hotz, Christine S
2004-01-01
Clinical pathologists use descriptive terms or modifiers to express the probability or likelihood of a cytologic diagnosis. Words are imprecise in meaning, however, and may be used and interpreted differently by pathologists and clinicians. The goals of this study were to 1) assess the frequency of use of 18 modifiers, 2) determine the probability of a positive diagnosis implied by the modifiers, 3) identify preferred modifiers for different levels of probability, 4) ascertain the importance of factors that affect expression of diagnostic certainty, and 5) evaluate differences based on gender, employment, and experience. We surveyed 202 clinical pathologists who were board-certified by the American College of Veterinary Pathologists (Clinical Pathology). Surveys were distributed in October 2001 and returned by e-mail, fax, or surface mail over a 2-month period. Results were analyzed by parametric and nonparametric tests. Survey response rate was 47.5% (n = 96) and primarily included clinical pathologists at veterinary schools (n = 58) and diagnostic laboratories (n = 31). Eleven of 18 terms were used "often" or "sometimes" by >/= 50% of respondents. Broad variability was found in the probability assigned to each term, especially those with median values of 75 to 90%. Preferred modifiers for 7 numerical probabilities ranging from 0 to 100% included 68 unique terms; however, a set of 10 terms was used by >/= 50% of respondents. Cellularity and quality of the sample, experience of the pathologist, and implications of the diagnosis were the most important factors affecting the expression of probability. Because of wide discrepancy in the implied likelihood of a diagnosis using words, defined terminology and controlled vocabulary may be useful in improving communication and the quality of data in cytology reporting.
Integrated-Circuit Pseudorandom-Number Generator
NASA Technical Reports Server (NTRS)
Steelman, James E.; Beasley, Jeff; Aragon, Michael; Ramirez, Francisco; Summers, Kenneth L.; Knoebel, Arthur
1992-01-01
Integrated circuit produces 8-bit pseudorandom numbers from specified probability distribution, at rate of 10 MHz. Use of Boolean logic, circuit implements pseudorandom-number-generating algorithm. Circuit includes eight 12-bit pseudorandom-number generators, outputs are uniformly distributed. 8-bit pseudorandom numbers satisfying specified nonuniform probability distribution are generated by processing uniformly distributed outputs of eight 12-bit pseudorandom-number generators through "pipeline" of D flip-flops, comparators, and memories implementing conditional probabilities on zeros and ones.