Science.gov

Sample records for probability based significance

  1. Salt dependent premelting base pair opening probabilities of B and Z DNA Poly [d(G-C)] and significance for the B-Z transition

    PubMed Central

    Chen, Y. Z.; Prohofsky, E. W.

    1993-01-01

    We calculate room temperature thermal fluctuational base pair opening probabilities of B and Z DNA Poly[d(G-C)] at various salt concentrations and discuss the significance of thermal fluctuation in facilitating base pair disruption during B to Z transition. Our calculated base pair opening probability of the B DNA at lower salt concentrations and the probability of the Z DNA at high salt concentrations are in agreement with observations. The salt dependence of the probabilities indicates a B to Z transition at a salt concentration close to the observed concentration. PMID:19431893

  2. A review of a multifactorial probability based model for classification of BRCA1 and BRCA2 variants of uncertain significance (VUS)

    PubMed Central

    Lindor, Noralane M.; Guidugli, Lucia; Wang, Xianshu; Vallée, Maxime P.; Monteiro, Alvaro N.A.; Tavtigian, Sean; Goldgar, David E.; Couch, Fergus J.

    2011-01-01

    Clinical mutation screening of the BRCA1 and BRCA2 genes for the presence of germline inactivating mutations is used to identify individuals at elevated risk of breast and ovarian cancer. Variants identified during screening are usually classified as pathogenic (increased risk of cancer) or not pathogenic (no increased risk of cancer). However, a significant proportion of genetic tests yield variants of uncertain significance (VUS) that have undefined risk of cancer. Individuals carrying these VUS cannot benefit from individualized cancer risk assessment. Recently a quantitative “posterior probability model” for assessing the clinical relevance of VUS in BRCA1 or BRCA2 that integrates multiple forms of genetic evidence has been developed. Here we provide a detailed review of this model. We describe the components of the model and explain how these can be combined to calculate a posterior probability of pathogenicity for each VUS. We explain how the model can be applied to public data and provide Tables that list the VUS that have been classified as not pathogenic or pathogenic using this method. While we use BRCA1 and BRCA2 VUS as examples, the method can be used as a framework for classification of the pathogenicity of VUS in other cancer genes. PMID:21990134

  3. Significance of "high probability/low damage" versus "low probability/high damage" flood events

    NASA Astrophysics Data System (ADS)

    Merz, B.; Elmer, F.; Thieken, A. H.

    2009-06-01

    The need for an efficient use of limited resources fosters the application of risk-oriented design in flood mitigation. Flood defence measures reduce future damage. Traditionally, this benefit is quantified via the expected annual damage. We analyse the contribution of "high probability/low damage" floods versus the contribution of "low probability/high damage" events to the expected annual damage. For three case studies, i.e. actual flood situations in flood-prone communities in Germany, it is shown that the expected annual damage is dominated by "high probability/low damage" events. Extreme events play a minor role, even though they cause high damage. Using typical values for flood frequency behaviour, flood plain morphology, distribution of assets and vulnerability, it is shown that this also holds for the general case of river floods in Germany. This result is compared to the significance of extreme events in the public perception. "Low probability/high damage" events are more important in the societal view than it is expressed by the expected annual damage. We conclude that the expected annual damage should be used with care since it is not in agreement with societal priorities. Further, risk aversion functions that penalise events with disastrous consequences are introduced in the appraisal of risk mitigation options. It is shown that risk aversion may have substantial implications for decision-making. Different flood mitigation decisions are probable, when risk aversion is taken into account.

  4. Probable detection of climatically significant change of the solar constant

    NASA Technical Reports Server (NTRS)

    Sofia, S.; Endal, A. S.

    1980-01-01

    It is suggested that the decrease in the solar radius inferred from solar eclipse observations made from 1715 to 1979 reflects a variation of the solar constant that may be of considerable climatic significance. A general, time-averaged relationship between changes in the solar constant and changes in the solar radius is derived based on a model of the contraction and expansion of the convective zone. A preliminary numerical calculation of radius changes due to changes in the mixing length of the solar envelope is presented which indicates that a decrease in solar radius of 0.5 arcsec, as observed in the last 264 years, would correspond to a decrease of 0.7% in the solar constant, a value of large climatic significance. Limitations of the observational method and the numerical approach are pointed out, and required additional theoretical and observational efforts are indicated.

  5. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  6. Significance of stress transfer in time-dependent earthquake probability calculations

    USGS Publications Warehouse

    Parsons, T.

    2005-01-01

    A sudden change in stress is seen to modify earthquake rates, but should it also revise earthquake probability? Data used to derive input parameters permits an array of forecasts; so how large a static stress change is require to cause a statistically significant earthquake probability change? To answer that question, effects of parameter and philosophical choices are examined through all phases of sample calculations, Drawing at random from distributions of recurrence-aperiodicity pairs identifies many that recreate long paleoseismic and historic earthquake catalogs. Probability density funtions built from the recurrence-aperiodicity pairs give the range of possible earthquake forecasts under a point process renewal model. Consequences of choices made in stress transfer calculations, such as different slip models, fault rake, dip, and friction are, tracked. For interactions among large faults, calculated peak stress changes may be localized, with most of the receiving fault area changed less than the mean. Thus, to avoid overstating probability change on segments, stress change values should be drawn from a distribution reflecting the spatial pattern rather than using the segment mean. Disparity resulting from interaction probability methodology is also examined. For a fault with a well-understood earthquake history, a minimum stress change to stressing rate ratio of 10:1 to 20:1 is required to significantly skew probabilities with >80-85% confidence. That ratio must be closer to 50:1 to exceed 90-95% confidence levels. Thus revision to earthquake probability is achievable when a perturbing event is very close to the fault in question or the tectonic stressing rate is low.

  7. Vehicle Detection Based on Probability Hypothesis Density Filter

    PubMed Central

    Zhang, Feihu; Knoll, Alois

    2016-01-01

    In the past decade, the developments of vehicle detection have been significantly improved. By utilizing cameras, vehicles can be detected in the Regions of Interest (ROI) in complex environments. However, vision techniques often suffer from false positives and limited field of view. In this paper, a LiDAR based vehicle detection approach is proposed by using the Probability Hypothesis Density (PHD) filter. The proposed approach consists of two phases: the hypothesis generation phase to detect potential objects and the hypothesis verification phase to classify objects. The performance of the proposed approach is evaluated in complex scenarios, compared with the state-of-the-art. PMID:27070621

  8. Vehicle Detection Based on Probability Hypothesis Density Filter.

    PubMed

    Zhang, Feihu; Knoll, Alois

    2016-01-01

    In the past decade, the developments of vehicle detection have been significantly improved. By utilizing cameras, vehicles can be detected in the Regions of Interest (ROI) in complex environments. However, vision techniques often suffer from false positives and limited field of view. In this paper, a LiDAR based vehicle detection approach is proposed by using the Probability Hypothesis Density (PHD) filter. The proposed approach consists of two phases: the hypothesis generation phase to detect potential objects and the hypothesis verification phase to classify objects. The performance of the proposed approach is evaluated in complex scenarios, compared with the state-of-the-art. PMID:27070621

  9. GNSS integer ambiguity validation based on posterior probability

    NASA Astrophysics Data System (ADS)

    Wu, Zemin; Bian, Shaofeng

    2015-10-01

    GNSS integer ambiguity validation is considered to be a challenge task for decades. Several kinds of validation tests are developed and widely used in these years, but theoretical basis is their weakness. Ambiguity validation theoretically is an issue of hypothesis test. In the frame of Bayesian hypothesis testing, posterior probability is the canonical standard that statistical decision should be based on. In this contribution, (i) we derive the posterior probability of the fixed ambiguity based on the Bayesian principle and modify it for practice ambiguity validation. (ii) The optimal property of the posterior probability test is proved based on an extended Neyman-Pearson lemma. Since validation failure rate is the issue users most concerned about, (iii) we derive the failure rate upper bound of the posterior probability test, so the user can use the posterior probability test either in the fixed posterior probability or in the fixed failure rate way. Simulated as well as real observed data are used for experimental validations. The results show that (i) the posterior probability test is the most effective within the R-ratio test, difference test, ellipsoidal integer aperture test and posterior probability test, (ii) the posterior probability test is computational efficient and (iii) the failure rate estimation for posterior probability test is useful.

  10. Significance probability mapping: an aid in the topographic analysis of brain electrical activity.

    PubMed

    Duffy, F H; Bartels, P H; Burchfiel, J L

    1981-05-01

    We illustrate the application of significance probability mapping (SPM) to the analysis of topographic maps of spectral analyzed EEG and visual evoked potential (VEP) activity from patients with brain tumors, boys with dyslexia, and control subjects. When the VEP topographic plots of tumor patients were displayed as number of standard deviations from a reference mean, more subjects were correctly identified than by inspection of the underlying raw data. When topographic plots of EEG alpha activity obtained while listening to speech or music were compared by t statistic to plots of resting alpha activity, regions of cortex presumably activated by speech or music were delineated. DIfferent regions were defined in dyslexic boys and controls. We propose that SPM will prove valuable in the regional localization of normal and abnormal functions in other clinical situations. PMID:6165544

  11. Probability-based nitrate contamination map of groundwater in Kinmen.

    PubMed

    Liu, Chen-Wuing; Wang, Yeuh-Bin; Jang, Cheng-Shin

    2013-12-01

    Groundwater supplies over 50% of drinking water in Kinmen. Approximately 16.8% of groundwater samples in Kinmen exceed the drinking water quality standard (DWQS) of NO3 (-)-N (10 mg/L). The residents drinking high nitrate-polluted groundwater pose a potential risk to health. To formulate effective water quality management plan and assure a safe drinking water in Kinmen, the detailed spatial distribution of nitrate-N in groundwater is a prerequisite. The aim of this study is to develop an efficient scheme for evaluating spatial distribution of nitrate-N in residential well water using logistic regression (LR) model. A probability-based nitrate-N contamination map in Kinmen is constructed. The LR model predicted the binary occurrence probability of groundwater nitrate-N concentrations exceeding DWQS by simple measurement variables as independent variables, including sampling season, soil type, water table depth, pH, EC, DO, and Eh. The analyzed results reveal that three statistically significant explanatory variables, soil type, pH, and EC, are selected for the forward stepwise LR analysis. The total ratio of correct classification reaches 92.7%. The highest probability of nitrate-N contamination map presents in the central zone, indicating that groundwater in the central zone should not be used for drinking purposes. Furthermore, a handy EC-pH-probability curve of nitrate-N exceeding the threshold of DWQS was developed. This curve can be used for preliminary screening of nitrate-N contamination in Kinmen groundwater. This study recommended that the local agency should implement the best management practice strategies to control nonpoint nitrogen sources and carry out a systematic monitoring of groundwater quality in residential wells of the high nitrate-N contamination zones.

  12. PROBABILITY BASED CORROSION CONTROL FOR WASTE TANKS - PART II

    SciTech Connect

    Hoffman, E.; Edwards, T.

    2010-12-09

    As part of an ongoing study to evaluate the discontinuity in the corrosion controls at the SRS tank farm, a study was conducted this year to assess the minimum concentrations below 1 molar nitrate, see Figure 1. Current controls on the tank farm solution chemistry are in place to prevent the initiation and propagation of pitting and stress corrosion cracking in the primary steel waste tanks. The controls are based upon a series of experiments performed with simulated solutions on materials used for construction of the tanks, namely ASTM A537 carbon steel (A537). During FY09, an experimental program was undertaken to investigate the risk associated with reducing the minimum molar nitrite concentration required to confidently inhibit pitting in dilute solutions (i.e., less than 1 molar nitrate). The experimental results and conclusions herein provide a statistical basis to quantify the probability of pitting for the tank wall exposed to various solutions with dilute concentrations of nitrate and nitrite. Understanding the probability for pitting will allow the facility to make tank-specific risk-based decisions for chemistry control. Based on previous electrochemical testing, a statistical test matrix was developed to refine and solidify the application of the statistical mixture/amount model to corrosion of A537 steel. A mixture/amount model was identified based on statistical analysis of recent and historically collected electrochemical data. This model provides a more complex relationship between the nitrate and nitrite concentrations and the probability of pitting than is represented by the model underlying the current chemistry control program, and its use may provide a technical basis for the utilization of less nitrite to inhibit pitting at concentrations below 1 molar nitrate. FY09 results fit within the mixture/amount model, and further refine the nitrate regime in which the model is applicable. The combination of visual observations and cyclic

  13. Visualizing RNA base-pairing probabilities with RNAbow diagrams.

    PubMed

    Aalberts, Daniel P; Jannen, William K

    2013-04-01

    There are many effective ways to represent a minimum free energy RNA secondary structure that make it easy to locate its helices and loops. It is a greater challenge to visualize the thermal average probabilities of all folds in a partition function sum; dot plot representations are often puzzling. Therefore, we introduce the RNAbows visualization tool for RNA base pair probabilities. RNAbows represent base pair probabilities with line thickness and shading, yielding intuitive diagrams. RNAbows aid in disentangling incompatible structures, allow comparisons between clusters of folds, highlight differences between wild-type and mutant folds, and are also rather beautiful.

  14. PROBABILITY BASED CORROSION CONTROL FOR LIQUID WASTE TANKS - PART III

    SciTech Connect

    Hoffman, E.; Edwards, T.

    2010-12-09

    The liquid waste chemistry control program is designed to reduce the pitting corrosion occurrence on tank walls. The chemistry control program has been implemented, in part, by applying engineering judgment safety factors to experimental data. However, the simple application of a general safety factor can result in use of excessive corrosion inhibiting agents. The required use of excess corrosion inhibitors can be costly for tank maintenance, waste processing, and in future tank closure. It is proposed that a probability-based approach can be used to quantify the risk associated with the chemistry control program. This approach can lead to the application of tank-specific chemistry control programs reducing overall costs associated with overly conservative use of inhibitor. Furthermore, when using nitrite as an inhibitor, the current chemistry control program is based on a linear model of increased aggressive species requiring increased protective species. This linear model was primarily supported by experimental data obtained from dilute solutions with nitrate concentrations less than 0.6 M, but is used to produce the current chemistry control program up to 1.0 M nitrate. Therefore, in the nitrate space between 0.6 and 1.0 M, the current control limit is based on assumptions that the linear model developed from data in the <0.6 M region is applicable in the 0.6-1.0 M region. Due to this assumption, further investigation of the nitrate region of 0.6 M to 1.0 M has potential for significant inhibitor reduction, while maintaining the same level of corrosion risk associated with the current chemistry control program. Ongoing studies have been conducted in FY07, FY08, FY09 and FY10 to evaluate the corrosion controls at the SRS tank farm and to assess the minimum nitrite concentrations to inhibit pitting in ASTM A537 carbon steel below 1.0 molar nitrate. The experimentation from FY08 suggested a non-linear model known as the mixture/amount model could be used to predict

  15. ProbOnto: ontology and knowledge base of probability distributions

    PubMed Central

    Swat, Maciej J.; Grenon, Pierre; Wimalaratne, Sarala

    2016-01-01

    Motivation: Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. Results: ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. Availability and Implementation: http://probonto.org Contact: mjswat@ebi.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153608

  16. An novel frequent probability pattern mining algorithm based on circuit simulation method in uncertain biological networks

    PubMed Central

    2014-01-01

    Background Motif mining has always been a hot research topic in bioinformatics. Most of current research on biological networks focuses on exact motif mining. However, due to the inevitable experimental error and noisy data, biological network data represented as the probability model could better reflect the authenticity and biological significance, therefore, it is more biological meaningful to discover probability motif in uncertain biological networks. One of the key steps in probability motif mining is frequent pattern discovery which is usually based on the possible world model having a relatively high computational complexity. Methods In this paper, we present a novel method for detecting frequent probability patterns based on circuit simulation in the uncertain biological networks. First, the partition based efficient search is applied to the non-tree like subgraph mining where the probability of occurrence in random networks is small. Then, an algorithm of probability isomorphic based on circuit simulation is proposed. The probability isomorphic combines the analysis of circuit topology structure with related physical properties of voltage in order to evaluate the probability isomorphism between probability subgraphs. The circuit simulation based probability isomorphic can avoid using traditional possible world model. Finally, based on the algorithm of probability subgraph isomorphism, two-step hierarchical clustering method is used to cluster subgraphs, and discover frequent probability patterns from the clusters. Results The experiment results on data sets of the Protein-Protein Interaction (PPI) networks and the transcriptional regulatory networks of E. coli and S. cerevisiae show that the proposed method can efficiently discover the frequent probability subgraphs. The discovered subgraphs in our study contain all probability motifs reported in the experiments published in other related papers. Conclusions The algorithm of probability graph isomorphism

  17. Lake Superior Phytoplankton Characterization from the 2006 Probability Based Survey

    EPA Science Inventory

    We conducted a late summer probability based survey of Lake Superior in 2006 which consisted of 52 sites stratified across 3 depth zones. As part of this effort, we collected composite phytoplankton samples from the epilimnion and the fluorescence maxima (Fmax) at 29 of the site...

  18. Westward-derived conglomerates in Moenkopi formation of Southeastern California, and their probable tectonic significance

    SciTech Connect

    Walker, J.D.; Burchfiel, B.C.; Royden, L.H.

    1983-02-01

    The upper part of the Moenkopi Formation in the Northern Clark Mountains, Southeastern California, contains conglomerate beds whose clasts comprise igneous, metamorphic, and sedimentary rocks. Metamorphic clasts include foliated granite, meta-arkose, and quarzite, probably derived from older Precambrian basement and younger Precambrian clastic rocks. Volcanic clasts are altered plagioclase-bearing rocks, and sedimentary clasts were derived from Paleozoic miogeoclinal rocks. Paleocurrent data indicate that the clasts had a source to the southwest. An age of late Early or early Middle Triassic has been tentatively assigned to these conglomerates. These conglomerates indicate that Late Permian to Early Triassic deformational events in this part of the orogen affected rocks much farther east than has been previously recognized.

  19. Assessing magnitude probability distribution through physics-based rupture scenarios

    NASA Astrophysics Data System (ADS)

    Hok, Sébastien; Durand, Virginie; Bernard, Pascal; Scotti, Oona

    2016-04-01

    When faced with complex network of faults in a seismic hazard assessment study, the first question raised is to what extent the fault network is connected and what is the probability that an earthquake ruptures simultaneously a series of neighboring segments. Physics-based dynamic rupture models can provide useful insight as to which rupture scenario is most probable, provided that an exhaustive exploration of the variability of the input parameters necessary for the dynamic rupture modeling is accounted for. Given the random nature of some parameters (e.g. hypocenter location) and the limitation of our knowledge, we used a logic-tree approach in order to build the different scenarios and to be able to associate them with a probability. The methodology is applied to the three main faults located along the southern coast of the West Corinth rift. Our logic tree takes into account different hypothesis for: fault geometry, location of hypocenter, seismic cycle position, and fracture energy on the fault plane. The variability of these parameters is discussed, and the different values tested are weighted accordingly. 64 scenarios resulting from 64 parameter combinations were included. Sensitivity studies were done to illustrate which parameters control the variability of the results. Given the weight of the input parameters, we evaluated the probability to obtain a full network break to be 15 %, while single segment rupture represents 50 % of the scenarios. These rupture scenario probability distribution along the three faults of the West Corinth rift fault network can then be used as input to a seismic hazard calculation.

  20. Ethanol, not metabolized in brain, significantly reduces brain metabolism, probably via specific GABA(A) receptors

    PubMed Central

    Rae, Caroline D.; Davidson, Joanne E.; Maher, Anthony D.; Rowlands, Benjamin D.; Kashem, Mohammed A.; Nasrallah, Fatima A.; Rallapalli, Sundari K.; Cook, James M; Balcar, Vladimir J.

    2014-01-01

    Ethanol is a known neuromodulatory agent with reported actions at a range of neurotransmitter receptors. Here, we used an indirect approach, measuring the effect of alcohol on metabolism of [3-13C]pyruvate in the adult Guinea pig brain cortical tissue slice and comparing the outcomes to those from a library of ligands active in the GABAergic system as well as studying the metabolic fate of [1,2-13C]ethanol. Ethanol (10, 30 and 60 mM) significantly reduced metabolic flux into all measured isotopomers and reduced all metabolic pool sizes. The metabolic profiles of these three concentrations of ethanol were similar and clustered with that of the α4β3δ positive allosteric modulator DS2 (4-Chloro-N-[2-(2-thienyl)imidazo[1,2a]-pyridin-3-yl]benzamide). Ethanol at a very low concentration (0.1 mM) produced a metabolic profile which clustered with those from inhibitors of GABA uptake, and ligands showing affinity for α5, and to a lesser extent, α1-containing GABA(A)R. There was no measureable metabolism of [1,2-13C]ethanol with no significant incorporation of 13C from [1,2-13C]ethanol into any measured metabolite above natural abundance, although there were measurable effects on total metabolite sizes similar to those seen with unlabeled ethanol. The reduction in metabolism seen in the presence of ethanol is therefore likely to be due to its actions at neurotransmitter receptors, particularly α4β3δ receptors, and not because ethanol is substituting as a substrate or because of the effects of ethanol catabolites acetaldehyde or acetate. We suggest that the stimulatory effects of very low concentrations of ethanol are due to release of GABA via GAT1 and the subsequent interaction of this GABA with local α5-containing, and to a lesser extent, α1-containing GABA(A)R. PMID:24313287

  1. QKD-based quantum private query without a failure probability

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Gao, Fei; Huang, Wei; Wen, QiaoYan

    2015-10-01

    In this paper, we present a quantum-key-distribution (QKD)-based quantum private query (QPQ) protocol utilizing single-photon signal of multiple optical pulses. It maintains the advantages of the QKD-based QPQ, i.e., easy to implement and loss tolerant. In addition, different from the situations in the previous QKD-based QPQ protocols, in our protocol, the number of the items an honest user will obtain is always one and the failure probability is always zero. This characteristic not only improves the stability (in the sense that, ignoring the noise and the attack, the protocol would always succeed), but also benefits the privacy of the database (since the database will no more reveal additional secrets to the honest users). Furthermore, for the user's privacy, the proposed protocol is cheat sensitive, and for security of the database, we obtain an upper bound for the leaked information of the database in theory.

  2. Computing probability masses in rule-based systems

    SciTech Connect

    Dillard, R.A.

    1982-09-01

    This report describes a method of computing confidences in rule-based inference systems by using the Dempster-Shafter theory. The theory is applicable to tactical decision problems which can be formulated in terms of sets of exhaustive and mutually exclusive propositions. Dempster's combining procedure, a generalization of Bayesian inference, can be used to combine probability mass assignments supplied by independent bodies of evidence. This report describes the use of Dempster's combining method and Shafer's representation framework in rule-based inference systems. It is shown that many kinds of data fusion problems can be represented in a way such that the constraints are met. Although computational problems remain to be solved, the theory should provide a versatile and consistent way of combining confidences for a large class of inferencing problems.

  3. Lung scans with significant perfusion defects limited to matching pleural effusions have a low probability of pulmonary embolism

    SciTech Connect

    Datz, F.L.; Bedont, R.A.; Taylor, A.

    1985-05-01

    Patients with a pleural effusion on chest x-ray often undergo a lung scan to exclude pulmonary embolism (PE). According to other studies, when the scan shows a perfusion defect equal in size to a radiographic abnormality on chest x-ray, the scan should be classified as indeterminate or intermediate probability for PE. However, since those studies dealt primarily with alveolar infiltrates rather than pleural effusions, the authors undertook a retrospective study to determine the probability of PE in patients with pleural effusion and a matching perfusion defect. The authors reviewed 451 scans and x-rays of patients studied for suspected PE. Of those, 53 had moderate or large perfusion defects secondary to pleural effusion without other significant (>25% of a segment) effusion without other significant (>25% of a segment) defects on the scan. Final diagnosis was confirmed by pulmonary angiography (16), thoracentesis (40), venography (11), other radiographic and laboratory studies, and clinical course. Of the 53 patients, only 2 patients had venous thrombotic disease. One patient had PE on pulmonary angiography, the other patient had thrombophlebitis on venography. The remainder of the patients had effusions due to congestive heart failure (12), malignancy (12), infection (7), trauma (7), collegen vascular disease (7), sympathetic effusion (3) and unknown etiology (3). The authors conclude that lung scans with significant perfusion defects limited to matching pleural effusions on chest x-ray have a low probability for PE.

  4. An algorithm based on negative probabilities for a separability criterion

    NASA Astrophysics Data System (ADS)

    de Ponte, M. A.; Mizrahi, S. S.; Moussa, M. H. Y.

    2015-09-01

    Here, we demonstrate that entangled states can be written as separable states [, 1 to N refering to the parts and to the nonnegative probabilities], although for some of the coefficients, assume negative values, while others are larger than 1 such to keep their sum equal to 1. We recognize this feature as a signature of non-separability or pseudoseparability. We systematize that kind of decomposition through an algorithm for the explicit separation of density matrices, and we apply it to illustrate the separation of some particular bipartite and tripartite states, including a multipartite one-parameter Werner-like state. We also work out an arbitrary bipartite state and show that in the particular case where this state reduces to an X-type density matrix, our algorithm leads to the separability conditions on the parameters, confirmed by the Peres-Horodecki partial transposition recipe. We finally propose a measure for quantifying the degree of entanglement based on these peculiar negative (and greater than one) probabilities.

  5. Partition function and base pairing probabilities of RNA heterodimers

    PubMed Central

    Bernhart, Stephan H; Tafer, Hakim; Mückstein, Ulrike; Flamm, Christoph; Stadler, Peter F; Hofacker, Ivo L

    2006-01-01

    Background RNA has been recognized as a key player in cellular regulation in recent years. In many cases, non-coding RNAs exert their function by binding to other nucleic acids, as in the case of microRNAs and snoRNAs. The specificity of these interactions derives from the stability of inter-molecular base pairing. The accurate computational treatment of RNA-RNA binding therefore lies at the heart of target prediction algorithms. Methods The standard dynamic programming algorithms for computing secondary structures of linear single-stranded RNA molecules are extended to the co-folding of two interacting RNAs. Results We present a program, RNAcofold, that computes the hybridization energy and base pairing pattern of a pair of interacting RNA molecules. In contrast to earlier approaches, complex internal structures in both RNAs are fully taken into account. RNAcofold supports the calculation of the minimum energy structure and of a complete set of suboptimal structures in an energy band above the ground state. Furthermore, it provides an extension of McCaskill's partition function algorithm to compute base pairing probabilities, realistic interaction energies, and equilibrium concentrations of duplex structures. Availability RNAcofold is distributed as part of the Vienna RNA Package, . Contact Stephan H. Bernhart – berni@tbi.univie.ac.at PMID:16722605

  6. Econometric analysis of the changing effects in wind strength and significant wave height on the probability of casualty in shipping.

    PubMed

    Knapp, Sabine; Kumar, Shashi; Sakurada, Yuri; Shen, Jiajun

    2011-05-01

    This study uses econometric models to measure the effect of significant wave height and wind strength on the probability of casualty and tests whether these effects changed. While both effects are in particular relevant for stability and strength calculations of vessels, it is also helpful for the development of ship construction standards in general to counteract increased risk resulting from changing oceanographic conditions. The authors analyzed a unique dataset of 3.2 million observations from 20,729 individual vessels in the North Atlantic and Arctic regions gathered during the period 1979-2007. The results show that although there is a seasonal pattern in the probability of casualty especially during the winter months, the effect of wind strength and significant wave height do not follow the same seasonal pattern. Additionally, over time, significant wave height shows an increasing effect in January, March, May and October while wind strength shows a decreasing effect, especially in January, March and May. The models can be used to simulate relationships and help understand the relationships. This is of particular interest to naval architects and ship designers as well as multilateral agencies such as the International Maritime Organization (IMO) that establish global standards in ship design and construction. PMID:21376925

  7. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning.

  8. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. PMID:25363706

  9. Web-based experiments controlled by JavaScript: an example from probability learning.

    PubMed

    Birnbaum, Michael H; Wakcher, Sandra V

    2002-05-01

    JavaScript programs can be used to control Web experiments. This technique is illustrated by an experiment that tested the effects of advice on performance in the classic probability-learning paradigm. Previous research reported that people tested via the Web or in the lab tended to match the probabilities of their responses to the probabilities that those responses would be reinforced. The optimal strategy, however, is to consistently choose the more frequent event; probability matching produces suboptimal performance. We investigated manipulations we reasoned should improve performance. A horse race scenario in which participants predicted the winner in each of a series of races between two horses was compared with an abstract scenario used previously. Ten groups of learners received different amounts of advice, including all combinations of (1) explicit instructions concerning the optimal strategy, (2) explicit instructions concerning a monetary sum to maximize, and (3) accurate information concerning the probabilities of events. The results showed minimal effects of horse race versus abstract scenario. Both advice concerning the optimal strategy and probability information contributed significantly to performance in the task. This paper includes a brief tutorial on JavaScript, explaining with simple examples how to assemble a browser-based experiment.

  10. Probability based hydrologic catchments of the Greenland Ice Sheet

    NASA Astrophysics Data System (ADS)

    Hudson, B. D.

    2015-12-01

    Greenland Ice Sheet melt water impacts ice sheet flow dynamics, fjord and coastal circulation, and sediment and biogeochemical fluxes. Melt water exiting the ice sheet also is a key term in its mass balance. Because of this, knowledge of the area of the ice sheet that contributes melt water to a given outlet (its hydrologic catchment) is important to many ice sheet studies and is especially critical to methods using river runoff to assess ice sheet mass balance. Yet uncertainty in delineating ice sheet hydrologic catchments is a problem that is rarely acknowledged. Ice sheet catchments are delineated as a function of both basal and surface topography. While surface topography is well known, basal topography is less certain because it is dependent on radar surveys. Here, I a present a Monte Carlo based approach to delineating ice sheet catchments that quantifies the impact of uncertain basal topography. In this scheme, over many iterations I randomly vary the ice sheet bed elevation within published error bounds (using Morlighem et al., 2014 bed and bed error datasets). For each iteration of ice sheet bed elevation, I calculate the hydraulic potentiometric surface and route water over its path of 'steepest' descent to delineate the catchment. I then use all realizations of the catchment to arrive at a probability map of all major melt water outlets in Greenland. I often find that catchment size is uncertain, with small, random perturbations in basal topography leading to large variations in catchments size. While some catchments are well defined, others can double or halve in size within published basal topography error bars. While some uncertainty will likely always remain, this work points to locations where studies of ice sheet hydrology would be the most successful, allows reinterpretation of past results, and points to where future radar surveys would be most advantageous.

  11. Analysis of blocking probability for OFDM-based variable bandwidth optical network

    NASA Astrophysics Data System (ADS)

    Gong, Lei; Zhang, Jie; Zhao, Yongli; Lin, Xuefeng; Wu, Yuyao; Gu, Wanyi

    2011-12-01

    Orthogonal Frequency Division Multiplexing (OFDM) has recently been proposed as a modulation technique. For optical networks, because of its good spectral efficiency, flexibility, and tolerance to impairments, optical OFDM is much more flexible compared to traditional WDM systems, enabling elastic bandwidth transmissions, and optical networking is the future trend of development. In OFDM-based optical network the research of blocking rate has very important significance for network assessment. Current research for WDM network is basically based on a fixed bandwidth, in order to accommodate the future business and the fast-changing development of optical network, our study is based on variable bandwidth OFDM-based optical networks. We apply the mathematical analysis and theoretical derivation, based on the existing theory and algorithms, research blocking probability of the variable bandwidth of optical network, and then we will build a model for blocking probability.

  12. Probability-based tampering detection scheme for digital images

    NASA Astrophysics Data System (ADS)

    Hsu, Ching-Sheng; Tu, Shu-Fen

    2010-05-01

    In recent years, digital watermarking technology has been widely used for property rights protection and integrity authentication of digital images. Image integrity authentication is usually done by a fragile watermarking scheme. When authenticating image integrity, one must extract the embedded authentication message from the image for comparison with the image feature to identify whether the image has been tampered with, and if so, locate the affected area. However, such authentication schemes may result in detection error problems. Namely, the tampered area may be misjudged as not having been tampered with, or vice versa. Hence, methods that effectively reduce errors in tampering detection have become an important research topic. This study aims to integrate a probability theory to improve image tampering detection accuracy and precision. The scheme includes two processes: the embedding of an image authentication message and tampering detection. In the image tampering detection process, in addition to identifying whether the image has been tampered with and locating the tampered area, through the authentication message embedded in the image, a probability theory is employed to improve previously obtained detection results to enhance authentication accuracy. The experimental results reveal that the proposed scheme performs well in terms of detection precision and authentication accuracy rate.

  13. A Selective Vision and Landmark based Approach to Improve the Efficiency of Position Probability Grid Localization

    NASA Astrophysics Data System (ADS)

    Loukianov, Andrey A.; Sugisaka, Masanori

    This paper presents a vision and landmark based approach to improve the efficiency of probability grid Markov localization for mobile robots. The proposed approach uses visual landmarks that can be detected by a rotating video camera on the robot. We assume that visual landmark positions in the map are known and that each landmark can be assigned to a certain landmark class. The method uses classes of observed landmarks and their relative arrangement to select regions in the robot posture space where the location probability density function is to be updated. Subsequent computations are performed only in these selected update regions thus the computational workload is significantly reduced. Probabilistic landmark-based localization method, details of the map and robot perception are discussed. A technique to compute the update regions and their parameters for selective computation is introduced. Simulation results are presented to show the effectiveness of the approach.

  14. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability

    NASA Astrophysics Data System (ADS)

    Thanh, Vo Hong; Priami, Corrado; Zunino, Roberto

    2016-06-01

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.

  15. Aircraft detection based on probability model of structural elements

    NASA Astrophysics Data System (ADS)

    Chen, Long; Jiang, Zhiguo

    2014-11-01

    Detecting aircrafts is important in the field of remote sensing. In past decades, researchers used various approaches to detect aircrafts based on classifiers for overall aircrafts. However, with the development of high-resolution images, the internal structures of aircrafts should also be taken into consideration now. To address this issue, a novel aircrafts detection method for satellite images based on probabilistic topic model is presented. We model aircrafts as the connected structural elements rather than features. The proposed method contains two major steps: 1) Use Cascade-Adaboost classier to identify the structural elements of aircraft firstly. 2) Connect these structural elements to aircrafts, where the relationships between elements are estimated by hierarchical topic model. The model places strict spatial constraints on structural elements which can identify differences between similar features. The experimental results demonstrate the effectiveness of the approach.

  16. Batch Mode Active Sampling based on Marginal Probability Distribution Matching.

    PubMed

    Chattopadhyay, Rita; Wang, Zheng; Fan, Wei; Davidson, Ian; Panchanathan, Sethuraman; Ye, Jieping

    2012-01-01

    Active Learning is a machine learning and data mining technique that selects the most informative samples for labeling and uses them as training data; it is especially useful when there are large amount of unlabeled data and labeling them is expensive. Recently, batch-mode active learning, where a set of samples are selected concurrently for labeling, based on their collective merit, has attracted a lot of attention. The objective of batch-mode active learning is to select a set of informative samples so that a classifier learned on these samples has good generalization performance on the unlabeled data. Most of the existing batch-mode active learning methodologies try to achieve this by selecting samples based on varied criteria. In this paper we propose a novel criterion which achieves good generalization performance of a classifier by specifically selecting a set of query samples that minimizes the difference in distribution between the labeled and the unlabeled data, after annotation. We explicitly measure this difference based on all candidate subsets of the unlabeled data and select the best subset. The proposed objective is an NP-hard integer programming optimization problem. We provide two optimization techniques to solve this problem. In the first one, the problem is transformed into a convex quadratic programming problem and in the second method the problem is transformed into a linear programming problem. Our empirical studies using publicly available UCI datasets and a biomedical image dataset demonstrate the effectiveness of the proposed approach in comparison with the state-of-the-art batch-mode active learning methods. We also present two extensions of the proposed approach, which incorporate uncertainty of the predicted labels of the unlabeled data and transfer learning in the proposed formulation. Our empirical studies on UCI datasets show that incorporation of uncertainty information improves performance at later iterations while our studies on 20

  17. The Effect of Simulation-Based Learning on Prospective Teachers' Inference Skills in Teaching Probability

    ERIC Educational Resources Information Center

    Koparan, Timur; Yilmaz, Gül Kaleli

    2015-01-01

    The effect of simulation-based probability teaching on the prospective teachers' inference skills has been examined with this research. In line with this purpose, it has been aimed to examine the design, implementation and efficiency of a learning environment for experimental probability. Activities were built on modeling, simulation and the…

  18. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).

  19. A Comparative Study of Probability Collectives Based Multi-agent Systems and Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Huang, Chien-Feng; Wolpert, David H.; Bieniawski, Stefan; Strauss, Charles E. M.

    2005-01-01

    We compare Genetic Algorithms (GA's) with Probability Collectives (PC), a new framework for distributed optimization and control. In contrast to GA's, PC-based methods do not update populations of solutions. Instead they update an explicitly parameterized probability distribution p over the space of solutions. That updating of p arises as the optimization of a functional of p. The functional is chosen so that any p that optimizes it should be p peaked about good solutions. The PC approach works in both continuous and discrete problems. It does not suffer from the resolution limitation of the finite bit length encoding of parameters into GA alleles. It also has deep connections with both game theory and statistical physics. We review the PC approach using its motivation as the information theoretic formulation of bounded rationality for multi-agent systems. It is then compared with GA's on a diverse set of problems. To handle high dimensional surfaces, in the PC method investigated here p is restricted to a product distribution. Each distribution in that product is controlled by a separate agent. The test functions were selected for their difficulty using either traditional gradient descent or genetic algorithms. On those functions the PC-based approach significantly outperforms traditional GA's in both rate of descent, trapping in false minima, and long term optimization.

  20. Open cluster membership probability based on K-means clustering algorithm

    NASA Astrophysics Data System (ADS)

    El Aziz, Mohamed Abd; Selim, I. M.; Essam, A.

    2016-08-01

    In the field of galaxies images, the relative coordinate positions of each star with respect to all the other stars are adapted. Therefore the membership of star cluster will be adapted by two basic criterions, one for geometric membership and other for physical (photometric) membership. So in this paper, we presented a new method for the determination of open cluster membership based on K-means clustering algorithm. This algorithm allows us to efficiently discriminate the cluster membership from the field stars. To validate the method we applied it on NGC 188 and NGC 2266, membership stars in these clusters have been obtained. The color-magnitude diagram of the membership stars is significantly clearer and shows a well-defined main sequence and a red giant branch in NGC 188, which allows us to better constrain the cluster members and estimate their physical parameters. The membership probabilities have been calculated and compared to those obtained by the other methods. The results show that the K-means clustering algorithm can effectively select probable member stars in space without any assumption about the spatial distribution of stars in cluster or field. The similarity of our results is in a good agreement with results derived by previous works.

  1. A Comprehensive Propagation Prediction Model Comprising Microfacet Based Scattering and Probability Based Coverage Optimization Algorithm

    PubMed Central

    Kausar, A. S. M. Zahid; Wo, Lau Chun

    2014-01-01

    Although ray tracing based propagation prediction models are popular for indoor radio wave propagation characterization, most of them do not provide an integrated approach for achieving the goal of optimum coverage, which is a key part in designing wireless network. In this paper, an accelerated technique of three-dimensional ray tracing is presented, where rough surface scattering is included for making a more accurate ray tracing technique. Here, the rough surface scattering is represented by microfacets, for which it becomes possible to compute the scattering field in all possible directions. New optimization techniques, like dual quadrant skipping (DQS) and closest object finder (COF), are implemented for fast characterization of wireless communications and making the ray tracing technique more efficient. In conjunction with the ray tracing technique, probability based coverage optimization algorithm is accumulated with the ray tracing technique to make a compact solution for indoor propagation prediction. The proposed technique decreases the ray tracing time by omitting the unnecessary objects for ray tracing using the DQS technique and by decreasing the ray-object intersection time using the COF technique. On the other hand, the coverage optimization algorithm is based on probability theory, which finds out the minimum number of transmitters and their corresponding positions in order to achieve optimal indoor wireless coverage. Both of the space and time complexities of the proposed algorithm surpass the existing algorithms. For the verification of the proposed ray tracing technique and coverage algorithm, detailed simulation results for different scattering factors, different antenna types, and different operating frequencies are presented. Furthermore, the proposed technique is verified by the experimental results. PMID:25202733

  2. A comprehensive propagation prediction model comprising microfacet based scattering and probability based coverage optimization algorithm.

    PubMed

    Kausar, A S M Zahid; Reza, Ahmed Wasif; Wo, Lau Chun; Ramiah, Harikrishnan

    2014-01-01

    Although ray tracing based propagation prediction models are popular for indoor radio wave propagation characterization, most of them do not provide an integrated approach for achieving the goal of optimum coverage, which is a key part in designing wireless network. In this paper, an accelerated technique of three-dimensional ray tracing is presented, where rough surface scattering is included for making a more accurate ray tracing technique. Here, the rough surface scattering is represented by microfacets, for which it becomes possible to compute the scattering field in all possible directions. New optimization techniques, like dual quadrant skipping (DQS) and closest object finder (COF), are implemented for fast characterization of wireless communications and making the ray tracing technique more efficient. In conjunction with the ray tracing technique, probability based coverage optimization algorithm is accumulated with the ray tracing technique to make a compact solution for indoor propagation prediction. The proposed technique decreases the ray tracing time by omitting the unnecessary objects for ray tracing using the DQS technique and by decreasing the ray-object intersection time using the COF technique. On the other hand, the coverage optimization algorithm is based on probability theory, which finds out the minimum number of transmitters and their corresponding positions in order to achieve optimal indoor wireless coverage. Both of the space and time complexities of the proposed algorithm surpass the existing algorithms. For the verification of the proposed ray tracing technique and coverage algorithm, detailed simulation results for different scattering factors, different antenna types, and different operating frequencies are presented. Furthermore, the proposed technique is verified by the experimental results. PMID:25202733

  3. An efficient surrogate-based method for computing rare failure probability

    NASA Astrophysics Data System (ADS)

    Li, Jing; Li, Jinglai; Xiu, Dongbin

    2011-10-01

    In this paper, we present an efficient numerical method for evaluating rare failure probability. The method is based on a recently developed surrogate-based method from Li and Xiu [J. Li, D. Xiu, Evaluation of failure probability via surrogate models, J. Comput. Phys. 229 (2010) 8966-8980] for failure probability computation. The method by Li and Xiu is of hybrid nature, in the sense that samples of both the surrogate model and the true physical model are used, and its efficiency gain relies on using only very few samples of the true model. Here we extend the capability of the method to rare probability computation by using the idea of importance sampling (IS). In particular, we employ cross-entropy (CE) method, which is an effective method to determine the biasing distribution in IS. We demonstrate that, by combining with the CE method, a surrogate-based IS algorithm can be constructed and is highly efficient for rare failure probability computation—it incurs much reduced simulation efforts compared to the traditional CE-IS method. In many cases, the new method is capable of capturing failure probability as small as 10 -12 ˜ 10 -6 with only several hundreds samples.

  4. Subjective-probability-based scenarios for uncertain input parameters: Stratospheric ozone depletion

    SciTech Connect

    Hammitt, J.K.

    1990-04-01

    Risk analysis often depends on complex, computer-based models to describe links between policies (e.g., required emission-control equipment) and consequences (e.g., probabilities of adverse health effects). Appropriate specification of many model aspects is uncertain, including details of the model structure; transport, reaction-rate, and other parameters; and application-specific inputs such as pollutant-release rates. Because these uncertainties preclude calculation of the precise consequences of a policy, it is important to characterize the plausible range of effects. In principle, a probability distribution function for the effects can be constructed using Monte Carlo analysis, but the combinatorics of multiple uncertainties and the often high cost of model runs quickly exhaust available resources. A method to choose sets of input conditions (scenarios) that efficiently represent knowledge about the joint probability distribution of inputs is presented and applied. A simple score function approximately relating inputs to a policy-relevant output, in this case, globally averaged stratospheric ozone depletion, is developed. The probability density function for the score-function value is analytically derived from a subjective joint probability density for the inputs. Scenarios are defined by selected quantiles of the score function. Using this method, scenarios can be systematically selected in terms of the approximate probability distribution function for the output of concern, and probability intervals for the joint effect of the inputs can be readily constructed.

  5. PROBABILITY BASED CORROSION CONTROL FOR HIGH LEVEL WASTE TANKS: INTERIM REPORT

    SciTech Connect

    Hoffman, E; Karthik Subramanian, K

    2008-04-23

    Controls on the solution chemistry (minimum nitrite and hydroxide concentrations) are in place to prevent the initiation and propagation of pitting and stress corrosion cracking in high level waste (HLW) tanks. These controls are based upon a series of experiments performed on carbon steel coupons in simulated waste solutions. An experimental program was undertaken to investigate reducing the minimum molar nitrite concentration required to confidently inhibit pitting. A statistical basis to quantify the probability of pitting for the tank wall, when exposed to various dilute solutions, is being developed. Electrochemical and coupon testing are being performed within the framework of the statistical test matrix to determine the minimum necessary inhibitor concentrations and develop a quantitative model to predict pitting propensity. A subset of the original statistical test matrix was used to develop an applied understanding of the corrosion response of the carbon steel in the various environments. The interim results suggest that there exists some critical nitrite concentration that sufficiently inhibits against localized corrosion mechanisms due to nitrates/chlorides/sulfates, beyond which further nitrite additions are unnecessary. The combination of visual observation and the cyclic potentiodynamic polarization scans indicate the potential for significant inhibitor reductions without consequence specifically at nitrate concentrations near 1 M. The complete data sets will be used to determine the statistical basis to confidently inhibit against pitting using nitrite inhibition with the current pH controls. Once complete, a revised chemistry control program will be devised based upon the probability of pitting specifically for dilute solutions which will allow for tank specific chemistry control implementation.

  6. Probability-Based Inference in a Domain of Proportional Reasoning Tasks.

    ERIC Educational Resources Information Center

    Beland, Anne; Mislevy, Robert J.

    Probability-based inference is described in the context of test theory for cognitive assessment. Its application is illustrated with an example concerning proportional reasoning. The statistical framework is that of inference networks. Ideas are demonstrated with data from a test of proportional reasoning based on work by G. Noelting (1980). The…

  7. Value and probability coding in a feedback-based learning task utilizing food rewards.

    PubMed

    Tricomi, Elizabeth; Lempert, Karolina M

    2015-01-01

    For the consequences of our actions to guide behavior, the brain must represent different types of outcome-related information. For example, an outcome can be construed as negative because an expected reward was not delivered or because an outcome of low value was delivered. Thus behavioral consequences can differ in terms of the information they provide about outcome probability and value. We investigated the role of the striatum in processing probability-based and value-based negative feedback by training participants to associate cues with food rewards and then employing a selective satiety procedure to devalue one food outcome. Using functional magnetic resonance imaging, we examined brain activity related to receipt of expected rewards, receipt of devalued outcomes, omission of expected rewards, omission of devalued outcomes, and expected omissions of an outcome. Nucleus accumbens activation was greater for rewarding outcomes than devalued outcomes, but activity in this region did not correlate with the probability of reward receipt. Activation of the right caudate and putamen, however, was largest in response to rewarding outcomes relative to expected omissions of reward. The dorsal striatum (caudate and putamen) at the time of feedback also showed a parametric increase correlating with the trialwise probability of reward receipt. Our results suggest that the ventral striatum is sensitive to the motivational relevance, or subjective value, of the outcome, while the dorsal striatum codes for a more complex signal that incorporates reward probability. Value and probability information may be integrated in the dorsal striatum, to facilitate action planning and allocation of effort.

  8. An algorithm for computing nucleic acid base-pairing probabilities including pseudoknots.

    PubMed

    Dirks, Robert M; Pierce, Niles A

    2004-07-30

    Given a nucleic acid sequence, a recent algorithm allows the calculation of the partition function over secondary structure space including a class of physically relevant pseudoknots. Here, we present a method for computing base-pairing probabilities starting from the output of this partition function algorithm. The approach relies on the calculation of recursion probabilities that are computed by backtracking through the partition function algorithm, applying a particular transformation at each step. This transformation is applicable to any partition function algorithm that follows the same basic dynamic programming paradigm. Base-pairing probabilities are useful for analyzing the equilibrium ensemble properties of natural and engineered nucleic acids, as demonstrated for a human telomerase RNA and a synthetic DNA nanostructure. PMID:15139042

  9. Flow Regime Based Climatologies of Lightning Probabilities for Spaceports and Airports

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III; Sharp, David; Spratt, Scott; Lafosse, Richard A.

    2008-01-01

    The objective of this work was to provide forecasters with a tool to indicate the warm season climatological probability of one or more lightning strikes within a circle at a site within a specified time interval. This paper described the AMU work conducted in developing flow regime based climatologies of lightning probabilities for the SLF and seven airports in the NWS MLB CWA in east-central Florida. The paper also described the GUI developed by the AMU that is used to display the data for the operational forecasters. There were challenges working with gridded lightning data as well as the code that accompanied the gridded data. The AMU modified the provided code to be able to produce the climatologies of lightning probabilities based on eight flow regimes for 5-, 10-, 20-, and 30-n mi circles centered on eight sites in 1-, 3-, and 6-hour increments.

  10. How might Model-based Probabilities Extracted from Imperfect Models Guide Rational Decisions: The Case for non-probabilistic odds

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.

    2010-05-01

    whether or not probabilities based on imperfect models can be expected to yield probabilistic odds which are sustainable. Evidence is provided that suggest this is not the case. Even with very good models (good in an Root-Mean-Square sense), the risk of ruin of probabilistic odds is significantly higher than might be expected. Methods for constructing model-based non-probabilistic odds which are sustainable are discussed. The aim here is to be relevant to real world decision support, and so unrealistic assumptions of equal knowledge, equal compute power, or equal access to information are to be avoided. Finally, the use of non-probabilistic odds as a method for communicating deep uncertainty (uncertainty in a probability forecast itself) is discussed in the context of other methods, such as stating one's subjective probability that the models will prove inadequate in each particular instance (that is, the Probability of a "Big Surprise").

  11. HABITAT ASSESSMENT USING A RANDOM PROBABILITY BASED SAMPLING DESIGN: ESCAMBIA RIVER DELTA, FLORIDA

    EPA Science Inventory

    Smith, Lisa M., Darrin D. Dantin and Steve Jordan. In press. Habitat Assessment Using a Random Probability Based Sampling Design: Escambia River Delta, Florida (Abstract). To be presented at the SWS/GERS Fall Joint Society Meeting: Communication and Collaboration: Coastal Systems...

  12. Learning Probabilities in Computer Engineering by Using a Competency- and Problem-Based Approach

    ERIC Educational Resources Information Center

    Khoumsi, Ahmed; Hadjou, Brahim

    2005-01-01

    Our department has redesigned its electrical and computer engineering programs by adopting a learning methodology based on competence development, problem solving, and the realization of design projects. In this article, we show how this pedagogical approach has been successfully used for learning probabilities and their application to computer…

  13. The Role of Probability-Based Inference in an Intelligent Tutoring System.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Gitomer, Drew H.

    Probability-based inference in complex networks of interdependent variables is an active topic in statistical research, spurred by such diverse applications as forecasting, pedigree analysis, troubleshooting, and medical diagnosis. This paper concerns the role of Bayesian inference networks for updating student models in intelligent tutoring…

  14. Performance of the Rayleigh task based on the posterior probability of tomographic reconstructions

    SciTech Connect

    Hanson, K.M.

    1991-01-01

    We seek the best possible performance of the Rayleigh task in which one must decide whether a perceived object is a pair of Gaussian-blurred points or a blurred line. Two Bayesian reconstruction algorithms are used, the first based on a Gaussian prior-probability distribution with a nonnegativity constraint and the second based on an entropic prior. In both cases, the reconstructions are found that maximize the posterior probability. We compare the performance of the Rayleigh task obtained with two decision variables, the logarithm of the posterior probability ratio and the change in the mean-squared deviation from the reconstruction. The method of evaluation is based on the results of a numerical testing procedure in which the stated discrimination task is carried out on reconstructions of a randomly generated sequence of images. The ability to perform the Rayleigh task is summarized in terms of a discrimination index that is derived from the area under the receiver-operating characteristic (ROC) curve. We find that the use of the posterior probability does not result in better performance of the Rayleigh task than the mean-squared deviation from the reconstruction. 10 refs., 6 figs.

  15. Probability-based differential normalized fluorescence bivariate analysis for the classification of tissue autofluorescence spectra.

    PubMed

    Wang, Gufeng; Platz, Charles P; Geng, M Lei

    2006-05-01

    Differential normalized fluorescence (DNF) is an efficient and effective method for the differentiation of normal and cancerous tissue fluorescence spectra. The diagnostic features are extracted from the difference between the averaged cancerous and averaged normal tissue spectra and used as indices in tissue classification. In this paper, a new method, probability-based DNF bivariate analysis, is introduced based on the univariate DNF method. Two differentiation features are used concurrently in the new method to achieve better classification accuracy. The probability of each sample belonging to a disease state is determined with Bayes decision theory. This probability approach classifies the tissue spectra according to disease states and provides uncertainty information on classification. With a data set of 57 colonic tissue sites, probability-based DNF bivariate analysis is demonstrated to improve the accuracy of cancer diagnosis. The bivariate DNF analysis only requires the collection of a few data points across the entire emission spectrum and has the potential of improving data acquisition speed in tissue imaging.

  16. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets

    PubMed Central

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details. PMID:26158662

  17. Estimating transition probabilities for stage-based population projection matrices using capture-recapture data

    USGS Publications Warehouse

    Nichols, J.D.; Sauer, J.R.; Pollock, K.H.; Hestbeck, J.B.

    1992-01-01

    In stage-based demography, animals are often categorized into size (or mass) classes, and size-based probabilities of surviving and changing mass classes must be estimated before demographic analyses can be conducted. In this paper, we develop two procedures for the estimation of mass transition probabilities from capture-recapture data. The first approach uses a multistate capture-recapture model that is parameterized directly with the transition probabilities of interest. Maximum likelihood estimates are then obtained numerically using program SURVIV. The second approach involves a modification of Pollock's robust design. Estimation proceeds by conditioning on animals caught in a particualr class at time i, and then using closed models to estimate the number of these that are alive in other classes at i + 1. Both methods are illustrated by application to meadow vole, Microtus pennsylvanicus, capture-recapture data. The two methods produced reasonable estimates that were similar. Advantages of these two approaches include the directness of estimation, the absence of need for restrictive assumptions about the independence of survival and growth, the testability of assumptions, and the testability of related hypotheses of ecological interest (e.g., the hypothesis of temporal variation in transition probabilities).

  18. Differential Survival in Europe and the United States: Estimates Based on Subjective Probabilities of Survival

    PubMed Central

    Delavande, Adeline; Rohwedder, Susann

    2013-01-01

    Cross-country comparisons of differential survival by socioeconomic status (SES) are useful in many domains. Yet, to date, such studies have been rare. Reliably estimating differential survival in a single country has been challenging because it requires rich panel data with a large sample size. Cross-country estimates have proven even more difficult because the measures of SES need to be comparable internationally. We present an alternative method for acquiring information on differential survival by SES. Rather than using observations of actual survival, we relate individuals’ subjective probabilities of survival to SES variables in cross section. To show that subjective survival probabilities are informative proxies for actual survival when estimating differential survival, we compare estimates of differential survival based on actual survival with estimates based on subjective probabilities of survival for the same sample. The results are remarkably similar. We then use this approach to compare differential survival by SES for 10 European countries and the United States. Wealthier people have higher survival probabilities than those who are less wealthy, but the strength of the association differs across countries. Nations with a smaller gradient appear to be Belgium, France, and Italy, while the United States, England, and Sweden appear to have a larger gradient. PMID:22042664

  19. The relationship study between image features and detection probability based on psychology experiments

    NASA Astrophysics Data System (ADS)

    Lin, Wei; Chen, Yu-hua; Wang, Ji-yuan; Gao, Hong-sheng; Wang, Ji-jun; Su, Rong-hua; Mao, Wei

    2011-04-01

    Detection probability is an important index to represent and estimate target viability, which provides basis for target recognition and decision-making. But it will expend a mass of time and manpower to obtain detection probability in reality. At the same time, due to the different interpretation of personnel practice knowledge and experience, a great difference will often exist in the datum obtained. By means of studying the relationship between image features and perception quantity based on psychology experiments, the probability model has been established, in which the process is as following.Firstly, four image features have been extracted and quantified, which affect directly detection. Four feature similarity degrees between target and background were defined. Secondly, the relationship between single image feature similarity degree and perception quantity was set up based on psychological principle, and psychological experiments of target interpretation were designed which includes about five hundred people for interpretation and two hundred images. In order to reduce image features correlativity, a lot of artificial synthesis images have been made which include images with single brightness feature difference, images with single chromaticity feature difference, images with single texture feature difference and images with single shape feature difference. By analyzing and fitting a mass of experiments datum, the model quantitys have been determined. Finally, by applying statistical decision theory and experimental results, the relationship between perception quantity with target detection probability has been found. With the verification of a great deal of target interpretation in practice, the target detection probability can be obtained by the model quickly and objectively.

  20. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    NASA Astrophysics Data System (ADS)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-09-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  1. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    NASA Astrophysics Data System (ADS)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-07-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  2. a Probability-Based Statistical Method to Extract Water Body of TM Images with Missing Information

    NASA Astrophysics Data System (ADS)

    Lian, Shizhong; Chen, Jiangping; Luo, Minghai

    2016-06-01

    Water information cannot be accurately extracted using TM images because true information is lost in some images because of blocking clouds and missing data stripes, thereby water information cannot be accurately extracted. Water is continuously distributed in natural conditions; thus, this paper proposed a new method of water body extraction based on probability statistics to improve the accuracy of water information extraction of TM images with missing information. Different disturbing information of clouds and missing data stripes are simulated. Water information is extracted using global histogram matching, local histogram matching, and the probability-based statistical method in the simulated images. Experiments show that smaller Areal Error and higher Boundary Recall can be obtained using this method compared with the conventional methods.

  3. A simple derivation and classification of common probability distributions based on information symmetry and measurement scale.

    PubMed

    Frank, S A; Smith, E

    2011-03-01

    Commonly observed patterns typically follow a few distinct families of probability distributions. Over one hundred years ago, Karl Pearson provided a systematic derivation and classification of the common continuous distributions. His approach was phenomenological: a differential equation that generated common distributions without any underlying conceptual basis for why common distributions have particular forms and what explains the familial relations. Pearson's system and its descendants remain the most popular systematic classification of probability distributions. Here, we unify the disparate forms of common distributions into a single system based on two meaningful and justifiable propositions. First, distributions follow maximum entropy subject to constraints, where maximum entropy is equivalent to minimum information. Second, different problems associate magnitude to information in different ways, an association we describe in terms of the relation between information invariance and measurement scale. Our framework relates the different continuous probability distributions through the variations in measurement scale that change each family of maximum entropy distributions into a distinct family. From our framework, future work in biology can consider the genesis of common patterns in a new and more general way. Particular biological processes set the relation between the information in observations and magnitude, the basis for information invariance, symmetry and measurement scale. The measurement scale, in turn, determines the most likely probability distributions and observed patterns associated with particular processes. This view presents a fundamentally derived alternative to the largely unproductive debates about neutrality in ecology and evolution. PMID:21265914

  4. A method of classification for multisource data in remote sensing based on interval-valued probabilities

    NASA Technical Reports Server (NTRS)

    Kim, Hakil; Swain, Philip H.

    1990-01-01

    An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method.

  5. Probability based earthquake load and resistance factor design criteria for offshore platforms

    SciTech Connect

    Bea, R.G.

    1996-12-31

    This paper describes a probability reliability based formulation to determine earthquake Load and Resistance Factor Design (LRFD) parameters for conventional, steel, pile supported, tubular membered platforms that is proposed as a basis for earthquake design criteria and guidelines for offshore platforms that are intended to have worldwide applicability. The formulation is illustrated with application to platforms located in five areas: offshore California, Venezuela (Rio Caribe), the East Coast of Canada, in the Caspian Sea (Azeri), and the Norwegian sector of the North Sea.

  6. Hit Expansion from Screening Data Based upon Conditional Probabilities of Activity Derived from SAR Matrices.

    PubMed

    Gupta-Ostermann, Disha; Balfer, Jenny; Bajorath, Jürgen

    2015-02-01

    A new methodology for activity prediction of compounds from SAR matrices is introduced that is based upon conditional probabilities of activity. The approach has low computational complexity, is primarily designed for hit expansion from biological screening data, and accurately predicts both active and inactive compounds. Its performance is comparable to state-of-the-art machine learning methods such as support vector machines or Bayesian classification. Matrix-based activity prediction of virtual compounds further extends the spectrum of computational methods for compound design.

  7. Flow Regime Based Climatologies of Lightning Probabilities for Spaceports and Airports

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III; Volmer, Matthew; Sharp, David; Spratt, Scott; Lafosse, Richard A.

    2007-01-01

    Objective: provide forecasters with a "first guess" climatological lightning probability tool (1) Focus on Space Shuttle landings and NWS T AFs (2) Four circles around sites: 5-, 10-, 20- and 30 n mi (4) Three time intervals: hourly, every 3 hr and every 6 hr It is based on: (1) NLDN gridded data (2) Flow regime (3) Warm season months of May-Sep for years 1989-2004 Gridded data and available code yields squares, not circles Over 850 spread sheets converted into manageable user-friendly web-based GUI

  8. Hit Expansion from Screening Data Based upon Conditional Probabilities of Activity Derived from SAR Matrices.

    PubMed

    Gupta-Ostermann, Disha; Balfer, Jenny; Bajorath, Jürgen

    2015-02-01

    A new methodology for activity prediction of compounds from SAR matrices is introduced that is based upon conditional probabilities of activity. The approach has low computational complexity, is primarily designed for hit expansion from biological screening data, and accurately predicts both active and inactive compounds. Its performance is comparable to state-of-the-art machine learning methods such as support vector machines or Bayesian classification. Matrix-based activity prediction of virtual compounds further extends the spectrum of computational methods for compound design. PMID:27490036

  9. Computing light statistics in heterogeneous media based on a mass weighted probability density function method.

    PubMed

    Jenny, Patrick; Mourad, Safer; Stamm, Tobias; Vöge, Markus; Simon, Klaus

    2007-08-01

    Based on the transport theory, we present a modeling approach to light scattering in turbid material. It uses an efficient and general statistical description of the material's scattering and absorption behavior. The model estimates the spatial distribution of intensity and the flow direction of radiation, both of which are required, e.g., for adaptable predictions of the appearance of colors in halftone prints. This is achieved by employing a computational particle method, which solves a model equation for the probability density function of photon positions and propagation directions. In this framework, each computational particle represents a finite probability of finding a photon in a corresponding state, including properties like wavelength. Model evaluations and verifications conclude the discussion.

  10. Probability-Based Software for Grid Optimization: Improved Power System Operations Using Advanced Stochastic Optimization

    SciTech Connect

    2012-02-24

    GENI Project: Sandia National Laboratories is working with several commercial and university partners to develop software for market management systems (MMSs) that enable greater use of renewable energy sources throughout the grid. MMSs are used to securely and optimally determine which energy resources should be used to service energy demand across the country. Contributions of electricity to the grid from renewable energy sources such as wind and solar are intermittent, introducing complications for MMSs, which have trouble accommodating the multiple sources of price and supply uncertainties associated with bringing these new types of energy into the grid. Sandia’s software will bring a new, probability-based formulation to account for these uncertainties. By factoring in various probability scenarios for electricity production from renewable energy sources in real time, Sandia’s formula can reduce the risk of inefficient electricity transmission, save ratepayers money, conserve power, and support the future use of renewable energy.

  11. Robust rate-control for wavelet-based image coding via conditional probability models.

    PubMed

    Gaubatz, Matthew D; Hemami, Sheila S

    2007-03-01

    Real-time rate-control for wavelet image coding requires characterization of the rate required to code quantized wavelet data. An ideal robust solution can be used with any wavelet coder and any quantization scheme. A large number of wavelet quantization schemes (perceptual and otherwise) are based on scalar dead-zone quantization of wavelet coefficients. A key to performing rate-control is, thus, fast, accurate characterization of the relationship between rate and quantization step size, the R-Q curve. A solution is presented using two invocations of the coder that estimates the slope of each R-Q curve via probability modeling. The method is robust to choices of probability models, quantization schemes and wavelet coders. Because of extreme robustness to probability modeling, a fast approximation to spatially adaptive probability modeling can be used in the solution, as well. With respect to achieving a target rate, the proposed approach and associated fast approximation yield average percentage errors around 0.5% and 1.0% on images in the test set. By comparison, 2-coding-pass rho-domain modeling yields errors around 2.0%, and post-compression rate-distortion optimization yields average errors of around 1.0% at rates below 0.5 bits-per-pixel (bpp) that decrease down to about 0.5% at 1.0 bpp; both methods exhibit more competitive performance on the larger images. The proposed method and fast approximation approach are also similar in speed to the other state-of-the-art methods. In addition to possessing speed and accuracy, the proposed method does not require any training and can maintain precise control over wavelet step sizes, which adds flexibility to a wavelet-based image-coding system.

  12. Mice plan decision strategies based on previously learned time intervals, locations, and probabilities.

    PubMed

    Tosun, Tuğçe; Gür, Ezgi; Balcı, Fuat

    2016-01-19

    Animals can shape their timed behaviors based on experienced probabilistic relations in a nearly optimal fashion. On the other hand, it is not clear if they adopt these timed decisions by making computations based on previously learnt task parameters (time intervals, locations, and probabilities) or if they gradually develop their decisions based on trial and error. To address this question, we tested mice in the timed-switching task, which required them to anticipate when (after a short or long delay) and at which of the two delay locations a reward would be presented. The probability of short trials differed between test groups in two experiments. Critically, we first trained mice on relevant task parameters by signaling the active trial with a discriminative stimulus and delivered the corresponding reward after the associated delay without any response requirement (without inducing switching behavior). During the test phase, both options were presented simultaneously to characterize the emergence and temporal characteristics of the switching behavior. Mice exhibited timed-switching behavior starting from the first few test trials, and their performance remained stable throughout testing in the majority of the conditions. Furthermore, as the probability of the short trial increased, mice waited longer before switching from the short to long location (experiment 1). These behavioral adjustments were in directions predicted by reward maximization. These results suggest that rather than gradually adjusting their time-dependent choice behavior, mice abruptly adopted temporal decision strategies by directly integrating their previous knowledge of task parameters into their timed behavior, supporting the model-based representational account of temporal risk assessment. PMID:26733674

  13. Probability-confidence-kernel-based localized multiple kernel learning with lp norm.

    PubMed

    Han, Yina; Liu, Guizhong

    2012-06-01

    Localized multiple kernel learning (LMKL) is an attractive strategy for combining multiple heterogeneous features in terms of their discriminative power for each individual sample. However, models excessively fitting to a specific sample would obstacle the extension to unseen data, while a more general form is often insufficient for diverse locality characterization. Hence, both learning sample-specific local models for each training datum and extending the learned models to unseen test data should be equally addressed in designing LMKL algorithm. In this paper, for an integrative solution, we propose a probability confidence kernel (PCK), which measures per-sample similarity with respect to probabilistic-prediction-based class attribute: The class attribute similarity complements the spatial-similarity-based base kernels for more reasonable locality characterization, and the predefined form of involved class probability density function facilitates the extension to the whole input space and ensures its statistical meaning. Incorporating PCK into support-vectormachine-based LMKL framework, we propose a new PCK-LMKL with arbitrary l(p)-norm constraint implied in the definition of PCKs, where both the parameters in PCK and the final classifier can be efficiently optimized in a joint manner. Evaluations of PCK-LMKL on both benchmark machine learning data sets (ten University of California Irvine (UCI) data sets) and challenging computer vision data sets (15-scene data set and Caltech-101 data set) have shown to achieve state-of-the-art performances.

  14. Identification of contaminant point source in surface waters based on backward location probability density function method

    NASA Astrophysics Data System (ADS)

    Cheng, Wei Ping; Jia, Yafei

    2010-04-01

    A backward location probability density function (BL-PDF) method capable of identifying location of point sources in surface waters is presented in this paper. The relation of forward location probability density function (FL-PDF) and backward location probability density, based on adjoint analysis, is validated using depth-averaged free-surface flow and mass transport models and several surface water test cases. The solutions of the backward location PDF transport equation agreed well to the forward location PDF computed using the pollutant concentration at the monitoring points. Using this relation and the distribution of the concentration detected at the monitoring points, an effective point source identification method is established. The numerical error of the backward location PDF simulation is found to be sensitive to the irregularity of the computational meshes, diffusivity, and velocity gradients. The performance of identification method is evaluated regarding the random error and number of observed values. In addition to hypothetical cases, a real case was studied to identify the source location where a dye tracer was instantaneously injected into a stream. The study indicated the proposed source identification method is effective, robust, and quite efficient in surface waters; the number of advection-diffusion equations needed to solve is equal to the number of observations.

  15. Comparative assessment of surface fluxes from different sources: a framework based on probability distributions

    NASA Astrophysics Data System (ADS)

    Gulev, S.

    2015-12-01

    Surface turbulent heat fluxes from modern era and first generation reanalyses (NCEP-DOE, ERA-Interim, MERRA NCEP-CFSR, JRA) as well as from satellite products (SEAFLUX, IFREMER, HOAPS) were intercompared using framework of probability distributions for sensible and latent heat fluxes. For approximation of probability distributions and estimation of extreme flux values Modified Fisher-Tippett (MFT) distribution has been used. Besides mean flux values, consideration is given to the comparative analysis of (i) parameters of the MFT probability density functions (scale and location), (ii) extreme flux values corresponding high order percentiles of fluxes (e.g. 99th and higher) and (iii) fractional contribution of extreme surface flux events in the total surface turbulent fluxes integrated over months and seasons. The latter was estimated using both fractional distribution derived from MFT and empirical estimates based upon occurrence histograms. The strongest differences in the parameters of probability distributions of surface fluxes and extreme surface flux values between different reanalyses are found in the western boundary current extension regions and high latitudes, while the highest differences in the fractional contributions of surface fluxes may occur in mid ocean regions being closely associated with atmospheric synoptic dynamics. Generally, satellite surface flux products demonstrate relatively stronger extreme fluxes compared to reanalyses, even in the Northern Hemisphere midlatitudes where data assimilation input in reanalyses is quite dense compared to the Southern Ocean regions. Our assessment also discriminated different reanalyses and satellite products with respect to their ability to quantify the role of extreme surface turbulent fluxes in forming ocean heat release in different regions.

  16. A multivariate copula-based framework for dealing with hazard scenarios and failure probabilities

    NASA Astrophysics Data System (ADS)

    Salvadori, G.; Durante, F.; De Michele, C.; Bernardi, M.; Petrella, L.

    2016-05-01

    This paper is of methodological nature, and deals with the foundations of Risk Assessment. Several international guidelines have recently recommended to select appropriate/relevant Hazard Scenarios in order to tame the consequences of (extreme) natural phenomena. In particular, the scenarios should be multivariate, i.e., they should take into account the fact that several variables, generally not independent, may be of interest. In this work, it is shown how a Hazard Scenario can be identified in terms of (i) a specific geometry and (ii) a suitable probability level. Several scenarios, as well as a Structural approach, are presented, and due comparisons are carried out. In addition, it is shown how the Hazard Scenario approach illustrated here is well suited to cope with the notion of Failure Probability, a tool traditionally used for design and risk assessment in engineering practice. All the results outlined throughout the work are based on the Copula Theory, which turns out to be a fundamental theoretical apparatus for doing multivariate risk assessment: formulas for the calculation of the probability of Hazard Scenarios in the general multidimensional case (d≥2) are derived, and worthy analytical relationships among the probabilities of occurrence of Hazard Scenarios are presented. In addition, the Extreme Value and Archimedean special cases are dealt with, relationships between dependence ordering and scenario levels are studied, and a counter-example concerning Tail Dependence is shown. Suitable indications for the practical application of the techniques outlined in the work are given, and two case studies illustrate the procedures discussed in the paper.

  17. Forestry inventory based on multistage sampling with probability proportional to size

    NASA Technical Reports Server (NTRS)

    Lee, D. C. L.; Hernandez, P., Jr.; Shimabukuro, Y. E.

    1983-01-01

    A multistage sampling technique, with probability proportional to size, is developed for a forest volume inventory using remote sensing data. The LANDSAT data, Panchromatic aerial photographs, and field data are collected. Based on age and homogeneity, pine and eucalyptus classes are identified. Selection of tertiary sampling units is made through aerial photographs to minimize field work. The sampling errors for eucalyptus and pine ranged from 8.34 to 21.89 percent and from 7.18 to 8.60 percent, respectively.

  18. Finding significantly connected voxels based on histograms of connection strengths

    NASA Astrophysics Data System (ADS)

    Kasenburg, Niklas; Pedersen, Morten Vester; Darkner, Sune

    2016-03-01

    We explore a new approach for structural connectivity based segmentations of subcortical brain regions. Connectivity based segmentations are usually based on fibre connections from a seed region to predefined target regions. We present a method for finding significantly connected voxels based on the distribution of connection strengths. Paths from seed voxels to all voxels in a target region are obtained from a shortest-path tractography. For each seed voxel we approximate the distribution with a histogram of path scores. We hypothesise that the majority of estimated connections are false-positives and that their connection strength is distributed differently from true-positive connections. Therefore, an empirical null-distribution is defined for each target region as the average normalized histogram over all voxels in the seed region. Single histograms are then tested against the corresponding null-distribution and significance is determined using the false discovery rate (FDR). Segmentations are based on significantly connected voxels and their FDR. In this work we focus on the thalamus and the target regions were chosen by dividing the cortex into a prefrontal/temporal zone, motor zone, somatosensory zone and a parieto-occipital zone. The obtained segmentations consistently show a sparse number of significantly connected voxels that are located near the surface of the anterior thalamus over a population of 38 subjects.

  19. A classification scheme for edge-localized modes based on their probability distributions

    NASA Astrophysics Data System (ADS)

    Shabbir, A.; Hornung, G.; Verdoolaege, G.

    2016-11-01

    We present here an automated classification scheme which is particularly well suited to scenarios where the parameters have significant uncertainties or are stochastic quantities. To this end, the parameters are modeled with probability distributions in a metric space and classification is conducted using the notion of nearest neighbors. The presented framework is then applied to the classification of type I and type III edge-localized modes (ELMs) from a set of carbon-wall plasmas at JET. This provides a fast, standardized classification of ELM types which is expected to significantly reduce the effort of ELM experts in identifying ELM types. Further, the classification scheme is general and can be applied to various other plasma phenomena as well.

  20. A generative probability model of joint label fusion for multi-atlas based brain segmentation

    PubMed Central

    Wu, Guorong; Wang, Qian; Zhang, Daoqiang; Nie, Feiping; Huang, Heng; Shen, Dinggang

    2013-01-01

    Automated labeling of anatomical structures in medical images is very important in many neuroscience studies. Recently, patch-based labeling has been widely investigated to alleviate the possible mis-alignment when registering atlases to the target image. However, the weights used for label fusion from the registered atlases are generally computed independently and thus lack the capability of preventing the ambiguous atlas patches from contributing to the label fusion. More critically, these weights are often calculated based only on the simple patch similarity, thus not necessarily providing optimal solution for label fusion. To address these limitations, we propose a generative probability model to describe the procedure of label fusion in a multi-atlas scenario, for the goal of labeling each point in the target image by the best representative atlas patches that also have the largest labeling unanimity in labeling the underlying point correctly. Specifically, sparsity constraint is imposed upon label fusion weights, in order to select a small number of atlas patches that best represent the underlying target patch, thus reducing the risks of including the misleading atlas patches. The labeling unanimity among atlas patches is achieved by exploring their dependencies, where we model these dependencies as the joint probability of each pair of atlas patches in correctly predicting the labels, by analyzing the correlation of their morphological error patterns and also the labeling consensus among atlases. The patch dependencies will be further recursively updated based on the latest labeling results to correct the possible labeling errors, which falls to the Expectation Maximization (EM) framework. To demonstrate the labeling performance, we have comprehensively evaluated our patch-based labeling method on the whole brain parcellation and hippocampus segmentation. Promising labeling results have been achieved with comparison to the conventional patch-based labeling

  1. Partitioning incident radiation fluxes based on photon recollision probability in vegetation canopies

    NASA Astrophysics Data System (ADS)

    M~Ottus, M.; Stenberg, P.

    2007-12-01

    Remote sensing of vegetation and modeling of canopy microclimate requires information on the fractions of incident radiation reflected, transmitted and absorbed by a plant canopy. The photon recollision probability p allows to calculate easily the amount of radiation absorbed by a vegetation canopy and to predict the spectral behavior of canopy scattering, i.e. the sum of canopy reflectance and transmittance. However, to divide the scattered radiation into reflected and transmitted fluxes, additional models are needed. To overcome this problem, we present a simple formula based on the photon recollision probability p to estimate the fraction of radiation scattered upwards by a canopy. The new semi-empirical method is tested with Monte Carlo simulations. A comparison with the analytical solution of the two-stream equation of radiative transfer in vegetation canopies is also provided. Our results indicate that the method is accurate for low to moderate leaf area index (LAI) values, and provides a reasonable approximation even at LAI=8. Finally, we present a new method to compute p using numerical radiative transfer models.

  2. A Trust-Based Adaptive Probability Marking and Storage Traceback Scheme for WSNs

    PubMed Central

    Liu, Anfeng; Liu, Xiao; Long, Jun

    2016-01-01

    Security is a pivotal issue for wireless sensor networks (WSNs), which are emerging as a promising platform that enables a wide range of military, scientific, industrial and commercial applications. Traceback, a key cyber-forensics technology, can play an important role in tracing and locating a malicious source to guarantee cybersecurity. In this work a trust-based adaptive probability marking and storage (TAPMS) traceback scheme is proposed to enhance security for WSNs. In a TAPMS scheme, the marking probability is adaptively adjusted according to the security requirements of the network and can substantially reduce the number of marking tuples and improve network lifetime. More importantly, a high trust node is selected to store marking tuples, which can avoid the problem of marking information being lost. Experimental results show that the total number of marking tuples can be reduced in a TAPMS scheme, thus improving network lifetime. At the same time, since the marking tuples are stored in high trust nodes, storage reliability can be guaranteed, and the traceback time can be reduced by more than 80%. PMID:27043566

  3. A Trust-Based Adaptive Probability Marking and Storage Traceback Scheme for WSNs.

    PubMed

    Liu, Anfeng; Liu, Xiao; Long, Jun

    2016-01-01

    Security is a pivotal issue for wireless sensor networks (WSNs), which are emerging as a promising platform that enables a wide range of military, scientific, industrial and commercial applications. Traceback, a key cyber-forensics technology, can play an important role in tracing and locating a malicious source to guarantee cybersecurity. In this work a trust-based adaptive probability marking and storage (TAPMS) traceback scheme is proposed to enhance security for WSNs. In a TAPMS scheme, the marking probability is adaptively adjusted according to the security requirements of the network and can substantially reduce the number of marking tuples and improve network lifetime. More importantly, a high trust node is selected to store marking tuples, which can avoid the problem of marking information being lost. Experimental results show that the total number of marking tuples can be reduced in a TAPMS scheme, thus improving network lifetime. At the same time, since the marking tuples are stored in high trust nodes, storage reliability can be guaranteed, and the traceback time can be reduced by more than 80%. PMID:27043566

  4. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    DOE PAGESBeta

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; Martin-Martinez, Sergio; Zhang, Jie; Hodge, Bri -Mathias; Molina-Garcia, Angel

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less

  5. A Trust-Based Adaptive Probability Marking and Storage Traceback Scheme for WSNs.

    PubMed

    Liu, Anfeng; Liu, Xiao; Long, Jun

    2016-03-30

    Security is a pivotal issue for wireless sensor networks (WSNs), which are emerging as a promising platform that enables a wide range of military, scientific, industrial and commercial applications. Traceback, a key cyber-forensics technology, can play an important role in tracing and locating a malicious source to guarantee cybersecurity. In this work a trust-based adaptive probability marking and storage (TAPMS) traceback scheme is proposed to enhance security for WSNs. In a TAPMS scheme, the marking probability is adaptively adjusted according to the security requirements of the network and can substantially reduce the number of marking tuples and improve network lifetime. More importantly, a high trust node is selected to store marking tuples, which can avoid the problem of marking information being lost. Experimental results show that the total number of marking tuples can be reduced in a TAPMS scheme, thus improving network lifetime. At the same time, since the marking tuples are stored in high trust nodes, storage reliability can be guaranteed, and the traceback time can be reduced by more than 80%.

  6. A Priori Knowledge and Probability Density Based Segmentation Method for Medical CT Image Sequences

    PubMed Central

    Tan, Hanqing; Yang, Benqiang

    2014-01-01

    This paper briefly introduces a novel segmentation strategy for CT images sequences. As first step of our strategy, we extract a priori intensity statistical information from object region which is manually segmented by radiologists. Then we define a search scope for object and calculate probability density for each pixel in the scope using a voting mechanism. Moreover, we generate an optimal initial level set contour based on a priori shape of object of previous slice. Finally the modified distance regularity level set method utilizes boundaries feature and probability density to conform final object. The main contributions of this paper are as follows: a priori knowledge is effectively used to guide the determination of objects and a modified distance regularization level set method can accurately extract actual contour of object in a short time. The proposed method is compared to other seven state-of-the-art medical image segmentation methods on abdominal CT image sequences datasets. The evaluated results demonstrate our method performs better and has the potential for segmentation in CT image sequences. PMID:24967402

  7. Probability density based gradient projection method for inverse kinematics of a robotic human body model.

    PubMed

    Lura, Derek; Wernke, Matthew; Alqasemi, Redwan; Carey, Stephanie; Dubey, Rajiv

    2012-01-01

    This paper presents the probability density based gradient projection (GP) of the null space of the Jacobian for a 25 degree of freedom bilateral robotic human body model (RHBM). This method was used to predict the inverse kinematics of the RHBM and maximize the similarity between predicted inverse kinematic poses and recorded data of 10 subjects performing activities of daily living. The density function was created for discrete increments of the workspace. The number of increments in each direction (x, y, and z) was varied from 1 to 20. Performance of the method was evaluated by finding the root mean squared (RMS) of the difference between the predicted joint angles relative to the joint angles recorded from motion capture. The amount of data included in the creation of the probability density function was varied from 1 to 10 subjects, creating sets of for subjects included and excluded from the density function. The performance of the GP method for subjects included and excluded from the density function was evaluated to test the robustness of the method. Accuracy of the GP method varied with amount of incremental division of the workspace, increasing the number of increments decreased the RMS error of the method, with the error of average RMS error of included subjects ranging from 7.7° to 3.7°. However increasing the number of increments also decreased the robustness of the method.

  8. Performance Demonstration Based Probablity of Detection (POD) Curves for Fatigue Cracks in Piping

    SciTech Connect

    Gosselin, Stephen R.; Simonen, Fredric A.; Heasler, Patrick G.; Becker, F. L.; Doctor, Steven R.; Carter, R. G.

    2005-07-01

    This paper evaluates non-destructive examination (NDE) detection capabilities for fatigue cracks in piping. Industry performance demonstration initiative (PDI) data for fatigue crack detection were used to develop a matrix of statistically based probability of detection (POD) curves that consider various NDE performance factors. Seven primary performance factors were identified – Material, Crack Geometry/Type, NDE Examination Access, NDE Procedure, Examiner Qualification, Pipe Diameter, and Pipe Wall Thickness. A database of 16,181 NDE performance observations, with 18 fields associated with each observation, was created and used to develop statistically based POD curves for 42 stainless steel and 14 carbon steel performance cases. Subsequent comparisons of the POD fits for each of the cases showed that excellent NDE performance for fatigue cracks can be expected for ferritic materials. Very little difference was observed between the POD curves for the 14 carbon steel performance cases considered in this study and NDE performance could therefore be represented by a single POD curve. For stainless steel, very good performance can also be expected for circumferential cracks located on the same side of the weld from which the NDE examination is made. POD depended primarily on component thickness. Three POD curves for stainless steel were prepared. Best estimate and the associated 95% confidence bounds for POD versas through-wall depth logistic regression digital data are provided. Probabilistic fracture mechanics (PFM) calculations were performed to compare best estimate leak probabilities obtained from both the new performance-based POD curves and previous PFM models. This work was performed under joint funding by EPRI and the U.S. Department of Energy (DOE), Office of Nuclear Energy Science and Technology’s Nuclear Energy Plant Optimization (NEPO) program.

  9. Reliability, failure probability, and strength of resin-based materials for CAD/CAM restorations

    PubMed Central

    Lim, Kiatlin; Yap, Adrian U-Jin; Agarwalla, Shruti Vidhawan; Tan, Keson Beng-Choon; Rosa, Vinicius

    2016-01-01

    ABSTRACT Objective: This study investigated the Weibull parameters and 5% fracture probability of direct, indirect composites, and CAD/CAM composites. Material and Methods: Discshaped (12 mm diameter x 1 mm thick) specimens were prepared for a direct composite [Z100 (ZO), 3M-ESPE], an indirect laboratory composite [Ceramage (CM), Shofu], and two CAD/CAM composites [Lava Ultimate (LU), 3M ESPE; Vita Enamic (VE), Vita Zahnfabrik] restorations (n=30 for each group). The specimens were polished, stored in distilled water for 24 hours at 37°C. Weibull parameters (m= modulus of Weibull, σ0= characteristic strength) and flexural strength for 5% fracture probability (σ5%) were determined using a piston-on-three-balls device at 1 MPa/s in distilled water. Statistical analysis for biaxial flexural strength analysis were performed either by both one-way ANOVA and Tukey's post hoc (α=0.05) or by Pearson's correlation test. Results: Ranking of m was: VE (19.5), LU (14.5), CM (11.7), and ZO (9.6). Ranking of σ0 (MPa) was: LU (218.1), ZO (210.4), CM (209.0), and VE (126.5). σ5% (MPa) was 177.9 for LU, 163.2 for CM, 154.7 for Z0, and 108.7 for VE. There was no significant difference in the m for ZO, CM, and LU. VE presented the highest m value and significantly higher than ZO. For σ0 and σ5%, ZO, CM, and LU were similar but higher than VE. Conclusion: The strength characteristics of CAD/ CAM composites vary according to their composition and microstructure. VE presented the lowest strength and highest Weibull modulus among the materials. PMID:27812614

  10. Location Prediction Based on Transition Probability Matrices Constructing from Sequential Rules for Spatial-Temporal K-Anonymity Dataset

    PubMed Central

    Liu, Zhao; Zhu, Yunhong; Wu, Chenxue

    2016-01-01

    Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users’ privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified. PMID:27508502

  11. Location Prediction Based on Transition Probability Matrices Constructing from Sequential Rules for Spatial-Temporal K-Anonymity Dataset.

    PubMed

    Zhang, Haitao; Chen, Zewei; Liu, Zhao; Zhu, Yunhong; Wu, Chenxue

    2016-01-01

    Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users' privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified. PMID:27508502

  12. Estimation of the failure probability during EGS stimulation based on borehole data

    NASA Astrophysics Data System (ADS)

    Meller, C.; Kohl, Th.; Gaucher, E.

    2012-04-01

    In recent times the search for alternative sources of energy has been fostered by the scarcity of fossil fuels. With its ability to permanently provide electricity or heat with little emission of CO2, geothermal energy will have an important share in the energy mix of the future. Within Europe, scientists identified many locations with conditions suitable for Enhanced Geothermal System (EGS) projects. In order to provide sufficiently high reservoir permeability, EGS require borehole stimulations prior to installation of power plants (Gérard et al, 2006). Induced seismicity during water injection into reservoirs EGS systems is a factor that currently cannot be predicted nor controlled. Often, people living near EGS projects are frightened by smaller earthquakes occurring during stimulation or injection. As this fear can lead to widespread disapproval of geothermal power plants, it is appreciable to find a way to estimate the probability of fractures to shear when injecting water with a distinct pressure into a geothermal reservoir. This provides knowledge, which enables to predict the mechanical behavior of a reservoir in response to a change in pore pressure conditions. In the present study an approach for estimation of the shearing probability based on statistical analyses of fracture distribution, orientation and clusters, together with their geological properties is proposed. Based on geophysical logs of five wells in Soultz-sous-Forêts, France, and with the help of statistical tools, the Mohr criterion, geological and mineralogical properties of the host rock and the fracture fillings, correlations between the wells are analyzed. This is achieved with the self-written MATLAB-code Fracdens, which enables us to statistically analyze the log files in different ways. With the application of a pore pressure change, the evolution of the critical pressure on the fractures can be determined. A special focus is on the clay fillings of the fractures and how they reduce

  13. Binomial probability distribution model-based protein identification algorithm for tandem mass spectrometry utilizing peak intensity information.

    PubMed

    Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu

    2013-01-01

    Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .

  14. Influence of sampling intake position on suspended solid measurements in sewers: two probability/time-series-based approaches.

    PubMed

    Sandoval, Santiago; Bertrand-Krajewski, Jean-Luc

    2016-06-01

    Total suspended solid (TSS) measurements in urban drainage systems are required for several reasons. Aiming to assess uncertainties in the mean TSS concentration due to the influence of sampling intake vertical position and vertical concentration gradients in a sewer pipe, two methods are proposed: a simplified method based on a theoretical vertical concentration profile (SM) and a time series grouping method (TSM). SM is based on flow rate and water depth time series. TSM requires additional TSS time series as input data. All time series are from the Chassieu urban catchment in Lyon, France (time series from 2007 with 2-min time step, 89 rainfall events). The probability of measuring a TSS value lower than the mean TSS along the vertical cross section (TSS underestimation) is about 0.88 with SM and about 0.64 with TSM. TSM shows more realistic TSS underestimation values (about 39 %) than SM (about 269 %). Interquartile ranges (IQR) over the probability values indicate that SM is more uncertain (IQR = 0.08) than TSM (IQR = 0.02). Differences between the two methods are mainly due to simplifications in SM (absence of TSS measurements). SM assumes a significant asymmetry of the TSS concentration profile along the vertical axis in the cross section. This is compatible with the distribution of TSS measurements found in the TSM approach. The methods provide insights towards an indicator of the measurement performance and representativeness for a TSS sampling protocol. PMID:27178049

  15. Probability-based diagnostic imaging using hybrid features extracted from ultrasonic Lamb wave signals

    NASA Astrophysics Data System (ADS)

    Zhou, Chao; Su, Zhongqing; Cheng, Li

    2011-12-01

    The imaging technique based on guided waves has been a research focus in the field of damage detection over the years, aimed at intuitively highlighting structural damage in two- or three-dimensional images. The accuracy and efficiency of this technique substantially rely on the means of defining the field values at image pixels. In this study, a novel probability-based diagnostic imaging (PDI) approach was developed. Hybrid signal features (including temporal information, intensity of signal energy and signal correlation) were extracted from ultrasonic Lamb wave signals and integrated to retrofit the traditional way of defining field values. To acquire hybrid signal features, an active sensor network in line with pulse-echo and pitch-catch configurations was designed, supplemented with a novel concept of 'virtual sensing'. A hybrid image fusion scheme was developed to enhance the tolerance of the approach to measurement noise/uncertainties and erroneous perceptions from individual sensors. As applications, the approach was employed to identify representative damage scenarios including L-shape through-thickness crack (orientation-specific damage), polygonal damage (multi-edge damage) and multi-damage in structural plates. Results have corroborated that the developed PDI approach based on the use of hybrid signal features is capable of visualizing structural damage quantitatively, regardless of damage shape and number, by highlighting its individual edges in an easily interpretable binary image.

  16. A generic probability based algorithm to derive regional patterns of crops in time and space

    NASA Astrophysics Data System (ADS)

    Wattenbach, Martin; Oijen, Marcel v.; Leip, Adrian; Hutchings, Nick; Balkovic, Juraj; Smith, Pete

    2013-04-01

    Croplands are not only the key to human food supply, they also change the biophysical and biogeochemical properties of the land surface leading to changes in the water cycle, energy partitioning, influence soil erosion and substantially contribute to the amount of greenhouse gases entering the atmosphere. The effects of croplands on the environment depend on the type of crop and the associated management which both are related to the site conditions, economic boundary settings as well as preferences of individual farmers. However, at a given point of time the pattern of crops in a landscape is not only determined by environmental and socioeconomic conditions but also by the compatibility to the crops which had been grown in the years before at the current field and its surrounding cropping area. The crop compatibility is driven by factors like pests and diseases, crop driven changes in soil structure and timing of cultivation steps. Given these effects of crops on the biochemical cycle and their interdependence with the mentioned boundary conditions, there is a demand in the regional and global modelling community to account for these regional patterns. Here we present a Bayesian crop distribution generator algorithm that is used to calculate the combined and conditional probability for a crop to appear in time and space using sparse and disparate information. The input information to define the most probable crop per year and grid cell is based on combined probabilities derived from the a crop transition matrix representing good agricultural practice, crop specific soil suitability derived from the European soil database and statistical information about harvested area from the Eurostat database. The reported Eurostat crop area also provides the target proportion to be matched by the algorithm on the level of administrative units (Nomenclature des Unités Territoriales Statistiques - NUTS). The algorithm is applied for the EU27 to derive regional spatial and

  17. Experience-Based Probabilities Modulate Expectations in a Gender-Coded Artificial Language.

    PubMed

    Öttl, Anton; Behne, Dawn M

    2016-01-01

    The current study combines artificial language learning with visual world eyetracking to investigate acquisition of representations associating spoken words and visual referents using morphologically complex pseudowords. Pseudowords were constructed to consistently encode referential gender by means of suffixation for a set of imaginary figures that could be either male or female. During training, the frequency of exposure to pseudowords and their imaginary figure referents were manipulated such that a given word and its referent would be more likely to occur in either the masculine form or the feminine form, or both forms would be equally likely. Results show that these experience-based probabilities affect the formation of new representations to the extent that participants were faster at recognizing a referent whose gender was consistent with the induced expectation than a referent whose gender was inconsistent with this expectation. Disambiguating gender information available from the suffix did not mask the induced expectations. Eyetracking data provide additional evidence that such expectations surface during online lexical processing. Taken together, these findings indicate that experience-based information is accessible during the earliest stages of processing, and are consistent with the view that language comprehension depends on the activation of perceptual memory traces. PMID:27602009

  18. Experience-Based Probabilities Modulate Expectations in a Gender-Coded Artificial Language

    PubMed Central

    Öttl, Anton; Behne, Dawn M.

    2016-01-01

    The current study combines artificial language learning with visual world eyetracking to investigate acquisition of representations associating spoken words and visual referents using morphologically complex pseudowords. Pseudowords were constructed to consistently encode referential gender by means of suffixation for a set of imaginary figures that could be either male or female. During training, the frequency of exposure to pseudowords and their imaginary figure referents were manipulated such that a given word and its referent would be more likely to occur in either the masculine form or the feminine form, or both forms would be equally likely. Results show that these experience-based probabilities affect the formation of new representations to the extent that participants were faster at recognizing a referent whose gender was consistent with the induced expectation than a referent whose gender was inconsistent with this expectation. Disambiguating gender information available from the suffix did not mask the induced expectations. Eyetracking data provide additional evidence that such expectations surface during online lexical processing. Taken together, these findings indicate that experience-based information is accessible during the earliest stages of processing, and are consistent with the view that language comprehension depends on the activation of perceptual memory traces. PMID:27602009

  19. Global climate change model natural climate variation: Paleoclimate data base, probabilities and astronomic predictors

    SciTech Connect

    Kukla, G.; Gavin, J.

    1994-05-01

    This report was prepared at the Lamont-Doherty Geological Observatory of Columbia University at Palisades, New York, under subcontract to Pacific Northwest Laboratory it is a part of a larger project of global climate studies which supports site characterization work required for the selection of a potential high-level nuclear waste repository and forms part of the Performance Assessment Scientific Support (PASS) Program at PNL. The work under the PASS Program is currently focusing on the proposed site at Yucca Mountain, Nevada, and is under the overall direction of the Yucca Mountain Project Office US Department of Energy, Las Vegas, Nevada. The final results of the PNL project will provide input to global atmospheric models designed to test specific climate scenarios which will be used in the site specific modeling work of others. The primary purpose of the data bases compiled and of the astronomic predictive models is to aid in the estimation of the probabilities of future climate states. The results will be used by two other teams working on the global climate study under contract to PNL. They are located at and the University of Maine in Orono, Maine, and the Applied Research Corporation in College Station, Texas. This report presents the results of the third year`s work on the global climate change models and the data bases describing past climates.

  20. Experience-Based Probabilities Modulate Expectations in a Gender-Coded Artificial Language

    PubMed Central

    Öttl, Anton; Behne, Dawn M.

    2016-01-01

    The current study combines artificial language learning with visual world eyetracking to investigate acquisition of representations associating spoken words and visual referents using morphologically complex pseudowords. Pseudowords were constructed to consistently encode referential gender by means of suffixation for a set of imaginary figures that could be either male or female. During training, the frequency of exposure to pseudowords and their imaginary figure referents were manipulated such that a given word and its referent would be more likely to occur in either the masculine form or the feminine form, or both forms would be equally likely. Results show that these experience-based probabilities affect the formation of new representations to the extent that participants were faster at recognizing a referent whose gender was consistent with the induced expectation than a referent whose gender was inconsistent with this expectation. Disambiguating gender information available from the suffix did not mask the induced expectations. Eyetracking data provide additional evidence that such expectations surface during online lexical processing. Taken together, these findings indicate that experience-based information is accessible during the earliest stages of processing, and are consistent with the view that language comprehension depends on the activation of perceptual memory traces.

  1. Effect-based interpretation of toxicity test data using probability and comparison with alternative methods of analysis

    SciTech Connect

    Gully, J.R.; Baird, R.B.; Markle, P.J.; Bottomley, J.P.

    2000-01-01

    A methodology is described that incorporates the intra- and intertest variability and the biological effect of bioassay data in evaluating the toxicity of single and multiple tests for regulatory decision-making purposes. The single- and multiple-test regulatory decision probabilities were determined from t values (n {minus} 1, one-tailed) derived from the estimated biological effect and the associated standard error at the critical sample concentration. Single-test regulatory decision probabilities below the selected minimum regulatory decision probability identify individual tests as noncompliant. A multiple-test regulatory decision probability is determined by combining the regulatory decision probability of a series of single tests. A multiple-test regulatory decision probability is determined by combining the regulatory decision probability of a series of single tests. A multiple-test regulatory decision probability below the multiple-test regulatory decision minimum identifies groups of tests in which the magnitude and persistence of the toxicity is sufficient to be considered noncompliant or to require enforcement action. Regulatory decision probabilities derived from the t distribution were compared with results based on standard and bioequivalence hypothesis tests using single- and multiple-concentration toxicity test data from an actual national pollutant discharge incorporated the precision of the effect estimate into regulatory decisions at a fixed level of effect. Also, probability-based interpretation of toxicity tests provides incentive to laboratories to produce, and permit holders to use, high-quality, precise data, particularly when multiple tests are used in regulatory decisions. These results are contrasted with standard and bioequivalence hypothesis tests in which the intratest precision is a determining factor in setting the biological effect used for regulatory decisions.

  2. GIS-based Probability Assessment of Natural Hazards in Forested Landscapes of Central and South-Eastern Europe

    NASA Astrophysics Data System (ADS)

    Lorz, C.; Fürst, C.; Galic, Z.; Matijasic, D.; Podrazky, V.; Potocic, N.; Simoncic, P.; Strauch, M.; Vacik, H.; Makeschin, F.

    2010-12-01

    We assessed the probability of three major natural hazards—windthrow, drought, and forest fire—for Central and South-Eastern European forests which are major threats for the provision of forest goods and ecosystem services. In addition, we analyzed spatial distribution and implications for a future oriented management of forested landscapes. For estimating the probability of windthrow, we used rooting depth and average wind speed. Probabilities of drought and fire were calculated from climatic and total water balance during growing season. As an approximation to climate change scenarios, we used a simplified approach with a general increase of pET by 20%. Monitoring data from the pan-European forests crown condition program and observed burnt areas and hot spots from the European Forest Fire Information System were used to test the plausibility of probability maps. Regions with high probabilities of natural hazard are identified and management strategies to minimize probability of natural hazards are discussed. We suggest future research should focus on (i) estimating probabilities using process based models (including sensitivity analysis), (ii) defining probability in terms of economic loss, (iii) including biotic hazards, (iv) using more detailed data sets on natural hazards, forest inventories and climate change scenarios, and (v) developing a framework of adaptive risk management.

  3. Statistical analysis of gait maturation in children based on probability density functions.

    PubMed

    Wu, Yunfeng; Zhong, Zhangting; Lu, Meng; He, Jia

    2011-01-01

    Analysis of gait patterns in children is useful for the study of maturation of locomotor control. In this paper, we utilized the Parzen-window method to estimate the probability density functions (PDFs) of the stride interval for 50 children. With the estimated PDFs, the statistical measures, i.e., averaged stride interval (ASI), variation of stride interval (VSI), PDF skewness (SK), and PDF kurtosis (KU), were computed for the gait maturation in three age groups (aged 3-5 years, 6-8 years, and 10-14 years) of young children. The results indicated that the ASI and VSI values are significantly different between the three age groups. The VSI is decreased rapidly until 8 years of age, and then continues to be decreased at a slower rate. The SK values of the PDFs for all of the three age groups are positive, which shows a slight imbalance in the stride interval distribution within each age group. In addition, the decrease of the KU values of the PDFs is age-dependent, which suggests the effects of the musculo-skeletal growth on the gait maturation in young children. PMID:22254641

  4. A physically-based earthquake recurrence model for estimation of long-term earthquake probabilities

    USGS Publications Warehouse

    Ellsworth, William L.; Matthews, Mark V.; Nadeau, Robert M.; Nishenko, Stuart P.; Reasenberg, Paul A.; Simpson, Robert W.

    1999-01-01

    A physically-motivated model for earthquake recurrence based on the Brownian relaxation oscillator is introduced. The renewal process defining this point process model can be described by the steady rise of a state variable from the ground state to failure threshold as modulated by Brownian motion. Failure times in this model follow the Brownian passage time (BPT) distribution, which is specified by the mean time to failure, μ, and the aperiodicity of the mean, α (equivalent to the familiar coefficient of variation). Analysis of 37 series of recurrent earthquakes, M -0.7 to 9.2, suggests a provisional generic value of α = 0.5. For this value of α, the hazard function (instantaneous failure rate of survivors) exceeds the mean rate for times > μ⁄2, and is ~ ~ 2 ⁄ μ for all times > μ. Application of this model to the next M 6 earthquake on the San Andreas fault at Parkfield, California suggests that the annual probability of the earthquake is between 1:10 and 1:13.

  5. Design-based texture feature fusion using Gabor filters and co-occurrence probabilities.

    PubMed

    Clausi, David A; Deng, Huang

    2005-07-01

    A design-based method to fuse Gabor filter and grey level co-occurrence probability (GLCP) features for improved texture recognition is presented. The fused feature set utilizes both the Gabor filter's capability of accurately capturing lower and mid-frequency texture information and the GLCP's capability in texture information relevant to higher frequency components. Evaluation methods include comparing feature space separability and comparing image segmentation classification rates. The fused feature sets are demonstrated to produce higher feature space separations, as well as higher segmentation accuracies relative to the individual feature sets. Fused feature sets also outperform individual feature sets for noisy images, across different noise magnitudes. The curse of dimensionality is demonstrated not to affect segmentation using the proposed the 48-dimensional fused feature set. Gabor magnitude responses produce higher segmentation accuracies than linearly normalized Gabor magnitude responses. Feature reduction using principal component analysis is acceptable for maintaining the segmentation performance, but feature reduction using the feature contrast method dramatically reduced the segmentation accuracy. Overall, the designed fused feature set is advocated as a means for improving texture segmentation performance.

  6. Probability based remaining capacity estimation using data-driven and neural network model

    NASA Astrophysics Data System (ADS)

    Wang, Yujie; Yang, Duo; Zhang, Xu; Chen, Zonghai

    2016-05-01

    Since large numbers of lithium-ion batteries are composed in pack and the batteries are complex electrochemical devices, their monitoring and safety concerns are key issues for the applications of battery technology. An accurate estimation of battery remaining capacity is crucial for optimization of the vehicle control, preventing battery from over-charging and over-discharging and ensuring the safety during its service life. The remaining capacity estimation of a battery includes the estimation of state-of-charge (SOC) and state-of-energy (SOE). In this work, a probability based adaptive estimator is presented to obtain accurate and reliable estimation results for both SOC and SOE. For the SOC estimation, an n ordered RC equivalent circuit model is employed by combining an electrochemical model to obtain more accurate voltage prediction results. For the SOE estimation, a sliding window neural network model is proposed to investigate the relationship between the terminal voltage and the model inputs. To verify the accuracy and robustness of the proposed model and estimation algorithm, experiments under different dynamic operation current profiles are performed on the commercial 1665130-type lithium-ion batteries. The results illustrate that accurate and robust estimation can be obtained by the proposed method.

  7. Moment-Based Probability Modeling and Extreme Response Estimation, The FITS Routine Version 1.2

    SciTech Connect

    MANUEL,LANCE; KASHEF,TINA; WINTERSTEIN,STEVEN R.

    1999-11-01

    This report documents the use of the FITS routine, which provides automated fits of various analytical, commonly used probability models from input data. It is intended to complement the previously distributed FITTING routine documented in RMS Report 14 (Winterstein et al., 1994), which implements relatively complex four-moment distribution models whose parameters are fit with numerical optimization routines. Although these four-moment fits can be quite useful and faithful to the observed data, their complexity can make them difficult to automate within standard fitting algorithms. In contrast, FITS provides more robust (lower moment) fits of simpler, more conventional distribution forms. For each database of interest, the routine estimates the distribution of annual maximum response based on the data values and the duration, T, over which they were recorded. To focus on the upper tails of interest, the user can also supply an arbitrary lower-bound threshold, {chi}{sub low}, above which a shifted distribution model--exponential or Weibull--is fit.

  8. BACTID: a microcomputer implementation of a PASCAL program for bacterial identification based on Bayesean probability.

    PubMed

    Jilly, B J

    1988-01-01

    A computer program (BACTID) is described which enables the identification of bacteria based on a priori data and Bayesean probability testing. The program is not limited to a specific format, has a short execution time, can be easily applied to a variety of situations, and can be run on almost any microcomputer system operating under either 8-bit CP/M or 16-bit MS-DOS or PC-DOS. Additionally, BACTID is not limited to one type of computer (hardware independent); is not limited by size of the computer's random access memory (RAM independent); can recognize various database matrices (format independent); is able to compensate for missing data; and allows for various methods of data entry. The efficacy of the program was checked against a commercially available test system and a 99.34% agreement was obtained. Also, the execution time for a 46 x 21 data matrix was as little as 3.5 seconds. These results show that microcomputer identification programs not only are viable alternatives to code-book registers, but also offer flexibility which is not found in commercial systems. PMID:3282771

  9. Inverse modeling of hydraulic tests in fractured crystalline rock based on a transition probability geostatistical approach

    NASA Astrophysics Data System (ADS)

    Blessent, Daniela; Therrien, René; Lemieux, Jean-Michel

    2011-12-01

    This paper presents numerical simulations of a series of hydraulic interference tests conducted in crystalline bedrock at Olkiluoto (Finland), a potential site for the disposal of the Finnish high-level nuclear waste. The tests are in a block of crystalline bedrock of about 0.03 km3 that contains low-transmissivity fractures. Fracture density, orientation, and fracture transmissivity are estimated from Posiva Flow Log (PFL) measurements in boreholes drilled in the rock block. On the basis of those data, a geostatistical approach relying on a transitional probability and Markov chain models is used to define a conceptual model based on stochastic fractured rock facies. Four facies are defined, from sparsely fractured bedrock to highly fractured bedrock. Using this conceptual model, three-dimensional groundwater flow is then simulated to reproduce interference pumping tests in either open or packed-off boreholes. Hydraulic conductivities of the fracture facies are estimated through automatic calibration using either hydraulic heads or both hydraulic heads and PFL flow rates as targets for calibration. The latter option produces a narrower confidence interval for the calibrated hydraulic conductivities, therefore reducing the associated uncertainty and demonstrating the usefulness of the measured PFL flow rates. Furthermore, the stochastic facies conceptual model is a suitable alternative to discrete fracture network models to simulate fluid flow in fractured geological media.

  10. SAR amplitude probability density function estimation based on a generalized Gaussian model.

    PubMed

    Moser, Gabriele; Zerubia, Josiane; Serpico, Sebastiano B

    2006-06-01

    In the context of remotely sensed data analysis, an important problem is the development of accurate models for the statistics of the pixel intensities. Focusing on synthetic aperture radar (SAR) data, this modeling process turns out to be a crucial task, for instance, for classification or for denoising purposes. In this paper, an innovative parametric estimation methodology for SAR amplitude data is proposed that adopts a generalized Gaussian (GG) model for the complex SAR backscattered signal. A closed-form expression for the corresponding amplitude probability density function (PDF) is derived and a specific parameter estimation algorithm is developed in order to deal with the proposed model. Specifically, the recently proposed "method-of-log-cumulants" (MoLC) is applied, which stems from the adoption of the Mellin transform (instead of the usual Fourier transform) in the computation of characteristic functions and from the corresponding generalization of the concepts of moment and cumulant. For the developed GG-based amplitude model, the resulting MoLC estimates turn out to be numerically feasible and are also analytically proved to be consistent. The proposed parametric approach was validated by using several real ERS-1, XSAR, E-SAR, and NASA/JPL airborne SAR images, and the experimental results prove that the method models the amplitude PDF better than several previously proposed parametric models for backscattering phenomena. PMID:16764268

  11. Induced Probabilities.

    ERIC Educational Resources Information Center

    Neel, John H.

    Induced probabilities have been largely ignored by educational researchers. Simply stated, if a new or random variable is defined in terms of a first random variable, then induced probability is the probability or density of the new random variable that can be found by summation or integration over the appropriate domains of the original random…

  12. A sequential nonparametric pattern classification algorithm based on the Wald SPRT. [Sequential Probability Ratio Test

    NASA Technical Reports Server (NTRS)

    Poage, J. L.

    1975-01-01

    A sequential nonparametric pattern classification procedure is presented. The method presented is an estimated version of the Wald sequential probability ratio test (SPRT). This method utilizes density function estimates, and the density estimate used is discussed, including a proof of convergence in probability of the estimate to the true density function. The classification procedure proposed makes use of the theory of order statistics, and estimates of the probabilities of misclassification are given. The procedure was tested on discriminating between two classes of Gaussian samples and on discriminating between two kinds of electroencephalogram (EEG) responses.

  13. On Probability Domains III

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2015-12-01

    Domains of generalized probability have been introduced in order to provide a general construction of random events, observables and states. It is based on the notion of a cogenerator and the properties of product. We continue our previous study and show how some other quantum structures fit our categorical approach. We discuss how various epireflections implicitly used in the classical probability theory are related to the transition to fuzzy probability theory and describe the latter probability theory as a genuine categorical extension of the former. We show that the IF-probability can be studied via the fuzzy probability theory. We outline a "tensor modification" of the fuzzy probability theory.

  14. Experimental Probability in Elementary School

    ERIC Educational Resources Information Center

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  15. A Bayesian-probability-based method for assigning protein backbone dihedral angles based on chemical shifts and local sequences.

    PubMed

    Wang, Jun; Liu, Haiyan

    2007-01-01

    Chemical shifts contain substantial information about protein local conformations. We present a method to assign individual protein backbone dihedral angles into specific regions on the Ramachandran map based on the amino acid sequences and the chemical shifts of backbone atoms of tripeptide segments. The method uses a scoring function derived from the Bayesian probability for the central residue of a query tripeptide segment to have a particular conformation. The Ramachandran map is partitioned into representative regions at two levels of resolution. The lower resolution partitioning is equivalent to the conventional definitions of different secondary structure regions on the map. At the higher resolution level, the alpha and beta regions are further divided into subregions. Predictions are attempted at both levels of resolution. We compared our method with TALOS using the original TALOS database, and obtained comparable results. Although TALOS may produce the best results with currently available databases which are much enlarged, the Bayesian-probability-based approach can provide a quantitative measure for the reliability of predictions.

  16. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education

    PubMed Central

    Masel, J.; Humphrey, P. T.; Blackburn, B.; Levine, J. A.

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students’ intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes’ theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. PMID:26582236

  17. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    PubMed

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education.

  18. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    PubMed

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. PMID:26582236

  19. In search of a statistical probability model for petroleum-resource assessment : a critique of the probabilistic significance of certain concepts and methods used in petroleum-resource assessment : to that end, a probabilistic model is sketched

    USGS Publications Warehouse

    Grossling, Bernardo F.

    1975-01-01

    Exploratory drilling is still in incipient or youthful stages in those areas of the world where the bulk of the potential petroleum resources is yet to be discovered. Methods of assessing resources from projections based on historical production and reserve data are limited to mature areas. For most of the world's petroleum-prospective areas, a more speculative situation calls for a critical review of resource-assessment methodology. The language of mathematical statistics is required to define more rigorously the appraisal of petroleum resources. Basically, two approaches have been used to appraise the amounts of undiscovered mineral resources in a geologic province: (1) projection models, which use statistical data on the past outcome of exploration and development in the province; and (2) estimation models of the overall resources of the province, which use certain known parameters of the province together with the outcome of exploration and development in analogous provinces. These two approaches often lead to widely different estimates. Some of the controversy that arises results from a confusion of the probabilistic significance of the quantities yielded by each of the two approaches. Also, inherent limitations of analytic projection models-such as those using the logistic and Gomperts functions --have often been ignored. The resource-assessment problem should be recast in terms that provide for consideration of the probability of existence of the resource and of the probability of discovery of a deposit. Then the two above-mentioned models occupy the two ends of the probability range. The new approach accounts for (1) what can be expected with reasonably high certainty by mere projections of what has been accomplished in the past; (2) the inherent biases of decision-makers and resource estimators; (3) upper bounds that can be set up as goals for exploration; and (4) the uncertainties in geologic conditions in a search for minerals. Actual outcomes can then

  20. United States streamflow probabilities based on forecasted La Nina, winter-spring 2000

    USGS Publications Warehouse

    Dettinger, M.D.; Cayan, D.R.; Redmond, K.T.

    1999-01-01

    Although for the last 5 months the TahitiDarwin Southern Oscillation Index (SOI) has hovered close to normal, the “equatorial” SOI has remained in the La Niña category and predictions are calling for La Niña conditions this winter. In view of these predictions of continuing La Niña and as a direct extension of previous studies of the relations between El NiñoSouthern Oscil-lation (ENSO) conditions and streamflow in the United States (e.g., Redmond and Koch, 1991; Cayan and Webb, 1992; Redmond and Cayan, 1994; Dettinger et al., 1998; Garen, 1998; Cayan et al., 1999; Dettinger et al., in press), the probabilities that United States streamflows from December 1999 through July 2000 will be in upper and lower thirds (terciles) of the historical records are estimated here. The processes that link ENSO to North American streamflow are discussed in detail in these diagnostics studies. Our justification for generating this forecast is threefold: (1) Cayan et al. (1999) recently have shown that ENSO influences on streamflow variations and extremes are proportionately larger than the corresponding precipitation teleconnections. (2) Redmond and Cayan (1994) and Dettinger et al. (in press) also have shown that the low-frequency evolution of ENSO conditions support long-lead correlations between ENSO and streamflow in many rivers of the conterminous United States. (3) In many rivers, significant (weeks-to-months) delays between precipitation and the release to streams of snowmelt or ground-water discharge can support even longer term forecasts of streamflow than is possible for precipitation. The relatively slow, orderly evolution of El Niño-Southern Oscillation episodes, the accentuated dependence of streamflow upon ENSO, and the long lags between precipitation and flow encourage us to provide the following analysis as a simple prediction of this year’s river flows.

  1. Model assisted probability of detection for a guided waves based SHM technique

    NASA Astrophysics Data System (ADS)

    Memmolo, V.; Ricci, F.; Maio, L.; Boffa, N. D.; Monaco, E.

    2016-04-01

    Guided wave (GW) Structural Health Monitoring (SHM) allows to assess the health of aerostructures thanks to the great sensitivity to delamination and/or debondings appearance. Due to the several complexities affecting wave propagation in composites, an efficient GW SHM system requires its effective quantification associated to a rigorous statistical evaluation procedure. Probability of Detection (POD) approach is a commonly accepted measurement method to quantify NDI results and it can be effectively extended to an SHM context. However, it requires a very complex setup arrangement and many coupons. When a rigorous correlation with measurements is adopted, Model Assisted POD (MAPOD) is an efficient alternative to classic methods. This paper is concerned with the identification of small emerging delaminations in composite structural components. An ultrasonic GW tomography focused to impact damage detection in composite plate-like structures recently developed by authors is investigated, getting the bases for a more complex MAPOD analysis. Experimental tests carried out on a typical wing composite structure demonstrated the effectiveness of modeling approach in order to detect damages with the tomographic algorithm. Environmental disturbances, which affect signal waveforms and consequently damage detection, are considered simulating a mathematical noise in the modeling stage. A statistical method is used for an effective making decision procedure. A Damage Index approach is implemented as metric to interpret the signals collected from a distributed sensor network and a subsequent graphic interpolation is carried out to reconstruct the damage appearance. A model validation and first reliability assessment results are provided, in view of performance system quantification and its optimization as well.

  2. Value of genetic testing for hereditary colorectal cancer in a probability-based US online sample

    PubMed Central

    Knight, Sara J.; Mohamed, Ateesha F.; Marshall, Deborah A.; Ladabaum, Uri; Phillips, Kathryn A.; Walsh, Judith M. E.

    2015-01-01

    Background While choices about genetic testing are increasingly common for patients and families, and public opinion surveys suggest public interest in genomics, it is not known how adults from the general population value genetic testing for heritable conditions. We sought to understand in a US sample the relative value of the characteristics of genetic tests to identify risk of hereditary colorectal cancer, among the first genomic applications with evidence to support its translation to clinical settings. Methods A Web-enabled choice-format conjoint survey was conducted with adults age 50 and older from a probability-based US panel. Participants were asked to make a series of choices between two hypothetical blood tests that differed in risk of false negative test, privacy, and cost. Random parameters logit models were used to estimate preferences, the dollar value of genetic information, and intent to have genetic testing. Results A total of 355 individuals completed choice-format questions. Cost and privacy were more highly valued than reducing the chance of a false negative result. Most (97%, 95% Confidence Interval (CI): 95% to 99%) would have genetic testing to reduce the risk of dying from colorectal cancer in the best scenario (no false negatives, results disclosed to primary care physician). Only 41% (95% CI: 25% to 57%) would have genetic testing in the worst case (20% false negatives, results disclosed to insurance company). Conclusions Given the characteristics and levels included in the choice, if false negative test results are unlikely and results are shared with a primary care physician, the majority would have genetic testing. As genomic services become widely available, primary care professionals will need to be increasingly knowledgeable about genetic testing decisions. PMID:25589525

  3. A Scrabble Heuristic Based on Probability That Performs at Championship Level

    NASA Astrophysics Data System (ADS)

    Ramírez, Arturo; Acuña, Francisco González; Romero, Alejandro González; Alquézar, René; Hernández, Enric; Aguilar, Amador Roldán; Olmedo, Ian García

    The game of Scrabble, in its competitive form (one vs. one), has been tackled mostly by using Monte Carlo simulation. Recently [1], Probability Theory (Bayes’ theorem) was used to gain knowledge about the opponents’ tiles; this proved to be a good approach to improve even more Computer Scrabble. We used probability to evaluate Scrabble leaves (rack residues); then using this evaluation, a heuristic function that dictates a move can be constructed. To calculate these probabilities it is necessary to have a lexicon, in our case a Spanish lexicon. To make proper investigations in the domain of Scrabble it is important to have the same lexicon as the one used by humans in official tournaments. We did a huge amount of work to build this free lexicon. In this paper a heuristic function that involves leaves probabilities is given. We have now an engine, Heuri, that uses this heuristic, and we have been able to perform some experiments to test it. The tests include matches against highly expert players; the games played so far give us promising results. For instance, recently a match between the current World Scrabble Champion (in Spanish) and Heuri was played. Heuri defeated the World Champion 6-0 ! Heuri includes a move generator which, using a lot of memory, is faster than using DAWG [2] or GADDAG [3]. Another plan to build a stronger Heuri that combines heuristics using probabilities, opponent modeling and Monte Carlo simulation is also proposed.

  4. Methods for estimating annual exceedance probability discharges for streams in Arkansas, based on data through water year 2013

    USGS Publications Warehouse

    Wagner, Daniel M.; Krieger, Joshua D.; Veilleux, Andrea G.

    2016-08-04

    In 2013, the U.S. Geological Survey initiated a study to update regional skew, annual exceedance probability discharges, and regional regression equations used to estimate annual exceedance probability discharges for ungaged locations on streams in the study area with the use of recent geospatial data, new analytical methods, and available annual peak-discharge data through the 2013 water year. An analysis of regional skew using Bayesian weighted least-squares/Bayesian generalized-least squares regression was performed for Arkansas, Louisiana, and parts of Missouri and Oklahoma. The newly developed constant regional skew of -0.17 was used in the computation of annual exceedance probability discharges for 281 streamgages used in the regional regression analysis. Based on analysis of covariance, four flood regions were identified for use in the generation of regional regression models. Thirty-nine basin characteristics were considered as potential explanatory variables, and ordinary least-squares regression techniques were used to determine the optimum combinations of basin characteristics for each of the four regions. Basin characteristics in candidate models were evaluated based on multicollinearity with other basin characteristics (variance inflation factor < 2.5) and statistical significance at the 95-percent confidence level (p ≤ 0.05). Generalized least-squares regression was used to develop the final regression models for each flood region. Average standard errors of prediction of the generalized least-squares models ranged from 32.76 to 59.53 percent, with the largest range in flood region D. Pseudo coefficients of determination of the generalized least-squares models ranged from 90.29 to 97.28 percent, with the largest range also in flood region D. The regional regression equations apply only to locations on streams in Arkansas where annual peak discharges are not substantially affected by regulation, diversion, channelization, backwater, or urbanization

  5. Heightened odds of large earthquakes near Istanbul: an interaction-based probability calculation

    USGS Publications Warehouse

    Parsons, T.; Toda, S.; Stein, R.S.; Barka, A.; Dieterich, J.H.

    2000-01-01

    We calculate the probability of strong shaking in Istanbul, an urban center of 10 million people, from the description of earthquakes on the North Anatolian fault system in the Marmara Sea during the past 500 years and test the resulting catalog against the frequency of damage in Istanbul during the preceding millennium, departing from current practice, we include the time-dependent effect of stress transferred by the 1999 moment magnitude M = 7.4 Izmit earthquake to faults nearer to Istanbul. We find a 62 ± 15% probability (one standard deviation) of strong shaking during the next 30 years and 32 ± 12% during the next decade.

  6. Implicit Segmentation of a Stream of Syllables Based on Transitional Probabilities: An MEG Study

    ERIC Educational Resources Information Center

    Teinonen, Tuomas; Huotilainen, Minna

    2012-01-01

    Statistical segmentation of continuous speech, i.e., the ability to utilise transitional probabilities between syllables in order to detect word boundaries, is reflected in the brain's auditory event-related potentials (ERPs). The N1 and N400 ERP components are typically enhanced for word onsets compared to random syllables during active…

  7. How the Probability and Potential Clinical Significance of Pharmacokinetically Mediated Drug-Drug Interactions Are Assessed in Drug Development: Desvenlafaxine as an Example

    PubMed Central

    Nichols, Alice I.; Preskorn, Sheldon H.

    2015-01-01

    Objective: The avoidance of adverse drug-drug interactions (DDIs) is a high priority in terms of both the US Food and Drug Administration (FDA) and the individual prescriber. With this perspective in mind, this article illustrates the process for assessing the risk of a drug (example here being desvenlafaxine) causing or being the victim of DDIs, in accordance with FDA guidance. Data Sources/Study Selection: DDI studies for the serotonin-norepinephrine reuptake inhibitor desvenlafaxine conducted by the sponsor and published since 2009 are used as examples of the systematic way that the FDA requires drug developers to assess whether their new drug is either capable of causing clinically meaningful DDIs or being the victim of such DDIs. In total, 8 open-label studies tested the effects of steady-state treatment with desvenlafaxine (50–400 mg/d) on the pharmacokinetics of cytochrome (CYP) 2D6 and/or CYP 3A4 substrate drugs, or the effect of CYP 3A4 inhibition on desvenlafaxine pharmacokinetics. The potential for DDIs mediated by the P-glycoprotein (P-gp) transporter was assessed in in vitro studies using Caco-2 monolayers. Data Extraction: Changes in area under the plasma concentration-time curve (AUC; CYP studies) and efflux (P-gp studies) were reviewed for potential DDIs in accordance with FDA criteria. Results: Desvenlafaxine coadministration had minimal effect on CYP 2D6 and/or 3A4 substrates per FDA criteria. Changes in AUC indicated either no interaction (90% confidence intervals for the ratio of AUC geometric least-squares means [GM] within 80%–125%) or weak inhibition (AUC GM ratio 125% to < 200%). Coadministration with ketoconazole resulted in a weak interaction with desvenlafaxine (AUC GM ratio of 143%). Desvenlafaxine was not a substrate (efflux ratio < 2) or inhibitor (50% inhibitory drug concentration values > 250 μM) of P-gp. Conclusions: A 2-step process based on FDA guidance can be used first to determine whether a pharmacokinetically mediated

  8. Probability Theory

    NASA Astrophysics Data System (ADS)

    Jaynes, E. T.; Bretthorst, G. Larry

    2003-04-01

    Foreword; Preface; Part I. Principles and Elementary Applications: 1. Plausible reasoning; 2. The quantitative rules; 3. Elementary sampling theory; 4. Elementary hypothesis testing; 5. Queer uses for probability theory; 6. Elementary parameter estimation; 7. The central, Gaussian or normal distribution; 8. Sufficiency, ancillarity, and all that; 9. Repetitive experiments, probability and frequency; 10. Physics of 'random experiments'; Part II. Advanced Applications: 11. Discrete prior probabilities, the entropy principle; 12. Ignorance priors and transformation groups; 13. Decision theory: historical background; 14. Simple applications of decision theory; 15. Paradoxes of probability theory; 16. Orthodox methods: historical background; 17. Principles and pathology of orthodox statistics; 18. The Ap distribution and rule of succession; 19. Physical measurements; 20. Model comparison; 21. Outliers and robustness; 22. Introduction to communication theory; References; Appendix A. Other approaches to probability theory; Appendix B. Mathematical formalities and style; Appendix C. Convolutions and cumulants.

  9. Methods for estimating annual exceedance-probability discharges for streams in Iowa, based on data through water year 2010

    USGS Publications Warehouse

    Eash, David A.; Barnes, Kimberlee K.; Veilleux, Andrea G.

    2013-01-01

    A statewide study was performed to develop regional regression equations for estimating selected annual exceedance-probability statistics for ungaged stream sites in Iowa. The study area comprises streamgages located within Iowa and 50 miles beyond the State’s borders. Annual exceedance-probability estimates were computed for 518 streamgages by using the expected moments algorithm to fit a Pearson Type III distribution to the logarithms of annual peak discharges for each streamgage using annual peak-discharge data through 2010. The estimation of the selected statistics included a Bayesian weighted least-squares/generalized least-squares regression analysis to update regional skew coefficients for the 518 streamgages. Low-outlier and historic information were incorporated into the annual exceedance-probability analyses, and a generalized Grubbs-Beck test was used to detect multiple potentially influential low flows. Also, geographic information system software was used to measure 59 selected basin characteristics for each streamgage. Regional regression analysis, using generalized least-squares regression, was used to develop a set of equations for each flood region in Iowa for estimating discharges for ungaged stream sites with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities, which are equivalent to annual flood-frequency recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively. A total of 394 streamgages were included in the development of regional regression equations for three flood regions (regions 1, 2, and 3) that were defined for Iowa based on landform regions and soil regions. Average standard errors of prediction range from 31.8 to 45.2 percent for flood region 1, 19.4 to 46.8 percent for flood region 2, and 26.5 to 43.1 percent for flood region 3. The pseudo coefficients of determination for the generalized least-squares equations range from 90.8 to 96.2 percent for flood region 1, 91.5 to 97

  10. Evaluation of tsunami potential based on conditional probability for specific zones of the Pacific tsunamigenic rim

    NASA Astrophysics Data System (ADS)

    Koravos, George Ch.; Yadav, R. B. S.; Tsapanos, Theodoros M.

    2015-09-01

    The Pacific tsunamigenic rim is one of the most tsunamigenic regions of the world which has experienced large catastrophic tsunamis in the past, resulting in huge loss of lives and properties. In this study, probabilities of occurrences of large tsunamis with tsunami intensity (Soloviev-Imamura intensity scale) I ≥ 1.5, I ≥ 2.0, I ≥ 2.5, I ≥ 3.0, I ≥ 3.5 and I ≥ 4.0 have been calculated over the next 100 years in ten main tsunamigenic zones of the Pacific rim area using a homogeneous and complete tsunami catalogue covering the time periods from 684 to 2011. In order to evaluate tsunami potential, we applied the conditional probability method in each zone by considering the inter-occurrence times between the successive tsunamis generated in the past that follow the lognormal distribution. Thus, we assessed the probability of the next generation of large tsunamis in each zone by considering the time of the last tsunami occurrence. The a-posteriori occurrence of the last large tsunami has been also assessed, assuming that the time of the last occurrence coincides with the time of the event prior to the last one. The estimated a-posteriori probabilities exhibit satisfactory results in most of the zones, revealing a promising technique and confirming the reliability of the tsunami data used. Furthermore, the tsunami potential in different tsunamigenic zones is also expressed in terms of spatial maps of conditional probabilities for two levels of tsunami intensities I ≥ 1.5 and I ≥ 2.5 during next 10, 20, 50 and 100 years. Estimated results reveal that the conditional probabilities in the South America and Alaska-Aleutian zones for larger tsunami intensity I ≥ 2.5 are in the range of 92-93%, much larger than the Japan (69%), for a time period of 100 years, suggesting that those are the most vulnerable tsunamigenic zones. The spatial maps provide brief atlas of tsunami potential in the Pacific rim area.

  11. Effect of Reinforcement Probability and Prize Size on Cocaine and Heroin Abstinence in Prize-Based Contingency Management

    ERIC Educational Resources Information Center

    Ghitza, Udi E.; Epstein, David H.; Schmittner, John; Vahabzadeh, Massoud; Lin, Jia-Ling; Preston, Kenzie L.

    2008-01-01

    Although treatment outcome in prize-based contingency management has been shown to depend on reinforcement schedule, the optimal schedule is still unknown. Therefore, we conducted a retrospective analysis of data from a randomized clinical trial (Ghitza et al., 2007) to determine the effects of the probability of winning a prize (low vs. high) and…

  12. Computer-Based Graphical Displays for Enhancing Mental Animation and Improving Reasoning in Novice Learning of Probability

    ERIC Educational Resources Information Center

    Kaplan, Danielle E.; Wu, Erin Chia-ling

    2006-01-01

    Our research suggests static and animated graphics can lead to more animated thinking and more correct problem solving in computer-based probability learning. Pilot software modules were developed for graduate online statistics courses and representation research. A study with novice graduate student statisticians compared problem solving in five…

  13. MEASUREMENT OF CHILDREN'S EXPOSURE TO PESTICIDES: ANALYSIS OF URINARY METABOLITE LEVELS IN A PROBABILITY-BASED SAMPLE

    EPA Science Inventory

    The Minnesota Children's Pesticide Exposure Study is a probability-based sample of 102 children 3-13 years old who were monitored for commonly used pesticides. During the summer of 1997, first-morning-void urine samples (1-3 per child) were obtained for 88% of study children a...

  14. Lake Superior Zooplankton Biomass Predictions from LOPC Tow Surveys Compare Well with a Probability Based Net Survey

    EPA Science Inventory

    We conducted a probability-based sampling of Lake Superior in 2006 and compared the zooplankton biomass estimate with laser optical plankton counter (LOPC) predictions. The net survey consisted of 52 sites stratified across three depth zones (0-30, 30-150, >150 m). The LOPC tow...

  15. Development and Use of a Computer-Based Interactive Resource for Teaching and Learning Probability in Primary Classrooms

    ERIC Educational Resources Information Center

    Trigueros, Maria; Lozano, Maria Dolores; Lage, Ana Elisa

    2006-01-01

    "Enciclomedia" is a Mexican project for primary school teaching using computers in the classroom. Within this project, and following an enactivist theoretical perspective and methodology, we have designed a computer-based package called "Dados", which, together with teaching guides, is intended to support the teaching and learning of probability.…

  16. Guide waves-based multi-damage identification using a local probability-based diagnostic imaging method

    NASA Astrophysics Data System (ADS)

    Gao, Dongyue; Wu, Zhanjun; Yang, Lei; Zheng, Yuebin

    2016-04-01

    Multi-damage identification is an important and challenging task in the research of guide waves-based structural health monitoring. In this paper, a multi-damage identification method is presented using a guide waves-based local probability-based diagnostic imaging (PDI) method. The method includes a path damage judgment stage, a multi-damage judgment stage and a multi-damage imaging stage. First, damage imaging was performed by partition. The damage imaging regions are divided into beside damage signal paths. The difference in guide waves propagation characteristics between cross and beside damage paths is proposed by theoretical analysis of the guide wave signal feature. The time-of-flight difference of paths is used as a factor to distinguish between cross and beside damage paths. Then, a global PDI method (damage identification using all paths in the sensor network) is performed using the beside damage path network. If the global PDI damage zone crosses the beside damage path, it means that the discrete multi-damage model (such as a group of holes or cracks) has been misjudged as a continuum single-damage model (such as a single hole or crack) by the global PDI method. Subsequently, damage imaging regions are separated by beside damage path and local PDI (damage identification using paths in the damage imaging regions) is performed in each damage imaging region. Finally, multi-damage identification results are obtained by superimposing the local damage imaging results and the marked cross damage paths. The method is employed to inspect the multi-damage in an aluminum plate with a surface-mounted piezoelectric ceramic sensors network. The results show that the guide waves-based multi-damage identification method is capable of visualizing the presence, quantity and location of structural damage.

  17. Competency-based curricular design to encourage significant learning.

    PubMed

    Hurtubise, Larry; Roman, Brenda

    2014-07-01

    Most significant learning (SL) experiences produce long-lasting learning experiences that meaningfully change the learner's thinking, feeling, and/or behavior. Most significant teaching experiences involve strong connections with the learner and recognition that the learner felt changed by the teaching effort. L. Dee Fink in Creating Significant Learning Experiences: An Integrated Approach to Designing College Course defines six kinds of learning goals: Foundational Knowledge, Application, Integration, Human Dimension, Caring, and Learning to Learn. SL occurs when learning experiences promote interaction between the different kinds of goals, for example, acquiring knowledge alone is not enough, but when paired with a learning experience, such as an effective patient experience as in Caring, then significant (and lasting) learning occurs. To promote SL, backward design principles that start with clearly defined learning goals and the context of the situation of the learner are particularly effective. Emphasis on defining assessment methods prior to developing teaching/learning activities is the key: this ensures that assessment (where the learner should be at the end of the educational activity/process) drives instruction and that assessment and learning/instruction are tightly linked so that assessment measures a defined outcome (competency) of the learner. Employing backward design and the AAMC's MedBiquitous standard vocabulary for medical education can help to ensure that curricular design and redesign efforts effectively enhance educational program quality and efficacy, leading to improved patient care. Such methods can promote successful careers in health care for learners through development of self-directed learning skills and active learning, in ways that help learners become fully committed to lifelong learning and continuous professional development. PMID:24981665

  18. Competency-based curricular design to encourage significant learning.

    PubMed

    Hurtubise, Larry; Roman, Brenda

    2014-07-01

    Most significant learning (SL) experiences produce long-lasting learning experiences that meaningfully change the learner's thinking, feeling, and/or behavior. Most significant teaching experiences involve strong connections with the learner and recognition that the learner felt changed by the teaching effort. L. Dee Fink in Creating Significant Learning Experiences: An Integrated Approach to Designing College Course defines six kinds of learning goals: Foundational Knowledge, Application, Integration, Human Dimension, Caring, and Learning to Learn. SL occurs when learning experiences promote interaction between the different kinds of goals, for example, acquiring knowledge alone is not enough, but when paired with a learning experience, such as an effective patient experience as in Caring, then significant (and lasting) learning occurs. To promote SL, backward design principles that start with clearly defined learning goals and the context of the situation of the learner are particularly effective. Emphasis on defining assessment methods prior to developing teaching/learning activities is the key: this ensures that assessment (where the learner should be at the end of the educational activity/process) drives instruction and that assessment and learning/instruction are tightly linked so that assessment measures a defined outcome (competency) of the learner. Employing backward design and the AAMC's MedBiquitous standard vocabulary for medical education can help to ensure that curricular design and redesign efforts effectively enhance educational program quality and efficacy, leading to improved patient care. Such methods can promote successful careers in health care for learners through development of self-directed learning skills and active learning, in ways that help learners become fully committed to lifelong learning and continuous professional development.

  19. The Significance of Trust in School-Based Collaborative Leadership

    ERIC Educational Resources Information Center

    Coleman, Andrew

    2012-01-01

    The expectation that schools should work in partnership to promote the achievement of children has arguably been the defining feature of school policy over the last decade. This rise in school-to-school partnerships and increased emphasis on multi-agency-based interventions for vulnerable children have seen the emergence of a new form of school…

  20. Men who have sex with men: a comparison of a probability sample survey and a community based study.

    PubMed

    Dodds, J P; Mercer, C H; Mercey, D E; Copas, A J; Johnson, A M

    2006-02-01

    We compared characteristics of men who have sex with men (MSM) in a probability sample survey with a community based study in London. The majority of men in both surveys reported male sex partner(s) in the last year but MSM recruited through the population based survey had lower levels of HIV risk behaviour, reported fewer sexually transmitted infections and HIV testing than those recruited from gay venues. Community samples are likely to overestimate levels of risk behaviour among all MSM.

  1. A satellite rainfall retrieval technique over northern Algeria based on the probability of rainfall intensities classification from MSG-SEVIRI

    NASA Astrophysics Data System (ADS)

    Lazri, Mourad; Ameur, Soltane

    2016-09-01

    In this paper, an algorithm based on the probability of rainfall intensities classification for rainfall estimation from Meteosat Second Generation/Spinning Enhanced Visible and Infrared Imager (MSG-SEVIRI) has been developed. The classification scheme uses various spectral parameters of SEVIRI that provide information about cloud top temperature and optical and microphysical cloud properties. The presented method is developed and trained for the north of Algeria. The calibration of the method is carried out using as a reference rain classification fields derived from radar for rainy season from November 2006 to March 2007. Rainfall rates are assigned to rain areas previously identified and classified according to the precipitation formation processes. The comparisons between satellite-derived precipitation estimates and validation data show that the developed scheme performs reasonably well. Indeed, the correlation coefficient presents a significant level (r:0.87). The values of POD, POFD and FAR are 80%, 13% and 25%, respectively. Also, for a rainfall estimation of about 614 mm, the RMSD, Bias, MAD and PD indicate 102.06(mm), 2.18(mm), 68.07(mm) and 12.58, respectively.

  2. Ethanol, not detectably metabolized in brain, significantly reduces brain metabolism, probably via action at specific GABA(A) receptors and has measureable metabolic effects at very low concentrations.

    PubMed

    Rae, Caroline D; Davidson, Joanne E; Maher, Anthony D; Rowlands, Benjamin D; Kashem, Mohammed A; Nasrallah, Fatima A; Rallapalli, Sundari K; Cook, James M; Balcar, Vladimir J

    2014-04-01

    Ethanol is a known neuromodulatory agent with reported actions at a range of neurotransmitter receptors. Here, we measured the effect of alcohol on metabolism of [3-¹³C]pyruvate in the adult Guinea pig brain cortical tissue slice and compared the outcomes to those from a library of ligands active in the GABAergic system as well as studying the metabolic fate of [1,2-¹³C]ethanol. Analyses of metabolic profile clusters suggest that the significant reductions in metabolism induced by ethanol (10, 30 and 60 mM) are via action at neurotransmitter receptors, particularly α4β3δ receptors, whereas very low concentrations of ethanol may produce metabolic responses owing to release of GABA via GABA transporter 1 (GAT1) and the subsequent interaction of this GABA with local α5- or α1-containing GABA(A)R. There was no measureable metabolism of [1,2-¹³C]ethanol with no significant incorporation of ¹³C from [1,2-¹³C]ethanol into any measured metabolite above natural abundance, although there were measurable effects on total metabolite sizes similar to those seen with unlabelled ethanol.

  3. Significance of hair-dye base-induced sensory irritation.

    PubMed

    Fujita, F; Azuma, T; Tajiri, M; Okamoto, H; Sano, M; Tominaga, M

    2010-06-01

    Oxidation hair-dyes, which are the principal hair-dyes, sometimes induce painful sensory irritation of the scalp caused by the combination of highly reactive substances, such as hydrogen peroxide and alkali agents. Although many cases of severe facial and scalp dermatitis have been reported following the use of hair-dyes, sensory irritation caused by contact of the hair-dye with the skin has not been reported clearly. In this study, we used a self-assessment questionnaire to measure the sensory irritation in various regions of the body caused by two model hair-dye bases that contained different amounts of alkali agents without dyes. Moreover, the occipital region was found as an alternative region of the scalp to test for sensory irritation of the hair-dye bases. We used this region to evaluate the relationship of sensitivity with skin properties, such as trans-epidermal water loss (TEWL), stratum corneum water content, sebum amount, surface temperature, current perception threshold (CPT), catalase activities in tape-stripped skin and sensory irritation score with the model hair-dye bases. The hair-dye sensitive group showed higher TEWL, a lower sebum amount, a lower surface temperature and higher catalase activity than the insensitive group, and was similar to that of damaged skin. These results suggest that sensory irritation caused by hair-dye could occur easily on the damaged dry scalp, as that caused by skin cosmetics reported previously.

  4. Mesh-Based Entry Vehicle and Explosive Debris Re-Contact Probability Modeling

    NASA Technical Reports Server (NTRS)

    McPherson, Mark A.; Mendeck, Gavin F.

    2011-01-01

    The risk to a crewed vehicle arising from potential re-contact with fragments from an explosive breakup of any jettisoned spacecraft segments during entry has long sought to be quantified. However, great difficulty lies in efficiently capturing the potential locations of each fragment and their collective threat to the vehicle. The method presented in this paper addresses this problem by using a stochastic approach that discretizes simulated debris pieces into volumetric cells, and then assesses strike probabilities accordingly. Combining spatial debris density and relative velocity between the debris and the entry vehicle, the strike probability can be calculated from the integral of the debris flux inside each cell over time. Using this technique it is possible to assess the risk to an entry vehicle along an entire trajectory as it separates from the jettisoned segment. By decoupling the fragment trajectories from that of the entry vehicle, multiple potential separation maneuvers can then be evaluated rapidly to provide an assessment of the best strategy to mitigate the re-contact risk.

  5. A Probability-Base Alerting Logic for Aircraft on Parallel Approach

    NASA Technical Reports Server (NTRS)

    Carpenter, Brenda D.; Kuchar, James K.

    1997-01-01

    This document discusses the development and evaluation of an airborne collision alerting logic for aircraft on closely-spaced approaches to parallel runways. A novel methodology is used when links alerts to collision probabilities: alerting thresholds are set such that when the probability of a collision exceeds an acceptable hazard level an alert is issued. The logic was designed to limit the hazard level to that estimated for the Precision Runway Monitoring system: one accident in every one thousand blunders which trigger alerts. When the aircraft were constrained to be coaltitude, evaluations of a two-dimensional version of the alerting logic show that the achieved hazard level is approximately one accident in every 250 blunders. Problematic scenarios have been identified and corrections to the logic can be made. The evaluations also show that over eighty percent of all unnecessary alerts were issued during scenarios in which the miss distance would have been less than 1000 ft, indicating that the alerts may have been justified. Also, no unnecessary alerts were generated during normal approaches.

  6. Model-Based Calculations of the Probability of a Country's Nuclear Proliferation Decisions

    SciTech Connect

    Li, Jun; Yim, Man-Sung; McNelis, David N.

    2007-07-01

    explain the occurrences of proliferation decisions. However, predicting major historical proliferation events using model-based predictions has been unreliable. Nuclear proliferation decisions by a country is affected by three main factors: (1) technology; (2) finance; and (3) political motivation [1]. Technological capability is important as nuclear weapons development needs special materials, detonation mechanism, delivery capability, and the supporting human resources and knowledge base. Financial capability is likewise important as the development of the technological capabilities requires a serious financial commitment. It would be difficult for any state with a gross national product (GNP) significantly less than that of about $100 billion to devote enough annual governmental funding to a nuclear weapon program to actually achieve positive results within a reasonable time frame (i.e., 10 years). At the same time, nuclear proliferation is not a matter determined by a mastery of technical details or overcoming financial constraints. Technology or finance is a necessary condition but not a sufficient condition for nuclear proliferation. At the most fundamental level, the proliferation decision by a state is controlled by its political motivation. To effectively address the issue of predicting proliferation events, all three of the factors must be included in the model. To the knowledge of the authors, none of the exiting models considered the 'technology' variable as part of the modeling. This paper presents an attempt to develop a methodology for statistical modeling and predicting a country's nuclear proliferation decisions. The approach is based on the combined use of data on a country's nuclear technical capability profiles economic development status, security environment factors and internal political and cultural factors. All of the information utilized in the study was from open source literature. (authors)

  7. Incremental value of diagonal earlobe crease to the Diamond-Forrester classification in estimating the probability of significant coronary artery disease determined by computed tomographic angiography.

    PubMed

    Shmilovich, Haim; Cheng, Victor Y; Nakazato, Ryo; Smith, Thomas W; Otaki, Yuka; Nakanishi, Rine; Paz, William; Pimentel, Raymond T; Berman, Daniel S; Rajani, Ronak

    2014-12-01

    The Diamond-Forrester (DF) algorithm overestimates the likelihood of significant coronary artery disease (≥50% stenosis, CAD50). The aim of the present study was to evaluate whether the addition of a diagonal earlobe crease (DELC) enhances the predictive ability of DF to detect CAD50 by coronary computed tomographic angiography (CTA). We evaluated 430 patients referred for CTA for symptoms, cardiovascular risk factors, and CAD50 likelihood using DF. Observers blinded to CTA findings evaluated the presence of DELC. The diagnostic accuracy and relation of DF, DELC, and DF + DELC for predicting CAD50 in patients with chest pain were evaluated using receiver operating characteristics curve (area under curve) analyses and multivariate logistic regression analyses. In 199 patients with chest pain, the sensitivity and specificity for CAD50 were 96% and 20% for DF (AUC 0.59, p = 0.59), 91% and 32% for DELC (AUC 0.62, p = 0.03), and 91% and 41% for DF + DELC (AUC 0.66, p = 0.004). On multivariate analyses DELC was the only independent predictor of CAD50 (odds ratio 3.6, 95% confidence interval 1 to 12.9, p = 0.048). DF + DELC increased the predictive ability to detect CAD50 above cardiovascular risk factors (odds ratio 5.6, 95% confidence interval 1.6 to 19.8, p = 0.007). In patients with chest pain, the presence of DELC is related to CAD50 beyond DF. A combined variable of DF + DELC provides superior discriminatory ability for detecting CAD50 than either method alone.

  8. Multiple Regression Model Based Sequential Probability Ratio Test for Structural Change Detection of Time Series

    NASA Astrophysics Data System (ADS)

    Takeda, Katsunori; Hattori, Tetsuo; Kawano, Hiromichi

    In real time analysis and forecasting of time series data, it is important to detect the structural change as immediately, correctly, and simply as possible. And it is necessary for rebuilding the next prediction model after the change point as soon as possible. For this kind of time series data analysis, in general, multiple linear regression models are used. In this paper, we present two methods, i.e., Sequential Probability Ratio Test (SPRT) and Chow Test that is well-known in economics, and describe those experimental evaluations of the effectiveness in the change detection using the multiple regression models. Moreover, we extend the definition of the detected change point in the SPRT method, and show the improvement of the change detection accuracy.

  9. Multiple Vehicle Cooperative Localization with Spatial Registration Based on a Probability Hypothesis Density Filter

    PubMed Central

    Zhang, Feihu; Buckl, Christian; Knoll, Alois

    2014-01-01

    This paper studies the problem of multiple vehicle cooperative localization with spatial registration in the formulation of the probability hypothesis density (PHD) filter. Assuming vehicles are equipped with proprioceptive and exteroceptive sensors (with biases) to cooperatively localize positions, a simultaneous solution for joint spatial registration and state estimation is proposed. For this, we rely on the sequential Monte Carlo implementation of the PHD filtering. Compared to other methods, the concept of multiple vehicle cooperative localization with spatial registration is first proposed under Random Finite Set Theory. In addition, the proposed solution also addresses the challenges for multiple vehicle cooperative localization, e.g., the communication bandwidth issue and data association uncertainty. The simulation result demonstrates its reliability and feasibility in large-scale environments. PMID:24406860

  10. Analysis of altered gait cycle duration in amyotrophic lateral sclerosis based on nonparametric probability density function estimation.

    PubMed

    Wu, Yunfeng; Shi, Lei

    2011-04-01

    Human locomotion is regulated by the central nervous system (CNS). The neurophysiological changes in the CNS due to amyotrophic lateral sclerosis (ALS) may cause altered gait cycle duration (stride interval) or other gait rhythm. This article used a statistical method to analyze the altered stride interval in patients with ALS. We first estimated the probability density functions (PDFs) of stride interval from the outlier-processed gait rhythm time series, by using the nonparametric Parzen-window approach. Based on the PDFs estimated, the mean of the left-foot stride interval and the modified Kullback-Leibler divergence (MKLD) can be computed to serve as dominant features. In the classification experiments, the least squares support vector machine (LS-SVM) with Gaussian kernels was applied to distinguish the stride patterns in ALS patients. According to the results obtained with the stride interval time series recorded from 16 healthy control subjects and 13 patients with ALS, the key findings of the present study are summarized as follows. (1) It is observed that the mean of stride interval computed based on the PDF for the left foot is correlated with that for the right foot in patients with ALS. (2) The MKLD parameter of the gait in ALS is significantly different from that in healthy controls. (3) The diagnostic performance of the nonlinear LS-SVM, evaluated by the leave-one-out cross-validation method, is superior to that obtained by the linear discriminant analysis. The LS-SVM can effectively separate the stride patterns between the groups of healthy controls and ALS patients with an overall accurate rate of 82.8% and an area of 0.869 under the receiver operating characteristic curve. PMID:21130016

  11. Analysis of altered gait cycle duration in amyotrophic lateral sclerosis based on nonparametric probability density function estimation.

    PubMed

    Wu, Yunfeng; Shi, Lei

    2011-04-01

    Human locomotion is regulated by the central nervous system (CNS). The neurophysiological changes in the CNS due to amyotrophic lateral sclerosis (ALS) may cause altered gait cycle duration (stride interval) or other gait rhythm. This article used a statistical method to analyze the altered stride interval in patients with ALS. We first estimated the probability density functions (PDFs) of stride interval from the outlier-processed gait rhythm time series, by using the nonparametric Parzen-window approach. Based on the PDFs estimated, the mean of the left-foot stride interval and the modified Kullback-Leibler divergence (MKLD) can be computed to serve as dominant features. In the classification experiments, the least squares support vector machine (LS-SVM) with Gaussian kernels was applied to distinguish the stride patterns in ALS patients. According to the results obtained with the stride interval time series recorded from 16 healthy control subjects and 13 patients with ALS, the key findings of the present study are summarized as follows. (1) It is observed that the mean of stride interval computed based on the PDF for the left foot is correlated with that for the right foot in patients with ALS. (2) The MKLD parameter of the gait in ALS is significantly different from that in healthy controls. (3) The diagnostic performance of the nonlinear LS-SVM, evaluated by the leave-one-out cross-validation method, is superior to that obtained by the linear discriminant analysis. The LS-SVM can effectively separate the stride patterns between the groups of healthy controls and ALS patients with an overall accurate rate of 82.8% and an area of 0.869 under the receiver operating characteristic curve.

  12. Statistically significant data base of rock properties for geothermal use

    NASA Astrophysics Data System (ADS)

    Koch, A.; Jorand, R.; Clauser, C.

    2009-04-01

    The high risk of failure due to the unknown properties of the target rocks at depth is a major obstacle for the exploration of geothermal energy. In general, the ranges of thermal and hydraulic properties given in compilations of rock properties are too large to be useful to constrain properties at a specific site. To overcome this problem, we study the thermal and hydraulic rock properties of the main rock types in Germany in a statistical approach. An important aspect is the use of data from exploration wells that are largely untapped for the purpose of geothermal exploration. In the current project stage, we have been analyzing mostly Devonian and Carboniferous drill cores from 20 deep boreholes in the region of the Lower Rhine Embayment and the Ruhr area (western North Rhine Westphalia). In total, we selected 230 core samples with a length of up to 30 cm from the core archive of the State Geological Survey. The use of core scanning technology allowed the rapid measurement of thermal conductivity, sonic velocity, and gamma density under dry and water saturated conditions with high resolution for a large number of samples. In addition, we measured porosity, bulk density, and matrix density based on Archimedes' principle and pycnometer analysis. As first results we present arithmetic means, medians and standard deviations characterizing the petrophysical properties and their variability for specific lithostratigraphic units. Bi- and multimodal frequency distributions correspond to the occurrence of different lithologies such as shale, limestone, dolomite, sandstone, siltstone, marlstone, and quartz-schist. In a next step, the data set will be combined with logging data and complementary mineralogical analyses to derive the variation of thermal conductivity with depth. As a final result, this may be used to infer thermal conductivity for boreholes without appropriate core data which were drilled in similar geological settings.

  13. Probability-based particle detection that enables threshold-free and robust in vivo single-molecule tracking.

    PubMed

    Smith, Carlas S; Stallinga, Sjoerd; Lidke, Keith A; Rieger, Bernd; Grunwald, David

    2015-11-01

    Single-molecule detection in fluorescence nanoscopy has become a powerful tool in cell biology but can present vexing issues in image analysis, such as limited signal, unspecific background, empirically set thresholds, image filtering, and false-positive detection limiting overall detection efficiency. Here we present a framework in which expert knowledge and parameter tweaking are replaced with a probability-based hypothesis test. Our method delivers robust and threshold-free signal detection with a defined error estimate and improved detection of weaker signals. The probability value has consequences for downstream data analysis, such as weighing a series of detections and corresponding probabilities, Bayesian propagation of probability, or defining metrics in tracking applications. We show that the method outperforms all current approaches, yielding a detection efficiency of >70% and a false-positive detection rate of <5% under conditions down to 17 photons/pixel background and 180 photons/molecule signal, which is beneficial for any kind of photon-limited application. Examples include limited brightness and photostability, phototoxicity in live-cell single-molecule imaging, and use of new labels for nanoscopy. We present simulations, experimental data, and tracking of low-signal mRNAs in yeast cells.

  14. Probability-based particle detection that enables threshold-free and robust in vivo single-molecule tracking

    PubMed Central

    Smith, Carlas S.; Stallinga, Sjoerd; Lidke, Keith A.; Rieger, Bernd; Grunwald, David

    2015-01-01

    Single-molecule detection in fluorescence nanoscopy has become a powerful tool in cell biology but can present vexing issues in image analysis, such as limited signal, unspecific background, empirically set thresholds, image filtering, and false-positive detection limiting overall detection efficiency. Here we present a framework in which expert knowledge and parameter tweaking are replaced with a probability-based hypothesis test. Our method delivers robust and threshold-free signal detection with a defined error estimate and improved detection of weaker signals. The probability value has consequences for downstream data analysis, such as weighing a series of detections and corresponding probabilities, Bayesian propagation of probability, or defining metrics in tracking applications. We show that the method outperforms all current approaches, yielding a detection efficiency of >70% and a false-positive detection rate of <5% under conditions down to 17 photons/pixel background and 180 photons/molecule signal, which is beneficial for any kind of photon-limited application. Examples include limited brightness and photostability, phototoxicity in live-cell single-molecule imaging, and use of new labels for nanoscopy. We present simulations, experimental data, and tracking of low-signal mRNAs in yeast cells. PMID:26424801

  15. A generic probability based model to derive regional patterns of crops in time and space

    NASA Astrophysics Data System (ADS)

    Wattenbach, Martin; Luedtke, Stefan; Redweik, Richard; van Oijen, Marcel; Balkovic, Juraj; Reinds, Gert Jan

    2015-04-01

    Croplands are not only the key to human food supply, they also change the biophysical and biogeochemical properties of the land surface leading to changes in the water cycle, energy portioning, they influence soil erosion and substantially contribute to the amount of greenhouse gases entering the atmosphere. The effects of croplands on the environment depend on the type of crop and the associated management which both are related to the site conditions, economic boundary settings as well as preferences of individual farmers. The method described here is designed to predict the most probable crop to appear at a given location and time. The method uses statistical crop area information on NUTS2 level from EUROSTAT and the Common Agricultural Policy Regionalized Impact Model (CAPRI) as observation. These crops are then spatially disaggregated to the 1 x 1 km grid scale within the region, using the assumption that the probability of a crop appearing at a given location and a given year depends on a) the suitability of the land for the cultivation of the crop derived from the MARS Crop Yield Forecast System (MCYFS) and b) expert knowledge of agricultural practices. The latter includes knowledge concerning the feasibility of one crop following another (e.g. a late-maturing crop might leave too little time for the establishment of a winter cereal crop) and the need to combat weed infestations or crop diseases. The model is implemented in R and PostGIS. The quality of the generated crop sequences per grid cell is evaluated on the basis of the given statistics reported by the joint EU/CAPRI database. The assessment is given on NUTS2 level using per cent bias as a measure with a threshold of 15% as minimum quality. The results clearly indicates that crops with a large relative share within the administrative unit are not as error prone as crops that allocate only minor parts of the unit. However, still roughly 40% show an absolute per cent bias above the 15% threshold. This

  16. Probability Distribution Extraction from TEC Estimates based on Kernel Density Estimation

    NASA Astrophysics Data System (ADS)

    Demir, Uygar; Toker, Cenk; Çenet, Duygu

    2016-07-01

    Statistical analysis of the ionosphere, specifically the Total Electron Content (TEC), may reveal important information about its temporal and spatial characteristics. One of the core metrics that express the statistical properties of a stochastic process is its Probability Density Function (pdf). Furthermore, statistical parameters such as mean, variance and kurtosis, which can be derived from the pdf, may provide information about the spatial uniformity or clustering of the electron content. For example, the variance differentiates between a quiet ionosphere and a disturbed one, whereas kurtosis differentiates between a geomagnetic storm and an earthquake. Therefore, valuable information about the state of the ionosphere (and the natural phenomena that cause the disturbance) can be obtained by looking at the statistical parameters. In the literature, there are publications which try to fit the histogram of TEC estimates to some well-known pdf.s such as Gaussian, Exponential, etc. However, constraining a histogram to fit to a function with a fixed shape will increase estimation error, and all the information extracted from such pdf will continue to contain this error. In such techniques, it is highly likely to observe some artificial characteristics in the estimated pdf which is not present in the original data. In the present study, we use the Kernel Density Estimation (KDE) technique to estimate the pdf of the TEC. KDE is a non-parametric approach which does not impose a specific form on the TEC. As a result, better pdf estimates that almost perfectly fit to the observed TEC values can be obtained as compared to the techniques mentioned above. KDE is particularly good at representing the tail probabilities, and outliers. We also calculate the mean, variance and kurtosis of the measured TEC values. The technique is applied to the ionosphere over Turkey where the TEC values are estimated from the GNSS measurement from the TNPGN-Active (Turkish National Permanent

  17. Scaling of strength and lifetime probability distributions of quasibrittle structures based on atomistic fracture mechanics.

    PubMed

    Bazant, Zdenek P; Le, Jia-Liang; Bazant, Martin Z

    2009-07-14

    The failure probability of engineering structures such as aircraft, bridges, dams, nuclear structures, and ships, as well as microelectronic components and medical implants, must be kept extremely low, typically <10(-6). The safety factors needed to ensure it have so far been assessed empirically. For perfectly ductile and perfectly brittle structures, the empirical approach is sufficient because the cumulative distribution function (cdf) of random material strength is known and fixed. However, such an approach is insufficient for structures consisting of quasibrittle materials, which are brittle materials with inhomogeneities that are not negligible compared with the structure size. The reason is that the strength cdf of quasibrittle structure varies from Gaussian to Weibullian as the structure size increases. In this article, a recently proposed theory for the strength cdf of quasibrittle structure is refined by deriving it from fracture mechanics of nanocracks propagating by small, activation-energy-controlled, random jumps through the atomic lattice. This refinement also provides a plausible physical justification of the power law for subcritical creep crack growth, hitherto considered empirical. The theory is further extended to predict the cdf of structural lifetime at constant load, which is shown to be size- and geometry-dependent. The size effects on structure strength and lifetime are shown to be related and the latter to be much stronger. The theory fits previously unexplained deviations of experimental strength and lifetime histograms from the Weibull distribution. Finally, a boundary layer method for numerical calculation of the cdf of structural strength and lifetime is outlined.

  18. Species-Level Deconvolution of Metagenome Assemblies with Hi-C–Based Contact Probability Maps

    PubMed Central

    Burton, Joshua N.; Liachko, Ivan; Dunham, Maitreya J.; Shendure, Jay

    2014-01-01

    Microbial communities consist of mixed populations of organisms, including unknown species in unknown abundances. These communities are often studied through metagenomic shotgun sequencing, but standard library construction methods remove long-range contiguity information; thus, shotgun sequencing and de novo assembly of a metagenome typically yield a collection of contigs that cannot readily be grouped by species. Methods for generating chromatin-level contact probability maps, e.g., as generated by the Hi-C method, provide a signal of contiguity that is completely intracellular and contains both intrachromosomal and interchromosomal information. Here, we demonstrate how this signal can be exploited to reconstruct the individual genomes of microbial species present within a mixed sample. We apply this approach to two synthetic metagenome samples, successfully clustering the genome content of fungal, bacterial, and archaeal species with more than 99% agreement with published reference genomes. We also show that the Hi-C signal can secondarily be used to create scaffolded genome assemblies of individual eukaryotic species present within the microbial community, with higher levels of contiguity than some of the species’ published reference genomes. PMID:24855317

  19. Level Set Segmentation of Medical Images Based on Local Region Statistics and Maximum a Posteriori Probability

    PubMed Central

    Wang, Yi; Lei, Tao; Fan, Yangyu; Feng, Yan

    2013-01-01

    This paper presents a variational level set method for simultaneous segmentation and bias field estimation of medical images with intensity inhomogeneity. In our model, the statistics of image intensities belonging to each different tissue in local regions are characterized by Gaussian distributions with different means and variances. According to maximum a posteriori probability (MAP) and Bayes' rule, we first derive a local objective function for image intensities in a neighborhood around each pixel. Then this local objective function is integrated with respect to the neighborhood center over the entire image domain to give a global criterion. In level set framework, this global criterion defines an energy in terms of the level set functions that represent a partition of the image domain and a bias field that accounts for the intensity inhomogeneity of the image. Therefore, image segmentation and bias field estimation are simultaneously achieved via a level set evolution process. Experimental results for synthetic and real images show desirable performances of our method. PMID:24302974

  20. Development of new risk score for pre-test probability of obstructive coronary artery disease based on coronary CT angiography.

    PubMed

    Fujimoto, Shinichiro; Kondo, Takeshi; Yamamoto, Hideya; Yokoyama, Naoyuki; Tarutani, Yasuhiro; Takamura, Kazuhisa; Urabe, Yoji; Konno, Kumiko; Nishizaki, Yuji; Shinozaki, Tomohiro; Kihara, Yasuki; Daida, Hiroyuki; Isshiki, Takaaki; Takase, Shinichi

    2015-09-01

    Existing methods to calculate pre-test probability of obstructive coronary artery disease (CAD) have been established using selected high-risk patients who were referred to conventional coronary angiography. The purpose of this study is to develop and validate our new method for pre-test probability of obstructive CAD using patients who underwent coronary CT angiography (CTA), which could be applicable to a wider range of patient population. Using consecutive 4137 patients with suspected CAD who underwent coronary CTA at our institution, a multivariate logistic regression model including clinical factors as covariates calculated the pre-test probability (K-score) of obstructive CAD determined by coronary CTA. The K-score was compared with the Duke clinical score using the area under the curve (AUC) for the receiver-operating characteristic curve. External validation was performed by an independent sample of 319 patients. The final model included eight significant predictors: age, gender, coronary risk factor (hypertension, diabetes mellitus, dyslipidemia, smoking), history of cerebral infarction, and chest symptom. The AUC of the K-score was significantly greater than that of the Duke clinical score for both derivation (0.736 vs. 0.699) and validation (0.714 vs. 0.688) data sets. Among patients who underwent coronary CTA, newly developed K-score had better pre-test prediction ability of obstructive CAD compared to Duke clinical score in Japanese population.

  1. DrugLogit: logistic discrimination between drugs and nondrugs including disease-specificity by assigning probabilities based on molecular properties.

    PubMed

    García-Sosa, Alfonso T; Oja, Mare; Hetényi, Csaba; Maran, Uko

    2012-08-27

    The increasing knowledge of both structure and activity of compounds provides a good basis for enhancing the pharmacological characterization of chemical libraries. In addition, pharmacology can be seen as incorporating both advances from molecular biology as well as chemical sciences, with innovative insight provided from studying target-ligand data from a ligand molecular point of view. Predictions and profiling of libraries of drug candidates have previously focused mainly on certain cases of oral bioavailability. Inclusion of other administration routes and disease-specificity would improve the precision of drug profiling. In this work, recent data are extended, and a probability-based approach is introduced for quantitative and gradual classification of compounds into categories of drugs/nondrugs, as well as for disease- or organ-specificity. Using experimental data of over 1067 compounds and multivariate logistic regressions, the classification shows good performance in training and independent test cases. The regressions have high statistical significance in terms of the robustness of coefficients and 95% confidence intervals provided by a 1000-fold bootstrapping resampling. Besides their good predictive power, the classification functions remain chemically interpretable, containing only one to five variables in total, and the physicochemical terms involved can be easily calculated. The present approach is useful for an improved description and filtering of compound libraries. It can also be applied sequentially or in combinations of filters, as well as adapted to particular use cases. The scores and equations may be able to suggest possible routes for compound or library modification. The data is made available for reuse by others, and the equations are freely accessible at http://hermes.chem.ut.ee/~alfx/druglogit.html.

  2. Calibrating perceived understanding and competency in probability concepts: A diagnosis of learning difficulties based on Rasch probabilistic model

    NASA Astrophysics Data System (ADS)

    Mahmud, Zamalia; Porter, Anne; Salikin, Masniyati; Ghani, Nor Azura Md

    2015-12-01

    Students' understanding of probability concepts have been investigated from various different perspectives. Competency on the other hand is often measured separately in the form of test structure. This study was set out to show that perceived understanding and competency can be calibrated and assessed together using Rasch measurement tools. Forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW have volunteered to participate in the study. Rasch measurement which is based on a probabilistic model is used to calibrate the responses from two survey instruments and investigate the interactions between them. Data were captured from the e-learning platform Moodle where students provided their responses through an online quiz. The study shows that majority of the students perceived little understanding about conditional and independent events prior to learning about it but tend to demonstrate a slightly higher competency level afterward. Based on the Rasch map, there is indication of some increase in learning and knowledge about some probability concepts at the end of the two weeks lessons on probability concepts.

  3. Automatic Sleep Stage Determination by Multi-Valued Decision Making Based on Conditional Probability with Optimal Parameters

    NASA Astrophysics Data System (ADS)

    Wang, Bei; Sugi, Takenao; Wang, Xingyu; Nakamura, Masatoshi

    Data for human sleep study may be affected by internal and external influences. The recorded sleep data contains complex and stochastic factors, which increase the difficulties for the computerized sleep stage determination techniques to be applied for clinical practice. The aim of this study is to develop an automatic sleep stage determination system which is optimized for variable sleep data. The main methodology includes two modules: expert knowledge database construction and automatic sleep stage determination. Visual inspection by a qualified clinician is utilized to obtain the probability density function of parameters during the learning process of expert knowledge database construction. Parameter selection is introduced in order to make the algorithm flexible. Automatic sleep stage determination is manipulated based on conditional probability. The result showed close agreement comparing with the visual inspection by clinician. The developed system can meet the customized requirements in hospitals and institutions.

  4. A fully automatic three-step liver segmentation method on LDA-based probability maps for multiple contrast MR images.

    PubMed

    Gloger, Oliver; Kühn, Jens; Stanski, Adam; Völzke, Henry; Puls, Ralf

    2010-07-01

    Automatic 3D liver segmentation in magnetic resonance (MR) data sets has proven to be a very challenging task in the domain of medical image analysis. There exist numerous approaches for automatic 3D liver segmentation on computer tomography data sets that have influenced the segmentation of MR images. In contrast to previous approaches to liver segmentation in MR data sets, we use all available MR channel information of different weightings and formulate liver tissue and position probabilities in a probabilistic framework. We apply multiclass linear discriminant analysis as a fast and efficient dimensionality reduction technique and generate probability maps then used for segmentation. We develop a fully automatic three-step 3D segmentation approach based upon a modified region growing approach and a further threshold technique. Finally, we incorporate characteristic prior knowledge to improve the segmentation results. This novel 3D segmentation approach is modularized and can be applied for normal and fat accumulated liver tissue properties.

  5. A mechanical model for predicting the probability of osteoporotic hip fractures based in DXA measurements and finite element simulation

    PubMed Central

    2012-01-01

    Background Osteoporotic hip fractures represent major cause of disability, loss of quality of life and even mortality among the elderly population. Decisions on drug therapy are based on the assessment of risk factors for fracture, from BMD measurements. The combination of biomechanical models with clinical studies could better estimate bone strength and supporting the specialists in their decision. Methods A model to assess the probability of fracture, based on the Damage and Fracture Mechanics has been developed, evaluating the mechanical magnitudes involved in the fracture process from clinical BMD measurements. The model is intended for simulating the degenerative process in the skeleton, with the consequent lost of bone mass and hence the decrease of its mechanical resistance which enables the fracture due to different traumatisms. Clinical studies were chosen, both in non-treatment conditions and receiving drug therapy, and fitted to specific patients according their actual BMD measures. The predictive model is applied in a FE simulation of the proximal femur. The fracture zone would be determined according loading scenario (sideway fall, impact, accidental loads, etc.), using the mechanical properties of bone obtained from the evolutionary model corresponding to the considered time. Results BMD evolution in untreated patients and in those under different treatments was analyzed. Evolutionary curves of fracture probability were obtained from the evolution of mechanical damage. The evolutionary curve of the untreated group of patients presented a marked increase of the fracture probability, while the curves of patients under drug treatment showed variable decreased risks, depending on the therapy type. Conclusion The FE model allowed to obtain detailed maps of damage and fracture probability, identifying high-risk local zones at femoral neck and intertrochanteric and subtrochanteric areas, which are the typical locations of osteoporotic hip fractures. The

  6. Results from probability-based, simplified, off-shore Louisiana CSEM hydrocarbon reservoir modeling

    NASA Astrophysics Data System (ADS)

    Stalnaker, J. L.; Tinley, M.; Gueho, B.

    2009-12-01

    Perhaps the biggest impediment to the commercial application of controlled-source electromagnetic (CSEM) geophysics marine hydrocarbon exploration is the inefficiency of modeling and data inversion. If an understanding of the typical (in a statistical sense) geometrical and electrical nature of a reservoir can be attained, then it is possible to derive therefrom a simplified yet accurate model of the electromagnetic interactions that produce a measured marine CSEM signal, leading ultimately to efficient modeling and inversion. We have compiled geometric and resistivity measurements from roughly 100 known, producing off-shore Louisiana Gulf of Mexico reservoirs. Recognizing that most reservoirs could be recreated roughly from a sectioned hemi-ellipsoid, we devised a unified, compact reservoir geometry description. Each reservoir was initially fit to the ellipsoid by eye, though we plan in the future to perform a more rigorous least-squares fit. We created, using kernel density estimation, initial probabilistic descriptions of reservoir parameter distributions, with the understanding that additional information would not fundamentally alter our results, but rather increase accuracy. From the probabilistic description, we designed an approximate model consisting of orthogonally oriented current segments distributed across the ellipsoid--enough to define the shape, yet few enough to be resolved during inversion. The moment and length of the currents are mapped to geometry and resistivity of the ellipsoid. The probability density functions (pdfs) derived from reservoir statistics serve as a workbench. We first use the pdfs in a Monte Carlo simulation designed to assess the detectability off-shore Louisiana reservoirs using magnitude versus offset (MVO) anomalies. From the pdfs, many reservoir instances are generated (using rejection sampling) and each normalized MVO response is calculated. The response strength is summarized by numerically computing MVO power, and that

  7. RECRUITING FOR A LONGITUDINAL STUDY OF CHILDREN'S HEALTH USING A HOUSEHOLD-BASED PROBABILITY SAMPLING APPROACH

    EPA Science Inventory

    The sampling design for the National Children¿s Study (NCS) calls for a population-based, multi-stage, clustered household sampling approach (visit our website for more information on the NCS : www.nationalchildrensstudy.gov). The full sample is designed to be representative of ...

  8. Translating CFC-based piston ages into probability density functions of ground-water age in karst

    USGS Publications Warehouse

    Long, A.J.; Putnam, L.D.

    2006-01-01

    Temporal age distributions are equivalent to probability density functions (PDFs) of transit time. The type and shape of a PDF provides important information related to ground-water mixing at the well or spring and the complex nature of flow networks in karst aquifers. Chlorofluorocarbon (CFC) concentrations measured for samples from 12 locations in the karstic Madison aquifer were used to evaluate the suitability of various PDF types for this aquifer. Parameters of PDFs could not be estimated within acceptable confidence intervals for any of the individual sites. Therefore, metrics derived from CFC-based apparent ages were used to evaluate results of PDF modeling in a more general approach. The ranges of these metrics were established as criteria against which families of PDFs could be evaluated for their applicability to different parts of the aquifer. Seven PDF types, including five unimodal and two bimodal models, were evaluated. Model results indicate that unimodal models may be applicable to areas close to conduits that have younger piston (i.e., apparent) ages and that bimodal models probably are applicable to areas farther from conduits that have older piston ages. The two components of a bimodal PDF are interpreted as representing conduit and diffuse flow, and transit times of as much as two decades may separate these PDF components. Areas near conduits may be dominated by conduit flow, whereas areas farther from conduits having bimodal distributions probably have good hydraulic connection to both diffuse and conduit flow. ?? 2006 Elsevier B.V. All rights reserved.

  9. Improvement of HMM-based action classification by using state transition probability

    NASA Astrophysics Data System (ADS)

    Kitamura, Yuka; Aruga, Haruki; Hashimoto, Manabu

    2015-04-01

    We propose a method to classify multiple similar actions which are contained in human behaviors by considering a weak-constrained order of "actions". The proposed method regards the human behavior as a combination of "action" patterns which have order constrained weakly. In this method, actions are classified by using not only image features but also consistency of transitions between an action and next action. By considering such an action transition, our method can recognize human behavior even if image features of different action are similar to each other. Based on this idea, we have improved the previous HMM-based algorithm effectively. Through some experiments using test image sequences of human behavior appeared in a bathroom, we have confirmed that the average classification success rate is 97 %, which is about 53 % higher than the previous method.

  10. Probability-Based Design Criteria of the ASCE 7 Tsunami Loads and Effects Provisions (Invited)

    NASA Astrophysics Data System (ADS)

    Chock, G.

    2013-12-01

    Mitigation of tsunami risk requires a combination of emergency preparedness for evacuation in addition to providing structural resilience of critical facilities, infrastructure, and key resources necessary for immediate response and economic and social recovery. Critical facilities would include emergency response, medical, tsunami refuges and shelters, ports and harbors, lifelines, transportation, telecommunications, power, financial institutions, and major industrial/commercial facilities. The Tsunami Loads and Effects Subcommittee of the ASCE/SEI 7 Standards Committee is developing a proposed new Chapter 6 - Tsunami Loads and Effects for the 2016 edition of the ASCE 7 Standard. ASCE 7 provides the minimum design loads and requirements for structures subject to building codes such as the International Building Code utilized in the USA. In this paper we will provide a review emphasizing the intent of these new code provisions and explain the design methodology. The ASCE 7 provisions for Tsunami Loads and Effects enables a set of analysis and design methodologies that are consistent with performance-based engineering based on probabilistic criteria. . The ASCE 7 Tsunami Loads and Effects chapter will be initially applicable only to the states of Alaska, Washington, Oregon, California, and Hawaii. Ground shaking effects and subsidence from a preceding local offshore Maximum Considered Earthquake will also be considered prior to tsunami arrival for Alaska and states in the Pacific Northwest regions governed by nearby offshore subduction earthquakes. For national tsunami design provisions to achieve a consistent reliability standard of structural performance for community resilience, a new generation of tsunami inundation hazard maps for design is required. The lesson of recent tsunami is that historical records alone do not provide a sufficient measure of the potential heights of future tsunamis. Engineering design must consider the occurrence of events greater than

  11. Computing posterior probabilities for score-based alignments using ppALIGN.

    PubMed

    Wolfsheimer, Stefan; Hartmann, Alexander; Rabus, Ralf; Nuel, Gregory

    2012-01-01

    Score-based pairwise alignments are widely used in bioinformatics in particular with molecular database search tools, such as the BLAST family. Due to sophisticated heuristics, such algorithms are usually fast but the underlying scoring model unfortunately lacks a statistical description of the reliability of the reported alignments. In particular, close to gaps, in low-score or low-complexity regions, a huge number of alternative alignments arise which results in a decrease of the certainty of the alignment. ppALIGN is a software package that uses hidden Markov Model techniques to compute position-wise reliability of score-based pairwise alignments of DNA or protein sequences. The design of the model allows for a direct connection between the scoring function and the parameters of the probabilistic model. For this reason it is suitable to analyze the outcomes of popular score based aligners and search tools without having to choose a complicated set of parameters. By contrast, our program only requires the classical score parameters (the scoring function and gap costs). The package comes along with a library written in C++, a standalone program for user defined alignments (ppALIGN) and another program (ppBLAST) which can process a complete result set of BLAST. The main algorithms essentially exhibit a linear time complexity (in the alignment lengths), and they are hence suitable for on-line computations. We have also included alternative decoding algorithms to provide alternative alignments. ppALIGN is a fast program/library that helps detect and quantify questionable regions in pairwise alignments. Due to its structure, the input/output interface it can to be connected to other post-processing tools. Empirically, we illustrate its usefulness in terms of correctly predicted reliable regions for sequences generated using the ROSE model for sequence evolution, and identify sensor-specific regions in the denitrifying betaproteobacterium Aromatoleum aromaticum. PMID

  12. Urban seismic risk assessment: statistical repair cost data and probable structural losses based on damage scenario—correlation analysis

    NASA Astrophysics Data System (ADS)

    Eleftheriadou, Anastasia K.; Baltzopoulou, Aikaterini D.; Karabinis, Athanasios I.

    2016-06-01

    The current seismic risk assessment is based on two discrete approaches, actual and probable, validating afterwards the produced results. In the first part of this research, the seismic risk is evaluated from the available data regarding the mean statistical repair/strengthening or replacement cost for the total number of damaged structures (180,427 buildings) after the 7/9/1999 Parnitha (Athens) earthquake. The actual evaluated seismic risk is afterwards compared to the estimated probable structural losses, which is presented in the second part of the paper, based on a damage scenario in the referring earthquake. The applied damage scenario is based on recently developed damage probability matrices (DPMs) from Athens (Greece) damage database. The seismic risk estimation refers to 750,085 buildings situated in the extended urban region of Athens. The building exposure is categorized in five typical structural types and represents 18.80 % of the entire building stock in Greece. The last information is provided by the National Statistics Service of Greece (NSSG) according to the 2000-2001 census. The seismic input is characterized by the ratio, a g/ a o, where a g is the regional peak ground acceleration (PGA) which is evaluated from the earlier estimated research macroseismic intensities, and a o is the PGA according to the hazard map of the 2003 Greek Seismic Code. Finally, the collected investigated financial data derived from different National Services responsible for the post-earthquake crisis management concerning the repair/strengthening or replacement costs or other categories of costs for the rehabilitation of earthquake victims (construction and function of settlements for earthquake homeless, rent supports, demolitions, shorings) are used to determine the final total seismic risk factor.

  13. Power optimization of chemically driven heat engine based on first and second order reaction kinetic theory and probability theory

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Chen, Lingen; Sun, Fengrui

    2016-03-01

    The finite-time thermodynamic method based on probability analysis can more accurately describe various performance parameters of thermodynamic systems. Based on the relation between optimal efficiency and power output of a generalized Carnot heat engine with a finite high-temperature heat reservoir (heat source) and an infinite low-temperature heat reservoir (heat sink) and with the only irreversibility of heat transfer, this paper studies the problem of power optimization of chemically driven heat engine based on first and second order reaction kinetic theory, puts forward a model of the coupling heat engine which can be run periodically and obtains the effects of the finite-time thermodynamic characteristics of the coupling relation between chemical reaction and heat engine on the power optimization. The results show that the first order reaction kinetics model can use fuel more effectively, and can provide heat engine with higher temperature heat source to increase the power output of the heat engine. Moreover, the power fluctuation bounds of the chemically driven heat engine are obtained by using the probability analysis method. The results may provide some guidelines for the character analysis and power optimization of the chemically driven heat engines.

  14. EUPDF: An Eulerian-Based Monte Carlo Probability Density Function (PDF) Solver. User's Manual

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    1998-01-01

    EUPDF is an Eulerian-based Monte Carlo PDF solver developed for application with sprays, combustion, parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and spray solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type. The manual provides the user with the coding required to couple the PDF code to any given flow code and a basic understanding of the EUPDF code structure as well as the models involved in the PDF formulation. The source code of EUPDF will be available with the release of the National Combustion Code (NCC) as a complete package.

  15. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    USGS Publications Warehouse

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  16. PON-mt-tRNA: a multifactorial probability-based method for classification of mitochondrial tRNA variations

    PubMed Central

    Niroula, Abhishek; Vihinen, Mauno

    2016-01-01

    Transfer RNAs (tRNAs) are essential for encoding the transcribed genetic information from DNA into proteins. Variations in the human tRNAs are involved in diverse clinical phenotypes. Interestingly, all pathogenic variations in tRNAs are located in mitochondrial tRNAs (mt-tRNAs). Therefore, it is crucial to identify pathogenic variations in mt-tRNAs for disease diagnosis and proper treatment. We collected mt-tRNA variations using a classification based on evidence from several sources and used the data to develop a multifactorial probability-based prediction method, PON-mt-tRNA, for classification of mt-tRNA single nucleotide substitutions. We integrated a machine learning-based predictor and an evidence-based likelihood ratio for pathogenicity using evidence of segregation, biochemistry and histochemistry to predict the posterior probability of pathogenicity of variants. The accuracy and Matthews correlation coefficient (MCC) of PON-mt-tRNA are 1.00 and 0.99, respectively. In the absence of evidence from segregation, biochemistry and histochemistry, PON-mt-tRNA classifies variations based on the machine learning method with an accuracy and MCC of 0.69 and 0.39, respectively. We classified all possible single nucleotide substitutions in all human mt-tRNAs using PON-mt-tRNA. The variations in the loops are more often tolerated compared to the variations in stems. The anticodon loop contains comparatively more predicted pathogenic variations than the other loops. PON-mt-tRNA is available at http://structure.bmc.lu.se/PON-mt-tRNA/. PMID:26843426

  17. Estimation of the probability of exposure to metalworking fluids in a population-based case-control study

    PubMed Central

    Park, Dong-Uk; Colt, Joanne S.; Baris, Dalsu; Schwenn, Molly; Karagas, Margaret R.; Armenti, Karla R.; Johnson, Alison; Silverman, Debra T; Stewart, Patricia A

    2014-01-01

    We describe here an approach for estimating the probability that study subjects were exposed to metalworking fluids (MWFs) in a population-based case-control study of bladder cancer. Study subject reports on the frequency of machining and use of specific MWFs (straight, soluble, and synthetic/semi-synthetic) were used to estimate exposure probability when available. Those reports also were used to develop estimates for job groups, which were then applied to jobs without MWF reports. Estimates using both cases and controls and controls only were developed. The prevalence of machining varied substantially across job groups (10-90%), with the greatest percentage of jobs that machined being reported by machinists and tool and die workers. Reports of straight and soluble MWF use were fairly consistent across job groups (generally, 50-70%). Synthetic MWF use was lower (13-45%). There was little difference in reports by cases and controls vs. controls only. Approximately, 1% of the entire study population was assessed as definitely exposed to straight or soluble fluids in contrast to 0.2% definitely exposed to synthetic/semi-synthetics. A comparison between the reported use of the MWFs and the US production levels by decade found high correlations (r generally >0.7). Overall, the method described here is likely to have provided a systematic and reliable ranking that better reflects the variability of exposure to three types of MWFs than approaches applied in the past. PMID:25256317

  18. EFFECT OF CHLORIDE AND SULFATE CONCENTRATION ON PROBABLITY BASED CORROSION CONTROL FOR LIQUID WASTE TANKS- PART IV

    SciTech Connect

    Hoffman, E.

    2012-08-23

    A series of cyclic potentiodynamic polarization tests was performed on samples of A537 carbon steel in support of a probability-based approach to evaluate the effect of chloride and sulfate on corrosion susceptibility. Testing solutions were chosen to build off previous experimental results from FY07, FY08, FY09 and FY10 to systemically evaluate the influence of the secondary aggressive species, chloride, and sulfate. The FY11 results suggest that evaluating the combined effect of all aggressive species, nitrate, chloride, and sulfate, provides a consistent response for determining corrosion susceptibility. The results of this work emphasize the importance for not only nitrate concentration limits, but also chloride and sulfate concentration limits as well.

  19. Blocking Probability of a Preemption-Based Bandwidth-Allocation Scheme for Service Differentiation in OBS Networks

    NASA Astrophysics Data System (ADS)

    Phuritatkul, Jumpot; Ji, Yusheng; Zhang, Yongbing

    2006-08-01

    For the next generation optical Internet, optical burst switching (OBS) is considered as a promising solution to exploit the capacity provided by wavelength-division-multiplexing technology. In this paper, the authors analyze preemption-based bandwidth-allocation (PBA) scheme for supporting service differentiation in OBS networks. They first propose the mathematical analysis of burst blocking probability (BBP) for a general case of probabilistic wavelength-preemption algorithm. The BBP of a new arrival burst for a K-channel N-class system is presented. They then apply this model to PBA. The results of analytical loss model are investigated and compared with simulations. The simulation results validate their analytical model and show that a BBP can be controlled for different service classes with the PBA scheme.

  20. An 8-channel neural spike processing IC with unsupervised closed-loop control based on spiking probability estimation.

    PubMed

    Wu, Tong; Yang, Zhi

    2014-01-01

    This paper presents a neural spike processing IC for simultaneous spike detection, alignment, and transmission on 8 recording channels with unsupervised closed-loop control. In this work, spikes are detected according to online estimated spiking probability maps, which reliably predict the possibility of spike occurrence. The closed-loop control has been made possible by estimating firing rates based on alignment results and turning on/off channels individually and automatically. The 8-channel neural spike processing IC, implemented in a 0.13 μm CMOS process, has a varied power dissipation from 36 μW to 54.4 μW per channel at a voltage supply of 1.2 V. The chip also achieves a 380× data rate reduction for the testing in vivo data, allowing easy integration with wireless data transmission modules. PMID:25571180

  1. Small scale photo probability sampling and vegetation classification in southeast Arizona as an ecological base for resource inventory. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Johnson, J. R. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. The broad scale vegetation classification was developed for a 3,200 sq mile area in southeastern Arizona. The 31 vegetation types were derived from association tables which contained information taken at about 500 ground sites. The classification provided an information base that was suitable for use with small scale photography. A procedure was developed and tested for objectively comparing photo images. The procedure consisted of two parts, image groupability testing and image complexity testing. The Apollo and ERTS photos were compared for relative suitability as first stage stratification bases in two stage proportional probability sampling. High altitude photography was used in common at the second stage.

  2. Estimation of the probability of exposure to machining fluids in a population-based case-control study.

    PubMed

    Park, Dong-Uk; Colt, Joanne S; Baris, Dalsu; Schwenn, Molly; Karagas, Margaret R; Armenti, Karla R; Johnson, Alison; Silverman, Debra T; Stewart, Patricia A

    2014-01-01

    We describe an approach for estimating the probability that study subjects were exposed to metalworking fluids (MWFs) in a population-based case-control study of bladder cancer. Study subject reports on the frequency of machining and use of specific MWFs (straight, soluble, and synthetic/semi-synthetic) were used to estimate exposure probability when available. Those reports also were used to develop estimates for job groups, which were then applied to jobs without MWF reports. Estimates using both cases and controls and controls only were developed. The prevalence of machining varied substantially across job groups (0.1->0.9%), with the greatest percentage of jobs that machined being reported by machinists and tool and die workers. Reports of straight and soluble MWF use were fairly consistent across job groups (generally 50-70%). Synthetic MWF use was lower (13-45%). There was little difference in reports by cases and controls vs. controls only. Approximately, 1% of the entire study population was assessed as definitely exposed to straight or soluble fluids in contrast to 0.2% definitely exposed to synthetic/semi-synthetics. A comparison between the reported use of the MWFs and U.S. production levels found high correlations (r generally >0.7). Overall, the method described here is likely to have provided a systematic and reliable ranking that better reflects the variability of exposure to three types of MWFs than approaches applied in the past. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resources: a list of keywords in the occupational histories that were used to link study subjects to the metalworking fluids (MWFs) modules; recommendations from the literature on selection of MWFs based on type of machining operation, the metal being machined and decade; popular additives to MWFs; the number and proportion of controls who

  3. GTNEUT: A code for the calculation of neutral particle transport in plasmas based on the Transmission and Escape Probability method

    NASA Astrophysics Data System (ADS)

    Mandrekas, John

    2004-08-01

    GTNEUT is a two-dimensional code for the calculation of the transport of neutral particles in fusion plasmas. It is based on the Transmission and Escape Probabilities (TEP) method and can be considered a computationally efficient alternative to traditional Monte Carlo methods. The code has been benchmarked extensively against Monte Carlo and has been used to model the distribution of neutrals in fusion experiments. Program summaryTitle of program: GTNEUT Catalogue identifier: ADTX Program obtainable from: CPC Program Library, Queen's University of Belfast, N. Ireland Program summary URL:http://cpc.cs.qub.ac.uk/summaries/ADTX Computer for which the program is designed and others on which it has been tested: The program was developed on a SUN Ultra 10 workstation and has been tested on other Unix workstations and PCs. Operating systems or monitors under which the program has been tested: Solaris 8, 9, HP-UX 11i, Linux Red Hat v8.0, Windows NT/2000/XP. Programming language used: Fortran 77 Memory required to execute with typical data: 6 219 388 bytes No. of bits in a word: 32 No. of processors used: 1 Has the code been vectorized or parallelized?: No No. of bytes in distributed program, including test data, etc.: 300 709 No. of lines in distributed program, including test data, etc.: 17 365 Distribution format: compressed tar gzip file Keywords: Neutral transport in plasmas, Escape probability methods Nature of physical problem: This code calculates the transport of neutral particles in thermonuclear plasmas in two-dimensional geometric configurations. Method of solution: The code is based on the Transmission and Escape Probability (TEP) methodology [1], which is part of the family of integral transport methods for neutral particles and neutrons. The resulting linear system of equations is solved by standard direct linear system solvers (sparse and non-sparse versions are included). Restrictions on the complexity of the problem: The current version of the code can

  4. Estimation of the probability of exposure to machining fluids in a population-based case-control study.

    PubMed

    Park, Dong-Uk; Colt, Joanne S; Baris, Dalsu; Schwenn, Molly; Karagas, Margaret R; Armenti, Karla R; Johnson, Alison; Silverman, Debra T; Stewart, Patricia A

    2014-01-01

    We describe an approach for estimating the probability that study subjects were exposed to metalworking fluids (MWFs) in a population-based case-control study of bladder cancer. Study subject reports on the frequency of machining and use of specific MWFs (straight, soluble, and synthetic/semi-synthetic) were used to estimate exposure probability when available. Those reports also were used to develop estimates for job groups, which were then applied to jobs without MWF reports. Estimates using both cases and controls and controls only were developed. The prevalence of machining varied substantially across job groups (0.1->0.9%), with the greatest percentage of jobs that machined being reported by machinists and tool and die workers. Reports of straight and soluble MWF use were fairly consistent across job groups (generally 50-70%). Synthetic MWF use was lower (13-45%). There was little difference in reports by cases and controls vs. controls only. Approximately, 1% of the entire study population was assessed as definitely exposed to straight or soluble fluids in contrast to 0.2% definitely exposed to synthetic/semi-synthetics. A comparison between the reported use of the MWFs and U.S. production levels found high correlations (r generally >0.7). Overall, the method described here is likely to have provided a systematic and reliable ranking that better reflects the variability of exposure to three types of MWFs than approaches applied in the past. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resources: a list of keywords in the occupational histories that were used to link study subjects to the metalworking fluids (MWFs) modules; recommendations from the literature on selection of MWFs based on type of machining operation, the metal being machined and decade; popular additives to MWFs; the number and proportion of controls who

  5. A scan statistic for binary outcome based on hypergeometric probability model, with an application to detecting spatial clusters of Japanese encephalitis.

    PubMed

    Zhao, Xing; Zhou, Xiao-Hua; Feng, Zijian; Guo, Pengfei; He, Hongyan; Zhang, Tao; Duan, Lei; Li, Xiaosong

    2013-01-01

    As a useful tool for geographical cluster detection of events, the spatial scan statistic is widely applied in many fields and plays an increasingly important role. The classic version of the spatial scan statistic for the binary outcome is developed by Kulldorff, based on the Bernoulli or the Poisson probability model. In this paper, we apply the Hypergeometric probability model to construct the likelihood function under the null hypothesis. Compared with existing methods, the likelihood function under the null hypothesis is an alternative and indirect method to identify the potential cluster, and the test statistic is the extreme value of the likelihood function. Similar with Kulldorff's methods, we adopt Monte Carlo test for the test of significance. Both methods are applied for detecting spatial clusters of Japanese encephalitis in Sichuan province, China, in 2009, and the detected clusters are identical. Through a simulation to independent benchmark data, it is indicated that the test statistic based on the Hypergeometric model outweighs Kulldorff's statistics for clusters of high population density or large size; otherwise Kulldorff's statistics are superior.

  6. A multifactorial likelihood model for MMR gene variant classification incorporating probabilities based on sequence bioinformatics and tumor characteristics: a report from the Colon Cancer Family Registry.

    PubMed

    Thompson, Bryony A; Goldgar, David E; Paterson, Carol; Clendenning, Mark; Walters, Rhiannon; Arnold, Sven; Parsons, Michael T; Michael D, Walsh; Gallinger, Steven; Haile, Robert W; Hopper, John L; Jenkins, Mark A; Lemarchand, Loic; Lindor, Noralane M; Newcomb, Polly A; Thibodeau, Stephen N; Young, Joanne P; Buchanan, Daniel D; Tavtigian, Sean V; Spurdle, Amanda B

    2013-01-01

    Mismatch repair (MMR) gene sequence variants of uncertain clinical significance are often identified in suspected Lynch syndrome families, and this constitutes a challenge for both researchers and clinicians. Multifactorial likelihood model approaches provide a quantitative measure of MMR variant pathogenicity, but first require input of likelihood ratios (LRs) for different MMR variation-associated characteristics from appropriate, well-characterized reference datasets. Microsatellite instability (MSI) and somatic BRAF tumor data for unselected colorectal cancer probands of known pathogenic variant status were used to derive LRs for tumor characteristics using the Colon Cancer Family Registry (CFR) resource. These tumor LRs were combined with variant segregation within families, and estimates of prior probability of pathogenicity based on sequence conservation and position, to analyze 44 unclassified variants identified initially in Australasian Colon CFR families. In addition, in vitro splicing analyses were conducted on the subset of variants based on bioinformatic splicing predictions. The LR in favor of pathogenicity was estimated to be ~12-fold for a colorectal tumor with a BRAF mutation-negative MSI-H phenotype. For 31 of the 44 variants, the posterior probabilities of pathogenicity were such that altered clinical management would be indicated. Our findings provide a working multifactorial likelihood model for classification that carefully considers mode of ascertainment for gene testing.

  7. Guided waves based SHM systems for composites structural elements: statistical analyses finalized at probability of detection definition and assessment

    NASA Astrophysics Data System (ADS)

    Monaco, E.; Memmolo, V.; Ricci, F.; Boffa, N. D.; Maio, L.

    2015-03-01

    Maintenance approaches based on sensorised structures and Structural Health Monitoring systems could represent one of the most promising innovations in the fields of aerostructures since many years, mostly when composites materials (fibers reinforced resins) are considered. Layered materials still suffer today of drastic reductions of maximum allowable stress values during the design phase as well as of costly and recurrent inspections during the life cycle phase that don't permit of completely exploit their structural and economic potentialities in today aircrafts. Those penalizing measures are necessary mainly to consider the presence of undetected hidden flaws within the layered sequence (delaminations) or in bonded areas (partial disbonding); in order to relax design and maintenance constraints a system based on sensors permanently installed on the structure to detect and locate eventual flaws can be considered (SHM system) once its effectiveness and reliability will be statistically demonstrated via a rigorous Probability Of Detection function definition and evaluation. This paper presents an experimental approach with a statistical procedure for the evaluation of detection threshold of a guided waves based SHM system oriented to delaminations detection on a typical wing composite layered panel. The experimental tests are mostly oriented to characterize the statistical distribution of measurements and damage metrics as well as to characterize the system detection capability using this approach. Numerically it is not possible to substitute part of the experimental tests aimed at POD where the noise in the system response is crucial. Results of experiments are presented in the paper and analyzed.

  8. Statistical analysis of the induced Basel 2006 earthquake sequence: introducing a probability-based monitoring approach for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Bachmann, C. E.; Wiemer, S.; Woessner, J.; Hainzl, S.

    2011-08-01

    Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was

  9. Abstract Models of Probability

    NASA Astrophysics Data System (ADS)

    Maximov, V. M.

    2001-12-01

    Probability theory presents a mathematical formalization of intuitive ideas of independent events and a probability as a measure of randomness. It is based on axioms 1-5 of A.N. Kolmogorov 1 and their generalizations 2. Different formalized refinements were proposed for such notions as events, independence, random value etc., 2,3, whereas the measure of randomness, i.e. numbers from [0,1], remained unchanged. To be precise we mention some attempts of generalization of the probability theory with negative probabilities 4. From another side the physicists tryed to use the negative and even complex values of probability to explain some paradoxes in quantum mechanics 5,6,7. Only recently, the necessity of formalization of quantum mechanics and their foundations 8 led to the construction of p-adic probabilities 9,10,11, which essentially extended our concept of probability and randomness. Therefore, a natural question arises how to describe algebraic structures whose elements can be used as a measure of randomness. As consequence, a necessity arises to define the types of randomness corresponding to every such algebraic structure. Possibly, this leads to another concept of randomness that has another nature different from combinatorical - metric conception of Kolmogorov. Apparenly, discrepancy of real type of randomness corresponding to some experimental data lead to paradoxes, if we use another model of randomness for data processing 12. Algebraic structure whose elements can be used to estimate some randomness will be called a probability set Φ. Naturally, the elements of Φ are the probabilities.

  10. FINAL PROJECT REPORT DOE Early Career Principal Investigator Program Project Title: Developing New Mathematical Models for Multiphase Flows Based on a Fundamental Probability Density Function Approach

    SciTech Connect

    Shankar Subramaniam

    2009-04-01

    This final project report summarizes progress made towards the objectives described in the proposal entitled “Developing New Mathematical Models for Multiphase Flows Based on a Fundamental Probability Density Function Approach”. Substantial progress has been made in theory, modeling and numerical simulation of turbulent multiphase flows. The consistent mathematical framework based on probability density functions is described. New models are proposed for turbulent particle-laden flows and sprays.

  11. Applying Probability Theory for the Quality Assessment of a Wildfire Spread Prediction Framework Based on Genetic Algorithms

    PubMed Central

    Cencerrado, Andrés; Cortés, Ana; Margalef, Tomàs

    2013-01-01

    This work presents a framework for assessing how the existing constraints at the time of attending an ongoing forest fire affect simulation results, both in terms of quality (accuracy) obtained and the time needed to make a decision. In the wildfire spread simulation and prediction area, it is essential to properly exploit the computational power offered by new computing advances. For this purpose, we rely on a two-stage prediction process to enhance the quality of traditional predictions, taking advantage of parallel computing. This strategy is based on an adjustment stage which is carried out by a well-known evolutionary technique: Genetic Algorithms. The core of this framework is evaluated according to the probability theory principles. Thus, a strong statistical study is presented and oriented towards the characterization of such an adjustment technique in order to help the operation managers deal with the two aspects previously mentioned: time and quality. The experimental work in this paper is based on a region in Spain which is one of the most prone to forest fires: El Cap de Creus. PMID:24453898

  12. Applying probability theory for the quality assessment of a wildfire spread prediction framework based on genetic algorithms.

    PubMed

    Cencerrado, Andrés; Cortés, Ana; Margalef, Tomàs

    2013-01-01

    This work presents a framework for assessing how the existing constraints at the time of attending an ongoing forest fire affect simulation results, both in terms of quality (accuracy) obtained and the time needed to make a decision. In the wildfire spread simulation and prediction area, it is essential to properly exploit the computational power offered by new computing advances. For this purpose, we rely on a two-stage prediction process to enhance the quality of traditional predictions, taking advantage of parallel computing. This strategy is based on an adjustment stage which is carried out by a well-known evolutionary technique: Genetic Algorithms. The core of this framework is evaluated according to the probability theory principles. Thus, a strong statistical study is presented and oriented towards the characterization of such an adjustment technique in order to help the operation managers deal with the two aspects previously mentioned: time and quality. The experimental work in this paper is based on a region in Spain which is one of the most prone to forest fires: El Cap de Creus. PMID:24453898

  13. Situational Lightning Climatologies for Central Florida: Phase IV: Central Florida Flow Regime Based Climatologies of Lightning Probabilities

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2009-01-01

    The threat of lightning is a daily concern during the warm season in Florida. Research has revealed distinct spatial and temporal distributions of lightning occurrence that are strongly influenced by large-scale atmospheric flow regimes. Previously, the Applied Meteorology Unit (AMU) calculated the gridded lightning climatologies based on seven flow regimes over Florida for 1-, 3- and 6-hr intervals in 5-, 10-, 20-, and 30-NM diameter range rings around the Shuttle Landing Facility (SLF) and eight other airfields in the National Weather Service in Melbourne (NWS MLB) county warning area (CWA). In this update to the work, the AMU recalculated the lightning climatologies for using individual lightning strike data to improve the accuracy of the climatologies. The AMU included all data regardless of flow regime as one of the stratifications, added monthly stratifications, added three years of data to the period of record and used modified flow regimes based work from the AMU's Objective Lightning Probability Forecast Tool, Phase II. The AMU made changes so the 5- and 10-NM radius range rings are consistent with the aviation forecast requirements at NWS MLB, while the 20- and 30-NM radius range rings at the SLF assist the Spaceflight Meteorology Group in making forecasts for weather Flight Rule violations during Shuttle landings. The AMU also updated the graphical user interface with the new data.

  14. Effects of Parent-Administered, Home-Based, High-Probability Request Sequences on Compliance by Children with Developmental Disabilities

    ERIC Educational Resources Information Center

    Humm, Stephen P.; Blampied, Neville M.; Liberty, Kathleen A.

    2005-01-01

    In the high-probability request sequence (high-p) procedure, a requester presents a rapid sequence of requests a child is known to be likely to comply with, followed by a request to perform a response for which there is a low probability of compliance (low-p request). To extend previous research from institutional and research settings to home…

  15. Influence of anthropogenic activities on PAHs in sediments in a significant gulf of low-latitude developing regions, the Beibu Gulf, South China Sea: distribution, sources, inventory and probability risk.

    PubMed

    Li, Pingyang; Xue, Rui; Wang, Yinghui; Zhang, Ruijie; Zhang, Gan

    2015-01-15

    Fifteen polycyclic aromatic hydrocarbons (PAHs) in 41 surface sediment samples and a sediment core (50 cm) from the Beibu Gulf, a significant low-latitude developing gulf, were analyzed. PAHs concentrations were 3.01-388 ng g(-)(1) (mean 95.5 ng g(-)(1)) in the surface sediments and 10.5-87.1 ng g(-)(1) (average 41.1 ng g(-)(1)) in the sediment core. Source apportionment indicated that PAHs were generated from coke production and vehicular emissions (39.4%), coal and biomass combustion (35.8%), and petrogenic sources (24.8%). PAHs were mainly concentrated in the industrialized and urbanized regions and the harbor, and were transported by atmospheric deposition to the marine matrix. The mass inventory (1.57-2.62t) and probability risk showed sediments here served as an important reservoir but low PAH risk. Different from oil and natural gas in developed regions, coal combustion has always been a significant energy consumption pattern in this developing region for the past 30 years (56 ± 5%).

  16. A multiscale finite element model validation method of composite cable-stayed bridge based on Probability Box theory

    NASA Astrophysics Data System (ADS)

    Zhong, Rumian; Zong, Zhouhong; Niu, Jie; Liu, Qiqi; Zheng, Peijuan

    2016-05-01

    Modeling and simulation are routinely implemented to predict the behavior of complex structures. These tools powerfully unite theoretical foundations, numerical models and experimental data which include associated uncertainties and errors. A new methodology for multi-scale finite element (FE) model validation is proposed in this paper. The method is based on two-step updating method, a novel approach to obtain coupling parameters in the gluing sub-regions of a multi-scale FE model, and upon Probability Box (P-box) theory that can provide a lower and upper bound for the purpose of quantifying and transmitting the uncertainty of structural parameters. The structural health monitoring data of Guanhe Bridge, a composite cable-stayed bridge with large span, and Monte Carlo simulation were used to verify the proposed method. The results show satisfactory accuracy, as the overlap ratio index of each modal frequency is over 89% without the average absolute value of relative errors, and the CDF of normal distribution has a good coincidence with measured frequencies of Guanhe Bridge. The validated multiscale FE model may be further used in structural damage prognosis and safety prognosis.

  17. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    NASA Astrophysics Data System (ADS)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2016-07-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  18. Location and release time identification of pollution point source in river networks based on the Backward Probability Method.

    PubMed

    Ghane, Alireza; Mazaheri, Mehdi; Mohammad Vali Samani, Jamal

    2016-09-15

    The pollution of rivers due to accidental spills is a major threat to environment and human health. To protect river systems from accidental spills, it is essential to introduce a reliable tool for identification process. Backward Probability Method (BPM) is one of the most recommended tools that is able to introduce information related to the prior location and the release time of the pollution. This method was originally developed and employed in groundwater pollution source identification problems. One of the objectives of this study is to apply this method in identifying the pollution source location and release time in surface waters, mainly in rivers. To accomplish this task, a numerical model is developed based on the adjoint analysis. Then the developed model is verified using analytical solution and some real data. The second objective of this study is to extend the method to pollution source identification in river networks. In this regard, a hypothetical test case is considered. In the later simulations, all of the suspected points are identified, using only one backward simulation. The results demonstrated that all suspected points, determined by the BPM could be a possible pollution source. The proposed approach is accurate and computationally efficient and does not need any simplification in river geometry and flow. Due to this simplicity, it is highly recommended for practical purposes.

  19. Location and release time identification of pollution point source in river networks based on the Backward Probability Method.

    PubMed

    Ghane, Alireza; Mazaheri, Mehdi; Mohammad Vali Samani, Jamal

    2016-09-15

    The pollution of rivers due to accidental spills is a major threat to environment and human health. To protect river systems from accidental spills, it is essential to introduce a reliable tool for identification process. Backward Probability Method (BPM) is one of the most recommended tools that is able to introduce information related to the prior location and the release time of the pollution. This method was originally developed and employed in groundwater pollution source identification problems. One of the objectives of this study is to apply this method in identifying the pollution source location and release time in surface waters, mainly in rivers. To accomplish this task, a numerical model is developed based on the adjoint analysis. Then the developed model is verified using analytical solution and some real data. The second objective of this study is to extend the method to pollution source identification in river networks. In this regard, a hypothetical test case is considered. In the later simulations, all of the suspected points are identified, using only one backward simulation. The results demonstrated that all suspected points, determined by the BPM could be a possible pollution source. The proposed approach is accurate and computationally efficient and does not need any simplification in river geometry and flow. Due to this simplicity, it is highly recommended for practical purposes. PMID:27219462

  20. Probability-based estimates of site-specific copper water quality criteria for the Chesapeake Bay, USA.

    PubMed

    Arnold, W Ray; Warren-Hicks, William J

    2007-01-01

    The object of this study was to estimate site- and region-specific dissolved copper criteria for a large embayment, the Chesapeake Bay, USA. The intent is to show the utility of 2 copper saltwater quality site-specific criteria estimation models and associated region-specific criteria selection methods. The criteria estimation models and selection methods are simple, efficient, and cost-effective tools for resource managers. The methods are proposed as potential substitutes for the US Environmental Protection Agency's water effect ratio methods. Dissolved organic carbon data and the copper criteria models were used to produce probability-based estimates of site-specific copper saltwater quality criteria. Site- and date-specific criteria estimations were made for 88 sites (n = 5,296) in the Chesapeake Bay. The average and range of estimated site-specific chronic dissolved copper criteria for the Chesapeake Bay were 7.5 and 5.3 to 16.9 microg Cu/L. The average and range of estimated site-specific acute dissolved copper criteria for the Chesapeake Bay were 11.7 and 8.3 to 26.4 microg Cu/L. The results suggest that applicable national and state copper criteria can increase in much of the Chesapeake Bay and remain protective. Virginia Department of Environmental Quality copper criteria near the mouth of the Chesapeake Bay, however, need to decrease to protect species of equal or greater sensitivity to that of the marine mussel, Mytilus sp.

  1. Maximum Magnitude and Probabilities of Induced Earthquakes in California Geothermal Fields: Applications for a Science-Based Decision Framework

    NASA Astrophysics Data System (ADS)

    Weiser, Deborah Anne

    Induced seismicity is occurring at increasing rates around the country. Brodsky and Lajoie (2013) and others have recognized anthropogenic quakes at a few geothermal fields in California. I use three techniques to assess if there are induced earthquakes in California geothermal fields; there are three sites with clear induced seismicity: Brawley, The Geysers, and Salton Sea. Moderate to strong evidence is found at Casa Diablo, Coso, East Mesa, and Susanville. Little to no evidence is found for Heber and Wendel. I develop a set of tools to reduce or cope with the risk imposed by these earthquakes, and also to address uncertainties through simulations. I test if an earthquake catalog may be bounded by an upper magnitude limit. I address whether the earthquake record during pumping time is consistent with the past earthquake record, or if injection can explain all or some of the earthquakes. I also present ways to assess the probability of future earthquake occurrence based on past records. I summarize current legislation for eight states where induced earthquakes are of concern. Unlike tectonic earthquakes, the hazard from induced earthquakes has the potential to be modified. I discuss direct and indirect mitigation practices. I present a framework with scientific and communication techniques for assessing uncertainty, ultimately allowing more informed decisions to be made.

  2. Tracking a large number of closely spaced objects based on the particle probability hypothesis density filter via optical sensor

    NASA Astrophysics Data System (ADS)

    Lin, Liangkui; Xu, Hui; An, Wei; Sheng, Weidong; Xu, Dan

    2011-11-01

    This paper presents a novel approach to tracking a large number of closely spaced objects (CSO) in image sequences that is based on the particle probability hypothesis density (PHD) filter and multiassignment data association. First, the particle PHD filter is adopted to eliminate most of the clutters and to estimate multitarget states. In the particle PHD filter, a noniterative multitarget estimation technique is introduced to reliably estimate multitarget states, and an improved birth particle sampling scheme is present to effectively acquire targets among clutters. Then, an integrated track management method is proposed to realize multitarget track continuity. The core of the track management is the track-to-estimation multiassignment association, which relaxes the traditional one-to-one data association restriction due to the unresolved focal plane CSO measurements. Meanwhile, a unified technique of multiple consecutive misses for track deletion is used jointly to cope with the sensitivity of the PHD filter to the missed detections and to eliminate false alarms further, as well as to initiate tracks of large numbers of CSO. Finally, results of two simulations and one experiment show that the proposed approach is feasible and efficient.

  3. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  4. Evaluation of the eruptive potential and probability in open conduit volcano (Mt Etna) based on soil CO2 flux measurements

    NASA Astrophysics Data System (ADS)

    De Gregorio, Sofia; Camarda, Marco

    2016-04-01

    The evaluation of the amount of magma that might be potentially erupted, i.e. the eruptive potential (EP), and the probability of eruptive event occurrence, i.e. eruptive probability (EPR) of active volcano is one of the most compelling and challenging topic addressed by the volcanology community in the last years. The evaluation of the EP in open conduit volcano is generally based on constant magma supply rate deduced by long-term series of eruptive rate. This EP computation gives good results on long-term (centuries) evaluations, but resulted less effective when short-term (years or months) estimations are needed. Actually the rate of magma supply can undergo changes both on long-term and short-term. At steady condition it can be supposed that the regular supply of magma determines an almost constant level of magma in the feeding system (FS) whereas episodic surplus of magma inputs, with respect the regular supply, can cause large variations in the magma level. Follow that the surplus of magma occasionally entered in the FS represents a supply of material that sooner or later will be disposed, i.e. it will be emitted. Afterwards the amount of surplus of magma inward the FS nearly corresponds to the amount of magma that must be erupted in order to restore the equilibrium. Further, larger is the amount of surplus of magma stored in the system higher is the energetic level of the system and its propensity to erupt or in other words its EPR. On the light of the above consideration herein, we present an innovative methodology to evaluate the EP based on the quantification of surplus of magma with respect the regular supply, progressively intruded in the FS. To estimate the surplus of magma supply we used soil CO2 emission data measured monthly at 130 sites in two peripheral areas of Mt Etna Volcano. Indeed as reported by many authors soil CO2 emissions in the areas are linked to magma supply dynamics and more, anomalous discharges of CO2 are ascribable to surplus of

  5. Application of the probability-based Maryland Biological Stream Survey to the state's assessment of water quality standards.

    PubMed

    Southerland, Mark T; Vølstad, Jon H; Weber, Edward D; Klauda, Ronald J; Poukish, Charles A; Rowe, Matthew C

    2009-03-01

    The Clean Water Act presents a daunting task for states by requiring them to assess and restore all their waters. Traditional monitoring has led to two beliefs: (1) ad hoc sampling (i.e., non-random) is adequate if enough sites are sampled and (2) more intensive sampling (e.g., collecting more organisms) at each site is always better. We analyzed the 1,500 Maryland Biological Stream Survey (MBSS) random sites sampled in 2000-2004 to describe the variability of Index of Biotic Integrity (IBI) scores at the site, reach, and watershed scales. Average variability for fish and benthic IBI scores increased with increasing spatial scale, demonstrating that single site IBI scores are not representative at watershed scales and therefore at best 25% of a state's stream length can be representatively sampled with non-random designs. We evaluated the effects on total taxa captured and IBI precision of sampling for twice as many benthic macroinvertebrates at 73 MBSS sites with replicate samples. When sampling costs were fixed, the precision of the IBI decreased as the number of sites had to be reduced by 15%. Only 1% more taxa were found overall when the 73 sites where combined. We concluded that (1) comprehensive assessment of a state's waters should be done using probability-based sampling that allows the condition across all reaches to be inferred statistically and (2) additional site sampling effort should not be incorporated into state biomonitoring when it will reduce the number of sites sampled to the point where overall assessment precision is lower. PMID:19067199

  6. Highly efficient codec based on significance-linked connected-component analysis of wavelet coefficients

    NASA Astrophysics Data System (ADS)

    Chai, Bing-Bing; Vass, Jozsef; Zhuang, Xinhua

    1997-04-01

    Recent success in wavelet coding is mainly attributed to the recognition of importance of data organization. There has been several very competitive wavelet codecs developed, namely, Shapiro's Embedded Zerotree Wavelets (EZW), Servetto et. al.'s Morphological Representation of Wavelet Data (MRWD), and Said and Pearlman's Set Partitioning in Hierarchical Trees (SPIHT). In this paper, we propose a new image compression algorithm called Significant-Linked Connected Component Analysis (SLCCA) of wavelet coefficients. SLCCA exploits both within-subband clustering of significant coefficients and cross-subband dependency in significant fields. A so-called significant link between connected components is designed to reduce the positional overhead of MRWD. In addition, the significant coefficients' magnitude are encoded in bit plane order to match the probability model of the adaptive arithmetic coder. Experiments show that SLCCA outperforms both EZW and MRWD, and is tied with SPIHT. Furthermore, it is observed that SLCCA generally has the best performance on images with large portion of texture. When applied to fingerprint image compression, it outperforms FBI's wavelet scalar quantization by about 1 dB.

  7. Prospect evaluation as a function of numeracy and probability denominator.

    PubMed

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used.

  8. Prospect evaluation as a function of numeracy and probability denominator.

    PubMed

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used. PMID:25704578

  9. Implementation of pseudoreceptor-based pharmacophore queries in the prediction of probable protein targets: explorations in the protein structural profile of Zea mays.

    PubMed

    Kumar, Sivakumar Prasanth; Jha, Prakash C; Pandya, Himanshu A; Jasrai, Yogesh T

    2014-07-01

    Molecular docking plays an important role in the protein target identification by prioritizing probable druggable proteins using docking energies. Due to the limitations of docking scoring schemes, there arises a need for structure-based approaches to acquire confidence in theoretical binding affinities. In this direction, we present here a receptor (protein)-based approach to predict probable protein targets using a small molecule of interest. We adopted a reverse approach wherein the ligand pharmacophore features were used to decipher interaction complementary amino acids of protein cavities (a pseudoreceptor) and expressed as queries to match the cavities or binding sites of the protein dataset. These pseudoreceptor-based pharmacophore queries were used to estimate total probabilities of each protein cavity thereby representing the ligand binding efficiency of the protein. We applied this approach to predict 3 experimental protein targets among 28 Zea mays structural data using 3 co-crystallized ligands as inputs and compared its effectiveness using conventional docking results. We suggest that the combination of total probabilities and docking energies increases the confidence in prioritizing probable protein targets using docking methods. These prediction hypotheses were further supported by DrugScoreX (DSX) pair potential calculations and molecular dynamic simulations. PMID:24756543

  10. Significance testing of rules in rule-based models of human problem solving

    NASA Technical Reports Server (NTRS)

    Lewis, C. M.; Hammer, J. M.

    1986-01-01

    Rule-based models of human problem solving have typically not been tested for statistical significance. Three methods of testing rules - analysis of variance, randomization, and contingency tables - are presented. Advantages and disadvantages of the methods are also described.

  11. Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination

    NASA Technical Reports Server (NTRS)

    Groen, Frank

    2010-01-01

    This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.

  12. Changes in Sexual Behavior and Attitudes Across Generations and Gender Among a Population-Based Probability Sample From an Urbanizing Province in Thailand.

    PubMed

    Techasrivichien, Teeranee; Darawuttimaprakorn, Niphon; Punpuing, Sureeporn; Musumari, Patou Masika; Lukhele, Bhekumusa Wellington; El-Saaidi, Christina; Suguimoto, S Pilar; Feldman, Mitchell D; Ono-Kihara, Masako; Kihara, Masahiro

    2016-02-01

    Thailand has undergone rapid modernization with implications for changes in sexual norms. We investigated sexual behavior and attitudes across generations and gender among a probability sample of the general population of Nonthaburi province located near Bangkok in 2012. A tablet-based survey was performed among 2,138 men and women aged 15-59 years identified through a three-stage, stratified, probability proportional to size, clustered sampling. Descriptive statistical analysis was carried out accounting for the effects of multistage sampling. Relationship of age and gender to sexual behavior and attitudes was analyzed by bivariate analysis followed by multivariate logistic regression analysis to adjust for possible confounding. Patterns of sexual behavior and attitudes varied substantially across generations and gender. We found strong evidence for a decline in the age of sexual initiation, a shift in the type of the first sexual partner, and a greater rate of acceptance of adolescent premarital sex among younger generations. The study highlighted profound changes among young women as evidenced by a higher number of lifetime sexual partners as compared to older women. In contrast to the significant gender gap in older generations, sexual profiles of Thai young women have evolved to resemble those of young men with attitudes gradually converging to similar sexual standards. Our data suggest that higher education, being never-married, and an urban lifestyle may have been associated with these changes. Our study found that Thai sexual norms are changing dramatically. It is vital to continue monitoring such changes, considering the potential impact on the HIV/STIs epidemic and unintended pregnancies.

  13. Co-activation Probability Estimation (CoPE): An approach for modeling functional co-activation architecture based on neuroimaging coordinates.

    PubMed

    Chu, Congying; Fan, Lingzhong; Eickhoff, Claudia R; Liu, Yong; Yang, Yong; Eickhoff, Simon B; Jiang, Tianzi

    2015-08-15

    Recent progress in functional neuroimaging has prompted studies of brain activation during various cognitive tasks. Coordinate-based meta-analysis has been utilized to discover the brain regions that are consistently activated across experiments. However, within-experiment co-activation relationships, which can reflect the underlying functional relationships between different brain regions, have not been widely studied. In particular, voxel-wise co-activation, which may be able to provide a detailed configuration of the co-activation network, still needs to be modeled. To estimate the voxel-wise co-activation pattern and deduce the co-activation network, a Co-activation Probability Estimation (CoPE) method was proposed to model within-experiment activations for the purpose of defining the co-activations. A permutation test was adopted as a significance test. Moreover, the co-activations were automatically separated into local and long-range ones, based on distance. The two types of co-activations describe distinct features: the first reflects convergent activations; the second represents co-activations between different brain regions. The validation of CoPE was based on five simulation tests and one real dataset derived from studies of working memory. Both the simulated and the real data demonstrated that CoPE was not only able to find local convergence but also significant long-range co-activation. In particular, CoPE was able to identify a 'core' co-activation network in the working memory dataset. As a data-driven method, the CoPE method can be used to mine underlying co-activation relationships across experiments in future studies.

  14. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy logical relationships.

    PubMed

    Chen, Shyi-Ming; Chen, Shen-Wen

    2015-03-01

    In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy-trend logical relationships. Firstly, the proposed method fuzzifies the historical training data of the main factor and the secondary factor into fuzzy sets, respectively, to form two-factors second-order fuzzy logical relationships. Then, it groups the obtained two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, it calculates the probability of the "down-trend," the probability of the "equal-trend" and the probability of the "up-trend" of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group, respectively. Finally, it performs the forecasting based on the probabilities of the down-trend, the equal-trend, and the up-trend of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) and the NTD/USD exchange rates. The experimental results show that the proposed method outperforms the existing methods.

  15. MEASUREMENT OF MULTI-POLLUTANT AND MULTI-PATHWAY EXPOSURES IN A PROBABILITY-BASED SAMPLE OF CHILDREN: PRACTICAL STRATEGIES FOR EFFECTIVE FIELD STUDIES

    EPA Science Inventory

    The purpose of this manuscript is to describe the practical strategies developed for the implementation of the Minnesota Children's Pesticide Exposure Study (MNCPES), which is one of the first probability-based samples of multi-pathway and multi-pesticide exposures in children....

  16. The Relationship between School Quality and the Probability of Passing Standards-Based High-Stakes Performance Assessments. CSE Technical Report 644

    ERIC Educational Resources Information Center

    Goldschmidt, Pete; Martinez-Fernandez, Jose-Felipe

    2004-01-01

    We examine whether school quality affects passing the California High School Exit Exam (CAHSEE), which is a standards-based high-stakes performance assessment. We use 3-level hierarchical logistic and linear models to examine student probabilities of passing the CAHSEE to take advantage of the availability of student, teacher, and school level…

  17. Probability workshop to be better in probability topic

    NASA Astrophysics Data System (ADS)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  18. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  19. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  20. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  1. Waste Package Misload Probability

    SciTech Connect

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  2. Probability 1/e

    ERIC Educational Resources Information Center

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  3. Nuclear spin of odd-odd α emitters based on the behavior of α -particle preformation probability

    NASA Astrophysics Data System (ADS)

    Ismail, M.; Adel, A.; Botros, M. M.

    2016-05-01

    The preformation probabilities of an α cluster inside radioactive parent nuclei for both odd-even and odd-odd nuclei are investigated. The calculations cover the isotopic chains from Ir to Ac in the mass regions 166 ≤A ≤215 and 77 ≤Z ≤89 . The calculations are employed in the framework of the density-dependent cluster model. A realistic density-dependent nucleon-nucleon (N N ) interaction with a finite-range exchange part is used to calculate the microscopic α -nucleus potential in the well-established double-folding model. The main effect of antisymmetrization under exchange of nucleons between the α and daughter nuclei has been included in the folding model through the finite-range exchange part of the N N interaction. The calculated potential is then implemented to find both the assault frequency and the penetration probability of the α particle by means of the Wentzel-Kramers-Brillouin approximation in combination with the Bohr-Sommerfeld quantization condition. The correlation of the α -particle preformation probability and the neutron and proton level sequences of the parent nucleus as obtained in our previous work is extended to odd-even and odd-odd nuclei to determine the nuclear spin and parities. Two spin coupling rules are used, namely, strong and weak rules to determine the nuclear spin for odd-odd isotopes. This work can be a useful reference for theoretical calculation of undetermined nuclear spin of odd-odd nuclei in the future.

  4. CS-dependent response probability in an auditory masked-detection task: considerations based on models of Pavlovian conditioning.

    PubMed

    Mason, Christine R; Idrobo, Fabio; Early, Susan J; Abibi, Ayome; Zheng, Ling; Harrison, J Michael; Carney, Laurel H

    2003-05-01

    Experimental studies were performed using a Pavlovian-conditioned eyeblink response to measure detection of a variable-sound-level tone (T) in a fixed-sound-level masking noise (N) in rabbits. Results showed an increase in the asymptotic probability of conditioned responses (CRs) to the reinforced TN trials and a decrease in the asymptotic rate of eyeblink responses to the non-reinforced N presentations as a function of the sound level of the T. These observations are consistent with expected behaviour in an auditory masked detection task, but they are not consistent with predictions from a traditional application of the Rescorla-Wagner or Pearce models of associative learning. To implement these models, one typically considers only the actual stimuli and reinforcement on each trial. We found that by considering perceptual interactions and concepts from signal detection theory, these models could predict the CS dependence on the sound level of the T. In these alternative implementations, the animals response probabilities were used as a guide in making assumptions about the "effective stimuli".

  5. Efficiency of using correlation function for estimation of probability of substance detection on the base of THz spectral dynamics

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Peskov, Nikolay V.; Kirillov, Dmitry A.

    2012-10-01

    One of the problems arising in Time-Domain THz spectroscopy for the problem of security is the developing the criteria for assessment of probability for the detection and identification of the explosive and drugs. We analyze the efficiency of using the correlation function and another functional (more exactly, spectral norm) for this aim. These criteria are applied to spectral lines dynamics. For increasing the reliability of the assessment we subtract the averaged value of THz signal during time of analysis of the signal: it means deleting the constant from this part of the signal. Because of this, we can increase the contrast of assessment. We compare application of the Fourier-Gabor transform with unbounded (for example, Gaussian) window, which slides along the signal, for finding the spectral lines dynamics with application of the Fourier transform in short time interval (FTST), in which the Fourier transform is applied to parts of the signals, for the same aim. These methods are close each to other. Nevertheless, they differ by series of frequencies which they use. It is important for practice that the optimal window shape depends on chosen method for obtaining the spectral dynamics. The probability enhancements if we can find the train of pulses with different frequencies, which follow sequentially. We show that there is possibility to get pure spectral lines dynamics even under the condition of distorted spectrum of the substance response on the action of the THz pulse.

  6. Minimizing the probable maximum flood

    SciTech Connect

    Woodbury, M.S.; Pansic, N. ); Eberlein, D.T. )

    1994-06-01

    This article examines Wisconsin Electric Power Company's efforts to determine an economical way to comply with Federal Energy Regulatory Commission requirements at two hydroelectric developments on the Michigamme River. Their efforts included refinement of the area's probable maximum flood model based, in part, on a newly developed probable maximum precipitation estimate.

  7. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    NASA Astrophysics Data System (ADS)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  8. Lamb wave-based damage quantification and probability of detection modeling for fatigue life assessment of riveted lap joint

    NASA Astrophysics Data System (ADS)

    He, Jingjing; Wang, Dengjiang; Zhang, Weifang

    2015-03-01

    This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.

  9. Probability for Weather and Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  10. Laser Raman detection for oral cancer based on an adaptive Gaussian process classification method with posterior probabilities

    NASA Astrophysics Data System (ADS)

    Du, Zhanwei; Yang, Yongjian; Bai, Yuan; Wang, Lijun; Su, Le; Chen, Yong; Li, Xianchang; Zhou, Xiaodong; Jia, Jun; Shen, Aiguo; Hu, Jiming

    2013-03-01

    The existing methods for early and differential diagnosis of oral cancer are limited due to the unapparent early symptoms and the imperfect imaging examination methods. In this paper, the classification models of oral adenocarcinoma, carcinoma tissues and a control group with just four features are established by utilizing the hybrid Gaussian process (HGP) classification algorithm, with the introduction of the mechanisms of noise reduction and posterior probability. HGP shows much better performance in the experimental results. During the experimental process, oral tissues were divided into three groups, adenocarcinoma (n = 87), carcinoma (n = 100) and the control group (n = 134). The spectral data for these groups were collected. The prospective application of the proposed HGP classification method improved the diagnostic sensitivity to 56.35% and the specificity to about 70.00%, and resulted in a Matthews correlation coefficient (MCC) of 0.36. It is proved that the utilization of HGP in LRS detection analysis for the diagnosis of oral cancer gives accurate results. The prospect of application is also satisfactory.

  11. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta-analysis and group level studies.

    PubMed

    Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan

    2016-07-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. PMID:27192062

  12. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    PubMed Central

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  13. Understanding text-based persuasion and support tactics of concerned significant others.

    PubMed

    van Stolk-Cooke, Katherine; Hayes, Marie; Baumel, Amit; Muench, Frederick

    2015-01-01

    The behavior of concerned significant others (CSOs) can have a measurable impact on the health and wellness of individuals attempting to meet behavioral and health goals, and research is needed to better understand the attributes of text-based CSO language when encouraging target significant others (TSOs) to achieve those goals. In an effort to inform the development of interventions for CSOs, this study examined the language content of brief text-based messages generated by CSOs to motivate TSOs to achieve a behavioral goal. CSOs generated brief text-based messages for TSOs for three scenarios: (1) to help TSOs achieve the goal, (2) in the event that the TSO is struggling to meet the goal, and (3) in the event that the TSO has given up on meeting the goal. Results indicate that there was a significant relationship between the tone and compassion of messages generated by CSOs, the CSOs' perceptions of TSO motivation, and their expectation of a grateful or annoyed reaction by the TSO to their feedback or support. Results underscore the importance of attending to patterns in language when CSOs communicate with TSOs about goal achievement or failure, and how certain variables in the CSOs' perceptions of their TSOs affect these characteristics. PMID:26312172

  14. Understanding text-based persuasion and support tactics of concerned significant others

    PubMed Central

    van Stolk-Cooke, Katherine; Hayes, Marie; Baumel, Amit

    2015-01-01

    The behavior of concerned significant others (CSOs) can have a measurable impact on the health and wellness of individuals attempting to meet behavioral and health goals, and research is needed to better understand the attributes of text-based CSO language when encouraging target significant others (TSOs) to achieve those goals. In an effort to inform the development of interventions for CSOs, this study examined the language content of brief text-based messages generated by CSOs to motivate TSOs to achieve a behavioral goal. CSOs generated brief text-based messages for TSOs for three scenarios: (1) to help TSOs achieve the goal, (2) in the event that the TSO is struggling to meet the goal, and (3) in the event that the TSO has given up on meeting the goal. Results indicate that there was a significant relationship between the tone and compassion of messages generated by CSOs, the CSOs’ perceptions of TSO motivation, and their expectation of a grateful or annoyed reaction by the TSO to their feedback or support. Results underscore the importance of attending to patterns in language when CSOs communicate with TSOs about goal achievement or failure, and how certain variables in the CSOs’ perceptions of their TSOs affect these characteristics. PMID:26312172

  15. The effects of inquiry-based science instruction training on teachers of students with significant disabilities

    NASA Astrophysics Data System (ADS)

    Courtade, Ginevra Rose

    Federal mandates (A Nation at Risk, 1983 and Project 2061: Science for all Americans, 1985) as well as the National Science Education Standards (NRC, 1996) call for science education for all students. Recent educational laws (IDEA, 1997; NCLB, 2002) require access to and assessment of the general curriculum, including science, for all students with disabilities. Although some research exists on teaching academics to students with significant disabilities, the research on teaching science is especially limited (Browder, Spooner, Ahlgrim-Delzell, Harris, & Wakeman, 2006; Browder, Wakeman, et al., 2006; Courtade, et al., 2006). The purpose of this investigation was to determine if training teachers of students with significant disabilities to teach science concepts using a guided inquiry-based method would change the way science was instructed in the classroom. Further objectives of this study were to determine if training the teachers would increase students' participation and achievement in science. The findings of this study demonstrated a functional relationship between the inquiry-based science instruction training and teacher's ability to instruct students with significant disabilities in science using inquiry-based science instruction. The findings of this study also indicated a functional relationship between the inquiry-based science instruction training and acquisition of student inquiry skills. Also, findings indicated an increase in the number of science content standards being addressed after the teachers received the training. Some students were also able to acquire new science terms after their teachers taught using inquiry-based instruction. Finally, social validity measures indicated a high degree of satisfaction with the intervention and its intended outcomes.

  16. The spline probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Sithiravel, Rajiv; Tharmarasa, Ratnasingham; McDonald, Mike; Pelletier, Michel; Kirubarajan, Thiagalingam

    2012-06-01

    The Probability Hypothesis Density Filter (PHD) is a multitarget tracker for recursively estimating the number of targets and their state vectors from a set of observations. The PHD filter is capable of working well in scenarios with false alarms and missed detections. Two distinct PHD filter implementations are available in the literature: the Sequential Monte Carlo Probability Hypothesis Density (SMC-PHD) and the Gaussian Mixture Probability Hypothesis Density (GM-PHD) filters. The SMC-PHD filter uses particles to provide target state estimates, which can lead to a high computational load, whereas the GM-PHD filter does not use particles, but restricts to linear Gaussian mixture models. The SMC-PHD filter technique provides only weighted samples at discrete points in the state space instead of a continuous estimate of the probability density function of the system state and thus suffers from the well-known degeneracy problem. This paper proposes a B-Spline based Probability Hypothesis Density (S-PHD) filter, which has the capability to model any arbitrary probability density function. The resulting algorithm can handle linear, non-linear, Gaussian, and non-Gaussian models and the S-PHD filter can also provide continuous estimates of the probability density function of the system state. In addition, by moving the knots dynamically, the S-PHD filter ensures that the splines cover only the region where the probability of the system state is significant, hence the high efficiency of the S-PHD filter is maintained at all times. Also, unlike the SMC-PHD filter, the S-PHD filter is immune to the degeneracy problem due to its continuous nature. The S-PHD filter derivations and simulations are provided in this paper.

  17. New Classification Method Based on Support-Significant Association Rules Algorithm

    NASA Astrophysics Data System (ADS)

    Li, Guoxin; Shi, Wen

    One of the most well-studied problems in data mining is mining for association rules. There was also research that introduced association rule mining methods to conduct classification tasks. These classification methods, based on association rule mining, could be applied for customer segmentation. Currently, most of the association rule mining methods are based on a support-confidence structure, where rules satisfied both minimum support and minimum confidence were returned as strong association rules back to the analyzer. But, this types of association rule mining methods lack of rigorous statistic guarantee, sometimes even caused misleading. A new classification model for customer segmentation, based on association rule mining algorithm, was proposed in this paper. This new model was based on the support-significant association rule mining method, where the measurement of confidence for association rule was substituted by the significant of association rule that was a better evaluation standard for association rules. Data experiment for customer segmentation from UCI indicated the effective of this new model.

  18. Prognostic significance of volume-based PET parameters in cancer patients.

    PubMed

    Moon, Seung Hwan; Hyun, Seung Hyup; Choi, Joon Young

    2013-01-01

    Accurate prediction of cancer prognosis before the start of treatment is important since these predictions often affect the choice of treatment. Prognosis is usually based on anatomical staging and other clinical factors. However, the conventional system is not sufficient to accurately and reliably determine prognosis. Metabolic parameters measured by (18)F-fluorodeoxyglucose (FDG) positron emission tomography (PET) have the potential to provide valuable information regarding prognosis and treatment response evaluation in cancer patients. Among these parameters, volume-based PET parameters such as metabolic tumor volume and total lesion glycolysis are especially promising. However, the measurement of these parameters is significantly affected by the imaging methodology and specific image characteristics, and a standard method for these parameters has not been established. This review introduces volume-based PET parameters as potential prognostic indicators, and highlights methodological considerations for measurement, potential implications, and prospects for further studies. PMID:23323025

  19. One rhinophore probably provides sufficient sensory input for odour-based navigation by the nudibranch mollusc Tritonia diomedea.

    PubMed

    McCullagh, Gregory B; Bishop, Cory D; Wyeth, Russell C

    2014-12-01

    Tritonia diomedea (synonymous with Tritonia tetraquetra) navigates in turbulent odour plumes, crawling upstream towards prey and downstream to avoid predators. This is probably accomplished by odour-gated rheotaxis, but other possibilities have not been excluded. Our goal was to test whether T. diomedea uses odour-gated rheotaxis and to simultaneously determine which of the cephalic sensory organs (rhinophores and oral veil) are required for navigation. In a first experiment, slugs showed no coherent responses to streams of odour directed at single rhinophores. In a second experiment, navigation in prey and predator odour plumes was compared between animals with unilateral rhinophore lesions, denervated oral veils, or combined unilateral rhinophore lesions and denervated oral veils. In all treatments, animals navigated in a similar manner to that of control and sham-operated animals, indicating that a single rhinophore provides sufficient sensory input for navigation (assuming that a distributed flow measurement system would also be affected by the denervations). Amongst various potential navigational strategies, only odour-gated positive rheotaxis can produce the navigation tracks we observed in prey plumes while receiving input from a single sensor. Thus, we provide strong evidence that T. diomedea uses odour-gated rheotaxis in attractive odour plumes, with odours and flow detected by the rhinophores. In predator plumes, slugs turned downstream to varying degrees rather than orienting directly downstream for crawling, resulting in greater dispersion for negative rheotaxis in aversive plumes. These conclusions are the first explicit confirmation of odour-gated rheotaxis as a navigational strategy in gastropods and are also a foundation for exploring the neural circuits that mediate odour-gated rheotaxis. PMID:25324338

  20. Model assembly for estimating cell surviving fraction for both targeted and nontargeted effects based on microdosimetric probability densities.

    PubMed

    Sato, Tatsuhiko; Hamada, Nobuyuki

    2014-01-01

    We here propose a new model assembly for estimating the surviving fraction of cells irradiated with various types of ionizing radiation, considering both targeted and nontargeted effects in the same framework. The probability densities of specific energies in two scales, which are the cell nucleus and its substructure called a domain, were employed as the physical index for characterizing the radiation fields. In the model assembly, our previously established double stochastic microdosimetric kinetic (DSMK) model was used to express the targeted effect, whereas a newly developed model was used to express the nontargeted effect. The radioresistance caused by overexpression of anti-apoptotic protein Bcl-2 known to frequently occur in human cancer was also considered by introducing the concept of the adaptive response in the DSMK model. The accuracy of the model assembly was examined by comparing the computationally and experimentally determined surviving fraction of Bcl-2 cells (Bcl-2 overexpressing HeLa cells) and Neo cells (neomycin resistant gene-expressing HeLa cells) irradiated with microbeam or broadbeam of energetic heavy ions, as well as the WI-38 normal human fibroblasts irradiated with X-ray microbeam. The model assembly reproduced very well the experimentally determined surviving fraction over a wide range of dose and linear energy transfer (LET) values. Our newly established model assembly will be worth being incorporated into treatment planning systems for heavy-ion therapy, brachytherapy, and boron neutron capture therapy, given critical roles of the frequent Bcl-2 overexpression and the nontargeted effect in estimating therapeutic outcomes and harmful effects of such advanced therapeutic modalities.

  1. The impacts of problem gambling on concerned significant others accessing web-based counselling.

    PubMed

    Dowling, Nicki A; Rodda, Simone N; Lubman, Dan I; Jackson, Alun C

    2014-08-01

    The 'concerned significant others' (CSOs) of people with problem gambling frequently seek professional support. However, there is surprisingly little research investigating the characteristics or help-seeking behaviour of these CSOs, particularly for web-based counselling. The aims of this study were to describe the characteristics of CSOs accessing the web-based counselling service (real time chat) offered by the Australian national gambling web-based counselling site, explore the most commonly reported CSO impacts using a new brief scale (the Problem Gambling Significant Other Impact Scale: PG-SOIS), and identify the factors associated with different types of CSO impact. The sample comprised all 366 CSOs accessing the service over a 21 month period. The findings revealed that the CSOs were most often the intimate partners of problem gamblers and that they were most often females aged under 30 years. All CSOs displayed a similar profile of impact, with emotional distress (97.5%) and impacts on the relationship (95.9%) reported to be the most commonly endorsed impacts, followed by impacts on social life (92.1%) and finances (91.3%). Impacts on employment (83.6%) and physical health (77.3%) were the least commonly endorsed. There were few significant differences in impacts between family members (children, partners, parents, and siblings), but friends consistently reported the lowest impact scores. Only prior counselling experience and Asian cultural background were consistently associated with higher CSO impacts. The findings can serve to inform the development of web-based interventions specifically designed for the CSOs of problem gamblers.

  2. SNP microarray-based 24 chromosome aneuploidy screening is significantly more consistent than FISH

    PubMed Central

    Treff, Nathan R.; Levy, Brynn; Su, Jing; Northrop, Lesley E.; Tao, Xin; Scott, Richard T.

    2010-01-01

    Many studies estimate that chromosomal mosaicism within the cleavage-stage human embryo is high. However, comparison of two unique methods of aneuploidy screening of blastomeres within the same embryo has not been conducted and may indicate whether mosaicism has been overestimated due to technical inconsistency rather than the biological phenomena. The present study investigates the prevalence of chromosomal abnormality and mosaicism found with two different single cell aneuploidy screening techniques. Thirteen arrested cleavage-stage embryos were studied. Each was biopsied into individual cells (n = 160). The cells from each embryo were randomized into two groups. Those destined for FISH-based aneuploidy screening (n = 75) were fixed, one cell per slide. Cells for SNP microarray-based aneuploidy screening (n = 85) were put into individual tubes. Microarray was significantly more reliable (96%) than FISH (83%) for providing an interpretable result (P = 0.004). Markedly different results were obtained when comparing microarray and FISH results from individual embryos. Mosaicism was significantly less commonly observed by microarray (31%) than by FISH (100%) (P = 0.0005). Although FISH evaluated fewer chromosomes per cell and fewer cells per embryo, FISH still displayed significantly more unique genetic diagnoses per embryo (3.2 ± 0.2) than microarray (1.3 ± 0.2) (P < 0.0001). This is the first prospective, randomized, blinded and paired comparison between microarray and FISH-based aneuploidy screening. SNP microarray-based 24 chromosome aneuploidy screening provides more complete and consistent results than FISH. These results also suggest that FISH technology may overestimate the contribution of mitotic error to the origin of aneuploidy at the cleavage stage of human embryogenesis. PMID:20484246

  3. Probability-based classifications for spatially characterizing the water temperatures and discharge rates of hot springs in the Tatun Volcanic Region, Taiwan.

    PubMed

    Jang, Cheng-Shin

    2015-05-01

    Accurately classifying the spatial features of the water temperatures and discharge rates of hot springs is crucial for environmental resources use and management. This study spatially characterized classifications of the water temperatures and discharge rates of hot springs in the Tatun Volcanic Region of Northern Taiwan by using indicator kriging (IK). The water temperatures and discharge rates of the springs were first assigned to high, moderate, and low categories according to the two thresholds of the proposed spring classification criteria. IK was then used to model the occurrence probabilities of the water temperatures and discharge rates of the springs and probabilistically determine their categories. Finally, nine combinations were acquired from the probability-based classifications for the spatial features of the water temperatures and discharge rates of the springs. Moreover, various combinations of spring water features were examined according to seven subzones of spring use in the study region. The research results reveal that probability-based classifications using IK provide practicable insights related to propagating the uncertainty of classifications according to the spatial features of the water temperatures and discharge rates of the springs. The springs in the Beitou (BT), Xingyi Road (XYR), Zhongshanlou (ZSL), and Lengshuikeng (LSK) subzones are suitable for supplying tourism hotels with a sufficient quantity of spring water because they have high or moderate discharge rates. Furthermore, natural hot springs in riverbeds and valleys should be developed in the Dingbeitou (DBT), ZSL, Xiayoukeng (XYK), and Macao (MC) subzones because of low discharge rates and low or moderate water temperatures. PMID:25917185

  4. Probability-based classifications for spatially characterizing the water temperatures and discharge rates of hot springs in the Tatun Volcanic Region, Taiwan.

    PubMed

    Jang, Cheng-Shin

    2015-05-01

    Accurately classifying the spatial features of the water temperatures and discharge rates of hot springs is crucial for environmental resources use and management. This study spatially characterized classifications of the water temperatures and discharge rates of hot springs in the Tatun Volcanic Region of Northern Taiwan by using indicator kriging (IK). The water temperatures and discharge rates of the springs were first assigned to high, moderate, and low categories according to the two thresholds of the proposed spring classification criteria. IK was then used to model the occurrence probabilities of the water temperatures and discharge rates of the springs and probabilistically determine their categories. Finally, nine combinations were acquired from the probability-based classifications for the spatial features of the water temperatures and discharge rates of the springs. Moreover, various combinations of spring water features were examined according to seven subzones of spring use in the study region. The research results reveal that probability-based classifications using IK provide practicable insights related to propagating the uncertainty of classifications according to the spatial features of the water temperatures and discharge rates of the springs. The springs in the Beitou (BT), Xingyi Road (XYR), Zhongshanlou (ZSL), and Lengshuikeng (LSK) subzones are suitable for supplying tourism hotels with a sufficient quantity of spring water because they have high or moderate discharge rates. Furthermore, natural hot springs in riverbeds and valleys should be developed in the Dingbeitou (DBT), ZSL, Xiayoukeng (XYK), and Macao (MC) subzones because of low discharge rates and low or moderate water temperatures.

  5. Evaluating research for clinical significance: using critically appraised topics to enhance evidence-based neuropsychology.

    PubMed

    Bowden, Stephen C; Harrison, Elise J; Loring, David W

    2014-01-01

    Meehl's (1973, Psychodiagnosis: Selected papers. Minneapolis: University of Minnesota Press) distinction between statistical and clinical significance holds special relevance for evidence-based neuropsychological practice. Meehl argued that despite attaining statistical significance, many published findings have limited practical value since they do not inform clinical care. In the context of an ever expanding clinical research literature, accessible methods to evaluate clinical impact are needed. The method of Critically Appraised Topics (Straus, Richardson, Glasziou, & Haynes, 2011, Evidence-based medicine: How to practice and teach EBM (4th ed.). Edinburgh: Elsevier Churchill-Livingstone) was developed to provide clinicians with a "toolkit" to facilitate implementation of evidence-based practice. We illustrate the Critically Appraised Topics method using a dementia screening example. We argue that the skills practiced through critical appraisal provide clinicians with methods to: (1) evaluate the clinical relevance of new or unfamiliar research findings with a focus on patient benefit, (2) help focus of research quality, and (3) incorporate evaluation of clinical impact into educational and professional development activities.

  6. Probability on a Budget.

    ERIC Educational Resources Information Center

    Ewbank, William A.; Ginther, John L.

    2002-01-01

    Describes how to use common dice numbered 1-6 for simple mathematical situations including probability. Presents a lesson using regular dice and specially marked dice to explore some of the concepts of probability. (KHR)

  7. Bortezomib-based triplets are associated with a high probability of dialysis independence and rapid renal recovery in newly diagnosed myeloma patients with severe renal failure or those requiring dialysis.

    PubMed

    Dimopoulos, Meletios A; Roussou, Maria; Gavriatopoulou, Maria; Psimenou, Erasmia; Eleutherakis-Papaiakovou, Evangelos; Migkou, Magdalini; Matsouka, Charis; Mparmparousi, Despoina; Gika, Dimitra; Kafantari, Eftychia; Ziogas, Dimitrios; Fotiou, Despoina; Panagiotidis, Ioannis; Terpos, Evangelos; Kastritis, Efstathios

    2016-05-01

    Renal failure (RF) is a common and severe complication of symptomatic myeloma, associated with significant morbidity and mortality. Such patients are commonly excluded from clinical trials. Bortezomib/dexamethasone (VD)-based regimens are the backbone of the treatment of newly diagnosed MM patients who present with severe RF even those requiring dialysis. We analyzed the outcomes of 83 consecutive bortezomib-treated patients with severe RF (eGFR < 30 ml/min/1.73 m(2) ), of which 31 (37%) required dialysis. By IMWG renal response criteria, 54 (65%) patients achieved at least MRrenal, including CRrenal in 35% and PRrenal in 12%. Triplet combinations (i.e., VD plus a third agent) versus VD alone were associated with higher rates of renal responses (72 vs. 50%; P = 0.06). Fifteen of the 31 (48%) patients became dialysis independent within a median of 217 days (range 11-724). Triplets were associated with a higher probability of dialysis discontinuation (57 vs. 35%). Serum free light chain (sFLC) level ≥11,550 mg/L was associated with lower rates of major renal response, longer time to major renal response, lower probability, and longer time to dialysis discontinuation. Rapid myeloma response (≥PR within the first month) was also associated with higher rates of renal response. Patients who became dialysis-independent had longer survival than those remaining on dialysis. In conclusion, VD-based triplets are associated with a significant probability of renal response and dialysis discontinuation, improving the survival of patients who became dialysis independent. Rapid disease response is important for renal recovery and sFLCs are predictive of the probability and of the time required for renal response.

  8. Is quantum probability rational?

    PubMed

    Houston, Alasdair I; Wiesner, Karoline

    2013-06-01

    We concentrate on two aspects of the article by Pothos & Busemeyer (P&B): the relationship between classical and quantum probability and quantum probability as a basis for rational decisions. We argue that the mathematical relationship between classical and quantum probability is not quite what the authors claim. Furthermore, it might be premature to regard quantum probability as the best practical rational scheme for decision making.

  9. Predicted probabilities' relationship to inclusion probabilities.

    PubMed

    Fang, Di; Chong, Jenny; Wilson, Jeffrey R

    2015-05-01

    It has been shown that under a general multiplicative intercept model for risk, case-control (retrospective) data can be analyzed by maximum likelihood as if they had arisen prospectively, up to an unknown multiplicative constant, which depends on the relative sampling fraction. (1) With suitable auxiliary information, retrospective data can also be used to estimate response probabilities. (2) In other words, predictive probabilities obtained without adjustments from retrospective data will likely be different from those obtained from prospective data. We highlighted this using binary data from Medicare to determine the probability of readmission into the hospital within 30 days of discharge, which is particularly timely because Medicare has begun penalizing hospitals for certain readmissions. (3).

  10. A Non-Parametric Surrogate-based Test of Significance for T-Wave Alternans Detection

    PubMed Central

    Nemati, Shamim; Abdala, Omar; Bazán, Violeta; Yim-Yeh, Susie; Malhotra, Atul; Clifford, Gari

    2010-01-01

    We present a non-parametric adaptive surrogate test that allows for the differentiation of statistically significant T-Wave Alternans (TWA) from alternating patterns that can be solely explained by the statistics of noise. The proposed test is based on estimating the distribution of noise induced alternating patterns in a beat sequence from a set of surrogate data derived from repeated reshuffling of the original beat sequence. Thus, in assessing the significance of the observed alternating patterns in the data no assumptions are made about the underlying noise distribution. In addition, since the distribution of noise-induced alternans magnitudes is calculated separately for each sequence of beats within the analysis window, the method is robust to data non-stationarities in both noise and TWA. The proposed surrogate method for rejecting noise was compared to the standard noise rejection methods used with the Spectral Method (SM) and the Modified Moving Average (MMA) techniques. Using a previously described realistic multi-lead model of TWA, and real physiological noise, we demonstrate the proposed approach reduces false TWA detections, while maintaining a lower missed TWA detection compared with all the other methods tested. A simple averaging-based TWA estimation algorithm was coupled with the surrogate significance testing and was evaluated on three public databases; the Normal Sinus Rhythm Database (NRSDB), the Chronic Heart Failure Database (CHFDB) and the Sudden Cardiac Death Database (SCDDB). Differences in TWA amplitudes between each database were evaluated at matched heart rate (HR) intervals from 40 to 120 beats per minute (BPM). Using the two-sample Kolmogorov-Smirnov test, we found that significant differences in TWA levels exist between each patient group at all decades of heart rates. The most marked difference was generally found at higher heart rates, and the new technique resulted in a larger margin of separability between patient populations than

  11. Racing To Understand Probability.

    ERIC Educational Resources Information Center

    Van Zoest, Laura R.; Walker, Rebecca K.

    1997-01-01

    Describes a series of lessons designed to supplement textbook instruction of probability by addressing the ideas of "equally likely,""not equally likely," and "fairness," as well as to introduce the difference between theoretical and experimental probability. Presents four lessons using The Wind Racer games to study probability. (ASK)

  12. Dependent Probability Spaces

    ERIC Educational Resources Information Center

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  13. Searching with probabilities

    SciTech Connect

    Palay, A.J.

    1985-01-01

    This book examines how probability distributions can be used as a knowledge representation technique. It presents a mechanism that can be used to guide a selective search algorithm to solve a variety of tactical chess problems. Topics covered include probabilities and searching the B algorithm and chess probabilities - in practice, examples, results, and future work.

  14. Ultrasonography-Based Thyroidal and Perithyroidal Anatomy and Its Clinical Significance

    PubMed Central

    Ha, Eun Ju; Lee, Jeong Hyun

    2015-01-01

    Ultrasonography (US)-guided procedures such as ethanol ablation, radiofrequency ablation, laser ablation, selective nerve block, and core needle biopsy have been widely applied in the diagnosis and management of thyroid and neck lesions. For a safe and effective US-guided procedure, knowledge of neck anatomy, particularly that of the nerves, vessels, and other critical structures, is essential. However, most previous reports evaluated neck anatomy based on cadavers, computed tomography, or magnetic resonance imaging rather than US. Therefore, the aim of this article was to elucidate US-based thyroidal and perithyroidal anatomy, as well as its clinical significance in the use of prevention techniques for complications during the US-guided procedures. Knowledge of these areas may be helpful for maximizing the efficacy and minimizing the complications of US-guided procedures for the thyroid and other neck lesions. PMID:26175574

  15. Chronic Arsenic Poisoning Probably Caused by Arsenic-Based Pesticides: Findings from an Investigation Study of a Household.

    PubMed

    Li, Yongfang; Ye, Feng; Wang, Anwei; Wang, Da; Yang, Boyi; Zheng, Quanmei; Sun, Guifan; Gao, Xinghua

    2016-01-16

    In addition to naturally occurring arsenic, man-made arsenic-based compounds are other sources of arsenic exposure. In 2013, our group identified 12 suspected arsenicosis patients in a household (32 living members). Of them, eight members were diagnosed with skin cancer. Interestingly, all of these patients had lived in the household prior to 1989. An investigation revealed that approximately 2 tons of arsenic-based pesticides had been previously placed near a well that had supplied drinking water to the family from 1973 to 1989. The current arsenic level in the well water was 620 μg/L. No other high arsenic wells were found near the family's residence. Based on these findings, it is possible to infer that the skin lesions exhibited by these family members were caused by long-term exposure to well water contaminated with arsenic-based pesticides. Additionally, biochemical analysis showed that the individuals exposed to arsenic had higher levels of aspartate aminotransferase and γ-glutamyl transpeptidase than those who were not exposed. These findings might indicate the presence of liver dysfunction in the arsenic-exposed individuals. This report elucidates the effects of arsenical compounds on the occurrence of high levels of arsenic in the environment and emphasizes the severe human health impact of arsenic exposure.

  16. Chronic Arsenic Poisoning Probably Caused by Arsenic-Based Pesticides: Findings from an Investigation Study of a Household.

    PubMed

    Li, Yongfang; Ye, Feng; Wang, Anwei; Wang, Da; Yang, Boyi; Zheng, Quanmei; Sun, Guifan; Gao, Xinghua

    2016-01-01

    In addition to naturally occurring arsenic, man-made arsenic-based compounds are other sources of arsenic exposure. In 2013, our group identified 12 suspected arsenicosis patients in a household (32 living members). Of them, eight members were diagnosed with skin cancer. Interestingly, all of these patients had lived in the household prior to 1989. An investigation revealed that approximately 2 tons of arsenic-based pesticides had been previously placed near a well that had supplied drinking water to the family from 1973 to 1989. The current arsenic level in the well water was 620 μg/L. No other high arsenic wells were found near the family's residence. Based on these findings, it is possible to infer that the skin lesions exhibited by these family members were caused by long-term exposure to well water contaminated with arsenic-based pesticides. Additionally, biochemical analysis showed that the individuals exposed to arsenic had higher levels of aspartate aminotransferase and γ-glutamyl transpeptidase than those who were not exposed. These findings might indicate the presence of liver dysfunction in the arsenic-exposed individuals. This report elucidates the effects of arsenical compounds on the occurrence of high levels of arsenic in the environment and emphasizes the severe human health impact of arsenic exposure. PMID:26784217

  17. Chronic Arsenic Poisoning Probably Caused by Arsenic-Based Pesticides: Findings from an Investigation Study of a Household

    PubMed Central

    Li, Yongfang; Ye, Feng; Wang, Anwei; Wang, Da; Yang, Boyi; Zheng, Quanmei; Sun, Guifan; Gao, Xinghua

    2016-01-01

    In addition to naturally occurring arsenic, man-made arsenic-based compounds are other sources of arsenic exposure. In 2013, our group identified 12 suspected arsenicosis patients in a household (32 living members). Of them, eight members were diagnosed with skin cancer. Interestingly, all of these patients had lived in the household prior to 1989. An investigation revealed that approximately 2 tons of arsenic-based pesticides had been previously placed near a well that had supplied drinking water to the family from 1973 to 1989. The current arsenic level in the well water was 620 μg/L. No other high arsenic wells were found near the family’s residence. Based on these findings, it is possible to infer that the skin lesions exhibited by these family members were caused by long-term exposure to well water contaminated with arsenic-based pesticides. Additionally, biochemical analysis showed that the individuals exposed to arsenic had higher levels of aspartate aminotransferase and γ-glutamyl transpeptidase than those who were not exposed. These findings might indicate the presence of liver dysfunction in the arsenic-exposed individuals. This report elucidates the effects of arsenical compounds on the occurrence of high levels of arsenic in the environment and emphasizes the severe human health impact of arsenic exposure. PMID:26784217

  18. In All Probability, Probability is not All

    ERIC Educational Resources Information Center

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  19. Response of the San Andreas fault to the 1983 Coalinga-Nuñez earthquakes: an application of interaction-based probabilities for Parkfield

    USGS Publications Warehouse

    Toda, Shinji; Stein, Ross S.

    2002-01-01

    The Parkfield-Cholame section of the San Andreas fault, site of an unfulfilled earthquake forecast in 1985, is the best monitored section of the world's most closely watched fault. In 1983, the M = 6.5 Coalinga and M = 6.0 Nuñez events struck 25 km northeast of Parkfield. Seismicity rates climbed for 18 months along the creeping section of the San Andreas north of Parkfield and dropped for 6 years along the locked section to the south. Right-lateral creep also slowed or reversed from Parkfield south. Here we calculate that the Coalinga sequence increased the shear and Coulomb stress on the creeping section, causing the rate of small shocks to rise until the added stress was shed by additional slip. However, the 1983 events decreased the shear and Coulomb stress on the Parkfield segment, causing surface creep and seismicity rates to drop. We use these observations to cast the likelihood of a Parkfield earthquake into an interaction-based probability, which includes both the renewal of stress following the 1966 Parkfield earthquake and the stress transfer from the 1983 Coalinga events. We calculate that the 1983 shocks dropped the 10-year probability of a M ∼ 6 Parkfield earthquake by 22% (from 54 ± 22% to 42 ± 23%) and that the probability did not recover until about 1991, when seismicity and creep resumed. Our analysis may thus explain why the Parkfield earthquake did not strike in the 1980s, but not why it was absent in the 1990s. We calculate a 58 ± 17% probability of a M ∼ 6 Parkfield earthquake during 2001–2011.

  20. A fast method for computing high-significance disease association in large population-based studies.

    PubMed

    Kimmel, Gad; Shamir, Ron

    2006-09-01

    Because of rapid progress in genotyping techniques, many large-scale, genomewide disease-association studies are now under way. Typically, the disorders examined are multifactorial, and, therefore, researchers seeking association must consider interactions among loci and between loci and other factors. One of the challenges of large disease-association studies is obtaining accurate estimates of the significance of discovered associations. The linkage disequilibrium between SNPs makes the tests highly dependent, and dependency worsens when interactions are tested. The standard way of assigning significance (P value) is by a permutation test. Unfortunately, in large studies, it is prohibitively slow to compute low P values by this method. We present here a faster algorithm for accurately calculating low P values in case-control association studies. Unlike with several previous methods, we do not assume a specific distribution of the traits, given the genotypes. Our method is based on importance sampling and on accounting for the decay in linkage disequilibrium along the chromosome. The algorithm is dramatically faster than the standard permutation test. On data sets mimicking medium-to-large association studies, it speeds up computation by a factor of 5,000-100,000, sometimes reducing running times from years to minutes. Thus, our method significantly increases the problem-size range for which accurate, meaningful association results are attainable. PMID:16909386

  1. Deriving statistical significance maps for SVM based image classification and group comparisons.

    PubMed

    Gaonkar, Bilwaj; Davatzikos, Christos

    2012-01-01

    Population based pattern analysis and classification for quantifying structural and functional differences between diverse groups has been shown to be a powerful tool for the study of a number of diseases, and is quite commonly used especially in neuroimaging. The alternative to these pattern analysis methods, namely mass univariate methods such as voxel based analysis and all related methods, cannot detect multivariate patterns associated with group differences, and are not particularly suitable for developing individual-based diagnostic and prognostic biomarkers. A commonly used pattern analysis tool is the support vector machine (SVM). Unlike univariate statistical frameworks for morphometry, analytical tools for statistical inference are unavailable for the SVM. In this paper, we show that null distributions ordinarily obtained by permutation tests using SVMs can be analytically approximated from the data. The analytical computation takes a small fraction of the time it takes to do an actual permutation test, thereby rendering it possible to quickly create statistical significance maps derived from SVMs. Such maps are critical for understanding imaging patterns of group differences and interpreting which anatomical regions are important in determining the classifier's decision.

  2. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    NASA Astrophysics Data System (ADS)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  3. Probability Forecasting Using Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Duncan, M.; Frisbee, J.; Wysack, J.

    2014-09-01

    Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a

  4. Significantly enhanced robustness and electrochemical performance of flexible carbon nanotube-based supercapacitors by electrodepositing polypyrrole

    NASA Astrophysics Data System (ADS)

    Chen, Yanli; Du, Lianhuan; Yang, Peihua; Sun, Peng; Yu, Xiang; Mai, Wenjie

    2015-08-01

    Here, we report robust, flexible CNT-based supercapacitor (SC) electrodes fabricated by electrodepositing polypyrrole (PPy) on freestanding vacuum-filtered CNT film. These electrodes demonstrate significantly improved mechanical properties (with the ultimate tensile strength of 16 MPa), and greatly enhanced electrochemical performance (5.6 times larger areal capacitance). The major drawback of conductive polymer electrodes is the fast capacitance decay caused by structural breakdown, which decreases cycling stability but this is not observed in our case. All-solid-state SCs assembled with the robust CNT/PPy electrodes exhibit excellent flexibility, long lifetime (95% capacitance retention after 10,000 cycles) and high electrochemical performance (a total device volumetric capacitance of 4.9 F/cm3). Moreover, a flexible SC pack is demonstrated to light up 53 LEDs or drive a digital watch, indicating the broad potential application of our SCs for portable/wearable electronics.

  5. A Network-Based Method to Assess the Statistical Significance of Mild Co-Regulation Effects

    PubMed Central

    Horvát, Emőke-Ágnes; Zhang, Jitao David; Uhlmann, Stefan; Sahin, Özgür; Zweig, Katharina Anna

    2013-01-01

    Recent development of high-throughput, multiplexing technology has initiated projects that systematically investigate interactions between two types of components in biological networks, for instance transcription factors and promoter sequences, or microRNAs (miRNAs) and mRNAs. In terms of network biology, such screening approaches primarily attempt to elucidate relations between biological components of two distinct types, which can be represented as edges between nodes in a bipartite graph. However, it is often desirable not only to determine regulatory relationships between nodes of different types, but also to understand the connection patterns of nodes of the same type. Especially interesting is the co-occurrence of two nodes of the same type, i.e., the number of their common neighbours, which current high-throughput screening analysis fails to address. The co-occurrence gives the number of circumstances under which both of the biological components are influenced in the same way. Here we present SICORE, a novel network-based method to detect pairs of nodes with a statistically significant co-occurrence. We first show the stability of the proposed method on artificial data sets: when randomly adding and deleting observations we obtain reliable results even with noise exceeding the expected level in large-scale experiments. Subsequently, we illustrate the viability of the method based on the analysis of a proteomic screening data set to reveal regulatory patterns of human microRNAs targeting proteins in the EGFR-driven cell cycle signalling system. Since statistically significant co-occurrence may indicate functional synergy and the mechanisms underlying canalization, and thus hold promise in drug target identification and therapeutic development, we provide a platform-independent implementation of SICORE with a graphical user interface as a novel tool in the arsenal of high-throughput screening analysis. PMID:24039936

  6. Visualization of the significance of Receiver Operating Characteristics based on confidence ellipses

    NASA Astrophysics Data System (ADS)

    Sarlis, Nicholas V.; Christopoulos, Stavros-Richard G.

    2014-03-01

    The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Catalogue identifier: AERY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERY_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 11511 No. of bytes in distributed program, including test data, etc.: 72906 Distribution format: tar.gz Programming language: FORTRAN. Computer: Any computer supporting a GNU FORTRAN compiler. Operating system: Linux, MacOS, Windows. RAM: 1Mbyte Classification: 4.13, 9, 14. Nature of problem: The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Solution method: Using the statistics of random binary predictions for a given value of the predictor threshold ɛt, one can construct the corresponding confidence ellipses. The envelope of these corresponding confidence ellipses is estimated when

  7. Functional activity maps based on significance measures and Independent Component Analysis.

    PubMed

    Martínez-Murcia, F J; Górriz, J M; Ramírez, J; Puntonet, C G; Illán, I A

    2013-07-01

    The use of functional imaging has been proven very helpful for the process of diagnosis of neurodegenerative diseases, such as Alzheimer's Disease (AD). In many cases, the analysis of these images is performed by manual reorientation and visual interpretation. Therefore, new statistical techniques to perform a more quantitative analysis are needed. In this work, a new statistical approximation to the analysis of functional images, based on significance measures and Independent Component Analysis (ICA) is presented. After the images preprocessing, voxels that allow better separation of the two classes are extracted, using significance measures such as the Mann-Whitney-Wilcoxon U-Test (MWW) and Relative Entropy (RE). After this feature selection step, the voxels vector is modelled by means of ICA, extracting a few independent components which will be used as an input to the classifier. Naive Bayes and Support Vector Machine (SVM) classifiers are used in this work. The proposed system has been applied to two different databases. A 96-subjects Single Photon Emission Computed Tomography (SPECT) database from the "Virgen de las Nieves" Hospital in Granada, Spain, and a 196-subjects Positron Emission Tomography (PET) database from the Alzheimer's Disease Neuroimaging Initiative (ADNI). Values of accuracy up to 96.9% and 91.3% for SPECT and PET databases are achieved by the proposed system, which has yielded many benefits over methods proposed on recent works.

  8. Future challenges for vection research: definitions, functional significance, measures, and neural bases

    PubMed Central

    Palmisano, Stephen; Allison, Robert S.; Schira, Mark M.; Barry, Robert J.

    2015-01-01

    This paper discusses four major challenges facing modern vection research. Challenge 1 (Defining Vection) outlines the different ways that vection has been defined in the literature and discusses their theoretical and experimental ramifications. The term vection is most often used to refer to visual illusions of self-motion induced in stationary observers (by moving, or simulating the motion of, the surrounding environment). However, vection is increasingly being used to also refer to non-visual illusions of self-motion, visually mediated self-motion perceptions, and even general subjective experiences (i.e., “feelings”) of self-motion. The common thread in all of these definitions is the conscious subjective experience of self-motion. Thus, Challenge 2 (Significance of Vection) tackles the crucial issue of whether such conscious experiences actually serve functional roles during self-motion (e.g., in terms of controlling or guiding the self-motion). After more than 100 years of vection research there has been surprisingly little investigation into its functional significance. Challenge 3 (Vection Measures) discusses the difficulties with existing subjective self-report measures of vection (particularly in the context of contemporary research), and proposes several more objective measures of vection based on recent empirical findings. Finally, Challenge 4 (Neural Basis) reviews the recent neuroimaging literature examining the neural basis of vection and discusses the hurdles still facing these investigations. PMID:25774143

  9. Statistically significant contrasts between EMG waveforms revealed using wavelet-based functional ANOVA.

    PubMed

    McKay, J Lucas; Welch, Torrence D J; Vidakovic, Brani; Ting, Lena H

    2013-01-01

    We developed wavelet-based functional ANOVA (wfANOVA) as a novel approach for comparing neurophysiological signals that are functions of time. Temporal resolution is often sacrificed by analyzing such data in large time bins, increasing statistical power by reducing the number of comparisons. We performed ANOVA in the wavelet domain because differences between curves tend to be represented by a few temporally localized wavelets, which we transformed back to the time domain for visualization. We compared wfANOVA and ANOVA performed in the time domain (tANOVA) on both experimental electromyographic (EMG) signals from responses to perturbation during standing balance across changes in peak perturbation acceleration (3 levels) and velocity (4 levels) and on simulated data with known contrasts. In experimental EMG data, wfANOVA revealed the continuous shape and magnitude of significant differences over time without a priori selection of time bins. However, tANOVA revealed only the largest differences at discontinuous time points, resulting in features with later onsets and shorter durations than those identified using wfANOVA (P < 0.02). Furthermore, wfANOVA required significantly fewer (~1/4;×; P < 0.015) significant F tests than tANOVA, resulting in post hoc tests with increased power. In simulated EMG data, wfANOVA identified known contrast curves with a high level of precision (r(2) = 0.94 ± 0.08) and performed better than tANOVA across noise levels (P < <0.01). Therefore, wfANOVA may be useful for revealing differences in the shape and magnitude of neurophysiological signals (e.g., EMG, firing rates) across multiple conditions with both high temporal resolution and high statistical power. PMID:23100136

  10. Estimating the probability of occurrence of earthquakes (M>6) in the Western part of the Corinth rift using fault-based and classical seismotectonic approaches.

    NASA Astrophysics Data System (ADS)

    Boiselet, Aurelien; Scotti, Oona; Lyon-Caen, Hélène

    2014-05-01

    -SISCOR Working Group. On the basis of this consensual logic tree, median probability of occurrences of M>=6 events were computed for the region of study. Time-dependent models (Brownian Passage time and Weibull probability distributions) were also explored. The probability of a M>=6.0 event is found to be greater in the western region compared to the eastern part of the Corinth rift, whether a fault-based or a classical seismotectonic approach is used. Percentile probability estimates are also provided to represent the range of uncertainties in the results. The percentile results show that, in general, probability estimates following the classical approach (based on the definition of seismotectonic source zones), cover the median values estimated following the fault-based approach. On the contrary, the fault-based approach in this region is still affected by a high degree of uncertainty, because of the poor constraints on the 3D geometries of the faults and the high uncertainties in their slip rates.

  11. Probability of satellite collision

    NASA Technical Reports Server (NTRS)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  12. A computer simulated phantom study of tomotherapy dose optimization based on probability density functions (PDF) and potential errors caused by low reproducibility of PDF

    SciTech Connect

    Sheng, Ke; Cai Jing; Brookeman, James; Molloy, Janelle; Christopher, John; Read, Paul

    2006-09-15

    Lung tumor motion trajectories measured by four-dimensional CT or dynamic MRI can be converted to a probability density function (PDF), which describes the probability of the tumor at a certain position, for PDF based treatment planning. Using this method in simulated sequential tomotherapy, we study the dose reduction of normal tissues and more important, the effect of PDF reproducibility on the accuracy of dosimetry. For these purposes, realistic PDFs were obtained from two dynamic MRI scans of a healthy volunteer within a 2 week interval. The first PDF was accumulated from a 300 s scan and the second PDF was calculated from variable scan times from 5 s (one breathing cycle) to 300 s. Optimized beam fluences based on the second PDF were delivered to the hypothetical gross target volume (GTV) of a lung phantom that moved following the first PDF. The reproducibility between two PDFs varied from low (78%) to high (94.8%) when the second scan time increased from 5 s to 300 s. When a highly reproducible PDF was used in optimization, the dose coverage of GTV was maintained; phantom lung receiving 10%-20% prescription dose was reduced by 40%-50% and the mean phantom lung dose was reduced by 9.6%. However, optimization based on PDF with low reproducibility resulted in a 50% underdosed GTV. The dosimetric error increased nearly exponentially as the PDF error increased. Therefore, although the dose of the tumor surrounding tissue can be theoretically reduced by PDF based treatment planning, the reliability and applicability of this method highly depend on if a reproducible PDF exists and is measurable. By correlating the dosimetric error and PDF error together, a useful guideline for PDF data acquisition and patient qualification for PDF based planning can be derived.

  13. Incidence of late rectal bleeding in high-dose conformal radiotherapy of prostate cancer using equivalent uniform dose-based and dose-volume-based normal tissue complication probability models

    SciTech Connect

    Soehn, Matthias . E-mail: Matthias.Soehn@med.uni-tuebingen.de; Yan Di; Liang Jian; Meldolesi, Elisa; Vargas, Carlos; Alber, Markus

    2007-03-15

    Purpose: Accurate modeling of rectal complications based on dose-volume histogram (DVH) data are necessary to allow safe dose escalation in radiotherapy of prostate cancer. We applied different equivalent uniform dose (EUD)-based and dose-volume-based normal tissue complication probability (NTCP) models to rectal wall DVHs and follow-up data for 319 prostate cancer patients to identify the dosimetric factors most predictive for Grade {>=} 2 rectal bleeding. Methods and Materials: Data for 319 patients treated at the William Beaumont Hospital with three-dimensional conformal radiotherapy (3D-CRT) under an adaptive radiotherapy protocol were used for this study. The following models were considered: (1) Lyman model and (2) logit-formula with DVH reduced to generalized EUD (3) serial reconstruction unit (RU) model (4) Poisson-EUD model, and (5) mean dose- and (6) cutoff dose-logistic regression model. The parameters and their confidence intervals were determined using maximum likelihood estimation. Results: Of the patients, 51 (16.0%) showed Grade 2 or higher bleeding. As assessed qualitatively and quantitatively, the Lyman- and Logit-EUD, serial RU, and Poisson-EUD model fitted the data very well. Rectal wall mean dose did not correlate to Grade 2 or higher bleeding. For the cutoff dose model, the volume receiving > 73.7 Gy showed most significant correlation to bleeding. However, this model fitted the data more poorly than the EUD-based models. Conclusions: Our study clearly confirms a volume effect for late rectal bleeding. This can be described very well by the EUD-like models, of which the serial RU- and Poisson-EUD model can describe the data with only two parameters. Dose-volume-based cutoff-dose models performed wor0008.

  14. Probability distributions for magnetotellurics

    SciTech Connect

    Stodt, John A.

    1982-11-01

    Estimates of the magnetotelluric transfer functions can be viewed as ratios of two complex random variables. It is assumed that the numerator and denominator are governed approximately by a joint complex normal distribution. Under this assumption, probability distributions are obtained for the magnitude, squared magnitude, logarithm of the squared magnitude, and the phase of the estimates. Normal approximations to the distributions are obtained by calculating mean values and variances from error propagation, and the distributions are plotted with their normal approximations for different percentage errors in the numerator and denominator of the estimates, ranging from 10% to 75%. The distribution of the phase is approximated well by a normal distribution for the range of errors considered, while the distribution of the logarithm of the squared magnitude is approximated by a normal distribution for a much larger range of errors than is the distribution of the squared magnitude. The distribution of the squared magnitude is most sensitive to the presence of noise in the denominator of the estimate, in which case the true distribution deviates significantly from normal behavior as the percentage errors exceed 10%. In contrast, the normal approximation to the distribution of the logarithm of the magnitude is useful for errors as large as 75%.

  15. Probability and amounts of yogurt intake are differently affected by sociodemographic, economic, and lifestyle factors in adults and the elderly-results from a population-based study.

    PubMed

    Possa, Gabriela; de Castro, Michelle Alessandra; Marchioni, Dirce Maria Lobo; Fisberg, Regina Mara; Fisberg, Mauro

    2015-08-01

    The aim of this population-based cross-sectional health survey (N = 532) was to investigate the factors associated with the probability and amounts of yogurt intake in Brazilian adults and the elderly. A structured questionnaire was used to obtain data on demographics, socioeconomic information, presence of morbidities and lifestyle and anthropometric characteristics. Food intake was evaluated using two nonconsecutive 24-hour dietary recalls and a Food Frequency Questionnaire. Approximately 60% of the subjects were classified as yogurt consumers. In the logistic regression model, yogurt intake was associated with smoking (odds ratio [OR], 1.98), female sex (OR, 2.12), and age 20 to 39 years (OR, 3.11). Per capita family income and being a nonsmoker were factors positively associated with the amount of yogurt consumption (coefficients, 0.61 and 3.73, respectively), whereas the level of education of the head of household was inversely associated (coefficient, 0.61). In this study, probability and amounts of yogurt intake are differently affected by demographic, socioeconomic, and lifestyle factors in adults and the elderly.

  16. Implementation of a web based universal exchange and inference language for medicine: Sparse data, probabilities and inference in data mining of clinical data repositories.

    PubMed

    Robson, Barry; Boray, Srinidhi

    2015-11-01

    We extend Q-UEL, our universal exchange language for interoperability and inference in healthcare and biomedicine, to the more traditional fields of public health surveys. These are the type associated with screening, epidemiological and cross-sectional studies, and cohort studies in some cases similar to clinical trials. There is the challenge that there is some degree of split between frequentist notions of probability as (a) classical measures based only on the idea of counting and proportion and on classical biostatistics as used in the above conservative disciplines, and (b) more subjectivist notions of uncertainty, belief, reliability, or confidence often used in automated inference and decision support systems. Samples in the above kind of public health survey are typically small compared with our earlier "Big Data" mining efforts. An issue addressed here is how much impact on decisions should sparse data have. We describe a new Q-UEL compatible toolkit including a data analytics application DiracMiner that also delivers more standard biostatistical results, DiracBuilder that uses its output to build Hyperbolic Dirac Nets (HDN) for decision support, and HDNcoherer that ensures that probabilities are mutually consistent. Use is exemplified by participating in a real word health-screening project, and also by deployment in a industrial platform called the BioIngine, a cognitive computing platform for health management.

  17. Size effects on the open probability of two-state ion channel system in cell membranes using microcanonical formalism based on gamma function

    NASA Astrophysics Data System (ADS)

    Erdem, Riza; Aydiner, Ekrem

    2016-08-01

    Ion channel systems are a class of proteins that reside in the membranes of all biological cells and forms conduction pores that regulate the transport of ions into and out of cells. They can be investigated theoretically in the microcanonical formalism since the number of accessible states can be easily evaluated by using the Stirling approximation to deal with factorials. In this work, we have used gamma function (Γ (n)) to solve the two-state or open-close channel model without any approximation. New values are calculated for the open probability (p0) and the relative error between our numerical results and the approximate one using Stirling formula is presented. This error (p0 app — p0)/p0 is significant for small channel systems.

  18. Normal probability plots with confidence.

    PubMed

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  19. Impact of the use of an alcohol-based hand sanitizer in the home on reduction in probability of infection by respiratory and enteric viruses.

    PubMed

    Tamimi, A H; Maxwell, S; Edmonds, S L; Gerba, C P

    2015-11-01

    The goal of this study was to determine the reduction in risk of infection by viruses with the use of an alcohol-based hand sanitizer, used in addition to routine hand washing, in family members in households. A quantitative microbial risk model was used to determine the probability of infection from the concentration of virus on the hands. The model incorporated variation in hand size, frequency of touching orifices (nose, mouth, eyes), and percent transfer to the site of infection, as well as, dose-response for each virus. Data on the occurrence of virus on household members' hands from an intervention study using MS-2 coliphage was used to determine the reduction of viruses on the hands pre- and post-intervention. It was found that the risk of rhinovirus, rotavirus or norovirus infection after the intervention was reduced by 47-98% depending upon the initial concentration of virus on the hands. PMID:25825988

  20. Determining the Probability of Violating Upper-Level Wind Constraints for the Launch of Minuteman Ill Ballistic Missiles At Vandenberg Air Force Base

    NASA Technical Reports Server (NTRS)

    Shafer, Jaclyn A.; Brock, Tyler M.

    2013-01-01

    The 30th Operational Support Squadron Weather Flight (30 OSSWF) provides comprehensive weather services to the space program at Vandenberg Air Force Base (VAFB) in California. One of their responsibilities is to monitor upper-level winds to ensure safe launch operations of the Minuteman Ill ballistic missile. The 30 OSSWF requested the Applied Meteorology Unit (AMU) analyze VAFB sounding data to determine the probability of violating (PoV) upper-level thresholds for wind speed and shear constraints specific to this launch vehicle, and to develop a graphical user interface (GUI) that will calculate the PoV of each constraint on the day of launch. The AMU suggested also including forecast sounding data from the Rapid Refresh (RAP) model. This would provide further insight for the launch weather officers (LWOs) when determining if a wind constraint violation will occur over the next few hours, and help to improve the overall upper winds forecast on launch day.

  1. Impact of the use of an alcohol-based hand sanitizer in the home on reduction in probability of infection by respiratory and enteric viruses.

    PubMed

    Tamimi, A H; Maxwell, S; Edmonds, S L; Gerba, C P

    2015-11-01

    The goal of this study was to determine the reduction in risk of infection by viruses with the use of an alcohol-based hand sanitizer, used in addition to routine hand washing, in family members in households. A quantitative microbial risk model was used to determine the probability of infection from the concentration of virus on the hands. The model incorporated variation in hand size, frequency of touching orifices (nose, mouth, eyes), and percent transfer to the site of infection, as well as, dose-response for each virus. Data on the occurrence of virus on household members' hands from an intervention study using MS-2 coliphage was used to determine the reduction of viruses on the hands pre- and post-intervention. It was found that the risk of rhinovirus, rotavirus or norovirus infection after the intervention was reduced by 47-98% depending upon the initial concentration of virus on the hands.

  2. Probability with Roulette

    ERIC Educational Resources Information Center

    Marshall, Jennings B.

    2007-01-01

    This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.

  3. Quantum computing and probability.

    PubMed

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  4. Launch Collision Probability

    NASA Technical Reports Server (NTRS)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  5. Evaluation and significance of hyperchromatic crowded groups (HCG) in liquid-based paps

    PubMed Central

    Chivukula, Mamatha; Austin, R Marshall; Shidham, Vinod B

    2007-01-01

    Objective Hyperchromatic crowded groups (HCG), a term first introduced into the cytology literature by DeMay in 1995, are commonly observed in Pap tests and may rarely be associated with serious but difficult to interpret lesions. In this study, we specifically defined HCG as dark crowded cell groups with more than 15 cells which can be identified at 10× screening magnification. Methods We evaluated consecutive liquid-based (Surepath) Pap tests from 601 women (age 17–74 years, mean age 29.4 yrs) and observed HCG in 477 cases. In all 477 HCG cases, Pap tests were found to be satisfactory and to contain an endocervical sample. HCG were easily detectible at 10× screening magnification (size up to 400 um, mean 239.5 um) and ranged from 1 to 50 (mean 19.5) per Pap slide. Results HCG predominantly represented 3-Dimensional groups of endocervical cells with some nuclear overlap (379/477 – 79%), reactive endocervical cells with relatively prominent nucleoli and some nuclear crowding (29/477 – 6%), clusters of inflammatory cells (25/477 – 5.2%), parabasal cells (22/477 – 4.6%), endometrial cells (1/477 – 0.2%). Epithelial cell abnormalities (ECA) were present in only 21 of 477 cases (4.6%). 18 of 21 women with HCG-associated ECA were less than 40 years old; only 3 were =/> 40 years. HCG-associated final abnormal Pap test interpretations were as follows: ASCUS (6/21 – 28%), LSIL (12/21 – 57%), ASC-H (2/21 – 9.5%), and HSIL/CIN2-3 (3/21 – 14%). The association of HCG with ECA was statistically significant (p = 0.0174. chi-square test). In patients with ECA, biopsy results were available in 10 cases, and 4 cases of biopsy-proven CIN2/3 were detected. Among these four cases, HCG in the Pap tests, in retrospect represented the lesional high grade cells in three cases (one HSIL case and two ASC-H cases). Interestingly, none of the 124 cases without HCG were found to have an epithelial cell abnormality. Conclusion We conclude: a. HCG are observed in a high

  6. Coronary CT angiography findings based on smoking status: Do ex-smokers and never-smokers share a low probability of developing coronary atherosclerosis?

    PubMed

    Yi, Minkyung; Chun, Eun Ju; Lee, Min Su; Lee, Jaebong; Choi, Sang Il

    2015-12-01

    The relationship of coronary artery disease (CAD) in ex-smokers has not been elucidated, although smoking is considered to be one of the major risk factors of CAD. We investigate subclinical coronary atherosclerosis (SCA) in asymptomatic subjects with coronary computed tomography angiography (CCTA), according to smoking status, and determine whether ex-smokers share a low probability of developing CAD with never-smokers. We retrospectively enrolled 6930 self-referred asymptomatic adults who underwent both coronary artery calcium score (CACS) and CCTA. The prevalence and characteristics of SCA were assessed according to smoking status (never-, ex- and current smokers). After adjusting for variable risk factors, we used multivariate logistic regression for adjusted odds ratios (AOR) of high CACS (>100), SCA (any plaque), significant stenosis (>50 % in luminal stenosis) and each plaque type (non-calcified, mixed and calcified plaque) among the three groups. The prevalence of SCA was highest in the ex-smokers (35.4 %) and the prevalence of significant stenosis in ex-smokers (6.9 %) was as high as in current smokers (6.4 %). However, after adjusting for variable risk factors, SCA was significantly correlated with both ex-smokers (AOR; 1.21) and current smokers (AOR; 1.25), whereas significant stenosis was correlated only with current smokers (AOR; 1.91). The association between SCA and ex-smokers is as strong as with current smokers, although significant stenosis is only correlated with current smokers; thus, not only quitting smoking but also never initiating smoking would be helpful to reduce the progression of the SCA.

  7. Coronary Risk Assessment by Point-Based vs. Equation-Based Framingham Models: Significant Implications for Clinical Care

    PubMed Central

    Gordon, William J.; Polansky, Jesse M.; John Boscardin, W.; Fung, Kathy Z.

    2010-01-01

    BACKGROUND US cholesterol guidelines use original and simplified versions of the Framingham model to estimate future coronary risk and thereby classify patients into risk groups with different treatment strategies. We sought to compare risk estimates and risk group classification generated by the original, complex Framingham model and the simplified, point-based version. METHODS We assessed 2,543 subjects age 20–79 from the 2001–2006 National Health and Nutrition Examination Surveys (NHANES) for whom Adult Treatment Panel III (ATP-III) guidelines recommend formal risk stratification. For each subject, we calculated the 10-year risk of major coronary events using the original and point-based Framingham models, and then compared differences in these risk estimates and whether these differences would place subjects into different ATP-III risk groups (<10% risk, 10–20% risk, or >20% risk). Using standard procedures, all analyses were adjusted for survey weights, clustering, and stratification to make our results nationally representative. RESULTS Among 39 million eligible adults, the original Framingham model categorized 71% of subjects as having “moderate” risk (<10% risk of a major coronary event in the next 10 years), 22% as having “moderately high” (10–20%) risk, and 7% as having “high” (>20%) risk. Estimates of coronary risk by the original and point-based models often differed substantially. The point-based system classified 15% of adults (5.7 million) into different risk groups than the original model, with 10% (3.9 million) misclassified into higher risk groups and 5% (1.8 million) into lower risk groups, for a net impact of classifying 2.1 million adults into higher risk groups. These risk group misclassifications would impact guideline-recommended drug treatment strategies for 25–46% of affected subjects. Patterns of misclassifications varied significantly by gender, age, and underlying CHD risk. CONCLUSIONS Compared to the original

  8. Determining the Probability of Violating Upper-Level Wind Constraints for the Launch of Minuteman III Ballistic Missiles at Vandenberg Air Force Base

    NASA Technical Reports Server (NTRS)

    Shafer, Jaclyn A.; Brock, Tyler M.

    2012-01-01

    The 30th Operational Support Squadron Weather Flight (30 OSSWF) provides comprehensive weather services to the space program at Vandenberg Air Force Base (VAFB) in California. One of their responsibilities is to monitor upper-level winds to ensure safe launch operations of the Minuteman Ill ballistic missile. The 30 OSSWF tasked the Applied Meteorology Unit (AMU) to analyze VAFB sounding data with the goal of determining the probability of violating (PoV) their upper-level thresholds for wind speed and shear constraints specific to this launch vehicle, and to develop a tool that will calculate the PoV of each constraint on the day of launch. In order to calculate the probability of exceeding each constraint, the AMU collected and analyzed historical data from VAFB. The historical sounding data were retrieved from the National Oceanic and Atmospheric Administration Earth System Research Laboratory archive for the years 1994-2011 and then stratified into four sub-seasons: January-March, April-June, July-September, and October-December. The AMU determined the theoretical distributions that best fit the maximum wind speed and maximum wind shear datasets and applied this information when calculating the averages and standard deviations needed for the historical and real-time PoV calculations. In addition, the AMU included forecast sounding data from the Rapid Refresh model. This information provides further insight for the launch weather officers (LWOs) when determining if a wind constraint violation will occur over the next few hours on the day of launch. The AMU developed an interactive graphical user interface (GUI) in Microsoft Excel using Visual Basic for Applications. The GUI displays the critical sounding data easily and quickly for LWOs on day of launch. This tool will replace the existing one used by the 30 OSSWF, assist the LWOs in determining the probability of exceeding specific wind threshold values, and help to improve the overall upper winds forecast for

  9. 47 CFR 1.1623 - Probability calculation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... computed to no less than three significant digits. Probabilities will be truncated to the number of significant digits used in a particular lottery. (b) Divide the total number of applicants into 1.00 to... than .40, then multiply each such intermediate probability by the ratio of .40 to such sum. Divide...

  10. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    PubMed Central

    2012-01-01

    Background With advances in modern radiotherapy (RT), many patients with head and neck (HN) cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB) model to derive parameters for the normal tissue complication probability (NTCP) for xerostomia based on scintigraphy assessments and quality of life (QoL) questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) guidelines against prospectively collected QoL and salivary scintigraphic data. Methods Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R2, the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV) was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Results Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD50) and the slope of the dose–response curve (m) were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD50=43.6 Gy and m=0.18 with the SEF data, and TD50=44.1 Gy and m=0.11 with the QoL data. The rate of grade 3+ xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Conclusions Our study shows the agreement between the NTCP

  11. Ultimate limits to error probabilities for ionospheric models based on solar geophysical indices and how these compare with the state of the art

    NASA Technical Reports Server (NTRS)

    Nisbet, J. S.; Stehle, C. G.

    1981-01-01

    An ideal model based on a given set of geophysical indices is defined as a model that provides a least squares fit to the data set as a function of the indices considered. Satellite measurements of electron content for three stations at different magnetic latitudes were used to provide such data sets which were each fitted to the geophysical indices. The magnitude of the difference between the measured value and the derived equation for the data set was used to estimate the probability of making an error greater than a given magnitude for such an ideal model. Atmospheric Explorer C data is used to examine the causes of the fluctuations and suggestions are made about how real improvements can be made in ionospheric forecasting ability. Joule heating inputs in the auroral electrojets are related to the AL and AU magnetic indices. Magnetic indices based on the time integral of the energy deposited in the electrojets are proposed for modeling processes affected by auroral zone heating.

  12. Children with Significant Hearing Loss: Learning to Listen, Talk, and Read--Evidence-Based Best Practices

    ERIC Educational Resources Information Center

    Martindale, Maura

    2007-01-01

    A considerable body of evidence obtained from studies of children who are deaf and who use cochlear implants has been useful in guiding practices that lead to higher levels of English language proficiency and age-appropriate literacy. Both (a) research conducted at implant centers and (b) educational programs with significant numbers of children…

  13. No Bridge Too High: Infants Decide Whether to Cross Based on the Probability of Falling not the Severity of the Potential Fall

    ERIC Educational Resources Information Center

    Kretch, Kari S.; Adolph, Karen E.

    2013-01-01

    Do infants, like adults, consider both the probability of falling and the severity of a potential fall when deciding whether to cross a bridge? Crawling and walking infants were encouraged to cross bridges varying in width over a small drop-off, a large drop-off, or no drop-off. Bridge width affects the probability of falling, whereas drop-off…

  14. Beyond No Significant Differences: A Closer Look at the Educational Impact of Computer-Based Instruction

    ERIC Educational Resources Information Center

    Mandernach, B. Jean

    2006-01-01

    There is a host of research examining the equivalence of alternative modes of technology-facilitated educational delivery (such as computer-based or online instruction) and traditional classroom instruction. While various studies have promoted each of these modalities for specific populations or topic areas, the bulk of research supports relative…

  15. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  16. A Unifying Probability Example.

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.

    2002-01-01

    Presents an example from probability and statistics that ties together several topics including the mean and variance of a discrete random variable, the binomial distribution and its particular mean and variance, the sum of independent random variables, the mean and variance of the sum, and the central limit theorem. Uses Excel to illustrate these…

  17. An exclusive human milk-based diet in extremely premature infants reduces the probability of remaining on total parenteral nutrition: A reanalysis of the data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We have previously shown that an exclusively human-milk-based diet is beneficial for extremely premature infants who are at risk for necrotizing enterocolitis (NEC). However, no significant difference in the other primary study endpoint, the length of time on total parenteral nutrition (TPN), was fo...

  18. Innovations in individual feature history management - The significance of feature-based temporal model

    USGS Publications Warehouse

    Choi, J.; Seong, J.C.; Kim, B.; Usery, E.L.

    2008-01-01

    A feature relies on three dimensions (space, theme, and time) for its representation. Even though spatiotemporal models have been proposed, they have principally focused on the spatial changes of a feature. In this paper, a feature-based temporal model is proposed to represent the changes of both space and theme independently. The proposed model modifies the ISO's temporal schema and adds new explicit temporal relationship structure that stores temporal topological relationship with the ISO's temporal primitives of a feature in order to keep track feature history. The explicit temporal relationship can enhance query performance on feature history by removing topological comparison during query process. Further, a prototype system has been developed to test a proposed feature-based temporal model by querying land parcel history in Athens, Georgia. The result of temporal query on individual feature history shows the efficiency of the explicit temporal relationship structure. ?? Springer Science+Business Media, LLC 2007.

  19. On Probability Domains

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2010-12-01

    Motivated by IF-probability theory (intuitionistic fuzzy), we study n-component probability domains in which each event represents a body of competing components and the range of a state represents a simplex S n of n-tuples of possible rewards-the sum of the rewards is a number from [0,1]. For n=1 we get fuzzy events, for example a bold algebra, and the corresponding fuzzy probability theory can be developed within the category ID of D-posets (equivalently effect algebras) of fuzzy sets and sequentially continuous D-homomorphisms. For n=2 we get IF-events, i.e., pairs ( μ, ν) of fuzzy sets μ, ν∈[0,1] X such that μ( x)+ ν( x)≤1 for all x∈ X, but we order our pairs (events) coordinatewise. Hence the structure of IF-events (where ( μ 1, ν 1)≤( μ 2, ν 2) whenever μ 1≤ μ 2 and ν 2≤ ν 1) is different and, consequently, the resulting IF-probability theory models a different principle. The category ID is cogenerated by I=[0,1] (objects of ID are subobjects of powers I X ), has nice properties and basic probabilistic notions and constructions are categorical. For example, states are morphisms. We introduce the category S n D cogenerated by Sn=\\{(x1,x2,ldots ,xn)in In;sum_{i=1}nxi≤ 1\\} carrying the coordinatewise partial order, difference, and sequential convergence and we show how basic probability notions can be defined within S n D.

  20. Significant Performance Enhancement in Asymmetric Supercapacitors based on Metal Oxides, Carbon nanotubes and Neutral Aqueous Electrolyte

    PubMed Central

    Singh, Arvinder; Chandra, Amreesh

    2015-01-01

    Amongst the materials being investigated for supercapacitor electrodes, carbon based materials are most investigated. However, pure carbon materials suffer from inherent physical processes which limit the maximum specific energy and power that can be achieved in an energy storage device. Therefore, use of carbon-based composites with suitable nano-materials is attaining prominence. The synergistic effect between the pseudocapacitive nanomaterials (high specific energy) and carbon (high specific power) is expected to deliver the desired improvements. We report the fabrication of high capacitance asymmetric supercapacitor based on electrodes of composites of SnO2 and V2O5 with multiwall carbon nanotubes and neutral 0.5 M Li2SO4 aqueous electrolyte. The advantages of the fabricated asymmetric supercapacitors are compared with the results published in the literature. The widened operating voltage window is due to the higher over-potential of electrolyte decomposition and a large difference in the work functions of the used metal oxides. The charge balanced device returns the specific capacitance of ~198 F g−1 with corresponding specific energy of ~89 Wh kg−1 at 1 A g−1. The proposed composite systems have shown great potential in fabricating high performance supercapacitors. PMID:26494197

  1. Significant Performance Enhancement in Asymmetric Supercapacitors based on Metal Oxides, Carbon nanotubes and Neutral Aqueous Electrolyte

    NASA Astrophysics Data System (ADS)

    Singh, Arvinder; Chandra, Amreesh

    2015-10-01

    Amongst the materials being investigated for supercapacitor electrodes, carbon based materials are most investigated. However, pure carbon materials suffer from inherent physical processes which limit the maximum specific energy and power that can be achieved in an energy storage device. Therefore, use of carbon-based composites with suitable nano-materials is attaining prominence. The synergistic effect between the pseudocapacitive nanomaterials (high specific energy) and carbon (high specific power) is expected to deliver the desired improvements. We report the fabrication of high capacitance asymmetric supercapacitor based on electrodes of composites of SnO2 and V2O5 with multiwall carbon nanotubes and neutral 0.5 M Li2SO4 aqueous electrolyte. The advantages of the fabricated asymmetric supercapacitors are compared with the results published in the literature. The widened operating voltage window is due to the higher over-potential of electrolyte decomposition and a large difference in the work functions of the used metal oxides. The charge balanced device returns the specific capacitance of ~198 F g-1 with corresponding specific energy of ~89 Wh kg-1 at 1 A g-1. The proposed composite systems have shown great potential in fabricating high performance supercapacitors.

  2. Automated Coronal Seismology: Curvelet Characterization of Probability Maps of Image Data with Oscillatory Signal

    NASA Astrophysics Data System (ADS)

    Young, C.; Ireland, J.

    2010-12-01

    Automated coronal seismology will require measurements of the structure that supports an oscillatory signal; for example, a measurement of the loop length of a transversely oscillating loop can be used to estimate the coronal magnetic field (Nakariakov& Ofman 2001). One of the results from the recently published Bayesian probability based automated oscillation detection algorithm (Ireland et al., 2010) is a probability map. This is an image of the probability that each pixel from a set of images contains an oscillatory signal. A map from a significant detection contains one or more clusters of high probability pixels dispersed amongst mostly pixels of low probability. These low probability pixels amount to noise while the clusters of high probability are the desired signal. A visual inspection of the probability maps that contain significant signal reveal that the clusters of pixels contain structure that corresponds to physical regions in the original images i.e. oscillating loops. A necessary step for using these oscillation probability maps is to extract and characterize these high probability regions. A natural choice for an appropriate representation of these structures especially given their corresponds to real extended features such as loops is the curvelet transform (Candes and Donoho, 1999 and Candes et al., 2005). In this work we present a preliminary analysis of these probability maps using curvelets to isolate and characterize regions of high probability. The suitability of this technique for the pipeline processing of Solar Dynamics Observatory data is also discussed.

  3. Calibrating Subjective Probabilities Using Hierarchical Bayesian Models

    NASA Astrophysics Data System (ADS)

    Merkle, Edgar C.

    A body of psychological research has examined the correspondence between a judge's subjective probability of an event's outcome and the event's actual outcome. The research generally shows that subjective probabilities are noisy and do not match the "true" probabilities. However, subjective probabilities are still useful for forecasting purposes if they bear some relationship to true probabilities. The purpose of the current research is to exploit relationships between subjective probabilities and outcomes to create improved, model-based probabilities for forecasting. Once the model has been trained in situations where the outcome is known, it can then be used in forecasting situations where the outcome is unknown. These concepts are demonstrated using experimental psychology data, and potential applications are discussed.

  4. Identification of Patient Benefit From Proton Therapy for Advanced Head and Neck Cancer Patients Based on Individual and Subgroup Normal Tissue Complication Probability Analysis

    SciTech Connect

    Jakobi, Annika; Bandurska-Luque, Anna; Stützer, Kristin; Haase, Robert; Löck, Steffen; Wack, Linda-Jacqueline; Mönnich, David; Thorwarth, Daniela; and others

    2015-08-01

    Purpose: The purpose of this study was to determine, by treatment plan comparison along with normal tissue complication probability (NTCP) modeling, whether a subpopulation of patients with head and neck squamous cell carcinoma (HNSCC) could be identified that would gain substantial benefit from proton therapy in terms of NTCP. Methods and Materials: For 45 HNSCC patients, intensity modulated radiation therapy (IMRT) was compared to intensity modulated proton therapy (IMPT). Physical dose distributions were evaluated as well as the resulting NTCP values, using modern models for acute mucositis, xerostomia, aspiration, dysphagia, laryngeal edema, and trismus. Patient subgroups were defined based on primary tumor location. Results: Generally, IMPT reduced the NTCP values while keeping similar target coverage for all patients. Subgroup analyses revealed a higher individual reduction of swallowing-related side effects by IMPT for patients with tumors in the upper head and neck area, whereas the risk reduction of acute mucositis was more pronounced in patients with tumors in the larynx region. More patients with tumors in the upper head and neck area had a reduction in NTCP of more than 10%. Conclusions: Subgrouping can help to identify patients who may benefit more than others from the use of IMPT and, thus, can be a useful tool for a preselection of patients in the clinic where there are limited PT resources. Because the individual benefit differs within a subgroup, the relative merits should additionally be evaluated by individual treatment plan comparisons.

  5. Nanosilver based anionic linear globular dendrimer with a special significant antiretroviral activity.

    PubMed

    Ardestani, Mehdi Shafiee; Fordoei, Alireza Salehi; Abdoli, Asghar; Ahangari Cohan, Reza; Bahramali, Golnaz; Sadat, Seyed Mehdi; Siadat, Seyed Davar; Moloudian, Hamid; Nassiri Koopaei, Nasser; Bolhasani, Azam; Rahimi, Pooneh; Hekmat, Soheila; Davari, Mehdi; Aghasadeghi, Mohammad Reza

    2015-05-01

    HIV is commonly caused to a very complicated disease which has not any recognized vaccine, so designing and development of novel antiretroviral agents with specific application of nanomedicine is a globally interested research subject worldwide. In the current study, a novel structure of silver complexes with anionic linear globular dendrimer was synthesized, characterized and then assessed against HIV replication pathway in vitro as well. The results showed a very good yield of synthesis (up to 70%) for the nano-complex as well as a very potent significant (P < 0.05) antiretroviral activity with non-severe toxic effects in comparison with the Nevirapine as standard drug in positive control group. According to the present data, silver anionic linear globular dendrimers complex may have a promising future to inhibit replication of HIV viruse in clinical practice.

  6. Nanosilver based anionic linear globular dendrimer with a special significant antiretroviral activity.

    PubMed

    Ardestani, Mehdi Shafiee; Fordoei, Alireza Salehi; Abdoli, Asghar; Ahangari Cohan, Reza; Bahramali, Golnaz; Sadat, Seyed Mehdi; Siadat, Seyed Davar; Moloudian, Hamid; Nassiri Koopaei, Nasser; Bolhasani, Azam; Rahimi, Pooneh; Hekmat, Soheila; Davari, Mehdi; Aghasadeghi, Mohammad Reza

    2015-05-01

    HIV is commonly caused to a very complicated disease which has not any recognized vaccine, so designing and development of novel antiretroviral agents with specific application of nanomedicine is a globally interested research subject worldwide. In the current study, a novel structure of silver complexes with anionic linear globular dendrimer was synthesized, characterized and then assessed against HIV replication pathway in vitro as well. The results showed a very good yield of synthesis (up to 70%) for the nano-complex as well as a very potent significant (P < 0.05) antiretroviral activity with non-severe toxic effects in comparison with the Nevirapine as standard drug in positive control group. According to the present data, silver anionic linear globular dendrimers complex may have a promising future to inhibit replication of HIV viruse in clinical practice. PMID:25893388

  7. A Citation-Based Analysis and Review of Significant Papers on Timing and Time Perception.

    PubMed

    Teki, Sundeep

    2016-01-01

    Time is an important dimension of brain function, but little is yet known about the underlying cognitive principles and neurobiological mechanisms. The field of timing and time perception has witnessed tremendous growth and multidisciplinary interest in the recent years with the advent of modern neuroimaging and neurophysiological approaches. In this article, I used a data mining approach to analyze the timing literature published by a select group of researchers (n = 202) during the period 2000-2015 and highlight important reviews as well as empirical articles that meet the criterion of a minimum of 100 citations. The qualifying articles (n = 150) are listed in a table along with key details such as number of citations, names of authors, year and journal of publication as well as a short summary of the findings of each study. The results of such a data-driven approach to literature review not only serve as a useful resource to any researcher interested in timing, but also provides a means to evaluate key papers that have significantly influenced the field and summarize recent progress and popular research trends in the field. Additionally, such analyses provides food for thought about future scientific directions and raises important questions about improving organizational structures to boost open science and progress in the field. I discuss exciting avenues for future research that have the potential to significantly advance our understanding of the neurobiology of timing, and propose the establishment of a new society, the Timing Research Forum, to promote open science and collaborative work within the highly diverse and multidisciplinary community of researchers in the field of timing and time perception. PMID:27471445

  8. A Citation-Based Analysis and Review of Significant Papers on Timing and Time Perception

    PubMed Central

    Teki, Sundeep

    2016-01-01

    Time is an important dimension of brain function, but little is yet known about the underlying cognitive principles and neurobiological mechanisms. The field of timing and time perception has witnessed tremendous growth and multidisciplinary interest in the recent years with the advent of modern neuroimaging and neurophysiological approaches. In this article, I used a data mining approach to analyze the timing literature published by a select group of researchers (n = 202) during the period 2000–2015 and highlight important reviews as well as empirical articles that meet the criterion of a minimum of 100 citations. The qualifying articles (n = 150) are listed in a table along with key details such as number of citations, names of authors, year and journal of publication as well as a short summary of the findings of each study. The results of such a data-driven approach to literature review not only serve as a useful resource to any researcher interested in timing, but also provides a means to evaluate key papers that have significantly influenced the field and summarize recent progress and popular research trends in the field. Additionally, such analyses provides food for thought about future scientific directions and raises important questions about improving organizational structures to boost open science and progress in the field. I discuss exciting avenues for future research that have the potential to significantly advance our understanding of the neurobiology of timing, and propose the establishment of a new society, the Timing Research Forum, to promote open science and collaborative work within the highly diverse and multidisciplinary community of researchers in the field of timing and time perception. PMID:27471445

  9. Integrated statistical modelling of spatial landslide probability

    NASA Astrophysics Data System (ADS)

    Mergili, M.; Chu, H.-J.

    2015-09-01

    Statistical methods are commonly employed to estimate spatial probabilities of landslide release at the catchment or regional scale. Travel distances and impact areas are often computed by means of conceptual mass point models. The present work introduces a fully automated procedure extending and combining both concepts to compute an integrated spatial landslide probability: (i) the landslide inventory is subset into release and deposition zones. (ii) We employ a simple statistical approach to estimate the pixel-based landslide release probability. (iii) We use the cumulative probability density function of the angle of reach of the observed landslide pixels to assign an impact probability to each pixel. (iv) We introduce the zonal probability i.e. the spatial probability that at least one landslide pixel occurs within a zone of defined size. We quantify this relationship by a set of empirical curves. (v) The integrated spatial landslide probability is defined as the maximum of the release probability and the product of the impact probability and the zonal release probability relevant for each pixel. We demonstrate the approach with a 637 km2 study area in southern Taiwan, using an inventory of 1399 landslides triggered by the typhoon Morakot in 2009. We observe that (i) the average integrated spatial landslide probability over the entire study area corresponds reasonably well to the fraction of the observed landside area; (ii) the model performs moderately well in predicting the observed spatial landslide distribution; (iii) the size of the release zone (or any other zone of spatial aggregation) influences the integrated spatial landslide probability to a much higher degree than the pixel-based release probability; (iv) removing the largest landslides from the analysis leads to an enhanced model performance.

  10. The Significant Frequency and Impact of Stealth (Nonviolent) Gender-Based Abuse Among College Women.

    PubMed

    Belknap, Joanne; Sharma, Nitika

    2014-05-29

    The prevalence, incidence, and impact of the gender-based abuse (GBA) of college women have been increasingly documented since the 1980s, with growing precision in the measurements and expanding identification of tactics. Although there is an obvious class bias in focusing on college women (compared to women of similar ages not attending college), it is important to address GBA among this population as they are at serious risk of sexual abuse (particularly incapacitated rape), intimate partner abuse (IPA), and stalking. This article addresses the stealth nature of the nonviolent GBAs of college women and how these abuses frequently operate under the radar of acknowledgment by society, the abusers, campus officials, the criminal legal system, and sometimes, the survivors.

  11. Methodology and Significance of Microsensor-based Oxygen Mapping in Plant Seeds – an Overview

    PubMed Central

    Rolletschek, Hardy; Stangelmayer, Achim; Borisjuk, Ljudmilla

    2009-01-01

    Oxygen deficiency is commonplace in seeds, and limits both their development and their germination. It is, therefore, of considerable relevance to crop production. While the underlying physiological basis of seed hypoxia has been known for some time, the lack of any experimental means of measuring the global or localized oxygen concentration within the seed has hampered further progress in this research area. The development of oxygen-sensitive microsensors now offers the capability to determine the localized oxygen status within a seed, and to study its dynamic adjustment both to changes in the ambient environment, and to the seed's developmental stage. This review illustrates the use of oxygen microsensors in seed research, and presents an overview of existing data with an emphasis on crop species. Oxygen maps, both static and dynamic, should serve to increase our basic understanding of seed physiology, as well as to facilitate upcoming breeding and biotechnology-based approaches for crop improvement. PMID:22412307

  12. Suffering and powerlessness: the significance of promoting participation in rights-based approaches to health.

    PubMed

    Yamin, Alicia Ely

    2009-01-01

    In a rights framework, participation is inextricably related to power. Through effective participation, we can challenge political and other forms of exclusion that prevent people from having power over the decisions and processes that affect their lives and health. Yet concepts of power are as contested as notions of participation. Thus, I argue here that, far from there being a formula for what participation means in a rights-based approach to health, the way in which we conceptualize the role of participation is closely linked to how we understand power and, in turn, the purpose and meaning of human rights themselves. I outline three ways of thinking about domination and participation-as-empowerment. In a liberal understanding of how power operates, there is an overarching concern for ensuring processes of participation that enable competing groups to express their voices on the proverbial level playing field, so that no one group may impose its will on the others. Critics of this approach assert that it ignores the power relations in which participatory processes are embedded, which determine which of the issues that affect health get decided--and which issues are never brought to the table because they are systematically blocked. If a second dimension of power entails deciding what gets decided, participatory approaches need to challenge the definition of what is "up for contention," or they risk merely legitimating social control. A third dimension of power entails securing compliance from oppressed groups by shaping their perceptions of their own interests. A human rights-based approach concerned with the effects of this form of domination on people's health calls for developing critical consciousness before there can be any truly "empowering" participation. I conclude by arguing that much is at stake in defining participation in a human rights framework to health, because in defining what we are calling for, we will determine how relevant human rights are

  13. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  14. Fractal probability laws.

    PubMed

    Eliazar, Iddo; Klafter, Joseph

    2008-06-01

    We explore six classes of fractal probability laws defined on the positive half-line: Weibull, Frechét, Lévy, hyper Pareto, hyper beta, and hyper shot noise. Each of these classes admits a unique statistical power-law structure, and is uniquely associated with a certain operation of renormalization. All six classes turn out to be one-dimensional projections of underlying Poisson processes which, in turn, are the unique fixed points of Poissonian renormalizations. The first three classes correspond to linear Poissonian renormalizations and are intimately related to extreme value theory (Weibull, Frechét) and to the central limit theorem (Lévy). The other three classes correspond to nonlinear Poissonian renormalizations. Pareto's law--commonly perceived as the "universal fractal probability distribution"--is merely a special case of the hyper Pareto class.

  15. Regional flood probabilities

    USGS Publications Warehouse

    Troutman, B.M.; Karlinger, M.R.

    2003-01-01

    The T-year annual maximum flood at a site is defined to be that streamflow, that has probability 1/T of being exceeded in any given year, and for a group of sites the corresponding regional flood probability (RFP) is the probability that at least one site will experience a T-year flood in any given year. The RFP depends on the number of sites of interest and on the spatial correlation of flows among the sites. We present a Monte Carlo method for obtaining the RFP and demonstrate that spatial correlation estimates used in this method may be obtained with rank transformed data and therefore that knowledge of the at-site peak flow distribution is not necessary. We examine the extent to which the estimates depend on specification of a parametric form for the spatial correlation function, which is known to be nonstationary for peak flows. It is shown in a simulation study that use of a stationary correlation function to compute RFPs yields satisfactory estimates for certain nonstationary processes. Application of asymptotic extreme value theory is examined, and a methodology for separating channel network and rainfall effects on RFPs is suggested. A case study is presented using peak flow data from the state of Washington. For 193 sites in the Puget Sound region it is estimated that a 100-year flood will occur on the average every 4,5 years.

  16. An algorithm for finding biologically significant features in microarray data based on a priori manifold learning.

    PubMed

    Hira, Zena M; Trigeorgis, George; Gillies, Duncan F

    2014-01-01

    Microarray databases are a large source of genetic data, which, upon proper analysis, could enhance our understanding of biology and medicine. Many microarray experiments have been designed to investigate the genetic mechanisms of cancer, and analytical approaches have been applied in order to classify different types of cancer or distinguish between cancerous and non-cancerous tissue. However, microarrays are high-dimensional datasets with high levels of noise and this causes problems when using machine learning methods. A popular approach to this problem is to search for a set of features that will simplify the structure and to some degree remove the noise from the data. The most widely used approach to feature extraction is principal component analysis (PCA) which assumes a multivariate Gaussian model of the data. More recently, non-linear methods have been investigated. Among these, manifold learning algorithms, for example Isomap, aim to project the data from a higher dimensional space onto a lower dimension one. We have proposed a priori manifold learning for finding a manifold in which a representative set of microarray data is fused with relevant data taken from the KEGG pathway database. Once the manifold has been constructed the raw microarray data is projected onto it and clustering and classification can take place. In contrast to earlier fusion based methods, the prior knowledge from the KEGG databases is not used in, and does not bias the classification process--it merely acts as an aid to find the best space in which to search the data. In our experiments we have found that using our new manifold method gives better classification results than using either PCA or conventional Isomap. PMID:24595155

  17. Microarray Based Gene Expression Analysis of Murine Brown and Subcutaneous Adipose Tissue: Significance with Human

    PubMed Central

    Boparai, Ravneet K.; Kondepudi, Kanthi Kiran; Mantri, Shrikant; Bishnoi, Mahendra

    2015-01-01

    Background Two types of adipose tissues, white (WAT) and brown (BAT) are found in mammals. Increasingly novel strategies are being proposed for the treatment of obesity and its associated complications by altering amount and/or activity of BAT using mouse models. Methodology/Principle Findings The present study was designed to: (a) investigate the differential expression of genes in LACA mice subcutaneous WAT (sWAT) and BAT using mouse DNA microarray, (b) to compare mouse differential gene expression with previously published human data; to understand any inter- species differences between the two and (c) to make a comparative assessment with C57BL/6 mouse strain. In mouse microarray studies, over 7003, 1176 and 401 probe sets showed more than two-fold, five-fold and ten-fold change respectively in differential expression between murine BAT and WAT. Microarray data was validated using quantitative RT-PCR of key genes showing high expression in BAT (Fabp3, Ucp1, Slc27a1) and sWAT (Ms4a1, H2-Ob, Bank1) or showing relatively low expression in BAT (Pgk1, Cox6b1) and sWAT (Slc20a1, Cd74). Multi-omic pathway analysis was employed to understand possible links between the organisms. When murine two fold data was compared with published human BAT and sWAT data, 90 genes showed parallel differential expression in both mouse and human. Out of these 90 genes, 46 showed same pattern of differential expression whereas the pattern was opposite for the remaining 44 genes. Based on our microarray results and its comparison with human data, we were able to identify genes (targets) (a) which can be studied in mouse model systems to extrapolate results to human (b) where caution should be exercised before extrapolation of murine data to human. Conclusion Our study provides evidence for inter species (mouse vs human) differences in differential gene expression between sWAT and BAT. Critical understanding of this data may help in development of novel ways to engineer one form of adipose

  18. Significant disparity in base and sugar damage in DNA resulting from neutron and electron irradiation

    PubMed Central

    Pang, Dalong; Nico, Jeffrey S.; Karam, Lisa; Timofeeva, Olga; Blakely, William F.; Dritschilo, Anatoly; Dizdaroglu, Miral; Jaruga, Pawel

    2014-01-01

    In this study, a comparison of the effects of neutron and electron irradiation of aqueous DNA solutions was investigated to characterize potential neutron signatures in DNA damage induction. Ionizing radiation generates numerous lesions in DNA, including base and sugar lesions, lesions involving base–sugar combinations (e.g. 8,5′-cyclopurine-2′-deoxynucleosides) and DNA–protein cross-links, as well as single- and double-strand breaks and clustered damage. The characteristics of damage depend on the linear energy transfer (LET) of the incident radiation. Here we investigated DNA damage using aqueous DNA solutions in 10 mmol/l phosphate buffer from 0–80 Gy by low-LET electrons (10 Gy/min) and the specific high-LET (∼0.16 Gy/h) neutrons formed by spontaneous 252Cf decay fissions. 8-hydroxy-2′-deoxyguanosine (8-OH-dG), (5′R)-8,5′-cyclo-2′-deoxyadenosine (R-cdA) and (5′S)-8,5′-cyclo-2′-deoxyadenosine (S-cdA) were quantified using liquid chromatography–isotope-dilution tandem mass spectrometry to demonstrate a linear dose dependence for induction of 8-OH-dG by both types of radiation, although neutron irradiation was ∼50% less effective at a given dose compared with electron irradiation. Electron irradiation resulted in an exponential increase in S-cdA and R-cdA with dose, whereas neutron irradiation induced substantially less damage and the amount of damage increased only gradually with dose. Addition of 30 mmol/l 2-amino-2-(hydroxymethyl)-1,3-propanediol (TRIS), a free radical scavenger, to the DNA solution before irradiation reduced lesion induction to background levels for both types of radiation. These results provide insight into the mechanisms of DNA damage by high-LET 252Cf decay neutrons and low-LET electrons, leading to enhanced understanding of the potential biological effects of these types of irradiation. PMID:25034731

  19. Understanding Y haplotype matching probability.

    PubMed

    Brenner, Charles H

    2014-01-01

    The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of

  20. Bell Could Become the Copernicus of Probability

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2016-07-01

    Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.

  1. Laboratory-Tutorial Activities for Teaching Probability

    ERIC Educational Resources Information Center

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called "Intuitive Quantum Physics". Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a…

  2. Probability and Quantum Paradigms: the Interplay

    NASA Astrophysics Data System (ADS)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  3. Probability and Quantum Paradigms: the Interplay

    SciTech Connect

    Kracklauer, A. F.

    2007-12-03

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  4. Probabilities of future VEI ≥ 2 eruptions at the Central American Volcanic Arc: a statistical perspective based on the past centuries' eruption record

    NASA Astrophysics Data System (ADS)

    Dzierma, Yvonne; Wehrmann, Heidi

    2014-10-01

    A probabilistic eruption forecast is provided for seven historically active volcanoes along the Central American Volcanic Arc (CAVA), as a pivotal empirical contribution to multi-disciplinary volcanic hazards assessment. The eruption probabilities are determined with a Kaplan-Meier estimator of survival functions, and parametric time series models are applied to describe the historical eruption records. Aside from the volcanoes that are currently in a state of eruptive activity (Santa María, Fuego, and Arenal), the highest probabilities for eruptions of VEI ≥ 2 occur at Concepción and Cerro Negro in Nicaragua, which are likely to erupt to 70-85 % within the next 10 years. Poás and Irazú in Costa Rica show a medium to high eruption probability, followed by San Miguel (El Salvador), Rincón de la Vieja (Costa Rica), and Izalco (El Salvador; 24 % within the next 10 years).

  5. Probability of brittle failure

    NASA Technical Reports Server (NTRS)

    Kim, A.; Bosnyak, C. P.; Chudnovsky, A.

    1991-01-01

    A methodology was developed for collecting statistically representative data for crack initiation and arrest from small number of test specimens. An epoxy (based on bisphenol A diglycidyl ether and polyglycol extended diglycyl ether and cured with diethylene triamine) is selected as a model material. A compact tension specimen with displacement controlled loading is used to observe multiple crack initiation and arrests. The energy release rate at crack initiation is significantly higher than that at a crack arrest, as has been observed elsewhere. The difference between these energy release rates is found to depend on specimen size (scale effect), and is quantitatively related to the fracture surface morphology. The scale effect, similar to that in statistical strength theory, is usually attributed to the statistics of defects which control the fracture process. Triangular shaped ripples (deltoids) are formed on the fracture surface during the slow subcritical crack growth, prior to the smooth mirror-like surface characteristic of fast cracks. The deltoids are complementary on the two crack faces which excludes any inelastic deformation from consideration. Presence of defects is also suggested by the observed scale effect. However, there are no defects at the deltoid apexes detectable down to the 0.1 micron level.

  6. Earthquake probabilities: theoretical assessments and reality

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  7. No bridge too high: Infants decide whether to cross based on the probability of falling not the severity of the potential fall

    PubMed Central

    Kretch, Kari S.; Adolph, Karen E.

    2013-01-01

    Do infants, like adults, consider both the probability of falling and the severity of a potential fall when deciding whether to cross a bridge? Crawling and walking infants were encouraged to cross bridges varying in width over a small drop-off, a large drop-off, or no drop-off. Bridge width affects the probability of falling, whereas drop-off height affects the severity of the potential fall. For both crawlers and walkers, decisions about crossing bridges depended only on the probability of falling: As bridge width decreased, attempts to cross decreased, and gait modifications and exploration increased, but behaviors did not differ between small and large drop-off conditions. Similarly, decisions about descent depended on the probability of falling: They backed or crawled into the small drop-off, but avoided the large drop-off. With no drop-off, infants ran straight across. Results indicate that experienced crawlers and walkers accurately perceive affordances for locomotion, but they do not yet consider the severity of a potential fall when making decisions for action. PMID:23587034

  8. Emptiness Formation Probability

    NASA Astrophysics Data System (ADS)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  9. Alternative probability theories for cognitive psychology.

    PubMed

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.

  10. Acid-base titrations for polyacids: Significance of the pK sub a and parameters in the Kern equation

    NASA Technical Reports Server (NTRS)

    Meites, L.

    1978-01-01

    A new method is suggested for calculating the dissociation constants of polyvalent acids, especially polymeric acids. In qualitative form the most significant characteristics of the titration curves are demonstrated and identified which are obtained when titrating the solutions of such acids with a standard base potentiometrically.

  11. People's conditional probability judgments follow probability theory (plus noise).

    PubMed

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  12. People's conditional probability judgments follow probability theory (plus noise).

    PubMed

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities. PMID:27570097

  13. The Animism Controversy Revisited: A Probability Analysis

    ERIC Educational Resources Information Center

    Smeets, Paul M.

    1973-01-01

    Considers methodological issues surrounding the Piaget-Huang controversy. A probability model, based on the difference between the expected and observed animistic and deanimistic responses is applied as an improved technique for the assessment of animism. (DP)

  14. Prestack inversion based on anisotropic Markov random field-maximum posterior probability inversion and its application to identify shale gas sweet spots

    NASA Astrophysics Data System (ADS)

    Wang, Kang-Ning; Sun, Zan-Dong; Dong, Ning

    2015-12-01

    Economic shale gas production requires hydraulic fracture stimulation to increase the formation permeability. Hydraulic fracturing strongly depends on geomechanical parameters such as Young's modulus and Poisson's ratio. Fracture-prone sweet spots can be predicted by prestack inversion, which is an ill-posed problem; thus, regularization is needed to obtain unique and stable solutions. To characterize gas-bearing shale sedimentary bodies, elastic parameter variations are regarded as an anisotropic Markov random field. Bayesian statistics are adopted for transforming prestack inversion to the maximum posterior probability. Two energy functions for the lateral and vertical directions are used to describe the distribution, and the expectation-maximization algorithm is used to estimate the hyperparameters of the prior probability of elastic parameters. Finally, the inversion yields clear geological boundaries, high vertical resolution, and reasonable lateral continuity using the conjugate gradient method to minimize the objective function. Antinoise and imaging ability of the method were tested using synthetic and real data.

  15. Lectures on probability and statistics

    SciTech Connect

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  16. Method for measurement of transition probabilities by laser-induced breakdown spectroscopy based on CSigma graphs-Application to Ca II spectral lines

    NASA Astrophysics Data System (ADS)

    Aguilera, J. A.; Aragón, C.; Manrique, J.

    2015-07-01

    We propose a method for determination of transition probabilities by laser-induced breakdown spectroscopy that avoids the error due to self-absorption. The method relies on CSigma graphs, a generalization of curves of growth which allows including several lines of various elements in the same ionization state. CSigma graphs are constructed including reference lines of an emitting species with well-known transition probabilities, together with the lines of interest, both in the same ionization state. The samples are fused glass disks prepared from small concentrations of compounds. When the method is applied, the concentration of the element of interest in the sample must be controlled to avoid the failure of the homogeneous plasma model. To test the method, the transition probabilities of 9 Ca II lines arising from the 4d, 5s, 5d and 6s configurations are measured using Fe II reference lines. The data for 5 of the studied lines, mainly from the 5d and 6s configurations, had not been measured previously.

  17. A timer inventory based upon manual and automated analysis of ERTS-1 and supporting aircraft data using multistage probability sampling. [Plumas National Forest, California

    NASA Technical Reports Server (NTRS)

    Nichols, J. D.; Gialdini, M.; Jaakkola, S.

    1974-01-01

    A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.

  18. SU-E-T-580: On the Significance of Model Based Dosimetry for Breast and Head and Neck 192Ir HDR Brachytherapy

    SciTech Connect

    Peppa, V; Pappas, E; Pantelis, E; Papagiannis, P; Major, T; Polgar, C

    2015-06-15

    Purpose: To assess the dosimetric and radiobiological differences between TG43-based and model-based dosimetry in the treatment planning of {sup 192}Ir HDR brachytherapy for breast and head and neck cancer. Methods: Two cohorts of 57 Accelerated Partial Breast Irradiation (APBI) and 22 head and neck (H&N) patients with oral cavity carcinoma were studied. Dosimetry for the treatment plans was performed using the TG43 algorithm of the Oncentra Brachy v4.4 treatment planning system (TPS). Corresponding Monte Carlo (MC) simulations were performed using MCNP6 with input files automatically prepared by the BrachyGuide software tool from DICOM RT plan data. TG43 and MC data were compared in terms of % dose differences, Dose Volume Histograms (DVHs) and related indices of clinical interest for the Planning Target Volume (PTV) and the Organs-At-Risk (OARs). A radiobiological analysis was also performed using the Equivalent Uniform Dose (EUD), mean survival fraction (S) and Tumor Control Probability (TCP) for the PTV, and the Normal Tissue Control Probability (N TCP) and the generalized EUD (gEUD) for the OARs. Significance testing of the observed differences performed using the Wilcoxon paired sample test. Results: Differences between TG43 and MC DVH indices, associated with the increased corresponding local % dose differences observed, were statistically significant. This is mainly attributed to their consistency however, since TG43 agrees closely with MC for the majority of DVH and radiobiological parameters in both patient cohorts. Differences varied considerably among patients only for the ipsilateral lung and ribs in the APBI cohort, with a strong correlation to target location. Conclusion: While the consistency and magnitude of differences in the majority of clinically relevant DVH indices imply that no change is needed in the treatment planning practice, individualized dosimetry improves accuracy and addresses instances of inter-patient variability observed. Research

  19. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  20. Assessment of the probability of contaminating Mars

    NASA Technical Reports Server (NTRS)

    Judd, B. R.; North, D. W.; Pezier, J. P.

    1974-01-01

    New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.

  1. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  2. UQ for Decision Making: How (at least five) Kinds of Probability Might Come Into Play

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    In 1959 IJ Good published the discussion "Kinds of Probability" in Science. Good identified (at least) five kinds. The need for (at least) a sixth kind of probability when quantifying uncertainty in the context of climate science is discussed. This discussion brings out the differences in weather-like forecasting tasks and climate-links tasks, with a focus on the effective use both of science and of modelling in support of decision making. Good also introduced the idea of a "Dynamic probability" a probability one expects to change without any additional empirical evidence; the probabilities assigned by a chess playing program when it is only half thorough its analysis being an example. This case is contrasted with the case of "Mature probabilities" where a forecast algorithm (or model) has converged on its asymptotic probabilities and the question hinges in whether or not those probabilities are expected to change significantly before the event in question occurs, even in the absence of new empirical evidence. If so, then how might one report and deploy such immature probabilities in scientific-support of decision-making rationally? Mature Probability is suggested as a useful sixth kind, although Good would doubtlessly argue that we can get by with just one, effective communication with decision makers may be enhanced by speaking as if the others existed. This again highlights the distinction between weather-like contexts and climate-like contexts. In the former context one has access to a relevant climatology (a relevant, arguably informative distribution prior to any model simulations), in the latter context that information is not available although one can fall back on the scientific basis upon which the model itself rests, and estimate the probability that the model output is in fact misinformative. This subjective "probability of a big surprise" is one way to communicate the probability of model-based information holding in practice, the probability that the

  3. The Probability of Causal Conditionals

    ERIC Educational Resources Information Center

    Over, David E.; Hadjichristidis, Constantinos; Evans, Jonathan St. B. T.; Handley, Simon J.; Sloman, Steven A.

    2007-01-01

    Conditionals in natural language are central to reasoning and decision making. A theoretical proposal called the Ramsey test implies the conditional probability hypothesis: that the subjective probability of a natural language conditional, P(if p then q), is the conditional subjective probability, P(q [such that] p). We report three experiments on…

  4. Intensity-Modulated Radiotherapy Results in Significant Decrease in Clinical Toxicities Compared With Conventional Wedge-Based Breast Radiotherapy

    SciTech Connect

    Harsolia, Asif; Kestin, Larry; Grills, Inga; Wallace, Michelle; Jolly, Shruti; Jones, Cortney; Lala, Moinaktar; Martinez, Alvaro; Schell, Scott; Vicini, Frank A. . E-mail: fvicini@beaumont.edu

    2007-08-01

    Purpose: We have previously demonstrated that intensity-modulated radiotherapy (IMRT) with a static multileaf collimator process results in a more homogenous dose distribution compared with conventional wedge-based whole breast irradiation (WBI). In the present analysis, we reviewed the acute and chronic toxicity of this IMRT approach compared with conventional wedge-based treatment. Methods and Materials: A total of 172 patients with Stage 0-IIB breast cancer were treated with lumpectomy followed by WBI. All patients underwent treatment planning computed tomography and received WBI (median dose, 45 Gy) followed by a boost to 61 Gy. Of the 172 patients, 93 (54%) were treated with IMRT, and the 79 patients (46%) treated with wedge-based RT in a consecutive fashion immediately before this cohort served as the control group. The median follow-up was 4.7 years. Results: A significant reduction in acute Grade 2 or worse dermatitis, edema, and hyperpigmentation was seen with IMRT compared with wedges. A trend was found toward reduced acute Grade 3 or greater dermatitis (6% vs. 1%, p = 0.09) in favor of IMRT. Chronic Grade 2 or worse breast edema was significantly reduced with IMRT compared with conventional wedges. No difference was found in cosmesis scores between the two groups. In patients with larger breasts ({>=}1,600 cm{sup 3}, n = 64), IMRT resulted in reduced acute (Grade 2 or greater) breast edema (0% vs. 36%, p <0.001) and hyperpigmentation (3% vs. 41%, p 0.001) and chronic (Grade 2 or greater) long-term edema (3% vs. 30%, p 0.007). Conclusion: The use of IMRT in the treatment of the whole breast results in a significant decrease in acute dermatitis, edema, and hyperpigmentation and a reduction in the development of chronic breast edema compared with conventional wedge-based RT.

  5. Using probability-based spatial estimation of the river pollution index to assess urban water recreational quality in the Tamsui River watershed.

    PubMed

    Jang, Cheng-Shin

    2016-01-01

    The Tamsui River watershed situated in Northern Taiwan provides a variety of water recreational opportunities such as riverbank park activities, fishing, cruising, rowing, sailing, and swimming. However, river water quality strongly affects water recreational quality. Moreover, the health of recreationists who are partially or fully exposed to polluted river water may be jeopardized. A river pollution index (RPI) composed of dissolved oxygen, biochemical oxygen demand, suspended solids, and ammonia nitrogen is typically used to gauge the river water quality and regulate the water body use in Taiwan. The purpose of this study was to probabilistically determine the RPI categories in the Tamsui River watershed and to assess the urban water recreational quality on the basis of the estimated RPI categories. First, according to various RPI categories, one-dimensional indicator kriging (IK) was adopted to estimate the occurrence probabilities of the RPI categories. The maximum occurrence probability among the categories was then employed to determine the most suitable RPI category. Finally, the most serious categories and seasonal variations of RPI were adopted to evaluate the quality of current water recreational opportunities in the Tamsui River watershed. The results revealed that the midstream and downstream sections of the Tamsui River and its tributaries with poor river water quality afford low water recreational quality, and water recreationists should avoid full or limited exposure to these bodies of water. However, the upstream sections of the Tamsui River watershed with high river water quality are suitable for all water recreational activities.

  6. Using probability-based spatial estimation of the river pollution index to assess urban water recreational quality in the Tamsui River watershed.

    PubMed

    Jang, Cheng-Shin

    2016-01-01

    The Tamsui River watershed situated in Northern Taiwan provides a variety of water recreational opportunities such as riverbank park activities, fishing, cruising, rowing, sailing, and swimming. However, river water quality strongly affects water recreational quality. Moreover, the health of recreationists who are partially or fully exposed to polluted river water may be jeopardized. A river pollution index (RPI) composed of dissolved oxygen, biochemical oxygen demand, suspended solids, and ammonia nitrogen is typically used to gauge the river water quality and regulate the water body use in Taiwan. The purpose of this study was to probabilistically determine the RPI categories in the Tamsui River watershed and to assess the urban water recreational quality on the basis of the estimated RPI categories. First, according to various RPI categories, one-dimensional indicator kriging (IK) was adopted to estimate the occurrence probabilities of the RPI categories. The maximum occurrence probability among the categories was then employed to determine the most suitable RPI category. Finally, the most serious categories and seasonal variations of RPI were adopted to evaluate the quality of current water recreational opportunities in the Tamsui River watershed. The results revealed that the midstream and downstream sections of the Tamsui River and its tributaries with poor river water quality afford low water recreational quality, and water recreationists should avoid full or limited exposure to these bodies of water. However, the upstream sections of the Tamsui River watershed with high river water quality are suitable for all water recreational activities. PMID:26676412

  7. Significantly elevated dielectric permittivity of Si-based semiconductor/polymer 2-2 composites induced by high polarity polymers

    NASA Astrophysics Data System (ADS)

    Feng, Yefeng; Gong, Honghong; Xie, Yunchuan; Wei, Xiaoyong; Zhang, Zhicheng

    2016-02-01

    To disclose the essential influence of polymer polarity on dielectric properties of polymer composites filled with semiconductive fillers, a series of Si-based semiconductor/polymer 2-2 composites in a series model was fabricated. The dielectric permittivity of composites is highly dependant on the polarity of polymer layers as well as the electron mobility in Si-based semiconductive sheets. The huge dielectric permittivity achieved in Si-based semiconductive sheets after being coated with high polarity polymer layers is inferred to originate from the strong induction of high polarity polymers. The increased mobility of the electrons in Si-based semiconductive sheets coated by high polarity polymer layers should be responsible for the significantly enhanced dielectric properties of composites. This could be facilely achieved by either increasing the polarity of polymer layers or reducing the percolative electric field of Si-based semiconductive sheets. The most promising 2-2 dielectric composite was found to be made of α-SiC with strong electron mobility and poly(vinyl alcohol) (PVA) with high polarity, and its highest permittivity was obtained as 372 at 100 Hz although the permittivity of α-SiC and PVA is 3-5 and 15, respectively. This work may help in the fabrication of high dielectric constant (high-k) composites by tailoring the induction effect of high polarity polymers to semiconductors.

  8. The cognitive substrate of subjective probability.

    PubMed

    Nilsson, Håkan; Olsson, Henrik; Juslin, Peter

    2005-07-01

    The prominent cognitive theories of probability judgment were primarily developed to explain cognitive biases rather than to account for the cognitive processes in probability judgment. In this article the authors compare 3 major theories of the processes and representations in probability judgment: the representativeness heuristic, implemented as prototype similarity, relative likelihood, or evidential support accumulation (ESAM; D. J. Koehler, C. M. White, & R. Grondin, 2003); cue-based relative frequency; and exemplar memory, implemented by probabilities from exemplars (PROBEX; P. Juslin & M. Persson, 2002). Three experiments with different task structures consistently demonstrate that exemplar memory is the best account of the data whereas the results are inconsistent with extant formulations of the representativeness heuristic and cue-based relative frequency. PMID:16060768

  9. Associativity and normative credal probability.

    PubMed

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098

  10. Determination of the compound nucleus survival probability Psurv for various "hot" fusion reactions based on the dynamical cluster-decay model

    NASA Astrophysics Data System (ADS)

    Chopra, Sahila; Kaur, Arshdeep; Gupta, Raj K.

    2015-03-01

    After a successful attempt to define and determine recently the compound nucleus (CN) fusion/ formation probability PCN within the dynamical cluster-decay model (DCM), we introduce and estimate here for the first time the survival probability Psurv of CN against fission, again within the DCM. Calculated as the dynamical fragmentation process, Psurv is defined as the ratio of the evaporation residue (ER) cross section σER and the sum of σER and fusion-fission (ff) cross section σff, the CN formation cross section σCN, where each contributing fragmentation cross section is determined in terms of its formation and barrier penetration probabilities P0 and P . In DCM, the deformations up to hexadecapole and "compact" orientations for both in-plane (coplanar) and out-of-plane (noncoplanar) configurations are allowed. Some 16 "hot" fusion reactions, forming a CN of mass number ACN˜100 to superheavy nuclei, are analyzed for various different nuclear interaction potentials, and the variation of Psurv on CN excitation energy E*, fissility parameter χ , CN mass ACN, and Coulomb parameter Z1Z2 is investigated. Interesting results are that three groups, namely, weakly fissioning, radioactive, and strongly fissioning superheavy nuclei, are identified with Psurv, respectively, ˜1 ,˜10-6 , and ˜10-10 . For the weakly fissioning group (100

  11. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  12. Study on the Identification of Radix Bupleuri from Its Unofficial Varieties Based on Discrete Wavelet Transformation Feature Extraction of ATR-FTIR Spectroscopy Combined with Probability Neural Network

    PubMed Central

    Jin, Wenying; Wan, Chayan; Cheng, Cungui

    2015-01-01

    The attenuated total reflection-Fourier transform infrared spectroscopy (ATR-FTIR) was employed to acquire the infrared spectra of Radix Bupleuri and its unofficial varieties: the root of Bupleurum smithii Wolff and the root of Bupleurum bicaule Helm. The infrared spectra and spectra of Fourier self-deconvolution (FSD), discrete wavelet transform (DWT), and probability neural network (PNN) of these species were analyzed. By the method of FSD, there were conspicuous differences of the infrared absorption peak intensity of different types between Radix Bupleuri and its unofficial varieties. But it is hard to tell the differences between the root of Bupleurum smithii Wolff and the root of Bupleurum bicaule. The differences could be shown more clearly when the DWT was used. The research result shows that by the DWT technology it is easier to identify Radix Bupleuri from its unofficial varieties the root of Bupleurum smithii Wolff and the root of Bupleurum bicaule. PMID:25784938

  13. Significant life experience: Exploring the lifelong influence of place-based environmental and science education on program participants

    NASA Astrophysics Data System (ADS)

    Colvin, Corrie Ruth

    Current research provides a limited understanding of the life long influence of nonformal place-based environmental and science education programs on past participants. This study looks to address this gap, exploring the ways in which these learning environments have contributed to environmental identity and stewardship. Using Dorothy Holland's approach to social practice theory's understanding of identity formation, this study employed narrative interviews and a close-ended survey to understand past participants' experience over time. Participants from two place-based environmental education programs and one science-inquiry program were asked to share their reflections on their program experience and the influence they attribute to that experience. Among all participants, the element of hands-on learning, supportive instructors, and engaging learning environments remained salient over time. Participants of nature-based programs demonstrated that these programs in particular were formative in contributing to an environmental stewardship identity. Social practice theory can serve as a helpful theoretical framework for significant life experience research, which has largely been missing from this body of research. This study also holds implications for the fields of place-based environmental education, conservation psychology, and sustainability planning, all of which look to understand and increase environmentally sustainable practices.

  14. Liquefaction probability curves for surficial geologic deposits

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2011-01-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different types of surficial geologic units. The units consist of alluvial fan, beach ridge, river delta topset and foreset beds, eolian dune, point bar, flood basin, natural river and alluvial fan levees, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities are derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 927 cone penetration tests. For natural deposits with a water table at 1.5 m and subjected to a M7.5 earthquake with peak ground acceleration (PGA)  =  0.25g, probabilities range from 0.5 for beach ridge, point bar, and deltaic deposits. The curves also were used to assign ranges of liquefaction probabilities to the susceptibility categories proposed previously for different geologic deposits. For the earthquake described here, probabilities for susceptibility categories have ranges of 0–0.08 for low, 0.09–0.30 for moderate, 0.31–0.62 for high, and 0.63–1.00 for very high. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to observations.

  15. Significance tests and weighted values for AFLP similarities, based on Arabidopsis in silico AFLP fragment length distributions.

    PubMed Central

    Koopman, Wim J M; Gort, Gerrit

    2004-01-01

    Many AFLP studies include relatively unrelated genotypes that contribute noise to data sets instead of signal. We developed: (1) estimates of expected AFLP similarities between unrelated genotypes, (2) significance tests for AFLP similarities, enabling the detection of unrelated genotypes, and (3) weighted similarity coefficients, including band position information. Detection of unrelated genotypes and use of weighted similarity coefficients will make the analysis of AFLP data sets more informative and more reliable. Test statistics and weighted coefficients were developed for total numbers of shared bands and for Dice, Jaccard, Nei and Li, and simple matching (dis)similarity coefficients. Theoretical and in silico AFLP fragment length distributions (FLDs) were examined as a basis for the tests. The in silico AFLP FLD based on the Arabidopsis thaliana genome sequence was the most appropriate for angiosperms. The G + C content of the selective nucleotides in the in silico AFLP procedure significantly influenced the FLD. Therefore, separate test statistics were calculated for AFLP procedures with high, average, and low G + C contents in the selective nucleotides. The test statistics are generally applicable for angiosperms with a G + C content of approximately 35-40%, but represent conservative estimates for genotypes with higher G + C contents. For the latter, test statistics based on a rice genome sequence are more appropriate. PMID:15342529

  16. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  17. A comprehensive laboratory-based program for classification of variants of uncertain significance in hereditary cancer genes.

    PubMed

    Eggington, J M; Bowles, K R; Moyes, K; Manley, S; Esterling, L; Sizemore, S; Rosenthal, E; Theisen, A; Saam, J; Arnell, C; Pruss, D; Bennett, J; Burbidge, L A; Roa, B; Wenstrup, R J

    2014-09-01

    Genetic testing has the potential to guide the prevention and treatment of disease in a variety of settings, and recent technical advances have greatly increased our ability to acquire large amounts of genetic data. The interpretation of this data remains challenging, as the clinical significance of genetic variation detected in the laboratory is not always clear. Although regulatory agencies and professional societies provide some guidance regarding the classification, reporting, and long-term follow-up of variants, few protocols for the implementation of these guidelines have been described. Because the primary aim of clinical testing is to provide results to inform medical management, a variant classification program that offers timely, accurate, confident and cost-effective interpretation of variants should be an integral component of the laboratory process. Here we describe the components of our laboratory's current variant classification program (VCP), based on 20 years of experience and over one million samples tested, using the BRCA1/2 genes as a model. Our VCP has lowered the percentage of tests in which one or more BRCA1/2 variants of uncertain significance (VUSs) are detected to 2.1% in the absence of a pathogenic mutation, demonstrating how the coordinated application of resources toward classification and reclassification significantly impacts the clinical utility of testing.

  18. MSPI False Indication Probability Simulations

    SciTech Connect

    Dana Kelly; Kurt Vedros; Robert Youngblood

    2011-03-01

    This paper examines false indication probabilities in the context of the Mitigating System Performance Index (MSPI), in order to investigate the pros and cons of different approaches to resolving two coupled issues: (1) sensitivity to the prior distribution used in calculating the Bayesian-corrected unreliability contribution to the MSPI, and (2) whether (in a particular plant configuration) to model the fuel oil transfer pump (FOTP) as a separate component, or integrally to its emergency diesel generator (EDG). False indication probabilities were calculated for the following situations: (1) all component reliability parameters at their baseline values, so that the true indication is green, meaning that an indication of white or above would be false positive; (2) one or more components degraded to the extent that the true indication would be (mid) white, and “false” would be green (negative) or yellow (negative) or red (negative). In key respects, this was the approach taken in NUREG-1753. The prior distributions examined were the constrained noninformative (CNI) prior used currently by the MSPI, a mixture of conjugate priors, the Jeffreys noninformative prior, a nonconjugate log(istic)-normal prior, and the minimally informative prior investigated in (Kelly et al., 2010). The mid-white performance state was set at ?CDF = ?10 ? 10-6/yr. For each simulated time history, a check is made of whether the calculated ?CDF is above or below 10-6/yr. If the parameters were at their baseline values, and ?CDF > 10-6/yr, this is counted as a false positive. Conversely, if one or all of the parameters are set to values corresponding to ?CDF > 10-6/yr but that time history’s ?CDF < 10-6/yr, this is counted as a false negative indication. The false indication (positive or negative) probability is then estimated as the number of false positive or negative counts divided by the number of time histories (100,000). Results are presented for a set of base case parameter values

  19. A clip-based protocol for breast boost radiotherapy provides clear target visualisation and demonstrates significant volume reduction over time

    PubMed Central

    Lewis, Lorraine; Cox, Jennifer; Morgia, Marita; Atyeo, John; Lamoury, Gillian

    2015-01-01

    Introduction The clinical target volume (CTV) for early stage breast cancer is difficult to clearly identify on planning computed tomography (CT) scans. Surgical clips inserted around the tumour bed should help to identify the CTV, particularly if the seroma has been reabsorbed, and enable tracking of CTV changes over time. Methods A surgical clip-based CTV delineation protocol was introduced. CTV visibility and its post-operative shrinkage pattern were assessed. The subjects were 27 early stage breast cancer patients receiving post-operative radiotherapy alone and 15 receiving post-operative chemotherapy followed by radiotherapy. The radiotherapy alone (RT/alone) group received a CT scan at median 25 days post-operatively (CT1rt) and another at 40 Gy, median 68 days (CT2rt). The chemotherapy/RT group (chemo/RT) received a CT scan at median 18 days post-operatively (CT1ch), a planning CT scan at median 126 days (CT2ch), and another at 40 Gy (CT3ch). Results There was no significant difference (P = 0.08) between the initial mean CTV for each cohort. The RT/alone cohort showed significant CTV volume reduction of 38.4% (P = 0.01) at 40 Gy. The Chemo/RT cohort had significantly reduced volumes between CT1ch: median 54 cm3 (4–118) and CT2ch: median 16 cm3, (2–99), (P = 0.01), but no significant volume reduction thereafter. Conclusion Surgical clips enable localisation of the post-surgical seroma for radiotherapy targeting. Most seroma shrinkage occurs early, enabling CT treatment planning to take place at 7 weeks, which is within the 9 weeks recommended to limit disease recurrence. PMID:26451239

  20. A clip-based protocol for breast boost radiotherapy provides clear target visualisation and demonstrates significant volume reduction over time

    SciTech Connect

    Lewis, Lorraine; Cox, Jennifer; Morgia, Marita; Atyeo, John; Lamoury, Gillian

    2015-09-15

    The clinical target volume (CTV) for early stage breast cancer is difficult to clearly identify on planning computed tomography (CT) scans. Surgical clips inserted around the tumour bed should help to identify the CTV, particularly if the seroma has been reabsorbed, and enable tracking of CTV changes over time. A surgical clip-based CTV delineation protocol was introduced. CTV visibility and its post-operative shrinkage pattern were assessed. The subjects were 27 early stage breast cancer patients receiving post-operative radiotherapy alone and 15 receiving post-operative chemotherapy followed by radiotherapy. The radiotherapy alone (RT/alone) group received a CT scan at median 25 days post-operatively (CT1rt) and another at 40 Gy, median 68 days (CT2rt). The chemotherapy/RT group (chemo/RT) received a CT scan at median 18 days post-operatively (CT1ch), a planning CT scan at median 126 days (CT2ch), and another at 40 Gy (CT3ch). There was no significant difference (P = 0.08) between the initial mean CTV for each cohort. The RT/alone cohort showed significant CTV volume reduction of 38.4% (P = 0.01) at 40 Gy. The Chemo/RT cohort had significantly reduced volumes between CT1ch: median 54 cm{sup 3} (4–118) and CT2ch: median 16 cm{sup 3}, (2–99), (P = 0.01), but no significant volume reduction thereafter. Surgical clips enable localisation of the post-surgical seroma for radiotherapy targeting. Most seroma shrinkage occurs early, enabling CT treatment planning to take place at 7 weeks, which is within the 9 weeks recommended to limit disease recurrence.

  1. Association between probable postnatal depression and increased infant mortality and morbidity: findings from the DON population-based cohort study in rural Ghana

    PubMed Central

    Weobong, Benedict; ten Asbroek, Augustinus H A; Soremekun, Seyi; Gram, Lu; Amenga-Etego, Seeba; Danso, Samuel; Owusu-Agyei, Seth; Prince, Martin; Kirkwood, Betty R

    2015-01-01

    Objectives To assess the impact of probable depression in the immediate postnatal period on subsequent infant mortality and morbidity. Design Cohort study nested within 4 weekly surveillance of all women of reproductive age to identify pregnancies and collect data on births and deaths. Setting Rural/periurban communities within the Kintampo Health Research Centre study area of the Brong-Ahafo Region of Ghana. Participants 16 560 mothers who had a live singleton birth reported between 24 March 2008 and 11 July 2009, who were screened for probable postnatal depression (pPND) between 4 and 12 weeks post partum (some of whom had also had depression assessed at pregnancy), and whose infants survived to this point. Primary/secondary outcome measures All-cause early infant mortality expressed per 1000 infant-months of follow-up from the time of postnatal assessment to 6 months of age. The secondary outcomes were (1) all-cause infant mortality from the time of postnatal assessment to 12 months of age and (2) reported infant morbidity from the time of the postnatal assessment to 12 months of age. Results 130 infant deaths were recorded and singletons were followed for 67 457.4 infant-months from the time of their mothers’ postnatal depression assessment. pPND was associated with an almost threefold increased risk of mortality up to 6 months (adjusted rate ratio (RR), 2.86 (1.58 to 5.19); p=0.001). The RR up to 12 months was 1.88 (1.09 to 3.24; p=0.023). pPND was also associated with increased risk of infant morbidity. Conclusions There is new evidence for the association between maternal pPND and infant mortality in low-income and middle-income countries. Implementation of the WHO's Mental Health Gap Action Programme (mhGAP) to scale up packages of care integrated with maternal health is encouraged as an important adjunct to child survival efforts. PMID:26316646

  2. Stimulus probability effects in absolute identification.

    PubMed

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record

  3. Dynamic mean field theory for lattice gas models of fluids confined in porous materials: Higher order theory based on the Bethe-Peierls and path probability method approximations

    SciTech Connect

    Edison, John R.; Monson, Peter A.

    2014-07-14

    Recently we have developed a dynamic mean field theory (DMFT) for lattice gas models of fluids in porous materials [P. A. Monson, J. Chem. Phys. 128(8), 084701 (2008)]. The theory can be used to describe the relaxation processes in the approach to equilibrium or metastable states for fluids in pores and is especially useful for studying system exhibiting adsorption/desorption hysteresis. In this paper we discuss the extension of the theory to higher order by means of the path probability method (PPM) of Kikuchi and co-workers. We show that this leads to a treatment of the dynamics that is consistent with thermodynamics coming from the Bethe-Peierls or Quasi-Chemical approximation for the equilibrium or metastable equilibrium states of the lattice model. We compare the results from the PPM with those from DMFT and from dynamic Monte Carlo simulations. We find that the predictions from PPM are qualitatively similar to those from DMFT but give somewhat improved quantitative accuracy, in part due to the superior treatment of the underlying thermodynamics. This comes at the cost of greater computational expense associated with the larger number of equations that must be solved.

  4. Comparisons of estimates of annual exceedance-probability discharges for small drainage basins in Iowa, based on data through water year 2013

    USGS Publications Warehouse

    Eash, David A.

    2015-01-01

    An examination was conducted to understand why the 1987 single-variable RREs seem to provide better accuracy and less bias than either of the 2013 multi- or single-variable RREs. A comparison of 1-percent annual exceedance-probability regression lines for hydrologic regions 1-4 from the 1987 single-variable RREs and for flood regions 1-3 from the 2013 single-variable RREs indicates that the 1987 single-variable regional-regression lines generally have steeper slopes and lower discharges when compared to 2013 single-variable regional-regression lines for corresponding areas of Iowa. The combination of the definition of hydrologic regions, the lower discharges, and the steeper slopes of regression lines associated with the 1987 single-variable RREs seem to provide better accuracy and less bias when compared to the 2013 multi- or single-variable RREs; better accuracy and less bias was determined particularly for drainage areas less than 2 mi2, and also for some drainage areas between 2 and 20 mi2. The 2013 multi- and single-variable RREs are considered to provide better accuracy and less bias for larger drainage areas. Results of this study indicate that additional research is needed to address the curvilinear relation between drainage area and AEPDs for areas of Iowa.

  5. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are <1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  6. Dimension Reduction via Unsupervised Learning Yields Significant Computational Improvements for Support Vector Machine Based Protein Family Classification.

    SciTech Connect

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Oehmen, Christopher S.

    2009-02-26

    Reducing the dimension of vectors used in training support vector machines (SVMs) results in a proportional speedup in training time. For large-scale problems this can make the difference between tractable and intractable training tasks. However, it is critical that classifiers trained on reduced datasets perform as reliably as their counterparts trained on high-dimensional data. We assessed principal component analysis (PCA) and sequential project pursuit (SPP) as dimension reduction strategies in the biology application of classifying proteins into well-defined functional ‘families’ (SVM-based protein family classification) by their impact on run-time, sensitivity and selectivity. Homology vectors of 4352 elements were reduced to approximately 2% of the original data size without significantly affecting accuracy using PCA and SPP, while leading to approximately a 28-fold speedup in run-time.

  7. Weighted Feature Significance: A Simple, Interpretable Model of Compound Toxicity Based on the Statistical Enrichment of Structural Features

    PubMed Central

    Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R.; Austin, Christopher P.

    2009-01-01

    In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) data from Salmonella typhimurium reverse mutagenicity assays conducted by the U.S. National Toxicology Program, and (3) hepatotoxicity data published in the Registry of Toxic Effects of Chemical Substances. Enrichments of structural features in toxic compounds are evaluated for their statistical significance and compiled into a simple additive model of toxicity and then used to score new compounds for potential toxicity. The predictive power of the model for cytotoxicity was validated using an independent set of compounds from the U.S. Environmental Protection Agency tested also at the National Institutes of Health Chemical Genomics Center. We compared the performance of our WFS approach with classical classification methods such as Naive Bayesian clustering and support vector machines. In most test cases, WFS showed similar or slightly better predictive power, especially in the prediction of hepatotoxic compounds, where WFS appeared to have the best performance among the three methods. The new algorithm has the important advantages of simplicity, power, interpretability, and ease of implementation. PMID:19805409

  8. What if the Electrical Conductivity of Graphene Is Significantly Deteriorated for the Graphene-Semiconductor Composite-Based Photocatalysis?

    PubMed

    Weng, Bo; Xu, Yi-Jun

    2015-12-23

    The extraordinary electrical conductivity of graphene has been widely regarded as the bible in literature to explain the activity enhancement of graphene-semiconductor composite photocatalysts. However, from the viewpoint of an entire composite-based artificial photosynthetic system, the significant matter of photocatalytic performance of graphene-semiconductor composite system is not just a simple and only issue of excellent electrical conductivity of graphene. Herein, the intentional design of melamine resin monomers functionalized three-dimensional (3D) graphene (donated as MRGO) with significantly deteriorated electrical conductivity enables us to independently focus on studying the geometry effect of MRGO on the photocatalytic performance of graphene-semiconductor composite. By coupling semiconductor CdS with graphene, including MRGO and reduced graphene oxide (RGO), it was found that the CdS-MRGO composites exhibit much higher visible light photoactivity than CdS-RGO composites although the electrical conductivity of MRGO is remarkably much lower than that of RGO. The comparison characterizations evidence that such photoactivity enhancement is predominantly attributed to the restacking-inhibited 3D architectural morphology of MRGO, by which the synergistic effects of boosted separation and transportation of photogenerated charge carriers and increased adsorption capacity can be achieved. Our work highlights that the significant matter of photocatalytic performance of graphene-semiconductor composite is not a simple issue on how to harness the electrical conductivity of graphene but the rational ensemble design of graphene-semiconductor composite, which includes the integrative optimization of geometrical and electrical factors of individual component and the interface composition. PMID:26624808

  9. What if the Electrical Conductivity of Graphene Is Significantly Deteriorated for the Graphene-Semiconductor Composite-Based Photocatalysis?

    PubMed

    Weng, Bo; Xu, Yi-Jun

    2015-12-23

    The extraordinary electrical conductivity of graphene has been widely regarded as the bible in literature to explain the activity enhancement of graphene-semiconductor composite photocatalysts. However, from the viewpoint of an entire composite-based artificial photosynthetic system, the significant matter of photocatalytic performance of graphene-semiconductor composite system is not just a simple and only issue of excellent electrical conductivity of graphene. Herein, the intentional design of melamine resin monomers functionalized three-dimensional (3D) graphene (donated as MRGO) with significantly deteriorated electrical conductivity enables us to independently focus on studying the geometry effect of MRGO on the photocatalytic performance of graphene-semiconductor composite. By coupling semiconductor CdS with graphene, including MRGO and reduced graphene oxide (RGO), it was found that the CdS-MRGO composites exhibit much higher visible light photoactivity than CdS-RGO composites although the electrical conductivity of MRGO is remarkably much lower than that of RGO. The comparison characterizations evidence that such photoactivity enhancement is predominantly attributed to the restacking-inhibited 3D architectural morphology of MRGO, by which the synergistic effects of boosted separation and transportation of photogenerated charge carriers and increased adsorption capacity can be achieved. Our work highlights that the significant matter of photocatalytic performance of graphene-semiconductor composite is not a simple issue on how to harness the electrical conductivity of graphene but the rational ensemble design of graphene-semiconductor composite, which includes the integrative optimization of geometrical and electrical factors of individual component and the interface composition.

  10. Combined Statistical Analyses of Peptide Intensities and Peptide Occurrences Improves Identification of Significant Peptides from MS-based Proteomics Data

    SciTech Connect

    Webb-Robertson, Bobbie-Jo M.; McCue, Lee Ann; Waters, Katrina M.; Matzke, Melissa M.; Jacobs, Jon M.; Metz, Thomas O.; Varnum, Susan M.; Pounds, Joel G.

    2010-11-01

    Liquid chromatography-mass spectrometry-based (LC-MS) proteomics uses peak intensities of proteolytic peptides to infer the differential abundance of peptides/proteins. However, substantial run-to-run variability in peptide intensities and observations (presence/absence) of peptides makes data analysis quite challenging. The missing abundance values in LC-MS proteomics data are difficult to address with traditional imputation-based approaches because the mechanisms by which data are missing are unknown a priori. Data can be missing due to random mechanisms such as experimental error, or non-random mechanisms such as a true biological effect. We present a statistical approach that uses a test of independence known as a G-test to test the null hypothesis of independence between the number of missing values and the experimental groups. We pair the G-test results evaluating independence of missing data (IMD) with a standard analysis of variance (ANOVA) that uses only means and variances computed from the observed data. Each peptide is therefore represented by two statistical confidence metrics, one for qualitative differential observation and one for quantitative differential intensity. We use two simulated and two real LC-MS datasets to demonstrate the robustness and sensitivity of the ANOVA-IMD approach for assigning confidence to peptides with significant differential abundance among experimental groups.

  11. Increasing the probability of long-range (1 month) SPI index forecasts based on SL-AV model for the Russian territory.

    NASA Astrophysics Data System (ADS)

    Utkuzova, Dilyara; Khan, Valentina; Donner, Reik

    2016-04-01

    Precipitation predictions for long-range period could be done with a numerical weather prediction model. Often, results after running the model are not so high. So, it is typically feasible to use post-processing methods producing the long - range precipitation forecast. For this purpose the SPI index was used. First of all it is necessary to test SPI index using statistical techniques. Different parameters of SPI frequency distribution and long-term tendencies were calculated as well as spatial characteristics indicating drought and wetness propagation. Results of analysis demonstrate that during previous years there is a tendency of increasing intensity of drought and wetness extremes over Russia. There are fewer droughts in the northern regions. The drought propagation for the European territory of Russia is decreasing in June and August, and increasing in July. The situation is opposite for the wetness tendencies. For the Asian territory of Russia, the drought propagation is significantly increasing in July along with a decreasing wetness trend. Then synoptic analysis has been conducted to describe wet and drought events. Synoptic conditions favorable for the formation of wet and drought extremes were identified by comparing synoptic charts with the spatial patterns of SPI. For synoptic analysis, episodes of extremely wet (6 episodes for the APR and 7 episodes for the EPR) and drought (6 episodes for the APR and 6 for the EPR) events were classified using A. Katz' typology of weather regimes. For European part of Russia, extreme DROUGHT events are linked to the weather type named "MIXED", for Asian part of Russia - the type "CENTRAL". For European part of Russia, extreme WET events associated with "CENTRAL" type. There is a displacement of the planetary frontal zone into southward direction approximately for 5-25 degrees relative to normal climatological position during WET extreme events linked to the «EASTERN» classification type. The SPI field (data was

  12. Probability Interpretation of Quantum Mechanics.

    ERIC Educational Resources Information Center

    Newton, Roger G.

    1980-01-01

    This paper draws attention to the frequency meaning of the probability concept and its implications for quantum mechanics. It emphasizes that the very meaning of probability implies the ensemble interpretation of both pure and mixed states. As a result some of the "paradoxical" aspects of quantum mechanics lose their counterintuitive character.…

  13. The Probabilities of Conditionals Revisited

    ERIC Educational Resources Information Center

    Douven, Igor; Verbrugge, Sara

    2013-01-01

    According to what is now commonly referred to as "the Equation" in the literature on indicative conditionals, the probability of any indicative conditional equals the probability of its consequent of the conditional given the antecedent of the conditional. Philosophers widely agree in their assessment that the triviality arguments of…

  14. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  15. Pig Data and Bayesian Inference on Multinomial Probabilities

    ERIC Educational Resources Information Center

    Kern, John C.

    2006-01-01

    Bayesian inference on multinomial probabilities is conducted based on data collected from the game Pass the Pigs[R]. Prior information on these probabilities is readily available from the instruction manual, and is easily incorporated in a Dirichlet prior. Posterior analysis of the scoring probabilities quantifies the discrepancy between empirical…

  16. Parametric probability distributions for anomalous change detection

    SciTech Connect

    Theiler, James P; Foy, Bernard R; Wohlberg, Brendt E; Scovel, James C

    2010-01-01

    The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.

  17. Probability mapping of scarred myocardium using texture and intensity features in CMR images

    PubMed Central

    2013-01-01

    Background The myocardium exhibits heterogeneous nature due to scarring after Myocardial Infarction (MI). In Cardiac Magnetic Resonance (CMR) imaging, Late Gadolinium (LG) contrast agent enhances the intensity of scarred area in the myocardium. Methods In this paper, we propose a probability mapping technique using Texture and Intensity features to describe heterogeneous nature of the scarred myocardium in Cardiac Magnetic Resonance (CMR) images after Myocardial Infarction (MI). Scarred tissue and non-scarred tissue are represented with high and low probabilities, respectively. Intermediate values possibly indicate areas where the scarred and healthy tissues are interwoven. The probability map of scarred myocardium is calculated by using a probability function based on Bayes rule. Any set of features can be used in the probability function. Results In the present study, we demonstrate the use of two different types of features. One is based on the mean intensity of pixel and the other on underlying texture information of the scarred and non-scarred myocardium. Examples of probability maps computed using the mean intensity of pixel and the underlying texture information are presented. We hypothesize that the probability mapping of myocardium offers alternate visualization, possibly showing the details with physiological significance difficult to detect visually in the original CMR image. Conclusion The probability mapping obtained from the two features provides a way to define different cardiac segments which offer a way to identify areas in the myocardium of diagnostic importance (like core and border areas in scarred myocardium). PMID:24053280

  18. A commercial PCV2a-based vaccine significantly reduces PCV2b transmission in experimental conditions.

    PubMed

    Rose, N; Andraud, M; Bigault, L; Jestin, A; Grasland, B

    2016-07-19

    Transmission characteristics of PCV2 have been compared between vaccinated and non-vaccinated pigs in experimental conditions. Twenty-four Specific Pathogen Free (SPF) piglets, vaccinated against PCV2 at 3weeks of age (PCV2a recombinant CAP protein-based vaccine), were inoculated at 15days post-vaccination with a PCV2b inoculum (6⋅10(5) TCID50), and put in contact with 24 vaccinated SPF piglets during 42days post-inoculation. Those piglets were shared in six replicates of a contact trial involving 4 inoculated piglets mingled with 4 susceptible SPF piglets. Two replicates of a similar contact trial were made with non-vaccinated pigs. Non vaccinated animals received a placebo at vaccination time and were inoculated the same way and at the same time as the vaccinated group. All the animals were monitored twice weekly using quantitative real-time PCR and ELISA for serology until 42days post-inoculation. The frequency of infection and the PCV2 genome load in sera of the vaccinated pigs were significantly reduced compared to the non-vaccinated animals. The duration of infectiousness was significantly different between vaccinated and non-vaccinated groups (16.6days [14.7;18.4] and 26.6days [22.9;30.4] respectively). The transmission rate was also considerably decreased in vaccinated pigs (β=0.09 [0.05-0.14] compared to β=0.19 [0.11-0.32] in non-vaccinated pigs). This led to an estimated reproduction ratio of 1.5 [95% CI 0.8 - 2.2] in vaccinated animals versus 5.1 [95% CI 2.5 - 8.2] in non-vaccinated pigs when merging data of this experiment with previous trials carried out in same conditions. PMID:27318416

  19. A commercial PCV2a-based vaccine significantly reduces PCV2b transmission in experimental conditions.

    PubMed

    Rose, N; Andraud, M; Bigault, L; Jestin, A; Grasland, B

    2016-07-19

    Transmission characteristics of PCV2 have been compared between vaccinated and non-vaccinated pigs in experimental conditions. Twenty-four Specific Pathogen Free (SPF) piglets, vaccinated against PCV2 at 3weeks of age (PCV2a recombinant CAP protein-based vaccine), were inoculated at 15days post-vaccination with a PCV2b inoculum (6⋅10(5) TCID50), and put in contact with 24 vaccinated SPF piglets during 42days post-inoculation. Those piglets were shared in six replicates of a contact trial involving 4 inoculated piglets mingled with 4 susceptible SPF piglets. Two replicates of a similar contact trial were made with non-vaccinated pigs. Non vaccinated animals received a placebo at vaccination time and were inoculated the same way and at the same time as the vaccinated group. All the animals were monitored twice weekly using quantitative real-time PCR and ELISA for serology until 42days post-inoculation. The frequency of infection and the PCV2 genome load in sera of the vaccinated pigs were significantly reduced compared to the non-vaccinated animals. The duration of infectiousness was significantly different between vaccinated and non-vaccinated groups (16.6days [14.7;18.4] and 26.6days [22.9;30.4] respectively). The transmission rate was also considerably decreased in vaccinated pigs (β=0.09 [0.05-0.14] compared to β=0.19 [0.11-0.32] in non-vaccinated pigs). This led to an estimated reproduction ratio of 1.5 [95% CI 0.8 - 2.2] in vaccinated animals versus 5.1 [95% CI 2.5 - 8.2] in non-vaccinated pigs when merging data of this experiment with previous trials carried out in same conditions.

  20. Holographic probabilities in eternal inflation.

    PubMed

    Bousso, Raphael

    2006-11-10

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  1. Pathway-based network analysis of myeloma tumors: monoclonal gammopathy of unknown significance, smoldering multiple myeloma, and multiple myeloma.

    PubMed

    Dong, L; Chen, C Y; Ning, B; Xu, D L; Gao, J H; Wang, L L; Yan, S Y; Cheng, S

    2015-01-01

    Although many studies have been carried out on monoclonal gammopathy of unknown significances (MGUS), smoldering multiple myeloma (SMM), and multiple myeloma (MM), their classification and underlying pathogenesis are far from elucidated. To discover the relationships among MGUS, SMM, and MM at the transcriptome level, differentially expressed genes in MGUS, SMM, and MM were identified by the rank product method, and then co-expression networks were constructed by integrating the data. Finally, a pathway-network was constructed based on Kyoto Encyclopedia of Genes and Genomes pathway enrichment analysis, and the relationships between the pathways were identified. The results indicated that there were 55, 78, and 138 pathways involved in the myeloma tumor developmental stages of MGUS, SMM, and MM, respectively. The biological processes identified therein were found to have a close relationship with the immune system. Processes and pathways related to the abnormal activity of DNA and RNA were also present in SMM and MM. Six common pathways were found in the whole process of myeloma tumor development. Nine pathways were shown to participate in the progression of MGUS to SMM, and prostate cancer was the sole pathway that was involved only in MGUS and MM. Pathway-network analysis might provide a new indicator for the developmental stage diagnosis of myeloma tumors. PMID:26345890

  2. Reactive Intermediates: Molecular and MS-Based Approaches to Assess the Functional Significance of Chemical:Protein Adducts1

    PubMed Central

    Monks, Terrence J.; Lau, Serrine S.

    2014-01-01

    Biologically reactive intermediates formed as endogenous products of various metabolic processes are considered important factors in a variety of human diseases, including Parkinson’s disease and other neurological disorders, diabetes and complications thereof, and other inflammatory-associated diseases. Chemical-induced toxicities are also frequently mediated via the bioactivation of relatively stable organic molecules to reactive electrophilic metabolites. Indeed, chemical-induced toxicities have long been known to be associated with the ability of electrophilic metabolites to react with a variety of targets within the cell, including their covalent adduction to nucleophilic residues in proteins, and nucleotides within DNA. Although we possess considerable knowledge of the various biochemical mechanisms by which chemicals undergo metabolic bioactivation, we understand far less about the processes that couple bioactivation to toxicity. Identifying specific sites within a protein that are targets for adduction can provide the initial information necessary to determine whether such adventitious post-translational modifications significantly alter either protein structure and/or function. To address this problem we have developed MS-based approaches to identify specific amino acid targets of electrophile adduction (electrophile-binding motifs), coupled with molecular modeling of such adducts, to determine the potential structural and functional consequences. Where appropriate, functional assays are subsequently conducted to assess protein function. PMID:23222993

  3. How LO Can You GO? Using the Dice-Based Golf Game GOLO to Illustrate Inferences on Proportions and Discrete Probability Distributions

    ERIC Educational Resources Information Center

    Stephenson, Paul; Richardson, Mary; Gabrosek, John; Reischman, Diann

    2009-01-01

    This paper describes an interactive activity that revolves around the dice-based golf game GOLO. The GOLO game can be purchased at various retail locations or online at igolo.com. In addition, the game may be played online free of charge at igolo.com. The activity is completed in four parts. The four parts can be used in a sequence or they can be…

  4. Automatic Item Generation of Probability Word Problems

    ERIC Educational Resources Information Center

    Holling, Heinz; Bertling, Jonas P.; Zeuch, Nina

    2009-01-01

    Mathematical word problems represent a common item format for assessing student competencies. Automatic item generation (AIG) is an effective way of constructing many items with predictable difficulties, based on a set of predefined task parameters. The current study presents a framework for the automatic generation of probability word problems…

  5. Logic, probability, and human reasoning.

    PubMed

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.

  6. Dinosaurs, Dinosaur Eggs, and Probability.

    ERIC Educational Resources Information Center

    Teppo, Anne R.; Hodgson, Ted

    2001-01-01

    Outlines several recommendations for teaching probability in the secondary school. Offers an activity that employs simulation by hand and using a programmable calculator in which geometry, analytical geometry, and discrete mathematics are explored. (KHR)

  7. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  8. Joint probabilities and quantum cognition

    SciTech Connect

    Acacio de Barros, J.

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  9. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  10. Probability of real-time detection versus probability of infection for aerosolized biowarfare agents: a model study.

    PubMed

    Sabelnikov, Alexander; Zhukov, Vladimir; Kempf, Ruth

    2006-05-15

    Real-time biosensors are expected to provide significant help in emergency response management should a terrorist attack with the use of biowarfare, BW, agents occur. In spite of recent and spectacular progress in the field of biosensors, several core questions still remain unaddressed. For instance, how sensitive should be a sensor? To what levels of infection would the different sensitivity limits correspond? How the probabilities of identification correspond to the probabilities of infection by an agent? In this paper, an attempt was made to address these questions. A simple probability model was generated for the calculation of risks of infection of humans exposed to different doses of infectious agents and of the probability of their simultaneous real-time detection/identification by a model biosensor and its network. A model biosensor was defined as a single device that included an aerosol sampler and a device for identification by any known (or conceived) method. A network of biosensors was defined as a set of several single biosensors that operated in a similar way and dealt with the same amount of an agent. Neither the particular deployment of sensors within the network, nor the spacious and timely distribution of agent aerosols due to wind, ventilation, humidity, temperature, etc., was considered by the model. Three model biosensors based on PCR-, antibody/antigen-, and MS-technique were used for simulation. A wide range of their metric parameters encompassing those of commercially available and laboratory biosensors, and those of future, theoretically conceivable devices was used for several hundred simulations. Based on the analysis of the obtained results, it is concluded that small concentrations of aerosolized agents that are still able to provide significant risks of infection especially for highly infectious agents (e.g. for small pox those risk are 1, 8, and 37 infected out of 1000 exposed, depending on the viability of the virus preparation) will

  11. Steering in spin tomographic probability representation

    NASA Astrophysics Data System (ADS)

    Man'ko, V. I.; Markovich, L. A.

    2016-09-01

    The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.

  12. Nonstationary envelope process and first excursion probability.

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1972-01-01

    The definition of stationary random envelope proposed by Cramer and Leadbetter, is extended to the envelope of nonstationary random process possessing evolutionary power spectral densities. The density function, the joint density function, the moment function, and the crossing rate of a level of the nonstationary envelope process are derived. Based on the envelope statistics, approximate solutions to the first excursion probability of nonstationary random processes are obtained. In particular, applications of the first excursion probability to the earthquake engineering problems are demonstrated in detail.

  13. Tsunami probability in the Caribbean Region

    USGS Publications Warehouse

    Parsons, T.; Geist, E.L.

    2008-01-01

    We calculated tsunami runup probability (in excess of 0.5 m) at coastal sites throughout the Caribbean region. We applied a Poissonian probability model because of the variety of uncorrelated tsunami sources in the region. Coastlines were discretized into 20 km by 20 km cells, and the mean tsunami runup rate was determined for each cell. The remarkable ???500-year empirical record compiled by O'Loughlin and Lander (2003) was used to calculate an empirical tsunami probability map, the first of three constructed for this study. However, it is unclear whether the 500-year record is complete, so we conducted a seismic moment-balance exercise using a finite-element model of the Caribbean-North American plate boundaries and the earthquake catalog, and found that moment could be balanced if the seismic coupling coefficient is c = 0.32. Modeled moment release was therefore used to generate synthetic earthquake sequences to calculate 50 tsunami runup scenarios for 500-year periods. We made a second probability map from numerically-calculated runup rates in each cell. Differences between the first two probability maps based on empirical and numerical-modeled rates suggest that each captured different aspects of tsunami generation; the empirical model may be deficient in primary plate-boundary events, whereas numerical model rates lack backarc fault and landslide sources. We thus prepared a third probability map using Bayesian likelihood functions derived from the empirical and numerical rate models and their attendant uncertainty to weight a range of rates at each 20 km by 20 km coastal cell. Our best-estimate map gives a range of 30-year runup probability from 0 - 30% regionally. ?? irkhaueser 2008.

  14. Genomics-based Approach and Prognostic Stratification Significance of Gene Mutations in Intermediate-risk Acute Myeloid Leukemia

    PubMed Central

    Wang, Bian-Hong; Li, Yong-Hui; Yu, Li

    2015-01-01

    Objective: Intermediate-risk acute myeloid leukemia (IR-AML), which accounts for a substantial number of AML cases, is highly heterogeneous. We systematically summarize the latest research progress on the significance of gene mutations for prognostic stratification of IR-AML. Data Sources: We conducted a systemic search from the PubMed database up to October, 2014 using various search terms and their combinations including IR-AML, gene mutations, mutational analysis, prognosis, risk stratification, next generation sequencing (NGS). Study Selection: Clinical or basic research articles on NGS and the prognosis of gene mutations in IR-AML were included. Results: The advent of the era of whole-genome sequencing has led to the discovery of an increasing number of molecular genetics aberrations that involved in leukemogenesis, and some of them have been used for prognostic risk stratification. Several studies have consistently identified that some gene mutations have prognostic relevance, however, there are still many controversies for some genes because of lacking sufficient evidence. In addition, tumor cells harbor hundreds of mutated genes and multiple mutations often coexist, therefore, single mutational analysis is not sufficient to make accurate prognostic predictions. The comprehensive analysis of multiple mutations based on sophisticated genomic technologies has raised increasing interest in recent years. Conclusions: NGS represents a pioneering and helpful approach to prognostic risk stratification of IR-AML patients. Further large-scale studies for comprehensive molecular analysis are needed to provide guidance and a theoretical basis for IR-AML prognostic stratification and clinical management. PMID:26315090

  15. A significant carbon sink in temperate forests in Beijing: based on 20-year field measurements in three stands.

    PubMed

    Zhu, JianXiao; Hu, XueYang; Yao, Hui; Liu, GuoHua; Ji, ChenJun; Fang, JingYun

    2015-11-01

    Numerous efforts have been made to characterize forest carbon (C) cycles and stocks in various ecosystems. However, long-term observation on each component of the forest C cycle is still lacking. We measured C stocks and fluxes in three permanent temperate forest plots (birch, oak and pine forest) during 2011–2014, and calculated the changes of the components of the C cycle related to the measurements during 1992–1994 at Mt. Dongling, Beijing, China. Forest net primary production in birch, oak, and pine plots was 5.32, 4.53, and 6.73 Mg C ha-1 a-1, respectively. Corresponding net ecosystem production was 0.12, 0.43, and 3.53 Mg C ha-1 a-1. The C stocks and fluxes in 2011–2014 were significantly larger than those in 1992–1994 in which the biomass C densities in birch, oak, and pine plots increased from 50.0, 37.7, and 54.0 Mg C ha-1 in 1994 to 101.5, 77.3, and 110.9 Mg C ha-1 in 2014; soil organic C densities increased from 207.0, 239.1, and 231.7 Mg C ha-1 to 214.8, 241.7, and 238.4 Mg C ha-1; and soil heterotrophic respiration increased from 2.78, 3.49, and 1.81 Mg C ha-1 a-1 to 5.20, 4.10, and 3.20 Mg C ha-1 a-1. These results suggest that the mountainous temperate forest ecosystems in Beijing have served as a carbon sink in the last two decades. These observations of C stocks and fluxes provided field-based data for a long-term study of C cycling in temperate forest ecosystems.

  16. Detection probability of EBPSK-MODEM system

    NASA Astrophysics Data System (ADS)

    Yao, Yu; Wu, Lenan

    2016-07-01

    Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.

  17. Local Directed Percolation Probability in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Inui, Norio; Konno, Norio; Komatsu, Genichi; Kameoka, Koichi

    1998-01-01

    Using the series expansion method and Monte Carlo simulation,we study the directed percolation probability on the square lattice Vn0=\\{ (x,y) \\in {Z}2:x+y=even, 0 ≤ y ≤ n, - y ≤ x ≤ y \\}.We calculate the local percolationprobability Pnl defined as the connection probability between theorigin and a site (0,n). The critical behavior of P∞lis clearly different from the global percolation probability P∞g characterized by a critical exponent βg.An analysis based on the Padé approximants shows βl=2βg.In addition, we find that the series expansion of P2nl can be expressed as a function of Png.

  18. The role of probabilities in physics.

    PubMed

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description.

  19. Joint probability distributions for projection probabilities of random orthonormal states

    NASA Astrophysics Data System (ADS)

    Alonso, L.; Gorin, T.

    2016-04-01

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.

  20. Computational methods for probability of instability calculations

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.

    1990-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of a dynamic system than can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the roots of the characteristics equation or Routh-Hurwitz test functions are investigated. Computational methods based on system reliability analysis methods and importance sampling concepts are proposed to perform efficient probabilistic analysis. Numerical examples are provided to demonstrate the methods.

  1. FISH-Based Analysis of Clonally Derived CHO Cell Populations Reveals High Probability for Transgene Integration in a Terminal Region of Chromosome 1 (1q13)

    PubMed Central

    Li, Shengwei; Gao, Xiaoping; Peng, Rui; Zhang, Sheng; Fu, Wei

    2016-01-01

    A basic goal in the development of recombinant proteins is the generation of cell lines that express the desired protein stably over many generations. Here, we constructed engineered Chinese hamster ovary cell lines (CHO-S) with a pCHO-hVR1 vector that carried an extracellular domain of a VEGF receptor (VR) fusion gene. Forty-five clones with high hVR1 expression were selected for karyotype analysis. Using fluorescence in situ hybridization (FISH) and G-banding, we found that pCHO-hVR1 was integrated into three chromosomes, including chromosomes 1, Z3 and Z4. Four clones were selected to evaluate their productivity under non-fed, non-optimized shake flask conditions. The results showed that clones 1 and 2 with integration sites on chromosome 1 revealed high levels of hVR1 products (shake flask of approximately 800 mg/L), whereas clones 3 and 4 with integration sites on chromosomes Z3 or Z4 had lower levels of hVR1 products. Furthermore, clones 1 and 2 maintained their productivity stabilities over a continuous period of 80 generations, and clones 3 and 4 showed significant declines in their productivities in the presence of selection pressure. Finally, pCHO-hVR1 localized to the same region at chromosome 1q13, the telomere region of normal chromosome 1. In this study, these results demonstrate that the integration of exogenous hVR1 gene on chromosome 1, band q13, may create a high protein-producing CHO-S cell line, suggesting that chromosome 1q13 may contain a useful target site for the high expression of exogenous protein. This study shows that the integration into the target site of chromosome 1q13 may avoid the problems of random integration that cause gene silencing or also overcome position effects, facilitating exogenous gene expression in CHO-S cells. PMID:27684722

  2. The Objective Borderline method (OBM): a probability-based model for setting up an objective pass/fail cut-off score in medical programme assessments.

    PubMed

    Shulruf, Boaz; Turner, Rolf; Poole, Phillippa; Wilkinson, Tim

    2013-05-01

    The decision to pass or fail a medical student is a 'high stakes' one. The aim of this study is to introduce and demonstrate the feasibility and practicality of a new objective standard-setting method for determining the pass/fail cut-off score from borderline grades. Three methods for setting up pass/fail cut-off scores were compared: the Regression Method, the Borderline Group Method, and the new Objective Borderline Method (OBM). Using Year 5 students' OSCE results from one medical school we established the pass/fail cut-off scores by the abovementioned three methods. The comparison indicated that the pass/fail cut-off scores generated by the OBM were similar to those generated by the more established methods (0.840 ≤ r ≤ 0.998; p < .0001). Based on theoretical and empirical analysis, we suggest that the OBM has advantages over existing methods in that it combines objectivity, realism, robust empirical basis and, no less importantly, is simple to use.

  3. The probability of finding suitable directed donors.

    PubMed

    Kanter, M; Selvin, S; Myhre, B A

    1989-02-01

    A series of tables based on mathematical calculations is given as guidelines for the number of directed donors needed by members of various ethnic/racial groups to provide a desired number of units of blood with a selected probability of achieving this result. From these tables, certain conclusions can be drawn. Unrelated donors who do not know their blood type are an inefficient source of directed donors. Rh-negative patients are unlikely to obtain enough directed-donor units from either related or unrelated donors with confidence unless these donors known their blood type. In general, siblings, parents, and offspring are the most efficient directed donors from the standpoint of compatibility. Cousins, uncles, aunts, nieces, and nephews are not much more likely to be compatible than unrelated donors are. It is easier to obtain suitable directed-donor units among Hispanics than among whites, blacks, or Asians, due to their skewed blood group frequencies. In general, using O-negative directed donors for Rh-positive recipients does not significantly increase the likelihood of finding suitable donors.

  4. Imprecise probabilities in engineering analyses

    NASA Astrophysics Data System (ADS)

    Beer, Michael; Ferson, Scott; Kreinovich, Vladik

    2013-05-01

    Probabilistic uncertainty and imprecision in structural parameters and in environmental conditions and loads are challenging phenomena in engineering analyses. They require appropriate mathematical modeling and quantification to obtain realistic results when predicting the behavior and reliability of engineering structures and systems. But the modeling and quantification is complicated by the characteristics of the available information, which involves, for example, sparse data, poor measurements and subjective information. This raises the question whether the available information is sufficient for probabilistic modeling or rather suggests a set-theoretical approach. The framework of imprecise probabilities provides a mathematical basis to deal with these problems which involve both probabilistic and non-probabilistic information. A common feature of the various concepts of imprecise probabilities is the consideration of an entire set of probabilistic models in one analysis. The theoretical differences between the concepts mainly concern the mathematical description of the set of probabilistic models and the connection to the probabilistic models involved. This paper provides an overview on developments which involve imprecise probabilities for the solution of engineering problems. Evidence theory, probability bounds analysis with p-boxes, and fuzzy probabilities are discussed with emphasis on their key features and on their relationships to one another. This paper was especially prepared for this special issue and reflects, in various ways, the thinking and presentation preferences of the authors, who are also the guest editors for this special issue.

  5. Children's understanding of posterior probability.

    PubMed

    Girotto, Vittorio; Gonzalez, Michel

    2008-01-01

    Do young children have a basic intuition of posterior probability? Do they update their decisions and judgments in the light of new evidence? We hypothesized that they can do so extensionally, by considering and counting the various ways in which an event may or may not occur. The results reported in this paper showed that from the age of five, children's decisions under uncertainty (Study 1) and judgments about random outcomes (Study 2) are correctly affected by posterior information. From the same age, children correctly revise their decisions in situations in which they face a single, uncertain event, produced by an intentional agent (Study 3). The finding that young children have some understanding of posterior probability supports the theory of naive extensional reasoning, and contravenes some pessimistic views of probabilistic reasoning, in particular the evolutionary claim that the human mind cannot deal with single-case probability. PMID:17391661

  6. Interference of probabilities in dynamics

    SciTech Connect

    Zak, Michail

    2014-08-15

    A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.

  7. Knowledge typology for imprecise probabilities.

    SciTech Connect

    Wilson, G. D.; Zucker, L. J.

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  8. Non-Patient-Based Clinical Licensure Examination for Dentistry in Minnesota: Significance of Decision and Description of Process.

    PubMed

    Mills, Eric A

    2016-06-01

    In recent years in the United States, there has been heightened interest in offering clinical licensure examination (CLE) alternatives to the live patient-based method in dentistry. Fueled by ethical concerns of faculty members at the University of Minnesota School of Dentistry, the state of Minnesota's Board of Dentistry approved a motion in 2009 to provide two CLE options to the school's future predoctoral graduates: a patient-based one, administered by the Central Regional Dental Testing Service, and a non-patient-based one administered by the National Dental Examining Board of Canada (NDEB). The validity of the NDEB written exam and objective structured clinical exam (OSCE) has been verified in a multi-year study. Via five-option, one-best-answer, multiple-choice questions in the written exam and extended match questions with up to 15 answer options in the station-based OSCE, competent candidates are distinguished from those who are incompetent in their didactic knowledge and clinical critical thinking and judgment across all dental disciplines. The action had the additional effects of furthering participation of Minnesota Board of Dentistry members in the University of Minnesota School of Dentistry's competency-based curriculum, of involving the school's faculty in NDEB item development workshops, and, beginning in 2018, of no longer permitting the patient-based CLE option on site. The aim of this article is to describe how this change came about and its effects. PMID:27251345

  9. Stretching Probability Explorations with Geoboards

    ERIC Educational Resources Information Center

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  10. GPS: Geometry, Probability, and Statistics

    ERIC Educational Resources Information Center

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  11. Some Surprising Probabilities from Bingo.

    ERIC Educational Resources Information Center

    Mercer, Joseph O.

    1993-01-01

    Investigates the probability of winning the largest prize at Bingo through a series of five simpler problems. Investigations are conducted with the aid of either BASIC computer programs, spreadsheets, or a computer algebra system such as Mathematica. Provides sample data tables to illustrate findings. (MDH)

  12. Probability Simulation in Middle School.

    ERIC Educational Resources Information Center

    Lappan, Glenda; Winter, M. J.

    1980-01-01

    Two simulations designed to teach probability to middle-school age pupils are presented. The first simulates the one-on-one foul shot simulation in basketball; the second deals with collecting a set of six cereal box prizes by buying boxes containing one toy each. (MP)

  13. Comments on quantum probability theory.

    PubMed

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.

  14. Rethinking the learning of belief network probabilities

    SciTech Connect

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  15. The influence of initial beliefs on judgments of probability.

    PubMed

    Yu, Erica C; Lagnado, David A

    2012-01-01

    This study aims to investigate whether experimentally induced prior beliefs affect processing of evidence including the updating of beliefs under uncertainty about the unknown probabilities of outcomes and the structural, outcome-generating nature of the environment. Participants played a gambling task in the form of computer-simulated slot machines and were given information about the slot machines' possible outcomes without their associated probabilities. One group was induced with a prior belief about the outcome space that matched the space of actual outcomes to be sampled; the other group was induced with a skewed prior belief that included the actual outcomes and also fictional higher outcomes. In reality, however, all participants sampled evidence from the same underlying outcome distribution, regardless of priors given. Before and during sampling, participants expressed their beliefs about the outcome distribution (values and probabilities). Evaluation of those subjective probability distributions suggests that all participants' judgments converged toward the observed outcome distribution. However, despite observing no supporting evidence for fictional outcomes, a significant proportion of participants in the skewed priors condition expected them in the future. A probe of the participants' understanding of the underlying outcome-generating processes indicated that participants' judgments were based on the information given in the induced priors and consequently, a significant proportion of participants in the skewed condition believed the slot machines were not games of chance while participants in the control condition believed the machines generated outcomes at random. Beyond Bayesian or heuristic belief updating, priors not only contribute to belief revision but also affect one's deeper understanding of the environment. PMID:23060843

  16. Team-Based Learning, Faculty Research, and Grant Writing Bring Significant Learning Experiences to an Undergraduate Biochemistry Laboratory Course

    ERIC Educational Resources Information Center

    Evans, Hedeel Guy; Heyl, Deborah L.; Liggit, Peggy

    2016-01-01

    This biochemistry laboratory course was designed to provide significant learning experiences to expose students to different ways of succeeding as scientists in academia and foster development and improvement of their potential and competency as the next generation of investigators. To meet these goals, the laboratory course employs three…

  17. [Subjective probability of reward receipt and the magnitude effect in probability discounting].

    PubMed

    Isomura, Mieko; Aoyama, Kenjiro

    2008-06-01

    Previous research suggested that larger probabilistic rewards were discounted more steeply than smaller probabilistic rewards (the magnitude effect). This research tests the hypothesis that the magnitude effect reflects the extent to which individuals distrust the stated probability of receiving different amounts of rewards. The participants were 105 college students. Probability discounting of two different amounts of rewards (5 000 yen and 100 000 yen) and the subjective probability of reward receipt of the different amounts (5 000 yen, 100 000 yen and 1 000 000 yen) were measured. The probabilistic 100 000 yen was discounted more steeply than the probabilistic 5 000 yen. The subjective probability of reward receipt was higher in the 5 000 yen than in the 100 000 yen condition. The proportion of subjective probability of receiving 5 000 yen to that of receiving 100 000 yen was significantly correlated with the proportion of degree of probability discounting for 5 000 yen to that for 100 000 yen. These results were consistent with the hypothesis stated above.

  18. Probability of detection calculations using MATLAB

    NASA Astrophysics Data System (ADS)

    Wei, Yung-Chung

    1993-06-01

    A set of highly efficient computer programs based on the Marcum and Swerling's analysis on radar detection has been written in MATLAB to evaluate the probability of detection. The programs are based on accurate methods unlike the detectability method which is based on approximation. This thesis also outlines radar detection theory and target models as a background. The goal of this effort is to provide a set of efficient computer programs for student usage and teacher's aid. Programs are designed to be user friendly and run on personal computers.

  19. Development of an antigen-based rapid diagnostic test for the identification of blowfly (Calliphoridae) species of forensic significance.

    PubMed

    McDonagh, Laura; Thornton, Chris; Wallman, James F; Stevens, Jamie R

    2009-06-01

    In this study we examine the limitations of currently used sequence-based approaches to blowfly (Calliphoridae) identification and evaluate the utility of an immunological approach to discriminate between blowfly species of forensic importance. By investigating antigenic similarity and dissimilarity between the first instar larval stages of four forensically important blowfly species, we have been able to identify immunoreactive proteins of potential use in the development of species-specific immuno-diagnostic tests. Here we outline our protein-based approach to species determination, and describe how it may be adapted to develop rapid diagnostic assays for the 'on-site' identification of blowfly species.

  20. An individual-based simulation of pneumonic plague transmission following an outbreak and the significance of intervention compliance.

    PubMed

    Williams, Andrew D C; Hall, Ian M; Rubin, G James; Amlôt, Richard; Leach, Steve

    2011-06-01

    The existence of primary pneumonic plague outbreaks raises concerns over the use of the causative bacteria as an aerosol-based bioweapon. We employed an individual-based model, parameterised using published personal contact information, to assess the severity of a deliberate release in a discrete community, under the influence of two proposed intervention strategies. We observed that the severity of the resulting epidemic is determined by the degree of personal compliance with said strategies, implying that prior preparedness activities are essential in order that public awareness and willingness to seek treatment is achieved quickly. PMID:21624780

  1. Sedimentology and facies of a Mississippi River meander belt: a fluvial model based on a significant fluvial system

    SciTech Connect

    Pryor, W.A.; Jordan, D.W.

    1988-02-01

    The meander belt of the Mississippi River, in southeastern Missouri, consists of four distinct facies groups: (1) river channel/point bar, (2) chute, (3) crevasse splay/levee, and (4) abandoned channel fill. A depositional model and vertical sequences have been developed from drill cores, vibra-cores, trenches, fathometer surveys, and mapping of these principal facies. This model and he vertical sequences compare very well to ancient sequences because the Mississippi River is a large, significant river with ancient analogs.

  2. Significant Life Experience: Exploring the Lifelong Influence of Place-Based Environmental and Science Education on Program Participants

    ERIC Educational Resources Information Center

    Colvin, Corrie Ruth

    2013-01-01

    Current research provides a limited understanding of the life long influence of nonformal place-based environmental and science education programs on past participants. This study looks to address this gap, exploring the ways in which these learning environments have contributed to environmental identity and stewardship. Using Dorothy Holland's…

  3. Richness-Productivity Relationships Between Trophic Levels in a Detritus-Based System: Significance of Abundance and Trophic Linkage.

    EPA Science Inventory

    Most theoretical and empirical studies of productivity–species richness relationships fail to consider linkages among trophic levels. We quantified productivity–richness relationships in detritus-based, water-filled tree-hole communities for two trophic levels: invertebrate consu...

  4. Sirolimus-based therapy following early cyclosporine withdrawal provides significantly improved renal histology and function at 3 years.

    PubMed

    Mota, Alfredo; Arias, Manuel; Taskinen, Eero I; Paavonen, Timo; Brault, Yves; Legendre, Christophe; Claesson, Kerstin; Castagneto, Marco; Campistol, Josep M; Hutchison, Brian; Burke, James T; Yilmaz, Sedar; Häyry, Pekka; Neylan, John F

    2004-06-01

    Graft function and histology are predictive of renal transplant survival. The Rapamune Maintenance Regimen study demonstrated that early cyclosporine (CsA) withdrawal from a sirolimus (SRL)-CsA-steroid (ST) regimen improved renal function and blood pressure. We report the protocol-mandated biopsy findings from that study. Renal transplant patients (n = 430) receiving SRL-CsA-ST were randomized at 3 months after transplantation to remain on SRL-CsA-ST, or to have CsA withdrawn (SRL-ST group). Protocol-mandated biopsies were performed at engraftment and at 12 and 36 months. Two pathologists blindly evaluated 484 biopsies to obtain the Chronic Allograft Damage Index (CADI) scores. At 36 months among patients with serial biopsies (n = 63), the mean CADI score was significantly lower with SRL-ST(4.70 vs. 3.20, p = 0.003), as was the mean tubular atrophy score (0.77 vs. 0.32, p < 0.001). All six components of the CADI score were numerically lower in SRL-ST group; moreover, inflammation and the tubular atrophy scores decreased significantly in the SRL-ST group between 12 and 36 months. The calculated glomerular filtration rate at 36 months was significantly better in the CsA-withdrawal group (54.8 vs. 68.2 mL/min, p = 0.009). In conclusion, withdrawing CsA from the SRL-CsA-ST regimen resulted in improved renal histology and function.

  5. Music-evoked incidental happiness modulates probability weighting during risky lottery choices

    PubMed Central

    Schulreich, Stefan; Heussen, Yana G.; Gerhardt, Holger; Mohr, Peter N. C.; Binkofski, Ferdinand C.; Koelsch, Stefan; Heekeren, Hauke R.

    2014-01-01

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music—happy, sad, or no music, or sequences of random tones—and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the “happy” than in the “sad” and “random tones” conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the “happy” condition, participants showed significantly higher decision weights associated with the larger payoffs than in the “sad” and “random tones” conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting. PMID:24432007

  6. Music-evoked incidental happiness modulates probability weighting during risky lottery choices.

    PubMed

    Schulreich, Stefan; Heussen, Yana G; Gerhardt, Holger; Mohr, Peter N C; Binkofski, Ferdinand C; Koelsch, Stefan; Heekeren, Hauke R

    2014-01-01

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music-happy, sad, or no music, or sequences of random tones-and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the "happy" than in the "sad" and "random tones" conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the "happy" condition, participants showed significantly higher decision weights associated with the larger payoffs than in the "sad" and "random tones" conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting.

  7. Electrochemical evidences and consequences of significant differences in ions diffusion rate in polyacrylate-based ion-selective membranes.

    PubMed

    Woźnica, Emilia; Mieczkowski, Józef; Michalska, Agata

    2011-11-21

    The origin and effect of surface accumulation of primary ions within the ion-selective poly(n-butyl acrylate)-based membrane, obtained by thermal polymerization, is discussed. Using a new method, based on the relation between the shape of a potentiometric plot and preconditioning time, the diffusion of copper ions in the membrane was found to be slow (the diffusion coefficient estimated to be close to 10(-11) cm(2) s(-1)), especially when compared to ion-exchanger counter ions--sodium cations diffusion (a diffusion coefficient above 10(-9) cm(2) s(-1)). The higher mobility of sodium ions than those of the copper-ionophore complex results in exposed ion-exchanger role leading to undesirably exposed sensitivity to sodium or potassium ions. PMID:21957488

  8. Significance of Rumex vesicarius as anticancer remedy against hepatocellular carcinoma: a proposal-based on experimental animal studies.

    PubMed

    Shahat, Abdelaaty A; Alsaid, Mansour S; Kotob, Soheir E; Ahmed, Hanaa H

    2015-01-01

    Rumex vesicarius is an edible herb distributed in Egypt and Saudi Arabia. The whole plant has significant value in folk medicine and it has been used to alleviate several diseases. Hepatocellular carcinoma (HCC), the major primary malignant tumor of the liver, is one of the most life-threatening human cancers. The goal of the current study was to explore the potent role of Rumex vesicarius extract against HCC induced in rats. Thirty adult male albino rats were divided into 3 groups: (I): Healthy animals received orally 0.9% normal saline and served as negative control group, (II): HCC group in which rats were orally administered N-nitrosodiethylamine NDEA, (III): HCC group treated orally with R. vesicarius extract in a dose of 400 mg/kg b.wt daily for two months. ALT and AST, ALP and γ-GT activities were estimated. CEA, AFP, AFU, GPC-3, Gp-73 and VEGF levels were quantified. Histopathological examination of liver tissue sections was also carried out. The results of the current study showed that the treatment of the HCC group with R. vesicarius extract reversed the significant increase in liver enzymes activity, CEA, AFP, AFU, glypican 3, golgi 73 and VEGF levels in serum as compared to HCC-untreated counterparts. In addition, the favorable impact of R. vesicarius treatment was evidenced by the marked improvement in the histopathological features of the liver of the treated group. In conclusion, the present experimental setting provided evidence for the significance of R. vesicarius as anticancer candidate with a promising anticancer potential against HCC. The powerful hepatoprotective properties, the potent antiangiogenic activity and the effective antiproliferative capacity are responsible for the anticancer effect of this plant. PMID:26028090

  9. Significance of Rumex vesicarius as anticancer remedy against hepatocellular carcinoma: a proposal-based on experimental animal studies.

    PubMed

    Shahat, Abdelaaty A; Alsaid, Mansour S; Kotob, Soheir E; Ahmed, Hanaa H

    2015-01-01

    Rumex vesicarius is an edible herb distributed in Egypt and Saudi Arabia. The whole plant has significant value in folk medicine and it has been used to alleviate several diseases. Hepatocellular carcinoma (HCC), the major primary malignant tumor of the liver, is one of the most life-threatening human cancers. The goal of the current study was to explore the potent role of Rumex vesicarius extract against HCC induced in rats. Thirty adult male albino rats were divided into 3 groups: (I): Healthy animals received orally 0.9% normal saline and served as negative control group, (II): HCC group in which rats were orally administered N-nitrosodiethylamine NDEA, (III): HCC group treated orally with R. vesicarius extract in a dose of 400 mg/kg b.wt daily for two months. ALT and AST, ALP and γ-GT activities were estimated. CEA, AFP, AFU, GPC-3, Gp-73 and VEGF levels were quantified. Histopathological examination of liver tissue sections was also carried out. The results of the current study showed that the treatment of the HCC group with R. vesicarius extract reversed the significant increase in liver enzymes activity, CEA, AFP, AFU, glypican 3, golgi 73 and VEGF levels in serum as compared to HCC-untreated counterparts. In addition, the favorable impact of R. vesicarius treatment was evidenced by the marked improvement in the histopathological features of the liver of the treated group. In conclusion, the present experimental setting provided evidence for the significance of R. vesicarius as anticancer candidate with a promising anticancer potential against HCC. The powerful hepatoprotective properties, the potent antiangiogenic activity and the effective antiproliferative capacity are responsible for the anticancer effect of this plant.

  10. Limits on the significant mass-loss scenario based on the globular clusters of the Fornax dwarf spheroidal galaxy

    NASA Astrophysics Data System (ADS)

    Khalaj, P.; Baumgardt, H.

    2016-03-01

    Many of the scenarios proposed to explain the origin of chemically peculiar stars in globular clusters (GCs) require significant mass loss (≥95 per cent) to explain the observed fraction of such stars. In the GCs of the Fornax dwarf galaxy, significant mass loss could be a problem. Larsen et al. showed that there is a large ratio of GCs to metal-poor field stars in Fornax and about 20-25 per cent of all the stars with [Fe/H] < -2 belong to the four metal-poor GCs. This imposes an upper limit of ˜80 per cent mass loss that could have happened in Fornax GCs. In this paper, we propose a solution to this problem by suggesting that stars can leave the Fornax galaxy. We use a series of N-body simulations to determine the limit of mass loss from Fornax as a function of the initial orbital radii of GCs and the speed with which stars leave Fornax GCs. We consider a set of cored and cuspy density profiles for Fornax. Our results show that with a cuspy model for Fornax, the fraction of stars that leave the galaxy can be as high as ˜90 per cent, when the initial orbital radii of GCs are R = 2-3 kpc and the initial speed of stars is v > 20 km s-1. We show that such large velocities can be achieved by mass loss induced by gas expulsion but not mass loss induced by stellar evolution. Our results imply that one cannot interpret the metallicity distribution of Fornax field stars as evidence against significant mass loss in Fornax GCs, if mass loss is due to gas expulsion.

  11. Knot probabilities in random diagrams

    NASA Astrophysics Data System (ADS)

    Cantarella, Jason; Chapman, Harrison; Mastin, Matt

    2016-10-01

    We consider a natural model of random knotting—choose a knot diagram at random from the finite set of diagrams with n crossings. We tabulate diagrams with 10 and fewer crossings and classify the diagrams by knot type, allowing us to compute exact probabilities for knots in this model. As expected, most diagrams with 10 and fewer crossings are unknots (about 78% of the roughly 1.6 billion 10 crossing diagrams). For these crossing numbers, the unknot fraction is mostly explained by the prevalence of ‘tree-like’ diagrams which are unknots for any assignment of over/under information at crossings. The data shows a roughly linear relationship between the log of knot type probability and the log of the frequency rank of the knot type, analogous to Zipf’s law for word frequency. The complete tabulation and all knot frequencies are included as supplementary data.

  12. Probability distributions for multimeric systems.

    PubMed

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  13. Probability, Information and Statistical Physics

    NASA Astrophysics Data System (ADS)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  14. Detection probabilities in fuel cycle oriented safeguards

    SciTech Connect

    Canty, J.J.; Stein, G.; Avenhaus, R. )

    1987-01-01

    An intensified discussion of evaluation criteria for International Atomic Energy Agency (IAEA) safeguards effectiveness is currently under way. Considerations basic to the establishment of such criteria are derived from the model agreement INFCIRC/153 and include threshold amounts, strategic significance, conversion times, required assurances, cost-effectiveness, and nonintrusiveness. In addition to these aspects, the extent to which fuel cycle characteristics are taken into account in safeguards implementations (Article 81c of INFCIRC/153) will be reflected in the criteria. The effectiveness of safeguards implemented under given manpower constraints is evaluated. As the significant quantity and timeliness criteria have established themselves within the safeguards community, these are taken as fixed. Detection probabilities, on the other hand, still provide a certain degree of freedom in interpretation. The problem of randomization of inspection activities across a fuel cycle, or portions thereof, is formalized as a two-person zero-sum game, the payoff function of which is the detection probability achieved by the inspectorate. It is argued, from the point of view of risk of detection, that fuel cycle-independent, minimally accepted threshold criteria for such detection probabilities cannot and should not be applied.

  15. The carpenter fork bed, a new - and older - Black-shale unit at the base of the New Albany shale in central Kentucky: Characterization and significance

    USGS Publications Warehouse

    Barnett, S.F.; Ettensohn, F.R.; Norby, R.D.

    1996-01-01

    Black shales previously interpreted to be Late Devonian cave-fill or slide deposits are shown to be much older Middle Devonian black shales only preserved locally in Middle Devonian grabens and structural lows in central Kentucky. This newly recognized - and older -black-shale unit occurs at the base of the New Albany Shale and is named the Carpenter Fork Bed of the Portwood Member of the New Albany Shale after its only known exposure on Carpenter Fork in Boyle County, central Kentucky; two other occurrences are known from core holes in east-central Kentucky. Based on stratigraphic position and conodont biostratigraphy, the unit is Middle Devonian (Givetian: probably Middle to Upper P. varcus Zone) in age and occurs at a position represented by an unconformity atop the Middle Devonian Boyle Dolostone and its equivalents elsewhere on the outcrop belt. Based on its presence as isolated clasts in the overlying Duffin Bed of the Portwood Member, the former distribution of the unit was probably much more widespread - perhaps occurring throughout western parts of the Rome trough. Carpenter Fork black shales apparently represent an episode of subsidence or sea-level rise coincident with inception of the third tectophase of the Acadian orogeny. Deposition, however, was soon interrupted by reactivation of several fault zones in central Kentucky, perhaps in response to bulge migration accompanying start of the tectophase. As a result, much of central Kentucky was uplifted and tilted, and the Carpenter Fork Bed was largely eroded from the top of the Boyle, except in a few structural lows like the Carpenter Fork graben where a nearly complete record of Middle to early Late Devonian deposition is preserved.

  16. Annonaceous acetogenins (ACGs) nanosuspensions based on a self-assembly stabilizer and the significantly improved anti-tumor efficacy.

    PubMed

    Hong, Jingyi; Li, Yanhong; Xiao, Yao; Li, Yijing; Guo, Yifei; Kuang, Haixue; Wang, Xiangtao

    2016-09-01

    Annonaceous acetogenins (ACGs) have exhibited antitumor activity against various cancers. However, these substances' poor solubility has limited clinical applications. In this study, hydroxypropyl-beta-cyclodextrin (HP-β-CD) and soybean lecithin (SPC) were self-assembled into an amphiphilic complex. ACGs nanosuspensions (ACGs-NSps) were prepared with a mean particle size of 144.4nm, a zeta potential of -22.9mV and a high drug payload of 46.17% using this complex as stabilizer. The ACGs-NSps demonstrated sustained release in vitro and good stability in plasma as well as simulated gastrointestinal fluid, and met the demand of both intravenous injection and oral administration. The ACGs-NSps demonstrated significantly increased cytotoxicity against Hela and HepG2 cancer cell lines compared to ACGs in solution (in vitro cytotoxicity assay). An in vivo study with H22-tumor bearing mice demonstrated that nanosuspensions significantly improved ACGs' antitumor activity. When orally administered, ACGs-NSps achieved a similar tumor inhibition rate at 1/10th the dose of ACGs in an oil solution (47.94% vs. 49.74%, p>0.05). Improved therapeutic efficacy was further achieved when the ACGs-NSps were intravenously injected into mice (70.31%). With the help of nanosuspension technology, ACGs may be an effective antitumor drug for clinic use.

  17. Annonaceous acetogenins (ACGs) nanosuspensions based on a self-assembly stabilizer and the significantly improved anti-tumor efficacy.

    PubMed

    Hong, Jingyi; Li, Yanhong; Xiao, Yao; Li, Yijing; Guo, Yifei; Kuang, Haixue; Wang, Xiangtao

    2016-09-01

    Annonaceous acetogenins (ACGs) have exhibited antitumor activity against various cancers. However, these substances' poor solubility has limited clinical applications. In this study, hydroxypropyl-beta-cyclodextrin (HP-β-CD) and soybean lecithin (SPC) were self-assembled into an amphiphilic complex. ACGs nanosuspensions (ACGs-NSps) were prepared with a mean particle size of 144.4nm, a zeta potential of -22.9mV and a high drug payload of 46.17% using this complex as stabilizer. The ACGs-NSps demonstrated sustained release in vitro and good stability in plasma as well as simulated gastrointestinal fluid, and met the demand of both intravenous injection and oral administration. The ACGs-NSps demonstrated significantly increased cytotoxicity against Hela and HepG2 cancer cell lines compared to ACGs in solution (in vitro cytotoxicity assay). An in vivo study with H22-tumor bearing mice demonstrated that nanosuspensions significantly improved ACGs' antitumor activity. When orally administered, ACGs-NSps achieved a similar tumor inhibition rate at 1/10th the dose of ACGs in an oil solution (47.94% vs. 49.74%, p>0.05). Improved therapeutic efficacy was further achieved when the ACGs-NSps were intravenously injected into mice (70.31%). With the help of nanosuspension technology, ACGs may be an effective antitumor drug for clinic use. PMID:27209384

  18. The biological significance of color constancy: an agent-based model with bees foraging from flowers under varied illumination.

    PubMed

    Faruq, Samia; McOwan, Peter W; Chittka, Lars

    2013-08-20

    The perceived color of an object depends on its spectral reflectance and the spectral composition of the illuminant. Thus when the illumination changes, the light reflected from the object also varies. This would result in a different color sensation if no color constancy mechanism is put in place-that is, the ability to form consistent representation of colors across various illuminants and background scenes. We explore the quantitative benefits of various color constancy algorithms in an agent-based model of foraging bees, where agents select flower color based on reward. Each simulation is based on 100 "meadows" with five randomly selected flower species with empirically determined spectral reflectance properties, and each flower species is associated with realistic distributions of nectar rewards. Simulated foraging bees memorize the colors of flowers that they have experienced as most rewarding, and their task is to discriminate against other flower colors with lower rewards, even in the face of changing illumination conditions. We compared the performance of von Kries, White Patch, and Gray World constancy models with (hypothetical) bees with perfect color constancy, and color-blind bees. A bee equipped with trichromatic color vision but no color constancy performed only ∼20% better than a color-blind bee (relative to a maximum improvement at 100% for perfect color constancy), whereas the most powerful recovery of reflectance in the face of changing illumination was generated by a combination of von Kries photoreceptor adaptation and a White Patch calibration (∼30% improvement relative to a bee without color constancy). However, none of the tested algorithms generated perfect color constancy.

  19. Information Hiding for G.711 Speech Based on Substitution of Least Significant Bits and Estimation of Tolerable Distortion

    NASA Astrophysics Data System (ADS)

    Ito, Akinori; Abe, Shun'ichiro; Suzuki, Yôiti

    In this paper, we propose a novel data hiding technique for G.711-coded speech based on the LSB substitution method. The novel feature of the proposed method is that a low-bitrate encoder, G.726 ADPCM, is used as a reference for deciding how many bits can be embedded in a sample. Experiments showed that the method outperformed the simple LSB substitution method and the selective embedding method proposed by Aoki. We achieved 4-kbit/s embedding with almost no subjective degradation of speech quality, and 10kbit/s while maintaining good quality.

  20. Applications and statistical properties of minimum significant difference-based criterion testing in a toxicity testing program

    SciTech Connect

    Wang, Q.; Denton, D.L.; Shukla, R.

    2000-01-01

    As a follow up to the recommendations of the September 1995 SETAC Pellston Workshop on Whole Effluent Toxicity (WET) on test methods and appropriate endpoints, this paper will discuss the applications and statistical properties of using a statistical criterion of minimum significant difference (MSD). The authors examined the upper limits of acceptable MSDs as acceptance criterion in the case of normally distributed data. The implications of this approach are examined in terms of false negative rate as well as false positive rate. Results indicated that the proposed approach has reasonable statistical properties. Reproductive data from short-term chronic WET test with Ceriodaphnia dubia tests were used to demonstrate the applications of the proposed approach. The data were collected by the North Carolina Department of Environment, Health, and Natural Resources (Raleigh, NC, USA) as part of their National Pollutant Discharge Elimination System program.

  1. [Prospects for the design of new therapeutically significant protease inhibitors based on knottins and sunflower seed trypsin inhibitor (SFTI 1)].

    PubMed

    Kuznetsova, S S; Kolesanova, E F; Talanova, A V; Veselovsky, A V

    2016-05-01

    Plant seed knottins, mainly from the Cucurbitacea family, and sunflower seed trypsin inhibitor (SFTI 1) are the most low-molecular canonical peptide inhibitors of serine proteases. High efficiency of inhibition of various serine proteases, structure rigidity together with the possibility of limited variations of amino acid sequences, high chemical stability, lack of toxic properties, opportunity of production by either chemical synthesis or use of heterologous expression systems make these inhibitors attractive templates for design of new compounds for regulation of therapeutically significant serine protease activities. Hence the design of such compounds represents a prospective research field. The review considers structural characteristics of these inhibitors, their properties, methods of preparation and design of new analogs. Examples of successful employment of natural serine protease inhibitors belonging to knottin family and SFTI 1 as templates for the design of highly specific inhibitors of certain proteases are given. PMID:27562989

  2. Hydrothermal Fe cycling and deep ocean organic carbon scavenging: Model-based evidence for significant POC supply to seafloor sediments

    NASA Astrophysics Data System (ADS)

    German, C. R.; Legendre, L. L.; Sander, S. G.; Niquil, N.; Luther, G. W.; Bharati, L.; Han, X.; Le Bris, N.

    2015-06-01

    Submarine hydrothermal venting has recently been identified to have the potential to impact ocean biogeochemistry at the global scale. This is the case because processes active in hydrothermal plumes are so vigorous that the residence time of the ocean, with respect to cycling through hydrothermal plumes, is comparable to that of deep ocean mixing caused by thermohaline circulation. Recently, it has been argued that seafloor venting may provide a significant source of bio-essential Fe to the oceans as the result of a close coupling between Fe and organic carbon in hydrothermal plumes. But a complementary question remains to be addressed: does this same intimate Fe-Corg association in hydrothermal plumes cause any related impact to the global C cycle? To address this, SCOR-InterRidge Working Group 135 developed a modeling approach to synthesize site-specific field data from the East Pacific Rise 9°50‧ N hydrothermal field, where the range of requisite data sets is most complete, and combine those inputs with global estimates for dissolved Fe inputs from venting to the oceans to establish a coherent model with which to investigate hydrothermal Corg cycling. The results place new constraints on submarine Fe vent fluxes worldwide, including an indication that the majority of Fe supplied to hydrothermal plumes should come from entrainment of diffuse flow. While this same entrainment is not predicted to enhance the supply of dissolved organic carbon to hydrothermal plumes by more than ∼10% over background values, what the model does indicate is that scavenging of carbon in association with Fe-rich hydrothermal plume particles should play a significant role in the delivery of particulate organic carbon to deep ocean sediments, worldwide.

  3. Airborne/Space-Based Doppler Lidar Wind Sounders Sampling the PBL and Other Regions of Significant Beta and U Inhomogeneities

    NASA Technical Reports Server (NTRS)

    Emmitt, Dave

    1998-01-01

    This final report covers the period from April 1994 through March 1998. The proposed research was organized under four main tasks. Those tasks were: (1) Investigate the vertical and horizontal velocity structures within and adjacent to thin and subvisual cirrus; (2) Investigate the lowest 1 km of the PBL and develop algorithms for processing pulsed Doppler lidar data obtained from single shots into regions of significant inhomogeneities in Beta and U; (3) Participate in OSSEs including those designed to establish shot density requirements for meso-gamma scale phenomena with quasi-persistent locations (e.g., jets, leewaves, tropical storms); and (4) Participate in the planning and execution of an airborne mission to measure winds with a pulsed CO2 Doppler lidar. Over the four year period of this research contract, work on all four tasks has yielded significant results which have led to 38 professional presentations (conferences and publications) and have been folded into the science justification for an approved NASA space mission, SPARCLE (SPAce Readiness Coherent Lidar Experiment), in 2001. Also this research has, through Task 4, led to a funded proposal to work directly on a NASA field campaign, CAMEX III, in which an airborne Doppler wind lidar will be used to investigate the cloud-free circulations near tropical storms. Monthly progress reports required under this contract are on file. This final report will highlight major accomplishments, including some that were not foreseen in the original proposal. The presentation of this final report includes this written document as well as material that is better presented via the internet (web pages). There is heavy reference to appended papers and documents. Thus, the main body of the report will serve to summarize the key efforts and findings.

  4. Objective Probability and Quantum Fuzziness

    NASA Astrophysics Data System (ADS)

    Mohrhoff, U.

    2009-02-01

    This paper offers a critique of the Bayesian interpretation of quantum mechanics with particular focus on a paper by Caves, Fuchs, and Schack containing a critique of the “objective preparations view” or OPV. It also aims to carry the discussion beyond the hardened positions of Bayesians and proponents of the OPV. Several claims made by Caves et al. are rebutted, including the claim that different pure states may legitimately be assigned to the same system at the same time, and the claim that the quantum nature of a preparation device cannot legitimately be ignored. Both Bayesians and proponents of the OPV regard the time dependence of a quantum state as the continuous dependence on time of an evolving state of some kind. This leads to a false dilemma: quantum states are either objective states of nature or subjective states of belief. In reality they are neither. The present paper views the aforesaid dependence as a dependence on the time of the measurement to whose possible outcomes the quantum state serves to assign probabilities. This makes it possible to recognize the full implications of the only testable feature of the theory, viz., the probabilities it assigns to measurement outcomes. Most important among these are the objective fuzziness of all relative positions and momenta and the consequent incomplete spatiotemporal differentiation of the physical world. The latter makes it possible to draw a clear distinction between the macroscopic and the microscopic. This in turn makes it possible to understand the special status of measurements in all standard formulations of the theory. Whereas Bayesians have written contemptuously about the “folly” of conjoining “objective” to “probability,” there are various reasons why quantum-mechanical probabilities can be considered objective, not least the fact that they are needed to quantify an objective fuzziness. But this cannot be appreciated without giving thought to the makeup of the world, which

  5. Lipid-based nanosystems for CD44 targeting in cancer treatment: recent significant advances, ongoing challenges and unmet needs.

    PubMed

    Nascimento, Thais Leite; Hillaireau, Hervé; Vergnaud, Juliette; Fattal, Elias

    2016-07-01

    Extensive experimental evidence demonstrates the important role of hyaluronic acid (HA)-CD44 interaction in cell proliferation and migration, inflammation and tumor growth. Taking advantage of this interaction, the design of HA-modified nanocarriers has been investigated for targeting CD44-overexpressing cells with the purpose of delivering drugs to cancer or inflammatory cells. The effect of such modification on targeting efficacy is influenced by several factors. In this review, we focus on the impact of HA-modification on the characteristics of lipid-based nanoparticles. We try to understand how these modifications influence particle physicochemical properties, interaction with CD44 receptors, intracellular trafficking pathways, toxicity, complement/macrophage activation and pharmacokinetics. Our aim is to provide insight in tailoring particle modification by HA in order to design more efficient CD44-targeting lipid nanocarriers.

  6. [Prevention and treatment of the complications of polycystic ovarian syndrome--the significance of evidence-based, interdisciplinary management].

    PubMed

    Gődény, Sándor; Csenteri, Orsolya Karola

    2015-12-13

    Polycystic ovary syndrome is the most common hormonal and metabolic disorder likely to affect women. The syndrome is often associated with obesity, hyperinsulinemia and adversely affects endocrine, metabolic, and cardiovascular health. The complex feature of the syndrome requires an interdisciplinary approach to treatment, where cooperation of paediatrician, internist, gynaecologist, endocrinologist, dermatologist, psychologist and oncologist is essential. The prevention and the treatment should be based on the best available evidence. This should include physical examination, laboratory tests for hormones, serum insulin, glucose, lipids, in addition patient's preferences should be considered, too. To maximise health gain of polycystic ovarian syndrome, adequate, effective, efficient and safe treatment is necessary. This article summarises the highest available evidence provided by meta-analyses and systematic reviews of the prevention of metabolic and cardiovascular complications of the syndrome, and discusses the relevant evidence published in the literature. PMID:26639643

  7. Significant enhancement of Sm3+ photoreduction in halide nanophase precipitated AIF3-based glasses under femtosecond laser irradiation.

    PubMed

    Jiao, Qing; Yu, Xue; Yang, Zhengwen; Zhou, Dacheng; Qiu, Jianbei

    2013-06-01

    The electronegativity effect for the efficient photoreduction of Sm3+ to Sm2+ in Br-modified fluoroaluminate glasses was investigated after femtosecond laser (fs) irradiation. Sm2+ luminescence was strongly observed in the higher Br-modified samples, and basing on the TEM and DSC measurements, BaBr2 nanophases were precipitated from the glass matrix in the laser focused areas. From the EDS spectra, it was found that Sm3+ can be selectively incorporated into the BaBr2 nanophases. More electrons provided by the nanophase facilitated the Sm3+ reduction in the irradiation process. Since the photoreduction efficiency of Sm3+ in Br-modified glass is evidently higher than that in Cl-modified glasses, the effect of halide ions electronegativity on Sm3+ photoreduction was identified and relevant mechanism was discussed.

  8. [Prevention and treatment of the complications of polycystic ovarian syndrome--the significance of evidence-based, interdisciplinary management].

    PubMed

    Gődény, Sándor; Csenteri, Orsolya Karola

    2015-12-13

    Polycystic ovary syndrome is the most common hormonal and metabolic disorder likely to affect women. The syndrome is often associated with obesity, hyperinsulinemia and adversely affects endocrine, metabolic, and cardiovascular health. The complex feature of the syndrome requires an interdisciplinary approach to treatment, where cooperation of paediatrician, internist, gynaecologist, endocrinologist, dermatologist, psychologist and oncologist is essential. The prevention and the treatment should be based on the best available evidence. This should include physical examination, laboratory tests for hormones, serum insulin, glucose, lipids, in addition patient's preferences should be considered, too. To maximise health gain of polycystic ovarian syndrome, adequate, effective, efficient and safe treatment is necessary. This article summarises the highest available evidence provided by meta-analyses and systematic reviews of the prevention of metabolic and cardiovascular complications of the syndrome, and discusses the relevant evidence published in the literature.

  9. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    SciTech Connect

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-08-26

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  10. Dietary variety increases the probability of nutrient adequacy among adults.

    PubMed

    Foote, Janet A; Murphy, Suzanne P; Wilkens, Lynne R; Basiotis, P Peter; Carlson, Andrea

    2004-07-01

    Despite guidance to consume a variety of foods, the role of dietary variety in ensuring nutrient adequacy is unclear. The aim of this study was to determine whether a commodity-based measure of dietary variety was associated with the probability of nutrient adequacy after adjusting for energy and food group intakes. Subjects were 4969 men and 4800 women >/= 19 y old who participated in the Continuing Survey of Food Intakes for Individuals 1994-1996. Using 24-h recall data, the mean probability of adequacy across 15 nutrients was calculated using the Dietary Reference Intakes. Dietary variety was defined using a commodity-based method similar to that used for the Healthy Eating Index (HEI). Associations were examined in gender-specific multivariate regression models. Energy intake was a strong predictor of the mean probability of adequacy in models controlled for age, BMI, education level, and ethnicity (model R(2) = 0.60 and 0.54 for men and women, respectively). Adding the number of servings from each of the 5 Food Guide Pyramid (FGP) groups to the models significantly improved the model fit (R(2) = 0.69 and 0.66 for men and women). Adding dietary variety again significantly improved the model fit for both men and women (R(2) = 0.73 and 0.70, respectively). Variety counts within the dairy and grain groups were most strongly associated with improved nutrient adequacy. Dietary variety as defined by the HEI contributes an additional component of dietary quality that is not captured by FGP servings or energy intake. PMID:15226469

  11. Prognostic Significance of Preoperative Circulating Monocyte Count in Patients With Breast Cancer: Based on a Large Cohort Study.

    PubMed

    Wen, Jiahuai; Ye, Feng; Huang, Xiaojia; Li, Shuaijie; Yang, Lu; Xiao, Xiangsheng; Xie, Xiaoming

    2015-12-01

    Growing evidence showed that inflammation response plays an important role in cancer development and progression, and absolute lymphocyte count (ALC), absolute monocyte count (AMC), and lymphocyte to monocyte ratio (LMR) have been used as parameters of systemic inflammation in several tumors. In this study, we evaluated the prognostic significance of preoperative ALC, AMC and LMR in breast cancer and 2000 patients between January 2002 and December 2008 at Sun Yat-Sen University Cancer Center were enrolled. Patients were grouped by the cut-off value according to the receiver operating characteristics (ROC) curve analysis. Kaplan-Meier analysis showed that patients with elevated AMC levels (>0.48 × 10/L) had shorter overall survival (OS, P < 0.001). In multivariate analysis, preoperative AMC was identified as an independent prognostic parameter for OS in breast cancer patients (hazard ratio = 1.374, 95% confidence interval: 1.045-1.807). Subgroup analyses revealed that AMC was an unfavorable prognostic factor in stage II-III breast cancer patients and Luminal B, human epithelial growth factor receptor-2 overexpressing subtype, and triple-negative breast cancer (all P < 0.05). Additionally, the prognostic value of ALC and LMR could not be proven in the current study. Preoperative AMC may serve as an easily available and low-priced parameter to predict the outcomes of breast cancer.

  12. Food Classification Systems Based on Food Processing: Significance and Implications for Policies and Actions: A Systematic Literature Review and Assessment.

    PubMed

    Moubarac, Jean-Claude; Parra, Diana C; Cannon, Geoffrey; Monteiro, Carlos A

    2014-06-01

    This paper is the first to make a systematic review and assessment of the literature that attempts methodically to incorporate food processing into classification of diets. The review identified 1276 papers, of which 110 were screened and 21 studied, derived from five classification systems. This paper analyses and assesses the five systems, one of which has been devised and developed by a research team that includes co-authors of this paper. The quality of the five systems is assessed and scored according to how specific, coherent, clear, comprehensive and workable they are. Their relevance to food, nutrition and health, and their use in various settings, is described. The paper shows that the significance of industrial food processing in shaping global food systems and supplies and thus dietary patterns worldwide, and its role in the pandemic of overweight and obesity, remains overlooked and underestimated. Once food processing is systematically incorporated into food classifications, they will be more useful in assessing and monitoring dietary patterns. Food classification systems that emphasize industrial food processing, and that define and distinguish relevant different types of processing, will improve understanding of how to prevent and control overweight, obesity and related chronic non-communicable diseases, and also malnutrition. They will also be a firmer basis for rational policies and effective actions designed to protect and improve public health at all levels from global to local.

  13. Approaches to Evaluating Probability of Collision Uncertainty

    NASA Technical Reports Server (NTRS)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  14. Estimation of transition probabilities of credit ratings

    NASA Astrophysics Data System (ADS)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  15. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  16. A quantum probability perspective on borderline vagueness.

    PubMed

    Blutner, Reinhard; Pothos, Emmanuel M; Bruza, Peter

    2013-10-01

    The term "vagueness" describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon. PMID:24039093

  17. Cheating Probabilities on Multiple Choice Tests

    NASA Astrophysics Data System (ADS)

    Rizzuto, Gaspard T.; Walters, Fred

    1997-10-01

    This paper is strictly based on mathematical statistics and as such does not depend on prior performance and assumes the probability of each choice to be identical. In a real life situation, the probability of two students having identical responses becomes larger the better the students are. However the mathematical model is developed for all responses, both correct and incorrect, and provides a baseline for evaluation. David Harpp and coworkers (2, 3) at McGill University have evaluated ratios of exact errors in common (EEIC) to errors in common (EIC) and differences (D). In pairings where the ratio EEIC/EIC was greater than 0.75, the pair had unusually high odds against their answer pattern being random. Detection of copying of the EEIC/D ratios at values >1.0 indicate that pairs of these students were seated adjacent to one another and copied from one another. The original papers should be examined for details.

  18. A quantum probability perspective on borderline vagueness.

    PubMed

    Blutner, Reinhard; Pothos, Emmanuel M; Bruza, Peter

    2013-10-01

    The term "vagueness" describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.

  19. ED-based screening programs for hepatitis C (HCV) highlight significant opportunity to identify patients, prevent downstream costs/complications.

    PubMed

    2014-01-01

    New data suggest there is a huge opportunity for EDs to identify patients with the hepatitis C virus (HCV) and link them into care before downstream complications lead to higher medical costs and adverse outcomes. Early results from a pilot study at the University of Alabama Medical Center in Birmingham show that at least 12% of the targeted baby boomer population being screened for HCV in the ED is testing positive for HCV, with confirmatory tests showing that about 9% of the screened population is infected with the disease. Both the Centers for Disease Control in Atlanta and the US Preventive Services Task Force recommend one-time HCV screening for patients who were born between 1945 and 1965. Public health experts say 75% of HCV infections occur in patients born during the baby boomer years, and that roughly half of them are unaware of their HCV status. Researchers at UAB report that so many patients are testing positive for HCV that demand for care can quickly overwhelm the health system if new primary care/specialty resources are not identified. Administrators of ED-based HCV screening programs in both Birmingham and Houston note that EDs with existing screening programs for HIV should have the easiest time implementing HCV screening. They also stress that patients are more accepting of HCV screening, and that the counseling process is easier. PMID:24432549

  20. The Significance of Lewis Acid Sites for the Selective Catalytic Reduction of Nitric Oxide on Vanadium-Based Catalysts.

    PubMed

    Marberger, Adrian; Ferri, Davide; Elsener, Martin; Kröcher, Oliver

    2016-09-19

    The long debated reaction mechanisms of the selective catalytic reduction (SCR) of nitric oxide with ammonia (NH3 ) on vanadium-based catalysts rely on the involvement of Brønsted or Lewis acid sites. This issue has been clearly elucidated using a combination of transient perturbations of the catalyst environment with operando time-resolved spectroscopy to obtain unique molecular level insights. Nitric oxide reacts predominantly with NH3 coordinated to Lewis sites on vanadia on tungsta-titania (V2 O5 -WO3 -TiO2 ), while Brønsted sites are not involved in the catalytic cycle. The Lewis site is a mono-oxo vanadyl group that reduces only in the presence of both nitric oxide and NH3 . We were also able to verify the formation of the nitrosamide (NH2 NO) intermediate, which forms in tandem with vanadium reduction, and thus the entire mechanism of SCR. Our experimental approach, demonstrated in the specific case of SCR, promises to progress the understanding of chemical reactions of technological relevance.

  1. The Significance of Lewis Acid Sites for the Selective Catalytic Reduction of Nitric Oxide on Vanadium-Based Catalysts.

    PubMed

    Marberger, Adrian; Ferri, Davide; Elsener, Martin; Kröcher, Oliver

    2016-09-19

    The long debated reaction mechanisms of the selective catalytic reduction (SCR) of nitric oxide with ammonia (NH3 ) on vanadium-based catalysts rely on the involvement of Brønsted or Lewis acid sites. This issue has been clearly elucidated using a combination of transient perturbations of the catalyst environment with operando time-resolved spectroscopy to obtain unique molecular level insights. Nitric oxide reacts predominantly with NH3 coordinated to Lewis sites on vanadia on tungsta-titania (V2 O5 -WO3 -TiO2 ), while Brønsted sites are not involved in the catalytic cycle. The Lewis site is a mono-oxo vanadyl group that reduces only in the presence of both nitric oxide and NH3 . We were also able to verify the formation of the nitrosamide (NH2 NO) intermediate, which forms in tandem with vanadium reduction, and thus the entire mechanism of SCR. Our experimental approach, demonstrated in the specific case of SCR, promises to progress the understanding of chemical reactions of technological relevance. PMID:27553251

  2. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew R.; Piro, Anthony; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we investigate the probability that a star will make a BH as a function of its ZAMS mass. Although the shape of the black hole formation probability function is poorly constrained by current measurements, we believe that this framework is an important new step toward better understanding BH formation. We also consider some of the implications of this probability distribution, from its impact on the chemical enrichment from massive stars, to its connection with the structure of the core at the time of collapse, to the birth kicks that black holes receive. A probabilistic description of BH formation will be a useful input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  3. Investigation of the Chromosome Regions with Significant Affinity for the Nuclear Envelope in Fruit Fly – A Model Based Approach

    PubMed Central

    Kinney, Nicholas Allen; Sharakhov, Igor V.; Onufriev, Alexey V.

    2014-01-01

    Three dimensional nuclear architecture is important for genome function, but is still poorly understood. In particular, little is known about the role of the “boundary conditions” – points of attachment between chromosomes and the nuclear envelope. We describe a method for modeling the 3D organization of the interphase nucleus, and its application to analysis of chromosome-nuclear envelope (Chr-NE) attachments of polytene (giant) chromosomes in Drosophila melanogaster salivary glands. The model represents chromosomes as self-avoiding polymer chains confined within the nucleus; parameters of the model are taken directly from experiment, no fitting parameters are introduced. Methods are developed to objectively quantify chromosome territories and intertwining, which are discussed in the context of corresponding experimental observations. In particular, a mathematically rigorous definition of a territory based on convex hull is proposed. The self-avoiding polymer model is used to re-analyze previous experimental data; the analysis suggests 33 additional Chr-NE attachments in addition to the 15 already explored Chr-NE attachments. Most of these new Chr-NE attachments correspond to intercalary heterochromatin – gene poor, dark staining, late replicating regions of the genome; however, three correspond to euchromatin – gene rich, light staining, early replicating regions of the genome. The analysis also suggests 5 regions of anti-contact, characterized by aversion for the NE, only two of these correspond to euchromatin. This composition of chromatin suggests that heterochromatin may not be necessary or sufficient for the formation of a Chr-NE attachment. To the extent that the proposed model represents reality, the confinement of the polytene chromosomes in a spherical nucleus alone does not favor the positioning of specific chromosome regions at the NE as seen in experiment; consequently, the 15 experimentally known Chr-NE attachment positions do not appear to

  4. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  5. Probability theory, not the very guide of life.

    PubMed

    Juslin, Peter; Nilsson, Håkan; Winman, Anders

    2009-10-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive integration, in part, at least, because of well-known capacity constraints on controlled thought. In this article, the authors show with computer simulations that when based on approximate knowledge of probabilities, as is routinely the case in natural environments, linear additive integration can yield as accurate estimates, and as good average decision returns, as estimates based on probability theory. It is proposed that in natural environments people have little opportunity or incentive to induce the normative rules of probability theory and, given their cognitive constraints, linear additive integration may often offer superior bounded rationality. PMID:19839686

  6. Bacteria survival probability in bactericidal filter paper.

    PubMed

    Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M

    2014-05-01

    Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive.

  7. Preservice Elementary Teachers and the Fundamentals of Probability

    ERIC Educational Resources Information Center

    Dollard, Clark

    2011-01-01

    This study examined how preservice elementary teachers think about situations involving probability. Twenty-four preservice elementary teachers who had not yet studied probability as part of their preservice elementary mathematics coursework were interviewed using a task-based interview. The participants' responses showed a wide variety of…

  8. Public Attitudes toward Stuttering in Turkey: Probability versus Convenience Sampling

    ERIC Educational Resources Information Center

    Ozdemir, R. Sertan; St. Louis, Kenneth O.; Topbas, Seyhun

    2011-01-01

    Purpose: A Turkish translation of the "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was used to compare probability versus convenience sampling to measure public attitudes toward stuttering. Method: A convenience sample of adults in Eskisehir, Turkey was compared with two replicates of a school-based, probability cluster…

  9. Probability Theory, Not the Very Guide of Life

    ERIC Educational Resources Information Center

    Juslin, Peter; Nilsson, Hakan; Winman, Anders

    2009-01-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…

  10. Dynamics of opinion formation with strengthen selection probability

    NASA Astrophysics Data System (ADS)

    Zhang, Haifeng; Jin, Zhen; Wang, Binghong

    2014-04-01

    The local majority rule is extensively accepted as a paradigmatic model to reflect the formation of opinion. In this paper, we study a model of opinion formation where opinion update rule is not based on the majority rule or linear selection probability but on a strengthen selection probability controlled by an adjustable parameter β. In particular, our proposed probability function can proximately fit the two extreme cases-linear probability function and majority rule or in between the two cases under different values of β. By studying such model on different kinds of networks, including different regular networks and complex networks, we find that there exists an optimal value of β giving the most efficient convergence to consensus regardless of the topology of networks. This work reveals that, compared with the majority rule and linear selection probability, the strengthen selection probability might be a more proper model in understanding the formation of opinions in society.

  11. Persistence probabilities for stream populations.

    PubMed

    Samia, Yasmine; Lutscher, Frithjof

    2012-07-01

    Individuals in streams and rivers are constantly at risk of being washed downstream and thereby lost to their population. The possibility of diffusion-mediated persistence of populations in advective environments has been the focus of a multitude of recent modeling efforts. Most of these recent models are deterministic, and they predict the existence of a critical advection velocity, above which a population cannot persist. In this work, we present a stochastic approach to the persistence problem in streams and rivers. We use the dominant eigenvalue of the advection-diffusion operator to transition from a spatially explicit description to a spatially implicit birth-death process, in which individual washout from the domain appears as an additional death term. We find that the deterministic persistence threshold is replaced by a smooth transition from almost sure persistence to extinction as advection velocity increases. More interestingly, we explore how temporal variation in flow rate and other parameters affect the persistence probability. In line with general expectations, we find that temporal variation often decreases the persistence probability, and we focus on a few examples of how variation can increase population persistence.

  12. A Quantum Probability Model of Causal Reasoning

    PubMed Central

    Trueblood, Jennifer S.; Busemeyer, Jerome R.

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747

  13. Augmenting Transition Probabilities for Neutral Atomic Nitrogen

    NASA Technical Reports Server (NTRS)

    Terrazas-Salines, Imelda; Park, Chul; Strawa, Anthony W.; Hartman, G. Joseph (Technical Monitor)

    1996-01-01

    The transition probability values for a number of neutral atomic nitrogen (NI) lines in the visible wavelength range are determined in order to augment those given in the National Bureau of Standards Tables. These values are determined from experimentation as well as by using the published results of other investigators. The experimental determination of the lines in the 410 to 430 nm range was made from the observation of the emission from the arc column of an arc-heated wind tunnel. The transition probability values of these NI lines are determined to an accuracy of +/- 30% by comparison of their measured intensities with those of the atomic oxygen (OI) multiplet at around 615 nm. The temperature of the emitting medium is determined both using a multiple-layer model, based on a theoretical model of the flow in the arc column, and an empirical single-layer model. The results show that the two models lead to the same values of transition probabilities for the NI lines.

  14. Quantum probabilities from quantum entanglement: experimentally unpacking the Born rule

    NASA Astrophysics Data System (ADS)

    Harris, Jérémie; Bouchard, Frédéric; Santamato, Enrico; Zurek, Wojciech H.; Boyd, Robert W.; Karimi, Ebrahim

    2016-05-01

    The Born rule, a foundational axiom used to deduce probabilities of events from wavefunctions, is indispensable in the everyday practice of quantum physics. It is also key in the quest to reconcile the ostensibly inconsistent laws of the quantum and classical realms, as it confers physical significance to reduced density matrices, the essential tools of decoherence theory. Following Bohr’s Copenhagen interpretation, textbooks postulate the Born rule outright. However, recent attempts to derive it from other quantum principles have been successful, holding promise for simplifying and clarifying the quantum foundational bedrock. A major family of derivations is based on envariance, a recently discovered symmetry of entangled quantum states. Here, we identify and experimentally test three premises central to these envariance-based derivations, thus demonstrating, in the microworld, the symmetries from which the Born rule is derived. Further, we demonstrate envariance in a purely local quantum system, showing its independence from relativistic causality.

  15. Mapping genes with longitudinal phenotypes via Bayesian posterior probabilities.

    PubMed

    Musolf, Anthony; Nato, Alejandro Q; Londono, Douglas; Zhou, Lisheng; Matise, Tara C; Gordon, Derek

    2014-01-01

    Most association studies focus on disease risk, with less attention paid to disease progression or severity. These phenotypes require longitudinal data. This paper presents a new method for analyzing longitudinal data to map genes in both population-based and family-based studies. Using simulated systolic blood pressure measurements obtained from Genetic Analysis Workshop 18, we cluster the phenotype data into trajectory subgroups. We then use the Bayesian posterior probability of being in the high subgroup as a quantitative trait in an association analysis with genotype data. This method maintains high power (>80%) in locating genes known to affect the simulated phenotype for most specified significance levels (α). We believe that this method can be useful to aid in the discovery of genes that affect severity or progression of disease. PMID:25519410

  16. Quantum probabilities from quantum entanglement: experimentally unpacking the Born rule

    DOE PAGESBeta

    Harris, Jérémie; Bouchard, Frédéric; Santamato, Enrico; Zurek, Wojciech H.; Boyd, Robert W.; Karimi, Ebrahim

    2016-05-11

    The Born rule, a foundational axiom used to deduce probabilities of events from wavefunctions, is indispensable in the everyday practice of quantum physics. It is also key in the quest to reconcile the ostensibly inconsistent laws of the quantum and classical realms, as it confers physical significance to reduced density matrices, the essential tools of decoherence theory. Following Bohr's Copenhagen interpretation, textbooks postulate the Born rule outright. But, recent attempts to derive it from other quantum principles have been successful, holding promise for simplifying and clarifying the quantum foundational bedrock. Moreover, a major family of derivations is based on envariance,more » a recently discovered symmetry of entangled quantum states. Here, we identify and experimentally test three premises central to these envariance-based derivations, thus demonstrating, in the microworld, the symmetries from which the Born rule is derived. Furthermore, we demonstrate envariance in a purely local quantum system, showing its independence from relativistic causality.« less

  17. Lévy laws in free probability

    PubMed Central

    Barndorff-Nielsen, Ole E.; Thorbjørnsen, Steen

    2002-01-01

    This article and its sequel outline recent developments in the theory of infinite divisibility and Lévy processes in free probability, a subject area belonging to noncommutative (or quantum) probability. The present paper discusses the classes of infinitely divisible probability measures in classical and free probability, respectively, via a study of the Bercovici–Pata bijection between these classes. PMID:12473744

  18. Imprecise probability for non-commuting observables

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.

    2015-08-01

    It is known that non-commuting observables in quantum mechanics do not have joint probability. This statement refers to the precise (additive) probability model. I show that the joint distribution of any non-commuting pair of variables can be quantified via upper and lower probabilities, i.e. the joint probability is described by an interval instead of a number (imprecise probability). I propose transparent axioms from which the upper and lower probability operators follow. The imprecise probability depend on the non-commuting observables, is linear over the state (density matrix) and reverts to the usual expression for commuting observables.

  19. Snell Envelope with Small Probability Criteria

    SciTech Connect

    Del Moral, Pierre Hu, Peng; Oudjane, Nadia

    2012-12-15

    We present a new algorithm to compute the Snell envelope in the specific case where the criteria to optimize is associated with a small probability or a rare event. This new approach combines the Stochastic Mesh approach of Broadie and Glasserman with a particle approximation scheme based on a specific change of measure designed to concentrate the computational effort in regions pointed out by the criteria. The theoretical analysis of this new algorithm provides non asymptotic convergence estimates. Finally, the numerical tests confirm the practical interest of this approach.

  20. Recent Advances in Model-Assisted Probability of Detection

    NASA Technical Reports Server (NTRS)

    Thompson, R. Bruce; Brasche, Lisa J.; Lindgren, Eric; Swindell, Paul; Winfree, William P.

    2009-01-01

    The increased role played by probability of detection (POD) in structural integrity programs, combined with the significant time and cost associated with the purely empirical determination of POD, provides motivation for alternate means to estimate this important metric of NDE techniques. One approach to make the process of POD estimation more efficient is to complement limited empirical experiments with information from physics-based models of the inspection process or controlled laboratory experiments. The Model-Assisted Probability of Detection (MAPOD) Working Group was formed by the Air Force Research Laboratory, the FAA Technical Center, and NASA to explore these possibilities. Since the 2004 inception of the MAPOD Working Group, 11 meetings have been held in conjunction with major NDE conferences. This paper will review the accomplishments of this group, which includes over 90 members from around the world. Included will be a discussion of strategies developed to combine physics-based and empirical understanding, draft protocols that have been developed to guide application of the strategies, and demonstrations that have been or are being carried out in a number of countries. The talk will conclude with a discussion of future directions, which will include documentation of benefits via case studies, development of formal protocols for engineering practice, as well as a number of specific technical issues.

  1. Reduction of discretization error for ray tracing of MOC through a correction on collision probabilities

    SciTech Connect

    Tabuchi, M.; Tatsumi, M.; Yamamoto, A.; Endo, T.

    2013-07-01

    A new correction model for ray tracing of the method of characteristics is proposed in order to reduce discretization error. As the ray tracing parameters such as azimuthal angle division, polar angle division and ray separation are considered in this study. In the method of characteristics, region average scalar fluxes can be implicitly expressed by collision probabilities, although these collision probabilities are not directly treated in the ordinary calculation scheme. From this viewpoint, difference between a coarse ray tracing condition and a detailed one can be interpreted as the difference in the estimation of collision probabilities. In other words, the discretization error for ray tracing can be recognized as a consequence of inaccurate collision probabilities caused by coarse ray tracing. This discussion suggests that accurate region average scalar flux can be obtained through an appropriate correction on collision probabilities. In this paper, a correction model on collision probabilities is theoretically derived based on the neutron balance equation, and its validity is confirmed through typical single assembly calculations. The effectiveness of the present correction method is also discussed in this paper. It is confirmed that discretization error for ray tracing can be significantly reduced by the present correction method in a multi-assembly calculation, though the correction factor is estimated in single assembly geometry. (authors)

  2. Probability forecast of the suspended sediment concentration using copula

    NASA Astrophysics Data System (ADS)

    Yu, Kun-xia; Li, Peng; Li, Zhanbin

    2016-04-01

    An approach for probability forecast of the suspended sediment loads is presented in our research. Probability forecast model is established based on the joint probability distribution of water discharge and suspended sediment concentration. The conditional distribution function of suspended sediment concentration given water discharge is evaluated provided the joint probability distribution between water discharge and suspended sediment concentration is constructed, and probability forecast of suspended sediment concentration is implemented in terms of conditional probability function. This approach is exemplified using annual data set of ten watersheds in the middle Yellow River which is characterized by heavy sediment. The three-parameter Gamma distribution is employed to fit the marginal distribution of annual water discharge and annual suspended sediment concentration, and the Gumbel copula can well describe the dependence structure between annual water discharge and annual suspended sediment concentration. Annual suspended sediment concentration estimated from the conditional distribution function with forecast probability of 50 percent agree better with the observed suspended sediment concentration values than the traditional sediment rating curve method given water discharge values. The overwhelming majority of observed suspended sediment concentration points lie between the forecast probability of 5 percent and 95 percent, which can be considered as the lower and upper 95th percent uncertainty bound of the predicted observation respectively. The results indicate that probability forecast on the basis of conditional distribution function is a potential alternative in suspended sediment and other hydrological variables estimation.

  3. Evolution probabilities and phylogenetic distance of dinucleotides.

    PubMed

    Michel, Christian J

    2007-11-21

    We develop here an analytical evolution model based on a dinucleotide mutation matrix 16 x 16 with six substitution parameters associated with the three types of substitutions in the two dinucleotide sites. It generalizes the previous models based on the nucleotide mutation matrices 4 x 4. It determines at some time t the exact occurrence probabilities of dinucleotides mutating randomly according to these six substitution parameters. Furthermore, several properties and two applications of this model allow to derive 16 evolutionary analytical solutions of dinucleotides and also a dinucleotide phylogenetic distance. Finally, based on this mathematical model, the SED (Stochastic Evolution of Dinucleotides) web server has been developed for deriving evolutionary analytical solutions of dinucleotides.

  4. Uncertainty analysis for Probable Maximum Precipitation estimates

    NASA Astrophysics Data System (ADS)

    Micovic, Zoran; Schaefer, Melvin G.; Taylor, George H.

    2015-02-01

    An analysis of uncertainty associated with Probable Maximum Precipitation (PMP) estimates is presented. The focus of the study is firmly on PMP estimates derived through meteorological analyses and not on statistically derived PMPs. Theoretical PMP cannot be computed directly and operational PMP estimates are developed through a stepwise procedure using a significant degree of subjective professional judgment. This paper presents a methodology for portraying the uncertain nature of PMP estimation by analyzing individual steps within the PMP derivation procedure whereby for each parameter requiring judgment, a set of possible values is specified and accompanied by expected probabilities. The resulting range of possible PMP values can be compared with the previously derived operational single-value PMP, providing measures of the conservatism and variability of the original estimate. To our knowledge, this is the first uncertainty analysis conducted for a PMP derived through meteorological analyses. The methodology was tested on the La Joie Dam watershed in British Columbia. The results indicate that the commonly used single-value PMP estimate could be more than 40% higher when possible changes in various meteorological variables used to derive the PMP are considered. The findings of this study imply that PMP estimates should always be characterized as a range of values recognizing the significant uncertainties involved in PMP estimation. In fact, we do not know at this time whether precipitation is actually upper-bounded, and if precipitation is upper-bounded, how closely PMP estimates approach the theoretical limit.

  5. The probability for a Pap test to be abnormal is directly proportional to HPV viral load: results from a Swiss study comparing HPV testing and liquid-based cytology to detect cervical cancer precursors in 13,842 women.

    PubMed

    Bigras, G; de Marval, F

    2005-09-01

    In a study involving 13,842 women and 113 gynaecologists, liquid-based cytology and HPV testing for detecting cervical cancer were compared. A total of 1334 women were found to be positive for one or both tests and were invited for colposcopy with biopsy. A total of 1031 satisfactory biopsies on 1031 women were thereafter collected using a systematic biopsy protocol, which was random in the colposcopically normal-appearing cervix or directed in the abnormal one. In all, 502 women with negative tests were also biopsied. A total of 82 histologic high-grade squamous intraepithelial lesion (HSIL) were reported in biopsies, all from the group with one or both tests positive. Sensitivity and specificity to detect histologic HSIL were 59 and 97% for cytology, and 97 and 92% for HPV. In total, 14% of reviewed negative cytological preparations associated with histologic HSIL contained no morphologically abnormal cells despite a positive HPV test. This suggested a theoretical limit for cytology sensitivity. HPV viral load analysis of the 1143 HPV-positive samples showed a direct relationship between abnormal Pap test frequency and HPV viral load. Thus, not only does the HPV testing have a greater sensitivity than cytology but the probability of the latter being positive can also be defined as a function of the associated HPV viral load. PMID:16136031

  6. The probability for a Pap test to be abnormal is directly proportional to HPV viral load: results from a Swiss study comparing HPV testing and liquid-based cytology to detect cervical cancer precursors in 13 842 women

    PubMed Central

    Bigras, G; de Marval, F

    2005-01-01

    In a study involving 13 842 women and 113 gynaecologists, liquid-based cytology and HPV testing for detecting cervical cancer were compared. A total of 1334 women were found to be positive for one or both tests and were invited for colposcopy with biopsy. A total of 1031 satisfactory biopsies on 1031 women were thereafter collected using a systematic biopsy protocol, which was random in the colposcopically normal-appearing cervix or directed in the abnormal one. In all, 502 women with negative tests were also biopsied. A total of 82 histologic high-grade squamous intraepithelial lesion (HSIL) were reported in biopsies, all from the group with one or both tests positive. Sensitivity and specificity to detect histologic HSIL were 59 and 97% for cytology, and 97 and 92% for HPV. In total, 14% of reviewed negative cytological preparations associated with histologic HSIL contained no morphologically abnormal cells despite a positive HPV test. This suggested a theoretical limit for cytology sensitivity. HPV viral load analysis of the 1143 HPV-positive samples showed a direct relationship between abnormal Pap test frequency and HPV viral load. Thus, not only does the HPV testing have a greater sensitivity than cytology but the probability of the latter being positive can also be defined as a function of the associated HPV viral load. PMID:16136031

  7. Instability of Wave Trains and Wave Probabilities

    NASA Astrophysics Data System (ADS)

    Babanin, Alexander

    2013-04-01

    Centre for Ocean Engineering, Science and Technology, Swinburne University of Technology, Melbourne, Australia, ababanin@swin.edu.au Design criteria in ocean engineering, whether this is one in 50 years or one in 5000 years event, are hardly ever based on measurements, and rather on statistical distributions of relevant metocean properties. Of utmost interest is the tail of distribution, that is rare events such as the highest waves with low probability. Engineers have long since realised that the superposition of linear waves with narrow-banded spectrum as depicted by the Rayleigh distribution underestimates the probability of extreme wave heights and crests, which is a critical shortcoming as far as the engineering design is concerned. Ongoing theoretical and experimental efforts have been under way for decades to address this issue. Typical approach is the treating all possible waves in the ocean or at a particular location as a single ensemble for which some comprehensive solution can be obtained. The oceanographic knowledge, however, now indicates that no single and united comprehensive solution is available. We would expect the probability distributions of wave height to depend on a) whether the waves are at the spectral peak or at the tail; b) on wave spectrum and mean steepness in the wave field; c) on the directional distribution of the peak waves; d) on whether the waves are in deep water, in intermediate depth or in shallow water; e) on wave breaking; f) on the wind, particularly if it is very strong, and on the currents if they have suitable horizontal gradients. Probability distributions in the different circumstances according to these groups of conditions should be different, and by combining them together the inevitable scatter is introduced. The scatter and the accuracy will not improve by increasing the bulk data quality and quantity, and it hides the actual distribution of extremes. The groups have to be separated and their probability

  8. Towards Defining Probability Forecasts of Likely Climate Change

    NASA Astrophysics Data System (ADS)

    Smith, L. A.; Allen, M. R.; Stainforth, D. A.

    2004-12-01

    There is strong desire for probabilistic forecasts of climate change, both for policy making and risk management, as well as scientific interest. The extent to which this desire can be satisfied scientifically is unclear. The aim of this paper is to explore (i) current methods for extracting probability forecasts and (ii) alternative deliverables which have a firm scientific basis given the current limitations to the state of the art. Even ``physics-based" models contain empirically determined paramters and parameterizations. While it is straightforward to make `ensembles' over initial conditions, parameter values and even several model structures, the interpretation of the resulting ensemble of simulations requires some care. Methods for extracting probability forecasts from ensembles of model simulations will be discussed in terms of their relevance and internal consistency, a particular example being provided by Murphy et al (Nature, 2004). This approach will be compared and contrasted with one proposed by climateprediction.net (Stainforth et al, Nature, in review), which strives to produce policy relevant information when no coherent probability forecast can be extracted from the limited ensembles available in practice. The role of the Bayesian paradigm will be considered in both cases. Extracting probability distributions in the context of climate change requires consideration of a discrete sample drawn from a high dimensional space. The analysis of small ensembles requires further assumptions of linearity and smoothness which must be verified explicitly; even when ``large" ensembles are to hand, the analysis of the collection of simulations requires combining mutually exclusive runs, sampling a restricted region of the parameter space, under a set of models with related shortcoming. Historical observations serve to lift some of these difficulties, but attempts to fold observations into the analysis (say, in terms of weighting sets of simulations differently

  9. Landslide Probability Assessment by the Derived Distributions Technique

    NASA Astrophysics Data System (ADS)

    Muñoz, E.; Ochoa, A.; Martínez, H.

    2012-12-01

    Landslides are potentially disastrous events that bring along human and economic losses; especially in cities where an accelerated and unorganized growth leads to settlements on steep and potentially unstable areas. Among the main causes of landslides are geological, geomorphological, geotechnical, climatological, hydrological conditions and anthropic intervention. This paper studies landslides detonated by rain, commonly known as "soil-slip", which characterize by having a superficial failure surface (Typically between 1 and 1.5 m deep) parallel to the slope face and being triggered by intense and/or sustained periods of rain. This type of landslides is caused by changes on the pore pressure produced by a decrease in the suction when a humid front enters, as a consequence of the infiltration initiated by rain and ruled by the hydraulic characteristics of the soil. Failure occurs when this front reaches a critical depth and the shear strength of the soil in not enough to guarantee the stability of the mass. Critical rainfall thresholds in combination with a slope stability model are widely used for assessing landslide probability. In this paper we present a model for the estimation of the occurrence of landslides based on the derived distributions technique. Since the works of Eagleson in the 1970s the derived distributions technique has been widely used in hydrology to estimate the probability of occurrence of extreme flows. The model estimates the probability density function (pdf) of the Factor of Safety (FOS) from the statistical behavior of the rainfall process and some slope parameters. The stochastic character of the rainfall is transformed by means of a deterministic failure model into FOS pdf. Exceedance probability and return period estimation is then straightforward. The rainfall process is modeled as a Rectangular Pulses Poisson Process (RPPP) with independent exponential pdf for mean intensity and duration of the storms. The Philip infiltration model

  10. Fusion probability in heavy nuclei

    NASA Astrophysics Data System (ADS)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine . Approximate boundaries have been obtained from where starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross

  11. Determination of riverbank erosion probability using Locally Weighted Logistic Regression

    NASA Astrophysics Data System (ADS)

    Ioannidou, Elena; Flori, Aikaterini; Varouchakis, Emmanouil A.; Giannakis, Georgios; Vozinaki, Anthi Eirini K.; Karatzas, George P.; Nikolaidis, Nikolaos

    2015-04-01

    Riverbank erosion is a natural geomorphologic process that affects the fluvial environment. The most important issue concerning riverbank erosion is the identification of the vulnerable locations. An alternative to the usual hydrodynamic models to predict vulnerable locations is to quantify the probability of erosion occurrence. This can be achieved by identifying the underlying relations between riverbank erosion and the geomorphological or hydrological variables that prevent or stimulate erosion. Thus, riverbank erosion can be determined by a regression model using independent variables that are considered to affect the erosion process. The impact of such variables may vary spatially, therefore, a non-stationary regression model is preferred instead of a stationary equivalent. Locally Weighted Regression (LWR) is proposed as a suitable choice. This method can be extended to predict the binary presence or absence of erosion based on a series of independent local variables by using the logistic regression model. It is referred to as Locally Weighted Logistic Regression (LWLR). Logistic regression is a type of regression analysis used for predicting the outcome of a categorical dependent variable (e.g. binary response) based on one or more predictor variables. The method can be combined with LWR to assign weights to local independent variables of the dependent one. LWR allows model parameters to vary over space in order to reflect spatial heterogeneity. The probabilities of the possible outcomes are modelled as a function of the independent variables using a logistic function. Logistic regression measures the relationship between a categorical dependent variable and, usually, one or several continuous independent variables by converting the dependent variable to probability scores. Then, a logistic regression is formed, which predicts success or failure of a given binary variable (e.g. erosion presence or absence) for any value of the independent variables. The

  12. Impaired probability estimation and decision-making in pathological gambling poker players.

    PubMed

    Linnet, Jakob; Frøslev, Mette; Ramsgaard, Stine; Gebauer, Line; Mouridsen, Kim; Wohlert, Victoria

    2012-03-01

    Poker has gained tremendous popularity in recent years, increasing the risk for some individuals to develop pathological gambling. Here, we investigated cognitive biases in a computerized two-player poker task against a fictive opponent, among 12 pathological gambling poker players (PGP), 10 experienced poker players (ExP), and 11 inexperienced poker players (InP). Players were compared on probability estimation and decision-making with the hypothesis that ExP would have significantly lower cognitive biases than PGP and InP, and that the groups could be differentiated based on their cognitive bias styles. The results showed that ExP had a significantly lower average error margin in probability estimation than PGP and InP, and that PGP played hands with lower winning probability than ExP. Binomial logistic regression showed perfect differentiation (100%) between ExP and PGP, and 90.5% classification accuracy between ExP and InP. Multinomial logistic regression showed an overall classification accuracy of 23 out of 33 (69.7%) between the three groups. The classification accuracy of ExP was higher than that of PGP and InP due to the similarities in probability estimation and decision-making between PGP and InP. These impairments in probability estimation and decision-making of PGP may have implications for assessment and treatment of cognitive biases in pathological gambling poker players.

  13. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  14. Trait Mindfulness, Reasons For Living and General Symptom Severity as Predictors of Suicide Probability in Males with Substance Abuse or Dependence

    PubMed Central

    Mohammadkhani, Parvaneh; Azadmehr, Hedieh; Mobramm, Ardeshir; Naseri, Esmaeil

    2015-01-01

    Objective: The aim of this study was to evaluate suicide probability in Iranian males with substance abuse or dependence disorder and to investigate the predictors of suicide probability based on trait mindfulness, reasons for living and severity of general psychiatric symptoms. Method: Participants were 324 individuals with substance abuse or dependence in an outpatient setting and prison. Reasons for living questionnaire, Mindfulness Attention Awareness Scale and Suicide probability Scale were used as instruments. Sample was selected based on convenience sampling method. Data were analyzed using SPSS and AMOS. Results: The life-time prevalence of suicide attempt in the outpatient setting was35% and it was 42% in the prison setting. Suicide probability in the prison setting was significantly higher than in the outpatient setting (p<0.001). The severity of general symptom strongly correlated with suicide probability. Trait mindfulness, not reasons for living beliefs, had a mediating effect in the relationship between the severity of general symptoms and suicide probability. Fear of social disapproval, survival and coping beliefs and child-related concerns significantly predicted suicide probability (p<0.001). Discussion: It could be suggested that trait mindfulness was more effective in preventing suicide probability than beliefs about reasons for living in individuals with substance abuse or dependence disorders. The severity of general symptom should be regarded as an important risk factor of suicide probability. PMID:26005482

  15. Objective probability-like things with and without objective indeterminism

    NASA Astrophysics Data System (ADS)

    Szabó, László E.

    I shall argue that there is no such property of an event as its "probability". This is why standard interpretations cannot give a sound definition in empirical terms of what "probability" is, and this is why empirical sciences like physics can manage without such a definition. "Probability" is a collective term, the meaning of which varies from context to context: it means different-dimensionless [ 0, 1 ] -valued-physical quantities characterising the different particular situations. In other words, probability is a reducible concept, supervening on physical quantities characterising the state of affairs corresponding to the event in question. On the other hand, however, these "probability-like" physical quantities correspond to objective features of the physical world, and are objectively related to measurable quantities like relative frequencies of physical events based on finite samples-no matter whether the world is objectively deterministic or indeterministic.

  16. THE BLACK HOLE FORMATION PROBABILITY

    SciTech Connect

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  17. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  18. Probability distributions for explaining hydrological losses in South Australian catchments

    NASA Astrophysics Data System (ADS)

    Gamage, S. H. P. W.; Hewa, G. A.; Beecham, S.

    2013-11-01

    Accurate estimation of hydrological losses is required for making vital decisions in design applications that are based on design rainfall models and rainfall-runoff models. The use of representative single values of hydrological losses, despite their wide variability, is common practice, especially in Australian studies. This practice leads to issues such as over or under estimation of design floods. The probability distribution method is potentially a better technique to describe losses. However, a lack of understanding of how losses are distributed can limit the use of this technique. This paper aims to identify a probability distribution function that can successfully describe hydrological losses of a catchment of interest. The paper explains the systematic process of identifying probability distribution functions, the problems faced during the distribution fitting process and a new generalised method to test the adequacy of fitted distributions. The goodness-of-fit of the fitted distributions are examined using the Anderson-Darling test and the Q-Q plot method and the errors associated with quantile estimation are quantified by estimating the bias and mean square error (MSE). A two-parameter gamma distribution was identified as one that successfully describes initial loss (IL) data for the selected catchments. Further, non-parametric standardised distributions that describe both IL and continuing loss data are also identified. This paper will provide a significant contribution to the Australian Rainfall and Runoff (ARR) guidelines that are currently being updated, by improving understanding of hydrological losses in South Australian catchments. More importantly, this study provides new knowledge on how IL in a catchment is characterised.

  19. Statistical Validation of Normal Tissue Complication Probability Models

    SciTech Connect

    Xu Chengjian; Schaaf, Arjen van der; Veld, Aart A. van't; Langendijk, Johannes A.; Schilstra, Cornelis

    2012-09-01

    Purpose: To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. Methods and Materials: A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Results: Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Conclusion: Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use.

  20. Quantification of effective exoelectrogens by most probable number (MPN) in a microbial fuel cell.

    PubMed

    Heidrich, Elizabeth S; Curtis, Thomas P; Woodcock, Stephen; Dolfing, Jan

    2016-10-01

    The objective of this work was to quantify the number of exoelectrogens in wastewater capable of producing current in a microbial fuel cell by adapting the classical most probable number (MPN) methodology using current production as end point. Inoculating a series of microbial fuel cells with various dilutions of domestic wastewater and with acetate as test substrate yielded an apparent number of exoelectrogens of 17perml. Using current as a proxy for activity the apparent exoelectrogen growth rate was 0.03h(-1). With starch or wastewater as more complex test substrates similar apparent growth rates were obtained, but the apparent MPN based numbers of exoelectrogens in wastewater were significantly lower, probably because in contrast to acetate, complex substrates require complex food chains to deliver the electrons to the electrodes. Consequently, the apparent MPN is a function of the combined probabilities of members of the food chain being present.