Science.gov

Sample records for probability based significance

  1. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  2. The statistical significance of error probability as determined from decoding simulations for long codes

    NASA Technical Reports Server (NTRS)

    Massey, J. L.

    1976-01-01

    The very low error probability obtained with long error-correcting codes results in a very small number of observed errors in simulation studies of practical size and renders the usual confidence interval techniques inapplicable to the observed error probability. A natural extension of the notion of a 'confidence interval' is made and applied to such determinations of error probability by simulation. An example is included to show the surprisingly great significance of as few as two decoding errors in a very large number of decoding trials.

  3. Direct Updating of an RNA Base-Pairing Probability Matrix with Marginal Probability Constraints

    PubMed Central

    2012-01-01

    Abstract A base-pairing probability matrix (BPPM) stores the probabilities for every possible base pair in an RNA sequence and has been used in many algorithms in RNA informatics (e.g., RNA secondary structure prediction and motif search). In this study, we propose a novel algorithm to perform iterative updates of a given BPPM, satisfying marginal probability constraints that are (approximately) given by recently developed biochemical experiments, such as SHAPE, PAR, and FragSeq. The method is easily implemented and is applicable to common models for RNA secondary structures, such as energy-based or machine-learning–based models. In this article, we focus mainly on the details of the algorithms, although preliminary computational experiments will also be presented. PMID:23210474

  4. Vehicle Detection Based on Probability Hypothesis Density Filter

    PubMed Central

    Zhang, Feihu; Knoll, Alois

    2016-01-01

    In the past decade, the developments of vehicle detection have been significantly improved. By utilizing cameras, vehicles can be detected in the Regions of Interest (ROI) in complex environments. However, vision techniques often suffer from false positives and limited field of view. In this paper, a LiDAR based vehicle detection approach is proposed by using the Probability Hypothesis Density (PHD) filter. The proposed approach consists of two phases: the hypothesis generation phase to detect potential objects and the hypothesis verification phase to classify objects. The performance of the proposed approach is evaluated in complex scenarios, compared with the state-of-the-art. PMID:27070621

  5. Vehicle Detection Based on Probability Hypothesis Density Filter.

    PubMed

    Zhang, Feihu; Knoll, Alois

    2016-01-01

    In the past decade, the developments of vehicle detection have been significantly improved. By utilizing cameras, vehicles can be detected in the Regions of Interest (ROI) in complex environments. However, vision techniques often suffer from false positives and limited field of view. In this paper, a LiDAR based vehicle detection approach is proposed by using the Probability Hypothesis Density (PHD) filter. The proposed approach consists of two phases: the hypothesis generation phase to detect potential objects and the hypothesis verification phase to classify objects. The performance of the proposed approach is evaluated in complex scenarios, compared with the state-of-the-art. PMID:27070621

  6. Significance probability mapping: an aid in the topographic analysis of brain electrical activity.

    PubMed

    Duffy, F H; Bartels, P H; Burchfiel, J L

    1981-05-01

    We illustrate the application of significance probability mapping (SPM) to the analysis of topographic maps of spectral analyzed EEG and visual evoked potential (VEP) activity from patients with brain tumors, boys with dyslexia, and control subjects. When the VEP topographic plots of tumor patients were displayed as number of standard deviations from a reference mean, more subjects were correctly identified than by inspection of the underlying raw data. When topographic plots of EEG alpha activity obtained while listening to speech or music were compared by t statistic to plots of resting alpha activity, regions of cortex presumably activated by speech or music were delineated. DIfferent regions were defined in dyslexic boys and controls. We propose that SPM will prove valuable in the regional localization of normal and abnormal functions in other clinical situations. PMID:6165544

  7. Probability-based nitrate contamination map of groundwater in Kinmen.

    PubMed

    Liu, Chen-Wuing; Wang, Yeuh-Bin; Jang, Cheng-Shin

    2013-12-01

    Groundwater supplies over 50% of drinking water in Kinmen. Approximately 16.8% of groundwater samples in Kinmen exceed the drinking water quality standard (DWQS) of NO3 (-)-N (10 mg/L). The residents drinking high nitrate-polluted groundwater pose a potential risk to health. To formulate effective water quality management plan and assure a safe drinking water in Kinmen, the detailed spatial distribution of nitrate-N in groundwater is a prerequisite. The aim of this study is to develop an efficient scheme for evaluating spatial distribution of nitrate-N in residential well water using logistic regression (LR) model. A probability-based nitrate-N contamination map in Kinmen is constructed. The LR model predicted the binary occurrence probability of groundwater nitrate-N concentrations exceeding DWQS by simple measurement variables as independent variables, including sampling season, soil type, water table depth, pH, EC, DO, and Eh. The analyzed results reveal that three statistically significant explanatory variables, soil type, pH, and EC, are selected for the forward stepwise LR analysis. The total ratio of correct classification reaches 92.7%. The highest probability of nitrate-N contamination map presents in the central zone, indicating that groundwater in the central zone should not be used for drinking purposes. Furthermore, a handy EC-pH-probability curve of nitrate-N exceeding the threshold of DWQS was developed. This curve can be used for preliminary screening of nitrate-N contamination in Kinmen groundwater. This study recommended that the local agency should implement the best management practice strategies to control nonpoint nitrogen sources and carry out a systematic monitoring of groundwater quality in residential wells of the high nitrate-N contamination zones. PMID:23892715

  8. PROBABILITY BASED CORROSION CONTROL FOR WASTE TANKS - PART II

    SciTech Connect

    Hoffman, E.; Edwards, T.

    2010-12-09

    As part of an ongoing study to evaluate the discontinuity in the corrosion controls at the SRS tank farm, a study was conducted this year to assess the minimum concentrations below 1 molar nitrate, see Figure 1. Current controls on the tank farm solution chemistry are in place to prevent the initiation and propagation of pitting and stress corrosion cracking in the primary steel waste tanks. The controls are based upon a series of experiments performed with simulated solutions on materials used for construction of the tanks, namely ASTM A537 carbon steel (A537). During FY09, an experimental program was undertaken to investigate the risk associated with reducing the minimum molar nitrite concentration required to confidently inhibit pitting in dilute solutions (i.e., less than 1 molar nitrate). The experimental results and conclusions herein provide a statistical basis to quantify the probability of pitting for the tank wall exposed to various solutions with dilute concentrations of nitrate and nitrite. Understanding the probability for pitting will allow the facility to make tank-specific risk-based decisions for chemistry control. Based on previous electrochemical testing, a statistical test matrix was developed to refine and solidify the application of the statistical mixture/amount model to corrosion of A537 steel. A mixture/amount model was identified based on statistical analysis of recent and historically collected electrochemical data. This model provides a more complex relationship between the nitrate and nitrite concentrations and the probability of pitting than is represented by the model underlying the current chemistry control program, and its use may provide a technical basis for the utilization of less nitrite to inhibit pitting at concentrations below 1 molar nitrate. FY09 results fit within the mixture/amount model, and further refine the nitrate regime in which the model is applicable. The combination of visual observations and cyclic

  9. Probability model for molecular recognition in biological receptor repertoires: significance to the olfactory system.

    PubMed

    Lancet, D; Sadovsky, E; Seidemann, E

    1993-04-15

    A generalized phenomenological model is presented for stereospecific recognition between biological receptors and their ligands. We ask what is the distribution of binding constants psi(K) between an arbitrary ligand and members of a large receptor repertoire, such as immunoglobulins or olfactory receptors. For binding surfaces with B potential subsite and S different types of subsite configurations, the number of successful elementary interactions obeys a binomial distribution. The discrete probability function psi(K) is then derived with assumptions on alpha, the free energy contribution per elementary interaction. The functional form of psi(K) may be universal, although the parameter values could vary for different ligand types. An estimate of the parameter values of psi(K) for iodovanillin, an analog of odorants and immunological haptens, is obtained by equilibrium dialysis experiments with nonimmune antibodies. Based on a simple relationship, predicted by the model, between the size of a receptor repertoire and its average maximal affinity toward an arbitrary ligand, the size of the olfactory receptor repertoire (Nolf) is calculated as 300-1000, in very good agreement with recent molecular biological studies. A very similar estimate, Nolf = 500, is independently derived by relating a theoretical distribution of maxima for psi(K) with published human olfactory threshold variations. The present model also has implications to the question of olfactory coding and to the analysis of specific anosmias, genetic deficits in perceiving particular odorants. More generally, the proposed model provides a better understanding of ligand specificity in biological receptors and could help in understanding their evolution. PMID:8475121

  10. Probability model for molecular recognition in biological receptor repertoires: significance to the olfactory system.

    PubMed Central

    Lancet, D; Sadovsky, E; Seidemann, E

    1993-01-01

    A generalized phenomenological model is presented for stereospecific recognition between biological receptors and their ligands. We ask what is the distribution of binding constants psi(K) between an arbitrary ligand and members of a large receptor repertoire, such as immunoglobulins or olfactory receptors. For binding surfaces with B potential subsite and S different types of subsite configurations, the number of successful elementary interactions obeys a binomial distribution. The discrete probability function psi(K) is then derived with assumptions on alpha, the free energy contribution per elementary interaction. The functional form of psi(K) may be universal, although the parameter values could vary for different ligand types. An estimate of the parameter values of psi(K) for iodovanillin, an analog of odorants and immunological haptens, is obtained by equilibrium dialysis experiments with nonimmune antibodies. Based on a simple relationship, predicted by the model, between the size of a receptor repertoire and its average maximal affinity toward an arbitrary ligand, the size of the olfactory receptor repertoire (Nolf) is calculated as 300-1000, in very good agreement with recent molecular biological studies. A very similar estimate, Nolf = 500, is independently derived by relating a theoretical distribution of maxima for psi(K) with published human olfactory threshold variations. The present model also has implications to the question of olfactory coding and to the analysis of specific anosmias, genetic deficits in perceiving particular odorants. More generally, the proposed model provides a better understanding of ligand specificity in biological receptors and could help in understanding their evolution. PMID:8475121

  11. PROBABILITY BASED CORROSION CONTROL FOR LIQUID WASTE TANKS - PART III

    SciTech Connect

    Hoffman, E.; Edwards, T.

    2010-12-09

    The liquid waste chemistry control program is designed to reduce the pitting corrosion occurrence on tank walls. The chemistry control program has been implemented, in part, by applying engineering judgment safety factors to experimental data. However, the simple application of a general safety factor can result in use of excessive corrosion inhibiting agents. The required use of excess corrosion inhibitors can be costly for tank maintenance, waste processing, and in future tank closure. It is proposed that a probability-based approach can be used to quantify the risk associated with the chemistry control program. This approach can lead to the application of tank-specific chemistry control programs reducing overall costs associated with overly conservative use of inhibitor. Furthermore, when using nitrite as an inhibitor, the current chemistry control program is based on a linear model of increased aggressive species requiring increased protective species. This linear model was primarily supported by experimental data obtained from dilute solutions with nitrate concentrations less than 0.6 M, but is used to produce the current chemistry control program up to 1.0 M nitrate. Therefore, in the nitrate space between 0.6 and 1.0 M, the current control limit is based on assumptions that the linear model developed from data in the <0.6 M region is applicable in the 0.6-1.0 M region. Due to this assumption, further investigation of the nitrate region of 0.6 M to 1.0 M has potential for significant inhibitor reduction, while maintaining the same level of corrosion risk associated with the current chemistry control program. Ongoing studies have been conducted in FY07, FY08, FY09 and FY10 to evaluate the corrosion controls at the SRS tank farm and to assess the minimum nitrite concentrations to inhibit pitting in ASTM A537 carbon steel below 1.0 molar nitrate. The experimentation from FY08 suggested a non-linear model known as the mixture/amount model could be used to predict

  12. ProbOnto: ontology and knowledge base of probability distributions

    PubMed Central

    Swat, Maciej J.; Grenon, Pierre; Wimalaratne, Sarala

    2016-01-01

    Motivation: Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. Results: ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. Availability and Implementation: http://probonto.org Contact: mjswat@ebi.ac.uk Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27153608

  13. Lake Superior Phytoplankton Characterization from the 2006 Probability Based Survey

    EPA Science Inventory

    We conducted a late summer probability based survey of Lake Superior in 2006 which consisted of 52 sites stratified across 3 depth zones. As part of this effort, we collected composite phytoplankton samples from the epilimnion and the fluorescence maxima (Fmax) at 29 of the site...

  14. Westward-derived conglomerates in Moenkopi formation of Southeastern California, and their probable tectonic significance

    SciTech Connect

    Walker, J.D.; Burchfiel, B.C.; Royden, L.H.

    1983-02-01

    The upper part of the Moenkopi Formation in the Northern Clark Mountains, Southeastern California, contains conglomerate beds whose clasts comprise igneous, metamorphic, and sedimentary rocks. Metamorphic clasts include foliated granite, meta-arkose, and quarzite, probably derived from older Precambrian basement and younger Precambrian clastic rocks. Volcanic clasts are altered plagioclase-bearing rocks, and sedimentary clasts were derived from Paleozoic miogeoclinal rocks. Paleocurrent data indicate that the clasts had a source to the southwest. An age of late Early or early Middle Triassic has been tentatively assigned to these conglomerates. These conglomerates indicate that Late Permian to Early Triassic deformational events in this part of the orogen affected rocks much farther east than has been previously recognized.

  15. Assessing magnitude probability distribution through physics-based rupture scenarios

    NASA Astrophysics Data System (ADS)

    Hok, Sébastien; Durand, Virginie; Bernard, Pascal; Scotti, Oona

    2016-04-01

    When faced with complex network of faults in a seismic hazard assessment study, the first question raised is to what extent the fault network is connected and what is the probability that an earthquake ruptures simultaneously a series of neighboring segments. Physics-based dynamic rupture models can provide useful insight as to which rupture scenario is most probable, provided that an exhaustive exploration of the variability of the input parameters necessary for the dynamic rupture modeling is accounted for. Given the random nature of some parameters (e.g. hypocenter location) and the limitation of our knowledge, we used a logic-tree approach in order to build the different scenarios and to be able to associate them with a probability. The methodology is applied to the three main faults located along the southern coast of the West Corinth rift. Our logic tree takes into account different hypothesis for: fault geometry, location of hypocenter, seismic cycle position, and fracture energy on the fault plane. The variability of these parameters is discussed, and the different values tested are weighted accordingly. 64 scenarios resulting from 64 parameter combinations were included. Sensitivity studies were done to illustrate which parameters control the variability of the results. Given the weight of the input parameters, we evaluated the probability to obtain a full network break to be 15 %, while single segment rupture represents 50 % of the scenarios. These rupture scenario probability distribution along the three faults of the West Corinth rift fault network can then be used as input to a seismic hazard calculation.

  16. Investigation of factors of probable significance in the pathogenesis of pneumonic pasteurellosis in cattle.

    PubMed Central

    Thomson, R G; Chander, S; Savan, M; Fox, M L

    1975-01-01

    Six groups of ten beef calves six to eight months of age were shipped from western Canada and observed untreated for one week after arrival. The following parameters were measured daily: body temperature, plasma fibrinogen, nasal bacterial mean colony counts of Pasteurella hemolytica and Pasteurella multocida, total and differential leukoyte counts, packed cell volumes and the following, twice during the week: serum and nasal antibody titres to P. hemolytica and parainfluenza-3 virus. The lungs from 44 of the calves were obtained at post mortem and given a numerical score based on the degree of pneumonia present. Animals were designated SICK and WELL according to body temperature and plasma fibrinogen. The SICK animals had higher nasal mean colony counts of P. hemolytica than the WELL animals. The SICK animals had lower levels of serum antibody to P. hemolytica than the WELL on day 1 but had a greater rise in titre over the week than did the WELL animals. Both groups were similar with regard to serum antibody to parainfluenza-3 virus and there was little change in these titres. The SICK animals had a much greater degree of pneumonia than the WELL. The values of some of the parameters were combined with the data of previously studied animals in order to provide a comparison of SICK and WELL with larger numbers of animals. PMID:164992

  17. QKD-based quantum private query without a failure probability

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Gao, Fei; Huang, Wei; Wen, QiaoYan

    2015-10-01

    In this paper, we present a quantum-key-distribution (QKD)-based quantum private query (QPQ) protocol utilizing single-photon signal of multiple optical pulses. It maintains the advantages of the QKD-based QPQ, i.e., easy to implement and loss tolerant. In addition, different from the situations in the previous QKD-based QPQ protocols, in our protocol, the number of the items an honest user will obtain is always one and the failure probability is always zero. This characteristic not only improves the stability (in the sense that, ignoring the noise and the attack, the protocol would always succeed), but also benefits the privacy of the database (since the database will no more reveal additional secrets to the honest users). Furthermore, for the user's privacy, the proposed protocol is cheat sensitive, and for security of the database, we obtain an upper bound for the leaked information of the database in theory.

  18. Lung scans with significant perfusion defects limited to matching pleural effusions have a low probability of pulmonary embolism

    SciTech Connect

    Datz, F.L.; Bedont, R.A.; Taylor, A.

    1985-05-01

    Patients with a pleural effusion on chest x-ray often undergo a lung scan to exclude pulmonary embolism (PE). According to other studies, when the scan shows a perfusion defect equal in size to a radiographic abnormality on chest x-ray, the scan should be classified as indeterminate or intermediate probability for PE. However, since those studies dealt primarily with alveolar infiltrates rather than pleural effusions, the authors undertook a retrospective study to determine the probability of PE in patients with pleural effusion and a matching perfusion defect. The authors reviewed 451 scans and x-rays of patients studied for suspected PE. Of those, 53 had moderate or large perfusion defects secondary to pleural effusion without other significant (>25% of a segment) effusion without other significant (>25% of a segment) defects on the scan. Final diagnosis was confirmed by pulmonary angiography (16), thoracentesis (40), venography (11), other radiographic and laboratory studies, and clinical course. Of the 53 patients, only 2 patients had venous thrombotic disease. One patient had PE on pulmonary angiography, the other patient had thrombophlebitis on venography. The remainder of the patients had effusions due to congestive heart failure (12), malignancy (12), infection (7), trauma (7), collegen vascular disease (7), sympathetic effusion (3) and unknown etiology (3). The authors conclude that lung scans with significant perfusion defects limited to matching pleural effusions on chest x-ray have a low probability for PE.

  19. The conditional risk probability-based seawall height design method

    NASA Astrophysics Data System (ADS)

    Yang, Xing; Hu, Xiaodong; Li, Zhiqing

    2015-11-01

    The determination of the required seawall height is usually based on the combination of wind speed (or wave height) and still water level according to a specified return period, e.g., 50-year return period wind speed and 50-year return period still water level. In reality, the two variables are be partially correlated. This may be lead to over-design (costs) of seawall structures. The above-mentioned return period for the design of a seawall depends on economy, society and natural environment in the region. This means a specified risk level of overtopping or damage of a seawall structure is usually allowed. The aim of this paper is to present a conditional risk probability-based seawall height design method which incorporates the correlation of the two variables. For purposes of demonstration, the wind speeds and water levels collected from Jiangsu of China are analyzed. The results show this method can improve seawall height design accuracy.

  20. Gesture Recognition Based on the Probability Distribution of Arm Trajectories

    NASA Astrophysics Data System (ADS)

    Wan, Khairunizam; Sawada, Hideyuki

    The use of human motions for the interaction between humans and computers is becoming an attractive alternative to verbal media, especially through the visual interpretation of the human body motion. In particular, hand gestures are used as non-verbal media for the humans to communicate with machines that pertain to the use of the human gestures to interact with them. This paper introduces a 3D motion measurement of the human upper body for the purpose of the gesture recognition, which is based on the probability distribution of arm trajectories. In this study, by examining the characteristics of the arm trajectories given by a signer, motion features are selected and classified by using a fuzzy technique. Experimental results show that the use of the features extracted from arm trajectories effectively works on the recognition of dynamic gestures of a human, and gives a good performance to classify various gesture patterns.

  1. Image-based camera motion estimation using prior probabilities

    NASA Astrophysics Data System (ADS)

    Sargent, Dusty; Park, Sun Young; Spofford, Inbar; Vosburgh, Kirby

    2011-03-01

    Image-based camera motion estimation from video or still images is a difficult problem in the field of computer vision. Many algorithms have been proposed for estimating intrinsic camera parameters, detecting and matching features between images, calculating extrinsic camera parameters based on those features, and optimizing the recovered parameters with nonlinear methods. These steps in the camera motion inference process all face challenges in practical applications: locating distinctive features can be difficult in many types of scenes given the limited capabilities of current feature detectors, camera motion inference can easily fail in the presence of noise and outliers in the matched features, and the error surfaces in optimization typically contain many suboptimal local minima. The problems faced by these techniques are compounded when they are applied to medical video captured by an endoscope, which presents further challenges such as non-rigid scenery and severe barrel distortion of the images. In this paper, we study these problems and propose the use of prior probabilities to stabilize camera motion estimation for the application of computing endoscope motion sequences in colonoscopy. Colonoscopy presents a special case for camera motion estimation in which it is possible to characterize typical motion sequences of the endoscope. As the endoscope is restricted to move within a roughly tube-shaped structure, forward/backward motion is expected, with only small amounts of rotation and horizontal movement. We formulate a probabilistic model of endoscope motion by maneuvering an endoscope and attached magnetic tracker through a synthetic colon model and fitting a distribution to the observed motion of the magnetic tracker. This model enables us to estimate the probability of the current endoscope motion given previously observed motion in the sequence. We add these prior probabilities into the camera motion calculation as an additional penalty term in RANSAC

  2. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. PMID:25363706

  3. A probability-based formula for calculating interobserver agreement1

    PubMed Central

    Yelton, Ann R.; Wildman, Beth G.; Erickson, Marilyn T.

    1977-01-01

    Estimates of observer agreement are necessary to assess the acceptability of interval data. A common method for assessing observer agreement, per cent agreement, includes several major weaknesses and varies as a function of the frequency of behavior recorded and the inclusion or exclusion of agreements on nonoccurrences. Also, agreements that might be expected to occur by chance are not taken into account. An alternative method for assessing observer agreement that determines the exact probability that the obtained number of agreements or better would have occurred by chance is presented and explained. Agreements on both occurrences and nonoccurrences of behavior are considered in the calculation of this probability. PMID:16795541

  4. Success Probability Analysis for Shuttle Based Microgravity Experiments

    NASA Technical Reports Server (NTRS)

    Liou, Ying-Hsin Andrew

    1996-01-01

    Presented in this report are the results of data analysis of shuttle-based microgravity flight experiments. Potential factors were identified in the previous grant period, and in this period 26 factors were selected for data analysis. In this project, the degree of success was developed and used as the performance measure. 293 of the 391 experiments in Lewis Research Center Microgravity Database were assigned degrees of success. The frequency analysis and the analysis of variance were conducted to determine the significance of the factors that effect the experiment success.

  5. A Selective Vision and Landmark based Approach to Improve the Efficiency of Position Probability Grid Localization

    NASA Astrophysics Data System (ADS)

    Loukianov, Andrey A.; Sugisaka, Masanori

    This paper presents a vision and landmark based approach to improve the efficiency of probability grid Markov localization for mobile robots. The proposed approach uses visual landmarks that can be detected by a rotating video camera on the robot. We assume that visual landmark positions in the map are known and that each landmark can be assigned to a certain landmark class. The method uses classes of observed landmarks and their relative arrangement to select regions in the robot posture space where the location probability density function is to be updated. Subsequent computations are performed only in these selected update regions thus the computational workload is significantly reduced. Probabilistic landmark-based localization method, details of the map and robot perception are discussed. A technique to compute the update regions and their parameters for selective computation is introduced. Simulation results are presented to show the effectiveness of the approach.

  6. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability.

    PubMed

    Thanh, Vo Hong; Priami, Corrado; Zunino, Roberto

    2016-06-14

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions. PMID:27305997

  7. Accelerating rejection-based simulation of biochemical reactions with bounded acceptance probability

    NASA Astrophysics Data System (ADS)

    Thanh, Vo Hong; Priami, Corrado; Zunino, Roberto

    2016-06-01

    Stochastic simulation of large biochemical reaction networks is often computationally expensive due to the disparate reaction rates and high variability of population of chemical species. An approach to accelerate the simulation is to allow multiple reaction firings before performing update by assuming that reaction propensities are changing of a negligible amount during a time interval. Species with small population in the firings of fast reactions significantly affect both performance and accuracy of this simulation approach. It is even worse when these small population species are involved in a large number of reactions. We present in this paper a new approximate algorithm to cope with this problem. It is based on bounding the acceptance probability of a reaction selected by the exact rejection-based simulation algorithm, which employs propensity bounds of reactions and the rejection-based mechanism to select next reaction firings. The reaction is ensured to be selected to fire with an acceptance rate greater than a predefined probability in which the selection becomes exact if the probability is set to one. Our new algorithm improves the computational cost for selecting the next reaction firing and reduces the updating the propensities of reactions.

  8. Quantitative Determination of the Probability of Multiple-Motor Transport in Bead-Based Assays.

    PubMed

    Li, Qiaochu; King, Stephen J; Gopinathan, Ajay; Xu, Jing

    2016-06-21

    With their longest dimension typically being less than 100 nm, molecular motors are significantly below the optical-resolution limit. Despite substantial advances in fluorescence-based imaging methodologies, labeling with beads remains critical for optical-trapping-based investigations of molecular motors. A key experimental challenge in bead-based assays is that the number of motors on a bead is not well defined. Particularly for single-molecule investigations, the probability of single- versus multiple-motor events has not been experimentally investigated. Here, we used bead travel distance as an indicator of multiple-motor transport and determined the lower-bound probability of bead transport by two or more motors. We limited the ATP concentration to increase our detection sensitivity for multiple- versus single-kinesin transport. Surprisingly, for all but the lowest motor number examined, our measurements exceeded estimations of a previous model by ≥2-fold. To bridge this apparent gap between theory and experiment, we derived a closed-form expression for the probability of bead transport by multiple motors, and constrained the only free parameter in this model using our experimental measurements. Our data indicate that kinesin extends to ∼57 nm during bead transport, suggesting that kinesin exploits its conformational flexibility to interact with microtubules at highly curved interfaces such as those present for vesicle transport in cells. To our knowledge, our findings provide the first experimentally constrained guide for estimating the probability of multiple-motor transport in optical trapping studies. The experimental approach utilized here (limiting ATP concentration) may be generally applicable to studies in which molecular motors are labeled with cargos that are artificial or are purified from cellular extracts. PMID:27332130

  9. Inferring rare disease risk variants based on exact probabilities of sharing by multiple affected relatives

    PubMed Central

    Bureau, Alexandre; Younkin, Samuel G.; Parker, Margaret M.; Bailey-Wilson, Joan E.; Marazita, Mary L.; Murray, Jeffrey C.; Mangold, Elisabeth; Albacha-Hejazi, Hasan; Beaty, Terri H.; Ruczinski, Ingo

    2014-01-01

    Motivation: Family-based designs are regaining popularity for genomic sequencing studies because they provide a way to test cosegregation with disease of variants that are too rare in the population to be tested individually in a conventional case–control study. Results: Where only a few affected subjects per family are sequenced, the probability that any variant would be shared by all affected relatives—given it occurred in any one family member—provides evidence against the null hypothesis of a complete absence of linkage and association. A P-value can be obtained as the sum of the probabilities of sharing events as (or more) extreme in one or more families. We generalize an existing closed-form expression for exact sharing probabilities to more than two relatives per family. When pedigree founders are related, we show that an approximation of sharing probabilities based on empirical estimates of kinship among founders obtained from genome-wide marker data is accurate for low levels of kinship. We also propose a more generally applicable approach based on Monte Carlo simulations. We applied this method to a study of 55 multiplex families with apparent non-syndromic forms of oral clefts from four distinct populations, with whole exome sequences available for two or three affected members per family. The rare single nucleotide variant rs149253049 in ADAMTS9 shared by affected relatives in three Indian families achieved significance after correcting for multiple comparisons (p=2×10−6). Availability and implementation: Source code and binaries of the R package RVsharing are freely available for download at http://cran.r-project.org/web/packages/RVsharing/index.html. Contact: alexandre.bureau@msp.ulaval.ca or ingo@jhu.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24740360

  10. Probability method for Cerenkov luminescence tomography based on conformance error minimization

    PubMed Central

    Ding, Xintao; Wang, Kun; Jie, Biao; Luo, Yonglong; Hu, Zhenhua; Tian, Jie

    2014-01-01

    Cerenkov luminescence tomography (CLT) was developed to reconstruct a three-dimensional (3D) distribution of radioactive probes inside a living animal. Reconstruction methods are generally performed within a unique framework by searching for the optimum solution. However, the ill-posed aspect of the inverse problem usually results in the reconstruction being non-robust. In addition, the reconstructed result may not match reality since the difference between the highest and lowest uptakes of the resulting radiotracers may be considerably large, therefore the biological significance is lost. In this paper, based on the minimization of a conformance error, a probability method is proposed that consists of qualitative and quantitative modules. The proposed method first pinpoints the organ that contains the light source. Next, we developed a 0-1 linear optimization subject to a space constraint to model the CLT inverse problem, which was transformed into a forward problem by employing a region growing method to solve the optimization. After running through all of the elements used to grow the sources, a source sequence was obtained. Finally, the probability of each discrete node being the light source inside the organ was reconstructed. One numerical study and two in vivo experiments were conducted to verify the performance of the proposed algorithm, and comparisons were carried out using the hp-finite element method (hp-FEM). The results suggested that our proposed probability method was more robust and reasonable than hp-FEM. PMID:25071951

  11. Probability method for Cerenkov luminescence tomography based on conformance error minimization.

    PubMed

    Ding, Xintao; Wang, Kun; Jie, Biao; Luo, Yonglong; Hu, Zhenhua; Tian, Jie

    2014-07-01

    Cerenkov luminescence tomography (CLT) was developed to reconstruct a three-dimensional (3D) distribution of radioactive probes inside a living animal. Reconstruction methods are generally performed within a unique framework by searching for the optimum solution. However, the ill-posed aspect of the inverse problem usually results in the reconstruction being non-robust. In addition, the reconstructed result may not match reality since the difference between the highest and lowest uptakes of the resulting radiotracers may be considerably large, therefore the biological significance is lost. In this paper, based on the minimization of a conformance error, a probability method is proposed that consists of qualitative and quantitative modules. The proposed method first pinpoints the organ that contains the light source. Next, we developed a 0-1 linear optimization subject to a space constraint to model the CLT inverse problem, which was transformed into a forward problem by employing a region growing method to solve the optimization. After running through all of the elements used to grow the sources, a source sequence was obtained. Finally, the probability of each discrete node being the light source inside the organ was reconstructed. One numerical study and two in vivo experiments were conducted to verify the performance of the proposed algorithm, and comparisons were carried out using the hp-finite element method (hp-FEM). The results suggested that our proposed probability method was more robust and reasonable than hp-FEM. PMID:25071951

  12. Probability-Based Determination Methods for Service Waiting in Service-Oriented Computing Environments

    NASA Astrophysics Data System (ADS)

    Zeng, Sen; Huang, Shuangxi; Liu, Yang

    Cooperative business processes (CBP)-based service-oriented enterprise networks (SOEN) are emerging with the significant advances of enterprise integration and service-oriented architecture. The performance prediction and optimization for CBP-based SOEN is very complex. To meet these challenges, one of the key points is to try to reduce an abstract service’s waiting number of its physical services. This paper introduces a probability-based determination method (PBDM) of an abstract service’ waiting number, M l , and time span, τ i , for its physical services. The determination of M i and τ i is according to the physical services’ arriving rule and their overall performance’s distribution functions. In PBDM, the arriving probability of the physical services with the best overall performance value is a pre-defined reliability. PBDM has made use of the information of the physical services’ arriving rule and performance distribution functions thoroughly, which will improve the computational efficiency for the scheme design and performance optimization of the collaborative business processes in service-oriented computing environments.

  13. The Effect of Simulation-Based Learning on Prospective Teachers' Inference Skills in Teaching Probability

    ERIC Educational Resources Information Center

    Koparan, Timur; Yilmaz, Gül Kaleli

    2015-01-01

    The effect of simulation-based probability teaching on the prospective teachers' inference skills has been examined with this research. In line with this purpose, it has been aimed to examine the design, implementation and efficiency of a learning environment for experimental probability. Activities were built on modeling, simulation and the…

  14. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W.; Gumbert, Clyde R.; Newman, Perry A.

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The optimal solutions associated with the MPP provide measurements related to safety probability. This study focuses on two commonly used approximate probability integration methods; i.e., the Reliability Index Approach (RIA) and the Performance Measurement Approach (PMA). Their reliability sensitivity equations are first derived in this paper, based on the derivatives of their respective optimal solutions. Examples are then provided to demonstrate the use of these derivatives for better reliability analysis and Reliability-Based Design Optimization (RBDO).

  15. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).

  16. A Comparative Study of Probability Collectives Based Multi-agent Systems and Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Huang, Chien-Feng; Wolpert, David H.; Bieniawski, Stefan; Strauss, Charles E. M.

    2005-01-01

    We compare Genetic Algorithms (GA's) with Probability Collectives (PC), a new framework for distributed optimization and control. In contrast to GA's, PC-based methods do not update populations of solutions. Instead they update an explicitly parameterized probability distribution p over the space of solutions. That updating of p arises as the optimization of a functional of p. The functional is chosen so that any p that optimizes it should be p peaked about good solutions. The PC approach works in both continuous and discrete problems. It does not suffer from the resolution limitation of the finite bit length encoding of parameters into GA alleles. It also has deep connections with both game theory and statistical physics. We review the PC approach using its motivation as the information theoretic formulation of bounded rationality for multi-agent systems. It is then compared with GA's on a diverse set of problems. To handle high dimensional surfaces, in the PC method investigated here p is restricted to a product distribution. Each distribution in that product is controlled by a separate agent. The test functions were selected for their difficulty using either traditional gradient descent or genetic algorithms. On those functions the PC-based approach significantly outperforms traditional GA's in both rate of descent, trapping in false minima, and long term optimization.

  17. Reducing the Probability of Incidents Through Behavior-Based Safety -- An Anomaly or Not?

    SciTech Connect

    Turek, John A

    2002-07-23

    Reducing the probability of incidents through Behavior-Based Safety-an anomaly or not? Can a Behavior-Based Safety (BBS) process reduce the probability of an employee sustaining a work-related injury or illness? This presentation describes the actions taken to implement a sustainable BBS process and evaluates its effectiveness. The BBS process at the Stanford Linear Accelerator Center used a pilot population of national laboratory employees to: Achieve employee and management support; Reduce the probability of employees' sustaining work-related injuries and illnesses; and Provide support for additional funding to expand within the laboratory.

  18. Reducing the Probability of Incidents Through Behavior-Based Safety -- An Anomaly or Not?

    SciTech Connect

    Turek, John A

    2002-07-23

    Reducing the probability of incidents through Behavior-Based Safety--an anomaly or not? Can a Behavior-Based Safety (BBS) process reduce the probability of an employee sustaining a work-related injury or illness? This presentation describes the actions taken to implement a sustainable BBS process and evaluates its effectiveness. The BBS process at the Stanford Linear Accelerator Center used a pilot population of national laboratory employees to: Achieve employee and management support; Reduce the probability of employees' sustaining work-related injuries and illnesses; and Provide support for additional funding to expand within the laboratory.

  19. Open cluster membership probability based on K-means clustering algorithm

    NASA Astrophysics Data System (ADS)

    El Aziz, Mohamed Abd; Selim, I. M.; Essam, A.

    2016-05-01

    In the field of galaxies images, the relative coordinate positions of each star with respect to all the other stars are adapted. Therefore the membership of star cluster will be adapted by two basic criterions, one for geometric membership and other for physical (photometric) membership. So in this paper, we presented a new method for the determination of open cluster membership based on K-means clustering algorithm. This algorithm allows us to efficiently discriminate the cluster membership from the field stars. To validate the method we applied it on NGC 188 and NGC 2266, membership stars in these clusters have been obtained. The color-magnitude diagram of the membership stars is significantly clearer and shows a well-defined main sequence and a red giant branch in NGC 188, which allows us to better constrain the cluster members and estimate their physical parameters. The membership probabilities have been calculated and compared to those obtained by the other methods. The results show that the K-means clustering algorithm can effectively select probable member stars in space without any assumption about the spatial distribution of stars in cluster or field. The similarity of our results is in a good agreement with results derived by previous works.

  20. A comprehensive propagation prediction model comprising microfacet based scattering and probability based coverage optimization algorithm.

    PubMed

    Kausar, A S M Zahid; Reza, Ahmed Wasif; Wo, Lau Chun; Ramiah, Harikrishnan

    2014-01-01

    Although ray tracing based propagation prediction models are popular for indoor radio wave propagation characterization, most of them do not provide an integrated approach for achieving the goal of optimum coverage, which is a key part in designing wireless network. In this paper, an accelerated technique of three-dimensional ray tracing is presented, where rough surface scattering is included for making a more accurate ray tracing technique. Here, the rough surface scattering is represented by microfacets, for which it becomes possible to compute the scattering field in all possible directions. New optimization techniques, like dual quadrant skipping (DQS) and closest object finder (COF), are implemented for fast characterization of wireless communications and making the ray tracing technique more efficient. In conjunction with the ray tracing technique, probability based coverage optimization algorithm is accumulated with the ray tracing technique to make a compact solution for indoor propagation prediction. The proposed technique decreases the ray tracing time by omitting the unnecessary objects for ray tracing using the DQS technique and by decreasing the ray-object intersection time using the COF technique. On the other hand, the coverage optimization algorithm is based on probability theory, which finds out the minimum number of transmitters and their corresponding positions in order to achieve optimal indoor wireless coverage. Both of the space and time complexities of the proposed algorithm surpass the existing algorithms. For the verification of the proposed ray tracing technique and coverage algorithm, detailed simulation results for different scattering factors, different antenna types, and different operating frequencies are presented. Furthermore, the proposed technique is verified by the experimental results. PMID:25202733

  1. An efficient surrogate-based method for computing rare failure probability

    NASA Astrophysics Data System (ADS)

    Li, Jing; Li, Jinglai; Xiu, Dongbin

    2011-10-01

    In this paper, we present an efficient numerical method for evaluating rare failure probability. The method is based on a recently developed surrogate-based method from Li and Xiu [J. Li, D. Xiu, Evaluation of failure probability via surrogate models, J. Comput. Phys. 229 (2010) 8966-8980] for failure probability computation. The method by Li and Xiu is of hybrid nature, in the sense that samples of both the surrogate model and the true physical model are used, and its efficiency gain relies on using only very few samples of the true model. Here we extend the capability of the method to rare probability computation by using the idea of importance sampling (IS). In particular, we employ cross-entropy (CE) method, which is an effective method to determine the biasing distribution in IS. We demonstrate that, by combining with the CE method, a surrogate-based IS algorithm can be constructed and is highly efficient for rare failure probability computation—it incurs much reduced simulation efforts compared to the traditional CE-IS method. In many cases, the new method is capable of capturing failure probability as small as 10 -12 ˜ 10 -6 with only several hundreds samples.

  2. PROBABILITY BASED CORROSION CONTROL FOR HIGH LEVEL WASTE TANKS: INTERIM REPORT

    SciTech Connect

    Hoffman, E; Karthik Subramanian, K

    2008-04-23

    Controls on the solution chemistry (minimum nitrite and hydroxide concentrations) are in place to prevent the initiation and propagation of pitting and stress corrosion cracking in high level waste (HLW) tanks. These controls are based upon a series of experiments performed on carbon steel coupons in simulated waste solutions. An experimental program was undertaken to investigate reducing the minimum molar nitrite concentration required to confidently inhibit pitting. A statistical basis to quantify the probability of pitting for the tank wall, when exposed to various dilute solutions, is being developed. Electrochemical and coupon testing are being performed within the framework of the statistical test matrix to determine the minimum necessary inhibitor concentrations and develop a quantitative model to predict pitting propensity. A subset of the original statistical test matrix was used to develop an applied understanding of the corrosion response of the carbon steel in the various environments. The interim results suggest that there exists some critical nitrite concentration that sufficiently inhibits against localized corrosion mechanisms due to nitrates/chlorides/sulfates, beyond which further nitrite additions are unnecessary. The combination of visual observation and the cyclic potentiodynamic polarization scans indicate the potential for significant inhibitor reductions without consequence specifically at nitrate concentrations near 1 M. The complete data sets will be used to determine the statistical basis to confidently inhibit against pitting using nitrite inhibition with the current pH controls. Once complete, a revised chemistry control program will be devised based upon the probability of pitting specifically for dilute solutions which will allow for tank specific chemistry control implementation.

  3. Probability Elicitation Under Severe Time Pressure: A Rank-Based Method.

    PubMed

    Jaspersen, Johannes G; Montibeller, Gilberto

    2015-07-01

    Probability elicitation protocols are used to assess and incorporate subjective probabilities in risk and decision analysis. While most of these protocols use methods that have focused on the precision of the elicited probabilities, the speed of the elicitation process has often been neglected. However, speed is also important, particularly when experts need to examine a large number of events on a recurrent basis. Furthermore, most existing elicitation methods are numerical in nature, but there are various reasons why an expert would refuse to give such precise ratio-scale estimates, even if highly numerate. This may occur, for instance, when there is lack of sufficient hard evidence, when assessing very uncertain events (such as emergent threats), or when dealing with politicized topics (such as terrorism or disease outbreaks). In this article, we adopt an ordinal ranking approach from multicriteria decision analysis to provide a fast and nonnumerical probability elicitation process. Probabilities are subsequently approximated from the ranking by an algorithm based on the principle of maximum entropy, a rule compatible with the ordinal information provided by the expert. The method can elicit probabilities for a wide range of different event types, including new ways of eliciting probabilities for stochastically independent events and low-probability events. We use a Monte Carlo simulation to test the accuracy of the approximated probabilities and try the method in practice, applying it to a real-world risk analysis recently conducted for DEFRA (the U.K. Department for the Environment, Farming and Rural Affairs): the prioritization of animal health threats. PMID:25850859

  4. Flow Regime Based Climatologies of Lightning Probabilities for Spaceports and Airports

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III; Sharp, David; Spratt, Scott; Lafosse, Richard A.

    2008-01-01

    The objective of this work was to provide forecasters with a tool to indicate the warm season climatological probability of one or more lightning strikes within a circle at a site within a specified time interval. This paper described the AMU work conducted in developing flow regime based climatologies of lightning probabilities for the SLF and seven airports in the NWS MLB CWA in east-central Florida. The paper also described the GUI developed by the AMU that is used to display the data for the operational forecasters. There were challenges working with gridded lightning data as well as the code that accompanied the gridded data. The AMU modified the provided code to be able to produce the climatologies of lightning probabilities based on eight flow regimes for 5-, 10-, 20-, and 30-n mi circles centered on eight sites in 1-, 3-, and 6-hour increments.

  5. How might Model-based Probabilities Extracted from Imperfect Models Guide Rational Decisions: The Case for non-probabilistic odds

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.

    2010-05-01

    whether or not probabilities based on imperfect models can be expected to yield probabilistic odds which are sustainable. Evidence is provided that suggest this is not the case. Even with very good models (good in an Root-Mean-Square sense), the risk of ruin of probabilistic odds is significantly higher than might be expected. Methods for constructing model-based non-probabilistic odds which are sustainable are discussed. The aim here is to be relevant to real world decision support, and so unrealistic assumptions of equal knowledge, equal compute power, or equal access to information are to be avoided. Finally, the use of non-probabilistic odds as a method for communicating deep uncertainty (uncertainty in a probability forecast itself) is discussed in the context of other methods, such as stating one's subjective probability that the models will prove inadequate in each particular instance (that is, the Probability of a "Big Surprise").

  6. The Role of Probability-Based Inference in an Intelligent Tutoring System.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Gitomer, Drew H.

    Probability-based inference in complex networks of interdependent variables is an active topic in statistical research, spurred by such diverse applications as forecasting, pedigree analysis, troubleshooting, and medical diagnosis. This paper concerns the role of Bayesian inference networks for updating student models in intelligent tutoring…

  7. HABITAT ASSESSMENT USING A RANDOM PROBABILITY BASED SAMPLING DESIGN: ESCAMBIA RIVER DELTA, FLORIDA

    EPA Science Inventory

    Smith, Lisa M., Darrin D. Dantin and Steve Jordan. In press. Habitat Assessment Using a Random Probability Based Sampling Design: Escambia River Delta, Florida (Abstract). To be presented at the SWS/GERS Fall Joint Society Meeting: Communication and Collaboration: Coastal Systems...

  8. Learning Probabilities in Computer Engineering by Using a Competency- and Problem-Based Approach

    ERIC Educational Resources Information Center

    Khoumsi, Ahmed; Hadjou, Brahim

    2005-01-01

    Our department has redesigned its electrical and computer engineering programs by adopting a learning methodology based on competence development, problem solving, and the realization of design projects. In this article, we show how this pedagogical approach has been successfully used for learning probabilities and their application to computer…

  9. Estimating the Upper Limit of Lifetime Probability Distribution, Based on Data of Japanese Centenarians.

    PubMed

    Hanayama, Nobutane; Sibuya, Masaaki

    2016-08-01

    In modern biology, theories of aging fall mainly into two groups: damage theories and programed theories. If programed theories are true, the probability that human beings live beyond a specific age will be zero. In contrast, if damage theories are true, such an age does not exist, and a longevity record will be eventually destroyed. In this article, for examining real state, a special type of binomial model based on the generalized Pareto distribution has been applied to data of Japanese centenarians. From the results, it is concluded that the upper limit of lifetime probability distribution in the Japanese population has been estimated 123 years. PMID:26362439

  10. Performance of the Rayleigh task based on the posterior probability of tomographic reconstructions

    SciTech Connect

    Hanson, K.M.

    1991-01-01

    We seek the best possible performance of the Rayleigh task in which one must decide whether a perceived object is a pair of Gaussian-blurred points or a blurred line. Two Bayesian reconstruction algorithms are used, the first based on a Gaussian prior-probability distribution with a nonnegativity constraint and the second based on an entropic prior. In both cases, the reconstructions are found that maximize the posterior probability. We compare the performance of the Rayleigh task obtained with two decision variables, the logarithm of the posterior probability ratio and the change in the mean-squared deviation from the reconstruction. The method of evaluation is based on the results of a numerical testing procedure in which the stated discrimination task is carried out on reconstructions of a randomly generated sequence of images. The ability to perform the Rayleigh task is summarized in terms of a discrimination index that is derived from the area under the receiver-operating characteristic (ROC) curve. We find that the use of the posterior probability does not result in better performance of the Rayleigh task than the mean-squared deviation from the reconstruction. 10 refs., 6 figs.

  11. Value and probability coding in a feedback-based learning task utilizing food rewards

    PubMed Central

    Lempert, Karolina M.

    2014-01-01

    For the consequences of our actions to guide behavior, the brain must represent different types of outcome-related information. For example, an outcome can be construed as negative because an expected reward was not delivered or because an outcome of low value was delivered. Thus behavioral consequences can differ in terms of the information they provide about outcome probability and value. We investigated the role of the striatum in processing probability-based and value-based negative feedback by training participants to associate cues with food rewards and then employing a selective satiety procedure to devalue one food outcome. Using functional magnetic resonance imaging, we examined brain activity related to receipt of expected rewards, receipt of devalued outcomes, omission of expected rewards, omission of devalued outcomes, and expected omissions of an outcome. Nucleus accumbens activation was greater for rewarding outcomes than devalued outcomes, but activity in this region did not correlate with the probability of reward receipt. Activation of the right caudate and putamen, however, was largest in response to rewarding outcomes relative to expected omissions of reward. The dorsal striatum (caudate and putamen) at the time of feedback also showed a parametric increase correlating with the trialwise probability of reward receipt. Our results suggest that the ventral striatum is sensitive to the motivational relevance, or subjective value, of the outcome, while the dorsal striatum codes for a more complex signal that incorporates reward probability. Value and probability information may be integrated in the dorsal striatum, to facilitate action planning and allocation of effort. PMID:25339705

  12. Differential Survival in Europe and the United States: Estimates Based on Subjective Probabilities of Survival

    PubMed Central

    Delavande, Adeline; Rohwedder, Susann

    2013-01-01

    Cross-country comparisons of differential survival by socioeconomic status (SES) are useful in many domains. Yet, to date, such studies have been rare. Reliably estimating differential survival in a single country has been challenging because it requires rich panel data with a large sample size. Cross-country estimates have proven even more difficult because the measures of SES need to be comparable internationally. We present an alternative method for acquiring information on differential survival by SES. Rather than using observations of actual survival, we relate individuals’ subjective probabilities of survival to SES variables in cross section. To show that subjective survival probabilities are informative proxies for actual survival when estimating differential survival, we compare estimates of differential survival based on actual survival with estimates based on subjective probabilities of survival for the same sample. The results are remarkably similar. We then use this approach to compare differential survival by SES for 10 European countries and the United States. Wealthier people have higher survival probabilities than those who are less wealthy, but the strength of the association differs across countries. Nations with a smaller gradient appear to be Belgium, France, and Italy, while the United States, England, and Sweden appear to have a larger gradient. PMID:22042664

  13. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets

    PubMed Central

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details. PMID:26158662

  14. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    PubMed

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details. PMID:26158662

  15. Estimating transition probabilities for stage-based population projection matrices using capture-recapture data

    USGS Publications Warehouse

    Nichols, J.D.; Sauer, J.R.; Pollock, K.H.; Hestbeck, J.B.

    1992-01-01

    In stage-based demography, animals are often categorized into size (or mass) classes, and size-based probabilities of surviving and changing mass classes must be estimated before demographic analyses can be conducted. In this paper, we develop two procedures for the estimation of mass transition probabilities from capture-recapture data. The first approach uses a multistate capture-recapture model that is parameterized directly with the transition probabilities of interest. Maximum likelihood estimates are then obtained numerically using program SURVIV. The second approach involves a modification of Pollock's robust design. Estimation proceeds by conditioning on animals caught in a particualr class at time i, and then using closed models to estimate the number of these that are alive in other classes at i + 1. Both methods are illustrated by application to meadow vole, Microtus pennsylvanicus, capture-recapture data. The two methods produced reasonable estimates that were similar. Advantages of these two approaches include the directness of estimation, the absence of need for restrictive assumptions about the independence of survival and growth, the testability of assumptions, and the testability of related hypotheses of ecological interest (e.g., the hypothesis of temporal variation in transition probabilities).

  16. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    NASA Astrophysics Data System (ADS)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-07-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  17. The relationship study between image features and detection probability based on psychology experiments

    NASA Astrophysics Data System (ADS)

    Lin, Wei; Chen, Yu-hua; Wang, Ji-yuan; Gao, Hong-sheng; Wang, Ji-jun; Su, Rong-hua; Mao, Wei

    2011-04-01

    Detection probability is an important index to represent and estimate target viability, which provides basis for target recognition and decision-making. But it will expend a mass of time and manpower to obtain detection probability in reality. At the same time, due to the different interpretation of personnel practice knowledge and experience, a great difference will often exist in the datum obtained. By means of studying the relationship between image features and perception quantity based on psychology experiments, the probability model has been established, in which the process is as following.Firstly, four image features have been extracted and quantified, which affect directly detection. Four feature similarity degrees between target and background were defined. Secondly, the relationship between single image feature similarity degree and perception quantity was set up based on psychological principle, and psychological experiments of target interpretation were designed which includes about five hundred people for interpretation and two hundred images. In order to reduce image features correlativity, a lot of artificial synthesis images have been made which include images with single brightness feature difference, images with single chromaticity feature difference, images with single texture feature difference and images with single shape feature difference. By analyzing and fitting a mass of experiments datum, the model quantitys have been determined. Finally, by applying statistical decision theory and experimental results, the relationship between perception quantity with target detection probability has been found. With the verification of a great deal of target interpretation in practice, the target detection probability can be obtained by the model quickly and objectively.

  18. a Probability-Based Statistical Method to Extract Water Body of TM Images with Missing Information

    NASA Astrophysics Data System (ADS)

    Lian, Shizhong; Chen, Jiangping; Luo, Minghai

    2016-06-01

    Water information cannot be accurately extracted using TM images because true information is lost in some images because of blocking clouds and missing data stripes, thereby water information cannot be accurately extracted. Water is continuously distributed in natural conditions; thus, this paper proposed a new method of water body extraction based on probability statistics to improve the accuracy of water information extraction of TM images with missing information. Different disturbing information of clouds and missing data stripes are simulated. Water information is extracted using global histogram matching, local histogram matching, and the probability-based statistical method in the simulated images. Experiments show that smaller Areal Error and higher Boundary Recall can be obtained using this method compared with the conventional methods.

  19. Finding significantly connected voxels based on histograms of connection strengths

    NASA Astrophysics Data System (ADS)

    Kasenburg, Niklas; Pedersen, Morten Vester; Darkner, Sune

    2016-03-01

    We explore a new approach for structural connectivity based segmentations of subcortical brain regions. Connectivity based segmentations are usually based on fibre connections from a seed region to predefined target regions. We present a method for finding significantly connected voxels based on the distribution of connection strengths. Paths from seed voxels to all voxels in a target region are obtained from a shortest-path tractography. For each seed voxel we approximate the distribution with a histogram of path scores. We hypothesise that the majority of estimated connections are false-positives and that their connection strength is distributed differently from true-positive connections. Therefore, an empirical null-distribution is defined for each target region as the average normalized histogram over all voxels in the seed region. Single histograms are then tested against the corresponding null-distribution and significance is determined using the false discovery rate (FDR). Segmentations are based on significantly connected voxels and their FDR. In this work we focus on the thalamus and the target regions were chosen by dividing the cortex into a prefrontal/temporal zone, motor zone, somatosensory zone and a parieto-occipital zone. The obtained segmentations consistently show a sparse number of significantly connected voxels that are located near the surface of the anterior thalamus over a population of 38 subjects.

  20. A method of classification for multisource data in remote sensing based on interval-valued probabilities

    NASA Technical Reports Server (NTRS)

    Kim, Hakil; Swain, Philip H.

    1990-01-01

    An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method.

  1. A simple derivation and classification of common probability distributions based on information symmetry and measurement scale.

    PubMed

    Frank, S A; Smith, E

    2011-03-01

    Commonly observed patterns typically follow a few distinct families of probability distributions. Over one hundred years ago, Karl Pearson provided a systematic derivation and classification of the common continuous distributions. His approach was phenomenological: a differential equation that generated common distributions without any underlying conceptual basis for why common distributions have particular forms and what explains the familial relations. Pearson's system and its descendants remain the most popular systematic classification of probability distributions. Here, we unify the disparate forms of common distributions into a single system based on two meaningful and justifiable propositions. First, distributions follow maximum entropy subject to constraints, where maximum entropy is equivalent to minimum information. Second, different problems associate magnitude to information in different ways, an association we describe in terms of the relation between information invariance and measurement scale. Our framework relates the different continuous probability distributions through the variations in measurement scale that change each family of maximum entropy distributions into a distinct family. From our framework, future work in biology can consider the genesis of common patterns in a new and more general way. Particular biological processes set the relation between the information in observations and magnitude, the basis for information invariance, symmetry and measurement scale. The measurement scale, in turn, determines the most likely probability distributions and observed patterns associated with particular processes. This view presents a fundamentally derived alternative to the largely unproductive debates about neutrality in ecology and evolution. PMID:21265914

  2. Assessment of probability density function based on POD reduced-order model for ensemble-based data assimilation

    NASA Astrophysics Data System (ADS)

    Kikuchi, Ryota; Misaka, Takashi; Obayashi, Shigeru

    2015-10-01

    An integrated method of a proper orthogonal decomposition based reduced-order model (ROM) and data assimilation is proposed for the real-time prediction of an unsteady flow field. In this paper, a particle filter (PF) and an ensemble Kalman filter (EnKF) are compared for data assimilation and the difference in the predicted flow fields is evaluated focusing on the probability density function (PDF) of the model variables. The proposed method is demonstrated using identical twin experiments of an unsteady flow field around a circular cylinder at the Reynolds number of 1000. The PF and EnKF are employed to estimate temporal coefficients of the ROM based on the observed velocity components in the wake of the circular cylinder. The prediction accuracy of ROM-PF is significantly better than that of ROM-EnKF due to the flexibility of PF for representing a PDF compared to EnKF. Furthermore, the proposed method reproduces the unsteady flow field several orders faster than the reference numerical simulation based on the Navier-Stokes equations.

  3. Probability based earthquake load and resistance factor design criteria for offshore platforms

    SciTech Connect

    Bea, R.G.

    1996-12-31

    This paper describes a probability reliability based formulation to determine earthquake Load and Resistance Factor Design (LRFD) parameters for conventional, steel, pile supported, tubular membered platforms that is proposed as a basis for earthquake design criteria and guidelines for offshore platforms that are intended to have worldwide applicability. The formulation is illustrated with application to platforms located in five areas: offshore California, Venezuela (Rio Caribe), the East Coast of Canada, in the Caspian Sea (Azeri), and the Norwegian sector of the North Sea.

  4. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    NASA Astrophysics Data System (ADS)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  5. A Comparison Of The Mycin Model For Reasoning Under Uncertainty To A Probability Based Model

    NASA Astrophysics Data System (ADS)

    Neapolitan, Richard E.

    1986-03-01

    Rule-based expert systems are those in which a certain number of IF-THEN rules are assumed to hold. Based on the verity of some assertions, the rules deduce new conclusions. In many cases, neither the rules nor the assertions are known with certainty. The system must then be able to obtain a measure of partial belief in the conclusion based upon measures of partial belief in the assertions and the rule. A problem arises when two or more rules (items of evidence) argue for the same conclusion. As proven in , certain assumptions concerning the independence of the two items of evidence is necessary before the certainties can be combined. In the current paper, it is shown how the well known MYCIN model combines the certainties from two items of evidence. The validity of the model is then proven based on the model's assumptions of independence of evidence. The assumptions are that the evidence must be independent in the whole space, in the space of the conclusion, and in the space of the complement of the conclusion. Next a probability-based model is described and compared to the MYCIN model. It is proven that the probabilistic assumptions for this model are weaker (independence is necessary only in the space of the conclusion and the space of the complement of conclusion), and therefore more appealing. An example is given to show how the added assumption in the MYCIN model is, in fact, the most restrictive assumption. It is also proven that, when two rules argue for the same conclusion, the combinatoric method in a MYCIN version of the probability-based model yields a higher combined certainty than that in the MYCIN model. It is finally concluded that the probability-based model, in light of the comparison, is the better choice.

  6. Questioning the Relevance of Model-Based Probability Statements on Extreme Weather and Future Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2007-12-01

    We question the relevance of climate-model based Bayesian (or other) probability statements for decision support and impact assessment on spatial scales less than continental and temporal averages less than seasonal. Scientific assessment of higher resolution space and time scale information is urgently needed, given the commercial availability of "products" at high spatiotemporal resolution, their provision by nationally funded agencies for use both in industry decision making and governmental policy support, and their presentation to the public as matters of fact. Specifically we seek to establish necessary conditions for probability forecasts (projections conditioned on a model structure and a forcing scenario) to be taken seriously as reflecting the probability of future real-world events. We illustrate how risk management can profitably employ imperfect models of complicated chaotic systems, following NASA's study of near-Earth PHOs (Potentially Hazardous Objects). Our climate models will never be perfect, nevertheless the space and time scales on which they provide decision- support relevant information is expected to improve with the models themselves. Our aim is to establish a set of baselines of internal consistency; these are merely necessary conditions (not sufficient conditions) that physics based state-of-the-art models are expected to pass if their output is to be judged decision support relevant. Probabilistic Similarity is proposed as one goal which can be obtained even when our models are not empirically adequate. In short, probabilistic similarity requires that, given inputs similar to today's empirical observations and observational uncertainties, we expect future models to produce similar forecast distributions. Expert opinion on the space and time scales on which we might reasonably expect probabilistic similarity may prove of much greater utility than expert elicitation of uncertainty in parameter values in a model that is not empirically

  7. A New Self-Constrained Inversion Method of Potential Fields Based on Probability Tomography

    NASA Astrophysics Data System (ADS)

    Sun, S.; Chen, C.; WANG, H.; Wang, Q.

    2014-12-01

    The self-constrained inversion method of potential fields uses a priori information self-extracted from potential field data. Differing from external a priori information, the self-extracted information are generally parameters derived exclusively from the analysis of the gravity and magnetic data (Paoletti et al., 2013). Here we develop a new self-constrained inversion method based on probability tomography. Probability tomography doesn't need any priori information, as well as large inversion matrix operations. Moreover, its result can describe the sources, especially the distribution of which is complex and irregular, entirely and clearly. Therefore, we attempt to use the a priori information extracted from the probability tomography results to constrain the inversion for physical properties. The magnetic anomaly data was taken as an example in this work. The probability tomography result of magnetic total field anomaly(ΔΤ) shows a smoother distribution than the anomalous source and cannot display the source edges exactly. However, the gradients of ΔΤ are with higher resolution than ΔΤ in their own direction, and this characteristic is also presented in their probability tomography results. So we use some rules to combine the probability tomography results of ∂ΔΤ⁄∂x, ∂ΔΤ⁄∂y and ∂ΔΤ⁄∂z into a new result which is used for extracting a priori information, and then incorporate the information into the model objective function as spatial weighting functions to invert the final magnetic susceptibility. Some magnetic synthetic examples incorporated with and without a priori information extracted from the probability tomography results were made to do comparison, results of which show that the former are more concentrated and with higher resolution of the source body edges. This method is finally applied in an iron mine in China with field measured ΔΤ data and performs well. ReferencesPaoletti, V., Ialongo, S., Florio, G., Fedi, M

  8. Flow Regime Based Climatologies of Lightning Probabilities for Spaceports and Airports

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III; Volmer, Matthew; Sharp, David; Spratt, Scott; Lafosse, Richard A.

    2007-01-01

    Objective: provide forecasters with a "first guess" climatological lightning probability tool (1) Focus on Space Shuttle landings and NWS T AFs (2) Four circles around sites: 5-, 10-, 20- and 30 n mi (4) Three time intervals: hourly, every 3 hr and every 6 hr It is based on: (1) NLDN gridded data (2) Flow regime (3) Warm season months of May-Sep for years 1989-2004 Gridded data and available code yields squares, not circles Over 850 spread sheets converted into manageable user-friendly web-based GUI

  9. Probability-Based Software for Grid Optimization: Improved Power System Operations Using Advanced Stochastic Optimization

    SciTech Connect

    2012-02-24

    GENI Project: Sandia National Laboratories is working with several commercial and university partners to develop software for market management systems (MMSs) that enable greater use of renewable energy sources throughout the grid. MMSs are used to securely and optimally determine which energy resources should be used to service energy demand across the country. Contributions of electricity to the grid from renewable energy sources such as wind and solar are intermittent, introducing complications for MMSs, which have trouble accommodating the multiple sources of price and supply uncertainties associated with bringing these new types of energy into the grid. Sandia’s software will bring a new, probability-based formulation to account for these uncertainties. By factoring in various probability scenarios for electricity production from renewable energy sources in real time, Sandia’s formula can reduce the risk of inefficient electricity transmission, save ratepayers money, conserve power, and support the future use of renewable energy.

  10. Identification of contaminant point source in surface waters based on backward location probability density function method

    NASA Astrophysics Data System (ADS)

    Cheng, Wei Ping; Jia, Yafei

    2010-04-01

    A backward location probability density function (BL-PDF) method capable of identifying location of point sources in surface waters is presented in this paper. The relation of forward location probability density function (FL-PDF) and backward location probability density, based on adjoint analysis, is validated using depth-averaged free-surface flow and mass transport models and several surface water test cases. The solutions of the backward location PDF transport equation agreed well to the forward location PDF computed using the pollutant concentration at the monitoring points. Using this relation and the distribution of the concentration detected at the monitoring points, an effective point source identification method is established. The numerical error of the backward location PDF simulation is found to be sensitive to the irregularity of the computational meshes, diffusivity, and velocity gradients. The performance of identification method is evaluated regarding the random error and number of observed values. In addition to hypothetical cases, a real case was studied to identify the source location where a dye tracer was instantaneously injected into a stream. The study indicated the proposed source identification method is effective, robust, and quite efficient in surface waters; the number of advection-diffusion equations needed to solve is equal to the number of observations.

  11. A multivariate copula-based framework for dealing with hazard scenarios and failure probabilities

    NASA Astrophysics Data System (ADS)

    Salvadori, G.; Durante, F.; De Michele, C.; Bernardi, M.; Petrella, L.

    2016-05-01

    This paper is of methodological nature, and deals with the foundations of Risk Assessment. Several international guidelines have recently recommended to select appropriate/relevant Hazard Scenarios in order to tame the consequences of (extreme) natural phenomena. In particular, the scenarios should be multivariate, i.e., they should take into account the fact that several variables, generally not independent, may be of interest. In this work, it is shown how a Hazard Scenario can be identified in terms of (i) a specific geometry and (ii) a suitable probability level. Several scenarios, as well as a Structural approach, are presented, and due comparisons are carried out. In addition, it is shown how the Hazard Scenario approach illustrated here is well suited to cope with the notion of Failure Probability, a tool traditionally used for design and risk assessment in engineering practice. All the results outlined throughout the work are based on the Copula Theory, which turns out to be a fundamental theoretical apparatus for doing multivariate risk assessment: formulas for the calculation of the probability of Hazard Scenarios in the general multidimensional case (d≥2) are derived, and worthy analytical relationships among the probabilities of occurrence of Hazard Scenarios are presented. In addition, the Extreme Value and Archimedean special cases are dealt with, relationships between dependence ordering and scenario levels are studied, and a counter-example concerning Tail Dependence is shown. Suitable indications for the practical application of the techniques outlined in the work are given, and two case studies illustrate the procedures discussed in the paper.

  12. Study of fusion probabilities with halo nuclei using different proximity based potentials

    NASA Astrophysics Data System (ADS)

    Kumari, Raj

    2013-11-01

    We study fusion of halo nuclei with heavy targets using proximity based potentials due to Aage Winther (AW) 95, Bass 80 and Proximity 2010. In order to consider the extended matter distribution of halo nuclei, the nuclei radii borrowed from cross section measurements are included in these potentials. Our study reveals that the barrier heights are effectively reduced and fusion cross sections are appreciably enhanced by including extended radii of these nuclei. We also find that the extended sizes of halos contribute towards enhancement of fusion probabilities in case of proton halo nuclei, but, contribute to transfer or break-up process rather than fusion yield in case of neutron halo nuclei.

  13. Forestry inventory based on multistage sampling with probability proportional to size

    NASA Technical Reports Server (NTRS)

    Lee, D. C. L.; Hernandez, P., Jr.; Shimabukuro, Y. E.

    1983-01-01

    A multistage sampling technique, with probability proportional to size, is developed for a forest volume inventory using remote sensing data. The LANDSAT data, Panchromatic aerial photographs, and field data are collected. Based on age and homogeneity, pine and eucalyptus classes are identified. Selection of tertiary sampling units is made through aerial photographs to minimize field work. The sampling errors for eucalyptus and pine ranged from 8.34 to 21.89 percent and from 7.18 to 8.60 percent, respectively.

  14. Probability-based damage detection using model updating with efficient uncertainty propagation

    NASA Astrophysics Data System (ADS)

    Xu, Yalan; Qian, Yu; Chen, Jianjun; Song, Gangbing

    2015-08-01

    Model updating method has received increasing attention in damage detection of structures based on measured modal parameters. In this article, a probability-based damage detection procedure is presented, in which the random factor method for non-homogeneous random field is developed and used as the forward propagation to analytically evaluate covariance matrices in each iteration step of stochastic model updating. An improved optimization algorithm is introduced to guarantee the convergence and reduce the computational effort, in which the design variables are restricted in search region by region truncation of each iteration step. The developed algorithm is illustrated by a simulated 25-bar planar truss structure and the results have been compared and verified with those obtained from Monte Carlo simulation. In order to assess the influences of uncertainty sources on the results of model updating and damage detection of structures, a comparative study is also given under different cases of uncertainties, that is, structural uncertainty only, measurement uncertainty only and combination of the two. The simulation results show the proposed method can perform well in stochastic model updating and probability-based damage detection of structures with less computational effort.

  15. 3D model retrieval using probability density-based shape descriptors.

    PubMed

    Akgül, Ceyhun Burak; Sankur, Bülent; Yemez, Yücel; Schmitt, Francis

    2009-06-01

    We address content-based retrieval of complete 3D object models by a probabilistic generative description of local shape properties. The proposed shape description framework characterizes a 3D object with sampled multivariate probability density functions of its local surface features. This density-based descriptor can be efficiently computed via kernel density estimation (KDE) coupled with fast Gauss transform. The non-parametric KDE technique allows reliable characterization of a diverse set of shapes and yields descriptors which remain relatively insensitive to small shape perturbations and mesh resolution. Density-based characterization also induces a permutation property which can be used to guarantee invariance at the shape matching stage. As proven by extensive retrieval experiments on several 3D databases, our framework provides state-of-the-art discrimination over a broad and heterogeneous set of shape categories. PMID:19372614

  16. Probability-based least square support vector regression metamodeling technique for crashworthiness optimization problems

    NASA Astrophysics Data System (ADS)

    Wang, Hu; Li, Enying; Li, G. Y.

    2011-03-01

    This paper presents a crashworthiness design optimization method based on a metamodeling technique. The crashworthiness optimization is a highly nonlinear and large scale problem, which is composed various nonlinearities, such as geometry, material and contact and needs a large number expensive evaluations. In order to obtain a robust approximation efficiently, a probability-based least square support vector regression is suggested to construct metamodels by considering structure risk minimization. Further, to save the computational cost, an intelligent sampling strategy is applied to generate sample points at the stage of design of experiment (DOE). In this paper, a cylinder, a full vehicle frontal collision is involved. The results demonstrate that the proposed metamodel-based optimization is efficient and effective in solving crashworthiness, design optimization problems.

  17. Probability voting and SVM-based vehicle detection in complex background airborne traffic video

    NASA Astrophysics Data System (ADS)

    Lei, Bo; Li, Qingquan; Zhang, Zhijie; Wang, Chensheng

    2012-11-01

    This paper introduces a novel vehicle detection method combined with probability voting based hypothesis generation (HG) and SVM based hypothesis verification (HV) specialized for the complex background airborne traffic video. In HG stage, a statistic based road area extraction method is applied and the lane marks are eliminated. Remained areas are clustered, and then the canny algorithm is performed to detect edges in clustered areas. A voting strategy is designed to detect rectangle objects in the scene. In HV stage, every possible vehicle area is rotated to align the vehicle along the vertical direction, and the vertical and horizontal gradients of them are calculated. SVM is adopted to classify vehicle and non-vehicle. The proposed method has been applied to several traffic scenes, and the experiment results show it's effective and veracious for the vehicle detection.

  18. Differentiable, multi-dimensional, knowledge-based energy terms for torsion angle probabilities and propensities.

    PubMed

    Amir, El-Ad David; Kalisman, Nir; Keasar, Chen

    2008-07-01

    Rotatable torsion angles are the major degrees of freedom in proteins. Adjacent angles are highly correlated and energy terms that rely on these correlations are intensively used in molecular modeling. However, the utility of torsion based terms is not yet fully exploited. Many of these terms do not capture the full scale of the correlations. Other terms, which rely on lookup tables, cannot be used in the context of force-driven algorithms because they are not fully differentiable. This study aims to extend the usability of torsion terms by presenting a set of high-dimensional and fully-differentiable energy terms that are derived from high-resolution structures. The set includes terms that describe backbone conformational probabilities and propensities, side-chain rotamer probabilities, and an elaborate term that couples all the torsion angles within the same residue. The terms are constructed by cubic spline interpolation with periodic boundary conditions that enable full differentiability and high computational efficiency. We show that the spline implementation does not compromise the accuracy of the original database statistics. We further show that the side-chain relevant terms are compatible with established rotamer probabilities. Despite their very local characteristics, the new terms are often able to identify native and native-like structures within decoy sets. Finally, force-based minimization of NMR structures with the new terms improves their torsion angle statistics with minor structural distortion (0.5 A RMSD on average). The new terms are freely available in the MESHI molecular modeling package. The spline coefficients are also available as a documented MATLAB file. PMID:18186478

  19. Incorporating seasonality into event-based joint probability methods for predicting flood frequency: A hybrid causative event approach

    NASA Astrophysics Data System (ADS)

    Li, Jing; Thyer, Mark; Lambert, Martin; Kuzera, George; Metcalfe, Andrew

    2016-02-01

    Flood extremes are driven by highly variable and complex climatic and hydrological processes. Observational evidence has identified that seasonality of climate variables has a major impact on flood peaks. However, event-based joint probability approaches for predicting the flood frequency distribution (FFD), which are commonly used in practice, do not commonly incorporate climate seasonality. This study presents an advance in event-based joint probability approaches by incorporating seasonality using the hybrid causative events (HCE) approach. The HCE was chosen because it uses the true causative events of the floods of interest and is able to combine the accuracy of continuous simulation with the computational efficiency of event-based approaches. The incorporation of seasonality is evaluated using a virtual catchment approach at eight sites over a wide range of Australian climate zones, including tropical, temperature, Mediterranean and desert climates (virtual catchment data for the eight sites is freely available via digital repository). The seasonal HCE provided accurate predictions of the FFD at all sites. In contrast, the non-seasonal HCE significantly over-predicted the FFD at some sites. The need to include seasonality was influenced by the magnitude of the seasonal variation in soil moisture and its coherence with the seasonal variation in extreme rainfall. For sites with a low seasonal variation in soil moisture the non-seasonal HCE provided reliable estimates of the FFD. For the remaining sites, it was found difficult to predict a priori whether ignoring seasonality provided a reliable estimate of the FFD, hence it is recommended that the seasonal HCE always be used. The practical implications of this study are that the HCE approach with seasonality is an accurate and efficient event-based joint probability approach to derive the flood frequency distribution across a wide range of climatologies.

  20. Protein single-model quality assessment by feature-based probability density functions.

    PubMed

    Cao, Renzhi; Cheng, Jianlin

    2016-01-01

    Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method-Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob. PMID:27041353

  1. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    DOE PAGESBeta

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; Martin-Martinez, Sergio; Zhang, Jie; Hodge, Bri -Mathias; Molina-Garcia, Angel

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less

  2. Protein single-model quality assessment by feature-based probability density functions

    PubMed Central

    Cao, Renzhi; Cheng, Jianlin

    2016-01-01

    Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method–Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob. PMID:27041353

  3. A Trust-Based Adaptive Probability Marking and Storage Traceback Scheme for WSNs.

    PubMed

    Liu, Anfeng; Liu, Xiao; Long, Jun

    2016-01-01

    Security is a pivotal issue for wireless sensor networks (WSNs), which are emerging as a promising platform that enables a wide range of military, scientific, industrial and commercial applications. Traceback, a key cyber-forensics technology, can play an important role in tracing and locating a malicious source to guarantee cybersecurity. In this work a trust-based adaptive probability marking and storage (TAPMS) traceback scheme is proposed to enhance security for WSNs. In a TAPMS scheme, the marking probability is adaptively adjusted according to the security requirements of the network and can substantially reduce the number of marking tuples and improve network lifetime. More importantly, a high trust node is selected to store marking tuples, which can avoid the problem of marking information being lost. Experimental results show that the total number of marking tuples can be reduced in a TAPMS scheme, thus improving network lifetime. At the same time, since the marking tuples are stored in high trust nodes, storage reliability can be guaranteed, and the traceback time can be reduced by more than 80%. PMID:27043566

  4. A Trust-Based Adaptive Probability Marking and Storage Traceback Scheme for WSNs

    PubMed Central

    Liu, Anfeng; Liu, Xiao; Long, Jun

    2016-01-01

    Security is a pivotal issue for wireless sensor networks (WSNs), which are emerging as a promising platform that enables a wide range of military, scientific, industrial and commercial applications. Traceback, a key cyber-forensics technology, can play an important role in tracing and locating a malicious source to guarantee cybersecurity. In this work a trust-based adaptive probability marking and storage (TAPMS) traceback scheme is proposed to enhance security for WSNs. In a TAPMS scheme, the marking probability is adaptively adjusted according to the security requirements of the network and can substantially reduce the number of marking tuples and improve network lifetime. More importantly, a high trust node is selected to store marking tuples, which can avoid the problem of marking information being lost. Experimental results show that the total number of marking tuples can be reduced in a TAPMS scheme, thus improving network lifetime. At the same time, since the marking tuples are stored in high trust nodes, storage reliability can be guaranteed, and the traceback time can be reduced by more than 80%. PMID:27043566

  5. Monte Carlo based protocol for cell survival and tumour control probability in BNCT

    NASA Astrophysics Data System (ADS)

    Ye, Sung-Joon

    1999-02-01

    A mathematical model to calculate the theoretical cell survival probability (nominally, the cell survival fraction) is developed to evaluate preclinical treatment conditions for boron neutron capture therapy (BNCT). A treatment condition is characterized by the neutron beam spectra, single or bilateral exposure, and the choice of boron carrier drug (boronophenylalanine (BPA) or boron sulfhydryl hydride (BSH)). The cell survival probability defined from Poisson statistics is expressed with the cell-killing yield, the (n, ) reaction density, and the tolerable neutron fluence. The radiation transport calculation from the neutron source to tumours is carried out using Monte Carlo methods: (i) reactor-based BNCT facility modelling to yield the neutron beam library at an irradiation port; (ii) dosimetry to limit the neutron fluence below a tolerance dose (10.5 Gy-Eq); (iii) calculation of the (n, ) reaction density in tumours. A shallow surface tumour could be effectively treated by single exposure producing an average cell survival probability of - for probable ranges of the cell-killing yield for the two drugs, while a deep tumour will require bilateral exposure to achieve comparable cell kills at depth. With very pure epithermal beams eliminating thermal, low epithermal and fast neutrons, the cell survival can be decreased by factors of 2-10 compared with

  6. Location Prediction Based on Transition Probability Matrices Constructing from Sequential Rules for Spatial-Temporal K-Anonymity Dataset

    PubMed Central

    Liu, Zhao; Zhu, Yunhong; Wu, Chenxue

    2016-01-01

    Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users’ privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified. PMID:27508502

  7. Location Prediction Based on Transition Probability Matrices Constructing from Sequential Rules for Spatial-Temporal K-Anonymity Dataset.

    PubMed

    Zhang, Haitao; Chen, Zewei; Liu, Zhao; Zhu, Yunhong; Wu, Chenxue

    2016-01-01

    Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users' privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified. PMID:27508502

  8. A unified optical damage criterion based on the probability density distribution of detector signals

    NASA Astrophysics Data System (ADS)

    Somoskoi, T.; Vass, Cs.; Mero, M.; Mingesz, R.; Bozoki, Z.; Osvay, K.

    2013-11-01

    Various methods and procedures have been developed so far to test laser induced optical damage. The question naturally arises, that what are the respective sensitivities of these diverse methods. To make a suitable comparison, both the processing of the measured primary signal has to be at least similar to the various methods, and one needs to establish a proper damage criterion, which has to be universally applicable for every method. We defined damage criteria based on the probability density distribution of the obtained detector signals. This was determined by the kernel density estimation procedure. We have tested the entire evaluation procedure in four well-known detection techniques: direct observation of the sample by optical microscopy; monitoring of the change in the light scattering power of the target surface and the detection of the generated photoacoustic waves both in the bulk of the sample and in the surrounding air.

  9. The estimation of neurotransmitter release probability in feedforward neuronal network based on adaptive synchronization

    NASA Astrophysics Data System (ADS)

    Xue, Ming; Wang, Jiang; Jia, Chenhui; Yu, Haitao; Deng, Bin; Wei, Xile; Che, Yanqiu

    2013-03-01

    In this paper, we proposed a new approach to estimate unknown parameters and topology of a neuronal network based on the adaptive synchronization control scheme. A virtual neuronal network is constructed as an observer to track the membrane potential of the corresponding neurons in the original network. When they achieve synchronization, the unknown parameters and topology of the original network are obtained. The method is applied to estimate the real-time status of the connection in the feedforward network and the neurotransmitter release probability of unreliable synapses is obtained by statistic computation. Numerical simulations are also performed to demonstrate the effectiveness of the proposed adaptive controller. The obtained results may have important implications in system identification in neural science.

  10. Estimation of the failure probability during EGS stimulation based on borehole data

    NASA Astrophysics Data System (ADS)

    Meller, C.; Kohl, Th.; Gaucher, E.

    2012-04-01

    In recent times the search for alternative sources of energy has been fostered by the scarcity of fossil fuels. With its ability to permanently provide electricity or heat with little emission of CO2, geothermal energy will have an important share in the energy mix of the future. Within Europe, scientists identified many locations with conditions suitable for Enhanced Geothermal System (EGS) projects. In order to provide sufficiently high reservoir permeability, EGS require borehole stimulations prior to installation of power plants (Gérard et al, 2006). Induced seismicity during water injection into reservoirs EGS systems is a factor that currently cannot be predicted nor controlled. Often, people living near EGS projects are frightened by smaller earthquakes occurring during stimulation or injection. As this fear can lead to widespread disapproval of geothermal power plants, it is appreciable to find a way to estimate the probability of fractures to shear when injecting water with a distinct pressure into a geothermal reservoir. This provides knowledge, which enables to predict the mechanical behavior of a reservoir in response to a change in pore pressure conditions. In the present study an approach for estimation of the shearing probability based on statistical analyses of fracture distribution, orientation and clusters, together with their geological properties is proposed. Based on geophysical logs of five wells in Soultz-sous-Forêts, France, and with the help of statistical tools, the Mohr criterion, geological and mineralogical properties of the host rock and the fracture fillings, correlations between the wells are analyzed. This is achieved with the self-written MATLAB-code Fracdens, which enables us to statistically analyze the log files in different ways. With the application of a pore pressure change, the evolution of the critical pressure on the fractures can be determined. A special focus is on the clay fillings of the fractures and how they reduce

  11. Binomial probability distribution model-based protein identification algorithm for tandem mass spectrometry utilizing peak intensity information.

    PubMed

    Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu

    2013-01-01

    Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ . PMID:23163785

  12. Influence of sampling intake position on suspended solid measurements in sewers: two probability/time-series-based approaches.

    PubMed

    Sandoval, Santiago; Bertrand-Krajewski, Jean-Luc

    2016-06-01

    Total suspended solid (TSS) measurements in urban drainage systems are required for several reasons. Aiming to assess uncertainties in the mean TSS concentration due to the influence of sampling intake vertical position and vertical concentration gradients in a sewer pipe, two methods are proposed: a simplified method based on a theoretical vertical concentration profile (SM) and a time series grouping method (TSM). SM is based on flow rate and water depth time series. TSM requires additional TSS time series as input data. All time series are from the Chassieu urban catchment in Lyon, France (time series from 2007 with 2-min time step, 89 rainfall events). The probability of measuring a TSS value lower than the mean TSS along the vertical cross section (TSS underestimation) is about 0.88 with SM and about 0.64 with TSM. TSM shows more realistic TSS underestimation values (about 39 %) than SM (about 269 %). Interquartile ranges (IQR) over the probability values indicate that SM is more uncertain (IQR = 0.08) than TSM (IQR = 0.02). Differences between the two methods are mainly due to simplifications in SM (absence of TSS measurements). SM assumes a significant asymmetry of the TSS concentration profile along the vertical axis in the cross section. This is compatible with the distribution of TSS measurements found in the TSM approach. The methods provide insights towards an indicator of the measurement performance and representativeness for a TSS sampling protocol. PMID:27178049

  13. A generic probability based algorithm to derive regional patterns of crops in time and space

    NASA Astrophysics Data System (ADS)

    Wattenbach, Martin; Oijen, Marcel v.; Leip, Adrian; Hutchings, Nick; Balkovic, Juraj; Smith, Pete

    2013-04-01

    Croplands are not only the key to human food supply, they also change the biophysical and biogeochemical properties of the land surface leading to changes in the water cycle, energy partitioning, influence soil erosion and substantially contribute to the amount of greenhouse gases entering the atmosphere. The effects of croplands on the environment depend on the type of crop and the associated management which both are related to the site conditions, economic boundary settings as well as preferences of individual farmers. However, at a given point of time the pattern of crops in a landscape is not only determined by environmental and socioeconomic conditions but also by the compatibility to the crops which had been grown in the years before at the current field and its surrounding cropping area. The crop compatibility is driven by factors like pests and diseases, crop driven changes in soil structure and timing of cultivation steps. Given these effects of crops on the biochemical cycle and their interdependence with the mentioned boundary conditions, there is a demand in the regional and global modelling community to account for these regional patterns. Here we present a Bayesian crop distribution generator algorithm that is used to calculate the combined and conditional probability for a crop to appear in time and space using sparse and disparate information. The input information to define the most probable crop per year and grid cell is based on combined probabilities derived from the a crop transition matrix representing good agricultural practice, crop specific soil suitability derived from the European soil database and statistical information about harvested area from the Eurostat database. The reported Eurostat crop area also provides the target proportion to be matched by the algorithm on the level of administrative units (Nomenclature des Unités Territoriales Statistiques - NUTS). The algorithm is applied for the EU27 to derive regional spatial and

  14. Use of probability based sampling of water quality indicators in supporting water quality criteria development - 2/28/08

    EPA Science Inventory

    We examine the proposition that water quality indicator data collected from large scale, probability based assessments of coastal condition such as the US Environmental Protection Agency National Coastal Assessment (NCA) can be used to support water quality criteria development f...

  15. Experience-Based Probabilities Modulate Expectations in a Gender-Coded Artificial Language.

    PubMed

    Öttl, Anton; Behne, Dawn M

    2016-01-01

    The current study combines artificial language learning with visual world eyetracking to investigate acquisition of representations associating spoken words and visual referents using morphologically complex pseudowords. Pseudowords were constructed to consistently encode referential gender by means of suffixation for a set of imaginary figures that could be either male or female. During training, the frequency of exposure to pseudowords and their imaginary figure referents were manipulated such that a given word and its referent would be more likely to occur in either the masculine form or the feminine form, or both forms would be equally likely. Results show that these experience-based probabilities affect the formation of new representations to the extent that participants were faster at recognizing a referent whose gender was consistent with the induced expectation than a referent whose gender was inconsistent with this expectation. Disambiguating gender information available from the suffix did not mask the induced expectations. Eyetracking data provide additional evidence that such expectations surface during online lexical processing. Taken together, these findings indicate that experience-based information is accessible during the earliest stages of processing, and are consistent with the view that language comprehension depends on the activation of perceptual memory traces. PMID:27602009

  16. Experience-Based Probabilities Modulate Expectations in a Gender-Coded Artificial Language

    PubMed Central

    Öttl, Anton; Behne, Dawn M.

    2016-01-01

    The current study combines artificial language learning with visual world eyetracking to investigate acquisition of representations associating spoken words and visual referents using morphologically complex pseudowords. Pseudowords were constructed to consistently encode referential gender by means of suffixation for a set of imaginary figures that could be either male or female. During training, the frequency of exposure to pseudowords and their imaginary figure referents were manipulated such that a given word and its referent would be more likely to occur in either the masculine form or the feminine form, or both forms would be equally likely. Results show that these experience-based probabilities affect the formation of new representations to the extent that participants were faster at recognizing a referent whose gender was consistent with the induced expectation than a referent whose gender was inconsistent with this expectation. Disambiguating gender information available from the suffix did not mask the induced expectations. Eyetracking data provide additional evidence that such expectations surface during online lexical processing. Taken together, these findings indicate that experience-based information is accessible during the earliest stages of processing, and are consistent with the view that language comprehension depends on the activation of perceptual memory traces. PMID:27602009

  17. Global climate change model natural climate variation: Paleoclimate data base, probabilities and astronomic predictors

    SciTech Connect

    Kukla, G.; Gavin, J.

    1994-05-01

    This report was prepared at the Lamont-Doherty Geological Observatory of Columbia University at Palisades, New York, under subcontract to Pacific Northwest Laboratory it is a part of a larger project of global climate studies which supports site characterization work required for the selection of a potential high-level nuclear waste repository and forms part of the Performance Assessment Scientific Support (PASS) Program at PNL. The work under the PASS Program is currently focusing on the proposed site at Yucca Mountain, Nevada, and is under the overall direction of the Yucca Mountain Project Office US Department of Energy, Las Vegas, Nevada. The final results of the PNL project will provide input to global atmospheric models designed to test specific climate scenarios which will be used in the site specific modeling work of others. The primary purpose of the data bases compiled and of the astronomic predictive models is to aid in the estimation of the probabilities of future climate states. The results will be used by two other teams working on the global climate study under contract to PNL. They are located at and the University of Maine in Orono, Maine, and the Applied Research Corporation in College Station, Texas. This report presents the results of the third year`s work on the global climate change models and the data bases describing past climates.

  18. Probability distribution function-based classification of structural MRI for the detection of Alzheimer's disease.

    PubMed

    Beheshti, I; Demirel, H

    2015-09-01

    High-dimensional classification methods have been a major target of machine learning for the automatic classification of patients who suffer from Alzheimer's disease (AD). One major issue of automatic classification is the feature-selection method from high-dimensional data. In this paper, a novel approach for statistical feature reduction and selection in high-dimensional magnetic resonance imaging (MRI) data based on the probability distribution function (PDF) is introduced. To develop an automatic computer-aided diagnosis (CAD) technique, this research explores the statistical patterns extracted from structural MRI (sMRI) data on four systematic levels. First, global and local differences of gray matter in patients with AD compared to healthy controls (HCs) using the voxel-based morphometric (VBM) technique with 3-Tesla 3D T1-weighted MRI are investigated. Second, feature extraction based on the voxel clusters detected by VBM on sMRI and voxel values as volume of interest (VOI) is used. Third, a novel statistical feature-selection process is employed, utilizing the PDF of the VOI to represent statistical patterns of the respective high-dimensional sMRI sample. Finally, the proposed feature-selection method for early detection of AD with support vector machine (SVM) classifiers compared to other standard feature selection methods, such as partial least squares (PLS) techniques, is assessed. The performance of the proposed technique is evaluated using 130 AD and 130 HC MRI data from the ADNI dataset with 10-fold cross validation(1). The results show that the PDF-based feature selection approach is a reliable technique that is highly competitive with respect to the state-of-the-art techniques in classifying AD from high-dimensional sMRI samples. PMID:26226415

  19. Moment-Based Probability Modeling and Extreme Response Estimation, The FITS Routine Version 1.2

    SciTech Connect

    MANUEL,LANCE; KASHEF,TINA; WINTERSTEIN,STEVEN R.

    1999-11-01

    This report documents the use of the FITS routine, which provides automated fits of various analytical, commonly used probability models from input data. It is intended to complement the previously distributed FITTING routine documented in RMS Report 14 (Winterstein et al., 1994), which implements relatively complex four-moment distribution models whose parameters are fit with numerical optimization routines. Although these four-moment fits can be quite useful and faithful to the observed data, their complexity can make them difficult to automate within standard fitting algorithms. In contrast, FITS provides more robust (lower moment) fits of simpler, more conventional distribution forms. For each database of interest, the routine estimates the distribution of annual maximum response based on the data values and the duration, T, over which they were recorded. To focus on the upper tails of interest, the user can also supply an arbitrary lower-bound threshold, {chi}{sub low}, above which a shifted distribution model--exponential or Weibull--is fit.

  20. A physically-based earthquake recurrence model for estimation of long-term earthquake probabilities

    USGS Publications Warehouse

    Ellsworth, William L.; Matthews, Mark V.; Nadeau, Robert M.; Nishenko, Stuart P.; Reasenberg, Paul A.; Simpson, Robert W.

    1999-01-01

    A physically-motivated model for earthquake recurrence based on the Brownian relaxation oscillator is introduced. The renewal process defining this point process model can be described by the steady rise of a state variable from the ground state to failure threshold as modulated by Brownian motion. Failure times in this model follow the Brownian passage time (BPT) distribution, which is specified by the mean time to failure, μ, and the aperiodicity of the mean, α (equivalent to the familiar coefficient of variation). Analysis of 37 series of recurrent earthquakes, M -0.7 to 9.2, suggests a provisional generic value of α = 0.5. For this value of α, the hazard function (instantaneous failure rate of survivors) exceeds the mean rate for times > μ⁄2, and is ~ ~ 2 ⁄ μ for all times > μ. Application of this model to the next M 6 earthquake on the San Andreas fault at Parkfield, California suggests that the annual probability of the earthquake is between 1:10 and 1:13.

  1. Micro-object motion tracking based on the probability hypothesis density particle tracker.

    PubMed

    Shi, Chunmei; Zhao, Lingling; Wang, Junjie; Zhang, Chiping; Su, Xiaohong; Ma, Peijun

    2016-04-01

    Tracking micro-objects in the noisy microscopy image sequences is important for the analysis of dynamic processes in biological objects. In this paper, an automated tracking framework is proposed to extract the trajectories of micro-objects. This framework uses a probability hypothesis density particle filtering (PF-PHD) tracker to implement a recursive state estimation and trajectories association. In order to increase the efficiency of this approach, an elliptical target model is presented to describe the micro-objects using shape parameters instead of point-like targets which may cause inaccurate tracking. A novel likelihood function, not only covering the spatiotemporal distance but also dealing with geometric shape function based on the Mahalanobis norm, is proposed to improve the accuracy of particle weight in the update process of the PF-PHD tracker. Using this framework, a larger number of tracks are obtained. The experiments are performed on simulated data of microtubule movements and real mouse stem cells. We compare the PF-PHD tracker with the nearest neighbor method and the multiple hypothesis tracking method. Our PF-PHD tracker can simultaneously track hundreds of micro-objects in the microscopy image sequence. PMID:26084407

  2. Visualization and probability-based scoring of structural variants within repetitive sequences

    PubMed Central

    Halper-Stromberg, Eitan; Steranka, Jared; Burns, Kathleen H.; Sabunciyan, Sarven; Irizarry, Rafael A.

    2014-01-01

    Motivation: Repetitive sequences account for approximately half of the human genome. Accurately ascertaining sequences in these regions with next generation sequencers is challenging, and requires a different set of analytical techniques than for reads originating from unique sequences. Complicating the matter are repetitive regions subject to programmed rearrangements, as is the case with the antigen-binding domains in the Immunoglobulin (Ig) and T-cell receptor (TCR) loci. Results: We developed a probability-based score and visualization method to aid in distinguishing true structural variants from alignment artifacts. We demonstrate the usefulness of this method in its ability to separate real structural variants from false positives generated with existing upstream analysis tools. We validated our approach using both target-capture and whole-genome experiments. Capture sequencing reads were generated from primary lymphoid tumors, cancer cell lines and an EBV-transformed lymphoblast cell line over the Ig and TCR loci. Whole-genome sequencing reads were from a lymphoblastoid cell-line. Availability: We implement our method as an R package available at https://github.com/Eitan177/targetSeqView. Code to reproduce the figures and results are also available. Contact: ehalper2@jhmi.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24501098

  3. Inverse modeling of hydraulic tests in fractured crystalline rock based on a transition probability geostatistical approach

    NASA Astrophysics Data System (ADS)

    Blessent, Daniela; Therrien, René; Lemieux, Jean-Michel

    2011-12-01

    This paper presents numerical simulations of a series of hydraulic interference tests conducted in crystalline bedrock at Olkiluoto (Finland), a potential site for the disposal of the Finnish high-level nuclear waste. The tests are in a block of crystalline bedrock of about 0.03 km3 that contains low-transmissivity fractures. Fracture density, orientation, and fracture transmissivity are estimated from Posiva Flow Log (PFL) measurements in boreholes drilled in the rock block. On the basis of those data, a geostatistical approach relying on a transitional probability and Markov chain models is used to define a conceptual model based on stochastic fractured rock facies. Four facies are defined, from sparsely fractured bedrock to highly fractured bedrock. Using this conceptual model, three-dimensional groundwater flow is then simulated to reproduce interference pumping tests in either open or packed-off boreholes. Hydraulic conductivities of the fracture facies are estimated through automatic calibration using either hydraulic heads or both hydraulic heads and PFL flow rates as targets for calibration. The latter option produces a narrower confidence interval for the calibrated hydraulic conductivities, therefore reducing the associated uncertainty and demonstrating the usefulness of the measured PFL flow rates. Furthermore, the stochastic facies conceptual model is a suitable alternative to discrete fracture network models to simulate fluid flow in fractured geological media.

  4. Probability based remaining capacity estimation using data-driven and neural network model

    NASA Astrophysics Data System (ADS)

    Wang, Yujie; Yang, Duo; Zhang, Xu; Chen, Zonghai

    2016-05-01

    Since large numbers of lithium-ion batteries are composed in pack and the batteries are complex electrochemical devices, their monitoring and safety concerns are key issues for the applications of battery technology. An accurate estimation of battery remaining capacity is crucial for optimization of the vehicle control, preventing battery from over-charging and over-discharging and ensuring the safety during its service life. The remaining capacity estimation of a battery includes the estimation of state-of-charge (SOC) and state-of-energy (SOE). In this work, a probability based adaptive estimator is presented to obtain accurate and reliable estimation results for both SOC and SOE. For the SOC estimation, an n ordered RC equivalent circuit model is employed by combining an electrochemical model to obtain more accurate voltage prediction results. For the SOE estimation, a sliding window neural network model is proposed to investigate the relationship between the terminal voltage and the model inputs. To verify the accuracy and robustness of the proposed model and estimation algorithm, experiments under different dynamic operation current profiles are performed on the commercial 1665130-type lithium-ion batteries. The results illustrate that accurate and robust estimation can be obtained by the proposed method.

  5. SAR amplitude probability density function estimation based on a generalized Gaussian model.

    PubMed

    Moser, Gabriele; Zerubia, Josiane; Serpico, Sebastiano B

    2006-06-01

    In the context of remotely sensed data analysis, an important problem is the development of accurate models for the statistics of the pixel intensities. Focusing on synthetic aperture radar (SAR) data, this modeling process turns out to be a crucial task, for instance, for classification or for denoising purposes. In this paper, an innovative parametric estimation methodology for SAR amplitude data is proposed that adopts a generalized Gaussian (GG) model for the complex SAR backscattered signal. A closed-form expression for the corresponding amplitude probability density function (PDF) is derived and a specific parameter estimation algorithm is developed in order to deal with the proposed model. Specifically, the recently proposed "method-of-log-cumulants" (MoLC) is applied, which stems from the adoption of the Mellin transform (instead of the usual Fourier transform) in the computation of characteristic functions and from the corresponding generalization of the concepts of moment and cumulant. For the developed GG-based amplitude model, the resulting MoLC estimates turn out to be numerically feasible and are also analytically proved to be consistent. The proposed parametric approach was validated by using several real ERS-1, XSAR, E-SAR, and NASA/JPL airborne SAR images, and the experimental results prove that the method models the amplitude PDF better than several previously proposed parametric models for backscattering phenomena. PMID:16764268

  6. In search of a statistical probability model for petroleum-resource assessment : a critique of the probabilistic significance of certain concepts and methods used in petroleum-resource assessment : to that end, a probabilistic model is sketched

    USGS Publications Warehouse

    Grossling, Bernardo F.

    1975-01-01

    Exploratory drilling is still in incipient or youthful stages in those areas of the world where the bulk of the potential petroleum resources is yet to be discovered. Methods of assessing resources from projections based on historical production and reserve data are limited to mature areas. For most of the world's petroleum-prospective areas, a more speculative situation calls for a critical review of resource-assessment methodology. The language of mathematical statistics is required to define more rigorously the appraisal of petroleum resources. Basically, two approaches have been used to appraise the amounts of undiscovered mineral resources in a geologic province: (1) projection models, which use statistical data on the past outcome of exploration and development in the province; and (2) estimation models of the overall resources of the province, which use certain known parameters of the province together with the outcome of exploration and development in analogous provinces. These two approaches often lead to widely different estimates. Some of the controversy that arises results from a confusion of the probabilistic significance of the quantities yielded by each of the two approaches. Also, inherent limitations of analytic projection models-such as those using the logistic and Gomperts functions --have often been ignored. The resource-assessment problem should be recast in terms that provide for consideration of the probability of existence of the resource and of the probability of discovery of a deposit. Then the two above-mentioned models occupy the two ends of the probability range. The new approach accounts for (1) what can be expected with reasonably high certainty by mere projections of what has been accomplished in the past; (2) the inherent biases of decision-makers and resource estimators; (3) upper bounds that can be set up as goals for exploration; and (4) the uncertainties in geologic conditions in a search for minerals. Actual outcomes can then

  7. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    NASA Astrophysics Data System (ADS)

    Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra

    2014-06-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute to stability of the tunnel structure are recognized owing to identify various aspects of reliability and sustainability in the system. The selection of efficient support methods for rock tunneling is a key factor in order to reduce the number of problems during construction and maintain the project cost and time within the limited budget and planned schedule. This paper introduces a smart approach by which decision-makers will be able to find the overall reliability of tunnel support system before selecting the final scheme of the lining system. Due to this research focus, engineering reliability which is a branch of statistics and probability is being appropriately applied to the field and much effort has been made to use it in tunneling while investigating the reliability of the lining support system for the tunnel structure. Therefore, reliability analysis for evaluating the tunnel support performance is the main idea used in this research. Decomposition approaches are used for producing system block diagram and determining the failure probability of the whole system. Effectiveness of the proposed reliability model of tunnel lining together with the recommended approaches is examined using several case studies and the final value of reliability obtained for different designing scenarios. Considering the idea of linear correlation between safety factors and reliability parameters, the values of isolated reliabilities determined for different structural components of tunnel support system. In order to determine individual safety factors, finite element modeling is employed for different structural subsystems and the results of numerical analyses are obtained in

  8. Ignition probabilities of wildland fuels based on simulated lightning discharges. Forest Service research paper

    SciTech Connect

    Latham, D.J.; Schlieter, J.A.

    1989-09-01

    Ignition of wildland fine fuels by lightning was simulated with an electric arc discharge in the laboratory. The results showed that fuel parameters such as depth, moisture content, bulk density, and mineral content can be combined with the duration of the simulated continuing current to give ignition probabilities. The fuel state parameters of importance and the ignition probabilities were determined using logistic regression. Graphs, tables, formulas, and a FORTRAN computer program are given for field use.

  9. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education

    PubMed Central

    Masel, J.; Humphrey, P. T.; Blackburn, B.; Levine, J. A.

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students’ intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes’ theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. PMID:26582236

  10. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    PubMed

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. PMID:26582236

  11. On Probability Domains III

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2015-12-01

    Domains of generalized probability have been introduced in order to provide a general construction of random events, observables and states. It is based on the notion of a cogenerator and the properties of product. We continue our previous study and show how some other quantum structures fit our categorical approach. We discuss how various epireflections implicitly used in the classical probability theory are related to the transition to fuzzy probability theory and describe the latter probability theory as a genuine categorical extension of the former. We show that the IF-probability can be studied via the fuzzy probability theory. We outline a "tensor modification" of the fuzzy probability theory.

  12. Experimental Probability in Elementary School

    ERIC Educational Resources Information Center

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  13. Competency-based curricular design to encourage significant learning.

    PubMed

    Hurtubise, Larry; Roman, Brenda

    2014-07-01

    Most significant learning (SL) experiences produce long-lasting learning experiences that meaningfully change the learner's thinking, feeling, and/or behavior. Most significant teaching experiences involve strong connections with the learner and recognition that the learner felt changed by the teaching effort. L. Dee Fink in Creating Significant Learning Experiences: An Integrated Approach to Designing College Course defines six kinds of learning goals: Foundational Knowledge, Application, Integration, Human Dimension, Caring, and Learning to Learn. SL occurs when learning experiences promote interaction between the different kinds of goals, for example, acquiring knowledge alone is not enough, but when paired with a learning experience, such as an effective patient experience as in Caring, then significant (and lasting) learning occurs. To promote SL, backward design principles that start with clearly defined learning goals and the context of the situation of the learner are particularly effective. Emphasis on defining assessment methods prior to developing teaching/learning activities is the key: this ensures that assessment (where the learner should be at the end of the educational activity/process) drives instruction and that assessment and learning/instruction are tightly linked so that assessment measures a defined outcome (competency) of the learner. Employing backward design and the AAMC's MedBiquitous standard vocabulary for medical education can help to ensure that curricular design and redesign efforts effectively enhance educational program quality and efficacy, leading to improved patient care. Such methods can promote successful careers in health care for learners through development of self-directed learning skills and active learning, in ways that help learners become fully committed to lifelong learning and continuous professional development. PMID:24981665

  14. The Significance of Trust in School-Based Collaborative Leadership

    ERIC Educational Resources Information Center

    Coleman, Andrew

    2012-01-01

    The expectation that schools should work in partnership to promote the achievement of children has arguably been the defining feature of school policy over the last decade. This rise in school-to-school partnerships and increased emphasis on multi-agency-based interventions for vulnerable children have seen the emergence of a new form of school…

  15. Model assisted probability of detection for a guided waves based SHM technique

    NASA Astrophysics Data System (ADS)

    Memmolo, V.; Ricci, F.; Maio, L.; Boffa, N. D.; Monaco, E.

    2016-04-01

    Guided wave (GW) Structural Health Monitoring (SHM) allows to assess the health of aerostructures thanks to the great sensitivity to delamination and/or debondings appearance. Due to the several complexities affecting wave propagation in composites, an efficient GW SHM system requires its effective quantification associated to a rigorous statistical evaluation procedure. Probability of Detection (POD) approach is a commonly accepted measurement method to quantify NDI results and it can be effectively extended to an SHM context. However, it requires a very complex setup arrangement and many coupons. When a rigorous correlation with measurements is adopted, Model Assisted POD (MAPOD) is an efficient alternative to classic methods. This paper is concerned with the identification of small emerging delaminations in composite structural components. An ultrasonic GW tomography focused to impact damage detection in composite plate-like structures recently developed by authors is investigated, getting the bases for a more complex MAPOD analysis. Experimental tests carried out on a typical wing composite structure demonstrated the effectiveness of modeling approach in order to detect damages with the tomographic algorithm. Environmental disturbances, which affect signal waveforms and consequently damage detection, are considered simulating a mathematical noise in the modeling stage. A statistical method is used for an effective making decision procedure. A Damage Index approach is implemented as metric to interpret the signals collected from a distributed sensor network and a subsequent graphic interpolation is carried out to reconstruct the damage appearance. A model validation and first reliability assessment results are provided, in view of performance system quantification and its optimization as well.

  16. Value of genetic testing for hereditary colorectal cancer in a probability-based US online sample

    PubMed Central

    Knight, Sara J.; Mohamed, Ateesha F.; Marshall, Deborah A.; Ladabaum, Uri; Phillips, Kathryn A.; Walsh, Judith M. E.

    2015-01-01

    Background While choices about genetic testing are increasingly common for patients and families, and public opinion surveys suggest public interest in genomics, it is not known how adults from the general population value genetic testing for heritable conditions. We sought to understand in a US sample the relative value of the characteristics of genetic tests to identify risk of hereditary colorectal cancer, among the first genomic applications with evidence to support its translation to clinical settings. Methods A Web-enabled choice-format conjoint survey was conducted with adults age 50 and older from a probability-based US panel. Participants were asked to make a series of choices between two hypothetical blood tests that differed in risk of false negative test, privacy, and cost. Random parameters logit models were used to estimate preferences, the dollar value of genetic information, and intent to have genetic testing. Results A total of 355 individuals completed choice-format questions. Cost and privacy were more highly valued than reducing the chance of a false negative result. Most (97%, 95% Confidence Interval (CI): 95% to 99%) would have genetic testing to reduce the risk of dying from colorectal cancer in the best scenario (no false negatives, results disclosed to primary care physician). Only 41% (95% CI: 25% to 57%) would have genetic testing in the worst case (20% false negatives, results disclosed to insurance company). Conclusions Given the characteristics and levels included in the choice, if false negative test results are unlikely and results are shared with a primary care physician, the majority would have genetic testing. As genomic services become widely available, primary care professionals will need to be increasingly knowledgeable about genetic testing decisions. PMID:25589525

  17. Significance of hair-dye base-induced sensory irritation.

    PubMed

    Fujita, F; Azuma, T; Tajiri, M; Okamoto, H; Sano, M; Tominaga, M

    2010-06-01

    Oxidation hair-dyes, which are the principal hair-dyes, sometimes induce painful sensory irritation of the scalp caused by the combination of highly reactive substances, such as hydrogen peroxide and alkali agents. Although many cases of severe facial and scalp dermatitis have been reported following the use of hair-dyes, sensory irritation caused by contact of the hair-dye with the skin has not been reported clearly. In this study, we used a self-assessment questionnaire to measure the sensory irritation in various regions of the body caused by two model hair-dye bases that contained different amounts of alkali agents without dyes. Moreover, the occipital region was found as an alternative region of the scalp to test for sensory irritation of the hair-dye bases. We used this region to evaluate the relationship of sensitivity with skin properties, such as trans-epidermal water loss (TEWL), stratum corneum water content, sebum amount, surface temperature, current perception threshold (CPT), catalase activities in tape-stripped skin and sensory irritation score with the model hair-dye bases. The hair-dye sensitive group showed higher TEWL, a lower sebum amount, a lower surface temperature and higher catalase activity than the insensitive group, and was similar to that of damaged skin. These results suggest that sensory irritation caused by hair-dye could occur easily on the damaged dry scalp, as that caused by skin cosmetics reported previously. PMID:20557579

  18. Methods for estimating annual exceedance probability discharges for streams in Arkansas, based on data through water year 2013

    USGS Publications Warehouse

    Wagner, Daniel M.; Krieger, Joshua D.; Veilleux, Andrea G.

    2016-01-01

    In 2013, the U.S. Geological Survey initiated a study to update regional skew, annual exceedance probability discharges, and regional regression equations used to estimate annual exceedance probability discharges for ungaged locations on streams in the study area with the use of recent geospatial data, new analytical methods, and available annual peak-discharge data through the 2013 water year. An analysis of regional skew using Bayesian weighted least-squares/Bayesian generalized-least squares regression was performed for Arkansas, Louisiana, and parts of Missouri and Oklahoma. The newly developed constant regional skew of -0.17 was used in the computation of annual exceedance probability discharges for 281 streamgages used in the regional regression analysis. Based on analysis of covariance, four flood regions were identified for use in the generation of regional regression models. Thirty-nine basin characteristics were considered as potential explanatory variables, and ordinary least-squares regression techniques were used to determine the optimum combinations of basin characteristics for each of the four regions. Basin characteristics in candidate models were evaluated based on multicollinearity with other basin characteristics (variance inflation factor < 2.5) and statistical significance at the 95-percent confidence level (p ≤ 0.05). Generalized least-squares regression was used to develop the final regression models for each flood region. Average standard errors of prediction of the generalized least-squares models ranged from 32.76 to 59.53 percent, with the largest range in flood region D. Pseudo coefficients of determination of the generalized least-squares models ranged from 90.29 to 97.28 percent, with the largest range also in flood region D. The regional regression equations apply only to locations on streams in Arkansas where annual peak discharges are not substantially affected by regulation, diversion, channelization, backwater, or urbanization

  19. A Scrabble Heuristic Based on Probability That Performs at Championship Level

    NASA Astrophysics Data System (ADS)

    Ramírez, Arturo; Acuña, Francisco González; Romero, Alejandro González; Alquézar, René; Hernández, Enric; Aguilar, Amador Roldán; Olmedo, Ian García

    The game of Scrabble, in its competitive form (one vs. one), has been tackled mostly by using Monte Carlo simulation. Recently [1], Probability Theory (Bayes’ theorem) was used to gain knowledge about the opponents’ tiles; this proved to be a good approach to improve even more Computer Scrabble. We used probability to evaluate Scrabble leaves (rack residues); then using this evaluation, a heuristic function that dictates a move can be constructed. To calculate these probabilities it is necessary to have a lexicon, in our case a Spanish lexicon. To make proper investigations in the domain of Scrabble it is important to have the same lexicon as the one used by humans in official tournaments. We did a huge amount of work to build this free lexicon. In this paper a heuristic function that involves leaves probabilities is given. We have now an engine, Heuri, that uses this heuristic, and we have been able to perform some experiments to test it. The tests include matches against highly expert players; the games played so far give us promising results. For instance, recently a match between the current World Scrabble Champion (in Spanish) and Heuri was played. Heuri defeated the World Champion 6-0 ! Heuri includes a move generator which, using a lot of memory, is faster than using DAWG [2] or GADDAG [3]. Another plan to build a stronger Heuri that combines heuristics using probabilities, opponent modeling and Monte Carlo simulation is also proposed.

  20. How the Probability and Potential Clinical Significance of Pharmacokinetically Mediated Drug-Drug Interactions Are Assessed in Drug Development: Desvenlafaxine as an Example

    PubMed Central

    Nichols, Alice I.; Preskorn, Sheldon H.

    2015-01-01

    Objective: The avoidance of adverse drug-drug interactions (DDIs) is a high priority in terms of both the US Food and Drug Administration (FDA) and the individual prescriber. With this perspective in mind, this article illustrates the process for assessing the risk of a drug (example here being desvenlafaxine) causing or being the victim of DDIs, in accordance with FDA guidance. Data Sources/Study Selection: DDI studies for the serotonin-norepinephrine reuptake inhibitor desvenlafaxine conducted by the sponsor and published since 2009 are used as examples of the systematic way that the FDA requires drug developers to assess whether their new drug is either capable of causing clinically meaningful DDIs or being the victim of such DDIs. In total, 8 open-label studies tested the effects of steady-state treatment with desvenlafaxine (50–400 mg/d) on the pharmacokinetics of cytochrome (CYP) 2D6 and/or CYP 3A4 substrate drugs, or the effect of CYP 3A4 inhibition on desvenlafaxine pharmacokinetics. The potential for DDIs mediated by the P-glycoprotein (P-gp) transporter was assessed in in vitro studies using Caco-2 monolayers. Data Extraction: Changes in area under the plasma concentration-time curve (AUC; CYP studies) and efflux (P-gp studies) were reviewed for potential DDIs in accordance with FDA criteria. Results: Desvenlafaxine coadministration had minimal effect on CYP 2D6 and/or 3A4 substrates per FDA criteria. Changes in AUC indicated either no interaction (90% confidence intervals for the ratio of AUC geometric least-squares means [GM] within 80%–125%) or weak inhibition (AUC GM ratio 125% to < 200%). Coadministration with ketoconazole resulted in a weak interaction with desvenlafaxine (AUC GM ratio of 143%). Desvenlafaxine was not a substrate (efflux ratio < 2) or inhibitor (50% inhibitory drug concentration values > 250 μM) of P-gp. Conclusions: A 2-step process based on FDA guidance can be used first to determine whether a pharmacokinetically mediated

  1. Heightened odds of large earthquakes near Istanbul: an interaction-based probability calculation

    USGS Publications Warehouse

    Parsons, T.; Toda, S.; Stein, R.S.; Barka, A.; Dieterich, J.H.

    2000-01-01

    We calculate the probability of strong shaking in Istanbul, an urban center of 10 million people, from the description of earthquakes on the North Anatolian fault system in the Marmara Sea during the past 500 years and test the resulting catalog against the frequency of damage in Istanbul during the preceding millennium, departing from current practice, we include the time-dependent effect of stress transferred by the 1999 moment magnitude M = 7.4 Izmit earthquake to faults nearer to Istanbul. We find a 62 ± 15% probability (one standard deviation) of strong shaking during the next 30 years and 32 ± 12% during the next decade.

  2. RDX-based nanocomposite microparticles for significantly reduced shock sensitivity.

    PubMed

    Qiu, Hongwei; Stepanov, Victor; Di Stasio, Anthony R; Chou, Tsengming; Lee, Woo Y

    2011-01-15

    Cyclotrimethylenetrinitramine (RDX)-based nanocomposite microparticles were produced by a simple, yet novel spray drying method. The microparticles were characterized by scanning electron microscopy (SEM), transmission electron microscopy (TEM), X-ray diffraction (XRD) and high performance liquid chromatography (HPLC), which shows that they consist of small RDX crystals (∼0.1-1 μm) uniformly and discretely dispersed in a binder. The microparticles were subsequently pressed to produce dense energetic materials which exhibited a markedly lower shock sensitivity. The low sensitivity was attributed to small crystal size as well as small void size (∼250 nm). The method developed in this work may be suitable for the preparation of a wide range of insensitive explosive compositions. PMID:20940087

  3. Detection of significant pathways in osteoporosis based on graph clustering.

    PubMed

    Xiao, Haijun; Shan, Liancheng; Zhu, Haiming; Xue, Feng

    2012-12-01

    Osteoporosis is the most common and serious skeletal disorder among the elderly, characterized by a low bone mineral density (BMD). Low bone mass in the elderly is highly dependent on their peak bone mass (PBM) as young adults. Circulating monocytes serve as early progenitors of osteoclasts and produce significant molecules for bone metabolism. An improved understanding of the biology and genetics of osteoclast differentiation at the pathway level is likely to be beneficial for the development of novel targeted approaches for osteoporosis. The objective of this study was to explore gene expression profiles comprehensively by grouping individual differentially expressed genes (DEGs) into gene sets and pathways using the graph clustering approach and Gene Ontology (GO) term enrichment analysis. The results indicated that the DEGs between high and low PBM samples were grouped into nine gene sets. The genes in clusters 1 and 8 (including GBP1, STAT1, CXCL10 and EIF2AK2) may be associated with osteoclast differentiation by the immune system response. The genes in clusters 2, 7 and 9 (including SOCS3, SOD2, ATF3, ADM EGR2 and BCL2A1) may be associated with osteoclast differentiation by responses to various stimuli. This study provides a number of candidate genes that warrant further investigation, including DDX60, HERC5, RSAD2, SIGLEC1, CMPK2, MX1, SEPING1, EPSTI1, C9orf72, PHLDA2, PFKFB3, PLEKHG2, ANKRD28, IL1RN and RNF19B. PMID:22992777

  4. Implicit Segmentation of a Stream of Syllables Based on Transitional Probabilities: An MEG Study

    ERIC Educational Resources Information Center

    Teinonen, Tuomas; Huotilainen, Minna

    2012-01-01

    Statistical segmentation of continuous speech, i.e., the ability to utilise transitional probabilities between syllables in order to detect word boundaries, is reflected in the brain's auditory event-related potentials (ERPs). The N1 and N400 ERP components are typically enhanced for word onsets compared to random syllables during active…

  5. Methods for estimating annual exceedance-probability discharges for streams in Iowa, based on data through water year 2010

    USGS Publications Warehouse

    Eash, David A.; Barnes, Kimberlee K.; Veilleux, Andrea G.

    2013-01-01

    A statewide study was performed to develop regional regression equations for estimating selected annual exceedance-probability statistics for ungaged stream sites in Iowa. The study area comprises streamgages located within Iowa and 50 miles beyond the State’s borders. Annual exceedance-probability estimates were computed for 518 streamgages by using the expected moments algorithm to fit a Pearson Type III distribution to the logarithms of annual peak discharges for each streamgage using annual peak-discharge data through 2010. The estimation of the selected statistics included a Bayesian weighted least-squares/generalized least-squares regression analysis to update regional skew coefficients for the 518 streamgages. Low-outlier and historic information were incorporated into the annual exceedance-probability analyses, and a generalized Grubbs-Beck test was used to detect multiple potentially influential low flows. Also, geographic information system software was used to measure 59 selected basin characteristics for each streamgage. Regional regression analysis, using generalized least-squares regression, was used to develop a set of equations for each flood region in Iowa for estimating discharges for ungaged stream sites with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities, which are equivalent to annual flood-frequency recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively. A total of 394 streamgages were included in the development of regional regression equations for three flood regions (regions 1, 2, and 3) that were defined for Iowa based on landform regions and soil regions. Average standard errors of prediction range from 31.8 to 45.2 percent for flood region 1, 19.4 to 46.8 percent for flood region 2, and 26.5 to 43.1 percent for flood region 3. The pseudo coefficients of determination for the generalized least-squares equations range from 90.8 to 96.2 percent for flood region 1, 91.5 to 97

  6. Estimation of flood inundation probabilities using global hazard indexes based on hydrodynamic variables

    NASA Astrophysics Data System (ADS)

    Aronica, Giuseppe Tito; Candela, Angela; Fabio, Pamela; Santoro, Mario

    In this paper a new procedure to derive flood hazard maps incorporating uncertainty concepts is presented. The layout of the procedure can be resumed as follows: (1) stochastic input of flood hydrograph modelled through a direct Monte-Carlo simulation based on flood recorded data. Generation of flood peaks and flow volumes has been obtained via copulas, which describe and model the correlation between these two variables independently of the marginal laws involved. The shape of hydrograph has been generated on the basis of a historical significant flood events, via cluster analysis; (2) modelling of flood propagation using a hyperbolic finite element model based on the DSV equations; (3) definition of global hazard indexes based on hydro-dynamic variables (i.e., water depth and flow velocities). The GLUE methodology has been applied in order to account for parameter uncertainty. The procedure has been tested on a flood prone area located in the southern part of Sicily, Italy. Three hazard maps have been obtained and then compared.

  7. Social Science and the Bayesian Probability Explanation Model

    NASA Astrophysics Data System (ADS)

    Yin, Jie; Zhao, Lei

    2014-03-01

    C. G. Hempel, one of the logical empiricists, who builds up his probability explanation model by using the empiricist view of probability, this model encountered many difficulties in the scientific explanation in which Hempel is difficult to make a reasonable defense. Based on the bayesian probability theory, the Bayesian probability model provides an approach of a subjective probability explanation based on the subjective probability, using the subjectivist view of probability. On the one hand, this probability model establishes the epistemological status of the subject in the social science; On the other hand, it provides a feasible explanation model for the social scientific explanation, which has important methodological significance.

  8. Lake Superior Zooplankton Biomass Predictions from LOPC Tow Surveys Compare Well with a Probability Based Net Survey

    EPA Science Inventory

    We conducted a probability-based sampling of Lake Superior in 2006 and compared the zooplankton biomass estimate with laser optical plankton counter (LOPC) predictions. The net survey consisted of 52 sites stratified across three depth zones (0-30, 30-150, >150 m). The LOPC tow...

  9. MEASUREMENT OF CHILDREN'S EXPOSURE TO PESTICIDES: ANALYSIS OF URINARY METABOLITE LEVELS IN A PROBABILITY-BASED SAMPLE

    EPA Science Inventory

    The Minnesota Children's Pesticide Exposure Study is a probability-based sample of 102 children 3-13 years old who were monitored for commonly used pesticides. During the summer of 1997, first-morning-void urine samples (1-3 per child) were obtained for 88% of study children a...

  10. Computer-Based Graphical Displays for Enhancing Mental Animation and Improving Reasoning in Novice Learning of Probability

    ERIC Educational Resources Information Center

    Kaplan, Danielle E.; Wu, Erin Chia-ling

    2006-01-01

    Our research suggests static and animated graphics can lead to more animated thinking and more correct problem solving in computer-based probability learning. Pilot software modules were developed for graduate online statistics courses and representation research. A study with novice graduate student statisticians compared problem solving in five…

  11. Effect of Reinforcement Probability and Prize Size on Cocaine and Heroin Abstinence in Prize-Based Contingency Management

    ERIC Educational Resources Information Center

    Ghitza, Udi E.; Epstein, David H.; Schmittner, John; Vahabzadeh, Massoud; Lin, Jia-Ling; Preston, Kenzie L.

    2008-01-01

    Although treatment outcome in prize-based contingency management has been shown to depend on reinforcement schedule, the optimal schedule is still unknown. Therefore, we conducted a retrospective analysis of data from a randomized clinical trial (Ghitza et al., 2007) to determine the effects of the probability of winning a prize (low vs. high) and…

  12. A Clarification on the Response Probability Criterion RP67 for Standard Settings Based on Bookmark and Item Mapping

    ERIC Educational Resources Information Center

    Huynh, Huynh

    2006-01-01

    By analyzing the Fisher information allotted to the correct response of a Rasch binary item, Huynh (1994) established the response probability criterion 0.67 (RP67) for standard settings based on bookmarks and item mapping. The purpose of this note is to help clarify the conceptual and psychometric framework of the RP criterion.

  13. Guide waves-based multi-damage identification using a local probability-based diagnostic imaging method

    NASA Astrophysics Data System (ADS)

    Gao, Dongyue; Wu, Zhanjun; Yang, Lei; Zheng, Yuebin

    2016-04-01

    Multi-damage identification is an important and challenging task in the research of guide waves-based structural health monitoring. In this paper, a multi-damage identification method is presented using a guide waves-based local probability-based diagnostic imaging (PDI) method. The method includes a path damage judgment stage, a multi-damage judgment stage and a multi-damage imaging stage. First, damage imaging was performed by partition. The damage imaging regions are divided into beside damage signal paths. The difference in guide waves propagation characteristics between cross and beside damage paths is proposed by theoretical analysis of the guide wave signal feature. The time-of-flight difference of paths is used as a factor to distinguish between cross and beside damage paths. Then, a global PDI method (damage identification using all paths in the sensor network) is performed using the beside damage path network. If the global PDI damage zone crosses the beside damage path, it means that the discrete multi-damage model (such as a group of holes or cracks) has been misjudged as a continuum single-damage model (such as a single hole or crack) by the global PDI method. Subsequently, damage imaging regions are separated by beside damage path and local PDI (damage identification using paths in the damage imaging regions) is performed in each damage imaging region. Finally, multi-damage identification results are obtained by superimposing the local damage imaging results and the marked cross damage paths. The method is employed to inspect the multi-damage in an aluminum plate with a surface-mounted piezoelectric ceramic sensors network. The results show that the guide waves-based multi-damage identification method is capable of visualizing the presence, quantity and location of structural damage.

  14. Probability-based non-local means filter for speckle noise suppression in optical coherence tomography images.

    PubMed

    Yu, Hancheng; Gao, Jianlin; Li, Aiting

    2016-03-01

    In this Letter, a probability-based non-local means filter is proposed for speckle reduction in optical coherence tomography (OCT). Originally developed for additive white Gaussian noise, the non-local means filter is not suitable for multiplicative speckle noise suppression. This Letter presents a two-stage non-local means algorithm using the uncorrupted probability of each pixel to effectively reduce speckle noise in OCT. Experiments on real OCT images demonstrate that the proposed filter is competitive with other state-of-the-art speckle removal techniques and able to accurately preserve edges and structural details with small computational cost. PMID:26974099

  15. Statistical analysis of blocking probability and fragmentation based on Markov modeling of elastic spectrum allocation on fiber link

    NASA Astrophysics Data System (ADS)

    Rosa, A. N. F.; Wiatr, P.; Cavdar, C.; Carvalho, S. V.; Costa, J. C. W. A.; Wosinska, L.

    2015-11-01

    In Elastic Optical Network (EON), spectrum fragmentation refers to the existence of non-aligned, small-sized blocks of free subcarrier slots in the optical spectrum. Several metrics have been proposed in order to quantify a level of spectrum fragmentation. Approximation methods might be used for estimating average blocking probability and some fragmentation measures, but are so far unable to accurately evaluate the influence of different sizes of connection requests and do not allow in-depth investigation of blocking events and their relation to fragmentation. The analytical study of the effect of fragmentation on requests' blocking probability is still under-explored. In this work, we introduce new definitions for blocking that differentiate between the reasons for the blocking events. We developed a framework based on Markov modeling to calculate steady-state probabilities for the different blocking events and to analyze fragmentation related problems in elastic optical links under dynamic traffic conditions. This framework can also be used for evaluation of different definitions of fragmentation in terms of their relation to the blocking probability. We investigate how different allocation request sizes contribute to fragmentation and blocking probability. Moreover, we show to which extend blocking events, due to insufficient amount of available resources, become inevitable and, compared to the amount of blocking events due to fragmented spectrum, we draw conclusions on the possible gains one can achieve by system defragmentation. We also show how efficient spectrum allocation policies really are in reducing the part of fragmentation that in particular leads to actual blocking events. Simulation experiments are carried out showing good match with our analytical results for blocking probability in a small scale scenario. Simulated blocking probabilities for the different blocking events are provided for a larger scale elastic optical link.

  16. A satellite rainfall retrieval technique over northern Algeria based on the probability of rainfall intensities classification from MSG-SEVIRI

    NASA Astrophysics Data System (ADS)

    Lazri, Mourad; Ameur, Soltane

    2016-09-01

    In this paper, an algorithm based on the probability of rainfall intensities classification for rainfall estimation from Meteosat Second Generation/Spinning Enhanced Visible and Infrared Imager (MSG-SEVIRI) has been developed. The classification scheme uses various spectral parameters of SEVIRI that provide information about cloud top temperature and optical and microphysical cloud properties. The presented method is developed and trained for the north of Algeria. The calibration of the method is carried out using as a reference rain classification fields derived from radar for rainy season from November 2006 to March 2007. Rainfall rates are assigned to rain areas previously identified and classified according to the precipitation formation processes. The comparisons between satellite-derived precipitation estimates and validation data show that the developed scheme performs reasonably well. Indeed, the correlation coefficient presents a significant level (r:0.87). The values of POD, POFD and FAR are 80%, 13% and 25%, respectively. Also, for a rainfall estimation of about 614 mm, the RMSD, Bias, MAD and PD indicate 102.06(mm), 2.18(mm), 68.07(mm) and 12.58, respectively.

  17. An RWA algorithm for OBS networks based on iterative local optimization of total blocking probability

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Tomohiro; Nagashima, Hidetaka; Hasegawa, Hiroshi; Sato, Ken-ichi

    2007-11-01

    We propose a routing and wavelength assignment algorithm for Optical Burst Switching (OBS) networks that utilizes centralized control. First, a method that can estimate the expected total blocking time in the network is presented. Then the proposed algorithm minimizes the estimated blocking time by simple iterative local optimization in terms of the traffic demand between each pair of nodes. We demonstrate that the proposed algorithm attains much smaller blocking probability than conventional distributed control algorithms. It is also shown that with introduction of optical buffers and burst retransmission, the proposed method realizes low burst loss rates (<10 -6) acceptable for most applications.

  18. Why Probability?

    ERIC Educational Resources Information Center

    Weatherly, Myra S.

    1984-01-01

    Instruction in mathematical probability to enhance higher levels of critical and creative thinking with gifted students is described. Among thinking skills developed by such an approach are analysis, synthesis, evaluation, fluency, and complexity. (CL)

  19. A Probability-Base Alerting Logic for Aircraft on Parallel Approach

    NASA Technical Reports Server (NTRS)

    Carpenter, Brenda D.; Kuchar, James K.

    1997-01-01

    This document discusses the development and evaluation of an airborne collision alerting logic for aircraft on closely-spaced approaches to parallel runways. A novel methodology is used when links alerts to collision probabilities: alerting thresholds are set such that when the probability of a collision exceeds an acceptable hazard level an alert is issued. The logic was designed to limit the hazard level to that estimated for the Precision Runway Monitoring system: one accident in every one thousand blunders which trigger alerts. When the aircraft were constrained to be coaltitude, evaluations of a two-dimensional version of the alerting logic show that the achieved hazard level is approximately one accident in every 250 blunders. Problematic scenarios have been identified and corrections to the logic can be made. The evaluations also show that over eighty percent of all unnecessary alerts were issued during scenarios in which the miss distance would have been less than 1000 ft, indicating that the alerts may have been justified. Also, no unnecessary alerts were generated during normal approaches.

  20. Mesh-Based Entry Vehicle and Explosive Debris Re-Contact Probability Modeling

    NASA Technical Reports Server (NTRS)

    McPherson, Mark A.; Mendeck, Gavin F.

    2011-01-01

    The risk to a crewed vehicle arising from potential re-contact with fragments from an explosive breakup of any jettisoned spacecraft segments during entry has long sought to be quantified. However, great difficulty lies in efficiently capturing the potential locations of each fragment and their collective threat to the vehicle. The method presented in this paper addresses this problem by using a stochastic approach that discretizes simulated debris pieces into volumetric cells, and then assesses strike probabilities accordingly. Combining spatial debris density and relative velocity between the debris and the entry vehicle, the strike probability can be calculated from the integral of the debris flux inside each cell over time. Using this technique it is possible to assess the risk to an entry vehicle along an entire trajectory as it separates from the jettisoned segment. By decoupling the fragment trajectories from that of the entry vehicle, multiple potential separation maneuvers can then be evaluated rapidly to provide an assessment of the best strategy to mitigate the re-contact risk.

  1. Quantifying predictive uncertainty of streamflow forecasts based on a Bayesian joint probability model

    NASA Astrophysics Data System (ADS)

    Zhao, Tongtiegang; Wang, Q. J.; Bennett, James C.; Robertson, David E.; Shao, Quanxi; Zhao, Jianshi

    2015-09-01

    Uncertainty is inherent in streamflow forecasts and is an important determinant of the utility of forecasts for water resources management. However, predictions by deterministic models provide only single values without uncertainty attached. This study presents a method for using a Bayesian joint probability (BJP) model to post-process deterministic streamflow forecasts by quantifying predictive uncertainty. The BJP model is comprised of a log-sinh transformation that normalises hydrological data, and a bi-variate Gaussian distribution that characterises the dependence relationship. The parameters of the transformation and the distribution are estimated through Bayesian inference with a Monte Carlo Markov chain (MCMC) algorithm. The BJP model produces, from a raw deterministic forecast, an ensemble of values to represent forecast uncertainty. The model is applied to raw deterministic forecasts of inflows to the Three Gorges Reservoir in China as a case study. The heteroscedasticity and non-Gaussianity of forecast uncertainty are effectively addressed. The ensemble spread accounts for the forecast uncertainty and leads to considerable improvement in terms of the continuous ranked probability score. The forecasts become less accurate as lead time increases, and the ensemble spread provides reliable information on the forecast uncertainty. We conclude that the BJP model is a useful tool to quantify predictive uncertainty in post-processing deterministic streamflow forecasts.

  2. Model-Based Calculations of the Probability of a Country's Nuclear Proliferation Decisions

    SciTech Connect

    Li, Jun; Yim, Man-Sung; McNelis, David N.

    2007-07-01

    explain the occurrences of proliferation decisions. However, predicting major historical proliferation events using model-based predictions has been unreliable. Nuclear proliferation decisions by a country is affected by three main factors: (1) technology; (2) finance; and (3) political motivation [1]. Technological capability is important as nuclear weapons development needs special materials, detonation mechanism, delivery capability, and the supporting human resources and knowledge base. Financial capability is likewise important as the development of the technological capabilities requires a serious financial commitment. It would be difficult for any state with a gross national product (GNP) significantly less than that of about $100 billion to devote enough annual governmental funding to a nuclear weapon program to actually achieve positive results within a reasonable time frame (i.e., 10 years). At the same time, nuclear proliferation is not a matter determined by a mastery of technical details or overcoming financial constraints. Technology or finance is a necessary condition but not a sufficient condition for nuclear proliferation. At the most fundamental level, the proliferation decision by a state is controlled by its political motivation. To effectively address the issue of predicting proliferation events, all three of the factors must be included in the model. To the knowledge of the authors, none of the exiting models considered the 'technology' variable as part of the modeling. This paper presents an attempt to develop a methodology for statistical modeling and predicting a country's nuclear proliferation decisions. The approach is based on the combined use of data on a country's nuclear technical capability profiles economic development status, security environment factors and internal political and cultural factors. All of the information utilized in the study was from open source literature. (authors)

  3. Probability-Based Evaluation of Peptide and Protein Identifications from Tandem Mass Spectrometry and SEQUEST Analysis: The Human Proteome

    SciTech Connect

    Qian, Weijun; Liu, Tao; Monroe, Matthew E.; Strittmatter, Eric F.; Jacobs, Jon M.; Kangas, Lars J.; Petritis, Konstantinos; Camp, David G.; Smith, Richard D.

    2005-01-01

    Large scale protein identifications from highly complex protein mixtures have recently been achieved using multidimensional liquid chromatography coupled with tandem mass spectrometry (LC/LC-MS/MS) and subsequent database searching with algorithms such as SEQUEST. Here, we describe a probability-based evaluation of false positive rates associated with peptide identifications from three different human proteome samples. Peptides from human plasma, human mammary epithelial cell (HMEC) lysate, and human hepatocyte (Huh)-7.5 cell lysate were separated by strong cation exchange (SCX) chromatography coupled offline with reversed-phase capillary LC-MS/MS analyses. The MS/MS spectra were first analyzed by SEQUEST, searching independently against both normal and sequence-reversed human protein databases, and the false positive rates of peptide identifications for the three proteome samples were then analyzed and compared. The observed false positive rates of peptide identifications for human plasma were significantly higher than those for the human cell lines when identical filtering criteria were used, which suggests that the false positive rates are highly dependent on sample characteristics, particularly the number of proteins found within the detectable dynamic range. Two new sets of filtering criteria are proposed for human plasma and human cell lines, respectively, to provide an overall confidence of >95% for peptide identifications. The new criteria were compared, using a normalized elution time (NET) criterion (Petritis et al. Anal. Chem. 2003, 75, 1039-48), with previously published criteria (Washburn et al. Nat. Biotechnol. 2001, 19, 242-7). The results demonstrate that the present criteria provide significantly higher levels of confidence for peptide identifications.

  4. Prediction of nucleic acid binding probability in proteins: a neighboring residue network based score.

    PubMed

    Miao, Zhichao; Westhof, Eric

    2015-06-23

    We describe a general binding score for predicting the nucleic acid binding probability in proteins. The score is directly derived from physicochemical and evolutionary features and integrates a residue neighboring network approach. Our process achieves stable and high accuracies on both DNA- and RNA-binding proteins and illustrates how the main driving forces for nucleic acid binding are common. Because of the effective integration of the synergetic effects of the network of neighboring residues and the fact that the prediction yields a hierarchical scoring on the protein surface, energy funnels for nucleic acid binding appear on protein surfaces, pointing to the dynamic process occurring in the binding of nucleic acids to proteins. PMID:25940624

  5. Prediction of nucleic acid binding probability in proteins: a neighboring residue network based score

    PubMed Central

    Miao, Zhichao; Westhof, Eric

    2015-01-01

    We describe a general binding score for predicting the nucleic acid binding probability in proteins. The score is directly derived from physicochemical and evolutionary features and integrates a residue neighboring network approach. Our process achieves stable and high accuracies on both DNA- and RNA-binding proteins and illustrates how the main driving forces for nucleic acid binding are common. Because of the effective integration of the synergetic effects of the network of neighboring residues and the fact that the prediction yields a hierarchical scoring on the protein surface, energy funnels for nucleic acid binding appear on protein surfaces, pointing to the dynamic process occurring in the binding of nucleic acids to proteins. PMID:25940624

  6. Multiple Vehicle Cooperative Localization with Spatial Registration Based on a Probability Hypothesis Density Filter

    PubMed Central

    Zhang, Feihu; Buckl, Christian; Knoll, Alois

    2014-01-01

    This paper studies the problem of multiple vehicle cooperative localization with spatial registration in the formulation of the probability hypothesis density (PHD) filter. Assuming vehicles are equipped with proprioceptive and exteroceptive sensors (with biases) to cooperatively localize positions, a simultaneous solution for joint spatial registration and state estimation is proposed. For this, we rely on the sequential Monte Carlo implementation of the PHD filtering. Compared to other methods, the concept of multiple vehicle cooperative localization with spatial registration is first proposed under Random Finite Set Theory. In addition, the proposed solution also addresses the challenges for multiple vehicle cooperative localization, e.g., the communication bandwidth issue and data association uncertainty. The simulation result demonstrates its reliability and feasibility in large-scale environments. PMID:24406860

  7. A Regression-based Approach to Assessing Stream Nitrogen Impairment Probabilities

    NASA Astrophysics Data System (ADS)

    McMahon, G.; Qian, S.; Roessler, C.

    2002-05-01

    A recently completed National Research Council study of the Total Maximum Daily Load (TMDL) program of the U.S. Environmental Protection Agency recommends an increased use of models to assess the conditions of waters for which nutrient load limits may need to be developed. Models can synthesize data to fill gaps associated with limited monitoring networks and estimate impairment probabilities for contaminants of interest. The U.S. Geological Survey, as part of the National Water-Quality Assessment Program, the North Carolina Division of Water Quality, has developed a nonlinear regression model to estimate impairment probabilities for all river segments, or reaches, in North Carolina's Neuse River. In this study, a reach is considered impaired if the annual mean concentration of total nitrogen is greater than 1.5 milligrams per liter (mg/L), a concentration associated with stream eutrophication. A SPARROW (Spatially Referenced Regressions on Watershed attributes) total nitrogen model was calibrated using data from three large basins in eastern North Carolina, including the Neuse River. The model specifies that in-stream nitrogen flux is a function of a nonlinear relation of nitrogen sources, including point sources, atmospheric deposition, inputs from agricultural and developed land, and terrestrial and aquatic nutrient processing. Because data are managed in a geographic information system, the SPARROW model uses information that can be derived from the stream reach network about the spatial relations among nitrogen fluxes, sources, landscape characteristics, and stream characteristics. This presentation describes a process for estimating the proportion (and 90-percent confidence interval) of Neuse River reaches with a total nitrogen concentration less than 1.5 mg/L and discusses the incorporation of prediction errors into the analysis.

  8. Analysis of extreme top event frequency percentiles based on fast probability integration

    SciTech Connect

    Staple, B.; Haskin, F.E.

    1993-10-01

    In risk assessments, a primary objective is to determine the frequency with which a collection of initiating and basic events, E{sub e} leads to some undesired top event, T. Uncertainties in the occurrence rates, x{sub t}, assigned to the initiating and basic events cause uncertainty in the top event frequency, z{sub T}. The quantification of the uncertainty in z{sub T} is an essential part of risk assessment called uncertainty analysis. In the past, it has been difficult to evaluate the extreme percentiles of output variables like z{sub T}. Analytic methods such as the method of moments do not provide estimates of output percentiles and the Monte Carlo (MC) method can be used to estimate extreme output percentiles only by resorting to large sample sizes. A promising altemative to these methods is the fast probability integration (FPI) methods. These methods approximate the integrals of multi-variate functions, representing percentiles of interest, without recourse to multi-dimensional numerical integration. FPI methods give precise results and have been demonstrated to be more efficient than MC methods for estimating extreme output percentiles. FPI allows the analyst to choose extreme percentiles of interest and perform sensitivity analyses in those regions. Such analyses can provide valuable insights as to the events driving the top event frequency response in extreme probability regions. In this paper, FPI methods are adapted a) to precisely estimate extreme top event frequency percentiles and b) to allow the quantification of sensitivity measures at these extreme percentiles. In addition, the relative precision and efficiency of alternative methods for treating lognormally distributed inputs is investigated. The methodology is applied to the top event frequency expression for the dominant accident sequence from a risk assessment of Grand Gulf nuclear power plant.

  9. Measurement of children's exposure to pesticides: analysis of urinary metabolite levels in a probability-based sample.

    PubMed Central

    Adgate, J L; Barr, D B; Clayton, C A; Eberly, L E; Freeman, N C; Lioy, P J; Needham, L L; Pellizzari, E D; Quackenboss, J J; Roy, A; Sexton, K

    2001-01-01

    The Minnesota Children's Pesticide Exposure Study is a probability-based sample of 102 children 3-13 years old who were monitored for commonly used pesticides. During the summer of 1997, first-morning-void urine samples (1-3 per child) were obtained for 88% of study children and analyzed for metabolites of insecticides and herbicides: carbamates and related compounds (1-NAP), atrazine (AM), malathion (MDA), and chlorpyrifos and related compounds (TCPy). TCPy was present in 93% of the samples, whereas 1-NAP, MDA, and AM were detected in 45%, 37%, and 2% of samples, respectively. Measured intrachild means ranged from 1.4 microg/L for MDA to 9.2 microg/L for TCPy, and there was considerable intrachild variability. For children providing three urine samples, geometric mean TCPy levels were greater than the detection limit in 98% of the samples, and nearly half the children had geometric mean 1-NAP and MDA levels greater than the detection limit. Interchild variability was significantly greater than intrachild variability for 1-NAP (p = 0.0037) and TCPy (p < 0.0001). The four metabolites measured were not correlated within urine samples, and children's metabolite levels did not vary systematically by sex, age, race, household income, or putative household pesticide use. On a log scale, mean TCPy levels were significantly higher in urban than in nonurban children (7.2 vs. 4.7 microg/L; p = 0.036). Weighted population mean concentrations were 3.9 [standard error (SE) = 0.7; 95% confidence interval (CI), 2.5, 5.3] microg/L for 1-NAP, 1.7 (SE = 0.3; 95% CI, 1.1, 2.3) microg/L for MDA, and 9.6 (SE = 0.9; 95% CI, 7.8, 11) microg/L for TCPy. The weighted population results estimate the overall mean and variability of metabolite levels for more than 84,000 children in the census tracts sampled. Levels of 1-NAP were lower than reported adult reference range concentrations, whereas TCPy concentrations were substantially higher. Concentrations of MDA were detected more frequently

  10. Probability-based particle detection that enables threshold-free and robust in vivo single-molecule tracking

    PubMed Central

    Smith, Carlas S.; Stallinga, Sjoerd; Lidke, Keith A.; Rieger, Bernd; Grunwald, David

    2015-01-01

    Single-molecule detection in fluorescence nanoscopy has become a powerful tool in cell biology but can present vexing issues in image analysis, such as limited signal, unspecific background, empirically set thresholds, image filtering, and false-positive detection limiting overall detection efficiency. Here we present a framework in which expert knowledge and parameter tweaking are replaced with a probability-based hypothesis test. Our method delivers robust and threshold-free signal detection with a defined error estimate and improved detection of weaker signals. The probability value has consequences for downstream data analysis, such as weighing a series of detections and corresponding probabilities, Bayesian propagation of probability, or defining metrics in tracking applications. We show that the method outperforms all current approaches, yielding a detection efficiency of >70% and a false-positive detection rate of <5% under conditions down to 17 photons/pixel background and 180 photons/molecule signal, which is beneficial for any kind of photon-limited application. Examples include limited brightness and photostability, phototoxicity in live-cell single-molecule imaging, and use of new labels for nanoscopy. We present simulations, experimental data, and tracking of low-signal mRNAs in yeast cells. PMID:26424801

  11. A generic probability based model to derive regional patterns of crops in time and space

    NASA Astrophysics Data System (ADS)

    Wattenbach, Martin; Luedtke, Stefan; Redweik, Richard; van Oijen, Marcel; Balkovic, Juraj; Reinds, Gert Jan

    2015-04-01

    Croplands are not only the key to human food supply, they also change the biophysical and biogeochemical properties of the land surface leading to changes in the water cycle, energy portioning, they influence soil erosion and substantially contribute to the amount of greenhouse gases entering the atmosphere. The effects of croplands on the environment depend on the type of crop and the associated management which both are related to the site conditions, economic boundary settings as well as preferences of individual farmers. The method described here is designed to predict the most probable crop to appear at a given location and time. The method uses statistical crop area information on NUTS2 level from EUROSTAT and the Common Agricultural Policy Regionalized Impact Model (CAPRI) as observation. These crops are then spatially disaggregated to the 1 x 1 km grid scale within the region, using the assumption that the probability of a crop appearing at a given location and a given year depends on a) the suitability of the land for the cultivation of the crop derived from the MARS Crop Yield Forecast System (MCYFS) and b) expert knowledge of agricultural practices. The latter includes knowledge concerning the feasibility of one crop following another (e.g. a late-maturing crop might leave too little time for the establishment of a winter cereal crop) and the need to combat weed infestations or crop diseases. The model is implemented in R and PostGIS. The quality of the generated crop sequences per grid cell is evaluated on the basis of the given statistics reported by the joint EU/CAPRI database. The assessment is given on NUTS2 level using per cent bias as a measure with a threshold of 15% as minimum quality. The results clearly indicates that crops with a large relative share within the administrative unit are not as error prone as crops that allocate only minor parts of the unit. However, still roughly 40% show an absolute per cent bias above the 15% threshold. This

  12. Probability Distribution Extraction from TEC Estimates based on Kernel Density Estimation

    NASA Astrophysics Data System (ADS)

    Demir, Uygar; Toker, Cenk; Çenet, Duygu

    2016-07-01

    Statistical analysis of the ionosphere, specifically the Total Electron Content (TEC), may reveal important information about its temporal and spatial characteristics. One of the core metrics that express the statistical properties of a stochastic process is its Probability Density Function (pdf). Furthermore, statistical parameters such as mean, variance and kurtosis, which can be derived from the pdf, may provide information about the spatial uniformity or clustering of the electron content. For example, the variance differentiates between a quiet ionosphere and a disturbed one, whereas kurtosis differentiates between a geomagnetic storm and an earthquake. Therefore, valuable information about the state of the ionosphere (and the natural phenomena that cause the disturbance) can be obtained by looking at the statistical parameters. In the literature, there are publications which try to fit the histogram of TEC estimates to some well-known pdf.s such as Gaussian, Exponential, etc. However, constraining a histogram to fit to a function with a fixed shape will increase estimation error, and all the information extracted from such pdf will continue to contain this error. In such techniques, it is highly likely to observe some artificial characteristics in the estimated pdf which is not present in the original data. In the present study, we use the Kernel Density Estimation (KDE) technique to estimate the pdf of the TEC. KDE is a non-parametric approach which does not impose a specific form on the TEC. As a result, better pdf estimates that almost perfectly fit to the observed TEC values can be obtained as compared to the techniques mentioned above. KDE is particularly good at representing the tail probabilities, and outliers. We also calculate the mean, variance and kurtosis of the measured TEC values. The technique is applied to the ionosphere over Turkey where the TEC values are estimated from the GNSS measurement from the TNPGN-Active (Turkish National Permanent

  13. Development of new risk score for pre-test probability of obstructive coronary artery disease based on coronary CT angiography.

    PubMed

    Fujimoto, Shinichiro; Kondo, Takeshi; Yamamoto, Hideya; Yokoyama, Naoyuki; Tarutani, Yasuhiro; Takamura, Kazuhisa; Urabe, Yoji; Konno, Kumiko; Nishizaki, Yuji; Shinozaki, Tomohiro; Kihara, Yasuki; Daida, Hiroyuki; Isshiki, Takaaki; Takase, Shinichi

    2015-09-01

    Existing methods to calculate pre-test probability of obstructive coronary artery disease (CAD) have been established using selected high-risk patients who were referred to conventional coronary angiography. The purpose of this study is to develop and validate our new method for pre-test probability of obstructive CAD using patients who underwent coronary CT angiography (CTA), which could be applicable to a wider range of patient population. Using consecutive 4137 patients with suspected CAD who underwent coronary CTA at our institution, a multivariate logistic regression model including clinical factors as covariates calculated the pre-test probability (K-score) of obstructive CAD determined by coronary CTA. The K-score was compared with the Duke clinical score using the area under the curve (AUC) for the receiver-operating characteristic curve. External validation was performed by an independent sample of 319 patients. The final model included eight significant predictors: age, gender, coronary risk factor (hypertension, diabetes mellitus, dyslipidemia, smoking), history of cerebral infarction, and chest symptom. The AUC of the K-score was significantly greater than that of the Duke clinical score for both derivation (0.736 vs. 0.699) and validation (0.714 vs. 0.688) data sets. Among patients who underwent coronary CTA, newly developed K-score had better pre-test prediction ability of obstructive CAD compared to Duke clinical score in Japanese population. PMID:24770610

  14. Species-Level Deconvolution of Metagenome Assemblies with Hi-C–Based Contact Probability Maps

    PubMed Central

    Burton, Joshua N.; Liachko, Ivan; Dunham, Maitreya J.; Shendure, Jay

    2014-01-01

    Microbial communities consist of mixed populations of organisms, including unknown species in unknown abundances. These communities are often studied through metagenomic shotgun sequencing, but standard library construction methods remove long-range contiguity information; thus, shotgun sequencing and de novo assembly of a metagenome typically yield a collection of contigs that cannot readily be grouped by species. Methods for generating chromatin-level contact probability maps, e.g., as generated by the Hi-C method, provide a signal of contiguity that is completely intracellular and contains both intrachromosomal and interchromosomal information. Here, we demonstrate how this signal can be exploited to reconstruct the individual genomes of microbial species present within a mixed sample. We apply this approach to two synthetic metagenome samples, successfully clustering the genome content of fungal, bacterial, and archaeal species with more than 99% agreement with published reference genomes. We also show that the Hi-C signal can secondarily be used to create scaffolded genome assemblies of individual eukaryotic species present within the microbial community, with higher levels of contiguity than some of the species’ published reference genomes. PMID:24855317

  15. Filled Pause Refinement Based on the Pronunciation Probability for Lecture Speech

    PubMed Central

    Long, Yan-Hua; Ye, Hong

    2015-01-01

    Nowadays, although automatic speech recognition has become quite proficient in recognizing or transcribing well-prepared fluent speech, the transcription of speech that contains many disfluencies remains problematic, such as spontaneous conversational and lecture speech. Filled pauses (FPs) are the most frequently occurring disfluencies in this type of speech. Most recent studies have shown that FPs are widely believed to increase the error rates for state-of-the-art speech transcription, primarily because most FPs are not well annotated or provided in training data transcriptions and because of the similarities in acoustic characteristics between FPs and some common non-content words. To enhance the speech transcription system, we propose a new automatic refinement approach to detect FPs in British English lecture speech transcription. This approach combines the pronunciation probabilities for each word in the dictionary and acoustic language model scores for FP refinement through a modified speech recognition forced-alignment framework. We evaluate the proposed approach on the Reith Lectures speech transcription task, in which only imperfect training transcriptions are available. Successful results are achieved for both the development and evaluation datasets. Acoustic models trained on different styles of speech genres have been investigated with respect to FP refinement. To further validate the effectiveness of the proposed approach, speech transcription performance has also been examined using systems built on training data transcriptions with and without FP refinement. PMID:25860959

  16. Scaling of strength and lifetime probability distributions of quasibrittle structures based on atomistic fracture mechanics

    PubMed Central

    Bažant, Zdeněk P.; Le, Jia-Liang; Bazant, Martin Z.

    2009-01-01

    The failure probability of engineering structures such as aircraft, bridges, dams, nuclear structures, and ships, as well as microelectronic components and medical implants, must be kept extremely low, typically <10−6. The safety factors needed to ensure it have so far been assessed empirically. For perfectly ductile and perfectly brittle structures, the empirical approach is sufficient because the cumulative distribution function (cdf) of random material strength is known and fixed. However, such an approach is insufficient for structures consisting of quasibrittle materials, which are brittle materials with inhomogeneities that are not negligible compared with the structure size. The reason is that the strength cdf of quasibrittle structure varies from Gaussian to Weibullian as the structure size increases. In this article, a recently proposed theory for the strength cdf of quasibrittle structure is refined by deriving it from fracture mechanics of nanocracks propagating by small, activation-energy-controlled, random jumps through the atomic lattice. This refinement also provides a plausible physical justification of the power law for subcritical creep crack growth, hitherto considered empirical. The theory is further extended to predict the cdf of structural lifetime at constant load, which is shown to be size- and geometry-dependent. The size effects on structure strength and lifetime are shown to be related and the latter to be much stronger. The theory fits previously unexplained deviations of experimental strength and lifetime histograms from the Weibull distribution. Finally, a boundary layer method for numerical calculation of the cdf of structural strength and lifetime is outlined. PMID:19561294

  17. Towards smart prosthetic hand: Adaptive probability based skeletan muscle fatigue model.

    PubMed

    Kumar, Parmod; Sebastian, Anish; Potluri, Chandrasekhar; Urfer, Alex; Naidu, D; Schoen, Marco P

    2010-01-01

    Skeletal muscle force can be estimated using surface electromyographic (sEMG) signals. Usually, the surface location for the sensors is near the respective muscle motor unit points. Skeletal muscles generate a spatial EMG signal, which causes cross talk between different sEMG signal sensors. In this study, an array of three sEMG sensors is used to capture the information of muscle dynamics in terms of sEMG signals. The recorded sEMG signals are filtered utilizing optimized nonlinear Half-Gaussian Bayesian filters parameters, and the muscle force signal using a Chebyshev type-II filter. The filter optimization is accomplished using Genetic Algorithms. Three discrete time state-space muscle fatigue models are obtained using system identification and modal transformation for three sets of sensors for single motor unit. The outputs of these three muscle fatigue models are fused with a probabilistic Kullback Information Criterion (KIC) for model selection. The final fused output is estimated with an adaptive probability of KIC, which provides improved force estimates. PMID:21095927

  18. Generating Within-Plant Spatial Distributions of an Insect Herbivore Based on Aggregation Patterns and Per-Node Infestation Probabilities.

    PubMed

    Rincon, Diego F; Hoy, Casey W; Cañas, Luis A

    2015-04-01

    Most predator-prey models extrapolate functional responses from small-scale experiments assuming spatially uniform within-plant predator-prey interactions. However, some predators focus their search in certain plant regions, and herbivores tend to select leaves to balance their nutrient uptake and exposure to plant defenses. Individual-based models that account for heterogeneous within-plant predator-prey interactions can be used to scale-up functional responses, but they would require the generation of explicit prey spatial distributions within-plant architecture models. The silverleaf whitefly, Bemisia tabaci biotype B (Gennadius) (Hemiptera: Aleyrodidae), is a significant pest of tomato crops worldwide that exhibits highly aggregated populations at several spatial scales, including within the plant. As part of an analytical framework to understand predator-silverleaf whitefly interactions, the objective of this research was to develop an algorithm to generate explicit spatial counts of silverleaf whitefly nymphs within tomato plants. The algorithm requires the plant size and the number of silverleaf whitefly individuals to distribute as inputs, and includes models that describe infestation probabilities per leaf nodal position and the aggregation pattern of the silverleaf whitefly within tomato plants and leaves. The output is a simulated number of silverleaf whitefly individuals for each leaf and leaflet on one or more plants. Parameter estimation was performed using nymph counts per leaflet censused from 30 artificially infested tomato plants. Validation revealed a substantial agreement between algorithm outputs and independent data that included the distribution of counts of both eggs and nymphs. This algorithm can be used in simulation models that explore the effect of local heterogeneity on whitefly-predator dynamics. PMID:26313173

  19. Calibrating perceived understanding and competency in probability concepts: A diagnosis of learning difficulties based on Rasch probabilistic model

    NASA Astrophysics Data System (ADS)

    Mahmud, Zamalia; Porter, Anne; Salikin, Masniyati; Ghani, Nor Azura Md

    2015-12-01

    Students' understanding of probability concepts have been investigated from various different perspectives. Competency on the other hand is often measured separately in the form of test structure. This study was set out to show that perceived understanding and competency can be calibrated and assessed together using Rasch measurement tools. Forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW have volunteered to participate in the study. Rasch measurement which is based on a probabilistic model is used to calibrate the responses from two survey instruments and investigate the interactions between them. Data were captured from the e-learning platform Moodle where students provided their responses through an online quiz. The study shows that majority of the students perceived little understanding about conditional and independent events prior to learning about it but tend to demonstrate a slightly higher competency level afterward. Based on the Rasch map, there is indication of some increase in learning and knowledge about some probability concepts at the end of the two weeks lessons on probability concepts.

  20. Results from probability-based, simplified, off-shore Louisiana CSEM hydrocarbon reservoir modeling

    NASA Astrophysics Data System (ADS)

    Stalnaker, J. L.; Tinley, M.; Gueho, B.

    2009-12-01

    Perhaps the biggest impediment to the commercial application of controlled-source electromagnetic (CSEM) geophysics marine hydrocarbon exploration is the inefficiency of modeling and data inversion. If an understanding of the typical (in a statistical sense) geometrical and electrical nature of a reservoir can be attained, then it is possible to derive therefrom a simplified yet accurate model of the electromagnetic interactions that produce a measured marine CSEM signal, leading ultimately to efficient modeling and inversion. We have compiled geometric and resistivity measurements from roughly 100 known, producing off-shore Louisiana Gulf of Mexico reservoirs. Recognizing that most reservoirs could be recreated roughly from a sectioned hemi-ellipsoid, we devised a unified, compact reservoir geometry description. Each reservoir was initially fit to the ellipsoid by eye, though we plan in the future to perform a more rigorous least-squares fit. We created, using kernel density estimation, initial probabilistic descriptions of reservoir parameter distributions, with the understanding that additional information would not fundamentally alter our results, but rather increase accuracy. From the probabilistic description, we designed an approximate model consisting of orthogonally oriented current segments distributed across the ellipsoid--enough to define the shape, yet few enough to be resolved during inversion. The moment and length of the currents are mapped to geometry and resistivity of the ellipsoid. The probability density functions (pdfs) derived from reservoir statistics serve as a workbench. We first use the pdfs in a Monte Carlo simulation designed to assess the detectability off-shore Louisiana reservoirs using magnitude versus offset (MVO) anomalies. From the pdfs, many reservoir instances are generated (using rejection sampling) and each normalized MVO response is calculated. The response strength is summarized by numerically computing MVO power, and that

  1. An EEG-Based Fuzzy Probability Model for Early Diagnosis of Alzheimer's Disease.

    PubMed

    Chiang, Hsiu-Sen; Pao, Shun-Chi

    2016-05-01

    Alzheimer's disease is a degenerative brain disease that results in cardinal memory deterioration and significant cognitive impairments. The early treatment of Alzheimer's disease can significantly reduce deterioration. Early diagnosis is difficult, and early symptoms are frequently overlooked. While much of the literature focuses on disease detection, the use of electroencephalography (EEG) in Alzheimer's diagnosis has received relatively little attention. This study combines the fuzzy and associative Petri net methodologies to develop a model for the effective and objective detection of Alzheimer's disease. Differences in EEG patterns between normal subjects and Alzheimer patients are used to establish prediction criteria for Alzheimer's disease, potentially providing physicians with a reference for early diagnosis, allowing for early action to delay the disease progression. PMID:27059738

  2. RECRUITING FOR A LONGITUDINAL STUDY OF CHILDREN'S HEALTH USING A HOUSEHOLD-BASED PROBABILITY SAMPLING APPROACH

    EPA Science Inventory

    The sampling design for the National Children¿s Study (NCS) calls for a population-based, multi-stage, clustered household sampling approach (visit our website for more information on the NCS : www.nationalchildrensstudy.gov). The full sample is designed to be representative of ...

  3. Translating CFC-based piston ages into probability density functions of ground-water age in karst

    USGS Publications Warehouse

    Long, A.J.; Putnam, L.D.

    2006-01-01

    Temporal age distributions are equivalent to probability density functions (PDFs) of transit time. The type and shape of a PDF provides important information related to ground-water mixing at the well or spring and the complex nature of flow networks in karst aquifers. Chlorofluorocarbon (CFC) concentrations measured for samples from 12 locations in the karstic Madison aquifer were used to evaluate the suitability of various PDF types for this aquifer. Parameters of PDFs could not be estimated within acceptable confidence intervals for any of the individual sites. Therefore, metrics derived from CFC-based apparent ages were used to evaluate results of PDF modeling in a more general approach. The ranges of these metrics were established as criteria against which families of PDFs could be evaluated for their applicability to different parts of the aquifer. Seven PDF types, including five unimodal and two bimodal models, were evaluated. Model results indicate that unimodal models may be applicable to areas close to conduits that have younger piston (i.e., apparent) ages and that bimodal models probably are applicable to areas farther from conduits that have older piston ages. The two components of a bimodal PDF are interpreted as representing conduit and diffuse flow, and transit times of as much as two decades may separate these PDF components. Areas near conduits may be dominated by conduit flow, whereas areas farther from conduits having bimodal distributions probably have good hydraulic connection to both diffuse and conduit flow. ?? 2006 Elsevier B.V. All rights reserved.

  4. Improvement of HMM-based action classification by using state transition probability

    NASA Astrophysics Data System (ADS)

    Kitamura, Yuka; Aruga, Haruki; Hashimoto, Manabu

    2015-04-01

    We propose a method to classify multiple similar actions which are contained in human behaviors by considering a weak-constrained order of "actions". The proposed method regards the human behavior as a combination of "action" patterns which have order constrained weakly. In this method, actions are classified by using not only image features but also consistency of transitions between an action and next action. By considering such an action transition, our method can recognize human behavior even if image features of different action are similar to each other. Based on this idea, we have improved the previous HMM-based algorithm effectively. Through some experiments using test image sequences of human behavior appeared in a bathroom, we have confirmed that the average classification success rate is 97 %, which is about 53 % higher than the previous method.

  5. Probability-Based Design Criteria of the ASCE 7 Tsunami Loads and Effects Provisions (Invited)

    NASA Astrophysics Data System (ADS)

    Chock, G.

    2013-12-01

    Mitigation of tsunami risk requires a combination of emergency preparedness for evacuation in addition to providing structural resilience of critical facilities, infrastructure, and key resources necessary for immediate response and economic and social recovery. Critical facilities would include emergency response, medical, tsunami refuges and shelters, ports and harbors, lifelines, transportation, telecommunications, power, financial institutions, and major industrial/commercial facilities. The Tsunami Loads and Effects Subcommittee of the ASCE/SEI 7 Standards Committee is developing a proposed new Chapter 6 - Tsunami Loads and Effects for the 2016 edition of the ASCE 7 Standard. ASCE 7 provides the minimum design loads and requirements for structures subject to building codes such as the International Building Code utilized in the USA. In this paper we will provide a review emphasizing the intent of these new code provisions and explain the design methodology. The ASCE 7 provisions for Tsunami Loads and Effects enables a set of analysis and design methodologies that are consistent with performance-based engineering based on probabilistic criteria. . The ASCE 7 Tsunami Loads and Effects chapter will be initially applicable only to the states of Alaska, Washington, Oregon, California, and Hawaii. Ground shaking effects and subsidence from a preceding local offshore Maximum Considered Earthquake will also be considered prior to tsunami arrival for Alaska and states in the Pacific Northwest regions governed by nearby offshore subduction earthquakes. For national tsunami design provisions to achieve a consistent reliability standard of structural performance for community resilience, a new generation of tsunami inundation hazard maps for design is required. The lesson of recent tsunami is that historical records alone do not provide a sufficient measure of the potential heights of future tsunamis. Engineering design must consider the occurrence of events greater than

  6. Computing posterior probabilities for score-based alignments using ppALIGN.

    PubMed

    Wolfsheimer, Stefan; Hartmann, Alexander; Rabus, Ralf; Nuel, Gregory

    2012-01-01

    Score-based pairwise alignments are widely used in bioinformatics in particular with molecular database search tools, such as the BLAST family. Due to sophisticated heuristics, such algorithms are usually fast but the underlying scoring model unfortunately lacks a statistical description of the reliability of the reported alignments. In particular, close to gaps, in low-score or low-complexity regions, a huge number of alternative alignments arise which results in a decrease of the certainty of the alignment. ppALIGN is a software package that uses hidden Markov Model techniques to compute position-wise reliability of score-based pairwise alignments of DNA or protein sequences. The design of the model allows for a direct connection between the scoring function and the parameters of the probabilistic model. For this reason it is suitable to analyze the outcomes of popular score based aligners and search tools without having to choose a complicated set of parameters. By contrast, our program only requires the classical score parameters (the scoring function and gap costs). The package comes along with a library written in C++, a standalone program for user defined alignments (ppALIGN) and another program (ppBLAST) which can process a complete result set of BLAST. The main algorithms essentially exhibit a linear time complexity (in the alignment lengths), and they are hence suitable for on-line computations. We have also included alternative decoding algorithms to provide alternative alignments. ppALIGN is a fast program/library that helps detect and quantify questionable regions in pairwise alignments. Due to its structure, the input/output interface it can to be connected to other post-processing tools. Empirically, we illustrate its usefulness in terms of correctly predicted reliable regions for sequences generated using the ROSE model for sequence evolution, and identify sensor-specific regions in the denitrifying betaproteobacterium Aromatoleum aromaticum. PMID

  7. Urban seismic risk assessment: statistical repair cost data and probable structural losses based on damage scenario—correlation analysis

    NASA Astrophysics Data System (ADS)

    Eleftheriadou, Anastasia K.; Baltzopoulou, Aikaterini D.; Karabinis, Athanasios I.

    2016-06-01

    The current seismic risk assessment is based on two discrete approaches, actual and probable, validating afterwards the produced results. In the first part of this research, the seismic risk is evaluated from the available data regarding the mean statistical repair/strengthening or replacement cost for the total number of damaged structures (180,427 buildings) after the 7/9/1999 Parnitha (Athens) earthquake. The actual evaluated seismic risk is afterwards compared to the estimated probable structural losses, which is presented in the second part of the paper, based on a damage scenario in the referring earthquake. The applied damage scenario is based on recently developed damage probability matrices (DPMs) from Athens (Greece) damage database. The seismic risk estimation refers to 750,085 buildings situated in the extended urban region of Athens. The building exposure is categorized in five typical structural types and represents 18.80 % of the entire building stock in Greece. The last information is provided by the National Statistics Service of Greece (NSSG) according to the 2000-2001 census. The seismic input is characterized by the ratio, a g/ a o, where a g is the regional peak ground acceleration (PGA) which is evaluated from the earlier estimated research macroseismic intensities, and a o is the PGA according to the hazard map of the 2003 Greek Seismic Code. Finally, the collected investigated financial data derived from different National Services responsible for the post-earthquake crisis management concerning the repair/strengthening or replacement costs or other categories of costs for the rehabilitation of earthquake victims (construction and function of settlements for earthquake homeless, rent supports, demolitions, shorings) are used to determine the final total seismic risk factor.

  8. Moment-based evidence for simple rational-valued Hilbert-Schmidt generic 2 × 2 separability probabilities

    NASA Astrophysics Data System (ADS)

    Slater, Paul B.; Dunkl, Charles F.

    2012-03-01

    Employing the Hilbert-Schmidt measure, we explicitly compute and analyze a number of determinantal product (bivariate) moments |ρ|k|ρPT|n, k, n = 0, 1, 2, 3, …, with PT denoting the partial transpose, for both generic (9-dimensional) two-rebit (\\alpha =\\frac{1}{2}) and generic (15-dimensional) two-qubit (α = 1) density matrices ρ. The results are, then, incorporated into a general formula, parameterized by k, n and α, with the case α = 2, presumptively corresponding to generic (27-dimensional) quaternionic systems. Holding the Dyson-index-like parameter α fixed, the induced univariate moments (|ρ||ρPT|)n and |ρPT|n are inputted into a Legendre-polynomial-based (least-squares) probability-distribution reconstruction algorithm of Provost (2005 Mathematica J. 9 727), yielding α-specific separability-probability estimates. Since, as the number of inputted moments grows, estimates based on the variable |ρ||ρPT| strongly decrease, while ones employing |ρPT| strongly increase (and converge faster), the gaps between upper and lower estimates diminish, yielding sharper and sharper bounds. Remarkably, for α = 2, with the use of 2325 moments, a separability-probability lower bound 0.999 999 987 as large as \\frac{26}{323} \\approx 0.080\\,4954 is found. For α = 1, based on 2415 moments, a lower bound results that is 0.999 997 066 times as large as \\frac{8}{33} \\approx 0.242\\,424, a (simpler still) fractional value that had previously been conjectured (Slater 2007 J. Phys. A: Math. Theor. 40 14279). Furthermore, for \\alpha =\\frac{1}{2}, employing 3310 moments, the lower bound is 0.999 955 times as large as \\frac{29}{64} = 0.453\\,125, a rational value previously considered (Slater 2010 J. Phys. A: Math. Theor. 43 195302).

  9. Urban seismic risk assessment: statistical repair cost data and probable structural losses based on damage scenario—correlation analysis

    NASA Astrophysics Data System (ADS)

    Eleftheriadou, Anastasia K.; Baltzopoulou, Aikaterini D.; Karabinis, Athanasios I.

    2016-04-01

    The current seismic risk assessment is based on two discrete approaches, actual and probable, validating afterwards the produced results. In the first part of this research, the seismic risk is evaluated from the available data regarding the mean statistical repair/strengthening or replacement cost for the total number of damaged structures (180,427 buildings) after the 7/9/1999 Parnitha (Athens) earthquake. The actual evaluated seismic risk is afterwards compared to the estimated probable structural losses, which is presented in the second part of the paper, based on a damage scenario in the referring earthquake. The applied damage scenario is based on recently developed damage probability matrices (DPMs) from Athens (Greece) damage database. The seismic risk estimation refers to 750,085 buildings situated in the extended urban region of Athens. The building exposure is categorized in five typical structural types and represents 18.80 % of the entire building stock in Greece. The last information is provided by the National Statistics Service of Greece (NSSG) according to the 2000-2001 census. The seismic input is characterized by the ratio, a g/a o, where a g is the regional peak ground acceleration (PGA) which is evaluated from the earlier estimated research macroseismic intensities, and a o is the PGA according to the hazard map of the 2003 Greek Seismic Code. Finally, the collected investigated financial data derived from different National Services responsible for the post-earthquake crisis management concerning the repair/strengthening or replacement costs or other categories of costs for the rehabilitation of earthquake victims (construction and function of settlements for earthquake homeless, rent supports, demolitions, shorings) are used to determine the final total seismic risk factor.

  10. Coupled level set segmentation using a point-based statistical shape model relying on correspondence probabilities

    NASA Astrophysics Data System (ADS)

    Hufnagel, Heike; Ehrhardt, Jan; Pennec, Xavier; Schmidt-Richberg, Alexander; Handels, Heinz

    2010-03-01

    In this article, we propose a unified statistical framework for image segmentation with shape prior information. The approach combines an explicitely parameterized point-based probabilistic statistical shape model (SSM) with a segmentation contour which is implicitly represented by the zero level set of a higher dimensional surface. These two aspects are unified in a Maximum a Posteriori (MAP) estimation where the level set is evolved to converge towards the boundary of the organ to be segmented based on the image information while taking into account the prior given by the SSM information. The optimization of the energy functional obtained by the MAP formulation leads to an alternate update of the level set and an update of the fitting of the SSM. We then adapt the probabilistic SSM for multi-shape modeling and extend the approach to multiple-structure segmentation by introducing a level set function for each structure. During segmentation, the evolution of the different level set functions is coupled by the multi-shape SSM. First experimental evaluations indicate that our method is well suited for the segmentation of topologically complex, non spheric and multiple-structure shapes. We demonstrate the effectiveness of the method by experiments on kidney segmentation as well as on hip joint segmentation in CT images.

  11. Power optimization of chemically driven heat engine based on first and second order reaction kinetic theory and probability theory

    NASA Astrophysics Data System (ADS)

    Zhang, Lei; Chen, Lingen; Sun, Fengrui

    2016-03-01

    The finite-time thermodynamic method based on probability analysis can more accurately describe various performance parameters of thermodynamic systems. Based on the relation between optimal efficiency and power output of a generalized Carnot heat engine with a finite high-temperature heat reservoir (heat source) and an infinite low-temperature heat reservoir (heat sink) and with the only irreversibility of heat transfer, this paper studies the problem of power optimization of chemically driven heat engine based on first and second order reaction kinetic theory, puts forward a model of the coupling heat engine which can be run periodically and obtains the effects of the finite-time thermodynamic characteristics of the coupling relation between chemical reaction and heat engine on the power optimization. The results show that the first order reaction kinetics model can use fuel more effectively, and can provide heat engine with higher temperature heat source to increase the power output of the heat engine. Moreover, the power fluctuation bounds of the chemically driven heat engine are obtained by using the probability analysis method. The results may provide some guidelines for the character analysis and power optimization of the chemically driven heat engines.

  12. EUPDF: An Eulerian-Based Monte Carlo Probability Density Function (PDF) Solver. User's Manual

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    1998-01-01

    EUPDF is an Eulerian-based Monte Carlo PDF solver developed for application with sprays, combustion, parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and spray solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type. The manual provides the user with the coding required to couple the PDF code to any given flow code and a basic understanding of the EUPDF code structure as well as the models involved in the PDF formulation. The source code of EUPDF will be available with the release of the National Combustion Code (NCC) as a complete package.

  13. CP-TDMA: Coloring-and Probability-Based TDMA Scheduling for Wireless Ad Hoc Networks

    NASA Astrophysics Data System (ADS)

    Zhang, Xuedan; Hong, Jun; Zhang, Lin; Shan, Xiuming; Li, Victor O. K.

    This paper addresses the issue of transmission scheduling in wireless ad hoc networks. We propose a Time Division Multiple Access (TDMA) scheduling scheme based on edge coloring and probabilistic assignment, called CP-TDMA. We categorize the conflicts suffered by wireless links into two types: explicit conflicts and implicit conflicts, and utilize two different strategies to deal with them. Explicit conflicts are avoided completely by a simple distributed edge-coloring algorithm μ-M, and implicit conflicts are minimized by applying probabilistic time slot assignments to links. We evaluate CP-TDMA analytically and numerically, and find that CP-TDMA, which requires only local information exhibits a better performance than previous work.

  14. APL: An angle probability list to improve knowledge-based metaheuristics for the three-dimensional protein structure prediction.

    PubMed

    Borguesan, Bruno; Barbachan e Silva, Mariel; Grisci, Bruno; Inostroza-Ponta, Mario; Dorn, Márcio

    2015-12-01

    Tertiary protein structure prediction is one of the most challenging problems in structural bioinformatics. Despite the advances in algorithm development and computational strategies, predicting the folded structure of a protein only from its amino acid sequence remains as an unsolved problem. We present a new computational approach to predict the native-like three-dimensional structure of proteins. Conformational preferences of amino acid residues and secondary structure information were obtained from protein templates stored in the Protein Data Bank and represented as an Angle Probability List. Two knowledge-based prediction methods based on Genetic Algorithms and Particle Swarm Optimization were developed using this information. The proposed method has been tested with twenty-six case studies selected to validate our approach with different classes of proteins and folding patterns. Stereochemical and structural analysis were performed for each predicted three-dimensional structure. Results achieved suggest that the Angle Probability List can improve the effectiveness of metaheuristics used to predicted the three-dimensional structure of protein molecules by reducing its conformational search space. PMID:26495908

  15. Evaluation of welding damage in welded tubular steel structures using guided waves and a probability-based imaging approach

    NASA Astrophysics Data System (ADS)

    Lu, Xi; Lu, Mingyu; Zhou, Li-Min; Su, Zhongqing; Cheng, Li; Ye, Lin; Meng, Guang

    2011-01-01

    Welded tubular steel structures (WTSSs) are widely used in various engineering sectors, serving as major frameworks for many mechanical systems. There has been increasing awareness of introducing effective damage identification and up-to-the-minute health surveillance to WTSSs, so as to enhance structural reliability and integrity. In this study, propagation of guided waves (GWs) in a WTSS of rectangular cross-section, a true-scale model of a train bogie frame segment, was investigated using the finite element method (FEM) and experimental analysis with the purpose of evaluating welding damage in the WTSS. An active piezoelectric sensor network was designed and surface-bonded on the WTSS, to activate and collect GWs. Characteristics of GWs at different excitation frequencies were explored. A signal feature, termed 'time of maximal difference' (ToMD) in this study, was extracted from captured GW signals, based on which a concept, damage presence probability (DPP), was established. With ToMD and DPP, a probability-based damage imaging approach was developed. To enhance robustness of the approach to measurement noise and uncertainties, a two-level image fusion scheme was further proposed. As validation, the approach was employed to predict presence and location of slot-like damage in the welding zone of a WTSS. Identification results have demonstrated the effectiveness of the developed approach for identifying damage in WTSSs and its large potential for real-time health monitoring of WTSSs.

  16. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    USGS Publications Warehouse

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  17. Flood hazard probability mapping method

    NASA Astrophysics Data System (ADS)

    Kalantari, Zahra; Lyon, Steve; Folkeson, Lennart

    2015-04-01

    In Sweden, spatially explicit approaches have been applied in various disciplines such as landslide modelling based on soil type data and flood risk modelling for large rivers. Regarding flood mapping, most previous studies have focused on complex hydrological modelling on a small scale whereas just a few studies have used a robust GIS-based approach integrating most physical catchment descriptor (PCD) aspects on a larger scale. The aim of the present study was to develop methodology for predicting the spatial probability of flooding on a general large scale. Factors such as topography, land use, soil data and other PCDs were analysed in terms of their relative importance for flood generation. The specific objective was to test the methodology using statistical methods to identify factors having a significant role on controlling flooding. A second objective was to generate an index quantifying flood probability value for each cell, based on different weighted factors, in order to provide a more accurate analysis of potential high flood hazards than can be obtained using just a single variable. The ability of indicator covariance to capture flooding probability was determined for different watersheds in central Sweden. Using data from this initial investigation, a method to subtract spatial data for multiple catchments and to produce soft data for statistical analysis was developed. It allowed flood probability to be predicted from spatially sparse data without compromising the significant hydrological features on the landscape. By using PCD data, realistic representations of high probability flood regions was made, despite the magnitude of rain events. This in turn allowed objective quantification of the probability of floods at the field scale for future model development and watershed management.

  18. Time-dependent fracture probability of bilayer, lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation

    PubMed Central

    Anusavice, Kenneth J.; Jadaan, Osama M.; Esquivel–Upshaw, Josephine

    2013-01-01

    Recent reports on bilayer ceramic crown prostheses suggest that fractures of the veneering ceramic represent the most common reason for prosthesis failure. Objective The aims of this study were to test the hypotheses that: (1) an increase in core ceramic/veneer ceramic thickness ratio for a crown thickness of 1.6 mm reduces the time-dependent fracture probability (Pf) of bilayer crowns with a lithium-disilicate-based glass-ceramic core, and (2) oblique loading, within the central fossa, increases Pf for 1.6-mm-thick crowns compared with vertical loading. Materials and methods Time-dependent fracture probabilities were calculated for 1.6-mm-thick, veneered lithium-disilicate-based glass-ceramic molar crowns as a function of core/veneer thickness ratio and load orientation in the central fossa area. Time-dependent fracture probability analyses were computed by CARES/Life software and finite element analysis, using dynamic fatigue strength data for monolithic discs of a lithium-disilicate glass-ceramic core (Empress 2), and ceramic veneer (Empress 2 Veneer Ceramic). Results Predicted fracture probabilities (Pf) for centrally-loaded 1,6-mm-thick bilayer crowns over periods of 1, 5, and 10 years are 1.2%, 2.7%, and 3.5%, respectively, for a core/veneer thickness ratio of 1.0 (0.8 mm/0.8 mm), and 2.5%, 5.1%, and 7.0%, respectively, for a core/veneer thickness ratio of 0.33 (0.4 mm/1.2 mm). Conclusion CARES/Life results support the proposed crown design and load orientation hypotheses. Significance The application of dynamic fatigue data, finite element stress analysis, and CARES/Life analysis represent an optimal approach to optimize fixed dental prosthesis designs produced from dental ceramics and to predict time-dependent fracture probabilities of ceramic-based fixed dental prostheses that can minimize the risk for clinical failures. PMID:24060349

  19. Estimation of the probability of exposure to metalworking fluids in a population-based case-control study

    PubMed Central

    Park, Dong-Uk; Colt, Joanne S.; Baris, Dalsu; Schwenn, Molly; Karagas, Margaret R.; Armenti, Karla R.; Johnson, Alison; Silverman, Debra T; Stewart, Patricia A

    2014-01-01

    We describe here an approach for estimating the probability that study subjects were exposed to metalworking fluids (MWFs) in a population-based case-control study of bladder cancer. Study subject reports on the frequency of machining and use of specific MWFs (straight, soluble, and synthetic/semi-synthetic) were used to estimate exposure probability when available. Those reports also were used to develop estimates for job groups, which were then applied to jobs without MWF reports. Estimates using both cases and controls and controls only were developed. The prevalence of machining varied substantially across job groups (10-90%), with the greatest percentage of jobs that machined being reported by machinists and tool and die workers. Reports of straight and soluble MWF use were fairly consistent across job groups (generally, 50-70%). Synthetic MWF use was lower (13-45%). There was little difference in reports by cases and controls vs. controls only. Approximately, 1% of the entire study population was assessed as definitely exposed to straight or soluble fluids in contrast to 0.2% definitely exposed to synthetic/semi-synthetics. A comparison between the reported use of the MWFs and the US production levels by decade found high correlations (r generally >0.7). Overall, the method described here is likely to have provided a systematic and reliable ranking that better reflects the variability of exposure to three types of MWFs than approaches applied in the past. PMID:25256317

  20. Small scale photo probability sampling and vegetation classification in southeast Arizona as an ecological base for resource inventory. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Johnson, J. R. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. The broad scale vegetation classification was developed for a 3,200 sq mile area in southeastern Arizona. The 31 vegetation types were derived from association tables which contained information taken at about 500 ground sites. The classification provided an information base that was suitable for use with small scale photography. A procedure was developed and tested for objectively comparing photo images. The procedure consisted of two parts, image groupability testing and image complexity testing. The Apollo and ERTS photos were compared for relative suitability as first stage stratification bases in two stage proportional probability sampling. High altitude photography was used in common at the second stage.

  1. EFFECT OF CHLORIDE AND SULFATE CONCENTRATION ON PROBABLITY BASED CORROSION CONTROL FOR LIQUID WASTE TANKS- PART IV

    SciTech Connect

    Hoffman, E.

    2012-08-23

    A series of cyclic potentiodynamic polarization tests was performed on samples of A537 carbon steel in support of a probability-based approach to evaluate the effect of chloride and sulfate on corrosion susceptibility. Testing solutions were chosen to build off previous experimental results from FY07, FY08, FY09 and FY10 to systemically evaluate the influence of the secondary aggressive species, chloride, and sulfate. The FY11 results suggest that evaluating the combined effect of all aggressive species, nitrate, chloride, and sulfate, provides a consistent response for determining corrosion susceptibility. The results of this work emphasize the importance for not only nitrate concentration limits, but also chloride and sulfate concentration limits as well.

  2. Estimation of the probability of exposure to machining fluids in a population-based case-control study.

    PubMed

    Park, Dong-Uk; Colt, Joanne S; Baris, Dalsu; Schwenn, Molly; Karagas, Margaret R; Armenti, Karla R; Johnson, Alison; Silverman, Debra T; Stewart, Patricia A

    2014-01-01

    We describe an approach for estimating the probability that study subjects were exposed to metalworking fluids (MWFs) in a population-based case-control study of bladder cancer. Study subject reports on the frequency of machining and use of specific MWFs (straight, soluble, and synthetic/semi-synthetic) were used to estimate exposure probability when available. Those reports also were used to develop estimates for job groups, which were then applied to jobs without MWF reports. Estimates using both cases and controls and controls only were developed. The prevalence of machining varied substantially across job groups (0.1->0.9%), with the greatest percentage of jobs that machined being reported by machinists and tool and die workers. Reports of straight and soluble MWF use were fairly consistent across job groups (generally 50-70%). Synthetic MWF use was lower (13-45%). There was little difference in reports by cases and controls vs. controls only. Approximately, 1% of the entire study population was assessed as definitely exposed to straight or soluble fluids in contrast to 0.2% definitely exposed to synthetic/semi-synthetics. A comparison between the reported use of the MWFs and U.S. production levels found high correlations (r generally >0.7). Overall, the method described here is likely to have provided a systematic and reliable ranking that better reflects the variability of exposure to three types of MWFs than approaches applied in the past. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resources: a list of keywords in the occupational histories that were used to link study subjects to the metalworking fluids (MWFs) modules; recommendations from the literature on selection of MWFs based on type of machining operation, the metal being machined and decade; popular additives to MWFs; the number and proportion of controls who

  3. Chromatographic elution process design space development for the purification of saponins in Panax notoginseng extract using a probability-based approach.

    PubMed

    Chen, Teng; Gong, Xingchu; Chen, Huali; Zhang, Ying; Qu, Haibin

    2016-01-01

    A Monte Carlo method was used to develop the design space of a chromatographic elution process for the purification of saponins in Panax notoginseng extract. During this process, saponin recovery ratios, saponin purity, and elution productivity are determined as process critical quality attributes, and ethanol concentration, elution rate, and elution volume are identified as critical process parameters. Quadratic equations between process critical quality attributes and critical process parameters were established using response surface methodology. Then probability-based design space was computed by calculating the prediction errors using Monte Carlo simulations. The influences of calculation parameters on computation results were investigated. The optimized calculation condition was as follows: calculation step length of 0.02, simulation times of 10 000, and a significance level value of 0.15 for adding or removing terms in a stepwise regression. Recommended normal operation region is located in ethanol concentration of 65.0-70.0%, elution rate of 1.7-2.0 bed volumes (BV)/h and elution volume of 3.0-3.6 BV. Verification experiments were carried out and the experimental values were in a good agreement with the predicted values. The application of present method is promising to develop a probability-based design space for other botanical drug manufacturing process. PMID:26549198

  4. Guided waves based SHM systems for composites structural elements: statistical analyses finalized at probability of detection definition and assessment

    NASA Astrophysics Data System (ADS)

    Monaco, E.; Memmolo, V.; Ricci, F.; Boffa, N. D.; Maio, L.

    2015-03-01

    Maintenance approaches based on sensorised structures and Structural Health Monitoring systems could represent one of the most promising innovations in the fields of aerostructures since many years, mostly when composites materials (fibers reinforced resins) are considered. Layered materials still suffer today of drastic reductions of maximum allowable stress values during the design phase as well as of costly and recurrent inspections during the life cycle phase that don't permit of completely exploit their structural and economic potentialities in today aircrafts. Those penalizing measures are necessary mainly to consider the presence of undetected hidden flaws within the layered sequence (delaminations) or in bonded areas (partial disbonding); in order to relax design and maintenance constraints a system based on sensors permanently installed on the structure to detect and locate eventual flaws can be considered (SHM system) once its effectiveness and reliability will be statistically demonstrated via a rigorous Probability Of Detection function definition and evaluation. This paper presents an experimental approach with a statistical procedure for the evaluation of detection threshold of a guided waves based SHM system oriented to delaminations detection on a typical wing composite layered panel. The experimental tests are mostly oriented to characterize the statistical distribution of measurements and damage metrics as well as to characterize the system detection capability using this approach. Numerically it is not possible to substitute part of the experimental tests aimed at POD where the noise in the system response is crucial. Results of experiments are presented in the paper and analyzed.

  5. Statistical analysis of the induced Basel 2006 earthquake sequence: introducing a probability-based monitoring approach for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Bachmann, C. E.; Wiemer, S.; Woessner, J.; Hainzl, S.

    2011-08-01

    Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was

  6. FINAL PROJECT REPORT DOE Early Career Principal Investigator Program Project Title: Developing New Mathematical Models for Multiphase Flows Based on a Fundamental Probability Density Function Approach

    SciTech Connect

    Shankar Subramaniam

    2009-04-01

    This final project report summarizes progress made towards the objectives described in the proposal entitled “Developing New Mathematical Models for Multiphase Flows Based on a Fundamental Probability Density Function Approach”. Substantial progress has been made in theory, modeling and numerical simulation of turbulent multiphase flows. The consistent mathematical framework based on probability density functions is described. New models are proposed for turbulent particle-laden flows and sprays.

  7. Highly efficient codec based on significance-linked connected-component analysis of wavelet coefficients

    NASA Astrophysics Data System (ADS)

    Chai, Bing-Bing; Vass, Jozsef; Zhuang, Xinhua

    1997-04-01

    Recent success in wavelet coding is mainly attributed to the recognition of importance of data organization. There has been several very competitive wavelet codecs developed, namely, Shapiro's Embedded Zerotree Wavelets (EZW), Servetto et. al.'s Morphological Representation of Wavelet Data (MRWD), and Said and Pearlman's Set Partitioning in Hierarchical Trees (SPIHT). In this paper, we propose a new image compression algorithm called Significant-Linked Connected Component Analysis (SLCCA) of wavelet coefficients. SLCCA exploits both within-subband clustering of significant coefficients and cross-subband dependency in significant fields. A so-called significant link between connected components is designed to reduce the positional overhead of MRWD. In addition, the significant coefficients' magnitude are encoded in bit plane order to match the probability model of the adaptive arithmetic coder. Experiments show that SLCCA outperforms both EZW and MRWD, and is tied with SPIHT. Furthermore, it is observed that SLCCA generally has the best performance on images with large portion of texture. When applied to fingerprint image compression, it outperforms FBI's wavelet scalar quantization by about 1 dB.

  8. Applying Probability Theory for the Quality Assessment of a Wildfire Spread Prediction Framework Based on Genetic Algorithms

    PubMed Central

    Cencerrado, Andrés; Cortés, Ana; Margalef, Tomàs

    2013-01-01

    This work presents a framework for assessing how the existing constraints at the time of attending an ongoing forest fire affect simulation results, both in terms of quality (accuracy) obtained and the time needed to make a decision. In the wildfire spread simulation and prediction area, it is essential to properly exploit the computational power offered by new computing advances. For this purpose, we rely on a two-stage prediction process to enhance the quality of traditional predictions, taking advantage of parallel computing. This strategy is based on an adjustment stage which is carried out by a well-known evolutionary technique: Genetic Algorithms. The core of this framework is evaluated according to the probability theory principles. Thus, a strong statistical study is presented and oriented towards the characterization of such an adjustment technique in order to help the operation managers deal with the two aspects previously mentioned: time and quality. The experimental work in this paper is based on a region in Spain which is one of the most prone to forest fires: El Cap de Creus. PMID:24453898

  9. Situational Lightning Climatologies for Central Florida: Phase IV: Central Florida Flow Regime Based Climatologies of Lightning Probabilities

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2009-01-01

    The threat of lightning is a daily concern during the warm season in Florida. Research has revealed distinct spatial and temporal distributions of lightning occurrence that are strongly influenced by large-scale atmospheric flow regimes. Previously, the Applied Meteorology Unit (AMU) calculated the gridded lightning climatologies based on seven flow regimes over Florida for 1-, 3- and 6-hr intervals in 5-, 10-, 20-, and 30-NM diameter range rings around the Shuttle Landing Facility (SLF) and eight other airfields in the National Weather Service in Melbourne (NWS MLB) county warning area (CWA). In this update to the work, the AMU recalculated the lightning climatologies for using individual lightning strike data to improve the accuracy of the climatologies. The AMU included all data regardless of flow regime as one of the stratifications, added monthly stratifications, added three years of data to the period of record and used modified flow regimes based work from the AMU's Objective Lightning Probability Forecast Tool, Phase II. The AMU made changes so the 5- and 10-NM radius range rings are consistent with the aviation forecast requirements at NWS MLB, while the 20- and 30-NM radius range rings at the SLF assist the Spaceflight Meteorology Group in making forecasts for weather Flight Rule violations during Shuttle landings. The AMU also updated the graphical user interface with the new data.

  10. Influence of anthropogenic activities on PAHs in sediments in a significant gulf of low-latitude developing regions, the Beibu Gulf, South China Sea: distribution, sources, inventory and probability risk.

    PubMed

    Li, Pingyang; Xue, Rui; Wang, Yinghui; Zhang, Ruijie; Zhang, Gan

    2015-01-15

    Fifteen polycyclic aromatic hydrocarbons (PAHs) in 41 surface sediment samples and a sediment core (50 cm) from the Beibu Gulf, a significant low-latitude developing gulf, were analyzed. PAHs concentrations were 3.01-388 ng g(-)(1) (mean 95.5 ng g(-)(1)) in the surface sediments and 10.5-87.1 ng g(-)(1) (average 41.1 ng g(-)(1)) in the sediment core. Source apportionment indicated that PAHs were generated from coke production and vehicular emissions (39.4%), coal and biomass combustion (35.8%), and petrogenic sources (24.8%). PAHs were mainly concentrated in the industrialized and urbanized regions and the harbor, and were transported by atmospheric deposition to the marine matrix. The mass inventory (1.57-2.62t) and probability risk showed sediments here served as an important reservoir but low PAH risk. Different from oil and natural gas in developed regions, coal combustion has always been a significant energy consumption pattern in this developing region for the past 30 years (56 ± 5%). PMID:25467868

  11. Significance testing of rules in rule-based models of human problem solving

    NASA Technical Reports Server (NTRS)

    Lewis, C. M.; Hammer, J. M.

    1986-01-01

    Rule-based models of human problem solving have typically not been tested for statistical significance. Three methods of testing rules - analysis of variance, randomization, and contingency tables - are presented. Advantages and disadvantages of the methods are also described.

  12. Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination

    NASA Technical Reports Server (NTRS)

    Groen, Frank

    2010-01-01

    This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.

  13. Location and release time identification of pollution point source in river networks based on the Backward Probability Method.

    PubMed

    Ghane, Alireza; Mazaheri, Mehdi; Mohammad Vali Samani, Jamal

    2016-09-15

    The pollution of rivers due to accidental spills is a major threat to environment and human health. To protect river systems from accidental spills, it is essential to introduce a reliable tool for identification process. Backward Probability Method (BPM) is one of the most recommended tools that is able to introduce information related to the prior location and the release time of the pollution. This method was originally developed and employed in groundwater pollution source identification problems. One of the objectives of this study is to apply this method in identifying the pollution source location and release time in surface waters, mainly in rivers. To accomplish this task, a numerical model is developed based on the adjoint analysis. Then the developed model is verified using analytical solution and some real data. The second objective of this study is to extend the method to pollution source identification in river networks. In this regard, a hypothetical test case is considered. In the later simulations, all of the suspected points are identified, using only one backward simulation. The results demonstrated that all suspected points, determined by the BPM could be a possible pollution source. The proposed approach is accurate and computationally efficient and does not need any simplification in river geometry and flow. Due to this simplicity, it is highly recommended for practical purposes. PMID:27219462

  14. A multiscale finite element model validation method of composite cable-stayed bridge based on Probability Box theory

    NASA Astrophysics Data System (ADS)

    Zhong, Rumian; Zong, Zhouhong; Niu, Jie; Liu, Qiqi; Zheng, Peijuan

    2016-05-01

    Modeling and simulation are routinely implemented to predict the behavior of complex structures. These tools powerfully unite theoretical foundations, numerical models and experimental data which include associated uncertainties and errors. A new methodology for multi-scale finite element (FE) model validation is proposed in this paper. The method is based on two-step updating method, a novel approach to obtain coupling parameters in the gluing sub-regions of a multi-scale FE model, and upon Probability Box (P-box) theory that can provide a lower and upper bound for the purpose of quantifying and transmitting the uncertainty of structural parameters. The structural health monitoring data of Guanhe Bridge, a composite cable-stayed bridge with large span, and Monte Carlo simulation were used to verify the proposed method. The results show satisfactory accuracy, as the overlap ratio index of each modal frequency is over 89% without the average absolute value of relative errors, and the CDF of normal distribution has a good coincidence with measured frequencies of Guanhe Bridge. The validated multiscale FE model may be further used in structural damage prognosis and safety prognosis.

  15. Maximum Magnitude and Probabilities of Induced Earthquakes in California Geothermal Fields: Applications for a Science-Based Decision Framework

    NASA Astrophysics Data System (ADS)

    Weiser, Deborah Anne

    Induced seismicity is occurring at increasing rates around the country. Brodsky and Lajoie (2013) and others have recognized anthropogenic quakes at a few geothermal fields in California. I use three techniques to assess if there are induced earthquakes in California geothermal fields; there are three sites with clear induced seismicity: Brawley, The Geysers, and Salton Sea. Moderate to strong evidence is found at Casa Diablo, Coso, East Mesa, and Susanville. Little to no evidence is found for Heber and Wendel. I develop a set of tools to reduce or cope with the risk imposed by these earthquakes, and also to address uncertainties through simulations. I test if an earthquake catalog may be bounded by an upper magnitude limit. I address whether the earthquake record during pumping time is consistent with the past earthquake record, or if injection can explain all or some of the earthquakes. I also present ways to assess the probability of future earthquake occurrence based on past records. I summarize current legislation for eight states where induced earthquakes are of concern. Unlike tectonic earthquakes, the hazard from induced earthquakes has the potential to be modified. I discuss direct and indirect mitigation practices. I present a framework with scientific and communication techniques for assessing uncertainty, ultimately allowing more informed decisions to be made.

  16. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    NASA Astrophysics Data System (ADS)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2016-07-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  17. Tracking a large number of closely spaced objects based on the particle probability hypothesis density filter via optical sensor

    NASA Astrophysics Data System (ADS)

    Lin, Liangkui; Xu, Hui; An, Wei; Sheng, Weidong; Xu, Dan

    2011-11-01

    This paper presents a novel approach to tracking a large number of closely spaced objects (CSO) in image sequences that is based on the particle probability hypothesis density (PHD) filter and multiassignment data association. First, the particle PHD filter is adopted to eliminate most of the clutters and to estimate multitarget states. In the particle PHD filter, a noniterative multitarget estimation technique is introduced to reliably estimate multitarget states, and an improved birth particle sampling scheme is present to effectively acquire targets among clutters. Then, an integrated track management method is proposed to realize multitarget track continuity. The core of the track management is the track-to-estimation multiassignment association, which relaxes the traditional one-to-one data association restriction due to the unresolved focal plane CSO measurements. Meanwhile, a unified technique of multiple consecutive misses for track deletion is used jointly to cope with the sensitivity of the PHD filter to the missed detections and to eliminate false alarms further, as well as to initiate tracks of large numbers of CSO. Finally, results of two simulations and one experiment show that the proposed approach is feasible and efficient.

  18. Premelting base pair opening probability and drug binding constant of a daunomycin-poly d(GCAT).poly d(ATGC) complex.

    PubMed Central

    Chen, Y Z; Prohofsky, E W

    1994-01-01

    We calculate room temperature thermal fluctuational base pair opening probability of a daunomycin-poly d(GCAT).poly d(ATGC) complex. This system is constructed at an atomic level of detail based on x-ray analysis of a crystal structure. The base pair opening probabilities are calculated from a modified self-consistent phonon approach of anharmonic lattice dynamics theory. We find that daunomycin binding substantially enhances the thermal stability of one of the base pairs adjacent the drug because of strong hydrogen bonding between the drug and the base. The possible effect of this enhanced stability on the drug inhibition of DNA transcription and replication is discussed. We also calculate the probability of drug dissociation from the helix based on the selfconsistent calculation of the probability of the disruption of drug-base H-bonds and the unstacking probability of the drug. The calculations can be used to determine the equilibrium drug binding constant which is found to be in good agreement with observations on similar daunomycin-DNA systems. PMID:8011914

  19. Comparison of two ways for representation of the forecast probability density function in ensemble-based sequential data assimilation

    NASA Astrophysics Data System (ADS)

    Nakano, Shinya

    2013-04-01

    In the ensemble-based sequential data assimilation, the probability density function (PDF) at each time step is represented by ensemble members. These ensemble members are usually assumed to be Monte Carlo samples drawn from the PDF, and the probability density is associated with the concentration of the ensemble members. On the basis of the Monte Carlo approximation, the forecast ensemble, which is obtained by applying the dynamical model to each ensemble member, provides an approximation of the forecast PDF on the basis of the Chapman-Kolmogorov integral. In practical cases, however, the ensemble size is limited by available computational resources, and it is typically much less than the system dimension. In such situations, the Monte Carlo approximation would not well work. When the ensemble size is less than the system dimension, the ensemble would form a simplex in a subspace. The simplex can not represent the third or higher-order moments of the PDF, but it can represent only the Gaussian features of the PDF. As noted by Wang et al. (2004), the forecast ensemble, which is obtained by applying the dynamical model to each member of the simplex ensemble, provides an approximation of the mean and covariance of the forecast PDF where the Taylor expansion of the dynamical model up to the second-order is considered except that the uncertainties which can not represented by the ensemble members are ignored. Since the third and higher-order nonlinearity is discarded, the forecast ensemble would provide some bias to the forecast. Using a small nonlinear model, the Lorenz 63 model, we also performed the experiment of the state estimation with both the simplex representation and the Monte Carlo representation, which corresponds to the limited-sized ensemble case and the large-sized ensemble case, respectively. If we use the simplex representation, it is found that the estimates tend to have some bias which is likely to be caused by the nonlinearity of the system rather

  20. Spatial scales and probability based sampling in determining levels of benthic community degradation in the Chesapeake Bay.

    PubMed

    Dauer, Daniel M; Llansó, Roberto J

    2003-01-01

    The extent of degradation of benthic communities of the Chesapeake Bay was determined by applying a previously developed benthic index of biotic integrity at three spatial scales. Allocation of sampling was probability-based allowing areal estimates of degradation with known confidence intervals. The three spatial scales were: (1) the tidal Chesapeake Bay; (2) the Elizabeth River watershed: and (3) two small tidal creeks within the Southern Branch of the Elizabeth River that are part of a sediment contaminant remediation effort. The areas covered varied from 10(-1) to 10(4) km2 and all were sampled in 1999. The Chesapeake Bay was divided into ten strata, the Elizabeth River into five strata and each of the two tidal creeks was a single stratum. The determination of the number and size of strata was based upon consideration of both managerially useful units for restoration and limitations of funding. Within each stratum 25 random locations were sampled for benthic community condition. In 1999 the percent of the benthos with poor benthic community condition for the entire Chesapeake Bay was 47% and varied from 20% at the mouth of the Bay to 72% in the Potomac River. The estimated area of benthos with poor benthic community condition for the Elizabeth River was 64% and varied from 52-92%. Both small tidal creeks had estimates of 76% of poor benthic community condition. These kinds of estimates allow environmental managers to better direct restoration efforts and evaluate progress towards restoration. Patterns of benthic community condition at smaller spatial scales may not be correctly inferred from larger spatial scales. Comparisons of patterns in benthic community condition across spatial scales, and between combinations of strata, must be cautiously interpreted. PMID:12620014

  1. Small-Area Estimation of the Probability of Toxocariasis in New York City Based on Sociodemographic Neighborhood Composition

    PubMed Central

    Walsh, Michael G.; Haseeb, M. A.

    2014-01-01

    Toxocariasis is increasingly recognized as an important neglected infection of poverty (NIP) in developed countries, and may constitute the most important NIP in the United States (US) given its association with chronic sequelae such as asthma and poor cognitive development. Its potential public health burden notwithstanding, toxocariasis surveillance is minimal throughout the US and so the true burden of disease remains uncertain in many areas. The Third National Health and Nutrition Examination Survey conducted a representative serologic survey of toxocariasis to estimate the prevalence of infection in diverse US subpopulations across different regions of the country. Using the NHANES III surveillance data, the current study applied the predicted probabilities of toxocariasis to the sociodemographic composition of New York census tracts to estimate the local probability of infection across the city. The predicted probability of toxocariasis ranged from 6% among US-born Latino women with a university education to 57% among immigrant men with less than a high school education. The predicted probability of toxocariasis exhibited marked spatial variation across the city, with particularly high infection probabilities in large sections of Queens, and smaller, more concentrated areas of Brooklyn and northern Manhattan. This investigation is the first attempt at small-area estimation of the probability surface of toxocariasis in a major US city. While this study does not define toxocariasis risk directly, it does provide a much needed tool to aid the development of toxocariasis surveillance in New York City. PMID:24918785

  2. Evaluation of the eruptive potential and probability in open conduit volcano (Mt Etna) based on soil CO2 flux measurements

    NASA Astrophysics Data System (ADS)

    De Gregorio, Sofia; Camarda, Marco

    2016-04-01

    The evaluation of the amount of magma that might be potentially erupted, i.e. the eruptive potential (EP), and the probability of eruptive event occurrence, i.e. eruptive probability (EPR) of active volcano is one of the most compelling and challenging topic addressed by the volcanology community in the last years. The evaluation of the EP in open conduit volcano is generally based on constant magma supply rate deduced by long-term series of eruptive rate. This EP computation gives good results on long-term (centuries) evaluations, but resulted less effective when short-term (years or months) estimations are needed. Actually the rate of magma supply can undergo changes both on long-term and short-term. At steady condition it can be supposed that the regular supply of magma determines an almost constant level of magma in the feeding system (FS) whereas episodic surplus of magma inputs, with respect the regular supply, can cause large variations in the magma level. Follow that the surplus of magma occasionally entered in the FS represents a supply of material that sooner or later will be disposed, i.e. it will be emitted. Afterwards the amount of surplus of magma inward the FS nearly corresponds to the amount of magma that must be erupted in order to restore the equilibrium. Further, larger is the amount of surplus of magma stored in the system higher is the energetic level of the system and its propensity to erupt or in other words its EPR. On the light of the above consideration herein, we present an innovative methodology to evaluate the EP based on the quantification of surplus of magma with respect the regular supply, progressively intruded in the FS. To estimate the surplus of magma supply we used soil CO2 emission data measured monthly at 130 sites in two peripheral areas of Mt Etna Volcano. Indeed as reported by many authors soil CO2 emissions in the areas are linked to magma supply dynamics and more, anomalous discharges of CO2 are ascribable to surplus of

  3. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  4. A GPU accelerated, discrete time random walk model for simulating reactive transport in porous media using colocation probability function based reaction methods

    NASA Astrophysics Data System (ADS)

    Barnard, J. M.; Augarde, C. E.

    2012-12-01

    The simulation of reactions in flow through unsaturated porous media is a more complicated process when using particle tracking based models than in continuum based models. In the fomer particles are reacted on an individual particle-to-particle basis using either deterministic or probabilistic methods. This means that particle tracking methods, especially when simulations of reactions are included, are computationally intensive as the reaction simulations require tens of thousands of nearest neighbour searches per time step. Despite this, particle tracking methods merit further study due to their ability to eliminate numerical dispersion, to simulate anomalous transport and incomplete mixing of reactive solutes. A new model has been developed using discrete time random walk particle tracking methods to simulate reactive mass transport in porous media which includes a variation of colocation probability function based methods of reaction simulation from those presented by Benson & Meerschaert (2008). Model development has also included code acceleration via graphics processing units (GPUs). The nature of particle tracking methods means that they are well suited to parallelization using GPUs. The architecture of GPUs is single instruction - multiple data (SIMD). This means that only one operation can be performed at any one time but can be performed on multiple data simultaneously. This allows for significant speed gains where long loops of independent operations are performed. Computationally expensive code elements, such the nearest neighbour searches required by the reaction simulation, are therefore prime targets for GPU acceleration.

  5. Physics-based Broadband Ground Motion Simulations for Probable M>7.0 earthquakes in the Marmara Sea Region (Turkey)

    NASA Astrophysics Data System (ADS)

    Akinci, Aybige; Aochi, Hideo; Herrero, Andre; Pischiutta, Marta; Karanikas, Dimitris

    2016-04-01

    The city of Istanbul is characterized by one of the highest levels of seismic risk in Europe and the Mediterranean region. The important source of the increased risk in Istanbul is the remarkable probability of the occurrence of a large earthquake, which stands at about 65% during the coming years due to the existing seismic gap and the post-1999 earthquake stress transfer at the western portion of the North Anatolian Fault Zone (NAFZ). In this study, we have simulated hybrid broadband time histories from two selected scenario earthquakes having magnitude M>7.0 in the Marmara Sea within 10-20 km of Istanbul believed to have generated devastating 1509 event in the region. The physics-based rupture scenarios, which may be an indication of potential future events, are adopted to estimate the ground motion characteristics and its variability in the region. Two simulation techniques (a full 3D wave propagation method to generate low-frequency seismograms, <~1 Hz and a stochastic technique to simulate high-frequency seismograms, >1Hz) are used to compute more realistic time series associated with scenario earthquakes having magnitudes Mw >7.0 in the Marmara Sea Region. A dynamic rupture is generated and computed with a boundary integral equation method and the propagation in the medium is realized through a finite difference approach (Aochi and Ulrich, 2015). The high frequency radiation is computed using stochastic finite-fault model approach based on a dynamic corner frequency (Motazedian and Atkinson, 2005; Boore, 2009). The results from the two simulation techniques are then merged by performing a weighted summation at intermediate frequencies to calculate broadband synthetic time series. The hybrid broadband ground motions computed with the proposed approach are validated by comparing peak ground acceleration (PGA), peak ground velocity (PGV), and spectral acceleration (SA) with recently proposed ground motion prediction equations (GMPE) in the region. Our

  6. Changes in Sexual Behavior and Attitudes Across Generations and Gender Among a Population-Based Probability Sample From an Urbanizing Province in Thailand.

    PubMed

    Techasrivichien, Teeranee; Darawuttimaprakorn, Niphon; Punpuing, Sureeporn; Musumari, Patou Masika; Lukhele, Bhekumusa Wellington; El-Saaidi, Christina; Suguimoto, S Pilar; Feldman, Mitchell D; Ono-Kihara, Masako; Kihara, Masahiro

    2016-02-01

    Thailand has undergone rapid modernization with implications for changes in sexual norms. We investigated sexual behavior and attitudes across generations and gender among a probability sample of the general population of Nonthaburi province located near Bangkok in 2012. A tablet-based survey was performed among 2,138 men and women aged 15-59 years identified through a three-stage, stratified, probability proportional to size, clustered sampling. Descriptive statistical analysis was carried out accounting for the effects of multistage sampling. Relationship of age and gender to sexual behavior and attitudes was analyzed by bivariate analysis followed by multivariate logistic regression analysis to adjust for possible confounding. Patterns of sexual behavior and attitudes varied substantially across generations and gender. We found strong evidence for a decline in the age of sexual initiation, a shift in the type of the first sexual partner, and a greater rate of acceptance of adolescent premarital sex among younger generations. The study highlighted profound changes among young women as evidenced by a higher number of lifetime sexual partners as compared to older women. In contrast to the significant gender gap in older generations, sexual profiles of Thai young women have evolved to resemble those of young men with attitudes gradually converging to similar sexual standards. Our data suggest that higher education, being never-married, and an urban lifestyle may have been associated with these changes. Our study found that Thai sexual norms are changing dramatically. It is vital to continue monitoring such changes, considering the potential impact on the HIV/STIs epidemic and unintended pregnancies. PMID:25403321

  7. A Galerkin-based formulation of the probability density evolution method for general stochastic finite element systems

    NASA Astrophysics Data System (ADS)

    Papadopoulos, Vissarion; Kalogeris, Ioannis

    2016-05-01

    The present paper proposes a Galerkin finite element projection scheme for the solution of the partial differential equations (pde's) involved in the probability density evolution method, for the linear and nonlinear static analysis of stochastic systems. According to the principle of preservation of probability, the probability density evolution of a stochastic system is expressed by its corresponding Fokker-Planck (FP) stochastic partial differential equation. Direct integration of the FP equation is feasible only for simple systems with a small number of degrees of freedom, due to analytical and/or numerical intractability. However, rewriting the FP equation conditioned to the random event description, a generalized density evolution equation (GDEE) can be obtained, which can be reduced to a one dimensional pde. Two Galerkin finite element method schemes are proposed for the numerical solution of the resulting pde's, namely a time-marching discontinuous Galerkin scheme and the StreamlineUpwind/Petrov Galerkin (SUPG) scheme. In addition, a reformulation of the classical GDEE is proposed, which implements the principle of probability preservation in space instead of time, making this approach suitable for the stochastic analysis of finite element systems. The advantages of the FE Galerkin methods and in particular the SUPG over finite difference schemes, like the modified Lax-Wendroff, which is the most frequently used method for the solution of the GDEE, are illustrated with numerical examples and explored further.

  8. Liver Stiffness Measurement-Based Scoring System for Significant Inflammation Related to Chronic Hepatitis B

    PubMed Central

    Hong, Mei-Zhu; Zhang, Ru-Mian; Chen, Guo-Liang; Huang, Wen-Qi; Min, Feng; Chen, Tian; Xu, Jin-Chao; Pan, Jin-Shui

    2014-01-01

    Objectives Liver biopsy is indispensable because liver stiffness measurement alone cannot provide information on intrahepatic inflammation. However, the presence of fibrosis highly correlates with inflammation. We constructed a noninvasive model to determine significant inflammation in chronic hepatitis B patients by using liver stiffness measurement and serum markers. Methods The training set included chronic hepatitis B patients (n = 327), and the validation set included 106 patients; liver biopsies were performed, liver histology was scored, and serum markers were investigated. All patients underwent liver stiffness measurement. Results An inflammation activity scoring system for significant inflammation was constructed. In the training set, the area under the curve, sensitivity, and specificity of the fibrosis-based activity score were 0.964, 91.9%, and 90.8% in the HBeAg(+) patients and 0.978, 85.0%, and 94.0% in the HBeAg(−) patients, respectively. In the validation set, the area under the curve, sensitivity, and specificity of the fibrosis-based activity score were 0.971, 90.5%, and 92.5% in the HBeAg(+) patients and 0.977, 95.2%, and 95.8% in the HBeAg(−) patients. The liver stiffness measurement-based activity score was comparable to that of the fibrosis-based activity score in both HBeAg(+) and HBeAg(−) patients for recognizing significant inflammation (G ≥3). Conclusions Significant inflammation can be accurately predicted by this novel method. The liver stiffness measurement-based scoring system can be used without the aid of computers and provides a noninvasive alternative for the prediction of chronic hepatitis B-related significant inflammation. PMID:25360742

  9. Co-activation Probability Estimation (CoPE): An approach for modeling functional co-activation architecture based on neuroimaging coordinates.

    PubMed

    Chu, Congying; Fan, Lingzhong; Eickhoff, Claudia R; Liu, Yong; Yang, Yong; Eickhoff, Simon B; Jiang, Tianzi

    2015-08-15

    Recent progress in functional neuroimaging has prompted studies of brain activation during various cognitive tasks. Coordinate-based meta-analysis has been utilized to discover the brain regions that are consistently activated across experiments. However, within-experiment co-activation relationships, which can reflect the underlying functional relationships between different brain regions, have not been widely studied. In particular, voxel-wise co-activation, which may be able to provide a detailed configuration of the co-activation network, still needs to be modeled. To estimate the voxel-wise co-activation pattern and deduce the co-activation network, a Co-activation Probability Estimation (CoPE) method was proposed to model within-experiment activations for the purpose of defining the co-activations. A permutation test was adopted as a significance test. Moreover, the co-activations were automatically separated into local and long-range ones, based on distance. The two types of co-activations describe distinct features: the first reflects convergent activations; the second represents co-activations between different brain regions. The validation of CoPE was based on five simulation tests and one real dataset derived from studies of working memory. Both the simulated and the real data demonstrated that CoPE was not only able to find local convergence but also significant long-range co-activation. In particular, CoPE was able to identify a 'core' co-activation network in the working memory dataset. As a data-driven method, the CoPE method can be used to mine underlying co-activation relationships across experiments in future studies. PMID:26037052

  10. Formulation and Application of a Physically-Based Rupture Probability Model for Large Earthquakes on Subduction Zones: A Case Study of Earthquakes on Nazca Plate

    NASA Astrophysics Data System (ADS)

    Mahdyiar, M.; Galgana, G.; Shen-Tu, B.; Klein, E.; Pontbriand, C. W.

    2014-12-01

    Most time dependent rupture probability (TDRP) models are basically designed for a single-mode rupture, i.e. a single characteristic earthquake on a fault. However, most subduction zones rupture in complex patterns that create overlapping earthquakes of different magnitudes. Additionally, the limited historic earthquake data does not provide sufficient information to estimate reliable mean recurrence intervals for earthquakes. This makes it difficult to identify a single characteristic earthquake for TDRP analysis. Physical models based on geodetic data have been successfully used to obtain information on the state of coupling and slip deficit rates for subduction zones. Coupling information provides valuable insight into the complexity of subduction zone rupture processes. In this study we present a TDRP model that is formulated based on subduction zone slip deficit rate distribution. A subduction zone is represented by an integrated network of cells. Each cell ruptures multiple times from numerous earthquakes that have overlapping rupture areas. The rate of rupture for each cell is calculated using a moment balance concept that is calibrated based on historic earthquake data. The information in conjunction with estimates of coseismic slip from past earthquakes is used to formulate time dependent rupture probability models for cells. Earthquakes on the subduction zone and their rupture probabilities are calculated by integrating different combinations of cells. The resulting rupture probability estimates are fully consistent with the state of coupling of the subduction zone and the regional and local earthquake history as the model takes into account the impact of all large (M>7.5) earthquakes on the subduction zone. The granular rupture model as developed in this study allows estimating rupture probabilities for large earthquakes other than just a single characteristic magnitude earthquake. This provides a general framework for formulating physically-based

  11. Probability of sea level rise

    SciTech Connect

    Titus, J.G.; Narayanan, V.K.

    1995-10-01

    The report develops probability-based projections that can be added to local tide-gage trends to estimate future sea level at particular locations. It uses the same models employed by previous assessments of sea level rise. The key coefficients in those models are based on subjective probability distributions supplied by a cross-section of climatologists, oceanographers, and glaciologists.

  12. MEASUREMENT OF MULTI-POLLUTANT AND MULTI-PATHWAY EXPOSURES IN A PROBABILITY-BASED SAMPLE OF CHILDREN: PRACTICAL STRATEGIES FOR EFFECTIVE FIELD STUDIES

    EPA Science Inventory

    The purpose of this manuscript is to describe the practical strategies developed for the implementation of the Minnesota Children's Pesticide Exposure Study (MNCPES), which is one of the first probability-based samples of multi-pathway and multi-pesticide exposures in children....

  13. The Relationship between School Quality and the Probability of Passing Standards-Based High-Stakes Performance Assessments. CSE Technical Report 644

    ERIC Educational Resources Information Center

    Goldschmidt, Pete; Martinez-Fernandez, Jose-Felipe

    2004-01-01

    We examine whether school quality affects passing the California High School Exit Exam (CAHSEE), which is a standards-based high-stakes performance assessment. We use 3-level hierarchical logistic and linear models to examine student probabilities of passing the CAHSEE to take advantage of the availability of student, teacher, and school level…

  14. Evaluation of Postharvest-Processed Oysters by Using PCR-Based Most-Probable-Number Enumeration of Vibrio vulnificus Bacteria▿

    PubMed Central

    Wright, Anita C.; Garrido, Victor; Debuex, Georgia; Farrell-Evans, Melissa; Mudbidri, Archana A.; Otwell, W. Steven

    2007-01-01

    Postharvest processing (PHP) is used to reduce levels of Vibrio vulnificus in oysters, but process validation is labor-intensive and expensive. Therefore, quantitative PCR was evaluated as a rapid confirmation method for most-probable-number enumeration (QPCR-MPN) of V. vulnificus bacteria in PHP oysters. QPCR-MPN showed excellent correlation (R2 = 0.97) with standard MPN and increased assay sensitivity and efficiency. PMID:17905883

  15. Adapting the posterior probability of diagnosis index to enhance evidence-based screening: an application to ADHD in primary care.

    PubMed

    Lindhiem, Oliver; Yu, Lan; Grasso, Damion J; Kolko, David J; Youngstrom, Eric A

    2015-04-01

    This study adapts the Posterior Probability of Diagnosis (PPOD) Index for use with screening data. The original PPOD Index, designed for use in the context of comprehensive diagnostic assessments, is overconfident when applied to screening data. To correct for this overconfidence, we describe a simple method for adjusting the PPOD Index to improve its calibration when used for screening. Specifically, we compare the adjusted PPOD Index to the original index and naïve Bayes probability estimates on two dimensions of accuracy, discrimination and calibration, using a clinical sample of children and adolescents (N = 321) whose caregivers completed the Vanderbilt Assessment Scale to screen for attention-deficit/hyperactivity disorder and who subsequently completed a comprehensive diagnostic assessment. Results indicated that the adjusted PPOD Index, original PPOD Index, and naïve Bayes probability estimates are comparable using traditional measures of accuracy (sensitivity, specificity, and area under the curve), but the adjusted PPOD Index showed superior calibration. We discuss the importance of calibration for screening and diagnostic support tools when applied to individual patients. PMID:25000935

  16. Understanding text-based persuasion and support tactics of concerned significant others

    PubMed Central

    van Stolk-Cooke, Katherine; Hayes, Marie; Baumel, Amit

    2015-01-01

    The behavior of concerned significant others (CSOs) can have a measurable impact on the health and wellness of individuals attempting to meet behavioral and health goals, and research is needed to better understand the attributes of text-based CSO language when encouraging target significant others (TSOs) to achieve those goals. In an effort to inform the development of interventions for CSOs, this study examined the language content of brief text-based messages generated by CSOs to motivate TSOs to achieve a behavioral goal. CSOs generated brief text-based messages for TSOs for three scenarios: (1) to help TSOs achieve the goal, (2) in the event that the TSO is struggling to meet the goal, and (3) in the event that the TSO has given up on meeting the goal. Results indicate that there was a significant relationship between the tone and compassion of messages generated by CSOs, the CSOs’ perceptions of TSO motivation, and their expectation of a grateful or annoyed reaction by the TSO to their feedback or support. Results underscore the importance of attending to patterns in language when CSOs communicate with TSOs about goal achievement or failure, and how certain variables in the CSOs’ perceptions of their TSOs affect these characteristics. PMID:26312172

  17. The effects of inquiry-based science instruction training on teachers of students with significant disabilities

    NASA Astrophysics Data System (ADS)

    Courtade, Ginevra Rose

    Federal mandates (A Nation at Risk, 1983 and Project 2061: Science for all Americans, 1985) as well as the National Science Education Standards (NRC, 1996) call for science education for all students. Recent educational laws (IDEA, 1997; NCLB, 2002) require access to and assessment of the general curriculum, including science, for all students with disabilities. Although some research exists on teaching academics to students with significant disabilities, the research on teaching science is especially limited (Browder, Spooner, Ahlgrim-Delzell, Harris, & Wakeman, 2006; Browder, Wakeman, et al., 2006; Courtade, et al., 2006). The purpose of this investigation was to determine if training teachers of students with significant disabilities to teach science concepts using a guided inquiry-based method would change the way science was instructed in the classroom. Further objectives of this study were to determine if training the teachers would increase students' participation and achievement in science. The findings of this study demonstrated a functional relationship between the inquiry-based science instruction training and teacher's ability to instruct students with significant disabilities in science using inquiry-based science instruction. The findings of this study also indicated a functional relationship between the inquiry-based science instruction training and acquisition of student inquiry skills. Also, findings indicated an increase in the number of science content standards being addressed after the teachers received the training. Some students were also able to acquire new science terms after their teachers taught using inquiry-based instruction. Finally, social validity measures indicated a high degree of satisfaction with the intervention and its intended outcomes.

  18. Prognostic significance of volume-based PET parameters in cancer patients.

    PubMed

    Moon, Seung Hwan; Hyun, Seung Hyup; Choi, Joon Young

    2013-01-01

    Accurate prediction of cancer prognosis before the start of treatment is important since these predictions often affect the choice of treatment. Prognosis is usually based on anatomical staging and other clinical factors. However, the conventional system is not sufficient to accurately and reliably determine prognosis. Metabolic parameters measured by (18)F-fluorodeoxyglucose (FDG) positron emission tomography (PET) have the potential to provide valuable information regarding prognosis and treatment response evaluation in cancer patients. Among these parameters, volume-based PET parameters such as metabolic tumor volume and total lesion glycolysis are especially promising. However, the measurement of these parameters is significantly affected by the imaging methodology and specific image characteristics, and a standard method for these parameters has not been established. This review introduces volume-based PET parameters as potential prognostic indicators, and highlights methodological considerations for measurement, potential implications, and prospects for further studies. PMID:23323025

  19. Probability workshop to be better in probability topic

    NASA Astrophysics Data System (ADS)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  20. The impacts of problem gambling on concerned significant others accessing web-based counselling.

    PubMed

    Dowling, Nicki A; Rodda, Simone N; Lubman, Dan I; Jackson, Alun C

    2014-08-01

    The 'concerned significant others' (CSOs) of people with problem gambling frequently seek professional support. However, there is surprisingly little research investigating the characteristics or help-seeking behaviour of these CSOs, particularly for web-based counselling. The aims of this study were to describe the characteristics of CSOs accessing the web-based counselling service (real time chat) offered by the Australian national gambling web-based counselling site, explore the most commonly reported CSO impacts using a new brief scale (the Problem Gambling Significant Other Impact Scale: PG-SOIS), and identify the factors associated with different types of CSO impact. The sample comprised all 366 CSOs accessing the service over a 21 month period. The findings revealed that the CSOs were most often the intimate partners of problem gamblers and that they were most often females aged under 30 years. All CSOs displayed a similar profile of impact, with emotional distress (97.5%) and impacts on the relationship (95.9%) reported to be the most commonly endorsed impacts, followed by impacts on social life (92.1%) and finances (91.3%). Impacts on employment (83.6%) and physical health (77.3%) were the least commonly endorsed. There were few significant differences in impacts between family members (children, partners, parents, and siblings), but friends consistently reported the lowest impact scores. Only prior counselling experience and Asian cultural background were consistently associated with higher CSO impacts. The findings can serve to inform the development of web-based interventions specifically designed for the CSOs of problem gamblers. PMID:24813552

  1. Nuclear spin of odd-odd α emitters based on the behavior of α -particle preformation probability

    NASA Astrophysics Data System (ADS)

    Ismail, M.; Adel, A.; Botros, M. M.

    2016-05-01

    The preformation probabilities of an α cluster inside radioactive parent nuclei for both odd-even and odd-odd nuclei are investigated. The calculations cover the isotopic chains from Ir to Ac in the mass regions 166 ≤A ≤215 and 77 ≤Z ≤89 . The calculations are employed in the framework of the density-dependent cluster model. A realistic density-dependent nucleon-nucleon (N N ) interaction with a finite-range exchange part is used to calculate the microscopic α -nucleus potential in the well-established double-folding model. The main effect of antisymmetrization under exchange of nucleons between the α and daughter nuclei has been included in the folding model through the finite-range exchange part of the N N interaction. The calculated potential is then implemented to find both the assault frequency and the penetration probability of the α particle by means of the Wentzel-Kramers-Brillouin approximation in combination with the Bohr-Sommerfeld quantization condition. The correlation of the α -particle preformation probability and the neutron and proton level sequences of the parent nucleus as obtained in our previous work is extended to odd-even and odd-odd nuclei to determine the nuclear spin and parities. Two spin coupling rules are used, namely, strong and weak rules to determine the nuclear spin for odd-odd isotopes. This work can be a useful reference for theoretical calculation of undetermined nuclear spin of odd-odd nuclei in the future.

  2. Group mindfulness-based therapy significantly improves sexual desire in women.

    PubMed

    Brotto, Lori A; Basson, Rosemary

    2014-06-01

    At least a third of women across reproductive ages experience low sexual desire and impaired arousal. There is increasing evidence that mindfulness, defined as non-judgmental present moment awareness, may improve women's sexual functioning. The goal of this study was to test the effectiveness of mindfulness-based therapy, either immediately or after a 3-month waiting period, in women seeking treatment for low sexual desire and arousal. Women participated in four 90-min group sessions that included mindfulness meditation, cognitive therapy, and education. A total of 117 women were assigned to either the immediate treatment (n = 68, mean age 40.8 yrs) or delayed treatment (n = 49, mean age 42.2 yrs) group, in which women had two pre-treatment baseline assessments followed by treatment. A total of 95 women completed assessments through to the 6-month follow-up period. Compared to the delayed treatment control group, treatment significantly improved sexual desire, sexual arousal, lubrication, sexual satisfaction, and overall sexual functioning. Sex-related distress significantly decreased in both conditions, regardless of treatment, as did orgasmic difficulties and depressive symptoms. Increases in mindfulness and a reduction in depressive symptoms predicted improvements in sexual desire. Mindfulness-based group therapy significantly improved sexual desire and other indices of sexual response, and should be considered in the treatment of women's sexual dysfunction. PMID:24814472

  3. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  4. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  5. Evaluating research for clinical significance: using critically appraised topics to enhance evidence-based neuropsychology.

    PubMed

    Bowden, Stephen C; Harrison, Elise J; Loring, David W

    2014-01-01

    Meehl's (1973, Psychodiagnosis: Selected papers. Minneapolis: University of Minnesota Press) distinction between statistical and clinical significance holds special relevance for evidence-based neuropsychological practice. Meehl argued that despite attaining statistical significance, many published findings have limited practical value since they do not inform clinical care. In the context of an ever expanding clinical research literature, accessible methods to evaluate clinical impact are needed. The method of Critically Appraised Topics (Straus, Richardson, Glasziou, & Haynes, 2011, Evidence-based medicine: How to practice and teach EBM (4th ed.). Edinburgh: Elsevier Churchill-Livingstone) was developed to provide clinicians with a "toolkit" to facilitate implementation of evidence-based practice. We illustrate the Critically Appraised Topics method using a dementia screening example. We argue that the skills practiced through critical appraisal provide clinicians with methods to: (1) evaluate the clinical relevance of new or unfamiliar research findings with a focus on patient benefit, (2) help focus of research quality, and (3) incorporate evaluation of clinical impact into educational and professional development activities. PMID:23463942

  6. Efficiency of using correlation function for estimation of probability of substance detection on the base of THz spectral dynamics

    NASA Astrophysics Data System (ADS)

    Trofimov, Vyacheslav A.; Peskov, Nikolay V.; Kirillov, Dmitry A.

    2012-10-01

    One of the problems arising in Time-Domain THz spectroscopy for the problem of security is the developing the criteria for assessment of probability for the detection and identification of the explosive and drugs. We analyze the efficiency of using the correlation function and another functional (more exactly, spectral norm) for this aim. These criteria are applied to spectral lines dynamics. For increasing the reliability of the assessment we subtract the averaged value of THz signal during time of analysis of the signal: it means deleting the constant from this part of the signal. Because of this, we can increase the contrast of assessment. We compare application of the Fourier-Gabor transform with unbounded (for example, Gaussian) window, which slides along the signal, for finding the spectral lines dynamics with application of the Fourier transform in short time interval (FTST), in which the Fourier transform is applied to parts of the signals, for the same aim. These methods are close each to other. Nevertheless, they differ by series of frequencies which they use. It is important for practice that the optimal window shape depends on chosen method for obtaining the spectral dynamics. The probability enhancements if we can find the train of pulses with different frequencies, which follow sequentially. We show that there is possibility to get pure spectral lines dynamics even under the condition of distorted spectrum of the substance response on the action of the THz pulse.

  7. Generalized method for probability-based peptide and protein identification from tandem mass spectrometry data and sequence database searching.

    PubMed

    Ramos-Fernández, Antonio; Paradela, Alberto; Navajas, Rosana; Albar, Juan Pablo

    2008-09-01

    Tandem mass spectrometry-based proteomics is currently in great demand of computational methods that facilitate the elimination of likely false positives in peptide and protein identification. In the last few years, a number of new peptide identification programs have been described, but scores or other significance measures reported by these programs cannot always be directly translated into an easy to interpret error rate measurement such as the false discovery rate. In this work we used generalized lambda distributions to model frequency distributions of database search scores computed by MASCOT, X!TANDEM with k-score plug-in, OMSSA, and InsPecT. From these distributions, we could successfully estimate p values and false discovery rates with high accuracy. From the set of peptide assignments reported by any of these engines, we also defined a generic protein scoring scheme that enabled accurate estimation of protein-level p values by simulation of random score distributions that was also found to yield good estimates of protein-level false discovery rate. The performance of these methods was evaluated by searching four freely available data sets ranging from 40,000 to 285,000 MS/MS spectra. PMID:18515861

  8. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    NASA Astrophysics Data System (ADS)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  9. Waste Package Misload Probability

    SciTech Connect

    J.K. Knudsen

    2001-11-20

    The objective of this calculation is to calculate the probability of occurrence for fuel assembly (FA) misloads (i.e., Fa placed in the wrong location) and FA damage during FA movements. The scope of this calculation is provided by the information obtained from the Framatome ANP 2001a report. The first step in this calculation is to categorize each fuel-handling events that occurred at nuclear power plants. The different categories are based on FAs being damaged or misloaded. The next step is to determine the total number of FAs involved in the event. Using the information, a probability of occurrence will be calculated for FA misload and FA damage events. This calculation is an expansion of preliminary work performed by Framatome ANP 2001a.

  10. A Non-Parametric Surrogate-based Test of Significance for T-Wave Alternans Detection

    PubMed Central

    Nemati, Shamim; Abdala, Omar; Bazán, Violeta; Yim-Yeh, Susie; Malhotra, Atul; Clifford, Gari

    2010-01-01

    We present a non-parametric adaptive surrogate test that allows for the differentiation of statistically significant T-Wave Alternans (TWA) from alternating patterns that can be solely explained by the statistics of noise. The proposed test is based on estimating the distribution of noise induced alternating patterns in a beat sequence from a set of surrogate data derived from repeated reshuffling of the original beat sequence. Thus, in assessing the significance of the observed alternating patterns in the data no assumptions are made about the underlying noise distribution. In addition, since the distribution of noise-induced alternans magnitudes is calculated separately for each sequence of beats within the analysis window, the method is robust to data non-stationarities in both noise and TWA. The proposed surrogate method for rejecting noise was compared to the standard noise rejection methods used with the Spectral Method (SM) and the Modified Moving Average (MMA) techniques. Using a previously described realistic multi-lead model of TWA, and real physiological noise, we demonstrate the proposed approach reduces false TWA detections, while maintaining a lower missed TWA detection compared with all the other methods tested. A simple averaging-based TWA estimation algorithm was coupled with the surrogate significance testing and was evaluated on three public databases; the Normal Sinus Rhythm Database (NRSDB), the Chronic Heart Failure Database (CHFDB) and the Sudden Cardiac Death Database (SCDDB). Differences in TWA amplitudes between each database were evaluated at matched heart rate (HR) intervals from 40 to 120 beats per minute (BPM). Using the two-sample Kolmogorov-Smirnov test, we found that significant differences in TWA levels exist between each patient group at all decades of heart rates. The most marked difference was generally found at higher heart rates, and the new technique resulted in a larger margin of separability between patient populations than

  11. Probability 1/e

    ERIC Educational Resources Information Center

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  12. Non-front-fanged colubroid snakes: a current evidence-based analysis of medical significance.

    PubMed

    Weinstein, Scott A; White, Julian; Keyler, Daniel E; Warrell, David A

    2013-07-01

    Non-front-fanged colubroid snakes (NFFC; formerly and artificially taxonomically assembled as "colubrids") comprise about 70% of extant snake species and include several taxa now known to cause lethal or life threatening envenoming in humans. Although the medical risks of bites by only a handful of species have been documented, a growing number of NFFC are implicated in medically significant bites. The majority of these snakes have oral products (Duvernoy's secretions, or venoms) with unknown biomedical properties and their potential for causing harm in humans is unknown. Increasingly, multiple NFFC species are entering the commercial snake trade posing an uncertain risk. Published case reports describing NFFC bites were assessed for evidence-based value, clinical detail and verified species identification. These data were subjected to meta-analysis and a hazard index was generated for select taxa. Cases on which we consulted or personally treated were included and subjected to the same assessment criteria. Cases involving approximately 120 species met the selection criteria, and a small subset designated Hazard Level 1 (most hazardous), contained 5 species with lethal potential. Recommended management of these cases included antivenom for 3 species, Dispholidus typus, Rhabdophis tiginis, Rhabdophis subminiatus, whereas others in this subset without commercially available antivenoms (Thelotornis spp.) were treated with plasma/erythrocyte replacement therapy and supportive care. Heparin, antifibrinolytics and/or plasmapheresis/exchange transfusion have been used in the management of some Hazard Level 1 envenomings, but evidence-based analysis positively contraindicates the use of any of these interventions. Hazard Level 2/3 species were involved in cases containing mixed quality data that implicated these taxa (e.g. Boiga irregularis, Philodryas olfersii, Malpolon monspessulanus) with bites that caused rare systemic effects. Recommended management may include use of

  13. Ultrasonography-Based Thyroidal and Perithyroidal Anatomy and Its Clinical Significance.

    PubMed

    Ha, Eun Ju; Baek, Jung Hwan; Lee, Jeong Hyun

    2015-01-01

    Ultrasonography (US)-guided procedures such as ethanol ablation, radiofrequency ablation, laser ablation, selective nerve block, and core needle biopsy have been widely applied in the diagnosis and management of thyroid and neck lesions. For a safe and effective US-guided procedure, knowledge of neck anatomy, particularly that of the nerves, vessels, and other critical structures, is essential. However, most previous reports evaluated neck anatomy based on cadavers, computed tomography, or magnetic resonance imaging rather than US. Therefore, the aim of this article was to elucidate US-based thyroidal and perithyroidal anatomy, as well as its clinical significance in the use of prevention techniques for complications during the US-guided procedures. Knowledge of these areas may be helpful for maximizing the efficacy and minimizing the complications of US-guided procedures for the thyroid and other neck lesions. PMID:26175574

  14. Ultrasonography-Based Thyroidal and Perithyroidal Anatomy and Its Clinical Significance

    PubMed Central

    Ha, Eun Ju; Lee, Jeong Hyun

    2015-01-01

    Ultrasonography (US)-guided procedures such as ethanol ablation, radiofrequency ablation, laser ablation, selective nerve block, and core needle biopsy have been widely applied in the diagnosis and management of thyroid and neck lesions. For a safe and effective US-guided procedure, knowledge of neck anatomy, particularly that of the nerves, vessels, and other critical structures, is essential. However, most previous reports evaluated neck anatomy based on cadavers, computed tomography, or magnetic resonance imaging rather than US. Therefore, the aim of this article was to elucidate US-based thyroidal and perithyroidal anatomy, as well as its clinical significance in the use of prevention techniques for complications during the US-guided procedures. Knowledge of these areas may be helpful for maximizing the efficacy and minimizing the complications of US-guided procedures for the thyroid and other neck lesions. PMID:26175574

  15. Significance of platelet count and platelet-based models for hepatocellular carcinoma recurrence

    PubMed Central

    Pang, Qing; Zhang, Jing-Yao; Xu, Xin-Sen; Song, Si-Dong; Qu, Kai; Chen, Wei; Zhou, Yan-Yan; Miao, Run-Chen; Liu, Su-Shun; Dong, Ya-Feng; Liu, Chang

    2015-01-01

    AIM: To explore the effects of platelet count (PLT) and 11 platelet-based indices on postoperative recurrence of hepatocellular carcinoma (HCC). METHODS: We retrospectively analyzed 172 HCC patients who were treated by partial hepatectomy. Preoperative data, including laboratory biochemical results, were used to calculate the 11 indices included in the analysis. We performed receiver operating characteristic curve analysis to determine the optimal cut-off values for predicting recurrence. Cumulative rates of HCC recurrence were calculated using Kaplan-Meier survival curves and differences were analyzed by log-rank tests. Multivariate analyses were performed to identify independent predictors of recurrence, early recurrence (within one year after surgery), and late recurrence in HCC. To obtain better prognostic models, PLT-based indices were analyzed separately after being expressed as binary and continuous variables. Two platelet-unrelated, validated HCC prognostic models were included in the analyses as reference indices. Additional analyses were performed after patients were stratified based on hepatitis B virus infection status, cirrhosis, and tumor size to investigate the significance of platelets in different subgroups. RESULTS: In the study cohort, 44.2% (76/172) of patients experienced HCC recurrence, and 50.6% (87/172) died during a median follow-up time of 46 mo. PLT and five of the 11 platelet-related models were significant predisposing factors for recurrence (P < 0.05). Multivariate analysis indicated that, among the clinical parameters, presence of ascites, PLT ≥ 148 × 109/L, alkaline phosphatase ≥ 116 U/L, and tumor size ≥ 5 cm were independently associated with a higher risk of HCC recurrence (P < 0.05). Independent and significant models included the aspartate aminotransferase/PLT index, fibrosis index based on the four factors, fibro-quotient, aspartate aminotransferase/PLT/γ-glutamyl transpeptidase/alpha-fetoprotein index, and the PLT

  16. Minimizing the probable maximum flood

    SciTech Connect

    Woodbury, M.S.; Pansic, N. ); Eberlein, D.T. )

    1994-06-01

    This article examines Wisconsin Electric Power Company's efforts to determine an economical way to comply with Federal Energy Regulatory Commission requirements at two hydroelectric developments on the Michigamme River. Their efforts included refinement of the area's probable maximum flood model based, in part, on a newly developed probable maximum precipitation estimate.

  17. Probability for Weather and Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  18. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta-analysis and group level studies.

    PubMed

    Bakbergenuly, Ilyas; Kulinskaya, Elena; Morgenthaler, Stephan

    2016-07-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group-level studies or in meta-analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log-odds and arcsine transformations of the estimated probability p̂, both for single-group studies and in combining results from several groups or studies in meta-analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta-analysis and result in abysmal coverage of the combined effect for large K. We also propose bias-correction for the arcsine transformation. Our simulations demonstrate that this bias-correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta-analyses of prevalence. PMID:27192062

  19. Laser Raman detection for oral cancer based on an adaptive Gaussian process classification method with posterior probabilities

    NASA Astrophysics Data System (ADS)

    Du, Zhanwei; Yang, Yongjian; Bai, Yuan; Wang, Lijun; Su, Le; Chen, Yong; Li, Xianchang; Zhou, Xiaodong; Jia, Jun; Shen, Aiguo; Hu, Jiming

    2013-03-01

    The existing methods for early and differential diagnosis of oral cancer are limited due to the unapparent early symptoms and the imperfect imaging examination methods. In this paper, the classification models of oral adenocarcinoma, carcinoma tissues and a control group with just four features are established by utilizing the hybrid Gaussian process (HGP) classification algorithm, with the introduction of the mechanisms of noise reduction and posterior probability. HGP shows much better performance in the experimental results. During the experimental process, oral tissues were divided into three groups, adenocarcinoma (n = 87), carcinoma (n = 100) and the control group (n = 134). The spectral data for these groups were collected. The prospective application of the proposed HGP classification method improved the diagnostic sensitivity to 56.35% and the specificity to about 70.00%, and resulted in a Matthews correlation coefficient (MCC) of 0.36. It is proved that the utilization of HGP in LRS detection analysis for the diagnosis of oral cancer gives accurate results. The prospect of application is also satisfactory.

  20. Inference for binomial probability based on dependent Bernoulli random variables with applications to meta‐analysis and group level studies

    PubMed Central

    Bakbergenuly, Ilyas; Morgenthaler, Stephan

    2016-01-01

    We study bias arising as a result of nonlinear transformations of random variables in random or mixed effects models and its effect on inference in group‐level studies or in meta‐analysis. The findings are illustrated on the example of overdispersed binomial distributions, where we demonstrate considerable biases arising from standard log‐odds and arcsine transformations of the estimated probability p^, both for single‐group studies and in combining results from several groups or studies in meta‐analysis. Our simulations confirm that these biases are linear in ρ, for small values of ρ, the intracluster correlation coefficient. These biases do not depend on the sample sizes or the number of studies K in a meta‐analysis and result in abysmal coverage of the combined effect for large K. We also propose bias‐correction for the arcsine transformation. Our simulations demonstrate that this bias‐correction works well for small values of the intraclass correlation. The methods are applied to two examples of meta‐analyses of prevalence. PMID:27192062

  1. Significantly enhanced robustness and electrochemical performance of flexible carbon nanotube-based supercapacitors by electrodepositing polypyrrole

    NASA Astrophysics Data System (ADS)

    Chen, Yanli; Du, Lianhuan; Yang, Peihua; Sun, Peng; Yu, Xiang; Mai, Wenjie

    2015-08-01

    Here, we report robust, flexible CNT-based supercapacitor (SC) electrodes fabricated by electrodepositing polypyrrole (PPy) on freestanding vacuum-filtered CNT film. These electrodes demonstrate significantly improved mechanical properties (with the ultimate tensile strength of 16 MPa), and greatly enhanced electrochemical performance (5.6 times larger areal capacitance). The major drawback of conductive polymer electrodes is the fast capacitance decay caused by structural breakdown, which decreases cycling stability but this is not observed in our case. All-solid-state SCs assembled with the robust CNT/PPy electrodes exhibit excellent flexibility, long lifetime (95% capacitance retention after 10,000 cycles) and high electrochemical performance (a total device volumetric capacitance of 4.9 F/cm3). Moreover, a flexible SC pack is demonstrated to light up 53 LEDs or drive a digital watch, indicating the broad potential application of our SCs for portable/wearable electronics.

  2. A Network-Based Method to Assess the Statistical Significance of Mild Co-Regulation Effects

    PubMed Central

    Horvát, Emőke-Ágnes; Zhang, Jitao David; Uhlmann, Stefan; Sahin, Özgür; Zweig, Katharina Anna

    2013-01-01

    Recent development of high-throughput, multiplexing technology has initiated projects that systematically investigate interactions between two types of components in biological networks, for instance transcription factors and promoter sequences, or microRNAs (miRNAs) and mRNAs. In terms of network biology, such screening approaches primarily attempt to elucidate relations between biological components of two distinct types, which can be represented as edges between nodes in a bipartite graph. However, it is often desirable not only to determine regulatory relationships between nodes of different types, but also to understand the connection patterns of nodes of the same type. Especially interesting is the co-occurrence of two nodes of the same type, i.e., the number of their common neighbours, which current high-throughput screening analysis fails to address. The co-occurrence gives the number of circumstances under which both of the biological components are influenced in the same way. Here we present SICORE, a novel network-based method to detect pairs of nodes with a statistically significant co-occurrence. We first show the stability of the proposed method on artificial data sets: when randomly adding and deleting observations we obtain reliable results even with noise exceeding the expected level in large-scale experiments. Subsequently, we illustrate the viability of the method based on the analysis of a proteomic screening data set to reveal regulatory patterns of human microRNAs targeting proteins in the EGFR-driven cell cycle signalling system. Since statistically significant co-occurrence may indicate functional synergy and the mechanisms underlying canalization, and thus hold promise in drug target identification and therapeutic development, we provide a platform-independent implementation of SICORE with a graphical user interface as a novel tool in the arsenal of high-throughput screening analysis. PMID:24039936

  3. Visualization of the significance of Receiver Operating Characteristics based on confidence ellipses

    NASA Astrophysics Data System (ADS)

    Sarlis, Nicholas V.; Christopoulos, Stavros-Richard G.

    2014-03-01

    The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Catalogue identifier: AERY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERY_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 11511 No. of bytes in distributed program, including test data, etc.: 72906 Distribution format: tar.gz Programming language: FORTRAN. Computer: Any computer supporting a GNU FORTRAN compiler. Operating system: Linux, MacOS, Windows. RAM: 1Mbyte Classification: 4.13, 9, 14. Nature of problem: The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Solution method: Using the statistics of random binary predictions for a given value of the predictor threshold ɛt, one can construct the corresponding confidence ellipses. The envelope of these corresponding confidence ellipses is estimated when

  4. Functional activity maps based on significance measures and Independent Component Analysis.

    PubMed

    Martínez-Murcia, F J; Górriz, J M; Ramírez, J; Puntonet, C G; Illán, I A

    2013-07-01

    The use of functional imaging has been proven very helpful for the process of diagnosis of neurodegenerative diseases, such as Alzheimer's Disease (AD). In many cases, the analysis of these images is performed by manual reorientation and visual interpretation. Therefore, new statistical techniques to perform a more quantitative analysis are needed. In this work, a new statistical approximation to the analysis of functional images, based on significance measures and Independent Component Analysis (ICA) is presented. After the images preprocessing, voxels that allow better separation of the two classes are extracted, using significance measures such as the Mann-Whitney-Wilcoxon U-Test (MWW) and Relative Entropy (RE). After this feature selection step, the voxels vector is modelled by means of ICA, extracting a few independent components which will be used as an input to the classifier. Naive Bayes and Support Vector Machine (SVM) classifiers are used in this work. The proposed system has been applied to two different databases. A 96-subjects Single Photon Emission Computed Tomography (SPECT) database from the "Virgen de las Nieves" Hospital in Granada, Spain, and a 196-subjects Positron Emission Tomography (PET) database from the Alzheimer's Disease Neuroimaging Initiative (ADNI). Values of accuracy up to 96.9% and 91.3% for SPECT and PET databases are achieved by the proposed system, which has yielded many benefits over methods proposed on recent works. PMID:23660005

  5. Future challenges for vection research: definitions, functional significance, measures, and neural bases

    PubMed Central

    Palmisano, Stephen; Allison, Robert S.; Schira, Mark M.; Barry, Robert J.

    2015-01-01

    This paper discusses four major challenges facing modern vection research. Challenge 1 (Defining Vection) outlines the different ways that vection has been defined in the literature and discusses their theoretical and experimental ramifications. The term vection is most often used to refer to visual illusions of self-motion induced in stationary observers (by moving, or simulating the motion of, the surrounding environment). However, vection is increasingly being used to also refer to non-visual illusions of self-motion, visually mediated self-motion perceptions, and even general subjective experiences (i.e., “feelings”) of self-motion. The common thread in all of these definitions is the conscious subjective experience of self-motion. Thus, Challenge 2 (Significance of Vection) tackles the crucial issue of whether such conscious experiences actually serve functional roles during self-motion (e.g., in terms of controlling or guiding the self-motion). After more than 100 years of vection research there has been surprisingly little investigation into its functional significance. Challenge 3 (Vection Measures) discusses the difficulties with existing subjective self-report measures of vection (particularly in the context of contemporary research), and proposes several more objective measures of vection based on recent empirical findings. Finally, Challenge 4 (Neural Basis) reviews the recent neuroimaging literature examining the neural basis of vection and discusses the hurdles still facing these investigations. PMID:25774143

  6. Mass spectrometry-based protein identification with accurate statistical significance assignment

    PubMed Central

    Alves, Gelio; Yu, Yi-Kuo

    2015-01-01

    Motivation: Assigning statistical significance accurately has become increasingly important as metadata of many types, often assembled in hierarchies, are constructed and combined for further biological analyses. Statistical inaccuracy of metadata at any level may propagate to downstream analyses, undermining the validity of scientific conclusions thus drawn. From the perspective of mass spectrometry-based proteomics, even though accurate statistics for peptide identification can now be achieved, accurate protein level statistics remain challenging. Results: We have constructed a protein ID method that combines peptide evidences of a candidate protein based on a rigorous formula derived earlier; in this formula the database P-value of every peptide is weighted, prior to the final combination, according to the number of proteins it maps to. We have also shown that this protein ID method provides accurate protein level E-value, eliminating the need of using empirical post-processing methods for type-I error control. Using a known protein mixture, we find that this protein ID method, when combined with the Sorić formula, yields accurate values for the proportion of false discoveries. In terms of retrieval efficacy, the results from our method are comparable with other methods tested. Availability and implementation: The source code, implemented in C++ on a linux system, is available for download at ftp://ftp.ncbi.nlm.nih.gov/pub/qmbp/qmbp_ms/RAId/RAId_Linux_64Bit. Contact: yyu@ncbi.nlm.nih.gov Supplementary information: Supplementary data are available at Bioinformatics online. PMID:25362092

  7. Model Assembly for Estimating Cell Surviving Fraction for Both Targeted and Nontargeted Effects Based on Microdosimetric Probability Densities

    PubMed Central

    Sato, Tatsuhiko; Hamada, Nobuyuki

    2014-01-01

    We here propose a new model assembly for estimating the surviving fraction of cells irradiated with various types of ionizing radiation, considering both targeted and nontargeted effects in the same framework. The probability densities of specific energies in two scales, which are the cell nucleus and its substructure called a domain, were employed as the physical index for characterizing the radiation fields. In the model assembly, our previously established double stochastic microdosimetric kinetic (DSMK) model was used to express the targeted effect, whereas a newly developed model was used to express the nontargeted effect. The radioresistance caused by overexpression of anti-apoptotic protein Bcl-2 known to frequently occur in human cancer was also considered by introducing the concept of the adaptive response in the DSMK model. The accuracy of the model assembly was examined by comparing the computationally and experimentally determined surviving fraction of Bcl-2 cells (Bcl-2 overexpressing HeLa cells) and Neo cells (neomycin resistant gene-expressing HeLa cells) irradiated with microbeam or broadbeam of energetic heavy ions, as well as the WI-38 normal human fibroblasts irradiated with X-ray microbeam. The model assembly reproduced very well the experimentally determined surviving fraction over a wide range of dose and linear energy transfer (LET) values. Our newly established model assembly will be worth being incorporated into treatment planning systems for heavy-ion therapy, brachytherapy, and boron neutron capture therapy, given critical roles of the frequent Bcl-2 overexpression and the nontargeted effect in estimating therapeutic outcomes and harmful effects of such advanced therapeutic modalities. PMID:25426641

  8. Model assembly for estimating cell surviving fraction for both targeted and nontargeted effects based on microdosimetric probability densities.

    PubMed

    Sato, Tatsuhiko; Hamada, Nobuyuki

    2014-01-01

    We here propose a new model assembly for estimating the surviving fraction of cells irradiated with various types of ionizing radiation, considering both targeted and nontargeted effects in the same framework. The probability densities of specific energies in two scales, which are the cell nucleus and its substructure called a domain, were employed as the physical index for characterizing the radiation fields. In the model assembly, our previously established double stochastic microdosimetric kinetic (DSMK) model was used to express the targeted effect, whereas a newly developed model was used to express the nontargeted effect. The radioresistance caused by overexpression of anti-apoptotic protein Bcl-2 known to frequently occur in human cancer was also considered by introducing the concept of the adaptive response in the DSMK model. The accuracy of the model assembly was examined by comparing the computationally and experimentally determined surviving fraction of Bcl-2 cells (Bcl-2 overexpressing HeLa cells) and Neo cells (neomycin resistant gene-expressing HeLa cells) irradiated with microbeam or broadbeam of energetic heavy ions, as well as the WI-38 normal human fibroblasts irradiated with X-ray microbeam. The model assembly reproduced very well the experimentally determined surviving fraction over a wide range of dose and linear energy transfer (LET) values. Our newly established model assembly will be worth being incorporated into treatment planning systems for heavy-ion therapy, brachytherapy, and boron neutron capture therapy, given critical roles of the frequent Bcl-2 overexpression and the nontargeted effect in estimating therapeutic outcomes and harmful effects of such advanced therapeutic modalities. PMID:25426641

  9. Probability mapping of contaminants

    SciTech Connect

    Rautman, C.A.; Kaplan, P.G.; McGraw, M.A.; Istok, J.D.; Sigda, J.M.

    1994-04-01

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. The probability mapping approach illustrated in this paper appears to offer site operators a reasonable, quantitative methodology for many environmental remediation decisions and allows evaluation of the risk associated with those decisions. For example, output from this approach can be used in quantitative, cost-based decision models for evaluating possible site characterization and/or remediation plans, resulting in selection of the risk-adjusted, least-cost alternative. The methodology is completely general, and the techniques are applicable to a wide variety of environmental restoration projects. The probability-mapping approach is illustrated by application to a contaminated site at the former DOE Feed Materials Production Center near Fernald, Ohio. Soil geochemical data, collected as part of the Uranium-in-Soils Integrated Demonstration Project, have been used to construct a number of geostatistical simulations of potential contamination for parcels approximately the size of a selective remediation unit (the 3-m width of a bulldozer blade). Each such simulation accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination (potential clean-up or personnel-hazard thresholds).

  10. The spline probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Sithiravel, Rajiv; Tharmarasa, Ratnasingham; McDonald, Mike; Pelletier, Michel; Kirubarajan, Thiagalingam

    2012-06-01

    The Probability Hypothesis Density Filter (PHD) is a multitarget tracker for recursively estimating the number of targets and their state vectors from a set of observations. The PHD filter is capable of working well in scenarios with false alarms and missed detections. Two distinct PHD filter implementations are available in the literature: the Sequential Monte Carlo Probability Hypothesis Density (SMC-PHD) and the Gaussian Mixture Probability Hypothesis Density (GM-PHD) filters. The SMC-PHD filter uses particles to provide target state estimates, which can lead to a high computational load, whereas the GM-PHD filter does not use particles, but restricts to linear Gaussian mixture models. The SMC-PHD filter technique provides only weighted samples at discrete points in the state space instead of a continuous estimate of the probability density function of the system state and thus suffers from the well-known degeneracy problem. This paper proposes a B-Spline based Probability Hypothesis Density (S-PHD) filter, which has the capability to model any arbitrary probability density function. The resulting algorithm can handle linear, non-linear, Gaussian, and non-Gaussian models and the S-PHD filter can also provide continuous estimates of the probability density function of the system state. In addition, by moving the knots dynamically, the S-PHD filter ensures that the splines cover only the region where the probability of the system state is significant, hence the high efficiency of the S-PHD filter is maintained at all times. Also, unlike the SMC-PHD filter, the S-PHD filter is immune to the degeneracy problem due to its continuous nature. The S-PHD filter derivations and simulations are provided in this paper.

  11. Probability-based classifications for spatially characterizing the water temperatures and discharge rates of hot springs in the Tatun Volcanic Region, Taiwan.

    PubMed

    Jang, Cheng-Shin

    2015-05-01

    Accurately classifying the spatial features of the water temperatures and discharge rates of hot springs is crucial for environmental resources use and management. This study spatially characterized classifications of the water temperatures and discharge rates of hot springs in the Tatun Volcanic Region of Northern Taiwan by using indicator kriging (IK). The water temperatures and discharge rates of the springs were first assigned to high, moderate, and low categories according to the two thresholds of the proposed spring classification criteria. IK was then used to model the occurrence probabilities of the water temperatures and discharge rates of the springs and probabilistically determine their categories. Finally, nine combinations were acquired from the probability-based classifications for the spatial features of the water temperatures and discharge rates of the springs. Moreover, various combinations of spring water features were examined according to seven subzones of spring use in the study region. The research results reveal that probability-based classifications using IK provide practicable insights related to propagating the uncertainty of classifications according to the spatial features of the water temperatures and discharge rates of the springs. The springs in the Beitou (BT), Xingyi Road (XYR), Zhongshanlou (ZSL), and Lengshuikeng (LSK) subzones are suitable for supplying tourism hotels with a sufficient quantity of spring water because they have high or moderate discharge rates. Furthermore, natural hot springs in riverbeds and valleys should be developed in the Dingbeitou (DBT), ZSL, Xiayoukeng (XYK), and Macao (MC) subzones because of low discharge rates and low or moderate water temperatures. PMID:25917185

  12. Function-Based Discovery of Significant Transcriptional Temporal Patterns in Insulin Stimulated Muscle Cells

    PubMed Central

    Di Camillo, Barbara; Irving, Brian A.; Schimke, Jill; Sanavia, Tiziana; Toffolo, Gianna; Cobelli, Claudio; Nair, K. Sreekumaran

    2012-01-01

    Background Insulin action on protein synthesis (translation of transcripts) and post-translational modifications, especially of those involving the reversible modifications such as phosphorylation of various signaling proteins, are extensively studied but insulin effect on transcription of genes, especially of transcriptional temporal patterns remains to be fully defined. Methodology/Principal Findings To identify significant transcriptional temporal patterns we utilized primary differentiated rat skeletal muscle myotubes which were treated with insulin and samples were collected every 20 min for 8 hours. Pooled samples at every hour were analyzed by gene array approach to measure transcript levels. The patterns of transcript levels were analyzed based on a novel method that integrates selection, clustering, and functional annotation to find the main temporal patterns associated to functional groups of differentially expressed genes. 326 genes were found to be differentially expressed in response to in vitro insulin administration in skeletal muscle myotubes. Approximately 20% of the genes that were differentially expressed were identified as belonging to the insulin signaling pathway. Characteristic transcriptional temporal patterns include: (a) a slow and gradual decrease in gene expression, (b) a gradual increase in gene expression reaching a peak at about 5 hours and then reaching a plateau or an initial decrease and other different variable pattern of increase in gene expression over time. Conclusion/Significance The new method allows identifying characteristic dynamic responses to insulin stimulus, common to a number of genes and associated to the same functional group. The results demonstrate that insulin treatment elicited different clusters of gene transcript profile supporting a temporal regulation of gene expression by insulin in skeletal muscle cells. PMID:22396763

  13. Probability and Relative Frequency

    NASA Astrophysics Data System (ADS)

    Drieschner, Michael

    2016-01-01

    The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.

  14. Chronic Arsenic Poisoning Probably Caused by Arsenic-Based Pesticides: Findings from an Investigation Study of a Household

    PubMed Central

    Li, Yongfang; Ye, Feng; Wang, Anwei; Wang, Da; Yang, Boyi; Zheng, Quanmei; Sun, Guifan; Gao, Xinghua

    2016-01-01

    In addition to naturally occurring arsenic, man-made arsenic-based compounds are other sources of arsenic exposure. In 2013, our group identified 12 suspected arsenicosis patients in a household (32 living members). Of them, eight members were diagnosed with skin cancer. Interestingly, all of these patients had lived in the household prior to 1989. An investigation revealed that approximately 2 tons of arsenic-based pesticides had been previously placed near a well that had supplied drinking water to the family from 1973 to 1989. The current arsenic level in the well water was 620 μg/L. No other high arsenic wells were found near the family’s residence. Based on these findings, it is possible to infer that the skin lesions exhibited by these family members were caused by long-term exposure to well water contaminated with arsenic-based pesticides. Additionally, biochemical analysis showed that the individuals exposed to arsenic had higher levels of aspartate aminotransferase and γ-glutamyl transpeptidase than those who were not exposed. These findings might indicate the presence of liver dysfunction in the arsenic-exposed individuals. This report elucidates the effects of arsenical compounds on the occurrence of high levels of arsenic in the environment and emphasizes the severe human health impact of arsenic exposure. PMID:26784217

  15. Explosion probability of unexploded ordnance: expert beliefs.

    PubMed

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  16. Near-infrared Raman spectroscopy for in-vivo diagnosis of cervical dysplasia: a probability-based multi-class diagnostic algorithm

    NASA Astrophysics Data System (ADS)

    Majumder, Shovan K.; Kanter, Elizabeth; Robichaux Viehoever, Amy; Jones, Howard; Mahadevan-Jansen, Anita

    2007-02-01

    We report the development of a probability-based multi-class diagnostic algorithm to simultaneously distinguish highgrade dysplasia from low-grade dysplasia, squamous metaplasia as well as normal human cervical tissues using nearinfrared Raman spectra acquired in-vivo from the cervix of patients at the Vanderbilt University Medical Center. Extraction of diagnostic features from the Raman spectra uses the recently formulated theory of nonlinear Maximum Representation and Discrimination Feature (MRDF), and classification into respective tissue categories is based on the theory of Sparse Multinomial Logistic Regression (SMLR), a recent Bayesian machine-learning framework of statistical pattern recognition. The algorithm based on MRDF and SMLR was found to provide very good diagnostic performance with a predictive accuracy of ~90% based on leave-one-out cross validation in classifying the tissue Raman spectra into the four different classes, using histology as the "gold standard". The inherently multi-class nature of the algorithm facilitates a rapid and simultaneous classification of tissue spectra into various tissue categories without the need to train and heuristically combine multiple binary classifiers. Further, the probabilistic framework of the algorithm makes it possible to predict the posterior probability of class membership in discriminating the different tissue types.

  17. Fully Automated Renal Tissue Volumetry in MR Volume Data Using Prior-Shape-Based Segmentation in Subject-Specific Probability Maps.

    PubMed

    Gloger, Oliver; Tönnies, Klaus; Laqua, Rene; Völzke, Henry

    2015-10-01

    Organ segmentation in magnetic resonance (MR) volume data is of increasing interest in epidemiological studies and clinical practice. Especially in large-scale population-based studies, organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time consuming and prone to reader variability, large-scale studies need automatic methods to perform organ segmentation. In this paper, we present an automated framework for renal tissue segmentation that computes renal parenchyma, cortex, and medulla volumetry in native MR volume data without any user interaction. We introduce a novel strategy of subject-specific probability map computation for renal tissue types, which takes inter- and intra-MR-intensity variability into account. Several kinds of tissue-related 2-D and 3-D prior-shape knowledge are incorporated in modularized framework parts to segment renal parenchyma in a final level set segmentation strategy. Subject-specific probabilities for medulla and cortex tissue are applied in a fuzzy clustering technique to delineate cortex and medulla tissue inside segmented parenchyma regions. The novel subject-specific computation approach provides clearly improved tissue probability map quality than existing methods. Comparing to existing methods, the framework provides improved results for parenchyma segmentation. Furthermore, cortex and medulla segmentation qualities are very promising but cannot be compared to existing methods since state-of-the art methods for automated cortex and medulla segmentation in native MR volume data are still missing. PMID:25915954

  18. Evolution and Probability.

    ERIC Educational Resources Information Center

    Bailey, David H.

    2000-01-01

    Some of the most impressive-sounding criticisms of the conventional theory of biological evolution involve probability. Presents a few examples of how probability should and should not be used in discussing evolution. (ASK)

  19. BIODEGRADATION PROBABILITY PROGRAM (BIODEG)

    EPA Science Inventory

    The Biodegradation Probability Program (BIODEG) calculates the probability that a chemical under aerobic conditions with mixed cultures of microorganisms will biodegrade rapidly or slowly. It uses fragment constants developed using multiple linear and non-linear regressions and d...

  20. Probability on a Budget.

    ERIC Educational Resources Information Center

    Ewbank, William A.; Ginther, John L.

    2002-01-01

    Describes how to use common dice numbered 1-6 for simple mathematical situations including probability. Presents a lesson using regular dice and specially marked dice to explore some of the concepts of probability. (KHR)

  1. Response of the San Andreas fault to the 1983 Coalinga-Nuñez earthquakes: an application of interaction-based probabilities for Parkfield

    USGS Publications Warehouse

    Toda, Shinji; Stein, Ross S.

    2002-01-01

    The Parkfield-Cholame section of the San Andreas fault, site of an unfulfilled earthquake forecast in 1985, is the best monitored section of the world's most closely watched fault. In 1983, the M = 6.5 Coalinga and M = 6.0 Nuñez events struck 25 km northeast of Parkfield. Seismicity rates climbed for 18 months along the creeping section of the San Andreas north of Parkfield and dropped for 6 years along the locked section to the south. Right-lateral creep also slowed or reversed from Parkfield south. Here we calculate that the Coalinga sequence increased the shear and Coulomb stress on the creeping section, causing the rate of small shocks to rise until the added stress was shed by additional slip. However, the 1983 events decreased the shear and Coulomb stress on the Parkfield segment, causing surface creep and seismicity rates to drop. We use these observations to cast the likelihood of a Parkfield earthquake into an interaction-based probability, which includes both the renewal of stress following the 1966 Parkfield earthquake and the stress transfer from the 1983 Coalinga events. We calculate that the 1983 shocks dropped the 10-year probability of a M ∼ 6 Parkfield earthquake by 22% (from 54 ± 22% to 42 ± 23%) and that the probability did not recover until about 1991, when seismicity and creep resumed. Our analysis may thus explain why the Parkfield earthquake did not strike in the 1980s, but not why it was absent in the 1990s. We calculate a 58 ± 17% probability of a M ∼ 6 Parkfield earthquake during 2001–2011.

  2. Dependent Probability Spaces

    ERIC Educational Resources Information Center

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  3. Searching with probabilities

    SciTech Connect

    Palay, A.J.

    1985-01-01

    This book examines how probability distributions can be used as a knowledge representation technique. It presents a mechanism that can be used to guide a selective search algorithm to solve a variety of tactical chess problems. Topics covered include probabilities and searching the B algorithm and chess probabilities - in practice, examples, results, and future work.

  4. A latent class analysis of adolescent adverse life events based on a Danish national youth probability sample.

    PubMed

    Shevlin, Mark; Elklit, Ask

    2008-01-01

    The aim of this study was to determine if there are meaningful clusters of individuals with similar experiences of adverse life events in a nationally representative sample of Danish adolescents. Latent class analysis (LCA) was used to identify such clusters or latent classes. In addition, the relationships between the latent classes and living arrangements and diagnosis of post-traumatic stress disorder (PTSD) were estimated. A four-class solution was found to be the best description of multiple adverse life events, and the classes were labelled "Low Risk", "Intermediate Risk", "Pregnancy" and "High Risk". Compared with the Low Risk class, the other classes were found to be significantly more likely to have a diagnosis PTSD and live with only one parent. This paper demonstrated how trauma research can focus on the individual as the unit of analysis rather than traumatic events. PMID:18609032

  5. Bayesian modeling and inference for diagnostic accuracy and probability of disease based on multiple diagnostic biomarkers with and without a perfect reference standard.

    PubMed

    Jafarzadeh, S Reza; Johnson, Wesley O; Gardner, Ian A

    2016-03-15

    The area under the receiver operating characteristic (ROC) curve (AUC) is used as a performance metric for quantitative tests. Although multiple biomarkers may be available for diagnostic or screening purposes, diagnostic accuracy is often assessed individually rather than in combination. In this paper, we consider the interesting problem of combining multiple biomarkers for use in a single diagnostic criterion with the goal of improving the diagnostic accuracy above that of an individual biomarker. The diagnostic criterion created from multiple biomarkers is based on the predictive probability of disease, conditional on given multiple biomarker outcomes. If the computed predictive probability exceeds a specified cutoff, the corresponding subject is allocated as 'diseased'. This defines a standard diagnostic criterion that has its own ROC curve, namely, the combined ROC (cROC). The AUC metric for cROC, namely, the combined AUC (cAUC), is used to compare the predictive criterion based on multiple biomarkers to one based on fewer biomarkers. A multivariate random-effects model is proposed for modeling multiple normally distributed dependent scores. Bayesian methods for estimating ROC curves and corresponding (marginal) AUCs are developed when a perfect reference standard is not available. In addition, cAUCs are computed to compare the accuracy of different combinations of biomarkers for diagnosis. The methods are evaluated using simulations and are applied to data for Johne's disease (paratuberculosis) in cattle. PMID:26415924

  6. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    NASA Astrophysics Data System (ADS)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  7. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data.

    PubMed

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-21

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches. PMID:26509325

  8. In All Probability, Probability is not All

    ERIC Educational Resources Information Center

    Helman, Danny

    2004-01-01

    The national lottery is often portrayed as a game of pure chance with no room for strategy. This misperception seems to stem from the application of probability instead of expectancy considerations, and can be utilized to introduce the statistical concept of expectation.

  9. Estimating the probability of occurrence of earthquakes (M>6) in the Western part of the Corinth rift using fault-based and classical seismotectonic approaches.

    NASA Astrophysics Data System (ADS)

    Boiselet, Aurelien; Scotti, Oona; Lyon-Caen, Hélène

    2014-05-01

    -SISCOR Working Group. On the basis of this consensual logic tree, median probability of occurrences of M>=6 events were computed for the region of study. Time-dependent models (Brownian Passage time and Weibull probability distributions) were also explored. The probability of a M>=6.0 event is found to be greater in the western region compared to the eastern part of the Corinth rift, whether a fault-based or a classical seismotectonic approach is used. Percentile probability estimates are also provided to represent the range of uncertainties in the results. The percentile results show that, in general, probability estimates following the classical approach (based on the definition of seismotectonic source zones), cover the median values estimated following the fault-based approach. On the contrary, the fault-based approach in this region is still affected by a high degree of uncertainty, because of the poor constraints on the 3D geometries of the faults and the high uncertainties in their slip rates.

  10. A Posteriori Transit Probabilities

    NASA Astrophysics Data System (ADS)

    Stevens, Daniel J.; Gaudi, B. Scott

    2013-08-01

    Given the radial velocity (RV) detection of an unseen companion, it is often of interest to estimate the probability that the companion also transits the primary star. Typically, one assumes a uniform distribution for the cosine of the inclination angle i of the companion's orbit. This yields the familiar estimate for the prior transit probability of ~Rlowast/a, given the primary radius Rlowast and orbital semimajor axis a, and assuming small companions and a circular orbit. However, the posterior transit probability depends not only on the prior probability distribution of i but also on the prior probability distribution of the companion mass Mc, given a measurement of the product of the two (the minimum mass Mc sin i) from an RV signal. In general, the posterior can be larger or smaller than the prior transit probability. We derive analytic expressions for the posterior transit probability assuming a power-law form for the distribution of true masses, dΓ/dMcvpropMcα, for integer values -3 <= α <= 3. We show that for low transit probabilities, these probabilities reduce to a constant multiplicative factor fα of the corresponding prior transit probability, where fα in general depends on α and an assumed upper limit on the true mass. The prior and posterior probabilities are equal for α = -1. The posterior transit probability is ~1.5 times larger than the prior for α = -3 and is ~4/π times larger for α = -2, but is less than the prior for α>=0, and can be arbitrarily small for α > 1. We also calculate the posterior transit probability in different mass regimes for two physically-motivated mass distributions of companions around Sun-like stars. We find that for Jupiter-mass planets, the posterior transit probability is roughly equal to the prior probability, whereas the posterior is likely higher for Super-Earths and Neptunes (10 M⊕ - 30 M⊕) and Super-Jupiters (3 MJup - 10 MJup), owing to the predicted steep rise in the mass function toward smaller

  11. Children with Significant Hearing Loss: Learning to Listen, Talk, and Read--Evidence-Based Best Practices

    ERIC Educational Resources Information Center

    Martindale, Maura

    2007-01-01

    A considerable body of evidence obtained from studies of children who are deaf and who use cochlear implants has been useful in guiding practices that lead to higher levels of English language proficiency and age-appropriate literacy. Both (a) research conducted at implant centers and (b) educational programs with significant numbers of children…

  12. Automatic Identification and Storage of Significant Points in a Computer-Based Presentation

    ERIC Educational Resources Information Center

    Dickson, Paul; Adrion, W. Richards; Hanson, Allen

    2007-01-01

    We describe an automatic classroom capture system that detects and records significant (stable) points in lectures by sampling and analyzing a sequence of screen capture frames from a PC used for presentations, application demonstrations, etc. The system uses visual inspection techniques to scan the screen capture stream to identify points to…

  13. The Welsh study of mothers and babies: protocol for a population-based cohort study to investigate the clinical significance of defined ultrasound findings of uncertain significance

    PubMed Central

    2014-01-01

    Background Improvement in ultrasound imaging has led to the identification of subtle non-structural markers during the 18 – 20 week fetal anomaly scan, such as echogenic bowel, mild cerebral ventriculomegaly, renal pelvicalyceal dilatation, and nuchal thickening. These markers are estimated to occur in between 0.6% and 4.3% of pregnancies. Their clinical significance, for pregnancy outcomes or childhood morbidity, is largely unknown. The aim of this study is to estimate the prevalence of seven markers in the general obstetric population and establish a cohort of children for longer terms follow-up to assess the clinical significance of these markers. Methods/Design All women receiving antenatal care within six of seven Welsh Health Boards who had an 18 to 20 week ultrasound scan in Welsh NHS Trusts between July 2008 and March 2011 were eligible for inclusion. Data were collected on seven markers (echogenic bowel, cerebral ventriculomegaly, renal pelvicalyceal dilatation, nuchal thickening, cardiac echogenic foci, choroid plexus cysts, and short femur) at the time of 18 – 20 week fetal anomaly scan. Ultrasound records were linked to routinely collected data on pregnancy outcomes (work completed during 2012 and 2013). Images were stored and reviewed by an expert panel. The prevalence of each marker (reported and validated) will be estimated. A projected sample size of 23,000 will allow the prevalence of each marker to be estimated with the following precision: a marker with 0.50% prevalence to within 0.10%; a marker with 1.00% prevalence to within 0.13%; and a marker with 4.50% prevalence to within 0.27%. The relative risk of major congenital abnormalities, stillbirths, pre-term birth and small for gestational age, given the presence of a validated marker, will be reported. Discussion This is a large, prospective study designed to estimate the prevalence of markers in a population-based cohort of pregnant women and to investigate associations with adverse

  14. Implementation of a web based universal exchange and inference language for medicine: Sparse data, probabilities and inference in data mining of clinical data repositories.

    PubMed

    Robson, Barry; Boray, Srinidhi

    2015-11-01

    We extend Q-UEL, our universal exchange language for interoperability and inference in healthcare and biomedicine, to the more traditional fields of public health surveys. These are the type associated with screening, epidemiological and cross-sectional studies, and cohort studies in some cases similar to clinical trials. There is the challenge that there is some degree of split between frequentist notions of probability as (a) classical measures based only on the idea of counting and proportion and on classical biostatistics as used in the above conservative disciplines, and (b) more subjectivist notions of uncertainty, belief, reliability, or confidence often used in automated inference and decision support systems. Samples in the above kind of public health survey are typically small compared with our earlier "Big Data" mining efforts. An issue addressed here is how much impact on decisions should sparse data have. We describe a new Q-UEL compatible toolkit including a data analytics application DiracMiner that also delivers more standard biostatistical results, DiracBuilder that uses its output to build Hyperbolic Dirac Nets (HDN) for decision support, and HDNcoherer that ensures that probabilities are mutually consistent. Use is exemplified by participating in a real word health-screening project, and also by deployment in a industrial platform called the BioIngine, a cognitive computing platform for health management. PMID:26386548

  15. A low false negative filter for detecting rare bird species from short video segments using a probable observation data set-based EKF method.

    PubMed

    Song, Dezhen; Xu, Yiliang

    2010-09-01

    We report a new filter to assist the search for rare bird species. Since a rare bird only appears in front of a camera with very low occurrence (e.g., less than ten times per year) for very short duration (e.g., less than a fraction of a second), our algorithm must have a very low false negative rate. We verify the bird body axis information with the known bird flying dynamics from the short video segment. Since a regular extended Kalman filter (EKF) cannot converge due to high measurement error and limited data, we develop a novel probable observation data set (PODS)-based EKF method. The new PODS-EKF searches the measurement error range for all probable observation data that ensures the convergence of the corresponding EKF in short time frame. The algorithm has been extensively tested using both simulated inputs and real video data of four representative bird species. In the physical experiments, our algorithm has been tested on rock pigeons and red-tailed hawks with 119 motion sequences. The area under the ROC curve is 95.0%. During the one-year search of ivory-billed woodpeckers, the system reduces the raw video data of 29.41 TB to only 146.7 MB (reduction rate 99.9995%). PMID:20388596

  16. Beyond No Significant Differences: A Closer Look at the Educational Impact of Computer-Based Instruction

    ERIC Educational Resources Information Center

    Mandernach, B. Jean

    2006-01-01

    There is a host of research examining the equivalence of alternative modes of technology-facilitated educational delivery (such as computer-based or online instruction) and traditional classroom instruction. While various studies have promoted each of these modalities for specific populations or topic areas, the bulk of research supports relative…

  17. Innovations in individual feature history management - The significance of feature-based temporal model

    USGS Publications Warehouse

    Choi, J.; Seong, J.C.; Kim, B.; Usery, E.L.

    2008-01-01

    A feature relies on three dimensions (space, theme, and time) for its representation. Even though spatiotemporal models have been proposed, they have principally focused on the spatial changes of a feature. In this paper, a feature-based temporal model is proposed to represent the changes of both space and theme independently. The proposed model modifies the ISO's temporal schema and adds new explicit temporal relationship structure that stores temporal topological relationship with the ISO's temporal primitives of a feature in order to keep track feature history. The explicit temporal relationship can enhance query performance on feature history by removing topological comparison during query process. Further, a prototype system has been developed to test a proposed feature-based temporal model by querying land parcel history in Athens, Georgia. The result of temporal query on individual feature history shows the efficiency of the explicit temporal relationship structure. ?? Springer Science+Business Media, LLC 2007.

  18. Significant Performance Enhancement in Asymmetric Supercapacitors based on Metal Oxides, Carbon nanotubes and Neutral Aqueous Electrolyte

    PubMed Central

    Singh, Arvinder; Chandra, Amreesh

    2015-01-01

    Amongst the materials being investigated for supercapacitor electrodes, carbon based materials are most investigated. However, pure carbon materials suffer from inherent physical processes which limit the maximum specific energy and power that can be achieved in an energy storage device. Therefore, use of carbon-based composites with suitable nano-materials is attaining prominence. The synergistic effect between the pseudocapacitive nanomaterials (high specific energy) and carbon (high specific power) is expected to deliver the desired improvements. We report the fabrication of high capacitance asymmetric supercapacitor based on electrodes of composites of SnO2 and V2O5 with multiwall carbon nanotubes and neutral 0.5 M Li2SO4 aqueous electrolyte. The advantages of the fabricated asymmetric supercapacitors are compared with the results published in the literature. The widened operating voltage window is due to the higher over-potential of electrolyte decomposition and a large difference in the work functions of the used metal oxides. The charge balanced device returns the specific capacitance of ~198 F g−1 with corresponding specific energy of ~89 Wh kg−1 at 1 A g−1. The proposed composite systems have shown great potential in fabricating high performance supercapacitors. PMID:26494197

  19. Significant Performance Enhancement in Asymmetric Supercapacitors based on Metal Oxides, Carbon nanotubes and Neutral Aqueous Electrolyte

    NASA Astrophysics Data System (ADS)

    Singh, Arvinder; Chandra, Amreesh

    2015-10-01

    Amongst the materials being investigated for supercapacitor electrodes, carbon based materials are most investigated. However, pure carbon materials suffer from inherent physical processes which limit the maximum specific energy and power that can be achieved in an energy storage device. Therefore, use of carbon-based composites with suitable nano-materials is attaining prominence. The synergistic effect between the pseudocapacitive nanomaterials (high specific energy) and carbon (high specific power) is expected to deliver the desired improvements. We report the fabrication of high capacitance asymmetric supercapacitor based on electrodes of composites of SnO2 and V2O5 with multiwall carbon nanotubes and neutral 0.5 M Li2SO4 aqueous electrolyte. The advantages of the fabricated asymmetric supercapacitors are compared with the results published in the literature. The widened operating voltage window is due to the higher over-potential of electrolyte decomposition and a large difference in the work functions of the used metal oxides. The charge balanced device returns the specific capacitance of ~198 F g-1 with corresponding specific energy of ~89 Wh kg-1 at 1 A g-1. The proposed composite systems have shown great potential in fabricating high performance supercapacitors.

  20. Determining the Probability of Violating Upper-Level Wind Constraints for the Launch of Minuteman Ill Ballistic Missiles At Vandenberg Air Force Base

    NASA Technical Reports Server (NTRS)

    Shafer, Jaclyn A.; Brock, Tyler M.

    2013-01-01

    The 30th Operational Support Squadron Weather Flight (30 OSSWF) provides comprehensive weather services to the space program at Vandenberg Air Force Base (VAFB) in California. One of their responsibilities is to monitor upper-level winds to ensure safe launch operations of the Minuteman Ill ballistic missile. The 30 OSSWF requested the Applied Meteorology Unit (AMU) analyze VAFB sounding data to determine the probability of violating (PoV) upper-level thresholds for wind speed and shear constraints specific to this launch vehicle, and to develop a graphical user interface (GUI) that will calculate the PoV of each constraint on the day of launch. The AMU suggested also including forecast sounding data from the Rapid Refresh (RAP) model. This would provide further insight for the launch weather officers (LWOs) when determining if a wind constraint violation will occur over the next few hours, and help to improve the overall upper winds forecast on launch day.

  1. A Citation-Based Analysis and Review of Significant Papers on Timing and Time Perception

    PubMed Central

    Teki, Sundeep

    2016-01-01

    Time is an important dimension of brain function, but little is yet known about the underlying cognitive principles and neurobiological mechanisms. The field of timing and time perception has witnessed tremendous growth and multidisciplinary interest in the recent years with the advent of modern neuroimaging and neurophysiological approaches. In this article, I used a data mining approach to analyze the timing literature published by a select group of researchers (n = 202) during the period 2000–2015 and highlight important reviews as well as empirical articles that meet the criterion of a minimum of 100 citations. The qualifying articles (n = 150) are listed in a table along with key details such as number of citations, names of authors, year and journal of publication as well as a short summary of the findings of each study. The results of such a data-driven approach to literature review not only serve as a useful resource to any researcher interested in timing, but also provides a means to evaluate key papers that have significantly influenced the field and summarize recent progress and popular research trends in the field. Additionally, such analyses provides food for thought about future scientific directions and raises important questions about improving organizational structures to boost open science and progress in the field. I discuss exciting avenues for future research that have the potential to significantly advance our understanding of the neurobiology of timing, and propose the establishment of a new society, the Timing Research Forum, to promote open science and collaborative work within the highly diverse and multidisciplinary community of researchers in the field of timing and time perception. PMID:27471445

  2. A Citation-Based Analysis and Review of Significant Papers on Timing and Time Perception.

    PubMed

    Teki, Sundeep

    2016-01-01

    Time is an important dimension of brain function, but little is yet known about the underlying cognitive principles and neurobiological mechanisms. The field of timing and time perception has witnessed tremendous growth and multidisciplinary interest in the recent years with the advent of modern neuroimaging and neurophysiological approaches. In this article, I used a data mining approach to analyze the timing literature published by a select group of researchers (n = 202) during the period 2000-2015 and highlight important reviews as well as empirical articles that meet the criterion of a minimum of 100 citations. The qualifying articles (n = 150) are listed in a table along with key details such as number of citations, names of authors, year and journal of publication as well as a short summary of the findings of each study. The results of such a data-driven approach to literature review not only serve as a useful resource to any researcher interested in timing, but also provides a means to evaluate key papers that have significantly influenced the field and summarize recent progress and popular research trends in the field. Additionally, such analyses provides food for thought about future scientific directions and raises important questions about improving organizational structures to boost open science and progress in the field. I discuss exciting avenues for future research that have the potential to significantly advance our understanding of the neurobiology of timing, and propose the establishment of a new society, the Timing Research Forum, to promote open science and collaborative work within the highly diverse and multidisciplinary community of researchers in the field of timing and time perception. PMID:27471445

  3. Single-case probabilities

    NASA Astrophysics Data System (ADS)

    Miller, David

    1991-12-01

    The propensity interpretation of probability, bred by Popper in 1957 (K. R. Popper, in Observation and Interpretation in the Philosophy of Physics, S. Körner, ed. (Butterworth, London, 1957, and Dover, New York, 1962), p. 65; reprinted in Popper Selections, D. W. Miller, ed. (Princeton University Press, Princeton, 1985), p. 199) from pure frequency stock, is the only extant objectivist account that provides any proper understanding of single-case probabilities as well as of probabilities in ensembles and in the long run. In Sec. 1 of this paper I recall salient points of the frequency interpretations of von Mises and of Popper himself, and in Sec. 2 I filter out from Popper's numerous expositions of the propensity interpretation its most interesting and fertile strain. I then go on to assess it. First I defend it, in Sec. 3, against recent criticisms (P. Humphreys, Philos. Rev. 94, 557 (1985); P. Milne, Erkenntnis 25, 129 (1986)) to the effect that conditional [or relative] probabilities, unlike absolute probabilities, can only rarely be made sense of as propensities. I then challenge its predominance, in Sec. 4, by outlining a rival theory: an irreproachably objectivist theory of probability, fully applicable to the single case, that interprets physical probabilities as instantaneous frequencies.

  4. The significance of zero space vector placement for carrier based PWM schemes

    SciTech Connect

    Holmes, D.G.

    1995-12-31

    Pulse Width Modulation has been one of the most intensively investigated areas of power electronics for many years now, and the number and combination of permutations seem to be endless. However, a general hierarchical consensus appears to have emerged from this work which ranks space vector modulation techniques, regular sampled modulation and sine-triangle modulation strategies in decreasing order of merit based on harmonic performance. However, what has not been clearly identified is why space vector modulation should lead to a reduced harmonic current ripple compared to regular sampled modulation, especially since it is straightforward to show that they produce identical low frequency fundamental components. This paper addresses this issue, by showing how it is the placement of the zero space vector component within the carrier interval that determines the harmonic performance of the modulation strategy, rather than any intrinsic differences between the various methods of calculating the switching instances.

  5. Methodology and Significance of Microsensor-based Oxygen Mapping in Plant Seeds – an Overview

    PubMed Central

    Rolletschek, Hardy; Stangelmayer, Achim; Borisjuk, Ljudmilla

    2009-01-01

    Oxygen deficiency is commonplace in seeds, and limits both their development and their germination. It is, therefore, of considerable relevance to crop production. While the underlying physiological basis of seed hypoxia has been known for some time, the lack of any experimental means of measuring the global or localized oxygen concentration within the seed has hampered further progress in this research area. The development of oxygen-sensitive microsensors now offers the capability to determine the localized oxygen status within a seed, and to study its dynamic adjustment both to changes in the ambient environment, and to the seed's developmental stage. This review illustrates the use of oxygen microsensors in seed research, and presents an overview of existing data with an emphasis on crop species. Oxygen maps, both static and dynamic, should serve to increase our basic understanding of seed physiology, as well as to facilitate upcoming breeding and biotechnology-based approaches for crop improvement. PMID:22412307

  6. The Significant Frequency and Impact of Stealth (Nonviolent) Gender-Based Abuse Among College Women.

    PubMed

    Belknap, Joanne; Sharma, Nitika

    2014-05-29

    The prevalence, incidence, and impact of the gender-based abuse (GBA) of college women have been increasingly documented since the 1980s, with growing precision in the measurements and expanding identification of tactics. Although there is an obvious class bias in focusing on college women (compared to women of similar ages not attending college), it is important to address GBA among this population as they are at serious risk of sexual abuse (particularly incapacitated rape), intimate partner abuse (IPA), and stalking. This article addresses the stealth nature of the nonviolent GBAs of college women and how these abuses frequently operate under the radar of acknowledgment by society, the abusers, campus officials, the criminal legal system, and sometimes, the survivors. PMID:24874993

  7. Significant disparity in base and sugar damage in DNA resulting from neutron and electron irradiation.

    PubMed

    Pang, Dalong; Nico, Jeffrey S; Karam, Lisa; Timofeeva, Olga; Blakely, William F; Dritschilo, Anatoly; Dizdaroglu, Miral; Jaruga, Pawel

    2014-11-01

    In this study, a comparison of the effects of neutron and electron irradiation of aqueous DNA solutions was investigated to characterize potential neutron signatures in DNA damage induction. Ionizing radiation generates numerous lesions in DNA, including base and sugar lesions, lesions involving base-sugar combinations (e.g. 8,5'-cyclopurine-2'-deoxynucleosides) and DNA-protein cross-links, as well as single- and double-strand breaks and clustered damage. The characteristics of damage depend on the linear energy transfer (LET) of the incident radiation. Here we investigated DNA damage using aqueous DNA solutions in 10 mmol/l phosphate buffer from 0-80 Gy by low-LET electrons (10 Gy/min) and the specific high-LET (∼0.16 Gy/h) neutrons formed by spontaneous (252)Cf decay fissions. 8-hydroxy-2'-deoxyguanosine (8-OH-dG), (5'R)-8,5'-cyclo-2'-deoxyadenosine (R-cdA) and (5'S)-8,5'-cyclo-2'-deoxyadenosine (S-cdA) were quantified using liquid chromatography-isotope-dilution tandem mass spectrometry to demonstrate a linear dose dependence for induction of 8-OH-dG by both types of radiation, although neutron irradiation was ∼50% less effective at a given dose compared with electron irradiation. Electron irradiation resulted in an exponential increase in S-cdA and R-cdA with dose, whereas neutron irradiation induced substantially less damage and the amount of damage increased only gradually with dose. Addition of 30 mmol/l 2-amino-2-(hydroxymethyl)-1,3-propanediol (TRIS), a free radical scavenger, to the DNA solution before irradiation reduced lesion induction to background levels for both types of radiation. These results provide insight into the mechanisms of DNA damage by high-LET (252)Cf decay neutrons and low-LET electrons, leading to enhanced understanding of the potential biological effects of these types of irradiation. PMID:25034731

  8. Control range: a controllability-based index for node significance in directed networks

    NASA Astrophysics Data System (ADS)

    Wang, Bingbo; Gao, Lin; Gao, Yong

    2012-04-01

    While a large number of methods for module detection have been developed for undirected networks, it is difficult to adapt them to handle directed networks due to the lack of consensus criteria for measuring the node significance in a directed network. In this paper, we propose a novel structural index, the control range, motivated by recent studies on the structural controllability of large-scale directed networks. The control range of a node quantifies the size of the subnetwork that the node can effectively control. A related index, called the control range similarity, is also introduced to measure the structural similarity between two nodes. When applying the index of control range to several real-world and synthetic directed networks, it is observed that the control range of the nodes is mainly influenced by the network's degree distribution and that nodes with a low degree may have a high control range. We use the index of control range similarity to detect and analyze functional modules in glossary networks and the enzyme-centric network of homo sapiens. Our results, as compared with other approaches to module detection such as modularity optimization algorithm, dynamic algorithm and clique percolation method, indicate that the proposed indices are effective and practical in depicting structural and modular characteristics of sparse directed networks.

  9. Significance of Heme-Based Respiration in Meat Spoilage Caused by Leuconostoc gasicomitatum

    PubMed Central

    Johansson, Per; Kostiainen, Olli; Nieminen, Timo; Schmidt, Georg; Somervuo, Panu; Mohsina, Marzia; Vanninen, Paula; Auvinen, Petri; Björkroth, Johanna

    2013-01-01

    Leuconostoc gasicomitatum is a psychrotrophic lactic acid bacterium (LAB) which causes spoilage in cold-stored modified-atmosphere-packaged (MAP) meat products. In addition to the fermentative metabolism, L. gasicomitatum is able to respire when exogenous heme and oxygen are available. In this study, we investigated the respiration effects on growth rate, biomass, gene expression, and volatile organic compound (VOC) production in laboratory media and pork loin. The meat samples were evaluated by a sensory panel every second or third day for 29 days. We observed that functional respiration increased the growth (rate and yield) of L. gasicomitatum in laboratory media with added heme and in situ meat with endogenous heme. Respiration increased enormously (up to 2,600-fold) the accumulation of acetoin and diacetyl, which are buttery off-odor compounds in meat. Our transcriptome analyses showed that the gene expression patterns were quite similar, irrespective of whether respiration was turned off by excluding heme from the medium or mutating the cydB gene, which is essential in the respiratory chain. The respiration-based growth of L. gasicomitatum in meat was obtained in terms of population development and subsequent development of sensory characteristics. Respiration is thus a key factor explaining why L. gasicomitatum is so well adapted in high-oxygen packed meat. PMID:23204416

  10. Women who know their place : sex-based differences in spatial abilities and their evolutionary significance.

    PubMed

    Burke, Ariane; Kandler, Anne; Good, David

    2012-06-01

    Differences between men and women in the performance of tests designed to measure spatial abilities are explained by evolutionary psychologists in terms of adaptive design. The Hunter-Gatherer Theory of Spatial Ability suggests that the adoption of a hunter-gatherer lifestyle (assuming a sexual division of labor) created differential selective pressure on the development of spatial skills in men and women and, therefore, cognitive differences between the sexes. Here, we examine a basic spatial skill-wayfinding (the ability to plan routes and navigate a landscape)-in men and women in a natural, real-world setting as a means of testing the proposition that sex-based differences in spatial ability exist outside of the laboratory. Our results indicate that when physical differences are accounted for, men and women with equivalent experience perform equally well at complex navigation tasks in a real-world setting. We conclude that experience, gendered patterns of activity, and self-assessment are contributing factors in producing previously reported differences in spatial ability. PMID:22648664

  11. Determining the Probability of Violating Upper-Level Wind Constraints for the Launch of Minuteman III Ballistic Missiles at Vandenberg Air Force Base

    NASA Technical Reports Server (NTRS)

    Shafer, Jaclyn A.; Brock, Tyler M.

    2012-01-01

    The 30th Operational Support Squadron Weather Flight (30 OSSWF) provides comprehensive weather services to the space program at Vandenberg Air Force Base (VAFB) in California. One of their responsibilities is to monitor upper-level winds to ensure safe launch operations of the Minuteman Ill ballistic missile. The 30 OSSWF tasked the Applied Meteorology Unit (AMU) to analyze VAFB sounding data with the goal of determining the probability of violating (PoV) their upper-level thresholds for wind speed and shear constraints specific to this launch vehicle, and to develop a tool that will calculate the PoV of each constraint on the day of launch. In order to calculate the probability of exceeding each constraint, the AMU collected and analyzed historical data from VAFB. The historical sounding data were retrieved from the National Oceanic and Atmospheric Administration Earth System Research Laboratory archive for the years 1994-2011 and then stratified into four sub-seasons: January-March, April-June, July-September, and October-December. The AMU determined the theoretical distributions that best fit the maximum wind speed and maximum wind shear datasets and applied this information when calculating the averages and standard deviations needed for the historical and real-time PoV calculations. In addition, the AMU included forecast sounding data from the Rapid Refresh model. This information provides further insight for the launch weather officers (LWOs) when determining if a wind constraint violation will occur over the next few hours on the day of launch. The AMU developed an interactive graphical user interface (GUI) in Microsoft Excel using Visual Basic for Applications. The GUI displays the critical sounding data easily and quickly for LWOs on day of launch. This tool will replace the existing one used by the 30 OSSWF, assist the LWOs in determining the probability of exceeding specific wind threshold values, and help to improve the overall upper winds forecast for

  12. Human Papillomavirus prevalence and probable first effects of vaccination in 20 to 25 year-old women in Germany: a population-based cross-sectional study via home-based self-sampling

    PubMed Central

    2014-01-01

    Background Estimates of Human Papillomavirus (HPV) prevalence in a population prior to and after HPV vaccine introduction are essential to evaluate the short-term impact of vaccination. Methods Between 2010 and 2012 we conducted a population-based cross-sectional study in Germany to determine HPV prevalence, genotype distribution and risk factors for HPV-infection in women aged 20-25 years. Women were recruited by a two-step cluster sampling approach. A home-based self-collection of cervicovaginal lavages was used. Specimens were analysed using a general primer GP5+/GP6+-based polymerase chain reaction and genotyped for 18 high-risk and 6 low-risk HPV- strains by Luminex-based multiplexed genotyping. Results Among 787 included women, 512 were not vaccinated against HPV. In the non-vaccinated population, HPV prevalence of any type was 38.1%, with HPV 16 (19.5%) being the most prevalent genotype. Prevalence of any high-risk type was 34.4%, and in 17.4% of all women, more than one genotype was identified. A higher number of lifetime sexual partners and low educational status were independently associated with HPV-infection. In 223 vaccinated women, prevalence of HPV 16/18 was significantly lower compared to non-vaccinated women (13.9% vs. 22.5%, p = 0.007). When stratifying by age groups, this difference was only significant in women aged 20-21 years, who at time of vaccination were on average younger and had less previous sexual contacts than women aged 22-25 years. Conclusion We demonstrate a high prevalence of high-risk HPV genotypes in non-vaccinated women living in Germany that can be potentially prevented by vaccination. Probable first vaccination effects on the HPV prevalence were observed in women who were vaccinated at younger age. This finding reinforces the recommendation to vaccinate girls in early adolescence. PMID:24552260

  13. Probability with Roulette

    ERIC Educational Resources Information Center

    Marshall, Jennings B.

    2007-01-01

    This article describes how roulette can be used to teach basic concepts of probability. Various bets are used to illustrate the computation of expected value. A betting system shows variations in patterns that often appear in random events.

  14. Microarray Based Gene Expression Analysis of Murine Brown and Subcutaneous Adipose Tissue: Significance with Human

    PubMed Central

    Boparai, Ravneet K.; Kondepudi, Kanthi Kiran; Mantri, Shrikant; Bishnoi, Mahendra

    2015-01-01

    Background Two types of adipose tissues, white (WAT) and brown (BAT) are found in mammals. Increasingly novel strategies are being proposed for the treatment of obesity and its associated complications by altering amount and/or activity of BAT using mouse models. Methodology/Principle Findings The present study was designed to: (a) investigate the differential expression of genes in LACA mice subcutaneous WAT (sWAT) and BAT using mouse DNA microarray, (b) to compare mouse differential gene expression with previously published human data; to understand any inter- species differences between the two and (c) to make a comparative assessment with C57BL/6 mouse strain. In mouse microarray studies, over 7003, 1176 and 401 probe sets showed more than two-fold, five-fold and ten-fold change respectively in differential expression between murine BAT and WAT. Microarray data was validated using quantitative RT-PCR of key genes showing high expression in BAT (Fabp3, Ucp1, Slc27a1) and sWAT (Ms4a1, H2-Ob, Bank1) or showing relatively low expression in BAT (Pgk1, Cox6b1) and sWAT (Slc20a1, Cd74). Multi-omic pathway analysis was employed to understand possible links between the organisms. When murine two fold data was compared with published human BAT and sWAT data, 90 genes showed parallel differential expression in both mouse and human. Out of these 90 genes, 46 showed same pattern of differential expression whereas the pattern was opposite for the remaining 44 genes. Based on our microarray results and its comparison with human data, we were able to identify genes (targets) (a) which can be studied in mouse model systems to extrapolate results to human (b) where caution should be exercised before extrapolation of murine data to human. Conclusion Our study provides evidence for inter species (mouse vs human) differences in differential gene expression between sWAT and BAT. Critical understanding of this data may help in development of novel ways to engineer one form of adipose

  15. Significant disparity in base and sugar damage in DNA resulting from neutron and electron irradiation

    PubMed Central

    Pang, Dalong; Nico, Jeffrey S.; Karam, Lisa; Timofeeva, Olga; Blakely, William F.; Dritschilo, Anatoly; Dizdaroglu, Miral; Jaruga, Pawel

    2014-01-01

    In this study, a comparison of the effects of neutron and electron irradiation of aqueous DNA solutions was investigated to characterize potential neutron signatures in DNA damage induction. Ionizing radiation generates numerous lesions in DNA, including base and sugar lesions, lesions involving base–sugar combinations (e.g. 8,5′-cyclopurine-2′-deoxynucleosides) and DNA–protein cross-links, as well as single- and double-strand breaks and clustered damage. The characteristics of damage depend on the linear energy transfer (LET) of the incident radiation. Here we investigated DNA damage using aqueous DNA solutions in 10 mmol/l phosphate buffer from 0–80 Gy by low-LET electrons (10 Gy/min) and the specific high-LET (∼0.16 Gy/h) neutrons formed by spontaneous 252Cf decay fissions. 8-hydroxy-2′-deoxyguanosine (8-OH-dG), (5′R)-8,5′-cyclo-2′-deoxyadenosine (R-cdA) and (5′S)-8,5′-cyclo-2′-deoxyadenosine (S-cdA) were quantified using liquid chromatography–isotope-dilution tandem mass spectrometry to demonstrate a linear dose dependence for induction of 8-OH-dG by both types of radiation, although neutron irradiation was ∼50% less effective at a given dose compared with electron irradiation. Electron irradiation resulted in an exponential increase in S-cdA and R-cdA with dose, whereas neutron irradiation induced substantially less damage and the amount of damage increased only gradually with dose. Addition of 30 mmol/l 2-amino-2-(hydroxymethyl)-1,3-propanediol (TRIS), a free radical scavenger, to the DNA solution before irradiation reduced lesion induction to background levels for both types of radiation. These results provide insight into the mechanisms of DNA damage by high-LET 252Cf decay neutrons and low-LET electrons, leading to enhanced understanding of the potential biological effects of these types of irradiation. PMID:25034731

  16. Launch Collision Probability

    NASA Technical Reports Server (NTRS)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  17. Normal tissue complication probability model parameter estimation for xerostomia in head and neck cancer patients based on scintigraphy and quality of life assessments

    PubMed Central

    2012-01-01

    Background With advances in modern radiotherapy (RT), many patients with head and neck (HN) cancer can be effectively cured. However, xerostomia is a common complication in patients after RT for HN cancer. The purpose of this study was to use the Lyman–Kutcher–Burman (LKB) model to derive parameters for the normal tissue complication probability (NTCP) for xerostomia based on scintigraphy assessments and quality of life (QoL) questionnaires. We performed validation tests of the Quantitative Analysis of Normal Tissue Effects in the Clinic (QUANTEC) guidelines against prospectively collected QoL and salivary scintigraphic data. Methods Thirty-one patients with HN cancer were enrolled. Salivary excretion factors (SEFs) measured by scintigraphy and QoL data from self-reported questionnaires were used for NTCP modeling to describe the incidence of grade 3+ xerostomia. The NTCP parameters estimated from the QoL and SEF datasets were compared. Model performance was assessed using Pearson’s chi-squared test, Nagelkerke’s R2, the area under the receiver operating characteristic curve, and the Hosmer–Lemeshow test. The negative predictive value (NPV) was checked for the rate of correctly predicting the lack of incidence. Pearson’s chi-squared test was used to test the goodness of fit and association. Results Using the LKB NTCP model and assuming n=1, the dose for uniform irradiation of the whole or partial volume of the parotid gland that results in 50% probability of a complication (TD50) and the slope of the dose–response curve (m) were determined from the QoL and SEF datasets, respectively. The NTCP-fitted parameters for local disease were TD50=43.6 Gy and m=0.18 with the SEF data, and TD50=44.1 Gy and m=0.11 with the QoL data. The rate of grade 3+ xerostomia for treatment plans meeting the QUANTEC guidelines was specifically predicted, with a NPV of 100%, using either the QoL or SEF dataset. Conclusions Our study shows the agreement between the NTCP

  18. GIS-based estimation of the winter storm damage probability in forests: a case study from Baden-Wuerttemberg (Southwest Germany).

    PubMed

    Schindler, Dirk; Grebhan, Karin; Albrecht, Axel; Schönborn, Jochen; Kohnle, Ulrich

    2012-01-01

    Data on storm damage attributed to the two high-impact winter storms 'Wiebke' (28 February 1990) and 'Lothar' (26 December 1999) were used for GIS-based estimation and mapping (in a 50 × 50 m resolution grid) of the winter storm damage probability (P(DAM)) for the forests of the German federal state of Baden-Wuerttemberg (Southwest Germany). The P(DAM)-calculation was based on weights of evidence (WofE) methodology. A combination of information on forest type, geology, soil type, soil moisture regime, and topographic exposure, as well as maximum gust wind speed field was used to compute P(DAM) across the entire study area. Given the condition that maximum gust wind speed during the two storm events exceeded 35 m s(-1), the highest P(DAM) values computed were primarily where coniferous forest grows in severely exposed areas on temporarily moist soils on bunter sandstone formations. Such areas are found mainly in the mountainous ranges of the northern Black Forest, the eastern Forest of Odes, in the Virngrund area, and in the southwestern Alpine Foothills. PMID:21207068

  19. Measurement of multi-pollutant and multi-pathway exposures in a probability-based sample of children: practical strategies for effective field studies.

    PubMed

    Adgate, J L; Clayton, C A; Quackenboss, J J; Thomas, K W; Whitmore, R W; Pellizzari, E D; Lioy, P J; Shubat, P; Stroebel, C; Freeman, N C; Sexton, K

    2000-01-01

    The purpose of this manuscript is to describe the practical strategies developed for the implementation of the Minnesota Children's Pesticide Exposure Study (MNCPES), which is one of the first probability-based samples of multi-pathway and multi-pesticide exposures in children. The primary objective of MNCPES was to characterize children's exposure to selected pesticides through a combination of questionnaires, personal exposure measurements (i.e., air, duplicate diet, hand rinse), and complementary monitoring of biological samples (i.e., pesticide metabolites in urine), environmental samples (i.e., residential indoor/outdoor air, drinking water, dust on residential surfaces, soil), and children's activity patterns. A cross-sectional design employing a stratified random sample was used to identify homes with age-eligible children and screen residences to facilitate oversampling of households with higher potential exposures. Numerous techniques were employed in the study, including in-person contact by locally based interviewers, brief and highly focused home visits, graduated subject incentives, and training of parents and children to assist in sample collection. It is not feasible to quantify increases in rates of subject recruitment, retention, or compliance that resulted from the techniques employed in this study. Nevertheless, results indicate that the total package of implemented procedures was instrumental in obtaining a high percentage of valid samples for targeted households and environmental media. PMID:11138657

  20. No Bridge Too High: Infants Decide Whether to Cross Based on the Probability of Falling not the Severity of the Potential Fall

    ERIC Educational Resources Information Center

    Kretch, Kari S.; Adolph, Karen E.

    2013-01-01

    Do infants, like adults, consider both the probability of falling and the severity of a potential fall when deciding whether to cross a bridge? Crawling and walking infants were encouraged to cross bridges varying in width over a small drop-off, a large drop-off, or no drop-off. Bridge width affects the probability of falling, whereas drop-off…

  1. An exclusive human milk-based diet in extremely premature infants reduces the probability of remaining on total parenteral nutrition: A reanalysis of the data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We have previously shown that an exclusively human-milk-based diet is beneficial for extremely premature infants who are at risk for necrotizing enterocolitis (NEC). However, no significant difference in the other primary study endpoint, the length of time on total parenteral nutrition (TPN), was fo...

  2. The significance of major connectors and denture base mucosal contacts on the functional strain patterns of maxillary removable partial dentures.

    PubMed

    Fernandes, C P; Glantz, P O

    1998-06-01

    The purpose of this study was to investigate the biomechanical significance of major connectors and base mucosal contacts on the mechanical behaviour of maxillary removable partial dentures in vivo. Six subjects wearing maxillary dentures retained by conical crowns were selected for the study. Reflective photoelasticity and strain gauges were used to monitor the development of strain/stress during functional loading. Loading tests were performed initially with a denture design including a palatal major connector and denture bases and then repeated after removal of the major connectors and denture base alveolar muccosa contacts. The palatal major connector and the denture bases mucosal contacts contribute significantly to the rigidity and stability of removable partial dentures retained by conical crowns. PMID:9927921

  3. Acceptance, values, and probability.

    PubMed

    Steel, Daniel

    2015-10-01

    This essay makes a case for regarding personal probabilities used in Bayesian analyses of confirmation as objects of acceptance and rejection. That in turn entails that personal probabilities are subject to the argument from inductive risk, which aims to show non-epistemic values can legitimately influence scientific decisions about which hypotheses to accept. In a Bayesian context, the argument from inductive risk suggests that value judgments can influence decisions about which probability models to accept for likelihoods and priors. As a consequence, if the argument from inductive risk is sound, then non-epistemic values can affect not only the level of evidence deemed necessary to accept a hypothesis but also degrees of confirmation themselves. PMID:26386533

  4. Identification of Patient Benefit From Proton Therapy for Advanced Head and Neck Cancer Patients Based on Individual and Subgroup Normal Tissue Complication Probability Analysis

    SciTech Connect

    Jakobi, Annika; Bandurska-Luque, Anna; Stützer, Kristin; Haase, Robert; Löck, Steffen; Wack, Linda-Jacqueline; Mönnich, David; Thorwarth, Daniela; and others

    2015-08-01

    Purpose: The purpose of this study was to determine, by treatment plan comparison along with normal tissue complication probability (NTCP) modeling, whether a subpopulation of patients with head and neck squamous cell carcinoma (HNSCC) could be identified that would gain substantial benefit from proton therapy in terms of NTCP. Methods and Materials: For 45 HNSCC patients, intensity modulated radiation therapy (IMRT) was compared to intensity modulated proton therapy (IMPT). Physical dose distributions were evaluated as well as the resulting NTCP values, using modern models for acute mucositis, xerostomia, aspiration, dysphagia, laryngeal edema, and trismus. Patient subgroups were defined based on primary tumor location. Results: Generally, IMPT reduced the NTCP values while keeping similar target coverage for all patients. Subgroup analyses revealed a higher individual reduction of swallowing-related side effects by IMPT for patients with tumors in the upper head and neck area, whereas the risk reduction of acute mucositis was more pronounced in patients with tumors in the larynx region. More patients with tumors in the upper head and neck area had a reduction in NTCP of more than 10%. Conclusions: Subgrouping can help to identify patients who may benefit more than others from the use of IMPT and, thus, can be a useful tool for a preselection of patients in the clinic where there are limited PT resources. Because the individual benefit differs within a subgroup, the relative merits should additionally be evaluated by individual treatment plan comparisons.

  5. Disparities in Adverse Childhood Experiences among Sexual Minority and Heterosexual Adults: Results from a Multi-State Probability-Based Sample

    PubMed Central

    Andersen, Judith P; Blosnich, John

    2013-01-01

    Background Adverse childhood experiences (e.g., physical, sexual and emotional abuse, neglect, exposure to domestic violence, parental discord, familial mental illness, incarceration and substance abuse) constitute a major public health problem in the United States. The Adverse Childhood Experiences (ACE) scale is a standardized measure that captures multiple developmental risk factors beyond sexual, physical and emotional abuse. Lesbian, gay, and bisexual (i.e., sexual minority) individuals may experience disproportionately higher prevalence of adverse childhood experiences. Purpose To examine, using the ACE scale, prevalence of childhood physical, emotional, and sexual abuse and childhood household dysfunction among sexual minority and heterosexual adults. Methods Analyses were conducted using a probability-based sample of data pooled from three U.S. states’ Behavioral Risk Factor Surveillance System (BRFSS) surveys (Maine, Washington, Wisconsin) that administered the ACE scale and collected information on sexual identity (n = 22,071). Results Compared with heterosexual respondents, gay/lesbian and bisexual individuals experienced increased odds of six of eight and seven of eight adverse childhood experiences, respectively. Sexual minority persons had higher rates of adverse childhood experiences (IRR = 1.66 gay/lesbian; 1.58 bisexual) compared to their heterosexual peers. Conclusions Sexual minority individuals have increased exposure to multiple developmental risk factors beyond physical, sexual and emotional abuse. We recommend the use of the Adverse Childhood Experiences scale in future research examining health disparities among this minority population. PMID:23372755

  6. Acid-base titrations for polyacids: Significance of the pK sub a and parameters in the Kern equation

    NASA Technical Reports Server (NTRS)

    Meites, L.

    1978-01-01

    A new method is suggested for calculating the dissociation constants of polyvalent acids, especially polymeric acids. In qualitative form the most significant characteristics of the titration curves are demonstrated and identified which are obtained when titrating the solutions of such acids with a standard base potentiometrically.

  7. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  8. Varga: On Probability.

    ERIC Educational Resources Information Center

    Varga, Tamas

    This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…

  9. Application of Quantum Probability

    NASA Astrophysics Data System (ADS)

    Bohdalová, Mária; Kalina, Martin; Nánásiová, Ol'ga

    2009-03-01

    This is the first attempt to smooth time series using estimators with applying quantum probability with causality (non-commutative s-maps on an othomodular lattice). In this context it means that we use non-symmetric covariance matrix to construction of our estimator.

  10. Most-probable-number loop-mediated isothermal amplification-based procedure enhanced with K antigen-specific immunomagnetic separation for quantifying tdh(+) Vibrio parahaemolyticus in molluscan Shellfish.

    PubMed

    Tanaka, Natsuko; Iwade, Yoshito; Yamazaki, Wataru; Gondaira, Fumio; Vuddhakul, Varaporn; Nakaguchi, Yoshitsugu; Nishibuchi, Mitsuaki

    2014-07-01

    Although thermostable direct hemolysin-producing (tdh(+)) Vibrio parahaemolyticus is the leading cause of seafood-borne gastroenteritis, the enumeration of tdh(+) V. parahaemolyticus remains challenging due to its low densities in the environment. In this study, we developed a most-probable-number (MPN)-based procedure designated A-IS(1)-LAMP, in which an immunomagnetic separation (IMS) technique targeting as many as 69 established K antigens and a loop-mediated isothermal amplification (LAMP) assay targeting the thermostable direct hemolysin (tdh) gene were applied in an MPN format. Our IMS employed PickPen, an eight-channel intrasolution magnetic particle separation device, which enabled a straightforward microtiter plate-based IMS procedure (designated as PickPen-IMS). The ability of the procedure to quantify a wide range of tdh(+) V. parahaemolyticus levels was evaluated by testing shellfish samples in Japan and southern Thailand, where shellfish products are known to contain relatively low and high levels of total V. parahaemolyticus, respectively. The Japanese and Thai shellfish samples showed, respectively, relatively low (< 3 to 11 MPN/10 g) and considerably higher (930 to 110,000 MPN/10 g) levels of tdh(+) V. parahaemolyticus, raising concern about the safety of Thai shellfish products sold to domestic consumers at local morning markets. LAMP showed similar or higher performance than conventional PCR in the detection and quantification of a wide range of tdh(+) V. parahaemolyticus levels in shellfish products. Whereas a positive effect of PickPen-IMS was not observed in MPN determination, PickPen-IMS was able to concentrate tdh(+) V. parahaemolyticus 32-fold on average from the Japanese shellfish samples at an individual tube level, suggesting a possibility of using PickPen-IMS as an optional tool for specific shellfish samples. The A-IS(1)-LAMP procedure can be used by any health authority in the world to measure the tdh(+) V. parahaemolyticus levels in

  11. Dynamic SEP event probability forecasts

    NASA Astrophysics Data System (ADS)

    Kahler, S. W.; Ling, A.

    2015-10-01

    The forecasting of solar energetic particle (SEP) event probabilities at Earth has been based primarily on the estimates of magnetic free energy in active regions and on the observations of peak fluxes and fluences of large (≥ M2) solar X-ray flares. These forecasts are typically issued for the next 24 h or with no definite expiration time, which can be deficient for time-critical operations when no SEP event appears following a large X-ray flare. It is therefore important to decrease the event probability forecast with time as a SEP event fails to appear. We use the NOAA listing of major (≥10 pfu) SEP events from 1976 to 2014 to plot the delay times from X-ray peaks to SEP threshold onsets as a function of solar source longitude. An algorithm is derived to decrease the SEP event probabilities with time when no event is observed to reach the 10 pfu threshold. In addition, we use known SEP event size distributions to modify probability forecasts when SEP intensity increases occur below the 10 pfu event threshold. An algorithm to provide a dynamic SEP event forecast, Pd, for both situations of SEP intensities following a large flare is derived.

  12. Experience Matters: Information Acquisition Optimizes Probability Gain

    PubMed Central

    Nelson, Jonathan D.; McKenzie, Craig R.M.; Cottrell, Garrison W.; Sejnowski, Terrence J.

    2010-01-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information—information gain, Kullback-Liebler distance, probability gain (error minimization), and impact—are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects’ information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects’ preference for probability gain is robust, suggesting that the other models contribute little to subjects’ search behavior. PMID:20525915

  13. Experience matters: information acquisition optimizes probability gain.

    PubMed

    Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J

    2010-07-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior. PMID:20525915

  14. MiRPara: a SVM-based software tool for prediction of most probable microRNA coding regions in genome scale sequences

    PubMed Central

    2011-01-01

    Background MicroRNAs are a family of ~22 nt small RNAs that can regulate gene expression at the post-transcriptional level. Identification of these molecules and their targets can aid understanding of regulatory processes. Recently, HTS has become a common identification method but there are two major limitations associated with the technique. Firstly, the method has low efficiency, with typically less than 1 in 10,000 sequences representing miRNA reads and secondly the method preferentially targets highly expressed miRNAs. If sequences are available, computational methods can provide a screening step to investigate the value of an HTS study and aid interpretation of results. However, current methods can only predict miRNAs for short fragments and have usually been trained against small datasets which don't always reflect the diversity of these molecules. Results We have developed a software tool, miRPara, that predicts most probable mature miRNA coding regions from genome scale sequences in a species specific manner. We classified sequences from miRBase into animal, plant and overall categories and used a support vector machine to train three models based on an initial set of 77 parameters related to the physical properties of the pre-miRNA and its miRNAs. By applying parameter filtering we found a subset of ~25 parameters produced higher prediction ability compared to the full set. Our software achieves an accuracy of up to 80% against experimentally verified mature miRNAs, making it one of the most accurate methods available. Conclusions miRPara is an effective tool for locating miRNAs coding regions in genome sequences and can be used as a screening step prior to HTS experiments. It is available at http://www.whiov.ac.cn/bioinformatics/mirpara PMID:21504621

  15. SU-E-T-580: On the Significance of Model Based Dosimetry for Breast and Head and Neck 192Ir HDR Brachytherapy

    SciTech Connect

    Peppa, V; Pappas, E; Pantelis, E; Papagiannis, P; Major, T; Polgar, C

    2015-06-15

    Purpose: To assess the dosimetric and radiobiological differences between TG43-based and model-based dosimetry in the treatment planning of {sup 192}Ir HDR brachytherapy for breast and head and neck cancer. Methods: Two cohorts of 57 Accelerated Partial Breast Irradiation (APBI) and 22 head and neck (H&N) patients with oral cavity carcinoma were studied. Dosimetry for the treatment plans was performed using the TG43 algorithm of the Oncentra Brachy v4.4 treatment planning system (TPS). Corresponding Monte Carlo (MC) simulations were performed using MCNP6 with input files automatically prepared by the BrachyGuide software tool from DICOM RT plan data. TG43 and MC data were compared in terms of % dose differences, Dose Volume Histograms (DVHs) and related indices of clinical interest for the Planning Target Volume (PTV) and the Organs-At-Risk (OARs). A radiobiological analysis was also performed using the Equivalent Uniform Dose (EUD), mean survival fraction (S) and Tumor Control Probability (TCP) for the PTV, and the Normal Tissue Control Probability (N TCP) and the generalized EUD (gEUD) for the OARs. Significance testing of the observed differences performed using the Wilcoxon paired sample test. Results: Differences between TG43 and MC DVH indices, associated with the increased corresponding local % dose differences observed, were statistically significant. This is mainly attributed to their consistency however, since TG43 agrees closely with MC for the majority of DVH and radiobiological parameters in both patient cohorts. Differences varied considerably among patients only for the ipsilateral lung and ribs in the APBI cohort, with a strong correlation to target location. Conclusion: While the consistency and magnitude of differences in the majority of clinically relevant DVH indices imply that no change is needed in the treatment planning practice, individualized dosimetry improves accuracy and addresses instances of inter-patient variability observed. Research

  16. Probabilities of future VEI ≥ 2 eruptions at the Central American Volcanic Arc: a statistical perspective based on the past centuries' eruption record

    NASA Astrophysics Data System (ADS)

    Dzierma, Yvonne; Wehrmann, Heidi

    2014-10-01

    A probabilistic eruption forecast is provided for seven historically active volcanoes along the Central American Volcanic Arc (CAVA), as a pivotal empirical contribution to multi-disciplinary volcanic hazards assessment. The eruption probabilities are determined with a Kaplan-Meier estimator of survival functions, and parametric time series models are applied to describe the historical eruption records. Aside from the volcanoes that are currently in a state of eruptive activity (Santa María, Fuego, and Arenal), the highest probabilities for eruptions of VEI ≥ 2 occur at Concepción and Cerro Negro in Nicaragua, which are likely to erupt to 70-85 % within the next 10 years. Poás and Irazú in Costa Rica show a medium to high eruption probability, followed by San Miguel (El Salvador), Rincón de la Vieja (Costa Rica), and Izalco (El Salvador; 24 % within the next 10 years).

  17. Assessing the statistical significance of periodogram peaks

    NASA Astrophysics Data System (ADS)

    Baluev, R. V.

    2008-04-01

    The least-squares (or Lomb-Scargle) periodogram is a powerful tool that is routinely used in many branches of astronomy to search for periodicities in observational data. The problem of assessing the statistical significance of candidate periodicities for a number of periodograms is considered. Based on results in extreme value theory, improved analytic estimations of false alarm probabilities are given. These include an upper limit to the false alarm probability (or a lower limit to the significance). The estimations are tested numerically in order to establish regions of their practical applicability.

  18. Bell Could Become the Copernicus of Probability

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei

    2016-07-01

    Our aim is to emphasize the role of mathematical models in physics, especially models of geometry and probability. We briefly compare developments of geometry and probability by pointing to similarities and differences: from Euclid to Lobachevsky and from Kolmogorov to Bell. In probability, Bell could play the same role as Lobachevsky in geometry. In fact, violation of Bell’s inequality can be treated as implying the impossibility to apply the classical probability model of Kolmogorov (1933) to quantum phenomena. Thus the quantum probabilistic model (based on Born’s rule) can be considered as the concrete example of the non-Kolmogorovian model of probability, similarly to the Lobachevskian model — the first example of the non-Euclidean model of geometry. This is the “probability model” interpretation of the violation of Bell’s inequality. We also criticize the standard interpretation—an attempt to add to rigorous mathematical probability models additional elements such as (non)locality and (un)realism. Finally, we compare embeddings of non-Euclidean geometries into the Euclidean space with embeddings of the non-Kolmogorovian probabilities (in particular, quantum probability) into the Kolmogorov probability space. As an example, we consider the CHSH-test.

  19. Predicting accurate probabilities with a ranking loss

    PubMed Central

    Menon, Aditya Krishna; Jiang, Xiaoqian J; Vembu, Shankar; Elkan, Charles; Ohno-Machado, Lucila

    2013-01-01

    In many real-world applications of machine learning classifiers, it is essential to predict the probability of an example belonging to a particular class. This paper proposes a simple technique for predicting probabilities based on optimizing a ranking loss, followed by isotonic regression. This semi-parametric technique offers both good ranking and regression performance, and models a richer set of probability distributions than statistical workhorses such as logistic regression. We provide experimental results that show the effectiveness of this technique on real-world applications of probability prediction. PMID:25285328

  20. Laboratory-Tutorial Activities for Teaching Probability

    ERIC Educational Resources Information Center

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called "Intuitive Quantum Physics". Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a…

  1. Measurement Uncertainty and Probability

    NASA Astrophysics Data System (ADS)

    Willink, Robin

    2013-02-01

    Part I. Principles: 1. Introduction; 2. Foundational ideas in measurement; 3. Components of error or uncertainty; 4. Foundational ideas in probability and statistics; 5. The randomization of systematic errors; 6. Beyond the standard confidence interval; Part II. Evaluation of Uncertainty: 7. Final preparation; 8. Evaluation using the linear approximation; 9. Evaluation without the linear approximations; 10. Uncertainty information fit for purpose; Part III. Related Topics: 11. Measurement of vectors and functions; 12. Why take part in a measurement comparison?; 13. Other philosophies; 14. An assessment of objective Bayesian methods; 15. A guide to the expression of uncertainty in measurement; 16. Measurement near a limit - an insoluble problem?; References; Index.

  2. Intensity-Modulated Radiotherapy Results in Significant Decrease in Clinical Toxicities Compared With Conventional Wedge-Based Breast Radiotherapy

    SciTech Connect

    Harsolia, Asif; Kestin, Larry; Grills, Inga; Wallace, Michelle; Jolly, Shruti; Jones, Cortney; Lala, Moinaktar; Martinez, Alvaro; Schell, Scott; Vicini, Frank A. . E-mail: fvicini@beaumont.edu

    2007-08-01

    Purpose: We have previously demonstrated that intensity-modulated radiotherapy (IMRT) with a static multileaf collimator process results in a more homogenous dose distribution compared with conventional wedge-based whole breast irradiation (WBI). In the present analysis, we reviewed the acute and chronic toxicity of this IMRT approach compared with conventional wedge-based treatment. Methods and Materials: A total of 172 patients with Stage 0-IIB breast cancer were treated with lumpectomy followed by WBI. All patients underwent treatment planning computed tomography and received WBI (median dose, 45 Gy) followed by a boost to 61 Gy. Of the 172 patients, 93 (54%) were treated with IMRT, and the 79 patients (46%) treated with wedge-based RT in a consecutive fashion immediately before this cohort served as the control group. The median follow-up was 4.7 years. Results: A significant reduction in acute Grade 2 or worse dermatitis, edema, and hyperpigmentation was seen with IMRT compared with wedges. A trend was found toward reduced acute Grade 3 or greater dermatitis (6% vs. 1%, p = 0.09) in favor of IMRT. Chronic Grade 2 or worse breast edema was significantly reduced with IMRT compared with conventional wedges. No difference was found in cosmesis scores between the two groups. In patients with larger breasts ({>=}1,600 cm{sup 3}, n = 64), IMRT resulted in reduced acute (Grade 2 or greater) breast edema (0% vs. 36%, p <0.001) and hyperpigmentation (3% vs. 41%, p 0.001) and chronic (Grade 2 or greater) long-term edema (3% vs. 30%, p 0.007). Conclusion: The use of IMRT in the treatment of the whole breast results in a significant decrease in acute dermatitis, edema, and hyperpigmentation and a reduction in the development of chronic breast edema compared with conventional wedge-based RT.

  3. Probability and Quantum Paradigms: the Interplay

    SciTech Connect

    Kracklauer, A. F.

    2007-12-03

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  4. Probability and Quantum Paradigms: the Interplay

    NASA Astrophysics Data System (ADS)

    Kracklauer, A. F.

    2007-12-01

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  5. Earthquake probabilities: theoretical assessments and reality

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  6. Significantly elevated dielectric permittivity of Si-based semiconductor/polymer 2-2 composites induced by high polarity polymers

    NASA Astrophysics Data System (ADS)

    Feng, Yefeng; Gong, Honghong; Xie, Yunchuan; Wei, Xiaoyong; Zhang, Zhicheng

    2016-02-01

    To disclose the essential influence of polymer polarity on dielectric properties of polymer composites filled with semiconductive fillers, a series of Si-based semiconductor/polymer 2-2 composites in a series model was fabricated. The dielectric permittivity of composites is highly dependant on the polarity of polymer layers as well as the electron mobility in Si-based semiconductive sheets. The huge dielectric permittivity achieved in Si-based semiconductive sheets after being coated with high polarity polymer layers is inferred to originate from the strong induction of high polarity polymers. The increased mobility of the electrons in Si-based semiconductive sheets coated by high polarity polymer layers should be responsible for the significantly enhanced dielectric properties of composites. This could be facilely achieved by either increasing the polarity of polymer layers or reducing the percolative electric field of Si-based semiconductive sheets. The most promising 2-2 dielectric composite was found to be made of α-SiC with strong electron mobility and poly(vinyl alcohol) (PVA) with high polarity, and its highest permittivity was obtained as 372 at 100 Hz although the permittivity of α-SiC and PVA is 3-5 and 15, respectively. This work may help in the fabrication of high dielectric constant (high-k) composites by tailoring the induction effect of high polarity polymers to semiconductors.

  7. Probability of brittle failure

    NASA Technical Reports Server (NTRS)

    Kim, A.; Bosnyak, C. P.; Chudnovsky, A.

    1991-01-01

    A methodology was developed for collecting statistically representative data for crack initiation and arrest from small number of test specimens. An epoxy (based on bisphenol A diglycidyl ether and polyglycol extended diglycyl ether and cured with diethylene triamine) is selected as a model material. A compact tension specimen with displacement controlled loading is used to observe multiple crack initiation and arrests. The energy release rate at crack initiation is significantly higher than that at a crack arrest, as has been observed elsewhere. The difference between these energy release rates is found to depend on specimen size (scale effect), and is quantitatively related to the fracture surface morphology. The scale effect, similar to that in statistical strength theory, is usually attributed to the statistics of defects which control the fracture process. Triangular shaped ripples (deltoids) are formed on the fracture surface during the slow subcritical crack growth, prior to the smooth mirror-like surface characteristic of fast cracks. The deltoids are complementary on the two crack faces which excludes any inelastic deformation from consideration. Presence of defects is also suggested by the observed scale effect. However, there are no defects at the deltoid apexes detectable down to the 0.1 micron level.

  8. Slider—maximum use of probability information for alignment of short sequence reads and SNP detection

    PubMed Central

    Malhis, Nawar; Butterfield, Yaron S. N.; Ester, Martin; Jones, Steven J. M.

    2009-01-01

    Motivation: A plethora of alignment tools have been created that are designed to best fit different types of alignment conditions. While some of these are made for aligning Illumina Sequence Analyzer reads, none of these are fully utilizing its probability (prb) output. In this article, we will introduce a new alignment approach (Slider) that reduces the alignment problem space by utilizing each read base's probabilities given in the prb files. Results: Compared with other aligners, Slider has higher alignment accuracy and efficiency. In addition, given that Slider matches bases with probabilities other than the most probable, it significantly reduces the percentage of base mismatches. The result is that its SNP predictions are more accurate than other SNP prediction approaches used today that start from the most probable sequence, including those using base quality. Contact: nmalhis@bcgsc.ca Supplementary information and availability: http://www.bcgsc.ca/platform/bioinfo/software/slider PMID:18974170

  9. Significant Scales in Community Structure

    PubMed Central

    Traag, V. A.; Krings, G.; Van Dooren, P.

    2013-01-01

    Many complex networks show signs of modular structure, uncovered by community detection. Although many methods succeed in revealing various partitions, it remains difficult to detect at what scale some partition is significant. This problem shows foremost in multi-resolution methods. We here introduce an efficient method for scanning for resolutions in one such method. Additionally, we introduce the notion of “significance” of a partition, based on subgraph probabilities. Significance is independent of the exact method used, so could also be applied in other methods, and can be interpreted as the gain in encoding a graph by making use of a partition. Using significance, we can determine “good” resolution parameters, which we demonstrate on benchmark networks. Moreover, optimizing significance itself also shows excellent performance. We demonstrate our method on voting data from the European Parliament. Our analysis suggests the European Parliament has become increasingly ideologically divided and that nationality plays no role. PMID:24121597

  10. Analytic estimation of statistical significance maps for support vector machine based multi-variate image analysis and classification

    PubMed Central

    Gaonkar, Bilwaj; Davatzikos, Christos

    2013-01-01

    Multivariate pattern analysis (MVPA) methods such as support vector machines (SVMs) have been increasingly applied to fMRI and sMRI analyses, enabling the detection of distinctive imaging patterns. However, identifying brain regions that significantly contribute to the classification/group separation requires computationally expensive permutation testing. In this paper we show that the results of SVM-permutation testing can be analytically approximated. This approximation leads to more than a thousand fold speed up of the permutation testing procedure, thereby rendering it feasible to perform such tests on standard computers. The speed up achieved makes SVM based group difference analysis competitive with standard univariate group difference analysis methods. PMID:23583748

  11. Organic photovoltaic based on copper phthalocyanine with high open circuit voltage and significant current and voltage stability

    NASA Astrophysics Data System (ADS)

    Hamam, Khalil; Al-Amar, Mohammad; Burns, Clement

    2012-10-01

    Organic semiconductors are under investigation as a possible material to create low cost solar cells. We fabricated photovoltaic devices consisting of copper phthalocyanine (CuPc) modified with a sulfonated group /perylene-3, 4, 9, 10-tetracarboxylic dianhydride (PTCDA)/ bathocuproine (BCP) A large open circuit voltage (VOC) of 0.74 V was recorded, superior to cells based on CuPc/PTCDA (VOC =0.55V). Our solar cells exhibits little change in their voltage and current for more than 7 months, superior to many organic solar cells which degrade significantly over days or weeks.

  12. Prevalence of urogenital Chlamydia trachomatis increases significantly with level of urbanisation and suggests targeted screening approaches: results from the first national population based study in the Netherlands

    PubMed Central

    van Bergen, J; Gotz, H; Richardus, J; Hoebe, C; Broer, J; Coenen, A; t for

    2005-01-01

    Objectives: Chlamydia trachomatis (Chlamydia) is the most prevalent sexually transmitted bacterial infection and can cause considerable reproductive morbidity in women. Chlamydia screening programmes have been considered but policy recommendations are hampered by the lack of population based data. This paper describes the prevalence of Chlamydia in 15–29 year old women and men in rural and urban areas, as determined through systematic population based screening organised by the Municipal Public Health Services (MHS), and discusses the implications of this screening strategy for routine implementation. Methods: Stratified national probability survey according to "area address density" (AAD). 21 000 randomly selected women and men in four regions, aged 15–29 years received a home sampling kit. Urine samples were returned by mail and tested by polymerase chain reaction (PCR). Treatment was via the general practitioner, STI clinic, or MHS clinic. Results: 41% (8383) responded by sending in urine and questionnaire. 11% (2227) returned a refusal card. Non-responders included both higher and lower risk categories. Chlamydia prevalence was significantly lower in rural areas (0.6%, 95% CI 0.1 to 1.1) compared with very highly urbanised areas (3.2%, 95% CI 2.4 to 4.0). Overall prevalence was 2.0% (95% CI 1.7 to 2.3): 2.5% (95% CI 2.0 to 3.0%) in women and 1.5% (95% CI 1.1 to 1.8) in men. Of all cases 91% were treated. Infection was associated with degree of urbanisation, ethnicity, number of sex partners, and symptoms. Conclusion: This large, population based study found very low prevalence in rural populations, suggesting that nationwide systematic screening is not indicated in the Netherlands and that targeted approaches are a better option. Further analysis of risk profiles will contribute to determine how selective screening can be done. PMID:15681716

  13. Significant life experience: Exploring the lifelong influence of place-based environmental and science education on program participants

    NASA Astrophysics Data System (ADS)

    Colvin, Corrie Ruth

    Current research provides a limited understanding of the life long influence of nonformal place-based environmental and science education programs on past participants. This study looks to address this gap, exploring the ways in which these learning environments have contributed to environmental identity and stewardship. Using Dorothy Holland's approach to social practice theory's understanding of identity formation, this study employed narrative interviews and a close-ended survey to understand past participants' experience over time. Participants from two place-based environmental education programs and one science-inquiry program were asked to share their reflections on their program experience and the influence they attribute to that experience. Among all participants, the element of hands-on learning, supportive instructors, and engaging learning environments remained salient over time. Participants of nature-based programs demonstrated that these programs in particular were formative in contributing to an environmental stewardship identity. Social practice theory can serve as a helpful theoretical framework for significant life experience research, which has largely been missing from this body of research. This study also holds implications for the fields of place-based environmental education, conservation psychology, and sustainability planning, all of which look to understand and increase environmentally sustainable practices.

  14. Emptiness Formation Probability

    NASA Astrophysics Data System (ADS)

    Crawford, Nicholas; Ng, Stephen; Starr, Shannon

    2016-08-01

    We present rigorous upper and lower bounds on the emptiness formation probability for the ground state of a spin-1/2 Heisenberg XXZ quantum spin system. For a d-dimensional system we find a rate of decay of the order {exp(-c L^{d+1})} where L is the sidelength of the box in which we ask for the emptiness formation event to occur. In the {d=1} case this confirms previous predictions made in the integrable systems community, though our bounds do not achieve the precision predicted by Bethe ansatz calculations. On the other hand, our bounds in the case {d ≥ 2} are new. The main tools we use are reflection positivity and a rigorous path integral expansion, which is a variation on those previously introduced by Toth, Aizenman-Nachtergaele and Ueltschi.

  15. Chemical potential and entropy in monodisperse and polydisperse hard-sphere fluids using Widom's particle insertion method and a pore size distribution-based insertion probability.

    PubMed

    Baranau, Vasili; Tallarek, Ulrich

    2016-06-01

    We estimate the excess chemical potential Δμ and excess entropy per particle Δs of computer-generated, monodisperse and polydisperse, frictionless hard-sphere fluids. For this purpose, we utilize the Widom particle insertion method, which for hard-sphere systems relates Δμ to the probability to successfully (without intersections) insert a particle into a system. This insertion probability is evaluated directly for each configuration of hard spheres by extrapolating to infinity the pore radii (nearest-surface) distribution and integrating its tail. The estimates of Δμ and Δs are compared to (and comply well with) predictions from the Boublík-Mansoori-Carnahan-Starling-Leland equation of state. For polydisperse spheres, we employ log-normal particle radii distributions with polydispersities δ = 0.1, 0.2, and 0.3. PMID:27276959

  16. Prestack inversion based on anisotropic Markov random field-maximum posterior probability inversion and its application to identify shale gas sweet spots

    NASA Astrophysics Data System (ADS)

    Wang, Kang-Ning; Sun, Zan-Dong; Dong, Ning

    2015-12-01

    Economic shale gas production requires hydraulic fracture stimulation to increase the formation permeability. Hydraulic fracturing strongly depends on geomechanical parameters such as Young's modulus and Poisson's ratio. Fracture-prone sweet spots can be predicted by prestack inversion, which is an ill-posed problem; thus, regularization is needed to obtain unique and stable solutions. To characterize gas-bearing shale sedimentary bodies, elastic parameter variations are regarded as an anisotropic Markov random field. Bayesian statistics are adopted for transforming prestack inversion to the maximum posterior probability. Two energy functions for the lateral and vertical directions are used to describe the distribution, and the expectation-maximization algorithm is used to estimate the hyperparameters of the prior probability of elastic parameters. Finally, the inversion yields clear geological boundaries, high vertical resolution, and reasonable lateral continuity using the conjugate gradient method to minimize the objective function. Antinoise and imaging ability of the method were tested using synthetic and real data.

  17. Significance tests and weighted values for AFLP similarities, based on Arabidopsis in silico AFLP fragment length distributions.

    PubMed Central

    Koopman, Wim J M; Gort, Gerrit

    2004-01-01

    Many AFLP studies include relatively unrelated genotypes that contribute noise to data sets instead of signal. We developed: (1) estimates of expected AFLP similarities between unrelated genotypes, (2) significance tests for AFLP similarities, enabling the detection of unrelated genotypes, and (3) weighted similarity coefficients, including band position information. Detection of unrelated genotypes and use of weighted similarity coefficients will make the analysis of AFLP data sets more informative and more reliable. Test statistics and weighted coefficients were developed for total numbers of shared bands and for Dice, Jaccard, Nei and Li, and simple matching (dis)similarity coefficients. Theoretical and in silico AFLP fragment length distributions (FLDs) were examined as a basis for the tests. The in silico AFLP FLD based on the Arabidopsis thaliana genome sequence was the most appropriate for angiosperms. The G + C content of the selective nucleotides in the in silico AFLP procedure significantly influenced the FLD. Therefore, separate test statistics were calculated for AFLP procedures with high, average, and low G + C contents in the selective nucleotides. The test statistics are generally applicable for angiosperms with a G + C content of approximately 35-40%, but represent conservative estimates for genotypes with higher G + C contents. For the latter, test statistics based on a rice genome sequence are more appropriate. PMID:15342529

  18. Identifying significant covariates for anti-HIV treatment response: mechanism-based differential equation models and empirical semiparametric regression models.

    PubMed

    Huang, Yangxin; Liang, Hua; Wu, Hulin

    2008-10-15

    In this paper, the mechanism-based ordinary differential equation (ODE) model and the flexible semiparametric regression model are employed to identify the significant covariates for antiretroviral response in AIDS clinical trials. We consider the treatment effect as a function of three factors (or covariates) including pharmacokinetics, drug adherence and susceptibility. Both clinical and simulated data examples are given to illustrate these two different kinds of modeling approaches. We found that the ODE model is more powerful to model the mechanism-based nonlinear relationship between treatment effects and virological response biomarkers. The ODE model is also better in identifying the significant factors for virological response, although it is slightly liberal and there is a trend to include more factors (or covariates) in the model. The semiparametric mixed-effects regression model is very flexible to fit the virological response data, but it is too liberal to identify correct factors for the virological response; sometimes it may miss the correct factors. The ODE model is also biologically justifiable and good for predictions and simulations for various biological scenarios. The limitations of the ODE models include the high cost of computation and the requirement of biological assumptions that sometimes may not be easy to validate. The methodologies reviewed in this paper are also generally applicable to studies of other viruses such as hepatitis B virus or hepatitis C virus. PMID:18407583

  19. A timer inventory based upon manual and automated analysis of ERTS-1 and supporting aircraft data using multistage probability sampling. [Plumas National Forest, California

    NASA Technical Reports Server (NTRS)

    Nichols, J. D.; Gialdini, M.; Jaakkola, S.

    1974-01-01

    A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.

  20. Predictors of clinically significant weight loss and participant retention in an insurance-sponsored community-based weight management program.

    PubMed

    Abildso, Christiaan G; Zizzi, Sam; Fitzpatrick, Sean J

    2013-07-01

    Health insurance providers are a logical partner in providing third-party payment for behavioral weight loss programming, but little evidence of predictors of improved outcomes or retention in large, insurance-sponsored lifestyle programming is available. The purpose was to determine predictors of weight loss and retention in an insurance-sponsored, community-based weight management program. Current and former participants (N = 2,106) were recruited to complete a program evaluation survey. Respondents' survey and objective outcome data (n = 766) were analyzed using logistic regression procedures to understand the factors predictive of clinically-significant (5%) weight loss and program retention (>6 months). Clinically significant weight loss was predicted by completing more than 6 months of the program, positive ratings of staff interaction, and social support from friends on success. Ratings of positive impact of site hours of operation, nurse calls, and availability of safe places to be active and feeling comfortable at the site were predictive of program retention. Modifiable intervention, social factors, and site-level factors were predictive of clinically significant weight loss and program retention, providing fodder for further study and dissemination to current providers and to a broader network of health promotion professionals. PMID:23075503

  1. Adapting the Posterior Probability of Diagnosis (PPOD) Index to Enhance Evidence-Based Screening: An Application to ADHD in Primary Care

    PubMed Central

    Lindhiem, Oliver; Yu, Lan; Grasso, Damion J.; Kolko, David J.; Youngstrom, Eric A.

    2014-01-01

    This study adapts the Posterior Probability of Diagnosis (PPOD) Index for use with screening data. The original PPOD Index, designed for use in the context of comprehensive diagnostic assessments, is overconfident when applied to screening data. To correct for this overconfidence, we describe a simple method for adjusting the PPOD Index to improve its calibration when used for screening. Specifically, we compare the adjusted PPOD Index to the original index and Naïve Bayes probability estimates on two dimensions of accuracy, discrimination and calibration, using a clinical sample of children and adolescents (N = 321) whose caregivers completed the Vanderbilt Assessment Scale to screen for Attention-Deficit/Hyperactivity Disorder (ADHD) and who subsequently completed a comprehensive diagnostic assessment. Results indicated that the adjusted PPOD Index, original PPOD Index, and Naïve Bayes probability estimates are comparable using traditional measures of accuracy (sensitivity, specificity, AUC) but the adjusted PPOD Index showed superior calibration. We discuss the importance of calibration for screening and diagnostic support tools when applied to individual patients. PMID:25000935

  2. A clip-based protocol for breast boost radiotherapy provides clear target visualisation and demonstrates significant volume reduction over time

    SciTech Connect

    Lewis, Lorraine; Cox, Jennifer; Morgia, Marita; Atyeo, John; Lamoury, Gillian

    2015-09-15

    The clinical target volume (CTV) for early stage breast cancer is difficult to clearly identify on planning computed tomography (CT) scans. Surgical clips inserted around the tumour bed should help to identify the CTV, particularly if the seroma has been reabsorbed, and enable tracking of CTV changes over time. A surgical clip-based CTV delineation protocol was introduced. CTV visibility and its post-operative shrinkage pattern were assessed. The subjects were 27 early stage breast cancer patients receiving post-operative radiotherapy alone and 15 receiving post-operative chemotherapy followed by radiotherapy. The radiotherapy alone (RT/alone) group received a CT scan at median 25 days post-operatively (CT1rt) and another at 40 Gy, median 68 days (CT2rt). The chemotherapy/RT group (chemo/RT) received a CT scan at median 18 days post-operatively (CT1ch), a planning CT scan at median 126 days (CT2ch), and another at 40 Gy (CT3ch). There was no significant difference (P = 0.08) between the initial mean CTV for each cohort. The RT/alone cohort showed significant CTV volume reduction of 38.4% (P = 0.01) at 40 Gy. The Chemo/RT cohort had significantly reduced volumes between CT1ch: median 54 cm{sup 3} (4–118) and CT2ch: median 16 cm{sup 3}, (2–99), (P = 0.01), but no significant volume reduction thereafter. Surgical clips enable localisation of the post-surgical seroma for radiotherapy targeting. Most seroma shrinkage occurs early, enabling CT treatment planning to take place at 7 weeks, which is within the 9 weeks recommended to limit disease recurrence.

  3. Lectures on probability and statistics

    SciTech Connect

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  4. Dimension Reduction via Unsupervised Learning Yields Significant Computational Improvements for Support Vector Machine Based Protein Family Classification.

    SciTech Connect

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Oehmen, Christopher S.

    2009-02-26

    Reducing the dimension of vectors used in training support vector machines (SVMs) results in a proportional speedup in training time. For large-scale problems this can make the difference between tractable and intractable training tasks. However, it is critical that classifiers trained on reduced datasets perform as reliably as their counterparts trained on high-dimensional data. We assessed principal component analysis (PCA) and sequential project pursuit (SPP) as dimension reduction strategies in the biology application of classifying proteins into well-defined functional ‘families’ (SVM-based protein family classification) by their impact on run-time, sensitivity and selectivity. Homology vectors of 4352 elements were reduced to approximately 2% of the original data size without significantly affecting accuracy using PCA and SPP, while leading to approximately a 28-fold speedup in run-time.

  5. Assessment of the probability of contaminating Mars

    NASA Technical Reports Server (NTRS)

    Judd, B. R.; North, D. W.; Pezier, J. P.

    1974-01-01

    New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.

  6. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  7. Weighted Feature Significance: A Simple, Interpretable Model of Compound Toxicity Based on the Statistical Enrichment of Structural Features

    PubMed Central

    Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R.; Austin, Christopher P.

    2009-01-01

    In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) data from Salmonella typhimurium reverse mutagenicity assays conducted by the U.S. National Toxicology Program, and (3) hepatotoxicity data published in the Registry of Toxic Effects of Chemical Substances. Enrichments of structural features in toxic compounds are evaluated for their statistical significance and compiled into a simple additive model of toxicity and then used to score new compounds for potential toxicity. The predictive power of the model for cytotoxicity was validated using an independent set of compounds from the U.S. Environmental Protection Agency tested also at the National Institutes of Health Chemical Genomics Center. We compared the performance of our WFS approach with classical classification methods such as Naive Bayesian clustering and support vector machines. In most test cases, WFS showed similar or slightly better predictive power, especially in the prediction of hepatotoxic compounds, where WFS appeared to have the best performance among the three methods. The new algorithm has the important advantages of simplicity, power, interpretability, and ease of implementation. PMID:19805409

  8. What if the Electrical Conductivity of Graphene Is Significantly Deteriorated for the Graphene-Semiconductor Composite-Based Photocatalysis?

    PubMed

    Weng, Bo; Xu, Yi-Jun

    2015-12-23

    The extraordinary electrical conductivity of graphene has been widely regarded as the bible in literature to explain the activity enhancement of graphene-semiconductor composite photocatalysts. However, from the viewpoint of an entire composite-based artificial photosynthetic system, the significant matter of photocatalytic performance of graphene-semiconductor composite system is not just a simple and only issue of excellent electrical conductivity of graphene. Herein, the intentional design of melamine resin monomers functionalized three-dimensional (3D) graphene (donated as MRGO) with significantly deteriorated electrical conductivity enables us to independently focus on studying the geometry effect of MRGO on the photocatalytic performance of graphene-semiconductor composite. By coupling semiconductor CdS with graphene, including MRGO and reduced graphene oxide (RGO), it was found that the CdS-MRGO composites exhibit much higher visible light photoactivity than CdS-RGO composites although the electrical conductivity of MRGO is remarkably much lower than that of RGO. The comparison characterizations evidence that such photoactivity enhancement is predominantly attributed to the restacking-inhibited 3D architectural morphology of MRGO, by which the synergistic effects of boosted separation and transportation of photogenerated charge carriers and increased adsorption capacity can be achieved. Our work highlights that the significant matter of photocatalytic performance of graphene-semiconductor composite is not a simple issue on how to harness the electrical conductivity of graphene but the rational ensemble design of graphene-semiconductor composite, which includes the integrative optimization of geometrical and electrical factors of individual component and the interface composition. PMID:26624808

  9. Combined Statistical Analyses of Peptide Intensities and Peptide Occurrences Improves Identification of Significant Peptides from MS-based Proteomics Data

    SciTech Connect

    Webb-Robertson, Bobbie-Jo M.; McCue, Lee Ann; Waters, Katrina M.; Matzke, Melissa M.; Jacobs, Jon M.; Metz, Thomas O.; Varnum, Susan M.; Pounds, Joel G.

    2010-11-01

    Liquid chromatography-mass spectrometry-based (LC-MS) proteomics uses peak intensities of proteolytic peptides to infer the differential abundance of peptides/proteins. However, substantial run-to-run variability in peptide intensities and observations (presence/absence) of peptides makes data analysis quite challenging. The missing abundance values in LC-MS proteomics data are difficult to address with traditional imputation-based approaches because the mechanisms by which data are missing are unknown a priori. Data can be missing due to random mechanisms such as experimental error, or non-random mechanisms such as a true biological effect. We present a statistical approach that uses a test of independence known as a G-test to test the null hypothesis of independence between the number of missing values and the experimental groups. We pair the G-test results evaluating independence of missing data (IMD) with a standard analysis of variance (ANOVA) that uses only means and variances computed from the observed data. Each peptide is therefore represented by two statistical confidence metrics, one for qualitative differential observation and one for quantitative differential intensity. We use two simulated and two real LC-MS datasets to demonstrate the robustness and sensitivity of the ANOVA-IMD approach for assigning confidence to peptides with significant differential abundance among experimental groups.

  10. UQ for Decision Making: How (at least five) Kinds of Probability Might Come Into Play

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    In 1959 IJ Good published the discussion "Kinds of Probability" in Science. Good identified (at least) five kinds. The need for (at least) a sixth kind of probability when quantifying uncertainty in the context of climate science is discussed. This discussion brings out the differences in weather-like forecasting tasks and climate-links tasks, with a focus on the effective use both of science and of modelling in support of decision making. Good also introduced the idea of a "Dynamic probability" a probability one expects to change without any additional empirical evidence; the probabilities assigned by a chess playing program when it is only half thorough its analysis being an example. This case is contrasted with the case of "Mature probabilities" where a forecast algorithm (or model) has converged on its asymptotic probabilities and the question hinges in whether or not those probabilities are expected to change significantly before the event in question occurs, even in the absence of new empirical evidence. If so, then how might one report and deploy such immature probabilities in scientific-support of decision-making rationally? Mature Probability is suggested as a useful sixth kind, although Good would doubtlessly argue that we can get by with just one, effective communication with decision makers may be enhanced by speaking as if the others existed. This again highlights the distinction between weather-like contexts and climate-like contexts. In the former context one has access to a relevant climatology (a relevant, arguably informative distribution prior to any model simulations), in the latter context that information is not available although one can fall back on the scientific basis upon which the model itself rests, and estimate the probability that the model output is in fact misinformative. This subjective "probability of a big surprise" is one way to communicate the probability of model-based information holding in practice, the probability that the

  11. Using probability-based spatial estimation of the river pollution index to assess urban water recreational quality in the Tamsui River watershed.

    PubMed

    Jang, Cheng-Shin

    2016-01-01

    The Tamsui River watershed situated in Northern Taiwan provides a variety of water recreational opportunities such as riverbank park activities, fishing, cruising, rowing, sailing, and swimming. However, river water quality strongly affects water recreational quality. Moreover, the health of recreationists who are partially or fully exposed to polluted river water may be jeopardized. A river pollution index (RPI) composed of dissolved oxygen, biochemical oxygen demand, suspended solids, and ammonia nitrogen is typically used to gauge the river water quality and regulate the water body use in Taiwan. The purpose of this study was to probabilistically determine the RPI categories in the Tamsui River watershed and to assess the urban water recreational quality on the basis of the estimated RPI categories. First, according to various RPI categories, one-dimensional indicator kriging (IK) was adopted to estimate the occurrence probabilities of the RPI categories. The maximum occurrence probability among the categories was then employed to determine the most suitable RPI category. Finally, the most serious categories and seasonal variations of RPI were adopted to evaluate the quality of current water recreational opportunities in the Tamsui River watershed. The results revealed that the midstream and downstream sections of the Tamsui River and its tributaries with poor river water quality afford low water recreational quality, and water recreationists should avoid full or limited exposure to these bodies of water. However, the upstream sections of the Tamsui River watershed with high river water quality are suitable for all water recreational activities. PMID:26676412

  12. Significant Treasures.

    ERIC Educational Resources Information Center

    Andrews, Ian A.

    1999-01-01

    Provides a crossword puzzle with an answer key corresponding to the book entitled "Significant Treasures/Tresors Parlants" that is filled with color and black-and-white prints of paintings and artifacts from 131 museums and art galleries as a sampling of the 2,200 such Canadian institutions. (CMK)

  13. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  14. The Probability of Causal Conditionals

    ERIC Educational Resources Information Center

    Over, David E.; Hadjichristidis, Constantinos; Evans, Jonathan St. B. T.; Handley, Simon J.; Sloman, Steven A.

    2007-01-01

    Conditionals in natural language are central to reasoning and decision making. A theoretical proposal called the Ramsey test implies the conditional probability hypothesis: that the subjective probability of a natural language conditional, P(if p then q), is the conditional subjective probability, P(q [such that] p). We report three experiments on…

  15. Determination of the compound nucleus survival probability Psurv for various "hot" fusion reactions based on the dynamical cluster-decay model

    NASA Astrophysics Data System (ADS)

    Chopra, Sahila; Kaur, Arshdeep; Gupta, Raj K.

    2015-03-01

    After a successful attempt to define and determine recently the compound nucleus (CN) fusion/ formation probability PCN within the dynamical cluster-decay model (DCM), we introduce and estimate here for the first time the survival probability Psurv of CN against fission, again within the DCM. Calculated as the dynamical fragmentation process, Psurv is defined as the ratio of the evaporation residue (ER) cross section σER and the sum of σER and fusion-fission (ff) cross section σff, the CN formation cross section σCN, where each contributing fragmentation cross section is determined in terms of its formation and barrier penetration probabilities P0 and P . In DCM, the deformations up to hexadecapole and "compact" orientations for both in-plane (coplanar) and out-of-plane (noncoplanar) configurations are allowed. Some 16 "hot" fusion reactions, forming a CN of mass number ACN˜100 to superheavy nuclei, are analyzed for various different nuclear interaction potentials, and the variation of Psurv on CN excitation energy E*, fissility parameter χ , CN mass ACN, and Coulomb parameter Z1Z2 is investigated. Interesting results are that three groups, namely, weakly fissioning, radioactive, and strongly fissioning superheavy nuclei, are identified with Psurv, respectively, ˜1 ,˜10-6 , and ˜10-10 . For the weakly fissioning group (100

  16. A commercial PCV2a-based vaccine significantly reduces PCV2b transmission in experimental conditions.

    PubMed

    Rose, N; Andraud, M; Bigault, L; Jestin, A; Grasland, B

    2016-07-19

    Transmission characteristics of PCV2 have been compared between vaccinated and non-vaccinated pigs in experimental conditions. Twenty-four Specific Pathogen Free (SPF) piglets, vaccinated against PCV2 at 3weeks of age (PCV2a recombinant CAP protein-based vaccine), were inoculated at 15days post-vaccination with a PCV2b inoculum (6⋅10(5) TCID50), and put in contact with 24 vaccinated SPF piglets during 42days post-inoculation. Those piglets were shared in six replicates of a contact trial involving 4 inoculated piglets mingled with 4 susceptible SPF piglets. Two replicates of a similar contact trial were made with non-vaccinated pigs. Non vaccinated animals received a placebo at vaccination time and were inoculated the same way and at the same time as the vaccinated group. All the animals were monitored twice weekly using quantitative real-time PCR and ELISA for serology until 42days post-inoculation. The frequency of infection and the PCV2 genome load in sera of the vaccinated pigs were significantly reduced compared to the non-vaccinated animals. The duration of infectiousness was significantly different between vaccinated and non-vaccinated groups (16.6days [14.7;18.4] and 26.6days [22.9;30.4] respectively). The transmission rate was also considerably decreased in vaccinated pigs (β=0.09 [0.05-0.14] compared to β=0.19 [0.11-0.32] in non-vaccinated pigs). This led to an estimated reproduction ratio of 1.5 [95% CI 0.8 - 2.2] in vaccinated animals versus 5.1 [95% CI 2.5 - 8.2] in non-vaccinated pigs when merging data of this experiment with previous trials carried out in same conditions. PMID:27318416

  17. Community-based tsetse fly control significantly reduces fly density and trypanosomosis prevalence in Metekel Zone, Northwest, Ethiopia.

    PubMed

    Girmay, Gebrerufael; Arega, Bezna; Tesfaye, Dawit; Berkvens, Dirk; Muleta, Gadisa; Asefa, Getnet

    2016-03-01

    African animal trypanosomosis is a great obstacle to livestock production where tsetse flies play a major role. Metekel zone is among the tsetse-infested areas. Community-based tsetse fly and trypanosomosis control using targets was conducted from June 2011 to May 2012 in Metekel zone, Ethiopia, to decrease trypanosomosis and tsetse fly. Cloth screen targets were developed, impregnated with 0.1 % deltamethrin, and deployed alongside rivers by the research team together with the community animal health workers. Monthly parasitological and entomological data were collected, processed, and compared with similar data collected before control. Overall average tsetse fly (Glossina tachinoides) density decreased from 1.13 to 0.18 fly/trap/day after control. The density was decreased in all sites with no significant difference among the sites. However, higher decrements were observed in the dry and late dry seasons by more than 12 and 6 times, respectively. The reduction in overall apparent prevalence of trypanosomosis caused by Trypanosoma congolense, Trypanosoma brucei, and Trypanosoma vivax from 12.14 % before to 3.61 % after control coincides with the tsetse fly reduction. In all the study sites, significant reduction was observed before and after control. The highest decrement was observed in the late dry season when the apparent prevalence was reduced from 7.89 to 1.17 % before and after control, respectively. As this approach is simple, cost-effective, and appropriate for riverine tsetse species, we recommend to be scaled up to other similar places. PMID:26885985

  18. [Clinical significance of proteinuria and renal function: findings from a population-based cohort, the Takahata study].

    PubMed

    Konta, Tsuneo

    2013-07-01

    Proteinuria/albuminuria and renal insufficiency are major components of chronic kidney disease (CKD), and are strongly associated with end-stage renal disease, cardiovascular events and premature death. To clarify the prevalence of these renal disorders and the association between renal disorders and mortality in the Japanese population, we conducted a community-based longitudinal study. This study included 3,445 registered Japanese subjects, with a 7-year follow-up. Proteinuria/albuminuria was evaluated using dipstick strips and the urinary protein/albumin creatinine ratio (PCR/ACR). Glomerular filtration rate (GFR) was estimated using the equation for Japanese subjects. The prevalence of dipstick proteinuria, proteinuria (PCR > or = 0.15 g/gCr), albuminuria(ACR > or =30 mg/gCr) and renal insufficiency(estimated GFR< 60 ml/min/1.73m2) were 5%, 8%, 15% and 7%, respectively. The overlap between urinary abnormality and renal insufficiency was small. The prevalence of proteinuria/albuminuria increased along with the increase of blood pressure, 24-hour urinary sodium excretion, HbAlc and the number of components of metabolic syndrome. Kaplan Meier analysis showed that all-cause mortality was significantly increased along with the increase in urinary albumin excretion and the subjects with albuminuria showed a significantly higher mortality rate than those without albuminuria. Cox proportional hazard analysis after adjusting for possible confounders showed that albuminuria was an independent risk for all-cause and cardiovascular mortality. In conclusion, proteinuria/albuminuria and renal insufficiency are prevalent and were independently associated with mortality in the Japanese general population. The detection of renal disorders at the earliest opportunity is important to prevent premature death. PMID:24205706

  19. A significant carbon sink in temperate forests in Beijing: based on 20-year field measurements in three stands

    NASA Astrophysics Data System (ADS)

    Zhu, J.

    2015-12-01

    Numerous efforts have been made to characterize the forest carbon (C) cycles and stocks. However, long-term observation-based quantification on each component of the forest C cycle and its change is still lacking. We measured C stocks and fluxes in three permanent temperate forest plots (birch, oak and pine forest plots) during 2011-2014 and calculated the changes of the C cycle components related to the measurements during 1992-1994 in Mt. Dongling, Beijing, China. Our results showed that forest net primary production in birch, oak and pine plots were 5.32, 4.53 and 6.73 Mg C ha-1 yr-1, respectively. The corresponding net ecosystem production were 0.12, 0.43 and 3.53 Mg C ha-1 yr-1. The C stocks and fluxes in 2011-2014 were significantly larger than those in 1992-1994: the biomass C densities in birch, oak and pine plots increased from 50.0, 37.7 and 54.0 Mg C ha-1 in 1994 to 101.5, 77.3 and 110.9 Mg C ha-1 in 2014; soil organic C densities from 207.0, 239.1 and 231.7 Mg C ha-1 to 214.8, 241.7 and 238.4 Mg C ha-1; and soil heterotrophic respiration from 2.78, 3.49 and 1.81 Mg C ha-1 yr-1 to 5.20, 4.10 and 3.20 Mg C ha-1 yr-1. These results suggest that the mountainous temperate forest ecosystems are a carbon sink in the recent two decades. These observations of C densities and fluxes provided field-based data for long-term study of C cycling in temperate forest ecosystems.

  20. Quantum probability and many worlds

    NASA Astrophysics Data System (ADS)

    Hemmo, Meir; Pitowsky, Itamar

    We discuss the meaning of probabilities in the many worlds interpretation of quantum mechanics. We start by presenting very briefly the many worlds theory, how the problem of probability arises, and some unsuccessful attempts to solve it in the past. Then we criticize a recent attempt by Deutsch to derive the quantum mechanical probabilities from the non-probabilistic parts of quantum mechanics and classical decision theory. We further argue that the Born probability does not make sense even as an additional probability rule in the many worlds theory. Our conclusion is that the many worlds theory fails to account for the probabilistic statements of standard (collapse) quantum mechanics.

  1. Symptomatic Chiari malformation in adults: a new classification based on magnetic resonance imaging with clinical and prognostic significance.

    PubMed

    Pillay, P K; Awad, I A; Little, J R; Hahn, J F

    1991-05-01

    Thirty-five consecutive adults with Chiari malformation and progressive symptoms underwent surgical treatment at a single institution over a 3-year period. All patients underwent magnetic resonance imaging scan before and after surgery. Images of the craniovertebral junction confirmed tonsillar herniation in all cases and allowed the definition of two anatomically distinct categories of the Chiari malformation in this age group. Twenty of the 35 patients had concomitant syringomyelia and were classified as Type A. The remaining 15 patients had evidence of frank herniation of the brain stem below the foramen magnum without evidence of syringomyelia and were labeled Type B. Type A patients had a predominant central cord symptomatology; Type B patients exhibited signs and symptoms of brain stem or cerebellar compression. The principal surgical procedure consisted of decompression of the foramen magnum, opening of the fourth ventricular outlet, and plugging of the obex. Significant improvement in preoperative symptoms and signs was observed in 9 of the 20 patients (45%) with syringomyelia (Type A), as compared to 13 of the 15 patients (87%) without syringomyelia (Type B). Postoperative reduction in syrinx volume was observed in 11 of the 20 patients with syringomyelia, including all 9 patients with excellent results. Magnetic resonance imaging has allowed a classification of the adult Chiari malformation in adults based on objective anatomic criteria, with clinical and prognostic relevance. The presence of syringomyelia implies a less favorable response to surgical intervention. PMID:1876240

  2. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  3. Associativity and normative credal probability.

    PubMed

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959. PMID:18238098

  4. Characteristic length of the knotting probability revisited

    NASA Astrophysics Data System (ADS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-09-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(-N/NK), where the estimates of parameter NK are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius rex, i.e. the screening length of double-stranded DNA.

  5. Liquefaction probability curves for surficial geologic deposits

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2011-01-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different types of surficial geologic units. The units consist of alluvial fan, beach ridge, river delta topset and foreset beds, eolian dune, point bar, flood basin, natural river and alluvial fan levees, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities are derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 927 cone penetration tests. For natural deposits with a water table at 1.5 m and subjected to a M7.5 earthquake with peak ground acceleration (PGA)  =  0.25g, probabilities range from 0.5 for beach ridge, point bar, and deltaic deposits. The curves also were used to assign ranges of liquefaction probabilities to the susceptibility categories proposed previously for different geologic deposits. For the earthquake described here, probabilities for susceptibility categories have ranges of 0–0.08 for low, 0.09–0.30 for moderate, 0.31–0.62 for high, and 0.63–1.00 for very high. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to observations.

  6. Predicting loss exceedance probabilities for US hurricane landfalls

    NASA Astrophysics Data System (ADS)

    Murnane, R.

    2003-04-01

    The extreme winds, rains, and floods produced by landfalling hurricanes kill and injure many people and cause severe economic losses. Many business, planning, and emergency management decisions are based on the probability of hurricane landfall and associated emergency management considerations; however, the expected total economic and insured losses also are important. Insured losses generally are assumed to be half the total economic loss from hurricanes in the United States. Here I describe a simple model that can be used to estimate deterministic and probabilistic exceedance probabilities for insured losses associated with landfalling hurricanes along the US coastline. The model combines wind speed exceedance probabilities with loss records from historical hurricanes striking land. The wind speed exceedance probabilities are based on the HURDAT best track data and use the storm’s maximum sustained wind just prior to landfall. The loss records are normalized to present-day values using a risk model and account for historical changes in inflation, population, housing stock, and other factors. Analysis of the correlation between normalized losses and a number of storm-related parameters suggests that the most relevant, statistically-significant predictor for insured loss is the storm’s maximum sustained wind at landfall. Insured loss exceedance probabilities thus are estimated using a linear relationship between the log of the maximum sustained winds and normalized insured loss. Model estimates for insured losses from Hurricanes Isidore (US45 million) and Lili (US275 million) compare well with loss estimates from more sophisticated risk models and recorded losses. The model can also be used to estimate how exceedance probabilities for insured loss vary as a function of the North Atlantic Oscillation and the El Niño-Southern Oscillation.

  7. Immunogenic Cell Death Induced by Ginsenoside Rg3: Significance in Dendritic Cell-based Anti-tumor Immunotherapy

    PubMed Central

    Son, Keum-joo; Choi, Ki ryung; Lee, Seog Jae

    2016-01-01

    Cancer is one of the leading causes of morbidity and mortality worldwide; therefore there is a need to discover new therapeutic modules with improved efficacy and safety. Immune-(cell) therapy is a promising therapeutic strategy for the treatment of intractable cancers. The effectiveness of certain chemotherapeutics in inducing immunogenic tumor cell death thus promoting cancer eradication has been reported. Ginsenoside Rg3 is a ginseng saponin that has antitumor and immunomodulatory activity. In this study, we treated tumor cells with Rg3 to verify the significance of inducing immunogenic tumor cell death in antitumor therapy, especially in DC-based immunotherapy. Rg3 killed the both immunogenic (B16F10 melanoma cells) and non-immunogenic (LLC: Lewis Lung Carcinoma cells) tumor cells by inducing apoptosis. Surface expression of immunogenic death markers including calreticulin and heat shock proteins and the transcription of relevant genes were increased in the Rg3-dying tumor. Increased calreticulin expression was directly related to the uptake of dying tumor cells by dendritic cells (DCs): the proportion of CRT+ CD11c+ cells was increased in the Rg3-treated group. Interestingly, tumor cells dying by immunogenic cell death secreted IFN-γ, an effector molecule for antitumor activity in T cells. Along with the Rg3-induced suppression of pro-angiogenic (TNF-α) and immunosuppressive cytokine (TGF-β) secretion, IFN-γ production from the Rg3-treated tumor cells may also indicate Rg3 as an effective anticancer immunotherapeutic strategy. The data clearly suggests that Rg3-induced immunogenic tumor cell death due its cytotoxic effect and its ability to induce DC function. This indicates that Rg3 may be an effective immunotherapeutic strategy. PMID:26937234

  8. A significant carbon sink in temperate forests in Beijing: based on 20-year field measurements in three stands.

    PubMed

    Zhu, JianXiao; Hu, XueYang; Yao, Hui; Liu, GuoHua; Ji, ChenJun; Fang, JingYun

    2015-11-01

    Numerous efforts have been made to characterize forest carbon (C) cycles and stocks in various ecosystems. However, long-term observation on each component of the forest C cycle is still lacking. We measured C stocks and fluxes in three permanent temperate forest plots (birch, oak and pine forest) during 2011–2014, and calculated the changes of the components of the C cycle related to the measurements during 1992–1994 at Mt. Dongling, Beijing, China. Forest net primary production in birch, oak, and pine plots was 5.32, 4.53, and 6.73 Mg C ha-1 a-1, respectively. Corresponding net ecosystem production was 0.12, 0.43, and 3.53 Mg C ha-1 a-1. The C stocks and fluxes in 2011–2014 were significantly larger than those in 1992–1994 in which the biomass C densities in birch, oak, and pine plots increased from 50.0, 37.7, and 54.0 Mg C ha-1 in 1994 to 101.5, 77.3, and 110.9 Mg C ha-1 in 2014; soil organic C densities increased from 207.0, 239.1, and 231.7 Mg C ha-1 to 214.8, 241.7, and 238.4 Mg C ha-1; and soil heterotrophic respiration increased from 2.78, 3.49, and 1.81 Mg C ha-1 a-1 to 5.20, 4.10, and 3.20 Mg C ha-1 a-1. These results suggest that the mountainous temperate forest ecosystems in Beijing have served as a carbon sink in the last two decades. These observations of C stocks and fluxes provided field-based data for a long-term study of C cycling in temperate forest ecosystems. PMID:26501378

  9. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  10. Dynamic mean field theory for lattice gas models of fluids confined in porous materials: Higher order theory based on the Bethe-Peierls and path probability method approximations

    SciTech Connect

    Edison, John R.; Monson, Peter A.

    2014-07-14

    Recently we have developed a dynamic mean field theory (DMFT) for lattice gas models of fluids in porous materials [P. A. Monson, J. Chem. Phys. 128(8), 084701 (2008)]. The theory can be used to describe the relaxation processes in the approach to equilibrium or metastable states for fluids in pores and is especially useful for studying system exhibiting adsorption/desorption hysteresis. In this paper we discuss the extension of the theory to higher order by means of the path probability method (PPM) of Kikuchi and co-workers. We show that this leads to a treatment of the dynamics that is consistent with thermodynamics coming from the Bethe-Peierls or Quasi-Chemical approximation for the equilibrium or metastable equilibrium states of the lattice model. We compare the results from the PPM with those from DMFT and from dynamic Monte Carlo simulations. We find that the predictions from PPM are qualitatively similar to those from DMFT but give somewhat improved quantitative accuracy, in part due to the superior treatment of the underlying thermodynamics. This comes at the cost of greater computational expense associated with the larger number of equations that must be solved.

  11. Comparisons of estimates of annual exceedance-probability discharges for small drainage basins in Iowa, based on data through water year 2013

    USGS Publications Warehouse

    Eash, David A.

    2015-01-01

    An examination was conducted to understand why the 1987 single-variable RREs seem to provide better accuracy and less bias than either of the 2013 multi- or single-variable RREs. A comparison of 1-percent annual exceedance-probability regression lines for hydrologic regions 1-4 from the 1987 single-variable RREs and for flood regions 1-3 from the 2013 single-variable RREs indicates that the 1987 single-variable regional-regression lines generally have steeper slopes and lower discharges when compared to 2013 single-variable regional-regression lines for corresponding areas of Iowa. The combination of the definition of hydrologic regions, the lower discharges, and the steeper slopes of regression lines associated with the 1987 single-variable RREs seem to provide better accuracy and less bias when compared to the 2013 multi- or single-variable RREs; better accuracy and less bias was determined particularly for drainage areas less than 2 mi2, and also for some drainage areas between 2 and 20 mi2. The 2013 multi- and single-variable RREs are considered to provide better accuracy and less bias for larger drainage areas. Results of this study indicate that additional research is needed to address the curvilinear relation between drainage area and AEPDs for areas of Iowa.

  12. Stimulus probability effects in absolute identification.

    PubMed

    Kent, Christopher; Lamberts, Koen

    2016-05-01

    This study investigated the effect of stimulus presentation probability on accuracy and response times in an absolute identification task. Three schedules of presentation were used to investigate the interaction between presentation probability and stimulus position within the set. Data from individual participants indicated strong effects of presentation probability on both proportion correct and response times. The effects were moderated by the ubiquitous stimulus position effect. The accuracy and response time data were predicted by an exemplar-based model of perceptual cognition (Kent & Lamberts, 2005). The bow in discriminability was also attenuated when presentation probability for middle items was relatively high, an effect that will constrain future model development. The study provides evidence for item-specific learning in absolute identification. Implications for other theories of absolute identification are discussed. (PsycINFO Database Record PMID:26478959

  13. Neutron initiation probability in fast burst reactor

    SciTech Connect

    Liu, X.; Du, J.; Xie, Q.; Fan, X.

    2012-07-01

    Based on the probability balance of neutron random events in multiply system, the four random process of neutron in prompt super-critical is described and then the equation of neutron initiation probability W(r,E,{Omega},t) is deduced. On the assumption of static, slightly prompt super-critical and the two factorial approximation, the formula of the average probability of 'one' neutron is derived which is the same with the result derived from the point model. The MC simulation using point model is applied in Godiva- II and CFBR-II, and the simulation result of one neutron initiation is well consistent with the theory that the initiation probability of Godiva- II inverted commas CFBR-II burst reactor are 0.00032, 0.00027 respectively on the ordinary burst operation. (authors)

  14. MSPI False Indication Probability Simulations

    SciTech Connect

    Dana Kelly; Kurt Vedros; Robert Youngblood

    2011-03-01

    This paper examines false indication probabilities in the context of the Mitigating System Performance Index (MSPI), in order to investigate the pros and cons of different approaches to resolving two coupled issues: (1) sensitivity to the prior distribution used in calculating the Bayesian-corrected unreliability contribution to the MSPI, and (2) whether (in a particular plant configuration) to model the fuel oil transfer pump (FOTP) as a separate component, or integrally to its emergency diesel generator (EDG). False indication probabilities were calculated for the following situations: (1) all component reliability parameters at their baseline values, so that the true indication is green, meaning that an indication of white or above would be false positive; (2) one or more components degraded to the extent that the true indication would be (mid) white, and “false” would be green (negative) or yellow (negative) or red (negative). In key respects, this was the approach taken in NUREG-1753. The prior distributions examined were the constrained noninformative (CNI) prior used currently by the MSPI, a mixture of conjugate priors, the Jeffreys noninformative prior, a nonconjugate log(istic)-normal prior, and the minimally informative prior investigated in (Kelly et al., 2010). The mid-white performance state was set at ?CDF = ?10 ? 10-6/yr. For each simulated time history, a check is made of whether the calculated ?CDF is above or below 10-6/yr. If the parameters were at their baseline values, and ?CDF > 10-6/yr, this is counted as a false positive. Conversely, if one or all of the parameters are set to values corresponding to ?CDF > 10-6/yr but that time history’s ?CDF < 10-6/yr, this is counted as a false negative indication. The false indication (positive or negative) probability is then estimated as the number of false positive or negative counts divided by the number of time histories (100,000). Results are presented for a set of base case parameter values

  15. Increasing the probability of long-range (1 month) SPI index forecasts based on SL-AV model for the Russian territory.

    NASA Astrophysics Data System (ADS)

    Utkuzova, Dilyara; Khan, Valentina; Donner, Reik

    2016-04-01

    Precipitation predictions for long-range period could be done with a numerical weather prediction model. Often, results after running the model are not so high. So, it is typically feasible to use post-processing methods producing the long - range precipitation forecast. For this purpose the SPI index was used. First of all it is necessary to test SPI index using statistical techniques. Different parameters of SPI frequency distribution and long-term tendencies were calculated as well as spatial characteristics indicating drought and wetness propagation. Results of analysis demonstrate that during previous years there is a tendency of increasing intensity of drought and wetness extremes over Russia. There are fewer droughts in the northern regions. The drought propagation for the European territory of Russia is decreasing in June and August, and increasing in July. The situation is opposite for the wetness tendencies. For the Asian territory of Russia, the drought propagation is significantly increasing in July along with a decreasing wetness trend. Then synoptic analysis has been conducted to describe wet and drought events. Synoptic conditions favorable for the formation of wet and drought extremes were identified by comparing synoptic charts with the spatial patterns of SPI. For synoptic analysis, episodes of extremely wet (6 episodes for the APR and 7 episodes for the EPR) and drought (6 episodes for the APR and 6 for the EPR) events were classified using A. Katz' typology of weather regimes. For European part of Russia, extreme DROUGHT events are linked to the weather type named "MIXED", for Asian part of Russia - the type "CENTRAL". For European part of Russia, extreme WET events associated with "CENTRAL" type. There is a displacement of the planetary frontal zone into southward direction approximately for 5-25 degrees relative to normal climatological position during WET extreme events linked to the «EASTERN» classification type. The SPI field (data was

  16. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are <1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  17. The cumulative reaction probability as eigenvalue problem

    NASA Astrophysics Data System (ADS)

    Manthe, Uwe; Miller, William H.

    1993-09-01

    It is shown that the cumulative reaction probability for a chemical reaction can be expressed (absolutely rigorously) as N(E)=∑kpk(E), where {pk} are the eigenvalues of a certain Hermitian matrix (or operator). The eigenvalues {pk} all lie between 0 and 1 and thus have the interpretation as probabilities, eigenreaction probabilities which may be thought of as the rigorous generalization of the transmission coefficients for the various states of the activated complex in transition state theory. The eigenreaction probabilities {pk} can be determined by diagonalizing a matrix that is directly available from the Hamiltonian matrix itself. It is also shown how a very efficient iterative method can be used to determine the eigenreaction probabilities for problems that are too large for a direct diagonalization to be possible. The number of iterations required is much smaller than that of previous methods, approximately the number of eigenreaction probabilities that are significantly different from zero. All of these new ideas are illustrated by application to three model problems—transmission through a one-dimensional (Eckart potential) barrier, the collinear H+H2→H2+H reaction, and the three-dimensional version of this reaction for total angular momentum J=0.

  18. Application-dependent Probability Distributions for Offshore Wind Speeds

    NASA Astrophysics Data System (ADS)

    Morgan, E. C.; Lackner, M.; Vogel, R. M.; Baise, L. G.

    2010-12-01

    The higher wind speeds of the offshore environment make it an attractive setting for future wind farms. With sparser field measurements, the theoretical probability distribution of short-term wind speeds becomes more important in estimating values such as average power output and fatigue load. While previous studies typically compare the accuracy of probability distributions using R2, we show that validation based on this metric is not consistent with validation based on engineering parameters of interest, namely turbine power output and extreme wind speed. Thus, in order to make the most accurate estimates possible, the probability distribution that an engineer picks to characterize wind speeds should depend on the design parameter of interest. We introduce the Kappa and Wakeby probability distribution functions to wind speed modeling, and show that these two distributions, along with the Biweibull distribution, fit wind speed samples better than the more widely accepted Weibull and Rayleigh distributions based on R2. Additionally, out of the 14 probability distributions we examine, the Kappa and Wakeby give the most accurate and least biased estimates of turbine power output. The fact that the 2-parameter Lognormal distribution estimates extreme wind speeds (i.e. fits the upper tail of wind speed distributions) with least error indicates that not one single distribution performs satisfactorily for all applications. Our use of a large dataset composed of 178 buoys (totaling ~72 million 10-minute wind speed observations) makes these findings highly significant, both in terms of large sample size and broad geographical distribution across various wind regimes. Boxplots of R2 from the fit of each of the 14 distributions to the 178 boy wind speed samples. Distributions are ranked from left to right by ascending median R2, with the Biweibull having the closest median to 1.

  19. Practical algorithmic probability: an image inpainting example

    NASA Astrophysics Data System (ADS)

    Potapov, Alexey; Scherbakov, Oleg; Zhdanov, Innokentii

    2013-12-01

    Possibility of practical application of algorithmic probability is analyzed on an example of image inpainting problem that precisely corresponds to the prediction problem. Such consideration is fruitful both for the theory of universal prediction and practical image inpaiting methods. Efficient application of algorithmic probability implies that its computation is essentially optimized for some specific data representation. In this paper, we considered one image representation, namely spectral representation, for which an image inpainting algorithm is proposed based on the spectrum entropy criterion. This algorithm showed promising results in spite of very simple representation. The same approach can be used for introducing ALP-based criterion for more powerful image representations.

  20. Probability mapping of scarred myocardium using texture and intensity features in CMR images

    PubMed Central

    2013-01-01

    Background The myocardium exhibits heterogeneous nature due to scarring after Myocardial Infarction (MI). In Cardiac Magnetic Resonance (CMR) imaging, Late Gadolinium (LG) contrast agent enhances the intensity of scarred area in the myocardium. Methods In this paper, we propose a probability mapping technique using Texture and Intensity features to describe heterogeneous nature of the scarred myocardium in Cardiac Magnetic Resonance (CMR) images after Myocardial Infarction (MI). Scarred tissue and non-scarred tissue are represented with high and low probabilities, respectively. Intermediate values possibly indicate areas where the scarred and healthy tissues are interwoven. The probability map of scarred myocardium is calculated by using a probability function based on Bayes rule. Any set of features can be used in the probability function. Results In the present study, we demonstrate the use of two different types of features. One is based on the mean intensity of pixel and the other on underlying texture information of the scarred and non-scarred myocardium. Examples of probability maps computed using the mean intensity of pixel and the underlying texture information are presented. We hypothesize that the probability mapping of myocardium offers alternate visualization, possibly showing the details with physiological significance difficult to detect visually in the original CMR image. Conclusion The probability mapping obtained from the two features provides a way to define different cardiac segments which offer a way to identify areas in the myocardium of diagnostic importance (like core and border areas in scarred myocardium). PMID:24053280

  1. The Probabilities of Conditionals Revisited

    ERIC Educational Resources Information Center

    Douven, Igor; Verbrugge, Sara

    2013-01-01

    According to what is now commonly referred to as "the Equation" in the literature on indicative conditionals, the probability of any indicative conditional equals the probability of its consequent of the conditional given the antecedent of the conditional. Philosophers widely agree in their assessment that the triviality arguments of…

  2. Decision analysis with approximate probabilities

    NASA Technical Reports Server (NTRS)

    Whalen, Thomas

    1992-01-01

    This paper concerns decisions under uncertainty in which the probabilities of the states of nature are only approximately known. Decision problems involving three states of nature are studied. This is due to the fact that some key issues do not arise in two-state problems, while probability spaces with more than three states of nature are essentially impossible to graph. The primary focus is on two levels of probabilistic information. In one level, the three probabilities are separately rounded to the nearest tenth. This can lead to sets of rounded probabilities which add up to 0.9, 1.0, or 1.1. In the other level, probabilities are rounded to the nearest tenth in such a way that the rounded probabilities are forced to sum to 1.0. For comparison, six additional levels of probabilistic information, previously analyzed, were also included in the present analysis. A simulation experiment compared four criteria for decisionmaking using linearly constrained probabilities (Maximin, Midpoint, Standard Laplace, and Extended Laplace) under the eight different levels of information about probability. The Extended Laplace criterion, which uses a second order maximum entropy principle, performed best overall.

  3. Computation of Most Probable Numbers

    PubMed Central

    Russek, Estelle; Colwell, Rita R.

    1983-01-01

    A rapid computational method for maximum likelihood estimation of most-probable-number values, incorporating a modified Newton-Raphson method, is presented. The method offers a much greater reliability for the most-probable-number estimate of total viable bacteria, i.e., those capable of growth in laboratory media. PMID:6870242

  4. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  5. Parametric probability distributions for anomalous change detection

    SciTech Connect

    Theiler, James P; Foy, Bernard R; Wohlberg, Brendt E; Scovel, James C

    2010-01-01

    The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.

  6. Pig Data and Bayesian Inference on Multinomial Probabilities

    ERIC Educational Resources Information Center

    Kern, John C.

    2006-01-01

    Bayesian inference on multinomial probabilities is conducted based on data collected from the game Pass the Pigs[R]. Prior information on these probabilities is readily available from the instruction manual, and is easily incorporated in a Dirichlet prior. Posterior analysis of the scoring probabilities quantifies the discrepancy between empirical…

  7. How LO Can You GO? Using the Dice-Based Golf Game GOLO to Illustrate Inferences on Proportions and Discrete Probability Distributions

    ERIC Educational Resources Information Center

    Stephenson, Paul; Richardson, Mary; Gabrosek, John; Reischman, Diann

    2009-01-01

    This paper describes an interactive activity that revolves around the dice-based golf game GOLO. The GOLO game can be purchased at various retail locations or online at igolo.com. In addition, the game may be played online free of charge at igolo.com. The activity is completed in four parts. The four parts can be used in a sequence or they can be…

  8. Non-Patient-Based Clinical Licensure Examination for Dentistry in Minnesota: Significance of Decision and Description of Process.

    PubMed

    Mills, Eric A

    2016-06-01

    In recent years in the United States, there has been heightened interest in offering clinical licensure examination (CLE) alternatives to the live patient-based method in dentistry. Fueled by ethical concerns of faculty members at the University of Minnesota School of Dentistry, the state of Minnesota's Board of Dentistry approved a motion in 2009 to provide two CLE options to the school's future predoctoral graduates: a patient-based one, administered by the Central Regional Dental Testing Service, and a non-patient-based one administered by the National Dental Examining Board of Canada (NDEB). The validity of the NDEB written exam and objective structured clinical exam (OSCE) has been verified in a multi-year study. Via five-option, one-best-answer, multiple-choice questions in the written exam and extended match questions with up to 15 answer options in the station-based OSCE, competent candidates are distinguished from those who are incompetent in their didactic knowledge and clinical critical thinking and judgment across all dental disciplines. The action had the additional effects of furthering participation of Minnesota Board of Dentistry members in the University of Minnesota School of Dentistry's competency-based curriculum, of involving the school's faculty in NDEB item development workshops, and, beginning in 2018, of no longer permitting the patient-based CLE option on site. The aim of this article is to describe how this change came about and its effects. PMID:27251345

  9. Direct probability mapping of contaminants

    SciTech Connect

    Rautman, C.A.

    1993-09-17

    Exhaustive characterization of a contaminated site is a physical and practical impossibility. Descriptions of the nature, extent, and level of contamination, as well as decisions regarding proposed remediation activities, must be made in a state of uncertainty based upon limited physical sampling. Geostatistical simulation provides powerful tools for investigating contaminant levels, and in particular, for identifying and using the spatial interrelationships among a set of isolated sample values. This additional information can be used to assess the likelihood of encountering contamination at unsampled locations and to evaluate the risk associated with decisions to remediate or not to remediate specific regions within a site. Past operation of the DOE Feed Materials Production Center has contaminated a site near Fernald, Ohio, with natural uranium. Soil geochemical data have been collected as part of the Uranium-in-Soils Integrated Demonstration Project. These data have been used to construct a number of stochastic images of potential contamination for parcels approximately the size of a selective remediation unit. Each such image accurately reflects the actual measured sample values, and reproduces the univariate statistics and spatial character of the extant data. Post-processing of a large number of these equally likely, statistically similar images produces maps directly showing the probability of exceeding specified levels of contamination. Evaluation of the geostatistical simulations can yield maps representing the expected magnitude of the contamination for various regions and other information that may be important in determining a suitable remediation process or in sizing equipment to accomplish the restoration.

  10. Development of an antigen-based rapid diagnostic test for the identification of blowfly (Calliphoridae) species of forensic significance.

    PubMed

    McDonagh, Laura; Thornton, Chris; Wallman, James F; Stevens, Jamie R

    2009-06-01

    In this study we examine the limitations of currently used sequence-based approaches to blowfly (Calliphoridae) identification and evaluate the utility of an immunological approach to discriminate between blowfly species of forensic importance. By investigating antigenic similarity and dissimilarity between the first instar larval stages of four forensically important blowfly species, we have been able to identify immunoreactive proteins of potential use in the development of species-specific immuno-diagnostic tests. Here we outline our protein-based approach to species determination, and describe how it may be adapted to develop rapid diagnostic assays for the 'on-site' identification of blowfly species. PMID:19414163

  11. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non

  12. Significant Life Experience: Exploring the Lifelong Influence of Place-Based Environmental and Science Education on Program Participants

    ERIC Educational Resources Information Center

    Colvin, Corrie Ruth

    2013-01-01

    Current research provides a limited understanding of the life long influence of nonformal place-based environmental and science education programs on past participants. This study looks to address this gap, exploring the ways in which these learning environments have contributed to environmental identity and stewardship. Using Dorothy Holland's…

  13. Richness-Productivity Relationships Between Trophic Levels in a Detritus-Based System: Significance of Abundance and Trophic Linkage.

    EPA Science Inventory

    Most theoretical and empirical studies of productivity–species richness relationships fail to consider linkages among trophic levels. We quantified productivity–richness relationships in detritus-based, water-filled tree-hole communities for two trophic levels: invertebrate consu...

  14. Problems in the Application of Thermal-based Energy Balance Models with Significant Sub-Pixel Variability

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A thermal infrared (TIR)-based two-source (soil + vegetation) energy balance (TSEB) model has been applied and validated using remotely sensed imagery at a variety of resolutions. TSEB model validation issues, potential model errors and inability to discriminate fluxes for diverse land cover types ...

  15. Automatic Item Generation of Probability Word Problems

    ERIC Educational Resources Information Center

    Holling, Heinz; Bertling, Jonas P.; Zeuch, Nina

    2009-01-01

    Mathematical word problems represent a common item format for assessing student competencies. Automatic item generation (AIG) is an effective way of constructing many items with predictable difficulties, based on a set of predefined task parameters. The current study presents a framework for the automatic generation of probability word problems…

  16. Limits on the significant mass-loss scenario based on the globular clusters of the Fornax dwarf spheroidal galaxy

    NASA Astrophysics Data System (ADS)

    Khalaj, P.; Baumgardt, H.

    2016-03-01

    Many of the scenarios proposed to explain the origin of chemically peculiar stars in globular clusters (GCs) require significant mass loss (≥95 per cent) to explain the observed fraction of such stars. In the GCs of the Fornax dwarf galaxy, significant mass loss could be a problem. Larsen et al. showed that there is a large ratio of GCs to metal-poor field stars in Fornax and about 20-25 per cent of all the stars with [Fe/H] < -2 belong to the four metal-poor GCs. This imposes an upper limit of ˜80 per cent mass loss that could have happened in Fornax GCs. In this paper, we propose a solution to this problem by suggesting that stars can leave the Fornax galaxy. We use a series of N-body simulations to determine the limit of mass loss from Fornax as a function of the initial orbital radii of GCs and the speed with which stars leave Fornax GCs. We consider a set of cored and cuspy density profiles for Fornax. Our results show that with a cuspy model for Fornax, the fraction of stars that leave the galaxy can be as high as ˜90 per cent, when the initial orbital radii of GCs are R = 2-3 kpc and the initial speed of stars is v > 20 km s-1. We show that such large velocities can be achieved by mass loss induced by gas expulsion but not mass loss induced by stellar evolution. Our results imply that one cannot interpret the metallicity distribution of Fornax field stars as evidence against significant mass loss in Fornax GCs, if mass loss is due to gas expulsion.

  17. Electrochemical evidences and consequences of significant differences in ions diffusion rate in polyacrylate-based ion-selective membranes.

    PubMed

    Woźnica, Emilia; Mieczkowski, Józef; Michalska, Agata

    2011-11-21

    The origin and effect of surface accumulation of primary ions within the ion-selective poly(n-butyl acrylate)-based membrane, obtained by thermal polymerization, is discussed. Using a new method, based on the relation between the shape of a potentiometric plot and preconditioning time, the diffusion of copper ions in the membrane was found to be slow (the diffusion coefficient estimated to be close to 10(-11) cm(2) s(-1)), especially when compared to ion-exchanger counter ions--sodium cations diffusion (a diffusion coefficient above 10(-9) cm(2) s(-1)). The higher mobility of sodium ions than those of the copper-ionophore complex results in exposed ion-exchanger role leading to undesirably exposed sensitivity to sodium or potassium ions. PMID:21957488

  18. The probabilities of unique events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  19. Transition probabilities of Br II

    NASA Technical Reports Server (NTRS)

    Bengtson, R. D.; Miller, M. H.

    1976-01-01

    Absolute transition probabilities of the three most prominent visible Br II lines are measured in emission. Results compare well with Coulomb approximations and with line strengths extrapolated from trends in homologous atoms.

  20. The Probabilities of Unique Events

    PubMed Central

    Khemlani, Sangeet S.; Lotstein, Max; Johnson-Laird, Phil

    2012-01-01

    Many theorists argue that the probabilities of unique events, even real possibilities such as President Obama's re-election, are meaningless. As a consequence, psychologists have seldom investigated them. We propose a new theory (implemented in a computer program) in which such estimates depend on an intuitive non-numerical system capable only of simple procedures, and a deliberative system that maps intuitions into numbers. The theory predicts that estimates of the probabilities of conjunctions should often tend to split the difference between the probabilities of the two conjuncts. We report two experiments showing that individuals commit such violations of the probability calculus, and corroborating other predictions of the theory, e.g., individuals err in the same way even when they make non-numerical verbal estimates, such as that an event is highly improbable. PMID:23056224

  1. VESPA: False positive probabilities calculator

    NASA Astrophysics Data System (ADS)

    Morton, Timothy D.

    2015-03-01

    Validation of Exoplanet Signals using a Probabilistic Algorithm (VESPA) calculates false positive probabilities and statistically validates transiting exoplanets. Written in Python, it uses isochrones [ascl:1503.010] and the package simpledist.

  2. Dinosaurs, Dinosaur Eggs, and Probability.

    ERIC Educational Resources Information Center

    Teppo, Anne R.; Hodgson, Ted

    2001-01-01

    Outlines several recommendations for teaching probability in the secondary school. Offers an activity that employs simulation by hand and using a programmable calculator in which geometry, analytical geometry, and discrete mathematics are explored. (KHR)

  3. Steering in spin tomographic probability representation

    NASA Astrophysics Data System (ADS)

    Man'ko, V. I.; Markovich, L. A.

    2016-09-01

    The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.

  4. Joint probabilities and quantum cognition

    SciTech Connect

    Acacio de Barros, J.

    2012-12-18

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  5. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  6. Evaluation of microbial release probabilities

    NASA Technical Reports Server (NTRS)

    1972-01-01

    Work undertaken to improve the estimation of the probability of release of microorganisms from unmanned Martian landing spacecraft is summarized. An analytical model is described for the development of numerical values for release parameters and release mechanisms applicable to flight missions are defined. Laboratory test data are used to evolve parameter values for use by flight projects in estimating numerical values for release probabilities. The analysis treats microbial burden located on spacecraft surfaces, between mated surfaces, and encapsulated within materials.

  7. Annonaceous acetogenins (ACGs) nanosuspensions based on a self-assembly stabilizer and the significantly improved anti-tumor efficacy.

    PubMed

    Hong, Jingyi; Li, Yanhong; Xiao, Yao; Li, Yijing; Guo, Yifei; Kuang, Haixue; Wang, Xiangtao

    2016-09-01

    Annonaceous acetogenins (ACGs) have exhibited antitumor activity against various cancers. However, these substances' poor solubility has limited clinical applications. In this study, hydroxypropyl-beta-cyclodextrin (HP-β-CD) and soybean lecithin (SPC) were self-assembled into an amphiphilic complex. ACGs nanosuspensions (ACGs-NSps) were prepared with a mean particle size of 144.4nm, a zeta potential of -22.9mV and a high drug payload of 46.17% using this complex as stabilizer. The ACGs-NSps demonstrated sustained release in vitro and good stability in plasma as well as simulated gastrointestinal fluid, and met the demand of both intravenous injection and oral administration. The ACGs-NSps demonstrated significantly increased cytotoxicity against Hela and HepG2 cancer cell lines compared to ACGs in solution (in vitro cytotoxicity assay). An in vivo study with H22-tumor bearing mice demonstrated that nanosuspensions significantly improved ACGs' antitumor activity. When orally administered, ACGs-NSps achieved a similar tumor inhibition rate at 1/10th the dose of ACGs in an oil solution (47.94% vs. 49.74%, p>0.05). Improved therapeutic efficacy was further achieved when the ACGs-NSps were intravenously injected into mice (70.31%). With the help of nanosuspension technology, ACGs may be an effective antitumor drug for clinic use. PMID:27209384

  8. Group cognitive behavioural treatment of youth anxiety in community based clinical practice: Clinical significance and benchmarking against efficacy.

    PubMed

    Jónsson, H; Thastum, M; Arendt, K; Juul-Sørensen, M

    2015-10-01

    The efficacy of a group cognitive behavioural therapy (CBT) programme (Cool Kids) of youth anxiety has been demonstrated at university clinics in Australia and Denmark and similar CBT programmes have been found effective within community settings in other countries. However, most effectiveness studies of CBT for youth anxiety have either used a mixture of CBT guidelines, or translated protocols not previous tested in an efficacy trial. This study used a benchmarking strategy to compare outcomes from the same CBT programme used at a university research clinic (N=87) and community centres (N=82). There was a significant reduction on both clinical and self-report measures of youth anxiety over time with medium to large effect sizes within both samples. Treatment effects on self-report measures of youth anxiety were significantly larger within the university sample, while changes in clinical measures of youth anxiety were similar in the two samples. Overall these findings suggest that an efficacious CBT group treatment programme developed within research contexts is transportable to community centres. Despite being effective within the community, the results indicate that the treatment may lose some of its efficacy when disseminated to the community. PMID:26283461

  9. The biological significance of color constancy: an agent-based model with bees foraging from flowers under varied illumination.

    PubMed

    Faruq, Samia; McOwan, Peter W; Chittka, Lars

    2013-01-01

    The perceived color of an object depends on its spectral reflectance and the spectral composition of the illuminant. Thus when the illumination changes, the light reflected from the object also varies. This would result in a different color sensation if no color constancy mechanism is put in place-that is, the ability to form consistent representation of colors across various illuminants and background scenes. We explore the quantitative benefits of various color constancy algorithms in an agent-based model of foraging bees, where agents select flower color based on reward. Each simulation is based on 100 "meadows" with five randomly selected flower species with empirically determined spectral reflectance properties, and each flower species is associated with realistic distributions of nectar rewards. Simulated foraging bees memorize the colors of flowers that they have experienced as most rewarding, and their task is to discriminate against other flower colors with lower rewards, even in the face of changing illumination conditions. We compared the performance of von Kries, White Patch, and Gray World constancy models with (hypothetical) bees with perfect color constancy, and color-blind bees. A bee equipped with trichromatic color vision but no color constancy performed only ∼20% better than a color-blind bee (relative to a maximum improvement at 100% for perfect color constancy), whereas the most powerful recovery of reflectance in the face of changing illumination was generated by a combination of von Kries photoreceptor adaptation and a White Patch calibration (∼30% improvement relative to a bee without color constancy). However, none of the tested algorithms generated perfect color constancy. PMID:23962735

  10. Significant acceleration of 2D-3D registration-based fusion of ultrasound and x-ray images by mesh-based DRR rendering

    NASA Astrophysics Data System (ADS)

    Kaiser, Markus; John, Matthias; Borsdorf, Anja; Mountney, Peter; Ionasec, Razvan; Nöttling, Alois; Kiefer, Philipp; Seeburger, Jörg; Neumuth, Thomas

    2013-03-01

    For transcatheter-based minimally invasive procedures in structural heart disease ultrasound and X-ray are the two enabling imaging modalities. A live fusion of both real-time modalities can potentially improve the workflow and the catheter navigation by combining the excellent instrument imaging of X-ray with the high-quality soft tissue imaging of ultrasound. A recently published approach to fuse X-ray fluoroscopy with trans-esophageal echo (TEE) registers the ultrasound probe to X-ray images by a 2D-3D registration method which inherently provides a registration of ultrasound images to X-ray images. In this paper, we significantly accelerate the 2D-3D registration method in this context. The main novelty is to generate the projection images (DRR) of the 3D object not via volume ray-casting but instead via a fast rendering of triangular meshes. This is possible, because in the setting for TEE/X-ray fusion the 3D geometry of the ultrasound probe is known in advance and their main components can be described by triangular meshes. We show that the new approach can achieve a speedup factor up to 65 and does not affect the registration accuracy when used in conjunction with the gradient correlation similarity measure. The improvement is independent of the underlying registration optimizer. Based on the results, a TEE/X-ray fusion could be performed with a higher frame rate and a shorter time lag towards real-time registration performance. The approach could potentially accelerate other applications of 2D-3D registrations, e.g. the registration of implant models with X-ray images.

  11. Detection probability of EBPSK-MODEM system

    NASA Astrophysics Data System (ADS)

    Yao, Yu; Wu, Lenan

    2016-07-01

    Since the impacting filter-based receiver is able to transform phase modulation into amplitude peak, a simple threshold decision can detect the Extend-Binary Phase Shift Keying (EBPSK) modulated ranging signal in noise environment. In this paper, an analysis of the EBPSK-MODEM system output gives the probability density function for EBPSK modulated signals plus noise. The equation of detection probability (pd) for fluctuating and non-fluctuating targets has been deduced. Also, a comparison of the pd for the EBPSK-MODEM system and pulse radar receiver is made, and some results are plotted. Moreover, the probability curves of such system with several modulation parameters are analysed. When modulation parameter is not smaller than 6, the detection performance of EBPSK-MODEM system is more excellent than traditional radar system. In addition to theoretical considerations, computer simulations are provided for illustrating the performance.

  12. Local Directed Percolation Probability in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Inui, Norio; Konno, Norio; Komatsu, Genichi; Kameoka, Koichi

    1998-01-01

    Using the series expansion method and Monte Carlo simulation,we study the directed percolation probability on the square lattice Vn0=\\{ (x,y) \\in {Z}2:x+y=even, 0 ≤ y ≤ n, - y ≤ x ≤ y \\}.We calculate the local percolationprobability Pnl defined as the connection probability between theorigin and a site (0,n). The critical behavior of P∞lis clearly different from the global percolation probability P∞g characterized by a critical exponent βg.An analysis based on the Padé approximants shows βl=2βg.In addition, we find that the series expansion of P2nl can be expressed as a function of Png.

  13. [Prospects for the design of new therapeutically significant protease inhibitors based on knottins and sunflower seed trypsin inhibitor (SFTI 1)].

    PubMed

    Kuznetsova, S S; Kolesanova, E F; Talanova, A V; Veselovsky, A V

    2016-05-01

    Plant seed knottins, mainly from the Cucurbitacea family, and sunflower seed trypsin inhibitor (SFTI 1) are the most low-molecular canonical peptide inhibitors of serine proteases. High efficiency of inhibition of various serine proteases, structure rigidity together with the possibility of limited variations of amino acid sequences, high chemical stability, lack of toxic properties, opportunity of production by either chemical synthesis or use of heterologous expression systems make these inhibitors attractive templates for design of new compounds for regulation of therapeutically significant serine protease activities. Hence the design of such compounds represents a prospective research field. The review considers structural characteristics of these inhibitors, their properties, methods of preparation and design of new analogs. Examples of successful employment of natural serine protease inhibitors belonging to knottin family and SFTI 1 as templates for the design of highly specific inhibitors of certain proteases are given. PMID:27562989

  14. Hydrothermal Fe cycling and deep ocean organic carbon scavenging: Model-based evidence for significant POC supply to seafloor sediments

    NASA Astrophysics Data System (ADS)

    German, C. R.; Legendre, L. L.; Sander, S. G.; Niquil, N.; Luther, G. W.; Bharati, L.; Han, X.; Le Bris, N.

    2015-06-01

    Submarine hydrothermal venting has recently been identified to have the potential to impact ocean biogeochemistry at the global scale. This is the case because processes active in hydrothermal plumes are so vigorous that the residence time of the ocean, with respect to cycling through hydrothermal plumes, is comparable to that of deep ocean mixing caused by thermohaline circulation. Recently, it has been argued that seafloor venting may provide a significant source of bio-essential Fe to the oceans as the result of a close coupling between Fe and organic carbon in hydrothermal plumes. But a complementary question remains to be addressed: does this same intimate Fe-Corg association in hydrothermal plumes cause any related impact to the global C cycle? To address this, SCOR-InterRidge Working Group 135 developed a modeling approach to synthesize site-specific field data from the East Pacific Rise 9°50‧ N hydrothermal field, where the range of requisite data sets is most complete, and combine those inputs with global estimates for dissolved Fe inputs from venting to the oceans to establish a coherent model with which to investigate hydrothermal Corg cycling. The results place new constraints on submarine Fe vent fluxes worldwide, including an indication that the majority of Fe supplied to hydrothermal plumes should come from entrainment of diffuse flow. While this same entrainment is not predicted to enhance the supply of dissolved organic carbon to hydrothermal plumes by more than ∼10% over background values, what the model does indicate is that scavenging of carbon in association with Fe-rich hydrothermal plume particles should play a significant role in the delivery of particulate organic carbon to deep ocean sediments, worldwide.

  15. Airborne/Space-Based Doppler Lidar Wind Sounders Sampling the PBL and Other Regions of Significant Beta and U Inhomogeneities

    NASA Technical Reports Server (NTRS)

    Emmitt, Dave

    1998-01-01

    This final report covers the period from April 1994 through March 1998. The proposed research was organized under four main tasks. Those tasks were: (1) Investigate the vertical and horizontal velocity structures within and adjacent to thin and subvisual cirrus; (2) Investigate the lowest 1 km of the PBL and develop algorithms for processing pulsed Doppler lidar data obtained from single shots into regions of significant inhomogeneities in Beta and U; (3) Participate in OSSEs including those designed to establish shot density requirements for meso-gamma scale phenomena with quasi-persistent locations (e.g., jets, leewaves, tropical storms); and (4) Participate in the planning and execution of an airborne mission to measure winds with a pulsed CO2 Doppler lidar. Over the four year period of this research contract, work on all four tasks has yielded significant results which have led to 38 professional presentations (conferences and publications) and have been folded into the science justification for an approved NASA space mission, SPARCLE (SPAce Readiness Coherent Lidar Experiment), in 2001. Also this research has, through Task 4, led to a funded proposal to work directly on a NASA field campaign, CAMEX III, in which an airborne Doppler wind lidar will be used to investigate the cloud-free circulations near tropical storms. Monthly progress reports required under this contract are on file. This final report will highlight major accomplishments, including some that were not foreseen in the original proposal. The presentation of this final report includes this written document as well as material that is better presented via the internet (web pages). There is heavy reference to appended papers and documents. Thus, the main body of the report will serve to summarize the key efforts and findings.

  16. The role of probabilities in physics.

    PubMed

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description. PMID:22609725

  17. Laboratory-tutorial activities for teaching probability

    NASA Astrophysics Data System (ADS)

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-12-01

    We report on the development of students’ ideas of probability and probability density in a University of Maine laboratory-based general education physics course called Intuitive Quantum Physics. Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We describe a set of activities used to teach concepts of probability and probability density. Rudimentary knowledge of mechanics is needed for one activity, but otherwise the material requires no additional preparation. Extensions of the activities include relating probability density to potential energy graphs for certain “touchstone” examples. Students have difficulties learning the target concepts, such as comparing the ratio of time in a region to total time in all regions. Instead, they often focus on edge effects, pattern match to previously studied situations, reason about necessary but incomplete macroscopic elements of the system, use the gambler’s fallacy, and use expectations about ensemble results rather than expectation values to predict future events. We map the development of their thinking to provide examples of problems rather than evidence of a curriculum’s success.

  18. Computational methods for probability of instability calculations

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.

    1990-01-01

    This paper summarizes the development of the methods and a computer program to compute the probability of instability of a dynamic system than can be represented by a system of second-order ordinary linear differential equations. Two instability criteria based upon the roots of the characteristics equation or Routh-Hurwitz test functions are investigated. Computational methods based on system reliability analysis methods and importance sampling concepts are proposed to perform efficient probabilistic analysis. Numerical examples are provided to demonstrate the methods.

  19. Joint probability distributions for projection probabilities of random orthonormal states

    NASA Astrophysics Data System (ADS)

    Alonso, L.; Gorin, T.

    2016-04-01

    The quantum chaos conjecture applied to a finite dimensional quantum system implies that such a system has eigenstates that show similar statistical properties as the column vectors of random orthogonal or unitary matrices. Here, we consider the different probabilities for obtaining a specific outcome in a projective measurement, provided the system is in one of its eigenstates. We then give analytic expressions for the joint probability density for these probabilities, with respect to the ensemble of random matrices. In the case of the unitary group, our results can be applied, also, to the phenomenon of universal conductance fluctuations, where the same mathematical quantities describe partial conductances in a two-terminal mesoscopic scattering problem with a finite number of modes in each terminal.

  20. [Prevention and treatment of the complications of polycystic ovarian syndrome--the significance of evidence-based, interdisciplinary management].

    PubMed

    Gődény, Sándor; Csenteri, Orsolya Karola

    2015-12-13

    Polycystic ovary syndrome is the most common hormonal and metabolic disorder likely to affect women. The syndrome is often associated with obesity, hyperinsulinemia and adversely affects endocrine, metabolic, and cardiovascular health. The complex feature of the syndrome requires an interdisciplinary approach to treatment, where cooperation of paediatrician, internist, gynaecologist, endocrinologist, dermatologist, psychologist and oncologist is essential. The prevention and the treatment should be based on the best available evidence. This should include physical examination, laboratory tests for hormones, serum insulin, glucose, lipids, in addition patient's preferences should be considered, too. To maximise health gain of polycystic ovarian syndrome, adequate, effective, efficient and safe treatment is necessary. This article summarises the highest available evidence provided by meta-analyses and systematic reviews of the prevention of metabolic and cardiovascular complications of the syndrome, and discusses the relevant evidence published in the literature. PMID:26639643

  1. ED-based screening programs for hepatitis C (HCV) highlight significant opportunity to identify patients, prevent downstream costs/complications.

    PubMed

    2014-01-01

    New data suggest there is a huge opportunity for EDs to identify patients with the hepatitis C virus (HCV) and link them into care before downstream complications lead to higher medical costs and adverse outcomes. Early results from a pilot study at the University of Alabama Medical Center in Birmingham show that at least 12% of the targeted baby boomer population being screened for HCV in the ED is testing positive for HCV, with confirmatory tests showing that about 9% of the screened population is infected with the disease. Both the Centers for Disease Control in Atlanta and the US Preventive Services Task Force recommend one-time HCV screening for patients who were born between 1945 and 1965. Public health experts say 75% of HCV infections occur in patients born during the baby boomer years, and that roughly half of them are unaware of their HCV status. Researchers at UAB report that so many patients are testing positive for HCV that demand for care can quickly overwhelm the health system if new primary care/specialty resources are not identified. Administrators of ED-based HCV screening programs in both Birmingham and Houston note that EDs with existing screening programs for HIV should have the easiest time implementing HCV screening. They also stress that patients are more accepting of HCV screening, and that the counseling process is easier. PMID:24432549

  2. The Significance of Lewis Acid Sites for the Selective Catalytic Reduction of Nitric Oxide on Vanadium-Based Catalysts.

    PubMed

    Marberger, Adrian; Ferri, Davide; Elsener, Martin; Kröcher, Oliver

    2016-09-19

    The long debated reaction mechanisms of the selective catalytic reduction (SCR) of nitric oxide with ammonia (NH3 ) on vanadium-based catalysts rely on the involvement of Brønsted or Lewis acid sites. This issue has been clearly elucidated using a combination of transient perturbations of the catalyst environment with operando time-resolved spectroscopy to obtain unique molecular level insights. Nitric oxide reacts predominantly with NH3 coordinated to Lewis sites on vanadia on tungsta-titania (V2 O5 -WO3 -TiO2 ), while Brønsted sites are not involved in the catalytic cycle. The Lewis site is a mono-oxo vanadyl group that reduces only in the presence of both nitric oxide and NH3 . We were also able to verify the formation of the nitrosamide (NH2 NO) intermediate, which forms in tandem with vanadium reduction, and thus the entire mechanism of SCR. Our experimental approach, demonstrated in the specific case of SCR, promises to progress the understanding of chemical reactions of technological relevance. PMID:27553251

  3. Imprecise probabilities in engineering analyses

    NASA Astrophysics Data System (ADS)

    Beer, Michael; Ferson, Scott; Kreinovich, Vladik

    2013-05-01

    Probabilistic uncertainty and imprecision in structural parameters and in environmental conditions and loads are challenging phenomena in engineering analyses. They require appropriate mathematical modeling and quantification to obtain realistic results when predicting the behavior and reliability of engineering structures and systems. But the modeling and quantification is complicated by the characteristics of the available information, which involves, for example, sparse data, poor measurements and subjective information. This raises the question whether the available information is sufficient for probabilistic modeling or rather suggests a set-theoretical approach. The framework of imprecise probabilities provides a mathematical basis to deal with these problems which involve both probabilistic and non-probabilistic information. A common feature of the various concepts of imprecise probabilities is the consideration of an entire set of probabilistic models in one analysis. The theoretical differences between the concepts mainly concern the mathematical description of the set of probabilistic models and the connection to the probabilistic models involved. This paper provides an overview on developments which involve imprecise probabilities for the solution of engineering problems. Evidence theory, probability bounds analysis with p-boxes, and fuzzy probabilities are discussed with emphasis on their key features and on their relationships to one another. This paper was especially prepared for this special issue and reflects, in various ways, the thinking and presentation preferences of the authors, who are also the guest editors for this special issue.

  4. Measure and probability in cosmology

    NASA Astrophysics Data System (ADS)

    Schiffrin, Joshua S.; Wald, Robert M.

    2012-07-01

    General relativity has a Hamiltonian formulation, which formally provides a canonical (Liouville) measure on the space of solutions. In ordinary statistical physics, the Liouville measure is used to compute probabilities of macrostates, and it would seem natural to use the similar measure arising in general relativity to compute probabilities in cosmology, such as the probability that the Universe underwent an era of inflation. Indeed, a number of authors have used the restriction of this measure to the space of homogeneous and isotropic universes with scalar field matter (minisuperspace)—namely, the Gibbons-Hawking-Stewart measure—to make arguments about the likelihood of inflation. We argue here that there are at least four major difficulties with using the measure of general relativity to make probability arguments in cosmology: (1) Equilibration does not occur on cosmological length scales. (2) Even in the minisuperspace case, the measure of phase space is infinite and the computation of probabilities depends very strongly on how the infinity is regulated. (3) The inhomogeneous degrees of freedom must be taken into account (we illustrate how) even if one is interested only in universes that are very nearly homogeneous. The measure depends upon how the infinite number of degrees of freedom are truncated, and how one defines “nearly homogeneous.” (4) In a Universe where the second law of thermodynamics holds, one cannot make use of our knowledge of the present state of the Universe to retrodict the likelihood of past conditions.

  5. Airborne pollen-climate relationship based on discriminant analysis in Nam Co, Central Tibet and its palaeoenvironmental significance

    NASA Astrophysics Data System (ADS)

    Lyu, X.; Zhu, L.; Ma, Q.; Li, Q.

    2014-12-01

    Based on the airborne pollen data collected using a Burkard pollen trap, discriminant analysis were conducted to evaluate the relationship between two different atmospheric circulation systems, the Asia summer monsoon (ASM) and the Westerlies, in Nam Co basin, central Tibet. The whole year's samples could be classified into two groups using cluster analysis: one group was from May to September, another group was from October to April of next year, corresponding to monsoon period and non-monsoon period, respectively. The classification represents two different atmospheric circulation systems, ASM in monsoon period and the Westerlies in non-monsoon period. Discriminant analysis was performed. First, the whole year samples were divided into two a priori groups, group A is monsoon period (May-Sep.) and group B is non-monsoon period (Oct.-Apr.). Then percentage data of major pollen taxa were used to establish the discriminant functions, and then the samples were classified into predicted groups. The results of discriminant analysis show that 78.6% of the samples were cross-validated grouped correctly. Thus, airborne pollen assemblages can distinguish two different climate systems: monsoon period and non-monsoon period. According to the discriminant score, the group centroids of group A and group B were negative and positive, respectively. Therefore, we created the discriminant score as a new monsoon index (PDI, Pollen Discriminant Index), small PDI values represented enhanced summer monsoon climate. Using above result, we calculated the PDI of Nam Co NCL core, the PDI values can be coincided with Dryness (moisture indicator) and A/Cy ratio (temperature indicator).

  6. Investigation of the Chromosome Regions with Significant Affinity for the Nuclear Envelope in Fruit Fly – A Model Based Approach

    PubMed Central

    Kinney, Nicholas Allen; Sharakhov, Igor V.; Onufriev, Alexey V.

    2014-01-01

    Three dimensional nuclear architecture is important for genome function, but is still poorly understood. In particular, little is known about the role of the “boundary conditions” – points of attachment between chromosomes and the nuclear envelope. We describe a method for modeling the 3D organization of the interphase nucleus, and its application to analysis of chromosome-nuclear envelope (Chr-NE) attachments of polytene (giant) chromosomes in Drosophila melanogaster salivary glands. The model represents chromosomes as self-avoiding polymer chains confined within the nucleus; parameters of the model are taken directly from experiment, no fitting parameters are introduced. Methods are developed to objectively quantify chromosome territories and intertwining, which are discussed in the context of corresponding experimental observations. In particular, a mathematically rigorous definition of a territory based on convex hull is proposed. The self-avoiding polymer model is used to re-analyze previous experimental data; the analysis suggests 33 additional Chr-NE attachments in addition to the 15 already explored Chr-NE attachments. Most of these new Chr-NE attachments correspond to intercalary heterochromatin – gene poor, dark staining, late replicating regions of the genome; however, three correspond to euchromatin – gene rich, light staining, early replicating regions of the genome. The analysis also suggests 5 regions of anti-contact, characterized by aversion for the NE, only two of these correspond to euchromatin. This composition of chromatin suggests that heterochromatin may not be necessary or sufficient for the formation of a Chr-NE attachment. To the extent that the proposed model represents reality, the confinement of the polytene chromosomes in a spherical nucleus alone does not favor the positioning of specific chromosome regions at the NE as seen in experiment; consequently, the 15 experimentally known Chr-NE attachment positions do not appear to

  7. Monte Carlo simulation of scenario probability distributions

    SciTech Connect

    Glaser, R.

    1996-10-23

    Suppose a scenario of interest can be represented as a series of events. A final result R may be viewed then as the intersection of three events, A, B, and C. The probability of the result P(R) in this case is the product P(R) = P(A) P(B {vert_bar} A) P(C {vert_bar} A {intersection} B). An expert may be reluctant to estimate P(R) as a whole yet agree to supply his notions of the component probabilities in the form of prior distributions. Each component prior distribution may be viewed as the stochastic characterization of the expert`s uncertainty regarding the true value of the component probability. Mathematically, the component probabilities are treated as independent random variables and P(R) as their product; the induced prior distribution for P(R) is determined which characterizes the expert`s uncertainty regarding P(R). It may be both convenient and adequate to approximate the desired distribution by Monte Carlo simulation. Software has been written for this task that allows a variety of component priors that experts with good engineering judgment might feel comfortable with. The priors are mostly based on so-called likelihood classes. The software permits an expert to choose for a given component event probability one of six types of prior distributions, and the expert specifies the parameter value(s) for that prior. Each prior is unimodal. The expert essentially decides where the mode is, how the probability is distributed in the vicinity of the mode, and how rapidly it attenuates away. Limiting and degenerate applications allow the expert to be vague or precise.

  8. Interference of probabilities in dynamics

    SciTech Connect

    Zak, Michail

    2014-08-15

    A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.

  9. Probability as a Physical Motive

    NASA Astrophysics Data System (ADS)

    Martin, Peter

    2007-06-01

    Recent theoretical progress in nonequilibrium thermodynamics, linking thephysical principle of Maximum Entropy Production (“MEP”) to the information-theoretical“MaxEnt” principle of scientific inference, together with conjectures from theoreticalphysics that there may be no fundamental causal laws but only probabilities for physicalprocesses, and from evolutionary theory that biological systems expand “the adjacentpossible” as rapidly as possible, all lend credence to the proposition that probability shouldbe recognized as a fundamental physical motive. It is further proposed that spatial order andtemporal order are two aspects of the same thing, and that this is the essence of the secondlaw of thermodynamics.

  10. Knowledge typology for imprecise probabilities.

    SciTech Connect

    Wilson, G. D.; Zucker, L. J.

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  11. Rethinking the learning of belief network probabilities

    SciTech Connect

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  12. Stretching Probability Explorations with Geoboards

    ERIC Educational Resources Information Center

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  13. ESTIMATION OF AGE TRANSITION PROBABILITIES.

    ERIC Educational Resources Information Center

    ZINTER, JUDITH R.

    THIS NOTE DESCRIBES THE PROCEDURES USED IN DETERMINING DYNAMOD II AGE TRANSITION MATRICES. A SEPARATE MATRIX FOR EACH SEX-RACE GROUP IS DEVELOPED. THESE MATRICES WILL BE USED AS AN AID IN ESTIMATING THE TRANSITION PROBABILITIES IN THE LARGER DYNAMOD II MATRIX RELATING AGE TO OCCUPATIONAL CATEGORIES. THREE STEPS WERE USED IN THE PROCEDURE--(1)…

  14. Probability Simulation in Middle School.

    ERIC Educational Resources Information Center

    Lappan, Glenda; Winter, M. J.

    1980-01-01

    Two simulations designed to teach probability to middle-school age pupils are presented. The first simulates the one-on-one foul shot simulation in basketball; the second deals with collecting a set of six cereal box prizes by buying boxes containing one toy each. (MP)

  15. Some Surprising Probabilities from Bingo.

    ERIC Educational Resources Information Center

    Mercer, Joseph O.

    1993-01-01

    Investigates the probability of winning the largest prize at Bingo through a series of five simpler problems. Investigations are conducted with the aid of either BASIC computer programs, spreadsheets, or a computer algebra system such as Mathematica. Provides sample data tables to illustrate findings. (MDH)

  16. GPS: Geometry, Probability, and Statistics

    ERIC Educational Resources Information Center

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  17. Conditional Independence in Applied Probability.

    ERIC Educational Resources Information Center

    Pfeiffer, Paul E.

    This material assumes the user has the background provided by a good undergraduate course in applied probability. It is felt that introductory courses in calculus, linear algebra, and perhaps some differential equations should provide the requisite experience and proficiency with mathematical concepts, notation, and argument. The document is…

  18. Music-evoked incidental happiness modulates probability weighting during risky lottery choices

    PubMed Central

    Schulreich, Stefan; Heussen, Yana G.; Gerhardt, Holger; Mohr, Peter N. C.; Binkofski, Ferdinand C.; Koelsch, Stefan; Heekeren, Hauke R.

    2014-01-01

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music—happy, sad, or no music, or sequences of random tones—and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the “happy” than in the “sad” and “random tones” conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the “happy” condition, participants showed significantly higher decision weights associated with the larger payoffs than in the “sad” and “random tones” conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting. PMID:24432007

  19. Music-evoked incidental happiness modulates probability weighting during risky lottery choices.

    PubMed

    Schulreich, Stefan; Heussen, Yana G; Gerhardt, Holger; Mohr, Peter N C; Binkofski, Ferdinand C; Koelsch, Stefan; Heekeren, Hauke R

    2014-01-01

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music-happy, sad, or no music, or sequences of random tones-and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the "happy" than in the "sad" and "random tones" conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the "happy" condition, participants showed significantly higher decision weights associated with the larger payoffs than in the "sad" and "random tones" conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting. PMID:24432007

  20. Significant impact of 2D graphene nanosheets on large volume change tin-based anodes in lithium-ion batteries: A review

    NASA Astrophysics Data System (ADS)

    Zhao, Yang; Li, Xifei; Yan, Bo; Li, Dejun; Lawes, Stephen; Sun, Xueliang

    2015-01-01

    Sn-based materials have attracted much attention as anodes in lithium ion batteries (LIBs) due to their low cost, high theoretical capacities, and high energy density. However, their practical applications are limited by the poor cyclability originating from the huge volume changes. Graphene nanosheets (GNSs), a novel two-dimensional carbon sheet with one atom thickness and one of the thinnest materials, significantly address the challenges of Sn-based anodes as excellent buffering materials, showing great research interests in LIBs. In this review, various nanocomposites of GNSs/Sn-based anodes are summarized in detail, including binary and ternary composites. The significant impact of 2D GNSs on the volume change of Sn-based anodes during cycling is discussed, along with with their preparation methods, properties and enhanced LIB performance.

  1. Detection probabilities in fuel cycle oriented safeguards

    SciTech Connect

    Canty, J.J.; Stein, G.; Avenhaus, R. )

    1987-01-01

    An intensified discussion of evaluation criteria for International Atomic Energy Agency (IAEA) safeguards effectiveness is currently under way. Considerations basic to the establishment of such criteria are derived from the model agreement INFCIRC/153 and include threshold amounts, strategic significance, conversion times, required assurances, cost-effectiveness, and nonintrusiveness. In addition to these aspects, the extent to which fuel cycle characteristics are taken into account in safeguards implementations (Article 81c of INFCIRC/153) will be reflected in the criteria. The effectiveness of safeguards implemented under given manpower constraints is evaluated. As the significant quantity and timeliness criteria have established themselves within the safeguards community, these are taken as fixed. Detection probabilities, on the other hand, still provide a certain degree of freedom in interpretation. The problem of randomization of inspection activities across a fuel cycle, or portions thereof, is formalized as a two-person zero-sum game, the payoff function of which is the detection probability achieved by the inspectorate. It is argued, from the point of view of risk of detection, that fuel cycle-independent, minimally accepted threshold criteria for such detection probabilities cannot and should not be applied.

  2. Probability densities in strong turbulence

    NASA Astrophysics Data System (ADS)

    Yakhot, Victor

    2006-03-01

    In this work we, using Mellin’s transform combined with the Gaussian large-scale boundary condition, calculate probability densities (PDFs) of velocity increments P(δu,r), velocity derivatives P(u,r) and the PDF of the fluctuating dissipation scales Q(η,Re), where Re is the large-scale Reynolds number. The resulting expressions strongly deviate from the Log-normal PDF P(δu,r) often quoted in the literature. It is shown that the probability density of the small-scale velocity fluctuations includes information about the large (integral) scale dynamics which is responsible for the deviation of P(δu,r) from P(δu,r). An expression for the function D(h) of the multifractal theory, free from spurious logarithms recently discussed in [U. Frisch, M. Martins Afonso, A. Mazzino, V. Yakhot, J. Fluid Mech. 542 (2005) 97] is also obtained.

  3. Probability, Information and Statistical Physics

    NASA Astrophysics Data System (ADS)

    Kuzemsky, A. L.

    2016-03-01

    In this short survey review we discuss foundational issues of the probabilistic approach to information theory and statistical mechanics from a unified standpoint. Emphasis is on the inter-relations between theories. The basic aim is tutorial, i.e. to carry out a basic introduction to the analysis and applications of probabilistic concepts to the description of various aspects of complexity and stochasticity. We consider probability as a foundational concept in statistical mechanics and review selected advances in the theoretical understanding of interrelation of the probability, information and statistical description with regard to basic notions of statistical mechanics of complex systems. It includes also a synthesis of past and present researches and a survey of methodology. The purpose of this terse overview is to discuss and partially describe those probabilistic methods and approaches that are used in statistical mechanics with the purpose of making these ideas easier to understanding and to apply.

  4. Probability for primordial black holes

    NASA Astrophysics Data System (ADS)

    Bousso, R.; Hawking, S. W.

    1995-11-01

    We consider two quantum cosmological models with a massive scalar field: an ordinary Friedmann universe and a universe containing primordial black holes. For both models we discuss the complex solutions to the Euclidean Einstein equations. Using the probability measure obtained from the Hartle-Hawking no-boundary proposal we find that the only unsuppressed black holes start at the Planck size but can grow with the horizon scale during the roll down of the scalar field to the minimum.

  5. Relative transition probabilities of cobalt

    NASA Technical Reports Server (NTRS)

    Roig, R. A.; Miller, M. H.

    1974-01-01

    Results of determinations of neutral-cobalt transition probabilities measured relative to Co I 4150.43 A and Co II 4145.15 A, using a gas-driven shock tube as the spectroscopic light source. Results are presented for 139 Co I lines in the range from 3940 to 6640 A and 11 Co II lines in the range from 3840 to 4730 A, which are estimated to have reliabilities ranging from 8 to 50%.

  6. Dietary variety increases the probability of nutrient adequacy among adults.

    PubMed

    Foote, Janet A; Murphy, Suzanne P; Wilkens, Lynne R; Basiotis, P Peter; Carlson, Andrea

    2004-07-01

    Despite guidance to consume a variety of foods, the role of dietary variety in ensuring nutrient adequacy is unclear. The aim of this study was to determine whether a commodity-based measure of dietary variety was associated with the probability of nutrient adequacy after adjusting for energy and food group intakes. Subjects were 4969 men and 4800 women >/= 19 y old who participated in the Continuing Survey of Food Intakes for Individuals 1994-1996. Using 24-h recall data, the mean probability of adequacy across 15 nutrients was calculated using the Dietary Reference Intakes. Dietary variety was defined using a commodity-based method similar to that used for the Healthy Eating Index (HEI). Associations were examined in gender-specific multivariate regression models. Energy intake was a strong predictor of the mean probability of adequacy in models controlled for age, BMI, education level, and ethnicity (model R(2) = 0.60 and 0.54 for men and women, respectively). Adding the number of servings from each of the 5 Food Guide Pyramid (FGP) groups to the models significantly improved the model fit (R(2) = 0.69 and 0.66 for men and women). Adding dietary variety again significantly improved the model fit for both men and women (R(2) = 0.73 and 0.70, respectively). Variety counts within the dairy and grain groups were most strongly associated with improved nutrient adequacy. Dietary variety as defined by the HEI contributes an additional component of dietary quality that is not captured by FGP servings or energy intake. PMID:15226469

  7. On the Role of Prior Probability in Adiabatic Quantum Algorithms

    NASA Astrophysics Data System (ADS)

    Sun, Jie; Lu, Songfeng; Yang, Liping

    2016-03-01

    In this paper, we study the role of prior probability on the efficiency of quantum local adiabatic search algorithm. The following aspects for prior probability are found here: firstly, only the probabilities of marked states affect the running time of the adiabatic evolution; secondly, the prior probability can be used for improving the efficiency of the adiabatic algorithm; thirdly, like the usual quantum adiabatic evolution, the running time for the case of multiple solution states where the number of marked elements are smaller enough than the size of the set assigned that contains them can be significantly bigger than that of the case where the assigned set only contains all the marked states.

  8. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    SciTech Connect

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-08-26

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  9. From data to probability densities without histograms

    NASA Astrophysics Data System (ADS)

    Berg, Bernd A.; Harris, Robert C.

    2008-09-01

    When one deals with data drawn from continuous variables, a histogram is often inadequate to display their probability density. It deals inefficiently with statistical noise, and binsizes are free parameters. In contrast to that, the empirical cumulative distribution function (obtained after sorting the data) is parameter free. But it is a step function, so that its differentiation does not give a smooth probability density. Based on Fourier series expansion and Kolmogorov tests, we introduce a simple method, which overcomes this problem. Error bars on the estimated probability density are calculated using a jackknife method. We give several examples and provide computer code reproducing them. You may want to look at the corresponding figures 4 to 9 first. Program summaryProgram title: cdf_to_pd Catalogue identifier: AEBC_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEBC_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 2758 No. of bytes in distributed program, including test data, etc.: 18 594 Distribution format: tar.gz Programming language: Fortran 77 Computer: Any capable of compiling and executing Fortran code Operating system: Any capable of compiling and executing Fortran code Classification: 4.14, 9 Nature of problem: When one deals with data drawn from continuous variables, a histogram is often inadequate to display the probability density. It deals inefficiently with statistical noise, and binsizes are free parameters. In contrast to that, the empirical cumulative distribution function (obtained after sorting the data) is parameter free. But it is a step function, so that its differentiation does not give a smooth probability density. Solution method: Based on Fourier series expansion and Kolmogorov tests, we introduce a simple method, which

  10. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  11. Approaches to Evaluating Probability of Collision Uncertainty

    NASA Technical Reports Server (NTRS)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  12. Estimation of transition probabilities of credit ratings

    NASA Astrophysics Data System (ADS)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  13. Preservice Elementary Teachers and the Fundamentals of Probability

    ERIC Educational Resources Information Center

    Dollard, Clark

    2011-01-01

    This study examined how preservice elementary teachers think about situations involving probability. Twenty-four preservice elementary teachers who had not yet studied probability as part of their preservice elementary mathematics coursework were interviewed using a task-based interview. The participants' responses showed a wide variety of…

  14. Public Attitudes toward Stuttering in Turkey: Probability versus Convenience Sampling

    ERIC Educational Resources Information Center

    Ozdemir, R. Sertan; St. Louis, Kenneth O.; Topbas, Seyhun

    2011-01-01

    Purpose: A Turkish translation of the "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was used to compare probability versus convenience sampling to measure public attitudes toward stuttering. Method: A convenience sample of adults in Eskisehir, Turkey was compared with two replicates of a school-based, probability cluster…

  15. Probability Theory, Not the Very Guide of Life

    ERIC Educational Resources Information Center

    Juslin, Peter; Nilsson, Hakan; Winman, Anders

    2009-01-01

    Probability theory has long been taken as the self-evident norm against which to evaluate inductive reasoning, and classical demonstrations of violations of this norm include the conjunction error and base-rate neglect. Many of these phenomena require multiplicative probability integration, whereas people seem more inclined to linear additive…

  16. Bacteria survival probability in bactericidal filter paper.

    PubMed

    Mansur-Azzam, Nura; Hosseinidoust, Zeinab; Woo, Su Gyeong; Vyhnalkova, Renata; Eisenberg, Adi; van de Ven, Theo G M

    2014-05-01

    Bactericidal filter papers offer the simplicity of gravity filtration to simultaneously eradicate microbial contaminants and particulates. We previously detailed the development of biocidal block copolymer micelles that could be immobilized on a filter paper to actively eradicate bacteria. Despite the many advantages offered by this system, its widespread use is hindered by its unknown mechanism of action which can result in non-reproducible outcomes. In this work, we sought to investigate the mechanism by which a certain percentage of Escherichia coli cells survived when passing through the bactericidal filter paper. Through the process of elimination, the possibility that the bacterial survival probability was controlled by the initial bacterial load or the existence of resistant sub-populations of E. coli was dismissed. It was observed that increasing the thickness or the number of layers of the filter significantly decreased bacterial survival probability for the biocidal filter paper but did not affect the efficiency of the blank filter paper (no biocide). The survival probability of bacteria passing through the antibacterial filter paper appeared to depend strongly on the number of collision between each bacterium and the biocide-loaded micelles. It was thus hypothesized that during each collision a certain number of biocide molecules were directly transferred from the hydrophobic core of the micelle to the bacterial lipid bilayer membrane. Therefore, each bacterium must encounter a certain number of collisions to take up enough biocide to kill the cell and cells that do not undergo the threshold number of collisions are expected to survive. PMID:24681395

  17. Probability of Detection Demonstration Transferability

    NASA Technical Reports Server (NTRS)

    Parker, Bradford H.

    2008-01-01

    The ongoing Mars Science Laboratory (MSL) Propellant Tank Penetrant Nondestructive Evaluation (NDE) Probability of Detection (POD) Assessment (NESC activity) has surfaced several issues associated with liquid penetrant POD demonstration testing. This presentation lists factors that may influence the transferability of POD demonstration tests. Initial testing will address the liquid penetrant inspection technique. Some of the factors to be considered in this task are crack aspect ratio, the extent of the crack opening, the material and the distance between the inspection surface and the inspector's eye.

  18. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters. PMID:22407706

  19. Mapping genes with longitudinal phenotypes via Bayesian posterior probabilities.

    PubMed

    Musolf, Anthony; Nato, Alejandro Q; Londono, Douglas; Zhou, Lisheng; Matise, Tara C; Gordon, Derek

    2014-01-01

    Most association studies focus on disease risk, with less attention paid to disease progression or severity. These phenotypes require longitudinal data. This paper presents a new method for analyzing longitudinal data to map genes in both population-based and family-based studies. Using simulated systolic blood pressure measurements obtained from Genetic Analysis Workshop 18, we cluster the phenotype data into trajectory subgroups. We then use the Bayesian posterior probability of being in the high subgroup as a quantitative trait in an association analysis with genotype data. This method maintains high power (>80%) in locating genes known to affect the simulated phenotype for most specified significance levels (α). We believe that this method can be useful to aid in the discovery of genes that affect severity or progression of disease. PMID:25519410

  20. Quantum probabilities from quantum entanglement: experimentally unpacking the Born rule

    DOE PAGESBeta

    Harris, Jérémie; Bouchard, Frédéric; Santamato, Enrico; Zurek, Wojciech H.; Boyd, Robert W.; Karimi, Ebrahim

    2016-05-01

    The Born rule, a foundational axiom was used to deduce probabilities of events from wavefunctions, is indispensable in the everyday practice of quantum physics. It is also key in the quest to reconcile the ostensibly inconsistent laws of the quantum and classical realms, as it confers physical significance to reduced density matrices, the essential tools of decoherence theory. Following Bohr's Copenhagen interpretation, textbooks postulate the Born rule outright. But, recent attempts to derive it from other quantum principles have been successful, holding promise for simplifying and clarifying the quantum foundational bedrock. Moreover, a major family of derivations is based onmore » envariance, a recently discovered symmetry of entangled quantum states. Here, we identify and experimentally test three premises central to these envariance-based derivations, thus demonstrating, in the microworld, the symmetries from which the Born rule is derived. Furthermore, we demonstrate envariance in a purely local quantum system, showing its independence from relativistic causality.« less

  1. Quantum probabilities from quantum entanglement: experimentally unpacking the Born rule

    NASA Astrophysics Data System (ADS)

    Harris, Jérémie; Bouchard, Frédéric; Santamato, Enrico; Zurek, Wojciech H.; Boyd, Robert W.; Karimi, Ebrahim

    2016-05-01

    The Born rule, a foundational axiom used to deduce probabilities of events from wavefunctions, is indispensable in the everyday practice of quantum physics. It is also key in the quest to reconcile the ostensibly inconsistent laws of the quantum and classical realms, as it confers physical significance to reduced density matrices, the essential tools of decoherence theory. Following Bohr’s Copenhagen interpretation, textbooks postulate the Born rule outright. However, recent attempts to derive it from other quantum principles have been successful, holding promise for simplifying and clarifying the quantum foundational bedrock. A major family of derivations is based on envariance, a recently discovered symmetry of entangled quantum states. Here, we identify and experimentally test three premises central to these envariance-based derivations, thus demonstrating, in the microworld, the symmetries from which the Born rule is derived. Further, we demonstrate envariance in a purely local quantum system, showing its independence from relativistic causality.

  2. Evolution of EF-hand calcium-modulated proteins. III. Exon sequences confirm most dendrograms based on protein sequences: calmodulin dendrograms show significant lack of parallelism

    NASA Technical Reports Server (NTRS)

    Nakayama, S.; Kretsinger, R. H.

    1993-01-01

    In the first report in this series we presented dendrograms based on 152 individual proteins of the EF-hand family. In the second we used sequences from 228 proteins, containing 835 domains, and showed that eight of the 29 subfamilies are congruent and that the EF-hand domains of the remaining 21 subfamilies have diverse evolutionary histories. In this study we have computed dendrograms within and among the EF-hand subfamilies using the encoding DNA sequences. In most instances the dendrograms based on protein and on DNA sequences are very similar. Significant differences between protein and DNA trees for calmodulin remain unexplained. In our fourth report we evaluate the sequences and the distribution of introns within the EF-hand family and conclude that exon shuffling did not play a significant role in its evolution.

  3. Measure and Probability in Cosmology

    NASA Astrophysics Data System (ADS)

    Schiffrin, Joshua; Wald, Robert

    2012-03-01

    General relativity has a Hamiltonian formulation, which formally provides a canonical (Liouville) measure on the space of solutions. A number of authors have used the restriction of this measure to the space of homogeneous and isotropic universes with scalar field matter (minisuperspace)---namely, the Gibbons-Hawking-Stewart measure---to make arguments about the likelihood of inflation. We argue here that there are at least four major difficulties with using the measure of general relativity to make probability arguments in cosmology: (1) Equilibration does not occur on cosmological length scales. (2) Even in the minisuperspace case, the measure of phase space is infinite and the computation of probabilities depends very strongly on how the infinity is regulated. (3) The inhomogeneous degrees of freedom must be taken into account even if one is interested only in universes that are very nearly homogeneous. The measure depends upon how the infinite number of degrees of freedom are truncated, and how one defines ``nearly homogeneous''. (4) In a universe where the second law of thermodynamics holds, one cannot make use of our knowledge of the present state of the universe to ``retrodict'' the likelihood of past conditions.

  4. Surface-Based fMRI-Driven Diffusion Tractography in the Presence of Significant Brain Pathology: A Study Linking Structure and Function in Cerebral Palsy

    PubMed Central

    Cunnington, Ross; Boyd, Roslyn N.; Rose, Stephen E.

    2016-01-01

    Diffusion MRI (dMRI) tractography analyses are difficult to perform in the presence of brain pathology. Automated methods that rely on cortical parcellation for structural connectivity studies often fail, while manually defining regions is extremely time consuming and can introduce human error. Both methods also make assumptions about structure-function relationships that may not hold after cortical reorganisation. Seeding tractography with functional-MRI (fMRI) activation is an emerging method that reduces these confounds, but inherent smoothing of fMRI signal may result in the inclusion of irrelevant pathways. This paper describes a novel fMRI-seeded dMRI-analysis pipeline based on surface-meshes that reduces these issues and utilises machine-learning to generate task specific white matter pathways, minimising the requirement for manually-drawn ROIs. We directly compared this new strategy to a standard voxelwise fMRI-dMRI approach, by investigating correlations between clinical scores and dMRI metrics of thalamocortical and corticomotor tracts in 31 children with unilateral cerebral palsy. The surface-based approach successfully processed more participants (87%) than the voxel-based approach (65%), and provided significantly more-coherent tractography. Significant correlations between dMRI metrics and five clinical scores of function were found for the more superior regions of these tracts. These significant correlations were stronger and more frequently found with the surface-based method (15/20 investigated were significant; R2 = 0.43–0.73) than the voxelwise analysis (2 sig. correlations; 0.38 & 0.49). More restricted fMRI signal, better-constrained tractography, and the novel track-classification method all appeared to contribute toward these differences. PMID:27487011

  5. Surface-Based fMRI-Driven Diffusion Tractography in the Presence of Significant Brain Pathology: A Study Linking Structure and Function in Cerebral Palsy.

    PubMed

    Reid, Lee B; Cunnington, Ross; Boyd, Roslyn N; Rose, Stephen E

    2016-01-01

    Diffusion MRI (dMRI) tractography analyses are difficult to perform in the presence of brain pathology. Automated methods that rely on cortical parcellation for structural connectivity studies often fail, while manually defining regions is extremely time consuming and can introduce human error. Both methods also make assumptions about structure-function relationships that may not hold after cortical reorganisation. Seeding tractography with functional-MRI (fMRI) activation is an emerging method that reduces these confounds, but inherent smoothing of fMRI signal may result in the inclusion of irrelevant pathways. This paper describes a novel fMRI-seeded dMRI-analysis pipeline based on surface-meshes that reduces these issues and utilises machine-learning to generate task specific white matter pathways, minimising the requirement for manually-drawn ROIs. We directly compared this new strategy to a standard voxelwise fMRI-dMRI approach, by investigating correlations between clinical scores and dMRI metrics of thalamocortical and corticomotor tracts in 31 children with unilateral cerebral palsy. The surface-based approach successfully processed more participants (87%) than the voxel-based approach (65%), and provided significantly more-coherent tractography. Significant correlations between dMRI metrics and five clinical scores of function were found for the more superior regions of these tracts. These significant correlations were stronger and more frequently found with the surface-based method (15/20 investigated were significant; R2 = 0.43-0.73) than the voxelwise analysis (2 sig. correlations; 0.38 & 0.49). More restricted fMRI signal, better-constrained tractography, and the novel track-classification method all appeared to contribute toward these differences. PMID:27487011

  6. Classical probabilities for Majorana and Weyl spinors

    SciTech Connect

    Wetterich, C.

    2011-08-15

    Highlights: > Map of classical statistical Ising model to fermionic quantum field theory. > Lattice-regularized real Grassmann functional integral for single Weyl spinor. > Emerging complex structure characteristic for quantum physics. > A classical statistical ensemble describes a quantum theory. - Abstract: We construct a map between the quantum field theory of free Weyl or Majorana fermions and the probability distribution of a classical statistical ensemble for Ising spins or discrete bits. More precisely, a Grassmann functional integral based on a real Grassmann algebra specifies the time evolution of the real wave function q{sub {tau}}(t) for the Ising states {tau}. The time dependent probability distribution of a generalized Ising model obtains as p{sub {tau}}(t)=q{sub {tau}}{sup 2}(t). The functional integral employs a lattice regularization for single Weyl or Majorana spinors. We further introduce the complex structure characteristic for quantum mechanics. Probability distributions of the Ising model which correspond to one or many propagating fermions are discussed explicitly. Expectation values of observables can be computed equivalently in the classical statistical Ising model or in the quantum field theory for fermions.

  7. A Quantum Probability Model of Causal Reasoning

    PubMed Central

    Trueblood, Jennifer S.; Busemeyer, Jerome R.

    2012-01-01

    People can often outperform statistical methods and machine learning algorithms in situations that involve making inferences about the relationship between causes and effects. While people are remarkably good at causal reasoning in many situations, there are several instances where they deviate from expected responses. This paper examines three situations where judgments related to causal inference problems produce unexpected results and describes a quantum inference model based on the axiomatic principles of quantum probability theory that can explain these effects. Two of the three phenomena arise from the comparison of predictive judgments (i.e., the conditional probability of an effect given a cause) with diagnostic judgments (i.e., the conditional probability of a cause given an effect). The third phenomenon is a new finding examining order effects in predictive causal judgments. The quantum inference model uses the notion of incompatibility among different causes to account for all three phenomena. Psychologically, the model assumes that individuals adopt different points of view when thinking about different causes. The model provides good fits to the data and offers a coherent account for all three causal reasoning effects thus proving to be a viable new candidate for modeling human judgment. PMID:22593747

  8. Augmenting Transition Probabilities for Neutral Atomic Nitrogen

    NASA Technical Reports Server (NTRS)

    Terrazas-Salines, Imelda; Park, Chul; Strawa, Anthony W.; Hartman, G. Joseph (Technical Monitor)

    1996-01-01

    The transition probability values for a number of neutral atomic nitrogen (NI) lines in the visible wavelength range are determined in order to augment those given in the National Bureau of Standards Tables. These values are determined from experimentation as well as by using the published results of other investigators. The experimental determination of the lines in the 410 to 430 nm range was made from the observation of the emission from the arc column of an arc-heated wind tunnel. The transition probability values of these NI lines are determined to an accuracy of +/- 30% by comparison of their measured intensities with those of the atomic oxygen (OI) multiplet at around 615 nm. The temperature of the emitting medium is determined both using a multiple-layer model, based on a theoretical model of the flow in the arc column, and an empirical single-layer model. The results show that the two models lead to the same values of transition probabilities for the NI lines.

  9. Evidence-Based Practices Are Not Reformulated Best Practices: A Response to Martindale's "Children with Significant Hearing Loss: Learning to Listen, Talk, and Read--Evidence-Based Best Practices"

    ERIC Educational Resources Information Center

    Schirmer, Barbara R.; Williams, Cheri

    2008-01-01

    "Communication Disorders Quarterly's" special series on evidence-based practices and, specifically, Martindale's article on evidence-based practices in learning to listen, talk, and read among children with significant hearing loss appear to confuse best practices with evidence-based practices and, perhaps more serious, offer little evidence for…

  10. Investigation of Flood Inundation Probability in Taiwan

    NASA Astrophysics Data System (ADS)

    Wang, Chia-Ho; Lai, Yen-Wei; Chang, Tsang-Jung

    2010-05-01

    Taiwan is located at a special point, which is in the path of typhoons from northeast Pacific Ocean. Taiwan is also situated in a tropical-subtropical transition zone. As a result, rainfall is abundant all the year round, especially in summer and autumn. For flood inundation analysis in Taiwan, there exist a lot of uncertainties in hydrological, hydraulic and land-surface topography characteristics, which can change flood inundation characteristics. According to the 7th work item of article 22 in Disaster Prevention and Protection Act in Taiwan, for preventing flood disaster being deteriorating, investigation analysis of disaster potentials, hazardous degree and situation simulation must be proceeded with scientific approaches. However, the flood potential analysis uses a deterministic approach to define flood inundation without considering data uncertainties. This research combines data uncertainty concept in flood inundation maps for showing flood probabilities in each grid. It can be an emergency evacuation basis as typhoons come and extremely torrential rain begin. The research selects Hebauyu watershed of Chiayi County as the demonstration area. Owing to uncertainties of data used, sensitivity analysis is first conducted by using Latin Hypercube sampling (LHS). LHS data sets are next input into an integrated numerical model, which is herein developed to assess flood inundation hazards in coastal lowlands, base on the extension of the 1-D river routing model and the 2-D inundation routing model. Finally, the probability of flood inundation simulation is calculated, and the flood inundation probability maps are obtained. Flood Inundation probability maps can be an alternative of the old flood potential maps for being a regard of building new hydraulic infrastructure in the future.

  11. Recent Advances in Model-Assisted Probability of Detection

    NASA Technical Reports Server (NTRS)

    Thompson, R. Bruce; Brasche, Lisa J.; Lindgren, Eric; Swindell, Paul; Winfree, William P.

    2009-01-01

    The increased role played by probability of detection (POD) in structural integrity programs, combined with the significant time and cost associated with the purely empirical determination of POD, provides motivation for alternate means to estimate this important metric of NDE techniques. One approach to make the process of POD estimation more efficient is to complement limited empirical experiments with information from physics-based models of the inspection process or controlled laboratory experiments. The Model-Assisted Probability of Detection (MAPOD) Working Group was formed by the Air Force Research Laboratory, the FAA Technical Center, and NASA to explore these possibilities. Since the 2004 inception of the MAPOD Working Group, 11 meetings have been held in conjunction with major NDE conferences. This paper will review the accomplishments of this group, which includes over 90 members from around the world. Included will be a discussion of strategies developed to combine physics-based and empirical understanding, draft protocols that have been developed to guide application of the strategies, and demonstrations that have been or are being carried out in a number of countries. The talk will conclude with a discussion of future directions, which will include documentation of benefits via case studies, development of formal protocols for engineering practice, as well as a number of specific technical issues.

  12. Is probability matching smart? Associations between probabilistic choices and cognitive ability.

    PubMed

    Stanovich, Keith E

    2003-03-01

    In three experiments involving over 1,500 university students (n = 1,557) and two different probabilistic choice tasks, we found that the utility-maximizing strategy of choosing the most probable alternative was not the majority response. In a story problem version of a probabilistic choice task in which participants chose from among five different strategies,the maximizing response and the probability-matching response were each selected by a similar number of students (roughly 35% of the sample selected each). In a more continuous, or trial-by-trial, task, the utility-maximizing response was chosen by only one half as many students asthe probability-matching response. More important, in both versions of the task, the participants preferring the utility-maximizing response were significantly higher in cognitive ability than were the participants showing a probability-matching tendency. Critiques of the traditional interpretation of probability matching as nonoptimal may well help explain why some humans are drawn to the nonmaximizing behavior of probability matching, but the traditional heuristics and biases interpretation can most easily accommodate the finding that participants high in computational ability are more likely to carry out the rule-based cognitive procedures that lead to maximizing behavior. PMID:12749466

  13. Reduction of discretization error for ray tracing of MOC through a correction on collision probabilities

    SciTech Connect

    Tabuchi, M.; Tatsumi, M.; Yamamoto, A.; Endo, T.

    2013-07-01

    A new correction model for ray tracing of the method of characteristics is proposed in order to reduce discretization error. As the ray tracing parameters such as azimuthal angle division, polar angle division and ray separation are considered in this study. In the method of characteristics, region average scalar fluxes can be implicitly expressed by collision probabilities, although these collision probabilities are not directly treated in the ordinary calculation scheme. From this viewpoint, difference between a coarse ray tracing condition and a detailed one can be interpreted as the difference in the estimation of collision probabilities. In other words, the discretization error for ray tracing can be recognized as a consequence of inaccurate collision probabilities caused by coarse ray tracing. This discussion suggests that accurate region average scalar flux can be obtained through an appropriate correction on collision probabilities. In this paper, a correction model on collision probabilities is theoretically derived based on the neutron balance equation, and its validity is confirmed through typical single assembly calculations. The effectiveness of the present correction method is also discussed in this paper. It is confirmed that discretization error for ray tracing can be significantly reduced by the present correction method in a multi-assembly calculation, though the correction factor is estimated in single assembly geometry. (authors)

  14. Cognitive behavior therapy-based psychoeducational groups for adults with ADHD and their significant others (PEGASUS): an open clinical feasibility trial.

    PubMed

    Hirvikoski, T; Waaler, E; Lindström, T; Bölte, S; Jokinen, J

    2015-03-01

    The aim of this pilot study was to investigate the feasibility and effectiveness of a new psychoeducative intervention program (PEGASUS) for adults with ADHD and their significant others in a psychiatric outpatient context. At three outpatient psychiatric clinics, adults with ADHD and their significant others took part in PEGASUS, a psychoeducational program based on theories from cognitive behavioral therapy, neuropsychology, and cross-disciplinary evidence regarding ADHD. In total, 108 adults were allocated to treatment (51 with ADHD and their 57 significant others). Feasibility was evaluated regarding suitability of the intervention at a psychiatric outpatient clinic and treatment completion. Preliminary efficacy was evaluated per protocol from baseline to post-intervention (n = 41 adults with ADHD and 40 significant others). In a feasibility analysis, the intervention was judged to be a suitable treatment option for 94.5 % of all individuals with a primary diagnosis of ADHD at an outpatient psychiatric clinic. In total, 43 out of 51 allocated individuals with ADHD (84.3 %) completed the intervention. The corresponding figures for their significant others were 42 out of 57 (73.7 %). Knowledge about ADHD increased, and both the quality of relationships and psychological well-being improved from baseline to post-intervention in all participants. The significant others reported a reduction in the subjective burden of care, such as worry and guilt. The objective burden of care (such as financial problems) did not change. The findings support the potential value of psychoeducation for adults with ADHD and their significant others. An ongoing randomized controlled trial will generate further evidence concerning the PEGASUS program. PMID:24863143

  15. Imprecise probability for non-commuting observables

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.

    2015-08-01

    It is known that non-commuting observables in quantum mechanics do not have joint probability. This statement refers to the precise (additive) probability model. I show that the joint distribution of any non-commuting pair of variables can be quantified via upper and lower probabilities, i.e. the joint probability is described by an interval instead of a number (imprecise probability). I propose transparent axioms from which the upper and lower probability operators follow. The imprecise probability depend on the non-commuting observables, is linear over the state (density matrix) and reverts to the usual expression for commuting observables.

  16. Snell Envelope with Small Probability Criteria

    SciTech Connect

    Del Moral, Pierre Hu, Peng; Oudjane, Nadia

    2012-12-15

    We present a new algorithm to compute the Snell envelope in the specific case where the criteria to optimize is associated with a small probability or a rare event. This new approach combines the Stochastic Mesh approach of Broadie and Glasserman with a particle approximation scheme based on a specific change of measure designed to concentrate the computational effort in regions pointed out by the criteria. The theoretical analysis of this new algorithm provides non asymptotic convergence estimates. Finally, the numerical tests confirm the practical interest of this approach.

  17. The Significance of Emotions and Professional Relations for Accommodating a Web-Based Ulcer Record and Improving Home-Based Care

    PubMed Central

    Ekeland, Anne G.

    2015-01-01

    Evidence of technological performance, medical improvements and economic effectiveness is generally considered sufficient for judging advances in healthcare. In this paper, I aim to add knowledge about the ways human emotions and professional relations play roles in the processes of accommodating new technologies for quality improvements. A newly-implemented, web-based ulcer record service for patients with chronic skin ulcers constitutes the case. After one year, only a few home care nurses were using the service, interacting with a specialist team. The result was disappointing, but the few users were enthusiastic. An explorative, qualitative study was initiated to understand the users, the processes that accounted for use and how improvements were enacted. In the paper, I expose the emotional aspects of the record accommodation by analyzing the ways emotions were translated in the process and how they influenced the improvements. I contend that use came about through a heterogeneous assemblage of ethical engagement and compassionate emotions stemming from frustration, combined with technological affordances and relations between different professionals. Certain aspects of the improvements are exposed. These are discussed as: (1) reconciliations between the medical facts and rational judgments, on one side, and the emotional and subjective values for judging quality, on the other; and (2) mediation between standardized and personalized care. The healing of ulcers was combined with a sense of purpose and wellbeing to validate improvements. Emotions were strongly involved, and the power of evaluative emotions and professional relations should be further explored to add to the understanding of innovation processes and to validate quality improvements. PMID:27417745

  18. Towards Defining Probability Forecasts of Likely Climate Change

    NASA Astrophysics Data System (ADS)

    Smith, L. A.; Allen, M. R.; Stainforth, D. A.

    2004-12-01

    There is strong desire for probabilistic forecasts of climate change, both for policy making and risk management, as well as scientific interest. The extent to which this desire can be satisfied scientifically is unclear. The aim of this paper is to explore (i) current methods for extracting probability forecasts and (ii) alternative deliverables which have a firm scientific basis given the current limitations to the state of the art. Even ``physics-based" models contain empirically determined paramters and parameterizations. While it is straightforward to make `ensembles' over initial conditions, parameter values and even several model structures, the interpretation of the resulting ensemble of simulations requires some care. Methods for extracting probability forecasts from ensembles of model simulations will be discussed in terms of their relevance and internal consistency, a particular example being provided by Murphy et al (Nature, 2004). This approach will be compared and contrasted with one proposed by climateprediction.net (Stainforth et al, Nature, in review), which strives to produce policy relevant information when no coherent probability forecast can be extracted from the limited ensembles available in practice. The role of the Bayesian paradigm will be considered in both cases. Extracting probability distributions in the context of climate change requires consideration of a discrete sample drawn from a high dimensional space. The analysis of small ensembles requires further assumptions of linearity and smoothness which must be verified explicitly; even when ``large" ensembles are to hand, the analysis of the collection of simulations requires combining mutually exclusive runs, sampling a restricted region of the parameter space, under a set of models with related shortcoming. Historical observations serve to lift some of these difficulties, but attempts to fold observations into the analysis (say, in terms of weighting sets of simulations differently

  19. Landslide Probability Assessment by the Derived Distributions Technique

    NASA Astrophysics Data System (ADS)

    Muñoz, E.; Ochoa, A.; Martínez, H.

    2012-12-01

    Landslides are potentially disastrous events that bring along human and economic losses; especially in cities where an accelerated and unorganized growth leads to settlements on steep and potentially unstable areas. Among the main causes of landslides are geological, geomorphological, geotechnical, climatological, hydrological conditions and anthropic intervention. This paper studies landslides detonated by rain, commonly known as "soil-slip", which characterize by having a superficial failure surface (Typically between 1 and 1.5 m deep) parallel to the slope face and being triggered by intense and/or sustained periods of rain. This type of landslides is caused by changes on the pore pressure produced by a decrease in the suction when a humid front enters, as a consequence of the infiltration initiated by rain and ruled by the hydraulic characteristics of the soil. Failure occurs when this front reaches a critical depth and the shear strength of the soil in not enough to guarantee the stability of the mass. Critical rainfall thresholds in combination with a slope stability model are widely used for assessing landslide probability. In this paper we present a model for the estimation of the occurrence of landslides based on the derived distributions technique. Since the works of Eagleson in the 1970s the derived distributions technique has been widely used in hydrology to estimate the probability of occurrence of extreme flows. The model estimates the probability density function (pdf) of the Factor of Safety (FOS) from the statistical behavior of the rainfall process and some slope parameters. The stochastic character of the rainfall is transformed by means of a deterministic failure model into FOS pdf. Exceedance probability and return period estimation is then straightforward. The rainfall process is modeled as a Rectangular Pulses Poisson Process (RPPP) with independent exponential pdf for mean intensity and duration of the storms. The Philip infiltration model

  20. Trait Mindfulness, Reasons For Living and General Symptom Severity as Predictors of Suicide Probability in Males with Substance Abuse or Dependence

    PubMed Central

    Mohammadkhani, Parvaneh; Azadmehr, Hedieh; Mobramm, Ardeshir; Naseri, Esmaeil

    2015-01-01

    Objective: The aim of this study was to evaluate suicide probability in Iranian males with substance abuse or dependence disorder and to investigate the predictors of suicide probability based on trait mindfulness, reasons for living and severity of general psychiatric symptoms. Method: Participants were 324 individuals with substance abuse or dependence in an outpatient setting and prison. Reasons for living questionnaire, Mindfulness Attention Awareness Scale and Suicide probability Scale were used as instruments. Sample was selected based on convenience sampling method. Data were analyzed using SPSS and AMOS. Results: The life-time prevalence of suicide attempt in the outpatient setting was35% and it was 42% in the prison setting. Suicide probability in the prison setting was significantly higher than in the outpatient setting (p<0.001). The severity of general symptom strongly correlated with suicide probability. Trait mindfulness, not reasons for living beliefs, had a mediating effect in the relationship between the severity of general symptoms and suicide probability. Fear of social disapproval, survival and coping beliefs and child-related concerns significantly predicted suicide probability (p<0.001). Discussion: It could be suggested that trait mindfulness was more effective in preventing suicide probability than beliefs about reasons for living in individuals with substance abuse or dependence disorders. The severity of general symptom should be regarded as an important risk factor of suicide probability. PMID:26005482

  1. Determination of riverbank erosion probability using Locally Weighted Logistic Regression

    NASA Astrophysics Data System (ADS)

    Ioannidou, Elena; Flori, Aikaterini; Varouchakis, Emmanouil A.; Giannakis, Georgios; Vozinaki, Anthi Eirini K.; Karatzas, George P.; Nikolaidis, Nikolaos

    2015-04-01

    Riverbank erosion is a natural geomorphologic process that affects the fluvial environment. The most important issue concerning riverbank erosion is the identification of the vulnerable locations. An alternative to the usual hydrodynamic models to predict vulnerable locations is to quantify the probability of erosion occurrence. This can be achieved by identifying the underlying relations between riverbank erosion and the geomorphological or hydrological variables that prevent or stimulate erosion. Thus, riverbank erosion can be determined by a regression model using independent variables that are considered to affect the erosion process. The impact of such variables may vary spatially, therefore, a non-stationary regression model is preferred instead of a stationary equivalent. Locally Weighted Regression (LWR) is proposed as a suitable choice. This method can be extended to predict the binary presence or absence of erosion based on a series of independent local variables by using the logistic regression model. It is referred to as Locally Weighted Logistic Regression (LWLR). Logistic regression is a type of regression analysis used for predicting the outcome of a categorical dependent variable (e.g. binary response) based on one or more predictor variables. The method can be combined with LWR to assign weights to local independent variables of the dependent one. LWR allows model parameters to vary over space in order to reflect spatial heterogeneity. The probabilities of the possible outcomes are modelled as a function of the independent variables using a logistic function. Logistic regression measures the relationship between a categorical dependent variable and, usually, one or several continuous independent variables by converting the dependent variable to probability scores. Then, a logistic regression is formed, which predicts success or failure of a given binary variable (e.g. erosion presence or absence) for any value of the independent variables. The

  2. Fusion probability in heavy nuclei

    NASA Astrophysics Data System (ADS)

    Banerjee, Tathagata; Nath, S.; Pal, Santanu

    2015-03-01

    Background: Fusion between two massive nuclei is a very complex process and is characterized by three stages: (a) capture inside the potential barrier, (b) formation of an equilibrated compound nucleus (CN), and (c) statistical decay of the CN leading to a cold evaporation residue (ER) or fission. The second stage is the least understood of the three and is the most crucial in predicting yield of superheavy elements (SHE) formed in complete fusion reactions. Purpose: A systematic study of average fusion probability, , is undertaken to obtain a better understanding of its dependence on various reaction parameters. The study may also help to clearly demarcate onset of non-CN fission (NCNF), which causes fusion probability, PCN, to deviate from unity. Method: ER excitation functions for 52 reactions leading to CN in the mass region 170-220, which are available in the literature, have been compared with statistical model (SM) calculations. Capture cross sections have been obtained from a coupled-channels code. In the SM, shell corrections in both the level density and the fission barrier have been included. for these reactions has been extracted by comparing experimental and theoretical ER excitation functions in the energy range ˜5 %-35% above the potential barrier, where known effects of nuclear structure are insignificant. Results: has been shown to vary with entrance channel mass asymmetry, η (or charge product, ZpZt ), as well as with fissility of the CN, χCN. No parameter has been found to be adequate as a single scaling variable to determine . Approximate boundaries have been obtained from where starts deviating from unity. Conclusions: This study quite clearly reveals the limits of applicability of the SM in interpreting experimental observables from fusion reactions involving two massive nuclei. Deviation of from unity marks the beginning of the domain of dynamical models of fusion. Availability of precise ER cross

  3. Exploring the Overestimation of Conjunctive Probabilities

    PubMed Central

    Nilsson, Håkan; Rieskamp, Jörg; Jenny, Mirjam A.

    2013-01-01

    People often overestimate probabilities of conjunctive events. The authors explored whether the accuracy of conjunctive probability estimates can be improved by increased experience with relevant constituent events and by using memory aids. The first experiment showed that increased experience with constituent events increased the correlation between the estimated and the objective conjunctive probabilities, but that it did not reduce overestimation of conjunctive probabilities. The second experiment showed that reducing cognitive load with memory aids for the constituent probabilities led to improved estimates of the conjunctive probabilities and to decreased overestimation of conjunctive probabilities. To explain the cognitive process underlying people’s probability estimates, the configural weighted average model was tested against the normative multiplicative model. The configural weighted average model generates conjunctive probabilities that systematically overestimate objective probabilities although the generated probabilities still correlate strongly with the objective probabilities. For the majority of participants this model was better than the multiplicative model in predicting the probability estimates. However, when memory aids were provided, the predictive accuracy of the multiplicative model increased. In sum, memory tools can improve people’s conjunctive probability estimates. PMID:23460026

  4. A corpus of consonant-vowel-consonant (CVC) real words and nonwords: Comparison of phonotactic probability, neighborhood density, and consonant age-of-acquisition

    PubMed Central

    Storkel, Holly L.

    2013-01-01

    A corpus of 5,765 consonant-vowel-consonant (CVC) sequences was compiled, and phonotactic probability and neighborhood density based on both child and adult corpora were computed. This corpus of CVCs, provided as supplementary materials, was analyzed to address the following questions: (1) Do computations based on a child corpus differ from those based on an adult corpus? (2) Do phonotactic probability and/or neighborhood density of real words differ from that of nonwords? (3) Do phonotactic probability and/or neighborhood density differ across CVCs varying in consonant age-of-acquisition? Results showed significant differences in phonotactic probability and neighborhood density for child versus adult corpora, replicating prior findings. The impact of this difference on future studies will depend on the level of precision needed in specifying probability and density. In addition, significant and large differences in phonotactic probability and neighborhood density were detected between real words and nonwords, which may present methodological challenges for future research. Lastly, CVCs composed of earlier acquired sounds differed significantly in probability and density from CVCs composed of later acquired sounds, although this effect was relatively small and less likely to present significant methodological challenges to future studies. PMID:23307574

  5. Trajectory versus probability density entropy.

    PubMed

    Bologna, M; Grigolini, P; Karagiorgis, M; Rosa, A

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy. PMID:11461383

  6. Trajectory versus probability density entropy

    NASA Astrophysics Data System (ADS)

    Bologna, Mauro; Grigolini, Paolo; Karagiorgis, Markos; Rosa, Angelo

    2001-07-01

    We show that the widely accepted conviction that a connection can be established between the probability density entropy and the Kolmogorov-Sinai (KS) entropy is questionable. We adopt the definition of density entropy as a functional of a distribution density whose time evolution is determined by a transport equation, conceived as the only prescription to use for the calculation. Although the transport equation is built up for the purpose of affording a picture equivalent to that stemming from trajectory dynamics, no direct use of trajectory time evolution is allowed, once the transport equation is defined. With this definition in mind we prove that the detection of a time regime of increase of the density entropy with a rate identical to the KS entropy is possible only in a limited number of cases. The proposals made by some authors to establish a connection between the two entropies in general, violate our definition of density entropy and imply the concept of trajectory, which is foreign to that of density entropy.

  7. Probability distributions of turbulent energy.

    PubMed

    Momeni, Mahdi; Müller, Wolf-Christian

    2008-05-01

    Probability density functions (PDFs) of scale-dependent energy fluctuations, P[deltaE(l)] , are studied in high-resolution direct numerical simulations of Navier-Stokes and incompressible magnetohydrodynamic (MHD) turbulence. MHD flows with and without a strong mean magnetic field are considered. For all three systems it is found that the PDFs of inertial range energy fluctuations exhibit self-similarity and monoscaling in agreement with recent solar-wind measurements [Hnat, Geophys. Res. Lett. 29, 86 (2002)]. Furthermore, the energy PDFs exhibit similarity over all scales of the turbulent system showing no substantial qualitative change of shape as the scale of the fluctuations varies. This is in contrast to the well-known behavior of PDFs of turbulent velocity fluctuations. In all three cases under consideration the P[deltaE(l)] resemble Lévy-type gamma distributions approximately Delta;{-1} exp(-|deltaE|/Delta)|deltaE|;{-gamma} The observed gamma distributions exhibit a scale-dependent width Delta(l) and a system-dependent gamma . The monoscaling property reflects the inertial-range scaling of the Elsässer-field fluctuations due to lacking Galilei invariance of deltaE . The appearance of Lévy distributions is made plausible by a simple model of energy transfer. PMID:18643170

  8. A Preoperative Assessment of Significant Coronary Stenosis Based on a Semiquantitative Analysis of Coronary Artery Calcification on Noncontrast Computed Tomography in Aortic Stenosis Patients Undergoing Aortic Valve Replacement

    PubMed Central

    Hwang, Ji-Won; Kim, Sung Mok; Park, Sung-Ji; Cho, Eun Jeong; Lee, Sans-Chol; Choe, Yeon Hyeon; Park, Seung Woo

    2016-01-01

    Abstract Invasive coronary angiography (ICA) is the recommended assessment for coronary artery disease in patients undergoing elective aortic valve replacement (AVR). Noncontrast computed tomography (CT) is useful for evaluating lung lesions and calcifications at the cannulation site of the ascending aorta. The purpose of this study was to evaluate the role of noncontrast CT in the visual assessment of coronary artery calcification (CAC) in patients undergoing AVR. We retrospectively identified patients with significant aortic stenosis (AS) who were referred for AVR between January 2006 and December 2013. Among these, we included 386 patients (53.6% males, 69.2 ± 8.4 years) who underwent both noncontrast CT and ICA. Significant coronary artery stenosis (CAS) in the ICA was defined as luminal stenosis ≥70%. The 4 main coronary arteries were visually assessed on noncontrast CT and were scored based on the Weston score as follows: 0, no visually detected calcium; 1, a single high-density pixel detected; 3, calcium was dense enough to create a blooming artifact; and 2, calcium in between 1 and 3. Four groups were reclassified by the sum of the Weston scores from each vessel, as follows: noncalcification (0); mild calcification (1–4); moderate calcification (5–8); and severe calcification (9–12). Receiver-operating characteristic (ROC) analysis was generated to identify the cutoff Weston score values for predicting significant CAS. Diagnostic estimates were calculated based on these cutoffs. In the ICA analysis, 62 of the 386 patients (16.1%) had significant CAS. All patients were divided into 4 groups. The noncalcification group had 97 subjects (Weston score 0), the mild degree group had 100 (2.6 ± 1.0), the moderate calcification group had 114 (6.6 ± 1.1), and the severe calcification group had 75 (10.7 ± 1.1). The prevalence of significant CAS in the noncalcification, mild, moderate, and severe groups was 1% (1/97), 5% (5/100), 24% (27

  9. A Preoperative Assessment of Significant Coronary Stenosis Based on a Semiquantitative Analysis of Coronary Artery Calcification on Noncontrast Computed Tomography in Aortic Stenosis Patients Undergoing Aortic Valve Replacement.

    PubMed

    Hwang, Ji-Won; Kim, Sung Mok; Park, Sung-Ji; Cho, Eun Jeong; Lee, Sans-Chol; Choe, Yeon Hyeon; Park, Seung Woo

    2016-03-01

    Invasive coronary angiography (ICA) is the recommended assessment for coronary artery disease in patients undergoing elective aortic valve replacement (AVR). Noncontrast computed tomography (CT) is useful for evaluating lung lesions and calcifications at the cannulation site of the ascending aorta. The purpose of this study was to evaluate the role of noncontrast CT in the visual assessment of coronary artery calcification (CAC) in patients undergoing AVR.We retrospectively identified patients with significant aortic stenosis (AS) who were referred for AVR between January 2006 and December 2013. Among these, we included 386 patients (53.6% males, 69.2 ± 8.4 years) who underwent both noncontrast CT and ICA. Significant coronary artery stenosis (CAS) in the ICA was defined as luminal stenosis ≥70%. The 4 main coronary arteries were visually assessed on noncontrast CT and were scored based on the Weston score as follows: 0, no visually detected calcium; 1, a single high-density pixel detected; 3, calcium was dense enough to create a blooming artifact; and 2, calcium in between 1 and 3. Four groups were reclassified by the sum of the Weston scores from each vessel, as follows: noncalcification (0); mild calcification (1-4); moderate calcification (5-8); and severe calcification (9-12). Receiver-operating characteristic (ROC) analysis was generated to identify the cutoff Weston score values for predicting significant CAS. Diagnostic estimates were calculated based on these cutoffs.In the ICA analysis, 62 of the 386 patients (16.1%) had significant CAS. All patients were divided into 4 groups. The noncalcification group had 97 subjects (Weston score 0), the mild degree group had 100 (2.6 ± 1.0), the moderate calcification group had 114 (6.6 ± 1.1), and the severe calcification group had 75 (10.7 ± 1.1). The prevalence of significant CAS in the noncalcification, mild, moderate, and severe groups was 1% (1/97), 5% (5/100), 24% (27/114), and 39% (29

  10. Quantification of effective exoelectrogens by most probable number (MPN) in a microbial fuel cell.

    PubMed

    Heidrich, Elizabeth S; Curtis, Thomas P; Woodcock, Stephen; Dolfing, Jan

    2016-10-01

    The objective of this work was to quantify the number of exoelectrogens in wastewater capable of producing current in a microbial fuel cell by adapting the classical most probable number (MPN) methodology using current production as end point. Inoculating a series of microbial fuel cells with various dilutions of domestic wastewater and with acetate as test substrate yielded an apparent number of exoelectrogens of 17perml. Using current as a proxy for activity the apparent exoelectrogen growth rate was 0.03h(-1). With starch or wastewater as more complex test substrates similar apparent growth rates were obtained, but the apparent MPN based numbers of exoelectrogens in wastewater were significantly lower, probably because in contrast to acetate, complex substrates require complex food chains to deliver the electrons to the electrodes. Consequently, the apparent MPN is a function of the combined probabilities of members of the food chain being present. PMID:27347794

  11. Bayesian classification of polarimetric SAR images using adaptive a priori probabilities

    NASA Technical Reports Server (NTRS)

    Van Zyl, J. J.; Burnette, C. F.

    1992-01-01

    The problem of classifying earth terrain by observed polarimetric scattering properties is tackled with an iterative Bayesian scheme using a priori probabilities adaptively. The first classification is based on the use of fixed and not necessarily equal a priori probabilities, and successive iterations change the a priori probabilities adaptively. The approach is applied to an SAR image in which a single water body covers 10 percent of the image area. The classification accuracy for ocean, urban, vegetated, and total area increase, and the percentage of reclassified pixels decreases greatly as the iteration number increases. The iterative scheme is found to improve the a posteriori classification accuracy of maximum likelihood classifiers by iteratively using the local homogeneity in polarimetric SAR images. A few iterations can improve the classification accuracy significantly without sacrificing key high-frequency detail or edges in the image.

  12. Solar Flare Probability depending on Sunspot Characteristics and Their Changes

    NASA Astrophysics Data System (ADS)

    Lee, J.; Hong, S.; Kim, J.; Kim, Y.; Lee, J.; Moon, Y.; Lee, D.

    2012-12-01

    Solar flare prediction has been at the core of space weather research and a number of different approaches have been developed since THEO (McIntosh, 1990) system was introduced. However, many of space weather operation centers, i.e. International Space Environment Service's Regional Warning Centers, still rely on traditional flare prediction methods like THEO. THEO uses the McIntosh classification as the knowledge base for flare prediction and also, rules of thumb are incorporated by a human forecaster, including spot growth, magnetic topology inferred from sunspot structure and previous flare activity. The method is apparently somewhat subjective, because the forecast decision depends on the expertise of an operator and it has not been evaluated statistically. In this study, we have investigated solar flare probability depending on several sunspot characteristics (McIntosh classification, Mt. Wilson magnetic classification, sunspot area and previous flare activity) and their changes for the past three days. For this, we used NOAA sunspot and flare catalog from August 1996 to February 2011. A new index, WFP(Weighted Flare Probability), which includes solar flare strength and its historical probability, is introduced to quantify the effective contribution of flare activity. We found several interesting results as follows. First, WFP index increases not only when the sunspot magnetic complexity increases but also when the magnetic complexity decreases with almost the same proportion. Second, the index also increases for both cases of sunspot area increase and decrease. This result might be the evidence that the change (flux emergence or flux cancelation) of magnetic flux may trigger a flare since sunspot area can be a good proxy of magnetic flux. Third, active regions having significant flare activity history are much more active than those without. We are applying the multi-dimensional regression method to these data and automating the process of THEO. We have a

  13. External validation of the HIT Expert Probability (HEP) score.

    PubMed

    Joseph, Lee; Gomes, Marcelo P V; Al Solaiman, Firas; St John, Julie; Ozaki, Asuka; Raju, Manjunath; Dhariwal, Manoj; Kim, Esther S H

    2015-03-01

    The diagnosis of heparin-induced thrombocytopenia (HIT) can be challenging. The HIT Expert Probability (HEP) Score has recently been proposed to aid in the diagnosis of HIT. We sought to externally and prospectively validate the HEP score. We prospectively assessed pre-test probability of HIT for 51 consecutive patients referred to our Consultative Service for evaluation of possible HIT between August 1, 2012 and February 1, 2013. Two Vascular Medicine fellows independently applied the 4T and HEP scores for each patient. Two independent HIT expert adjudicators rendered a diagnosis of HIT likely or unlikely. The median (interquartile range) of 4T and HEP scores were 4.5 (3.0, 6.0) and 5 (3.0, 8.5), respectively. There were no significant differences between area under receiver-operating characteristic curves of 4T and HEP scores against the gold standard, confirmed HIT [defined as positive serotonin release assay and positive anti-PF4/heparin ELISA] (0.74 vs 0.73, p = 0.97). HEP score ≥ 2 was 100 % sensitive and 16 % specific for determining the presence of confirmed HIT while a 4T score > 3 was 93 % sensitive and 35 % specific. In conclusion, the HEP and 4T scores are excellent screening pre-test probability models for HIT, however, in this prospective validation study, test characteristics for the diagnosis of HIT based on confirmatory laboratory testing and expert opinion are similar. Given the complexity of the HEP scoring model compared to that of the 4T score, further validation of the HEP score is warranted prior to widespread clinical acceptance. PMID:25588983

  14. Prognostic significance of PRAME expression based on immunohistochemistry for diffuse large B-cell lymphoma patients treated with R-CHOP therapy.

    PubMed

    Mitsuhashi, Kenjiro; Masuda, Akihiro; Wang, Yan-Hua; Shiseki, Masayuki; Motoji, Toshiko

    2014-07-01

    The preferentially expressed antigen of melanoma (PRAME), a tumor-associated antigen, is considered a prognostic marker for various human malignancies. The prognostic significance of PRAME expression for diffuse large B-cell lymphoma (DLBCL) patients treated with rituximab-containing chemotherapy has not been evaluated to date, and the ability of immunohistochemistry (IHC) to detect PRAME expression in these patients has not yet been studied, although IHC is simple to perform in clinical practice. We evaluated the prognostic significance of PRAME expression based on IHC analysis in 160 DLBCL patients treated with R-CHOP therapy. There was a significant association between higher PRAME expression and shorter progression-free survival (PFS), and a trend toward shorter overall survival (OS) in patients with higher PRAME expression than that in patients with lower PRAME expression (5-year PFS, 48.1 vs. 61.1 %; 5-year OS, 65.6 vs. 79.1 %). Patients with high PRAME expression tended to have lower chemotherapeutic responses. Thus, IHC is useful for detecting and assessing PRAME expression in DLBCL. Further, we found a positive correlation between IHC and quantitative real-time RT-PCR measurements of PRAME expression. Our findings indicate that IHC results of PRAME expression can be a novel prognostic maker in DLBCL patients treated with R-CHOP therapy. PMID:24820636

  15. Imprecise probability assessment of tipping points in the climate system.

    PubMed

    Kriegler, Elmar; Hall, Jim W; Held, Hermann; Dawson, Richard; Schellnhuber, Hans Joachim

    2009-03-31

    Major restructuring of the Atlantic meridional overturning circulation, the Greenland and West Antarctic ice sheets, the Amazon rainforest and ENSO, are a source of concern for climate policy. We have elicited subjective probability intervals for the occurrence of such major changes under global warming from 43 scientists. Although the expert estimates highlight large uncertainty, they allocate significant probability to some of the events listed above. We deduce conservative lower bounds for the probability of triggering at least 1 of those events of 0.16 for medium (2-4 degrees C), and 0.56 for high global mean temperature change (above 4 degrees C) relative to year 2000 levels. PMID:19289827

  16. THE BLACK HOLE FORMATION PROBABILITY

    SciTech Connect

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P {sub BH}(M {sub ZAMS}). Although we find that it is difficult to derive a unique P {sub BH}(M {sub ZAMS}) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P {sub BH}(M {sub ZAMS}) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P {sub BH}(M {sub ZAMS}) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  17. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew; Piro, Anthony L.; Ott, Christian D.

    2015-02-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. We present an initial exploration of the probability that a star will make a BH as a function of its ZAMS mass, P BH(M ZAMS). Although we find that it is difficult to derive a unique P BH(M ZAMS) using current measurements of both the BH mass distribution and the degree of chemical enrichment by massive stars, we demonstrate how P BH(M ZAMS) changes with these various observational and theoretical uncertainties. We anticipate that future studies of Galactic BHs and theoretical studies of core collapse will refine P BH(M ZAMS) and argue that this framework is an important new step toward better understanding BH formation. A probabilistic description of BH formation will be useful as input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  18. The case for lower probabilities as measures of uncertainty

    SciTech Connect

    Tonn, B. ); Wagner, C. . Dept. of Mathematics)

    1991-01-01

    This paper presents the case for using lower probabilities as measures of uncertainty in expert systems. A debate has raged within the artificial intelligence community for years about how to represent uncertainty in expert systems. Several camps have emerged. One camp has focused on developing alternatives to probability theory, such as certainty factors, fuzzy sets, and endorsements. A second camp has focused on retrofitting classical, additive probability, for example, by developing a cautious approach to probabilistic reasoning and interpreting probability within a possible worlds framework. This paper falls into a third camp, which encompasses generalizations of probability theory. The most discussed generalization is known as Dempster-Shafer Theory, which is based on the combined work of Dempster and Shafer. Lower probabilities are actually a substantial generalization of DST. This paper has two parts. The first presents the definitions of lower probabilities, DST, and additive probability. This section includes a discussion of capacities, the most general type of uncertainty measure. The purpose of this section is to show the differences among the uncertainty measures.

  19. Multiple-event probability in general-relativistic quantum mechanics

    SciTech Connect

    Hellmann, Frank; Mondragon, Mauricio; Perez, Alejandro; Rovelli, Carlo

    2007-04-15

    We discuss the definition of quantum probability in the context of 'timeless' general-relativistic quantum mechanics. In particular, we study the probability of sequences of events, or multievent probability. In conventional quantum mechanics this can be obtained by means of the 'wave function collapse' algorithm. We first point out certain difficulties of some natural definitions of multievent probability, including the conditional probability widely considered in the literature. We then observe that multievent probability can be reduced to single-event probability, by taking into account the quantum nature of the measuring apparatus. In fact, by exploiting the von-Neumann freedom of moving the quantum/classical boundary, one can always trade a sequence of noncommuting quantum measurements at different times, with an ensemble of simultaneous commuting measurements on the joint system+apparatus system. This observation permits a formulation of quantum theory based only on single-event probability, where the results of the wave function collapse algorithm can nevertheless be recovered. The discussion also bears on the nature of the quantum collapse.

  20. The Probability Distribution for a Biased Spinner

    ERIC Educational Resources Information Center

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)