Science.gov

Sample records for probability based significance

  1. Significance of "high probability/low damage" versus "low probability/high damage" flood events

    NASA Astrophysics Data System (ADS)

    Merz, B.; Elmer, F.; Thieken, A. H.

    2009-06-01

    The need for an efficient use of limited resources fosters the application of risk-oriented design in flood mitigation. Flood defence measures reduce future damage. Traditionally, this benefit is quantified via the expected annual damage. We analyse the contribution of "high probability/low damage" floods versus the contribution of "low probability/high damage" events to the expected annual damage. For three case studies, i.e. actual flood situations in flood-prone communities in Germany, it is shown that the expected annual damage is dominated by "high probability/low damage" events. Extreme events play a minor role, even though they cause high damage. Using typical values for flood frequency behaviour, flood plain morphology, distribution of assets and vulnerability, it is shown that this also holds for the general case of river floods in Germany. This result is compared to the significance of extreme events in the public perception. "Low probability/high damage" events are more important in the societal view than it is expressed by the expected annual damage. We conclude that the expected annual damage should be used with care since it is not in agreement with societal priorities. Further, risk aversion functions that penalise events with disastrous consequences are introduced in the appraisal of risk mitigation options. It is shown that risk aversion may have substantial implications for decision-making. Different flood mitigation decisions are probable, when risk aversion is taken into account.

  2. Significance probability mapping: the final touch in t-statistic mapping.

    PubMed

    Hassainia, F; Petit, D; Montplaisir, J

    1994-01-01

    Significance Probability Mapping (SPM), based on Student's t-statistic, is widely used for comparing mean brain topography maps of two groups. The map resulting from this process represents the distribution of t-values over the entire scalp. However, t-values by themselves cannot reveal whether or not group differences are significant. Significance levels associated with a few t-values are therefore commonly indicated on map legends to give the reader an idea of the significance levels of t-values. Nevertheless, a precise significance level topography cannot be achieved with these few significance values. We introduce a new kind of map which directly displays significance level topography in order to relieve the reader from converting multiple t-values to their corresponding significance probabilities, and to obtain a good quantification and a better localization of regions with significant differences between groups. As an illustration of this type of map, we present a comparison of EEG activity in Alzheimer's patients and age-matched control subjects for both wakefulness and REM sleep.

  3. Modulation Based on Probability Density Functions

    NASA Technical Reports Server (NTRS)

    Williams, Glenn L.

    2009-01-01

    A proposed method of modulating a sinusoidal carrier signal to convey digital information involves the use of histograms representing probability density functions (PDFs) that characterize samples of the signal waveform. The method is based partly on the observation that when a waveform is sampled (whether by analog or digital means) over a time interval at least as long as one half cycle of the waveform, the samples can be sorted by frequency of occurrence, thereby constructing a histogram representing a PDF of the waveform during that time interval.

  4. Significance of stress transfer in time-dependent earthquake probability calculations

    USGS Publications Warehouse

    Parsons, T.

    2005-01-01

    A sudden change in stress is seen to modify earthquake rates, but should it also revise earthquake probability? Data used to derive input parameters permits an array of forecasts; so how large a static stress change is require to cause a statistically significant earthquake probability change? To answer that question, effects of parameter and philosophical choices are examined through all phases of sample calculations, Drawing at random from distributions of recurrence-aperiodicity pairs identifies many that recreate long paleoseismic and historic earthquake catalogs. Probability density funtions built from the recurrence-aperiodicity pairs give the range of possible earthquake forecasts under a point process renewal model. Consequences of choices made in stress transfer calculations, such as different slip models, fault rake, dip, and friction are, tracked. For interactions among large faults, calculated peak stress changes may be localized, with most of the receiving fault area changed less than the mean. Thus, to avoid overstating probability change on segments, stress change values should be drawn from a distribution reflecting the spatial pattern rather than using the segment mean. Disparity resulting from interaction probability methodology is also examined. For a fault with a well-understood earthquake history, a minimum stress change to stressing rate ratio of 10:1 to 20:1 is required to significantly skew probabilities with >80-85% confidence. That ratio must be closer to 50:1 to exceed 90-95% confidence levels. Thus revision to earthquake probability is achievable when a perturbing event is very close to the fault in question or the tectonic stressing rate is low.

  5. Clinical significance of interval changes in breast lesions initially categorized as probably benign on breast ultrasound

    PubMed Central

    Jang, Ja Yoon; Kim, Sun Mi; Kim, Jin Hwan; Jang, Mijung; La Yun, Bo; Lee, Jong Yoon; Lee, Soo Hyun; Kim, Bohyoung

    2017-01-01

    Abstract The aims of this study were to determine the malignancy rate of probably benign lesions that show an interval change on follow-up ultrasound and to evaluate the differences seen on imaging between benign and malignant lesions initially categorized as probably benign but with interval change on follow-up breast ultrasound. We retrospectively reviewed 11,323 lesions from ultrasound-guided core-biopsies performed between June 2004 and December 2014 and identified 289 lesions (266 patients) with an interval change from probably benign (Breast Imaging Reporting and Data System [BI-RADS] category 3) in the previous 2 years. Malignancy rates were compared according to the ultrasound findings and the characteristics of the interval changes, including changes in morphology and/or diameter. The malignancy rate for probably benign lesions that showed an interval change on follow-up ultrasound was 6.9% (20/289). The malignancy rate was higher for clustered cysts (33.3%) and irregular or noncircumscribed masses (12.7%) than for circumscribed oval masses (5%) or complicated cysts (5%) seen on initial ultrasound (P = 0.043). Fifty-five percent of the malignancies were found to be ductal carcinoma in situ and there was 1 case of lymph node metastasis among the patients with invasive disease in whom biopsy was delayed by 6 to 15 months. The extent of invasiveness was greater in missed cases. There was a significant difference in the maximal diameter change between the 20 malignant lesions and the 269 benign lesions (4.0 mm vs 2.7 mm, P = 0.002). The cutoff value for maximal diameter change per initial diameter was 39.0% for predicting malignancy (sensitivity 95%, specificity 53.5%). The malignancy rate for morphologically changed lesions was significantly higher than for morphologically stable lesions (13.6% vs 4.9%; P = 0.024) Our 6.9% of probably benign lesions that showed an interval change finally turned out to be malignancy was mostly DCIS. The

  6. Vehicle Detection Based on Probability Hypothesis Density Filter

    PubMed Central

    Zhang, Feihu; Knoll, Alois

    2016-01-01

    In the past decade, the developments of vehicle detection have been significantly improved. By utilizing cameras, vehicles can be detected in the Regions of Interest (ROI) in complex environments. However, vision techniques often suffer from false positives and limited field of view. In this paper, a LiDAR based vehicle detection approach is proposed by using the Probability Hypothesis Density (PHD) filter. The proposed approach consists of two phases: the hypothesis generation phase to detect potential objects and the hypothesis verification phase to classify objects. The performance of the proposed approach is evaluated in complex scenarios, compared with the state-of-the-art. PMID:27070621

  7. GNSS integer ambiguity validation based on posterior probability

    NASA Astrophysics Data System (ADS)

    Wu, Zemin; Bian, Shaofeng

    2015-10-01

    GNSS integer ambiguity validation is considered to be a challenge task for decades. Several kinds of validation tests are developed and widely used in these years, but theoretical basis is their weakness. Ambiguity validation theoretically is an issue of hypothesis test. In the frame of Bayesian hypothesis testing, posterior probability is the canonical standard that statistical decision should be based on. In this contribution, (i) we derive the posterior probability of the fixed ambiguity based on the Bayesian principle and modify it for practice ambiguity validation. (ii) The optimal property of the posterior probability test is proved based on an extended Neyman-Pearson lemma. Since validation failure rate is the issue users most concerned about, (iii) we derive the failure rate upper bound of the posterior probability test, so the user can use the posterior probability test either in the fixed posterior probability or in the fixed failure rate way. Simulated as well as real observed data are used for experimental validations. The results show that (i) the posterior probability test is the most effective within the R-ratio test, difference test, ellipsoidal integer aperture test and posterior probability test, (ii) the posterior probability test is computational efficient and (iii) the failure rate estimation for posterior probability test is useful.

  8. PROBABILITY BASED CORROSION CONTROL FOR WASTE TANKS - PART II

    SciTech Connect

    Hoffman, E.; Edwards, T.

    2010-12-09

    As part of an ongoing study to evaluate the discontinuity in the corrosion controls at the SRS tank farm, a study was conducted this year to assess the minimum concentrations below 1 molar nitrate, see Figure 1. Current controls on the tank farm solution chemistry are in place to prevent the initiation and propagation of pitting and stress corrosion cracking in the primary steel waste tanks. The controls are based upon a series of experiments performed with simulated solutions on materials used for construction of the tanks, namely ASTM A537 carbon steel (A537). During FY09, an experimental program was undertaken to investigate the risk associated with reducing the minimum molar nitrite concentration required to confidently inhibit pitting in dilute solutions (i.e., less than 1 molar nitrate). The experimental results and conclusions herein provide a statistical basis to quantify the probability of pitting for the tank wall exposed to various solutions with dilute concentrations of nitrate and nitrite. Understanding the probability for pitting will allow the facility to make tank-specific risk-based decisions for chemistry control. Based on previous electrochemical testing, a statistical test matrix was developed to refine and solidify the application of the statistical mixture/amount model to corrosion of A537 steel. A mixture/amount model was identified based on statistical analysis of recent and historically collected electrochemical data. This model provides a more complex relationship between the nitrate and nitrite concentrations and the probability of pitting than is represented by the model underlying the current chemistry control program, and its use may provide a technical basis for the utilization of less nitrite to inhibit pitting at concentrations below 1 molar nitrate. FY09 results fit within the mixture/amount model, and further refine the nitrate regime in which the model is applicable. The combination of visual observations and cyclic

  9. PROBABILITY BASED CORROSION CONTROL FOR LIQUID WASTE TANKS - PART III

    SciTech Connect

    Hoffman, E.; Edwards, T.

    2010-12-09

    The liquid waste chemistry control program is designed to reduce the pitting corrosion occurrence on tank walls. The chemistry control program has been implemented, in part, by applying engineering judgment safety factors to experimental data. However, the simple application of a general safety factor can result in use of excessive corrosion inhibiting agents. The required use of excess corrosion inhibitors can be costly for tank maintenance, waste processing, and in future tank closure. It is proposed that a probability-based approach can be used to quantify the risk associated with the chemistry control program. This approach can lead to the application of tank-specific chemistry control programs reducing overall costs associated with overly conservative use of inhibitor. Furthermore, when using nitrite as an inhibitor, the current chemistry control program is based on a linear model of increased aggressive species requiring increased protective species. This linear model was primarily supported by experimental data obtained from dilute solutions with nitrate concentrations less than 0.6 M, but is used to produce the current chemistry control program up to 1.0 M nitrate. Therefore, in the nitrate space between 0.6 and 1.0 M, the current control limit is based on assumptions that the linear model developed from data in the <0.6 M region is applicable in the 0.6-1.0 M region. Due to this assumption, further investigation of the nitrate region of 0.6 M to 1.0 M has potential for significant inhibitor reduction, while maintaining the same level of corrosion risk associated with the current chemistry control program. Ongoing studies have been conducted in FY07, FY08, FY09 and FY10 to evaluate the corrosion controls at the SRS tank farm and to assess the minimum nitrite concentrations to inhibit pitting in ASTM A537 carbon steel below 1.0 molar nitrate. The experimentation from FY08 suggested a non-linear model known as the mixture/amount model could be used to predict

  10. Scene text detection based on probability map and hierarchical model

    NASA Astrophysics Data System (ADS)

    Zhou, Gang; Liu, Yuehu

    2012-06-01

    Scene text detection is an important step for the text-based information extraction system. This problem is challenging due to the variations of size, unknown colors, and background complexity. We present a novel algorithm to robustly detect text in scene images. To segment text candidate connected components (CC) from images, a text probability map consisting of the text position and scale information is estimated by a text region detector. To filter out the non-text CCs, a hierarchical model consisting of two classifiers in cascade is utilized. The first stage of the model estimates text probabilities with unary component features. The second stage classifier is trained with both probability features and similarity features. Since the proposed method is learning-based, there are very few manual parameters required. Experimental results on the public benchmark ICDAR dataset show that our algorithm outperforms other state-of-the-art methods.

  11. Nonprobability and probability-based sampling strategies in sexual science.

    PubMed

    Catania, Joseph A; Dolcini, M Margaret; Orellana, Roberto; Narayanan, Vasudah

    2015-01-01

    With few exceptions, much of sexual science builds upon data from opportunistic nonprobability samples of limited generalizability. Although probability-based studies are considered the gold standard in terms of generalizability, they are costly to apply to many of the hard-to-reach populations of interest to sexologists. The present article discusses recent conclusions by sampling experts that have relevance to sexual science that advocates for nonprobability methods. In this regard, we provide an overview of Internet sampling as a useful, cost-efficient, nonprobability sampling method of value to sex researchers conducting modeling work or clinical trials. We also argue that probability-based sampling methods may be more readily applied in sex research with hard-to-reach populations than is typically thought. In this context, we provide three case studies that utilize qualitative and quantitative techniques directed at reducing limitations in applying probability-based sampling to hard-to-reach populations: indigenous Peruvians, African American youth, and urban men who have sex with men (MSM). Recommendations are made with regard to presampling studies, adaptive and disproportionate sampling methods, and strategies that may be utilized in evaluating nonprobability and probability-based sampling methods.

  12. Lake Superior Phytoplankton Characterization from the 2006 Probability Based Survey

    EPA Science Inventory

    We conducted a late summer probability based survey of Lake Superior in 2006 which consisted of 52 sites stratified across 3 depth zones. As part of this effort, we collected composite phytoplankton samples from the epilimnion and the fluorescence maxima (Fmax) at 29 of the site...

  13. Westward-derived conglomerates in Moenkopi formation of Southeastern California, and their probable tectonic significance

    SciTech Connect

    Walker, J.D.; Burchfiel, B.C.; Royden, L.H.

    1983-02-01

    The upper part of the Moenkopi Formation in the Northern Clark Mountains, Southeastern California, contains conglomerate beds whose clasts comprise igneous, metamorphic, and sedimentary rocks. Metamorphic clasts include foliated granite, meta-arkose, and quarzite, probably derived from older Precambrian basement and younger Precambrian clastic rocks. Volcanic clasts are altered plagioclase-bearing rocks, and sedimentary clasts were derived from Paleozoic miogeoclinal rocks. Paleocurrent data indicate that the clasts had a source to the southwest. An age of late Early or early Middle Triassic has been tentatively assigned to these conglomerates. These conglomerates indicate that Late Permian to Early Triassic deformational events in this part of the orogen affected rocks much farther east than has been previously recognized.

  14. Quantum probability ranking principle for ligand-based virtual screening.

    PubMed

    Al-Dabbagh, Mohammed Mumtaz; Salim, Naomie; Himmat, Mubarak; Ahmed, Ali; Saeed, Faisal

    2017-04-01

    Chemical libraries contain thousands of compounds that need screening, which increases the need for computational methods that can rank or prioritize compounds. The tools of virtual screening are widely exploited to enhance the cost effectiveness of lead drug discovery programs by ranking chemical compounds databases in decreasing probability of biological activity based upon probability ranking principle (PRP). In this paper, we developed a novel ranking approach for molecular compounds inspired by quantum mechanics, called quantum probability ranking principle (QPRP). The QPRP ranking criteria would make an attempt to draw an analogy between the physical experiment and molecular structure ranking process for 2D fingerprints in ligand based virtual screening (LBVS). The development of QPRP criteria in LBVS has employed the concepts of quantum at three different levels, firstly at representation level, this model makes an effort to develop a new framework of molecular representation by connecting the molecular compounds with mathematical quantum space. Secondly, estimate the similarity between chemical libraries and references based on quantum-based similarity searching method. Finally, rank the molecules using QPRP approach. Simulated virtual screening experiments with MDL drug data report (MDDR) data sets showed that QPRP outperformed the classical ranking principle (PRP) for molecular chemical compounds.

  15. Quantum probability ranking principle for ligand-based virtual screening

    NASA Astrophysics Data System (ADS)

    Al-Dabbagh, Mohammed Mumtaz; Salim, Naomie; Himmat, Mubarak; Ahmed, Ali; Saeed, Faisal

    2017-02-01

    Chemical libraries contain thousands of compounds that need screening, which increases the need for computational methods that can rank or prioritize compounds. The tools of virtual screening are widely exploited to enhance the cost effectiveness of lead drug discovery programs by ranking chemical compounds databases in decreasing probability of biological activity based upon probability ranking principle (PRP). In this paper, we developed a novel ranking approach for molecular compounds inspired by quantum mechanics, called quantum probability ranking principle (QPRP). The QPRP ranking criteria would make an attempt to draw an analogy between the physical experiment and molecular structure ranking process for 2D fingerprints in ligand based virtual screening (LBVS). The development of QPRP criteria in LBVS has employed the concepts of quantum at three different levels, firstly at representation level, this model makes an effort to develop a new framework of molecular representation by connecting the molecular compounds with mathematical quantum space. Secondly, estimate the similarity between chemical libraries and references based on quantum-based similarity searching method. Finally, rank the molecules using QPRP approach. Simulated virtual screening experiments with MDL drug data report (MDDR) data sets showed that QPRP outperformed the classical ranking principle (PRP) for molecular chemical compounds.

  16. Ethanol, not metabolized in brain, significantly reduces brain metabolism, probably via specific GABA(A) receptors

    PubMed Central

    Rae, Caroline D.; Davidson, Joanne E.; Maher, Anthony D.; Rowlands, Benjamin D.; Kashem, Mohammed A.; Nasrallah, Fatima A.; Rallapalli, Sundari K.; Cook, James M; Balcar, Vladimir J.

    2014-01-01

    Ethanol is a known neuromodulatory agent with reported actions at a range of neurotransmitter receptors. Here, we used an indirect approach, measuring the effect of alcohol on metabolism of [3-13C]pyruvate in the adult Guinea pig brain cortical tissue slice and comparing the outcomes to those from a library of ligands active in the GABAergic system as well as studying the metabolic fate of [1,2-13C]ethanol. Ethanol (10, 30 and 60 mM) significantly reduced metabolic flux into all measured isotopomers and reduced all metabolic pool sizes. The metabolic profiles of these three concentrations of ethanol were similar and clustered with that of the α4β3δ positive allosteric modulator DS2 (4-Chloro-N-[2-(2-thienyl)imidazo[1,2a]-pyridin-3-yl]benzamide). Ethanol at a very low concentration (0.1 mM) produced a metabolic profile which clustered with those from inhibitors of GABA uptake, and ligands showing affinity for α5, and to a lesser extent, α1-containing GABA(A)R. There was no measureable metabolism of [1,2-13C]ethanol with no significant incorporation of 13C from [1,2-13C]ethanol into any measured metabolite above natural abundance, although there were measurable effects on total metabolite sizes similar to those seen with unlabeled ethanol. The reduction in metabolism seen in the presence of ethanol is therefore likely to be due to its actions at neurotransmitter receptors, particularly α4β3δ receptors, and not because ethanol is substituting as a substrate or because of the effects of ethanol catabolites acetaldehyde or acetate. We suggest that the stimulatory effects of very low concentrations of ethanol are due to release of GABA via GAT1 and the subsequent interaction of this GABA with local α5-containing, and to a lesser extent, α1-containing GABA(A)R. PMID:24313287

  17. Diffusion-based population statistics using tract probability maps.

    PubMed

    Wassermann, Demian; Kanterakis, Efstathios; Gur, Ruben C; Deriche, Rachid; Verma, Ragini

    2010-01-01

    We present a novel technique for the tract-based statistical analysis of diffusion imaging data. In our technique, we represent each white matter (WM) tract as a tract probability map (TPM): a function mapping a point to its probability of belonging to the tract. We start by automatically clustering the tracts identified in the brain via tractography into TPMs using a novel Gaussian process framework. Then, each tract is modeled by the skeleton of its TPM, a medial representation with a tubular or sheet-like geometry. The appropriate geometry for each tract is implicitly inferred from the data instead of being selected a priori, as is done by current tract-specific approaches. The TPM representation makes it possible to average diffusion imaging based features along directions locally perpendicular to the skeleton of each WM tract, increasing the sensitivity and specificity of statistical analyses on the WM. Our framework therefore facilitates the automated analysis of WM tract bundles, and enables the quantification and visualization of tract-based statistical differences between groups. We have demonstrated the applicability of our framework by studying WM differences between 34 schizophrenia patients and 24 healthy controls.

  18. QKD-based quantum private query without a failure probability

    NASA Astrophysics Data System (ADS)

    Liu, Bin; Gao, Fei; Huang, Wei; Wen, QiaoYan

    2015-10-01

    In this paper, we present a quantum-key-distribution (QKD)-based quantum private query (QPQ) protocol utilizing single-photon signal of multiple optical pulses. It maintains the advantages of the QKD-based QPQ, i.e., easy to implement and loss tolerant. In addition, different from the situations in the previous QKD-based QPQ protocols, in our protocol, the number of the items an honest user will obtain is always one and the failure probability is always zero. This characteristic not only improves the stability (in the sense that, ignoring the noise and the attack, the protocol would always succeed), but also benefits the privacy of the database (since the database will no more reveal additional secrets to the honest users). Furthermore, for the user's privacy, the proposed protocol is cheat sensitive, and for security of the database, we obtain an upper bound for the leaked information of the database in theory.

  19. Using Inverse Probability Bootstrap Sampling to Eliminate Sample Induced Bias in Model Based Analysis of Unequal Probability Samples.

    PubMed

    Nahorniak, Matthew; Larsen, David P; Volk, Carol; Jordan, Chris E

    2015-01-01

    In ecology, as in other research fields, efficient sampling for population estimation often drives sample designs toward unequal probability sampling, such as in stratified sampling. Design based statistical analysis tools are appropriate for seamless integration of sample design into the statistical analysis. However, it is also common and necessary, after a sampling design has been implemented, to use datasets to address questions that, in many cases, were not considered during the sampling design phase. Questions may arise requiring the use of model based statistical tools such as multiple regression, quantile regression, or regression tree analysis. However, such model based tools may require, for ensuring unbiased estimation, data from simple random samples, which can be problematic when analyzing data from unequal probability designs. Despite numerous method specific tools available to properly account for sampling design, too often in the analysis of ecological data, sample design is ignored and consequences are not properly considered. We demonstrate here that violation of this assumption can lead to biased parameter estimates in ecological research. In addition, to the set of tools available for researchers to properly account for sampling design in model based analysis, we introduce inverse probability bootstrapping (IPB). Inverse probability bootstrapping is an easily implemented method for obtaining equal probability re-samples from a probability sample, from which unbiased model based estimates can be made. We demonstrate the potential for bias in model-based analyses that ignore sample inclusion probabilities, and the effectiveness of IPB sampling in eliminating this bias, using both simulated and actual ecological data. For illustration, we considered three model based analysis tools--linear regression, quantile regression, and boosted regression tree analysis. In all models, using both simulated and actual ecological data, we found inferences to be

  20. Using Inverse Probability Bootstrap Sampling to Eliminate Sample Induced Bias in Model Based Analysis of Unequal Probability Samples

    PubMed Central

    Nahorniak, Matthew

    2015-01-01

    In ecology, as in other research fields, efficient sampling for population estimation often drives sample designs toward unequal probability sampling, such as in stratified sampling. Design based statistical analysis tools are appropriate for seamless integration of sample design into the statistical analysis. However, it is also common and necessary, after a sampling design has been implemented, to use datasets to address questions that, in many cases, were not considered during the sampling design phase. Questions may arise requiring the use of model based statistical tools such as multiple regression, quantile regression, or regression tree analysis. However, such model based tools may require, for ensuring unbiased estimation, data from simple random samples, which can be problematic when analyzing data from unequal probability designs. Despite numerous method specific tools available to properly account for sampling design, too often in the analysis of ecological data, sample design is ignored and consequences are not properly considered. We demonstrate here that violation of this assumption can lead to biased parameter estimates in ecological research. In addition, to the set of tools available for researchers to properly account for sampling design in model based analysis, we introduce inverse probability bootstrapping (IPB). Inverse probability bootstrapping is an easily implemented method for obtaining equal probability re-samples from a probability sample, from which unbiased model based estimates can be made. We demonstrate the potential for bias in model-based analyses that ignore sample inclusion probabilities, and the effectiveness of IPB sampling in eliminating this bias, using both simulated and actual ecological data. For illustration, we considered three model based analysis tools—linear regression, quantile regression, and boosted regression tree analysis. In all models, using both simulated and actual ecological data, we found inferences to be

  1. Lung scans with significant perfusion defects limited to matching pleural effusions have a low probability of pulmonary embolism

    SciTech Connect

    Datz, F.L.; Bedont, R.A.; Taylor, A.

    1985-05-01

    Patients with a pleural effusion on chest x-ray often undergo a lung scan to exclude pulmonary embolism (PE). According to other studies, when the scan shows a perfusion defect equal in size to a radiographic abnormality on chest x-ray, the scan should be classified as indeterminate or intermediate probability for PE. However, since those studies dealt primarily with alveolar infiltrates rather than pleural effusions, the authors undertook a retrospective study to determine the probability of PE in patients with pleural effusion and a matching perfusion defect. The authors reviewed 451 scans and x-rays of patients studied for suspected PE. Of those, 53 had moderate or large perfusion defects secondary to pleural effusion without other significant (>25% of a segment) effusion without other significant (>25% of a segment) defects on the scan. Final diagnosis was confirmed by pulmonary angiography (16), thoracentesis (40), venography (11), other radiographic and laboratory studies, and clinical course. Of the 53 patients, only 2 patients had venous thrombotic disease. One patient had PE on pulmonary angiography, the other patient had thrombophlebitis on venography. The remainder of the patients had effusions due to congestive heart failure (12), malignancy (12), infection (7), trauma (7), collegen vascular disease (7), sympathetic effusion (3) and unknown etiology (3). The authors conclude that lung scans with significant perfusion defects limited to matching pleural effusions on chest x-ray have a low probability for PE.

  2. Gesture Recognition Based on the Probability Distribution of Arm Trajectories

    NASA Astrophysics Data System (ADS)

    Wan, Khairunizam; Sawada, Hideyuki

    The use of human motions for the interaction between humans and computers is becoming an attractive alternative to verbal media, especially through the visual interpretation of the human body motion. In particular, hand gestures are used as non-verbal media for the humans to communicate with machines that pertain to the use of the human gestures to interact with them. This paper introduces a 3D motion measurement of the human upper body for the purpose of the gesture recognition, which is based on the probability distribution of arm trajectories. In this study, by examining the characteristics of the arm trajectories given by a signer, motion features are selected and classified by using a fuzzy technique. Experimental results show that the use of the features extracted from arm trajectories effectively works on the recognition of dynamic gestures of a human, and gives a good performance to classify various gesture patterns.

  3. Novel Use of Derived Genotype Probabilities to Discover Significant Dominance Effects for Milk Production Traits in Dairy Cattle

    PubMed Central

    Boysen, Teide-Jens; Heuer, Claas; Tetens, Jens; Reinhardt, Fritz; Thaller, Georg

    2013-01-01

    The estimation of dominance effects requires the availability of direct phenotypes, i.e., genotypes and phenotypes in the same individuals. In dairy cattle, classical QTL mapping approaches are, however, relying on genotyped sires and daughter-based phenotypes like breeding values. Thus, dominance effects cannot be estimated. The number of dairy bulls genotyped for dense genome-wide marker panels is steadily increasing in the context of genomic selection schemes. The availability of genotyped cows is, however, limited. Within the current study, the genotypes of male ancestors were applied to the calculation of genotype probabilities in cows. Together with the cows’ phenotypes, these probabilities were used to estimate dominance effects on a genome-wide scale. The impact of sample size, the depth of pedigree used in deriving genotype probabilities, the linkage disequilibrium between QTL and marker, the fraction of variance explained by the QTL, and the degree of dominance on the power to detect dominance were analyzed in simulation studies. The effect of relatedness among animals on the specificity of detection was addressed. Furthermore, the approach was applied to a real data set comprising 470,000 Holstein cows. To account for relatedness between animals a mixed-model two-step approach was used to adjust phenotypes based on an additive genetic relationship matrix. Thereby, considerable dominance effects were identified for important milk production traits. The approach might serve as a powerful tool to dissect the genetic architecture of performance and functional traits in dairy cattle. PMID:23222654

  4. Econometric analysis of the changing effects in wind strength and significant wave height on the probability of casualty in shipping.

    PubMed

    Knapp, Sabine; Kumar, Shashi; Sakurada, Yuri; Shen, Jiajun

    2011-05-01

    This study uses econometric models to measure the effect of significant wave height and wind strength on the probability of casualty and tests whether these effects changed. While both effects are in particular relevant for stability and strength calculations of vessels, it is also helpful for the development of ship construction standards in general to counteract increased risk resulting from changing oceanographic conditions. The authors analyzed a unique dataset of 3.2 million observations from 20,729 individual vessels in the North Atlantic and Arctic regions gathered during the period 1979-2007. The results show that although there is a seasonal pattern in the probability of casualty especially during the winter months, the effect of wind strength and significant wave height do not follow the same seasonal pattern. Additionally, over time, significant wave height shows an increasing effect in January, March, May and October while wind strength shows a decreasing effect, especially in January, March and May. The models can be used to simulate relationships and help understand the relationships. This is of particular interest to naval architects and ship designers as well as multilateral agencies such as the International Maritime Organization (IMO) that establish global standards in ship design and construction.

  5. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning.

  6. Web-based experiments controlled by JavaScript: an example from probability learning.

    PubMed

    Birnbaum, Michael H; Wakcher, Sandra V

    2002-05-01

    JavaScript programs can be used to control Web experiments. This technique is illustrated by an experiment that tested the effects of advice on performance in the classic probability-learning paradigm. Previous research reported that people tested via the Web or in the lab tended to match the probabilities of their responses to the probabilities that those responses would be reinforced. The optimal strategy, however, is to consistently choose the more frequent event; probability matching produces suboptimal performance. We investigated manipulations we reasoned should improve performance. A horse race scenario in which participants predicted the winner in each of a series of races between two horses was compared with an abstract scenario used previously. Ten groups of learners received different amounts of advice, including all combinations of (1) explicit instructions concerning the optimal strategy, (2) explicit instructions concerning a monetary sum to maximize, and (3) accurate information concerning the probabilities of events. The results showed minimal effects of horse race versus abstract scenario. Both advice concerning the optimal strategy and probability information contributed significantly to performance in the task. This paper includes a brief tutorial on JavaScript, explaining with simple examples how to assemble a browser-based experiment.

  7. Success Probability Analysis for Shuttle Based Microgravity Experiments

    NASA Technical Reports Server (NTRS)

    Liou, Ying-Hsin Andrew

    1996-01-01

    Presented in this report are the results of data analysis of shuttle-based microgravity flight experiments. Potential factors were identified in the previous grant period, and in this period 26 factors were selected for data analysis. In this project, the degree of success was developed and used as the performance measure. 293 of the 391 experiments in Lewis Research Center Microgravity Database were assigned degrees of success. The frequency analysis and the analysis of variance were conducted to determine the significance of the factors that effect the experiment success.

  8. Proficiency Scaling Based on Conditional Probability Functions for Attributes

    DTIC Science & Technology

    1993-10-01

    4.1 Non-parametric regression estimates as probability functions for attributes Non- parametric estimation of the unknown density function f from a plot...as construction of confidence intervals for PFAs and further improvement of non- parametric estimation methods are not discussed in this paper. The... parametric estimation of PFAs will be illustrated with the attribute mastery patterns of SAT M Section 4. In the next section, analysis results will be

  9. Probability-based stability robustness assessment of controlled structures

    SciTech Connect

    Field, R.V. Jr.; Voulgaris, P.G.; Bergman, L.A.

    1996-01-01

    Model uncertainty, if ignored, can seriously degrade the performance of an otherwise well-designed control system. If the level of this uncertainty is extreme, the system may even be driven to instability. In the context of structural control, performance degradation and instability imply excessive vibration or even structural failure. Robust control has typically been applied to the issue of model uncertainty through worst-case analyses. These traditional methods include the use of the structured singular value, as applied to the small gain condition, to provide estimates of controller robustness. However, this emphasis on the worst-case scenario has not allowed a probabilistic understanding of robust control. In this paper an attempt to view controller robustness as a probability measure is presented. The probability of failure due to parametric uncertainty is estimated using first-order reliability methods (FORM). It is demonstrated that this method can provide quite accurate results on the probability of failure of actively controlled structures. Moreover, a comparison of this method to a suitability modified structured singular value robustness analysis in a probabilistic framework is performed. It is shown that FORM is the superior analysis technique when applied to a controlled three degree-of-freedom structure. In addition, the robustness qualities of various active control design schemes such as LQR, H{sub 2}, H {sub oo}, and {mu}-synthesis is discussed in order to provide some design guidelines.

  10. [Base excess. Parameter with exceptional clinical significance].

    PubMed

    Schaffartzik, W

    2007-05-01

    The base excess of blood (BE) plays an important role in the description of the acid-base status of a patient and is gaining in clinical interest. Apart from the Quick test, the age, the injury severity score and the Glasgow coma scale, the BE is becoming more and more important to identify, e. g. the risk of mortality for patients with multiple injuries. According to Zander the BE is calculated using the pH, pCO(2), haemoglobin concentration and the oxygen saturation of haemoglobin (sO(2)). The use of sO(2 )allows the blood gas analyser to determine only one value of BE, independent of the type of blood sample analyzed: arterial, mixed venous or venous. The BE and measurement of the lactate concentration (cLac) play an important role in diagnosing critically ill patients. In general, the change in BE corresponds to the change in cLac. If DeltaBE is smaller than DeltacLac the reason could be therapy with HCO(3)(-) but also with infusion solutions containing lactate. Physician are very familiar with the term BE, therefore, knowledge about an alkalizing or acidifying effect of an infusion solution would be very helpful in the treatment of patients, especially critically ill patients. Unfortunately, at present the description of an infusion solution with respect to BE has not yet been accepted by the manufacturers.

  11. The Effect of Simulation-Based Learning on Prospective Teachers' Inference Skills in Teaching Probability

    ERIC Educational Resources Information Center

    Koparan, Timur; Yilmaz, Gül Kaleli

    2015-01-01

    The effect of simulation-based probability teaching on the prospective teachers' inference skills has been examined with this research. In line with this purpose, it has been aimed to examine the design, implementation and efficiency of a learning environment for experimental probability. Activities were built on modeling, simulation and the…

  12. Developing a probability-based model of aquifer vulnerability in an agricultural region

    NASA Astrophysics Data System (ADS)

    Chen, Shih-Kai; Jang, Cheng-Shin; Peng, Yi-Huei

    2013-04-01

    SummaryHydrogeological settings of aquifers strongly influence the regional groundwater movement and pollution processes. Establishing a map of aquifer vulnerability is considerably critical for planning a scheme of groundwater quality protection. This study developed a novel probability-based DRASTIC model of aquifer vulnerability in the Choushui River alluvial fan, Taiwan, using indicator kriging and to determine various risk categories of contamination potentials based on estimated vulnerability indexes. Categories and ratings of six parameters in the probability-based DRASTIC model were probabilistically characterized according to the parameter classification methods of selecting a maximum estimation probability and calculating an expected value. Moreover, the probability-based estimation and assessment gave us an excellent insight into propagating the uncertainty of parameters due to limited observation data. To examine the prediction capacity of pollutants for the developed probability-based DRASTIC model, medium, high, and very high risk categories of contamination potentials were compared with observed nitrate-N exceeding 0.5 mg/L indicating the anthropogenic groundwater pollution. The analyzed results reveal that the developed probability-based DRASTIC model is capable of predicting high nitrate-N groundwater pollution and characterizing the parameter uncertainty via the probability estimation processes.

  13. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W; Newman, Perry A. (Technical Monitor)

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The minimum distance associated with the MPP provides a measurement of safety probability, which can be obtained by approximate probability integration methods such as FORM or SORM. The reliability sensitivity equations are derived first in this paper, based on the derivatives of the optimal solution. Examples are provided later to demonstrate the use of these derivatives for better reliability analysis and reliability-based design optimization (RBDO).

  14. A Most Probable Point-Based Method for Reliability Analysis, Sensitivity Analysis and Design Optimization

    NASA Technical Reports Server (NTRS)

    Hou, Gene J.-W.; Gumbert, Clyde R.; Newman, Perry A.

    2004-01-01

    A major step in a most probable point (MPP)-based method for reliability analysis is to determine the MPP. This is usually accomplished by using an optimization search algorithm. The optimal solutions associated with the MPP provide measurements related to safety probability. This study focuses on two commonly used approximate probability integration methods; i.e., the Reliability Index Approach (RIA) and the Performance Measurement Approach (PMA). Their reliability sensitivity equations are first derived in this paper, based on the derivatives of their respective optimal solutions. Examples are then provided to demonstrate the use of these derivatives for better reliability analysis and Reliability-Based Design Optimization (RBDO).

  15. Probability of foliar injury for Acer sp. based on foliar fluoride concentrations.

    PubMed

    McDonough, Andrew M; Dixon, Murray J; Terry, Debbie T; Todd, Aaron K; Luciani, Michael A; Williamson, Michele L; Roszak, Danuta S; Farias, Kim A

    2016-12-01

    Fluoride is considered one of the most phytotoxic elements to plants, and indicative fluoride injury has been associated over a wide range of foliar fluoride concentrations. The aim of this study was to determine the probability of indicative foliar fluoride injury based on Acer sp. foliar fluoride concentrations using a logistic regression model. Foliage from Acer nedundo, Acer saccharinum, Acer saccharum and Acer platanoides was collected along a distance gradient from three separate brick manufacturing facilities in southern Ontario as part of a long-term monitoring programme between 1995 and 2014. Hydrogen fluoride is the major emission source associated with the manufacturing facilities resulting with highly elevated foliar fluoride close to the facilities and decreasing with distance. Consistent with other studies, indicative fluoride injury was observed over a wide range of foliar concentrations (9.9-480.0 μg F(-) g(-1)). The logistic regression model was statistically significant for the Acer sp. group, A. negundo and A. saccharinum; consequently, A. negundo being the most sensitive species among the group. In addition, A. saccharum and A. platanoides were not statistically significant within the model. We are unaware of published foliar fluoride values for Acer sp. within Canada, and this research provides policy maker and scientist with probabilities of indicative foliar injury for common urban Acer sp. trees that can help guide decisions about emissions controls. Further research should focus on mechanisms driving indicative fluoride injury over wide ranging foliar fluoride concentrations and help determine foliar fluoride thresholds for damage.

  16. A Comparative Study of Probability Collectives Based Multi-agent Systems and Genetic Algorithms

    NASA Technical Reports Server (NTRS)

    Huang, Chien-Feng; Wolpert, David H.; Bieniawski, Stefan; Strauss, Charles E. M.

    2005-01-01

    We compare Genetic Algorithms (GA's) with Probability Collectives (PC), a new framework for distributed optimization and control. In contrast to GA's, PC-based methods do not update populations of solutions. Instead they update an explicitly parameterized probability distribution p over the space of solutions. That updating of p arises as the optimization of a functional of p. The functional is chosen so that any p that optimizes it should be p peaked about good solutions. The PC approach works in both continuous and discrete problems. It does not suffer from the resolution limitation of the finite bit length encoding of parameters into GA alleles. It also has deep connections with both game theory and statistical physics. We review the PC approach using its motivation as the information theoretic formulation of bounded rationality for multi-agent systems. It is then compared with GA's on a diverse set of problems. To handle high dimensional surfaces, in the PC method investigated here p is restricted to a product distribution. Each distribution in that product is controlled by a separate agent. The test functions were selected for their difficulty using either traditional gradient descent or genetic algorithms. On those functions the PC-based approach significantly outperforms traditional GA's in both rate of descent, trapping in false minima, and long term optimization.

  17. Dynamic Forecasting Conditional Probability of Bombing Attacks Based on Time-Series and Intervention Analysis.

    PubMed

    Li, Shuying; Zhuang, Jun; Shen, Shifei

    2016-08-23

    In recent years, various types of terrorist attacks occurred, causing worldwide catastrophes. According to the Global Terrorism Database (GTD), among all attack tactics, bombing attacks happened most frequently, followed by armed assaults. In this article, a model for analyzing and forecasting the conditional probability of bombing attacks (CPBAs) based on time-series methods is developed. In addition, intervention analysis is used to analyze the sudden increase in the time-series process. The results show that the CPBA increased dramatically at the end of 2011. During that time, the CPBA increased by 16.0% in a two-month period to reach the peak value, but still stays 9.0% greater than the predicted level after the temporary effect gradually decays. By contrast, no significant fluctuation can be found in the conditional probability process of armed assault. It can be inferred that some social unrest, such as America's troop withdrawal from Afghanistan and Iraq, could have led to the increase of the CPBA in Afghanistan, Iraq, and Pakistan. The integrated time-series and intervention model is used to forecast the monthly CPBA in 2014 and through 2064. The average relative error compared with the real data in 2014 is 3.5%. The model is also applied to the total number of attacks recorded by the GTD between 2004 and 2014.

  18. Bag of Events: An Efficient Probability-Based Feature Extraction Method for AER Image Sensors.

    PubMed

    Peng, Xi; Zhao, Bo; Yan, Rui; Tang, Huajin; Yi, Zhang

    2016-03-18

    Address event representation (AER) image sensors represent the visual information as a sequence of events that denotes the luminance changes of the scene. In this paper, we introduce a feature extraction method for AER image sensors based on the probability theory, namely, bag of events (BOE). The proposed approach represents each object as the joint probability distribution of the concurrent events, and each event corresponds to a unique activated pixel of the AER sensor. The advantages of BOE include: 1) it is a statistical learning method and has a good interpretability in mathematics; 2) BOE can significantly reduce the effort to tune parameters for different data sets, because it only has one hyperparameter and is robust to the value of the parameter; 3) BOE is an online learning algorithm, which does not require the training data to be collected in advance; 4) BOE can achieve competitive results in real time for feature extraction (>275 frames/s and >120,000 events/s); and 5) the implementation complexity of BOE only involves some basic operations, e.g., addition and multiplication. This guarantees the hardware friendliness of our method. The experimental results on three popular AER databases (i.e., MNIST-dynamic vision sensor, Poker Card, and Posture) show that our method is remarkably faster than two recently proposed AER categorization systems while preserving a good classification accuracy.

  19. Studying the effects of fuel treatment based on burn probability on a boreal forest landscape.

    PubMed

    Liu, Zhihua; Yang, Jian; He, Hong S

    2013-01-30

    Fuel treatment is assumed to be a primary tactic to mitigate intense and damaging wildfires. However, how to place treatment units across a landscape and assess its effectiveness is difficult for landscape-scale fuel management planning. In this study, we used a spatially explicit simulation model (LANDIS) to conduct wildfire risk assessments and optimize the placement of fuel treatments at the landscape scale. We first calculated a baseline burn probability map from empirical data (fuel, topography, weather, and fire ignition and size data) to assess fire risk. We then prioritized landscape-scale fuel treatment based on maps of burn probability and fuel loads (calculated from the interactions among tree composition, stand age, and disturbance history), and compared their effects on reducing fire risk. The burn probability map described the likelihood of burning on a given location; the fuel load map described the probability that a high fuel load will accumulate on a given location. Fuel treatment based on the burn probability map specified that stands with high burn probability be treated first, while fuel treatment based on the fuel load map specified that stands with high fuel loads be treated first. Our results indicated that fuel treatment based on burn probability greatly reduced the burned area and number of fires of different intensities. Fuel treatment based on burn probability also produced more dispersed and smaller high-risk fire patches and therefore can improve efficiency of subsequent fire suppression. The strength of our approach is that more model components (e.g., succession, fuel, and harvest) can be linked into LANDIS to map the spatially explicit wildfire risk and its dynamics to fuel management, vegetation dynamics, and harvesting.

  20. CD99 polymorphisms significantly influence the probability to develop Ewing sarcoma in earlier age and patient disease progression

    PubMed Central

    Martinelli, Marcella; Parra, Alessandro; Scapoli, Luca; Sanctis, Paola De; Chiadini, Valentina; Hattinger, Claudia; Picci, Piero

    2016-01-01

    Ewing sarcoma (EWS), the second most common primary bone tumor in pediatric age, is known for its paucity of recurrent somatic abnormalities. Apart from the chimeric oncoprotein that derives from the fusion of EWS and FLI genes, recent genome-wide association studies have identified susceptibility variants near the EGR2 gene that regulate DNA binding of EWS-FLI. However, to induce transformation, EWS-FLI requires the presence of additional molecular events, including the expression of CD99, a cell surface molecule with critical relevance for the pathogenesis of EWS. High expression of CD99 is a common and distinctive feature of EWS cells, and it has largely been used for the differential diagnosis of the disease. The present study first links CD99 germline genetic variants to the susceptibility of EWS development and its progression. In particular, a panel of 25 single nucleotide polymorphisms has been genotyped in a case-control study. The CD99 rs311059 T variant was found to be significantly associated [P value = 0.0029; ORhet = 3.9 (95% CI 1.5-9.8) and ORhom = 5.3 (95% CI 1.2-23.7)] with EWS onset in patients less than 14 years old, while the CD99 rs312257-T was observed to be associated [P value = 0.0265; ORhet = 3.5 (95% CI 1.3-9.9)] with a reduced risk of relapse. Besides confirming the importance of CD99, our findings indicate that polymorphic variations in this gene may affect either development or progression of EWS, leading to further understanding of this cancer and development of better diagnostics/prognostics for children and adolescents with this devastating disease. PMID:27792997

  1. Evaluation of gene importance in microarray data based upon probability of selection

    PubMed Central

    Fu, Li M; Fu-Liu, Casey S

    2005-01-01

    Background Microarray devices permit a genome-scale evaluation of gene function. This technology has catalyzed biomedical research and development in recent years. As many important diseases can be traced down to the gene level, a long-standing research problem is to identify specific gene expression patterns linking to metabolic characteristics that contribute to disease development and progression. The microarray approach offers an expedited solution to this problem. However, it has posed a challenging issue to recognize disease-related genes expression patterns embedded in the microarray data. In selecting a small set of biologically significant genes for classifier design, the nature of high data dimensionality inherent in this problem creates substantial amount of uncertainty. Results Here we present a model for probability analysis of selected genes in order to determine their importance. Our contribution is that we show how to derive the P value of each selected gene in multiple gene selection trials based on different combinations of data samples and how to conduct a reliability analysis accordingly. The importance of a gene is indicated by its associated P value in that a smaller value implies higher information content from information theory. On the microarray data concerning the subtype classification of small round blue cell tumors, we demonstrate that the method is capable of finding the smallest set of genes (19 genes) with optimal classification performance, compared with results reported in the literature. Conclusion In classifier design based on microarray data, the probability value derived from gene selection based on multiple combinations of data samples enables an effective mechanism for reducing the tendency of fitting local data particularities. PMID:15784140

  2. A comprehensive propagation prediction model comprising microfacet based scattering and probability based coverage optimization algorithm.

    PubMed

    Kausar, A S M Zahid; Reza, Ahmed Wasif; Wo, Lau Chun; Ramiah, Harikrishnan

    2014-01-01

    Although ray tracing based propagation prediction models are popular for indoor radio wave propagation characterization, most of them do not provide an integrated approach for achieving the goal of optimum coverage, which is a key part in designing wireless network. In this paper, an accelerated technique of three-dimensional ray tracing is presented, where rough surface scattering is included for making a more accurate ray tracing technique. Here, the rough surface scattering is represented by microfacets, for which it becomes possible to compute the scattering field in all possible directions. New optimization techniques, like dual quadrant skipping (DQS) and closest object finder (COF), are implemented for fast characterization of wireless communications and making the ray tracing technique more efficient. In conjunction with the ray tracing technique, probability based coverage optimization algorithm is accumulated with the ray tracing technique to make a compact solution for indoor propagation prediction. The proposed technique decreases the ray tracing time by omitting the unnecessary objects for ray tracing using the DQS technique and by decreasing the ray-object intersection time using the COF technique. On the other hand, the coverage optimization algorithm is based on probability theory, which finds out the minimum number of transmitters and their corresponding positions in order to achieve optimal indoor wireless coverage. Both of the space and time complexities of the proposed algorithm surpass the existing algorithms. For the verification of the proposed ray tracing technique and coverage algorithm, detailed simulation results for different scattering factors, different antenna types, and different operating frequencies are presented. Furthermore, the proposed technique is verified by the experimental results.

  3. PROBABILITY BASED CORROSION CONTROL FOR HIGH LEVEL WASTE TANKS: INTERIM REPORT

    SciTech Connect

    Hoffman, E; Karthik Subramanian, K

    2008-04-23

    Controls on the solution chemistry (minimum nitrite and hydroxide concentrations) are in place to prevent the initiation and propagation of pitting and stress corrosion cracking in high level waste (HLW) tanks. These controls are based upon a series of experiments performed on carbon steel coupons in simulated waste solutions. An experimental program was undertaken to investigate reducing the minimum molar nitrite concentration required to confidently inhibit pitting. A statistical basis to quantify the probability of pitting for the tank wall, when exposed to various dilute solutions, is being developed. Electrochemical and coupon testing are being performed within the framework of the statistical test matrix to determine the minimum necessary inhibitor concentrations and develop a quantitative model to predict pitting propensity. A subset of the original statistical test matrix was used to develop an applied understanding of the corrosion response of the carbon steel in the various environments. The interim results suggest that there exists some critical nitrite concentration that sufficiently inhibits against localized corrosion mechanisms due to nitrates/chlorides/sulfates, beyond which further nitrite additions are unnecessary. The combination of visual observation and the cyclic potentiodynamic polarization scans indicate the potential for significant inhibitor reductions without consequence specifically at nitrate concentrations near 1 M. The complete data sets will be used to determine the statistical basis to confidently inhibit against pitting using nitrite inhibition with the current pH controls. Once complete, a revised chemistry control program will be devised based upon the probability of pitting specifically for dilute solutions which will allow for tank specific chemistry control implementation.

  4. Value and probability coding in a feedback-based learning task utilizing food rewards.

    PubMed

    Tricomi, Elizabeth; Lempert, Karolina M

    2015-01-01

    For the consequences of our actions to guide behavior, the brain must represent different types of outcome-related information. For example, an outcome can be construed as negative because an expected reward was not delivered or because an outcome of low value was delivered. Thus behavioral consequences can differ in terms of the information they provide about outcome probability and value. We investigated the role of the striatum in processing probability-based and value-based negative feedback by training participants to associate cues with food rewards and then employing a selective satiety procedure to devalue one food outcome. Using functional magnetic resonance imaging, we examined brain activity related to receipt of expected rewards, receipt of devalued outcomes, omission of expected rewards, omission of devalued outcomes, and expected omissions of an outcome. Nucleus accumbens activation was greater for rewarding outcomes than devalued outcomes, but activity in this region did not correlate with the probability of reward receipt. Activation of the right caudate and putamen, however, was largest in response to rewarding outcomes relative to expected omissions of reward. The dorsal striatum (caudate and putamen) at the time of feedback also showed a parametric increase correlating with the trialwise probability of reward receipt. Our results suggest that the ventral striatum is sensitive to the motivational relevance, or subjective value, of the outcome, while the dorsal striatum codes for a more complex signal that incorporates reward probability. Value and probability information may be integrated in the dorsal striatum, to facilitate action planning and allocation of effort.

  5. Design of an activity landscape view taking compound-based feature probabilities into account.

    PubMed

    Zhang, Bijun; Vogt, Martin; Bajorath, Jürgen

    2014-09-01

    Activity landscapes (ALs) of compound data sets are rationalized as graphical representations that integrate similarity and potency relationships between active compounds. ALs enable the visualization of structure-activity relationship (SAR) information and are thus computational tools of interest for medicinal chemistry. For AL generation, similarity and potency relationships are typically evaluated in a pairwise manner and major AL features are assessed at the level of compound pairs. In this study, we add a conditional probability formalism to AL design that makes it possible to quantify the probability of individual compounds to contribute to characteristic AL features. Making this information graphically accessible in a molecular network-based AL representation is shown to further increase AL information content and helps to quickly focus on SAR-informative compound subsets. This feature probability-based AL variant extends the current spectrum of AL representations for medicinal chemistry applications.

  6. Flow Regime Based Climatologies of Lightning Probabilities for Spaceports and Airports

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III; Sharp, David; Spratt, Scott; Lafosse, Richard A.

    2008-01-01

    The objective of this work was to provide forecasters with a tool to indicate the warm season climatological probability of one or more lightning strikes within a circle at a site within a specified time interval. This paper described the AMU work conducted in developing flow regime based climatologies of lightning probabilities for the SLF and seven airports in the NWS MLB CWA in east-central Florida. The paper also described the GUI developed by the AMU that is used to display the data for the operational forecasters. There were challenges working with gridded lightning data as well as the code that accompanied the gridded data. The AMU modified the provided code to be able to produce the climatologies of lightning probabilities based on eight flow regimes for 5-, 10-, 20-, and 30-n mi circles centered on eight sites in 1-, 3-, and 6-hour increments.

  7. Design of an activity landscape view taking compound-based feature probabilities into account

    NASA Astrophysics Data System (ADS)

    Zhang, Bijun; Vogt, Martin; Bajorath, Jürgen

    2014-09-01

    Activity landscapes (ALs) of compound data sets are rationalized as graphical representations that integrate similarity and potency relationships between active compounds. ALs enable the visualization of structure-activity relationship (SAR) information and are thus computational tools of interest for medicinal chemistry. For AL generation, similarity and potency relationships are typically evaluated in a pairwise manner and major AL features are assessed at the level of compound pairs. In this study, we add a conditional probability formalism to AL design that makes it possible to quantify the probability of individual compounds to contribute to characteristic AL features. Making this information graphically accessible in a molecular network-based AL representation is shown to further increase AL information content and helps to quickly focus on SAR-informative compound subsets. This feature probability-based AL variant extends the current spectrum of AL representations for medicinal chemistry applications.

  8. The Role of Probability-Based Inference in an Intelligent Tutoring System.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Gitomer, Drew H.

    Probability-based inference in complex networks of interdependent variables is an active topic in statistical research, spurred by such diverse applications as forecasting, pedigree analysis, troubleshooting, and medical diagnosis. This paper concerns the role of Bayesian inference networks for updating student models in intelligent tutoring…

  9. HABITAT ASSESSMENT USING A RANDOM PROBABILITY BASED SAMPLING DESIGN: ESCAMBIA RIVER DELTA, FLORIDA

    EPA Science Inventory

    Smith, Lisa M., Darrin D. Dantin and Steve Jordan. In press. Habitat Assessment Using a Random Probability Based Sampling Design: Escambia River Delta, Florida (Abstract). To be presented at the SWS/GERS Fall Joint Society Meeting: Communication and Collaboration: Coastal Systems...

  10. Learning Probabilities in Computer Engineering by Using a Competency- and Problem-Based Approach

    ERIC Educational Resources Information Center

    Khoumsi, Ahmed; Hadjou, Brahim

    2005-01-01

    Our department has redesigned its electrical and computer engineering programs by adopting a learning methodology based on competence development, problem solving, and the realization of design projects. In this article, we show how this pedagogical approach has been successfully used for learning probabilities and their application to computer…

  11. Teaching Probability to Pre-Service Teachers with Argumentation Based Science Learning Approach

    ERIC Educational Resources Information Center

    Can, Ömer Sinan; Isleyen, Tevfik

    2016-01-01

    The aim of this study is to explore the effects of the argumentation based science learning (ABSL) approach on the teaching probability to pre-service teachers. The sample of the study included 41 students studying at the Department of Elementary School Mathematics Education in a public university during the 2014-2015 academic years. The study is…

  12. Duality-based calculations for transition probabilities in stochastic chemical reactions

    NASA Astrophysics Data System (ADS)

    Ohkubo, Jun

    2017-02-01

    An idea for evaluating transition probabilities in chemical reaction systems is proposed, which is efficient for repeated calculations with various rate constants. The idea is based on duality relations; instead of direct time evolutions of the original reaction system, the dual process is dealt with. Usually, if one changes rate constants of the original reaction system, the direct time evolutions should be performed again, using the new rate constants. On the other hands, only one solution of an extended dual process can be reused to calculate the transition probabilities for various rate constant cases. The idea is demonstrated in a parameter estimation problem for the Lotka-Volterra system.

  13. Sequential elution liquid chromatography can significantly increase the probability of a successful separation by simultaneously increasing the peak capacity and reducing the separation disorder.

    PubMed

    Socia, Adam; Foley, Joe P

    2014-01-10

    This paper demonstrates that sequential elution liquid chromatography (SE-LC), an approach in which two or more elution modes are employed in series for the separation of two or more groups of compounds, can be used to separate not only weak acids (or weak bases) from neutral compounds, but weak acids and weak bases from neutral compounds (and each other) by the sequential application of either of two types of an extended pH gradient prior to a solvent gradient. It also details a comparison, based on peak capacity and separation disorder, of the probability of success of this approach with the unimodal elution approach taken by conventional column liquid chromatography. For an HPLC peak capacity of 120 and samples of moderate complexity (e.g., 12 components), the probability of success (Rs≥1) increases from 37.9% (HPLC) to 85.8% (SE-LC). Different columns were evaluated for their utility for SE-LC using the following criteria: (1) the prediction of the elution order of the groups based on the degree of ionization of the compounds; and (2) the closeness of the peak shape to the ideal Gaussian distribution. The best columns overall were the Zorbax SB-AQ and Waters XBridge Shield columns, as they provided both between-class and within-class separations of all compounds, as well as the lowest degree of tailing of 4-ethylaniline using the pH 2 to pH 8 gradient.

  14. Conditional probability distribution (CPD) method in temperature based death time estimation: Error propagation analysis.

    PubMed

    Hubig, Michael; Muggenthaler, Holger; Mall, Gita

    2014-05-01

    Bayesian estimation applied to temperature based death time estimation was recently introduced as conditional probability distribution or CPD-method by Biermann and Potente. The CPD-method is useful, if there is external information that sets the boundaries of the true death time interval (victim last seen alive and found dead). CPD allows computation of probabilities for small time intervals of interest (e.g. no-alibi intervals of suspects) within the large true death time interval. In the light of the importance of the CPD for conviction or acquittal of suspects the present study identifies a potential error source. Deviations in death time estimates will cause errors in the CPD-computed probabilities. We derive formulae to quantify the CPD error as a function of input error. Moreover we observed the paradox, that in cases, in which the small no-alibi time interval is located at the boundary of the true death time interval, adjacent to the erroneous death time estimate, CPD-computed probabilities for that small no-alibi interval will increase with increasing input deviation, else the CPD-computed probabilities will decrease. We therefore advise not to use CPD if there is an indication of an error or a contra-empirical deviation in the death time estimates, that is especially, if the death time estimates fall out of the true death time interval, even if the 95%-confidence intervals of the estimate still overlap the true death time interval.

  15. Performance of the Rayleigh task based on the posterior probability of tomographic reconstructions

    SciTech Connect

    Hanson, K.M.

    1991-01-01

    We seek the best possible performance of the Rayleigh task in which one must decide whether a perceived object is a pair of Gaussian-blurred points or a blurred line. Two Bayesian reconstruction algorithms are used, the first based on a Gaussian prior-probability distribution with a nonnegativity constraint and the second based on an entropic prior. In both cases, the reconstructions are found that maximize the posterior probability. We compare the performance of the Rayleigh task obtained with two decision variables, the logarithm of the posterior probability ratio and the change in the mean-squared deviation from the reconstruction. The method of evaluation is based on the results of a numerical testing procedure in which the stated discrimination task is carried out on reconstructions of a randomly generated sequence of images. The ability to perform the Rayleigh task is summarized in terms of a discrimination index that is derived from the area under the receiver-operating characteristic (ROC) curve. We find that the use of the posterior probability does not result in better performance of the Rayleigh task than the mean-squared deviation from the reconstruction. 10 refs., 6 figs.

  16. Value and probability coding in a feedback-based learning task utilizing food rewards

    PubMed Central

    Lempert, Karolina M.

    2014-01-01

    For the consequences of our actions to guide behavior, the brain must represent different types of outcome-related information. For example, an outcome can be construed as negative because an expected reward was not delivered or because an outcome of low value was delivered. Thus behavioral consequences can differ in terms of the information they provide about outcome probability and value. We investigated the role of the striatum in processing probability-based and value-based negative feedback by training participants to associate cues with food rewards and then employing a selective satiety procedure to devalue one food outcome. Using functional magnetic resonance imaging, we examined brain activity related to receipt of expected rewards, receipt of devalued outcomes, omission of expected rewards, omission of devalued outcomes, and expected omissions of an outcome. Nucleus accumbens activation was greater for rewarding outcomes than devalued outcomes, but activity in this region did not correlate with the probability of reward receipt. Activation of the right caudate and putamen, however, was largest in response to rewarding outcomes relative to expected omissions of reward. The dorsal striatum (caudate and putamen) at the time of feedback also showed a parametric increase correlating with the trialwise probability of reward receipt. Our results suggest that the ventral striatum is sensitive to the motivational relevance, or subjective value, of the outcome, while the dorsal striatum codes for a more complex signal that incorporates reward probability. Value and probability information may be integrated in the dorsal striatum, to facilitate action planning and allocation of effort. PMID:25339705

  17. Uncertainty analysis based on probability bounds (p-box) approach in probabilistic safety assessment.

    PubMed

    Karanki, Durga Rao; Kushwaha, Hari Shankar; Verma, Ajit Kumar; Ajit, Srividya

    2009-05-01

    A wide range of uncertainties will be introduced inevitably during the process of performing a safety assessment of engineering systems. The impact of all these uncertainties must be addressed if the analysis is to serve as a tool in the decision-making process. Uncertainties present in the components (input parameters of model or basic events) of model output are propagated to quantify its impact in the final results. There are several methods available in the literature, namely, method of moments, discrete probability analysis, Monte Carlo simulation, fuzzy arithmetic, and Dempster-Shafer theory. All the methods are different in terms of characterizing at the component level and also in propagating to the system level. All these methods have different desirable and undesirable features, making them more or less useful in different situations. In the probabilistic framework, which is most widely used, probability distribution is used to characterize uncertainty. However, in situations in which one cannot specify (1) parameter values for input distributions, (2) precise probability distributions (shape), and (3) dependencies between input parameters, these methods have limitations and are found to be not effective. In order to address some of these limitations, the article presents uncertainty analysis in the context of level-1 probabilistic safety assessment (PSA) based on a probability bounds (PB) approach. PB analysis combines probability theory and interval arithmetic to produce probability boxes (p-boxes), structures that allow the comprehensive propagation through calculation in a rigorous way. A practical case study is also carried out with the developed code based on the PB approach and compared with the two-phase Monte Carlo simulation results.

  18. Estimating transition probabilities for stage-based population projection matrices using capture-recapture data

    USGS Publications Warehouse

    Nichols, J.D.; Sauer, J.R.; Pollock, K.H.; Hestbeck, J.B.

    1992-01-01

    In stage-based demography, animals are often categorized into size (or mass) classes, and size-based probabilities of surviving and changing mass classes must be estimated before demographic analyses can be conducted. In this paper, we develop two procedures for the estimation of mass transition probabilities from capture-recapture data. The first approach uses a multistate capture-recapture model that is parameterized directly with the transition probabilities of interest. Maximum likelihood estimates are then obtained numerically using program SURVIV. The second approach involves a modification of Pollock's robust design. Estimation proceeds by conditioning on animals caught in a particualr class at time i, and then using closed models to estimate the number of these that are alive in other classes at i + 1. Both methods are illustrated by application to meadow vole, Microtus pennsylvanicus, capture-recapture data. The two methods produced reasonable estimates that were similar. Advantages of these two approaches include the directness of estimation, the absence of need for restrictive assumptions about the independence of survival and growth, the testability of assumptions, and the testability of related hypotheses of ecological interest (e.g., the hypothesis of temporal variation in transition probabilities).

  19. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets

    PubMed Central

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details. PMID:26158662

  20. Differential Survival in Europe and the United States: Estimates Based on Subjective Probabilities of Survival

    PubMed Central

    Delavande, Adeline; Rohwedder, Susann

    2013-01-01

    Cross-country comparisons of differential survival by socioeconomic status (SES) are useful in many domains. Yet, to date, such studies have been rare. Reliably estimating differential survival in a single country has been challenging because it requires rich panel data with a large sample size. Cross-country estimates have proven even more difficult because the measures of SES need to be comparable internationally. We present an alternative method for acquiring information on differential survival by SES. Rather than using observations of actual survival, we relate individuals’ subjective probabilities of survival to SES variables in cross section. To show that subjective survival probabilities are informative proxies for actual survival when estimating differential survival, we compare estimates of differential survival based on actual survival with estimates based on subjective probabilities of survival for the same sample. The results are remarkably similar. We then use this approach to compare differential survival by SES for 10 European countries and the United States. Wealthier people have higher survival probabilities than those who are less wealthy, but the strength of the association differs across countries. Nations with a smaller gradient appear to be Belgium, France, and Italy, while the United States, England, and Sweden appear to have a larger gradient. PMID:22042664

  1. Reconciliation of Decision-Making Heuristics Based on Decision Trees Topologies and Incomplete Fuzzy Probabilities Sets.

    PubMed

    Doubravsky, Karel; Dohnal, Mirko

    2015-01-01

    Complex decision making tasks of different natures, e.g. economics, safety engineering, ecology and biology, are based on vague, sparse, partially inconsistent and subjective knowledge. Moreover, decision making economists / engineers are usually not willing to invest too much time into study of complex formal theories. They require such decisions which can be (re)checked by human like common sense reasoning. One important problem related to realistic decision making tasks are incomplete data sets required by the chosen decision making algorithm. This paper presents a relatively simple algorithm how some missing III (input information items) can be generated using mainly decision tree topologies and integrated into incomplete data sets. The algorithm is based on an easy to understand heuristics, e.g. a longer decision tree sub-path is less probable. This heuristic can solve decision problems under total ignorance, i.e. the decision tree topology is the only information available. But in a practice, isolated information items e.g. some vaguely known probabilities (e.g. fuzzy probabilities) are usually available. It means that a realistic problem is analysed under partial ignorance. The proposed algorithm reconciles topology related heuristics and additional fuzzy sets using fuzzy linear programming. The case study, represented by a tree with six lotteries and one fuzzy probability, is presented in details.

  2. The relationship study between image features and detection probability based on psychology experiments

    NASA Astrophysics Data System (ADS)

    Lin, Wei; Chen, Yu-hua; Wang, Ji-yuan; Gao, Hong-sheng; Wang, Ji-jun; Su, Rong-hua; Mao, Wei

    2011-04-01

    Detection probability is an important index to represent and estimate target viability, which provides basis for target recognition and decision-making. But it will expend a mass of time and manpower to obtain detection probability in reality. At the same time, due to the different interpretation of personnel practice knowledge and experience, a great difference will often exist in the datum obtained. By means of studying the relationship between image features and perception quantity based on psychology experiments, the probability model has been established, in which the process is as following.Firstly, four image features have been extracted and quantified, which affect directly detection. Four feature similarity degrees between target and background were defined. Secondly, the relationship between single image feature similarity degree and perception quantity was set up based on psychological principle, and psychological experiments of target interpretation were designed which includes about five hundred people for interpretation and two hundred images. In order to reduce image features correlativity, a lot of artificial synthesis images have been made which include images with single brightness feature difference, images with single chromaticity feature difference, images with single texture feature difference and images with single shape feature difference. By analyzing and fitting a mass of experiments datum, the model quantitys have been determined. Finally, by applying statistical decision theory and experimental results, the relationship between perception quantity with target detection probability has been found. With the verification of a great deal of target interpretation in practice, the target detection probability can be obtained by the model quickly and objectively.

  3. Epistemic-based investigation of the probability of hazard scenarios using Bayesian network for the lifting operation of floating objects

    NASA Astrophysics Data System (ADS)

    Toroody, Ahmad Bahoo; Abaiee, Mohammad Mahdi; Gholamnia, Reza; Ketabdari, Mohammad Javad

    2016-09-01

    Owing to the increase in unprecedented accidents with new root causes in almost all operational areas, the importance of risk management has dramatically risen. Risk assessment, one of the most significant aspects of risk management, has a substantial impact on the system-safety level of organizations, industries, and operations. If the causes of all kinds of failure and the interactions between them are considered, effective risk assessment can be highly accurate. A combination of traditional risk assessment approaches and modern scientific probability methods can help in realizing better quantitative risk assessment methods. Most researchers face the problem of minimal field data with respect to the probability and frequency of each failure. Because of this limitation in the availability of epistemic knowledge, it is important to conduct epistemic estimations by applying the Bayesian theory for identifying plausible outcomes. In this paper, we propose an algorithm and demonstrate its application in a case study for a light-weight lifting operation in the Persian Gulf of Iran. First, we identify potential accident scenarios and present them in an event tree format. Next, excluding human error, we use the event tree to roughly estimate the prior probability of other hazard-promoting factors using a minimal amount of field data. We then use the Success Likelihood Index Method (SLIM) to calculate the probability of human error. On the basis of the proposed event tree, we use the Bayesian network of the provided scenarios to compensate for the lack of data. Finally, we determine the resulting probability of each event based on its evidence in the epistemic estimation format by building on two Bayesian network types: the probability of hazard promotion factors and the Bayesian theory. The study results indicate that despite the lack of available information on the operation of floating objects, a satisfactory result can be achieved using epistemic data.

  4. TOPICAL REVIEW: Mechanistically based probability modelling, life prediction and reliability assessment

    NASA Astrophysics Data System (ADS)

    Wei, Robert P.; Harlow, D. Gary

    2005-01-01

    Life prediction and reliability assessment are essential components for the life-cycle engineering and management (LCEM) of modern engineered systems. These systems can range from microelectronic and bio-medical devices to large machinery and structures. To be effective, the underlying approach to LCEM must be transformed to embody mechanistically based probability modelling, vis-à-vis the more traditional experientially based statistical modelling, for predicting damage evolution and distribution. In this paper, the probability and statistical approaches are compared and differentiated. The process of model development on the basis of mechanistic understanding derived from critical experiments is illustrated through selected examples. The efficacy of this approach is illustrated through an example of the evolution and distribution of corrosion and corrosion fatigue damage in aluminium alloys in relation to aircraft that had been in long-term service.

  5. A method of classification for multisource data in remote sensing based on interval-valued probabilities

    NASA Technical Reports Server (NTRS)

    Kim, Hakil; Swain, Philip H.

    1990-01-01

    An axiomatic approach to intervalued (IV) probabilities is presented, where the IV probability is defined by a pair of set-theoretic functions which satisfy some pre-specified axioms. On the basis of this approach representation of statistical evidence and combination of multiple bodies of evidence are emphasized. Although IV probabilities provide an innovative means for the representation and combination of evidential information, they make the decision process rather complicated. It entails more intelligent strategies for making decisions. The development of decision rules over IV probabilities is discussed from the viewpoint of statistical pattern recognition. The proposed method, so called evidential reasoning method, is applied to the ground-cover classification of a multisource data set consisting of Multispectral Scanner (MSS) data, Synthetic Aperture Radar (SAR) data, and digital terrain data such as elevation, slope, and aspect. By treating the data sources separately, the method is able to capture both parametric and nonparametric information and to combine them. Then the method is applied to two separate cases of classifying multiband data obtained by a single sensor. In each case a set of multiple sources is obtained by dividing the dimensionally huge data into smaller and more manageable pieces based on the global statistical correlation information. By a divide-and-combine process, the method is able to utilize more features than the conventional maximum likelihood method.

  6. The development of posterior probability models in risk-based integrity modeling.

    PubMed

    Thodi, Premkumar N; Khan, Faisal I; Haddara, Mahmoud R

    2010-03-01

    There is a need for accurate modeling of mechanisms causing material degradation of equipment in process installation, to ensure safety and reliability of the equipment. Degradation mechanisms are stochastic processes. They can be best described using risk-based approaches. Risk-based integrity assessment quantifies the level of risk to which the individual components are subjected and provides means to mitigate them in a safe and cost-effective manner. The uncertainty and variability in structural degradations can be best modeled by probability distributions. Prior probability models provide initial description of the degradation mechanisms. As more inspection data become available, these prior probability models can be revised to obtain posterior probability models, which represent the current system and can be used to predict future failures. In this article, a rejection sampling-based Metropolis-Hastings (M-H) algorithm is used to develop posterior distributions. The M-H algorithm is a Markov chain Monte Carlo algorithm used to generate a sequence of posterior samples without actually knowing the normalizing constant. Ignoring the transient samples in the generated Markov chain, the steady state samples are rejected or accepted based on an acceptance criterion. To validate the estimated parameters of posterior models, analytical Laplace approximation method is used to compute the integrals involved in the posterior function. Results of the M-H algorithm and Laplace approximations are compared with conjugate pair estimations of known prior and likelihood combinations. The M-H algorithm provides better results and hence it is used for posterior development of the selected priors for corrosion and cracking.

  7. Probability Prediction of a Nation’s Internal Conflict Based on Instability

    DTIC Science & Technology

    2008-06-01

    COVERED Master’s Thesis 4. TITLE AND SUBTITLE Probability Prediction of a Nation’s Internal Conflict Based on Instability 6. AUTHOR( S ) Shian...kuen Wann 5. FUNDING NUMBERS 7. PERFORMING ORGANIZATION NAME( S ) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943-5000 8. PERFORMING...ORGANIZATION REPORT NUMBER 9. SPONSORING /MONITORING AGENCY NAME( S ) AND ADDRESS(ES) N/A 10. SPONSORING/MONITORING AGENCY REPORT NUMBER 11

  8. Probability based earthquake load and resistance factor design criteria for offshore platforms

    SciTech Connect

    Bea, R.G.

    1996-12-31

    This paper describes a probability reliability based formulation to determine earthquake Load and Resistance Factor Design (LRFD) parameters for conventional, steel, pile supported, tubular membered platforms that is proposed as a basis for earthquake design criteria and guidelines for offshore platforms that are intended to have worldwide applicability. The formulation is illustrated with application to platforms located in five areas: offshore California, Venezuela (Rio Caribe), the East Coast of Canada, in the Caspian Sea (Azeri), and the Norwegian sector of the North Sea.

  9. Acceptance Control Charts with Stipulated Error Probabilities Based on Poisson Count Data

    DTIC Science & Technology

    1973-01-01

    Richard L / Scheaffer ’.* Richard S eavenwort December,... 198 *Department of Industrial and Systems Engineering University of Florida Gainesville...L. Scheaffer N00014-75-C-0783 Richard S. Leavenworth 9. PERFORMING ORGANIZATION NAME AND ADDRESS . PROGRAM ELEMENT. PROJECT, TASK Industrial and...PROBABILITIES BASED ON POISSON COUNT DATA by Suresh 1Ihatre Richard L. Scheaffer S..Richard S. Leavenworth ABSTRACT An acceptance control charting

  10. Chaos optimization algorithms based on chaotic maps with different probability distribution and search speed for global optimization

    NASA Astrophysics Data System (ADS)

    Yang, Dixiong; Liu, Zhenjun; Zhou, Jilei

    2014-04-01

    Chaos optimization algorithms (COAs) usually utilize the chaotic map like Logistic map to generate the pseudo-random numbers mapped as the design variables for global optimization. Many existing researches indicated that COA can more easily escape from the local minima than classical stochastic optimization algorithms. This paper reveals the inherent mechanism of high efficiency and superior performance of COA, from a new perspective of both the probability distribution property and search speed of chaotic sequences generated by different chaotic maps. The statistical property and search speed of chaotic sequences are represented by the probability density function (PDF) and the Lyapunov exponent, respectively. Meanwhile, the computational performances of hybrid chaos-BFGS algorithms based on eight one-dimensional chaotic maps with different PDF and Lyapunov exponents are compared, in which BFGS is a quasi-Newton method for local optimization. Moreover, several multimodal benchmark examples illustrate that, the probability distribution property and search speed of chaotic sequences from different chaotic maps significantly affect the global searching capability and optimization efficiency of COA. To achieve the high efficiency of COA, it is recommended to adopt the appropriate chaotic map generating the desired chaotic sequences with uniform or nearly uniform probability distribution and large Lyapunov exponent.

  11. Assessment of probability density function based on POD reduced-order model for ensemble-based data assimilation

    NASA Astrophysics Data System (ADS)

    Kikuchi, Ryota; Misaka, Takashi; Obayashi, Shigeru

    2015-10-01

    An integrated method of a proper orthogonal decomposition based reduced-order model (ROM) and data assimilation is proposed for the real-time prediction of an unsteady flow field. In this paper, a particle filter (PF) and an ensemble Kalman filter (EnKF) are compared for data assimilation and the difference in the predicted flow fields is evaluated focusing on the probability density function (PDF) of the model variables. The proposed method is demonstrated using identical twin experiments of an unsteady flow field around a circular cylinder at the Reynolds number of 1000. The PF and EnKF are employed to estimate temporal coefficients of the ROM based on the observed velocity components in the wake of the circular cylinder. The prediction accuracy of ROM-PF is significantly better than that of ROM-EnKF due to the flexibility of PF for representing a PDF compared to EnKF. Furthermore, the proposed method reproduces the unsteady flow field several orders faster than the reference numerical simulation based on the Navier-Stokes equations.

  12. Target detection in complex scene of SAR image based on existence probability

    NASA Astrophysics Data System (ADS)

    Liu, Shuo; Cao, Zongjie; Wu, Honggang; Pi, Yiming; Yang, Haiyi

    2016-12-01

    This study proposes a target detection approach based on the target existence probability in complex scenes of a synthetic aperture radar image. Superpixels are the basic unit throughout the approach and are labelled into each classified scene by a texture feature. The original and predicted saliency depth values for each scene are derived through self-information of all the labelled superpixels in each scene. Thereafter, the target existence probability is estimated based on the comparison of two saliency depth values. Lastly, an improved visual attention algorithm, in which the scenes of the saliency map are endowed with different weights related to the existence probabilities, derives the target detection result. This algorithm enhances the attention for the scene that contains the target. Hence, the proposed approach is self-adapting for complex scenes and the algorithm is substantially suitable for different detection missions as well (e.g. vehicle, ship or aircraft detection in the related scenes of road, harbour or airport, respectively). Experimental results on various data show the effectiveness of the proposed method.

  13. Probability-Based Software for Grid Optimization: Improved Power System Operations Using Advanced Stochastic Optimization

    SciTech Connect

    2012-02-24

    GENI Project: Sandia National Laboratories is working with several commercial and university partners to develop software for market management systems (MMSs) that enable greater use of renewable energy sources throughout the grid. MMSs are used to securely and optimally determine which energy resources should be used to service energy demand across the country. Contributions of electricity to the grid from renewable energy sources such as wind and solar are intermittent, introducing complications for MMSs, which have trouble accommodating the multiple sources of price and supply uncertainties associated with bringing these new types of energy into the grid. Sandia’s software will bring a new, probability-based formulation to account for these uncertainties. By factoring in various probability scenarios for electricity production from renewable energy sources in real time, Sandia’s formula can reduce the risk of inefficient electricity transmission, save ratepayers money, conserve power, and support the future use of renewable energy.

  14. Confidence Probability versus Detection Probability

    SciTech Connect

    Axelrod, M

    2005-08-18

    In a discovery sampling activity the auditor seeks to vet an inventory by measuring (or inspecting) a random sample of items from the inventory. When the auditor finds every sample item in compliance, he must then make a confidence statement about the whole inventory. For example, the auditor might say: ''We believe that this inventory of 100 items contains no more than 5 defectives with 95% confidence.'' Note this is a retrospective statement in that it asserts something about the inventory after the sample was selected and measured. Contrast this to the prospective statement: ''We will detect the existence of more than 5 defective items in this inventory with 95% probability.'' The former uses confidence probability while the latter uses detection probability. For a given sample size, the two probabilities need not be equal, indeed they could differ significantly. Both these probabilities critically depend on the auditor's prior belief about the number of defectives in the inventory and how he defines non-compliance. In other words, the answer strongly depends on how the question is framed.

  15. Hierarchical modeling of contingency-based source monitoring: a test of the probability-matching account.

    PubMed

    Arnold, Nina R; Bayen, Ute J; Kuhlmann, Beatrice G; Vaterrodt, Bianca

    2013-04-01

    According to the probability-matching account of source guessing (Spaniol & Bayen, Journal of Experimental Psychology: Learning, Memory, and Cognition 28:631-651, 2002), when people do not remember the source of an item in a source-monitoring task, they match the source-guessing probabilities to the perceived contingencies between sources and item types. In a source-monitoring experiment, half of the items presented by each of two sources were consistent with schematic expectations about this source, whereas the other half of the items were consistent with schematic expectations about the other source. Participants' source schemas were activated either at the time of encoding or just before the source-monitoring test. After test, the participants judged the contingency of the item type and source. Individual parameter estimates of source guessing were obtained via beta-multinomial processing tree modeling (beta-MPT; Smith & Batchelder, Journal of Mathematical Psychology 54:167-183, 2010). We found a significant correlation between the perceived contingency and source guessing, as well as a correlation between the deviation of the guessing bias from the true contingency and source memory when participants did not receive the schema information until retrieval. These findings support the probability-matching account.

  16. Prediction of protein secondary structure using probability based features and a hybrid system.

    PubMed

    Ghanty, Pradip; Pal, Nikhil R; Mudi, Rajani K

    2013-10-01

    In this paper, we propose some co-occurrence probability-based features for prediction of protein secondary structure. The features are extracted using occurrence/nonoccurrence of secondary structures in the protein sequences. We explore two types of features: position-specific (based on position of amino acid on fragments of protein sequences) as well as position-independent (independent of amino acid position on fragments of protein sequences). We use a hybrid system, NEUROSVM, consisting of neural networks and support vector machines for classification of secondary structures. We propose two schemes NSVMps and NSVM for protein secondary structure prediction. The NSVMps uses position-specific probability-based features and NEUROSVM classifier whereas NSVM uses the same classifier with position-independent probability-based features. The proposed method falls in the single-sequence category of methods because it does not use any sequence profile information such as position specific scoring matrices (PSSM) derived from PSI-BLAST. Two widely used datasets RS126 and CB513 are used in the experiments. The results obtained using the proposed features and NEUROSVM classifier are better than most of the existing single-sequence prediction methods. Most importantly, the results using NSVMps that are obtained using lower dimensional features, are comparable to those by other existing methods. The NSVMps and NSVM are finally tested on target proteins of the critical assessment of protein structure prediction experiment-9 (CASP9). A larger dataset is used to compare the performance of the proposed methods with that of two recent single-sequence prediction methods. We also investigate the impact of presence of different amino acid residues (in protein sequences) that are responsible for the formation of different secondary structures.

  17. Mice plan decision strategies based on previously learned time intervals, locations, and probabilities.

    PubMed

    Tosun, Tuğçe; Gür, Ezgi; Balcı, Fuat

    2016-01-19

    Animals can shape their timed behaviors based on experienced probabilistic relations in a nearly optimal fashion. On the other hand, it is not clear if they adopt these timed decisions by making computations based on previously learnt task parameters (time intervals, locations, and probabilities) or if they gradually develop their decisions based on trial and error. To address this question, we tested mice in the timed-switching task, which required them to anticipate when (after a short or long delay) and at which of the two delay locations a reward would be presented. The probability of short trials differed between test groups in two experiments. Critically, we first trained mice on relevant task parameters by signaling the active trial with a discriminative stimulus and delivered the corresponding reward after the associated delay without any response requirement (without inducing switching behavior). During the test phase, both options were presented simultaneously to characterize the emergence and temporal characteristics of the switching behavior. Mice exhibited timed-switching behavior starting from the first few test trials, and their performance remained stable throughout testing in the majority of the conditions. Furthermore, as the probability of the short trial increased, mice waited longer before switching from the short to long location (experiment 1). These behavioral adjustments were in directions predicted by reward maximization. These results suggest that rather than gradually adjusting their time-dependent choice behavior, mice abruptly adopted temporal decision strategies by directly integrating their previous knowledge of task parameters into their timed behavior, supporting the model-based representational account of temporal risk assessment.

  18. Mice plan decision strategies based on previously learned time intervals, locations, and probabilities

    PubMed Central

    Tosun, Tuğçe; Gür, Ezgi; Balcı, Fuat

    2016-01-01

    Animals can shape their timed behaviors based on experienced probabilistic relations in a nearly optimal fashion. On the other hand, it is not clear if they adopt these timed decisions by making computations based on previously learnt task parameters (time intervals, locations, and probabilities) or if they gradually develop their decisions based on trial and error. To address this question, we tested mice in the timed-switching task, which required them to anticipate when (after a short or long delay) and at which of the two delay locations a reward would be presented. The probability of short trials differed between test groups in two experiments. Critically, we first trained mice on relevant task parameters by signaling the active trial with a discriminative stimulus and delivered the corresponding reward after the associated delay without any response requirement (without inducing switching behavior). During the test phase, both options were presented simultaneously to characterize the emergence and temporal characteristics of the switching behavior. Mice exhibited timed-switching behavior starting from the first few test trials, and their performance remained stable throughout testing in the majority of the conditions. Furthermore, as the probability of the short trial increased, mice waited longer before switching from the short to long location (experiment 1). These behavioral adjustments were in directions predicted by reward maximization. These results suggest that rather than gradually adjusting their time-dependent choice behavior, mice abruptly adopted temporal decision strategies by directly integrating their previous knowledge of task parameters into their timed behavior, supporting the model-based representational account of temporal risk assessment. PMID:26733674

  19. The high order dispersion analysis based on first-passage-time probability in financial markets

    NASA Astrophysics Data System (ADS)

    Liu, Chenggong; Shang, Pengjian; Feng, Guochen

    2017-04-01

    The study of first-passage-time (FPT) event about financial time series has gained broad research recently, which can provide reference for risk management and investment. In this paper, a new measurement-high order dispersion (HOD)-is developed based on FPT probability to explore financial time series. The tick-by-tick data of three Chinese stock markets and three American stock markets are investigated. We classify the financial markets successfully through analyzing the scaling properties of FPT probabilities of six stock markets and employing HOD method to compare the differences of FPT decay curves. It can be concluded that long-range correlation, fat-tailed broad probability density function and its coupling with nonlinearity mainly lead to the multifractality of financial time series by applying HOD method. Furthermore, we take the fluctuation function of multifractal detrended fluctuation analysis (MF-DFA) to distinguish markets and get consistent results with HOD method, whereas the HOD method is capable of fractionizing the stock markets effectively in the same region. We convince that such explorations are relevant for a better understanding of the financial market mechanisms.

  20. Tips for Teachers of Evidence-based Medicine: Clinical Prediction Rules (CPRs) and Estimating Pretest Probability

    PubMed Central

    McGinn, Thomas; Jervis, Ramiro; Wisnivesky, Juan; Keitz, Sheri

    2008-01-01

    Background Clinical prediction rules (CPR) are tools that clinicians can use to predict the most likely diagnosis, prognosis, or response to treatment in a patient based on individual characteristics. CPRs attempt to standardize, simplify, and increase the accuracy of clinicians’ diagnostic and prognostic assessments. The teaching tips series is designed to give teachers advice and materials they can use to attain specific educational objectives. Educational Objectives In this article, we present 3 teaching tips aimed at helping clinical learners use clinical prediction rules and to more accurately assess pretest probability in every day practice. The first tip is designed to demonstrate variability in physician estimation of pretest probability. The second tip demonstrates how the estimate of pretest probability influences the interpretation of diagnostic tests and patient management. The third tip exposes learners to various examples and different types of Clinical Prediction Rules (CPR) and how to apply them in practice. Pilot Testing We field tested all 3 tips with 16 learners, a mix of interns and senior residents. Teacher preparatory time was approximately 2 hours. The field test utilized a board and a data projector; 3 handouts were prepared. The tips were felt to be clear and the educational objectives reached. Potential teaching pitfalls were identified. Conclusion Teaching with these tips will help physicians appreciate the importance of applying evidence to their every day decisions. In 2 or 3 short teaching sessions, clinicians can also become familiar with the use of CPRs in applying evidence consistently in everyday practice. PMID:18491194

  1. A multivariate copula-based framework for dealing with hazard scenarios and failure probabilities

    NASA Astrophysics Data System (ADS)

    Salvadori, G.; Durante, F.; De Michele, C.; Bernardi, M.; Petrella, L.

    2016-05-01

    This paper is of methodological nature, and deals with the foundations of Risk Assessment. Several international guidelines have recently recommended to select appropriate/relevant Hazard Scenarios in order to tame the consequences of (extreme) natural phenomena. In particular, the scenarios should be multivariate, i.e., they should take into account the fact that several variables, generally not independent, may be of interest. In this work, it is shown how a Hazard Scenario can be identified in terms of (i) a specific geometry and (ii) a suitable probability level. Several scenarios, as well as a Structural approach, are presented, and due comparisons are carried out. In addition, it is shown how the Hazard Scenario approach illustrated here is well suited to cope with the notion of Failure Probability, a tool traditionally used for design and risk assessment in engineering practice. All the results outlined throughout the work are based on the Copula Theory, which turns out to be a fundamental theoretical apparatus for doing multivariate risk assessment: formulas for the calculation of the probability of Hazard Scenarios in the general multidimensional case (d≥2) are derived, and worthy analytical relationships among the probabilities of occurrence of Hazard Scenarios are presented. In addition, the Extreme Value and Archimedean special cases are dealt with, relationships between dependence ordering and scenario levels are studied, and a counter-example concerning Tail Dependence is shown. Suitable indications for the practical application of the techniques outlined in the work are given, and two case studies illustrate the procedures discussed in the paper.

  2. Probability of ventricular fibrillation: allometric model based on the ST deviation

    PubMed Central

    2011-01-01

    Background Allometry, in general biology, measures the relative growth of a part in relation to the whole living organism. Using reported clinical data, we apply this concept for evaluating the probability of ventricular fibrillation based on the electrocardiographic ST-segment deviation values. Methods Data collected by previous reports were used to fit an allometric model in order to estimate ventricular fibrillation probability. Patients presenting either with death, myocardial infarction or unstable angina were included to calculate such probability as, VFp = δ + β (ST), for three different ST deviations. The coefficients δ and β were obtained as the best fit to the clinical data extended over observational periods of 1, 6, 12 and 48 months from occurrence of the first reported chest pain accompanied by ST deviation. Results By application of the above equation in log-log representation, the fitting procedure produced the following overall coefficients: Average β = 0.46, with a maximum = 0.62 and a minimum = 0.42; Average δ = 1.28, with a maximum = 1.79 and a minimum = 0.92. For a 2 mm ST-deviation, the full range of predicted ventricular fibrillation probability extended from about 13% at 1 month up to 86% at 4 years after the original cardiac event. Conclusions These results, at least preliminarily, appear acceptable and still call for full clinical test. The model seems promising, especially if other parameters were taken into account, such as blood cardiac enzyme concentrations, ischemic or infarcted epicardial areas or ejection fraction. It is concluded, considering these results and a few references found in the literature, that the allometric model shows good predictive practical value to aid medical decisions. PMID:21226961

  3. Forestry inventory based on multistage sampling with probability proportional to size

    NASA Technical Reports Server (NTRS)

    Lee, D. C. L.; Hernandez, P., Jr.; Shimabukuro, Y. E.

    1983-01-01

    A multistage sampling technique, with probability proportional to size, is developed for a forest volume inventory using remote sensing data. The LANDSAT data, Panchromatic aerial photographs, and field data are collected. Based on age and homogeneity, pine and eucalyptus classes are identified. Selection of tertiary sampling units is made through aerial photographs to minimize field work. The sampling errors for eucalyptus and pine ranged from 8.34 to 21.89 percent and from 7.18 to 8.60 percent, respectively.

  4. Differentiated protection services with failure probability guarantee for workflow-based applications

    NASA Astrophysics Data System (ADS)

    Zhong, Yaoquan; Guo, Wei; Jin, Yaohui; Sun, Weiqiang; Hu, Weisheng

    2010-12-01

    A cost-effective and service-differentiated provisioning strategy is very desirable to service providers so that they can offer users satisfactory services, while optimizing network resource allocation. Providing differentiated protection services to connections for surviving link failure has been extensively studied in recent years. However, the differentiated protection services for workflow-based applications, which consist of many interdependent tasks, have scarcely been studied. This paper investigates the problem of providing differentiated services for workflow-based applications in optical grid. In this paper, we develop three differentiated protection services provisioning strategies which can provide security level guarantee and network-resource optimization for workflow-based applications. The simulation demonstrates that these heuristic algorithms provide protection cost-effectively while satisfying the applications' failure probability requirements.

  5. A generative probability model of joint label fusion for multi-atlas based brain segmentation.

    PubMed

    Wu, Guorong; Wang, Qian; Zhang, Daoqiang; Nie, Feiping; Huang, Heng; Shen, Dinggang

    2014-08-01

    Automated labeling of anatomical structures in medical images is very important in many neuroscience studies. Recently, patch-based labeling has been widely investigated to alleviate the possible mis-alignment when registering atlases to the target image. However, the weights used for label fusion from the registered atlases are generally computed independently and thus lack the capability of preventing the ambiguous atlas patches from contributing to the label fusion. More critically, these weights are often calculated based only on the simple patch similarity, thus not necessarily providing optimal solution for label fusion. To address these limitations, we propose a generative probability model to describe the procedure of label fusion in a multi-atlas scenario, for the goal of labeling each point in the target image by the best representative atlas patches that also have the largest labeling unanimity in labeling the underlying point correctly. Specifically, sparsity constraint is imposed upon label fusion weights, in order to select a small number of atlas patches that best represent the underlying target patch, thus reducing the risks of including the misleading atlas patches. The labeling unanimity among atlas patches is achieved by exploring their dependencies, where we model these dependencies as the joint probability of each pair of atlas patches in correctly predicting the labels, by analyzing the correlation of their morphological error patterns and also the labeling consensus among atlases. The patch dependencies will be further recursively updated based on the latest labeling results to correct the possible labeling errors, which falls to the Expectation Maximization (EM) framework. To demonstrate the labeling performance, we have comprehensively evaluated our patch-based labeling method on the whole brain parcellation and hippocampus segmentation. Promising labeling results have been achieved with comparison to the conventional patch-based labeling

  6. Finding significantly connected voxels based on histograms of connection strengths

    NASA Astrophysics Data System (ADS)

    Kasenburg, Niklas; Pedersen, Morten Vester; Darkner, Sune

    2016-03-01

    We explore a new approach for structural connectivity based segmentations of subcortical brain regions. Connectivity based segmentations are usually based on fibre connections from a seed region to predefined target regions. We present a method for finding significantly connected voxels based on the distribution of connection strengths. Paths from seed voxels to all voxels in a target region are obtained from a shortest-path tractography. For each seed voxel we approximate the distribution with a histogram of path scores. We hypothesise that the majority of estimated connections are false-positives and that their connection strength is distributed differently from true-positive connections. Therefore, an empirical null-distribution is defined for each target region as the average normalized histogram over all voxels in the seed region. Single histograms are then tested against the corresponding null-distribution and significance is determined using the false discovery rate (FDR). Segmentations are based on significantly connected voxels and their FDR. In this work we focus on the thalamus and the target regions were chosen by dividing the cortex into a prefrontal/temporal zone, motor zone, somatosensory zone and a parieto-occipital zone. The obtained segmentations consistently show a sparse number of significantly connected voxels that are located near the surface of the anterior thalamus over a population of 38 subjects.

  7. Design of Probabilistic Boolean Networks Based on Network Structure and Steady-State Probabilities.

    PubMed

    Kobayashi, Koichi; Hiraishi, Kunihiko

    2016-06-06

    In this brief, we consider the problem of finding a probabilistic Boolean network (PBN) based on a network structure and desired steady-state properties. In systems biology and synthetic biology, such problems are important as an inverse problem. Using a matrix-based representation of PBNs, a solution method for this problem is proposed. The problem of finding a BN has been studied so far. In the problem of finding a PBN, we must calculate not only the Boolean functions, but also the probabilities of selecting a Boolean function and the number of candidates of the Boolean functions. Hence, the problem of finding a PBN is more difficult than that of finding a BN. The effectiveness of the proposed method is presented by numerical examples.

  8. Tuning the tunneling probability by mechanical stress in Schottky barrier based reconfigurable nanowire transistors

    NASA Astrophysics Data System (ADS)

    Baldauf, Tim; Heinzig, André; Trommer, Jens; Mikolajick, Thomas; Weber, Walter Michael

    2017-02-01

    Mechanical stress is an established and important tool of the semiconductor industry to improve the performance of modern transistors. It is well understood for the enhancement of carrier mobility but rather unexplored for the control of the tunneling probability for injection dominated research devices based on tunneling phenomena, such as tunnel FETs, resonant tunnel FETs and reconfigurable Schottky FETs. In this work, the effect of stress on the tunneling probability and overall transistor characteristics is studied by three-dimensional device simulations in the example of reconfigurable silicon nanowire Schottky barrier transistors using two independently gated Schottky junctions. To this end, four different stress sources are investigated. The effects of mechanical stress on the average effective tunneling mass and on the multi-valley band structure applying the deformation potential theory are being considered. The transfer characteristics of strained transistors in n- and p-configuration and corresponding charge carrier tunneling are analyzed with respect to the current ratio between electron and hole conduction. For the implementation of these devices into complementary circuits, the mandatory current ratio of unity can be achieved by appropriate mechanical stress either by nanowire oxidation or the application of a stressed top layer.

  9. A Trust-Based Adaptive Probability Marking and Storage Traceback Scheme for WSNs.

    PubMed

    Liu, Anfeng; Liu, Xiao; Long, Jun

    2016-03-30

    Security is a pivotal issue for wireless sensor networks (WSNs), which are emerging as a promising platform that enables a wide range of military, scientific, industrial and commercial applications. Traceback, a key cyber-forensics technology, can play an important role in tracing and locating a malicious source to guarantee cybersecurity. In this work a trust-based adaptive probability marking and storage (TAPMS) traceback scheme is proposed to enhance security for WSNs. In a TAPMS scheme, the marking probability is adaptively adjusted according to the security requirements of the network and can substantially reduce the number of marking tuples and improve network lifetime. More importantly, a high trust node is selected to store marking tuples, which can avoid the problem of marking information being lost. Experimental results show that the total number of marking tuples can be reduced in a TAPMS scheme, thus improving network lifetime. At the same time, since the marking tuples are stored in high trust nodes, storage reliability can be guaranteed, and the traceback time can be reduced by more than 80%.

  10. Protein single-model quality assessment by feature-based probability density functions.

    PubMed

    Cao, Renzhi; Cheng, Jianlin

    2016-04-04

    Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method-Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob.

  11. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    DOE PAGES

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; ...

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power datamore » are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.« less

  12. A Trust-Based Adaptive Probability Marking and Storage Traceback Scheme for WSNs

    PubMed Central

    Liu, Anfeng; Liu, Xiao; Long, Jun

    2016-01-01

    Security is a pivotal issue for wireless sensor networks (WSNs), which are emerging as a promising platform that enables a wide range of military, scientific, industrial and commercial applications. Traceback, a key cyber-forensics technology, can play an important role in tracing and locating a malicious source to guarantee cybersecurity. In this work a trust-based adaptive probability marking and storage (TAPMS) traceback scheme is proposed to enhance security for WSNs. In a TAPMS scheme, the marking probability is adaptively adjusted according to the security requirements of the network and can substantially reduce the number of marking tuples and improve network lifetime. More importantly, a high trust node is selected to store marking tuples, which can avoid the problem of marking information being lost. Experimental results show that the total number of marking tuples can be reduced in a TAPMS scheme, thus improving network lifetime. At the same time, since the marking tuples are stored in high trust nodes, storage reliability can be guaranteed, and the traceback time can be reduced by more than 80%. PMID:27043566

  13. Partitioning incident radiation fluxes based on photon recollision probability in vegetation canopies

    NASA Astrophysics Data System (ADS)

    M~Ottus, M.; Stenberg, P.

    2007-12-01

    Remote sensing of vegetation and modeling of canopy microclimate requires information on the fractions of incident radiation reflected, transmitted and absorbed by a plant canopy. The photon recollision probability p allows to calculate easily the amount of radiation absorbed by a vegetation canopy and to predict the spectral behavior of canopy scattering, i.e. the sum of canopy reflectance and transmittance. However, to divide the scattered radiation into reflected and transmitted fluxes, additional models are needed. To overcome this problem, we present a simple formula based on the photon recollision probability p to estimate the fraction of radiation scattered upwards by a canopy. The new semi-empirical method is tested with Monte Carlo simulations. A comparison with the analytical solution of the two-stream equation of radiative transfer in vegetation canopies is also provided. Our results indicate that the method is accurate for low to moderate leaf area index (LAI) values, and provides a reasonable approximation even at LAI=8. Finally, we present a new method to compute p using numerical radiative transfer models.

  14. Probability density function characterization for aggregated large-scale wind power based on Weibull mixtures

    SciTech Connect

    Gomez-Lazaro, Emilio; Bueso, Maria C.; Kessler, Mathieu; Martin-Martinez, Sergio; Zhang, Jie; Hodge, Bri -Mathias; Molina-Garcia, Angel

    2016-02-02

    Here, the Weibull probability distribution has been widely applied to characterize wind speeds for wind energy resources. Wind power generation modeling is different, however, due in particular to power curve limitations, wind turbine control methods, and transmission system operation requirements. These differences are even greater for aggregated wind power generation in power systems with high wind penetration. Consequently, models based on one-Weibull component can provide poor characterizations for aggregated wind power generation. With this aim, the present paper focuses on discussing Weibull mixtures to characterize the probability density function (PDF) for aggregated wind power generation. PDFs of wind power data are firstly classified attending to hourly and seasonal patterns. The selection of the number of components in the mixture is analyzed through two well-known different criteria: the Akaike information criterion (AIC) and the Bayesian information criterion (BIC). Finally, the optimal number of Weibull components for maximum likelihood is explored for the defined patterns, including the estimated weight, scale, and shape parameters. Results show that multi-Weibull models are more suitable to characterize aggregated wind power data due to the impact of distributed generation, variety of wind speed values and wind power curtailment.

  15. Rationalizing Hybrid Earthquake Probabilities

    NASA Astrophysics Data System (ADS)

    Gomberg, J.; Reasenberg, P.; Beeler, N.; Cocco, M.; Belardinelli, M.

    2003-12-01

    An approach to including stress transfer and frictional effects in estimates of the probability of failure of a single fault affected by a nearby earthquake has been suggested in Stein et al. (1997). This `hybrid' approach combines conditional probabilities, which depend on the time elapsed since the last earthquake on the affected fault, with Poissonian probabilities that account for friction and depend only on the time since the perturbing earthquake. The latter are based on the seismicity rate change model developed by Dieterich (1994) to explain the temporal behavior of aftershock sequences in terms of rate-state frictional processes. The model assumes an infinite population of nucleation sites that are near failure at the time of the perturbing earthquake. In the hybrid approach, assuming the Dieterich model can lead to significant transient increases in failure probability. We explore some of the implications of applying the Dieterich model to a single fault and its impact on the hybrid probabilities. We present two interpretations that we believe can rationalize the use of the hybrid approach. In the first, a statistical distribution representing uncertainties in elapsed and/or mean recurrence time on the fault serves as a proxy for Dieterich's population of nucleation sites. In the second, we imagine a population of nucleation patches distributed over the fault with a distribution of maturities. In both cases we find that the probability depends on the time since the last earthquake. In particular, the size of the transient probability increase may only be significant for faults already close to failure. Neglecting the maturity of a fault may lead to overestimated rate and probability increases.

  16. Location Prediction Based on Transition Probability Matrices Constructing from Sequential Rules for Spatial-Temporal K-Anonymity Dataset

    PubMed Central

    Liu, Zhao; Zhu, Yunhong; Wu, Chenxue

    2016-01-01

    Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users’ privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified. PMID:27508502

  17. Location Prediction Based on Transition Probability Matrices Constructing from Sequential Rules for Spatial-Temporal K-Anonymity Dataset.

    PubMed

    Zhang, Haitao; Chen, Zewei; Liu, Zhao; Zhu, Yunhong; Wu, Chenxue

    2016-01-01

    Spatial-temporal k-anonymity has become a mainstream approach among techniques for protection of users' privacy in location-based services (LBS) applications, and has been applied to several variants such as LBS snapshot queries and continuous queries. Analyzing large-scale spatial-temporal anonymity sets may benefit several LBS applications. In this paper, we propose two location prediction methods based on transition probability matrices constructing from sequential rules for spatial-temporal k-anonymity dataset. First, we define single-step sequential rules mined from sequential spatial-temporal k-anonymity datasets generated from continuous LBS queries for multiple users. We then construct transition probability matrices from mined single-step sequential rules, and normalize the transition probabilities in the transition matrices. Next, we regard a mobility model for an LBS requester as a stationary stochastic process and compute the n-step transition probability matrices by raising the normalized transition probability matrices to the power n. Furthermore, we propose two location prediction methods: rough prediction and accurate prediction. The former achieves the probabilities of arriving at target locations along simple paths those include only current locations, target locations and transition steps. By iteratively combining the probabilities for simple paths with n steps and the probabilities for detailed paths with n-1 steps, the latter method calculates transition probabilities for detailed paths with n steps from current locations to target locations. Finally, we conduct extensive experiments, and correctness and flexibility of our proposed algorithm have been verified.

  18. Reliability, failure probability, and strength of resin-based materials for CAD/CAM restorations

    PubMed Central

    Lim, Kiatlin; Yap, Adrian U-Jin; Agarwalla, Shruti Vidhawan; Tan, Keson Beng-Choon; Rosa, Vinicius

    2016-01-01

    ABSTRACT Objective: This study investigated the Weibull parameters and 5% fracture probability of direct, indirect composites, and CAD/CAM composites. Material and Methods: Discshaped (12 mm diameter x 1 mm thick) specimens were prepared for a direct composite [Z100 (ZO), 3M-ESPE], an indirect laboratory composite [Ceramage (CM), Shofu], and two CAD/CAM composites [Lava Ultimate (LU), 3M ESPE; Vita Enamic (VE), Vita Zahnfabrik] restorations (n=30 for each group). The specimens were polished, stored in distilled water for 24 hours at 37°C. Weibull parameters (m= modulus of Weibull, σ0= characteristic strength) and flexural strength for 5% fracture probability (σ5%) were determined using a piston-on-three-balls device at 1 MPa/s in distilled water. Statistical analysis for biaxial flexural strength analysis were performed either by both one-way ANOVA and Tukey's post hoc (α=0.05) or by Pearson's correlation test. Results: Ranking of m was: VE (19.5), LU (14.5), CM (11.7), and ZO (9.6). Ranking of σ0 (MPa) was: LU (218.1), ZO (210.4), CM (209.0), and VE (126.5). σ5% (MPa) was 177.9 for LU, 163.2 for CM, 154.7 for Z0, and 108.7 for VE. There was no significant difference in the m for ZO, CM, and LU. VE presented the highest m value and significantly higher than ZO. For σ0 and σ5%, ZO, CM, and LU were similar but higher than VE. Conclusion: The strength characteristics of CAD/ CAM composites vary according to their composition and microstructure. VE presented the lowest strength and highest Weibull modulus among the materials. PMID:27812614

  19. A software for the estimation of binding parameters of biochemical equilibria based on statistical probability model.

    PubMed

    Fisicaro, E; Braibanti, A; Sambasiva Rao, R; Compari, C; Ghiozzi, A; Nageswara Rao, G

    1998-04-01

    An algorithm is proposed for the estimation of binding parameters for the interaction of biologically important macromolecules with smaller ones from electrometric titration data. The mathematical model is based on the representation of equilibria in terms of probability concepts of statistical molecular thermodynamics. The refinement of equilibrium concentrations of the components and estimation of binding parameters (log site constant and cooperativity factor) is performed using singular value decomposition, a chemometric technique which overcomes the general obstacles due to near singularity. The present software is validated with a number of biochemical systems of varying number of sites and cooperativity factors. The effect of random errors of realistic magnitude in experimental data is studied using the simulated primary data for some typical systems. The safe area within which approximate binding parameters ensure convergence has been reported for the non-self starting optimization algorithms.

  20. Fall risk probability estimation based on supervised feature learning using public fall datasets.

    PubMed

    Koshmak, Gregory A; Linden, Maria; Loutfi, Amy

    2016-08-01

    Risk of falling is considered among major threats for elderly population and therefore started to play an important role in modern healthcare. With recent development of sensor technology, the number of studies dedicated to reliable fall detection system has increased drastically. However, there is still a lack of universal approach regarding the evaluation of developed algorithms. In the following study we make an attempt to find publicly available fall datasets and analyze similarities among them using supervised learning. After preforming similarity assessment based on multidimensional scaling we indicate the most representative feature vector corresponding to each specific dataset. This vector obtained from a real-life data is subsequently deployed to estimate fall risk probabilities for a statistical fall detection model. Finally, we conclude with some observations regarding the similarity assessment results and provide suggestions towards an efficient approach for evaluation of fall detection studies.

  1. Estimation of the failure probability during EGS stimulation based on borehole data

    NASA Astrophysics Data System (ADS)

    Meller, C.; Kohl, Th.; Gaucher, E.

    2012-04-01

    In recent times the search for alternative sources of energy has been fostered by the scarcity of fossil fuels. With its ability to permanently provide electricity or heat with little emission of CO2, geothermal energy will have an important share in the energy mix of the future. Within Europe, scientists identified many locations with conditions suitable for Enhanced Geothermal System (EGS) projects. In order to provide sufficiently high reservoir permeability, EGS require borehole stimulations prior to installation of power plants (Gérard et al, 2006). Induced seismicity during water injection into reservoirs EGS systems is a factor that currently cannot be predicted nor controlled. Often, people living near EGS projects are frightened by smaller earthquakes occurring during stimulation or injection. As this fear can lead to widespread disapproval of geothermal power plants, it is appreciable to find a way to estimate the probability of fractures to shear when injecting water with a distinct pressure into a geothermal reservoir. This provides knowledge, which enables to predict the mechanical behavior of a reservoir in response to a change in pore pressure conditions. In the present study an approach for estimation of the shearing probability based on statistical analyses of fracture distribution, orientation and clusters, together with their geological properties is proposed. Based on geophysical logs of five wells in Soultz-sous-Forêts, France, and with the help of statistical tools, the Mohr criterion, geological and mineralogical properties of the host rock and the fracture fillings, correlations between the wells are analyzed. This is achieved with the self-written MATLAB-code Fracdens, which enables us to statistically analyze the log files in different ways. With the application of a pore pressure change, the evolution of the critical pressure on the fractures can be determined. A special focus is on the clay fillings of the fractures and how they reduce

  2. Binomial probability distribution model-based protein identification algorithm for tandem mass spectrometry utilizing peak intensity information.

    PubMed

    Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu

    2013-01-04

    Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .

  3. Estimates of EPSP amplitude based on changes in motoneuron discharge rate and probability.

    PubMed

    Powers, Randall K; Türker, K S

    2010-10-01

    When motor units are discharging tonically, transient excitatory synaptic inputs produce an increase in the probability of spike occurrence and also increase the instantaneous discharge rate. Several researchers have proposed that these induced changes in discharge rate and probability can be used to estimate the amplitude of the underlying excitatory post-synaptic potential (EPSP). We tested two different methods of estimating EPSP amplitude by comparing the amplitude of simulated EPSPs with their effects on the discharge of rat hypoglossal motoneurons recorded in an in vitro brainstem slice preparation. The first estimation method (simplified-trajectory method) is based on the assumptions that the membrane potential trajectory between spikes can be approximated by a 10 mV post-spike hyperpolarization followed by a linear rise to the next spike and that EPSPs sum linearly with this trajectory. We hypothesized that this estimation method would not be accurate due to interspike variations in membrane conductance and firing threshold that are not included in the model and that an alternative method based on estimating the effective distance to threshold would provide more accurate estimates of EPSP amplitude. This second method (distance-to-threshold method) uses interspike interval statistics to estimate the effective distance to threshold throughout the interspike interval and incorporates this distance-to-threshold trajectory into a threshold-crossing model. We found that the first method systematically overestimated the amplitude of small (<5 mV) EPSPs and underestimated the amplitude of large (>5 mV EPSPs). For large EPSPs, the degree of underestimation increased with increasing background discharge rate. Estimates based on the second method were more accurate for small EPSPs than those based on the first model, but estimation errors were still large for large EPSPs. These errors were likely due to two factors: (1) the distance to threshold can only be

  4. A generic probability based algorithm to derive regional patterns of crops in time and space

    NASA Astrophysics Data System (ADS)

    Wattenbach, Martin; Oijen, Marcel v.; Leip, Adrian; Hutchings, Nick; Balkovic, Juraj; Smith, Pete

    2013-04-01

    Croplands are not only the key to human food supply, they also change the biophysical and biogeochemical properties of the land surface leading to changes in the water cycle, energy partitioning, influence soil erosion and substantially contribute to the amount of greenhouse gases entering the atmosphere. The effects of croplands on the environment depend on the type of crop and the associated management which both are related to the site conditions, economic boundary settings as well as preferences of individual farmers. However, at a given point of time the pattern of crops in a landscape is not only determined by environmental and socioeconomic conditions but also by the compatibility to the crops which had been grown in the years before at the current field and its surrounding cropping area. The crop compatibility is driven by factors like pests and diseases, crop driven changes in soil structure and timing of cultivation steps. Given these effects of crops on the biochemical cycle and their interdependence with the mentioned boundary conditions, there is a demand in the regional and global modelling community to account for these regional patterns. Here we present a Bayesian crop distribution generator algorithm that is used to calculate the combined and conditional probability for a crop to appear in time and space using sparse and disparate information. The input information to define the most probable crop per year and grid cell is based on combined probabilities derived from the a crop transition matrix representing good agricultural practice, crop specific soil suitability derived from the European soil database and statistical information about harvested area from the Eurostat database. The reported Eurostat crop area also provides the target proportion to be matched by the algorithm on the level of administrative units (Nomenclature des Unités Territoriales Statistiques - NUTS). The algorithm is applied for the EU27 to derive regional spatial and

  5. Stability metrics for multi-source biomedical data based on simplicial projections from probability distribution distances.

    PubMed

    Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan M

    2017-02-01

    Biomedical data may be composed of individuals generated from distinct, meaningful sources. Due to possible contextual biases in the processes that generate data, there may exist an undesirable and unexpected variability among the probability distribution functions (PDFs) of the source subsamples, which, when uncontrolled, may lead to inaccurate or unreproducible research results. Classical statistical methods may have difficulties to undercover such variabilities when dealing with multi-modal, multi-type, multi-variate data. This work proposes two metrics for the analysis of stability among multiple data sources, robust to the aforementioned conditions, and defined in the context of data quality assessment. Specifically, a global probabilistic deviation and a source probabilistic outlyingness metrics are proposed. The first provides a bounded degree of the global multi-source variability, designed as an estimator equivalent to the notion of normalized standard deviation of PDFs. The second provides a bounded degree of the dissimilarity of each source to a latent central distribution. The metrics are based on the projection of a simplex geometrical structure constructed from the Jensen-Shannon distances among the sources PDFs. The metrics have been evaluated and demonstrated their correct behaviour on a simulated benchmark and with real multi-source biomedical data using the UCI Heart Disease data set. The biomedical data quality assessment based on the proposed stability metrics may improve the efficiency and effectiveness of biomedical data exploitation and research.

  6. Probability-based prediction of activity in multiple arm muscles: implications for functional electrical stimulation.

    PubMed

    Anderson, Chad V; Fuglevand, Andrew J

    2008-07-01

    Functional electrical stimulation (FES) involves artificial activation of muscles with implanted electrodes to restore motor function in paralyzed individuals. The range of motor behaviors that can be generated by FES, however, is limited to a small set of preprogrammed movements such as hand grasp and release. A broader range of movements has not been implemented because of the substantial difficulty associated with identifying the patterns of muscle stimulation needed to elicit specified movements. To overcome this limitation in controlling FES systems, we used probabilistic methods to estimate the levels of muscle activity in the human arm during a wide range of free movements based on kinematic information of the upper limb. Conditional probability distributions were generated based on hand kinematics and associated surface electromyographic (EMG) signals from 12 arm muscles recorded during a training task involving random movements of the arm in one subject. These distributions were then used to predict in four other subjects the patterns of muscle activity associated with eight different movement tasks. On average, about 40% of the variance in the actual EMG signals could be accounted for in the predicted EMG signals. These results suggest that probabilistic methods ultimately might be used to predict the patterns of muscle stimulation needed to produce a wide array of desired movements in paralyzed individuals with FES.

  7. Global climate change model natural climate variation: Paleoclimate data base, probabilities and astronomic predictors

    SciTech Connect

    Kukla, G.; Gavin, J.

    1994-05-01

    This report was prepared at the Lamont-Doherty Geological Observatory of Columbia University at Palisades, New York, under subcontract to Pacific Northwest Laboratory it is a part of a larger project of global climate studies which supports site characterization work required for the selection of a potential high-level nuclear waste repository and forms part of the Performance Assessment Scientific Support (PASS) Program at PNL. The work under the PASS Program is currently focusing on the proposed site at Yucca Mountain, Nevada, and is under the overall direction of the Yucca Mountain Project Office US Department of Energy, Las Vegas, Nevada. The final results of the PNL project will provide input to global atmospheric models designed to test specific climate scenarios which will be used in the site specific modeling work of others. The primary purpose of the data bases compiled and of the astronomic predictive models is to aid in the estimation of the probabilities of future climate states. The results will be used by two other teams working on the global climate study under contract to PNL. They are located at and the University of Maine in Orono, Maine, and the Applied Research Corporation in College Station, Texas. This report presents the results of the third year`s work on the global climate change models and the data bases describing past climates.

  8. Molecular Characterization of Trichomonas vaginalis Strains Based on Identifying Their Probable Variations in Asymptomatic Patients

    PubMed Central

    SPOTIN, Adel; EGHTEDAR, Sanaz TAGHIZADEH; SHAHBAZI, Abbas; SALEHPOUR, Asghar; SARAFRAZ, Seddigheh; SHARIATZADEH, Seyyed Ali; MAHAMI-OSKOUEI, Mahmoud

    2016-01-01

    Background: The aim of this study was to identify the Trichomonas vaginalis strains/haplotypes based on identifying their probable variations in asymptomatic patients referred to Tabriz health centers, northwestern Iran. Methods: Sampling was taken from 50-suspected women to T. vaginalis in northwestern Iran. The obtained samples were smeared and cultured. Fifty DNA samples were extracted, amplified and identified by nested polymerase chain reaction and PCR-RFLP of actin gene using two endonuclease enzymes: MseI and RsaI. To reconfirm, the amplicons of actin gene were directly sequenced in order to identify the strains/haplotypes. Results: PCR-RFLP patterns, sequencing and phylogenetic analyses revealed definitely the presence of the G (n=22; 73.4%) and E (n=8; 26.6%) strains. Multiple alignments findings of genotype G showed five haplotypes and two amino acid substitutions in codons 192 and 211 although, no remarkable unique haplotype was found in genotype E. Conclusion: The accurate identification of T. vaginalis strains based on discrimination of their unknown haplotypes particularly those which are impacted on protein translation should be considered in parasite status, drug resistance, mixed infection with HIV and monitoring of asymptomatic trichomoniasis in the region. PMID:28127362

  9. Some considerations on the definition of risk based on concepts of systems theory and probability.

    PubMed

    Andretta, Massimo

    2014-07-01

    The concept of risk has been applied in many modern science and technology fields. Despite its successes in many applicative fields, there is still not a well-established vision and universally accepted definition of the principles and fundamental concepts of the risk assessment discipline. As emphasized recently, the risk fields suffer from a lack of clarity on their scientific bases that can define, in a unique theoretical framework, the general concepts in the different areas of application. The aim of this article is to make suggestions for another perspective of risk definition that could be applied and, in a certain sense, generalize some of the previously known definitions (at least in the fields of technical and scientific applications). By drawing on my experience of risk assessment in different applicative situations (particularly in the risk estimation for major industrial accidents, and in the health and ecological risk assessment for contaminated sites), I would like to revise some general and foundational concepts of risk analysis in as consistent a manner as possible from the axiomatic/deductive point of view. My proposal is based on the fundamental concepts of the systems theory and of the probability. In this way, I try to frame, in a single, broad, and general theoretical context some fundamental concepts and principles applicable in many different fields of risk assessment. I hope that this article will contribute to the revitalization and stimulation of useful discussions and new insights into the key issues and theoretical foundations of risk assessment disciplines.

  10. Experience-Based Probabilities Modulate Expectations in a Gender-Coded Artificial Language

    PubMed Central

    Öttl, Anton; Behne, Dawn M.

    2016-01-01

    The current study combines artificial language learning with visual world eyetracking to investigate acquisition of representations associating spoken words and visual referents using morphologically complex pseudowords. Pseudowords were constructed to consistently encode referential gender by means of suffixation for a set of imaginary figures that could be either male or female. During training, the frequency of exposure to pseudowords and their imaginary figure referents were manipulated such that a given word and its referent would be more likely to occur in either the masculine form or the feminine form, or both forms would be equally likely. Results show that these experience-based probabilities affect the formation of new representations to the extent that participants were faster at recognizing a referent whose gender was consistent with the induced expectation than a referent whose gender was inconsistent with this expectation. Disambiguating gender information available from the suffix did not mask the induced expectations. Eyetracking data provide additional evidence that such expectations surface during online lexical processing. Taken together, these findings indicate that experience-based information is accessible during the earliest stages of processing, and are consistent with the view that language comprehension depends on the activation of perceptual memory traces. PMID:27602009

  11. Effect-based interpretation of toxicity test data using probability and comparison with alternative methods of analysis

    SciTech Connect

    Gully, J.R.; Baird, R.B.; Markle, P.J.; Bottomley, J.P.

    2000-01-01

    A methodology is described that incorporates the intra- and intertest variability and the biological effect of bioassay data in evaluating the toxicity of single and multiple tests for regulatory decision-making purposes. The single- and multiple-test regulatory decision probabilities were determined from t values (n {minus} 1, one-tailed) derived from the estimated biological effect and the associated standard error at the critical sample concentration. Single-test regulatory decision probabilities below the selected minimum regulatory decision probability identify individual tests as noncompliant. A multiple-test regulatory decision probability is determined by combining the regulatory decision probability of a series of single tests. A multiple-test regulatory decision probability is determined by combining the regulatory decision probability of a series of single tests. A multiple-test regulatory decision probability below the multiple-test regulatory decision minimum identifies groups of tests in which the magnitude and persistence of the toxicity is sufficient to be considered noncompliant or to require enforcement action. Regulatory decision probabilities derived from the t distribution were compared with results based on standard and bioequivalence hypothesis tests using single- and multiple-concentration toxicity test data from an actual national pollutant discharge incorporated the precision of the effect estimate into regulatory decisions at a fixed level of effect. Also, probability-based interpretation of toxicity tests provides incentive to laboratories to produce, and permit holders to use, high-quality, precise data, particularly when multiple tests are used in regulatory decisions. These results are contrasted with standard and bioequivalence hypothesis tests in which the intratest precision is a determining factor in setting the biological effect used for regulatory decisions.

  12. A VLSI Architecture for Output Probability Computations of HMM-Based Recognition Systems with Store-Based Block Parallel Processing

    NASA Astrophysics Data System (ADS)

    Nakamura, Kazuhiro; Yamamoto, Masatoshi; Takagi, Kazuyoshi; Takagi, Naofumi

    In this paper, a fast and memory-efficient VLSI architecture for output probability computations of continuous Hidden Markov Models (HMMs) is presented. These computations are the most time-consuming part of HMM-based recognition systems. High-speed VLSI architectures with small registers and low-power dissipation are required for the development of mobile embedded systems with capable human interfaces. We demonstrate store-based block parallel processing (StoreBPP) for output probability computations and present a VLSI architecture that supports it. When the number of HMM states is adequate for accurate recognition, compared with conventional stream-based block parallel processing (StreamBPP) architectures, the proposed architecture requires fewer registers and processing elements and less processing time. The processing elements used in the StreamBPP architecture are identical to those used in the StoreBPP architecture. From a VLSI architectural viewpoint, a comparison shows the efficiency of the proposed architecture through efficient use of registers for storing input feature vectors and intermediate results during computation.

  13. Experimental Probability in Elementary School

    ERIC Educational Resources Information Center

    Andrew, Lane

    2009-01-01

    Concepts in probability can be more readily understood if students are first exposed to probability via experiment. Performing probability experiments encourages students to develop understandings of probability grounded in real events, as opposed to merely computing answers based on formulae.

  14. Probability based remaining capacity estimation using data-driven and neural network model

    NASA Astrophysics Data System (ADS)

    Wang, Yujie; Yang, Duo; Zhang, Xu; Chen, Zonghai

    2016-05-01

    Since large numbers of lithium-ion batteries are composed in pack and the batteries are complex electrochemical devices, their monitoring and safety concerns are key issues for the applications of battery technology. An accurate estimation of battery remaining capacity is crucial for optimization of the vehicle control, preventing battery from over-charging and over-discharging and ensuring the safety during its service life. The remaining capacity estimation of a battery includes the estimation of state-of-charge (SOC) and state-of-energy (SOE). In this work, a probability based adaptive estimator is presented to obtain accurate and reliable estimation results for both SOC and SOE. For the SOC estimation, an n ordered RC equivalent circuit model is employed by combining an electrochemical model to obtain more accurate voltage prediction results. For the SOE estimation, a sliding window neural network model is proposed to investigate the relationship between the terminal voltage and the model inputs. To verify the accuracy and robustness of the proposed model and estimation algorithm, experiments under different dynamic operation current profiles are performed on the commercial 1665130-type lithium-ion batteries. The results illustrate that accurate and robust estimation can be obtained by the proposed method.

  15. A new probability distribution model of turbulent irradiance based on Born perturbation theory

    NASA Astrophysics Data System (ADS)

    Wang, Hongxing; Liu, Min; Hu, Hao; Wang, Qian; Liu, Xiguo

    2010-10-01

    The subject of the PDF (Probability Density Function) of the irradiance fluctuations in a turbulent atmosphere is still unsettled. Theory reliably describes the behavior in the weak turbulence regime, but theoretical description in the strong and whole turbulence regimes are still controversial. Based on Born perturbation theory, the physical manifestations and correlations of three typical PDF models (Rice-Nakagami, exponential-Bessel and negative-exponential distribution) were theoretically analyzed. It is shown that these models can be derived by separately making circular-Gaussian, strong-turbulence and strong-turbulence-circular-Gaussian approximations in Born perturbation theory, which denies the viewpoint that the Rice-Nakagami model is only applicable in the extremely weak turbulence regime and provides theoretical arguments for choosing rational models in practical applications. In addition, a common shortcoming of the three models is that they are all approximations. A new model, called the Maclaurin-spread distribution, is proposed without any approximation except for assuming the correlation coefficient to be zero. So, it is considered that the new model can exactly reflect the Born perturbation theory. Simulated results prove the accuracy of this new model.

  16. Moment-Based Probability Modeling and Extreme Response Estimation, The FITS Routine Version 1.2

    SciTech Connect

    MANUEL,LANCE; KASHEF,TINA; WINTERSTEIN,STEVEN R.

    1999-11-01

    This report documents the use of the FITS routine, which provides automated fits of various analytical, commonly used probability models from input data. It is intended to complement the previously distributed FITTING routine documented in RMS Report 14 (Winterstein et al., 1994), which implements relatively complex four-moment distribution models whose parameters are fit with numerical optimization routines. Although these four-moment fits can be quite useful and faithful to the observed data, their complexity can make them difficult to automate within standard fitting algorithms. In contrast, FITS provides more robust (lower moment) fits of simpler, more conventional distribution forms. For each database of interest, the routine estimates the distribution of annual maximum response based on the data values and the duration, T, over which they were recorded. To focus on the upper tails of interest, the user can also supply an arbitrary lower-bound threshold, {chi}{sub low}, above which a shifted distribution model--exponential or Weibull--is fit.

  17. SAR amplitude probability density function estimation based on a generalized Gaussian model.

    PubMed

    Moser, Gabriele; Zerubia, Josiane; Serpico, Sebastiano B

    2006-06-01

    In the context of remotely sensed data analysis, an important problem is the development of accurate models for the statistics of the pixel intensities. Focusing on synthetic aperture radar (SAR) data, this modeling process turns out to be a crucial task, for instance, for classification or for denoising purposes. In this paper, an innovative parametric estimation methodology for SAR amplitude data is proposed that adopts a generalized Gaussian (GG) model for the complex SAR backscattered signal. A closed-form expression for the corresponding amplitude probability density function (PDF) is derived and a specific parameter estimation algorithm is developed in order to deal with the proposed model. Specifically, the recently proposed "method-of-log-cumulants" (MoLC) is applied, which stems from the adoption of the Mellin transform (instead of the usual Fourier transform) in the computation of characteristic functions and from the corresponding generalization of the concepts of moment and cumulant. For the developed GG-based amplitude model, the resulting MoLC estimates turn out to be numerically feasible and are also analytically proved to be consistent. The proposed parametric approach was validated by using several real ERS-1, XSAR, E-SAR, and NASA/JPL airborne SAR images, and the experimental results prove that the method models the amplitude PDF better than several previously proposed parametric models for backscattering phenomena.

  18. Inverse modeling of hydraulic tests in fractured crystalline rock based on a transition probability geostatistical approach

    NASA Astrophysics Data System (ADS)

    Blessent, Daniela; Therrien, René; Lemieux, Jean-Michel

    2011-12-01

    This paper presents numerical simulations of a series of hydraulic interference tests conducted in crystalline bedrock at Olkiluoto (Finland), a potential site for the disposal of the Finnish high-level nuclear waste. The tests are in a block of crystalline bedrock of about 0.03 km3 that contains low-transmissivity fractures. Fracture density, orientation, and fracture transmissivity are estimated from Posiva Flow Log (PFL) measurements in boreholes drilled in the rock block. On the basis of those data, a geostatistical approach relying on a transitional probability and Markov chain models is used to define a conceptual model based on stochastic fractured rock facies. Four facies are defined, from sparsely fractured bedrock to highly fractured bedrock. Using this conceptual model, three-dimensional groundwater flow is then simulated to reproduce interference pumping tests in either open or packed-off boreholes. Hydraulic conductivities of the fracture facies are estimated through automatic calibration using either hydraulic heads or both hydraulic heads and PFL flow rates as targets for calibration. The latter option produces a narrower confidence interval for the calibrated hydraulic conductivities, therefore reducing the associated uncertainty and demonstrating the usefulness of the measured PFL flow rates. Furthermore, the stochastic facies conceptual model is a suitable alternative to discrete fracture network models to simulate fluid flow in fractured geological media.

  19. An imprecise probability approach for squeal instability analysis based on evidence theory

    NASA Astrophysics Data System (ADS)

    Lü, Hui; Shangguan, Wen-Bin; Yu, Dejie

    2017-01-01

    An imprecise probability approach based on evidence theory is proposed for squeal instability analysis of uncertain disc brakes in this paper. First, the squeal instability of the finite element (FE) model of a disc brake is investigated and its dominant unstable eigenvalue is detected by running two typical numerical simulations, i.e., complex eigenvalue analysis (CEA) and transient dynamical analysis. Next, the uncertainty mainly caused by contact and friction is taken into account and some key parameters of the brake are described as uncertain parameters. All these uncertain parameters are usually involved with imprecise data such as incomplete information and conflict information. Finally, a squeal instability analysis model considering imprecise uncertainty is established by integrating evidence theory, Taylor expansion, subinterval analysis and surrogate model. In the proposed analysis model, the uncertain parameters with imprecise data are treated as evidence variables, and the belief measure and plausibility measure are employed to evaluate system squeal instability. The effectiveness of the proposed approach is demonstrated by numerical examples and some interesting observations and conclusions are summarized from the analyses and discussions. The proposed approach is generally limited to the squeal problems without too many investigated parameters. It can be considered as a potential method for squeal instability analysis, which will act as the first step to reduce squeal noise of uncertain brakes with imprecise information.

  20. Reliability analysis of idealized tunnel support system using probability-based methods with case studies

    NASA Astrophysics Data System (ADS)

    Gharouni-Nik, Morteza; Naeimi, Meysam; Ahadi, Sodayf; Alimoradi, Zahra

    2014-06-01

    In order to determine the overall safety of a tunnel support lining, a reliability-based approach is presented in this paper. Support elements in jointed rock tunnels are provided to control the ground movement caused by stress redistribution during the tunnel drive. Main support elements contribute to stability of the tunnel structure are recognized owing to identify various aspects of reliability and sustainability in the system. The selection of efficient support methods for rock tunneling is a key factor in order to reduce the number of problems during construction and maintain the project cost and time within the limited budget and planned schedule. This paper introduces a smart approach by which decision-makers will be able to find the overall reliability of tunnel support system before selecting the final scheme of the lining system. Due to this research focus, engineering reliability which is a branch of statistics and probability is being appropriately applied to the field and much effort has been made to use it in tunneling while investigating the reliability of the lining support system for the tunnel structure. Therefore, reliability analysis for evaluating the tunnel support performance is the main idea used in this research. Decomposition approaches are used for producing system block diagram and determining the failure probability of the whole system. Effectiveness of the proposed reliability model of tunnel lining together with the recommended approaches is examined using several case studies and the final value of reliability obtained for different designing scenarios. Considering the idea of linear correlation between safety factors and reliability parameters, the values of isolated reliabilities determined for different structural components of tunnel support system. In order to determine individual safety factors, finite element modeling is employed for different structural subsystems and the results of numerical analyses are obtained in

  1. A sequential nonparametric pattern classification algorithm based on the Wald SPRT. [Sequential Probability Ratio Test

    NASA Technical Reports Server (NTRS)

    Poage, J. L.

    1975-01-01

    A sequential nonparametric pattern classification procedure is presented. The method presented is an estimated version of the Wald sequential probability ratio test (SPRT). This method utilizes density function estimates, and the density estimate used is discussed, including a proof of convergence in probability of the estimate to the true density function. The classification procedure proposed makes use of the theory of order statistics, and estimates of the probabilities of misclassification are given. The procedure was tested on discriminating between two classes of Gaussian samples and on discriminating between two kinds of electroencephalogram (EEG) responses.

  2. METAPHOR: a machine-learning-based method for the probability density estimation of photometric redshifts

    NASA Astrophysics Data System (ADS)

    Cavuoti, S.; Amaro, V.; Brescia, M.; Vellucci, C.; Tortora, C.; Longo, G.

    2017-02-01

    A variety of fundamental astrophysical science topics require the determination of very accurate photometric redshifts (photo-z). A wide plethora of methods have been developed, based either on template models fitting or on empirical explorations of the photometric parameter space. Machine-learning-based techniques are not explicitly dependent on the physical priors and able to produce accurate photo-z estimations within the photometric ranges derived from the spectroscopic training set. These estimates, however, are not easy to characterize in terms of a photo-z probability density function (PDF), due to the fact that the analytical relation mapping the photometric parameters on to the redshift space is virtually unknown. We present METAPHOR (Machine-learning Estimation Tool for Accurate PHOtometric Redshifts), a method designed to provide a reliable PDF of the error distribution for empirical techniques. The method is implemented as a modular workflow, whose internal engine for photo-z estimation makes use of the MLPQNA neural network (Multi Layer Perceptron with Quasi Newton learning rule), with the possibility to easily replace the specific machine-learning model chosen to predict photo-z. We present a summary of results on SDSS-DR9 galaxy data, used also to perform a direct comparison with PDFs obtained by the LE PHARE spectral energy distribution template fitting. We show that METAPHOR is capable to estimate the precision and reliability of photometric redshifts obtained with three different self-adaptive techniques, i.e. MLPQNA, Random Forest and the standard K-Nearest Neighbors models.

  3. In search of a statistical probability model for petroleum-resource assessment : a critique of the probabilistic significance of certain concepts and methods used in petroleum-resource assessment : to that end, a probabilistic model is sketched

    USGS Publications Warehouse

    Grossling, Bernardo F.

    1975-01-01

    Exploratory drilling is still in incipient or youthful stages in those areas of the world where the bulk of the potential petroleum resources is yet to be discovered. Methods of assessing resources from projections based on historical production and reserve data are limited to mature areas. For most of the world's petroleum-prospective areas, a more speculative situation calls for a critical review of resource-assessment methodology. The language of mathematical statistics is required to define more rigorously the appraisal of petroleum resources. Basically, two approaches have been used to appraise the amounts of undiscovered mineral resources in a geologic province: (1) projection models, which use statistical data on the past outcome of exploration and development in the province; and (2) estimation models of the overall resources of the province, which use certain known parameters of the province together with the outcome of exploration and development in analogous provinces. These two approaches often lead to widely different estimates. Some of the controversy that arises results from a confusion of the probabilistic significance of the quantities yielded by each of the two approaches. Also, inherent limitations of analytic projection models-such as those using the logistic and Gomperts functions --have often been ignored. The resource-assessment problem should be recast in terms that provide for consideration of the probability of existence of the resource and of the probability of discovery of a deposit. Then the two above-mentioned models occupy the two ends of the probability range. The new approach accounts for (1) what can be expected with reasonably high certainty by mere projections of what has been accomplished in the past; (2) the inherent biases of decision-makers and resource estimators; (3) upper bounds that can be set up as goals for exploration; and (4) the uncertainties in geologic conditions in a search for minerals. Actual outcomes can then

  4. The Significance of Acid/Base Properties in Drug Discovery

    PubMed Central

    Manallack, David T.; Prankerd, Richard J.; Yuriev, Elizabeth; Oprea, Tudor I.; Chalmers, David K.

    2013-01-01

    While drug discovery scientists take heed of various guidelines concerning drug-like character, the influence of acid/base properties often remains under-scrutinised. Ionisation constants (pKa values) are fundamental to the variability of the biopharmaceutical characteristics of drugs and to underlying parameters such as logD and solubility. pKa values affect physicochemical properties such as aqueous solubility, which in turn influences drug formulation approaches. More importantly, absorption, distribution, metabolism, excretion and toxicity (ADMET) are profoundly affected by the charge state of compounds under varying pH conditions. Consideration of pKa values in conjunction with other molecular properties is of great significance and has the potential to be used to further improve the efficiency of drug discovery. Given the recent low annual output of new drugs from pharmaceutical companies, this review will provide a timely reminder of an important molecular property that influences clinical success. PMID:23099561

  5. Variable selection in large margin classifier-based probability estimation with high-dimensional predictors.

    PubMed

    Shin, Seung Jun; Wu, Yichao

    2014-07-01

    This is a discussion of the papers: "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Theory" by Jochen Kruppa, Yufeng Liu, Gérard Biau, Michael Kohler, Inke R. König, James D. Malley, and Andreas Ziegler; and "Probability estimation with machine learning methods for dichotomous and multicategory outcome: Applications" by Jochen Kruppa, Yufeng Liu, Hans-Christian Diener, Theresa Holste, Christian Weimar, Inke R. König, and Andreas Ziegler.

  6. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education

    PubMed Central

    Masel, J.; Humphrey, P. T.; Blackburn, B.; Levine, J. A.

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students’ intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes’ theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. PMID:26582236

  7. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    PubMed

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education.

  8. Probability and possibility-based representations of uncertainty in fault tree analysis.

    PubMed

    Flage, Roger; Baraldi, Piero; Zio, Enrico; Aven, Terje

    2013-01-01

    Expert knowledge is an important source of input to risk analysis. In practice, experts might be reluctant to characterize their knowledge and the related (epistemic) uncertainty using precise probabilities. The theory of possibility allows for imprecision in probability assignments. The associated possibilistic representation of epistemic uncertainty can be combined with, and transformed into, a probabilistic representation; in this article, we show this with reference to a simple fault tree analysis. We apply an integrated (hybrid) probabilistic-possibilistic computational framework for the joint propagation of the epistemic uncertainty on the values of the (limiting relative frequency) probabilities of the basic events of the fault tree, and we use possibility-probability (probability-possibility) transformations for propagating the epistemic uncertainty within purely probabilistic and possibilistic settings. The results of the different approaches (hybrid, probabilistic, and possibilistic) are compared with respect to the representation of uncertainty about the top event (limiting relative frequency) probability. Both the rationale underpinning the approaches and the computational efforts they require are critically examined. We conclude that the approaches relevant in a given setting depend on the purpose of the risk analysis, and that further research is required to make the possibilistic approaches operational in a risk analysis context.

  9. Sample size planning for phase II trials based on success probabilities for phase III.

    PubMed

    Götte, Heiko; Schüler, Armin; Kirchner, Marietta; Kieser, Meinhard

    2015-01-01

    In recent years, high failure rates in phase III trials were observed. One of the main reasons is overoptimistic assumptions for the planning of phase III resulting from limited phase II information and/or unawareness of realistic success probabilities. We present an approach for planning a phase II trial in a time-to-event setting that considers the whole phase II/III clinical development programme. We derive stopping boundaries after phase II that minimise the number of events under side conditions for the conditional probabilities of correct go/no-go decision after phase II as well as the conditional success probabilities for phase III. In addition, we give general recommendations for the choice of phase II sample size. Our simulations show that unconditional probabilities of go/no-go decision as well as the unconditional success probabilities for phase III are influenced by the number of events observed in phase II. However, choosing more than 150 events in phase II seems not necessary as the impact on these probabilities then becomes quite small. We recommend considering aspects like the number of compounds in phase II and the resources available when determining the sample size. The lower the number of compounds and the lower the resources are for phase III, the higher the investment for phase II should be.

  10. United States streamflow probabilities based on forecasted La Nina, winter-spring 2000

    USGS Publications Warehouse

    Dettinger, M.D.; Cayan, D.R.; Redmond, K.T.

    1999-01-01

    Although for the last 5 months the TahitiDarwin Southern Oscillation Index (SOI) has hovered close to normal, the “equatorial” SOI has remained in the La Niña category and predictions are calling for La Niña conditions this winter. In view of these predictions of continuing La Niña and as a direct extension of previous studies of the relations between El NiñoSouthern Oscil-lation (ENSO) conditions and streamflow in the United States (e.g., Redmond and Koch, 1991; Cayan and Webb, 1992; Redmond and Cayan, 1994; Dettinger et al., 1998; Garen, 1998; Cayan et al., 1999; Dettinger et al., in press), the probabilities that United States streamflows from December 1999 through July 2000 will be in upper and lower thirds (terciles) of the historical records are estimated here. The processes that link ENSO to North American streamflow are discussed in detail in these diagnostics studies. Our justification for generating this forecast is threefold: (1) Cayan et al. (1999) recently have shown that ENSO influences on streamflow variations and extremes are proportionately larger than the corresponding precipitation teleconnections. (2) Redmond and Cayan (1994) and Dettinger et al. (in press) also have shown that the low-frequency evolution of ENSO conditions support long-lead correlations between ENSO and streamflow in many rivers of the conterminous United States. (3) In many rivers, significant (weeks-to-months) delays between precipitation and the release to streams of snowmelt or ground-water discharge can support even longer term forecasts of streamflow than is possible for precipitation. The relatively slow, orderly evolution of El Niño-Southern Oscillation episodes, the accentuated dependence of streamflow upon ENSO, and the long lags between precipitation and flow encourage us to provide the following analysis as a simple prediction of this year’s river flows.

  11. Model assisted probability of detection for a guided waves based SHM technique

    NASA Astrophysics Data System (ADS)

    Memmolo, V.; Ricci, F.; Maio, L.; Boffa, N. D.; Monaco, E.

    2016-04-01

    Guided wave (GW) Structural Health Monitoring (SHM) allows to assess the health of aerostructures thanks to the great sensitivity to delamination and/or debondings appearance. Due to the several complexities affecting wave propagation in composites, an efficient GW SHM system requires its effective quantification associated to a rigorous statistical evaluation procedure. Probability of Detection (POD) approach is a commonly accepted measurement method to quantify NDI results and it can be effectively extended to an SHM context. However, it requires a very complex setup arrangement and many coupons. When a rigorous correlation with measurements is adopted, Model Assisted POD (MAPOD) is an efficient alternative to classic methods. This paper is concerned with the identification of small emerging delaminations in composite structural components. An ultrasonic GW tomography focused to impact damage detection in composite plate-like structures recently developed by authors is investigated, getting the bases for a more complex MAPOD analysis. Experimental tests carried out on a typical wing composite structure demonstrated the effectiveness of modeling approach in order to detect damages with the tomographic algorithm. Environmental disturbances, which affect signal waveforms and consequently damage detection, are considered simulating a mathematical noise in the modeling stage. A statistical method is used for an effective making decision procedure. A Damage Index approach is implemented as metric to interpret the signals collected from a distributed sensor network and a subsequent graphic interpolation is carried out to reconstruct the damage appearance. A model validation and first reliability assessment results are provided, in view of performance system quantification and its optimization as well.

  12. Methods for estimating annual exceedance probability discharges for streams in Arkansas, based on data through water year 2013

    USGS Publications Warehouse

    Wagner, Daniel M.; Krieger, Joshua D.; Veilleux, Andrea G.

    2016-08-04

    In 2013, the U.S. Geological Survey initiated a study to update regional skew, annual exceedance probability discharges, and regional regression equations used to estimate annual exceedance probability discharges for ungaged locations on streams in the study area with the use of recent geospatial data, new analytical methods, and available annual peak-discharge data through the 2013 water year. An analysis of regional skew using Bayesian weighted least-squares/Bayesian generalized-least squares regression was performed for Arkansas, Louisiana, and parts of Missouri and Oklahoma. The newly developed constant regional skew of -0.17 was used in the computation of annual exceedance probability discharges for 281 streamgages used in the regional regression analysis. Based on analysis of covariance, four flood regions were identified for use in the generation of regional regression models. Thirty-nine basin characteristics were considered as potential explanatory variables, and ordinary least-squares regression techniques were used to determine the optimum combinations of basin characteristics for each of the four regions. Basin characteristics in candidate models were evaluated based on multicollinearity with other basin characteristics (variance inflation factor < 2.5) and statistical significance at the 95-percent confidence level (p ≤ 0.05). Generalized least-squares regression was used to develop the final regression models for each flood region. Average standard errors of prediction of the generalized least-squares models ranged from 32.76 to 59.53 percent, with the largest range in flood region D. Pseudo coefficients of determination of the generalized least-squares models ranged from 90.29 to 97.28 percent, with the largest range also in flood region D. The regional regression equations apply only to locations on streams in Arkansas where annual peak discharges are not substantially affected by regulation, diversion, channelization, backwater, or urbanization

  13. Change of flood risk under climate change based on Discharge Probability Index in Japan

    NASA Astrophysics Data System (ADS)

    Nitta, T.; Yoshimura, K.; Kanae, S.; Oki, T.

    2010-12-01

    Water-related disasters under the climate change have recently gained considerable interest, and there have been many studies referring to flood risk at the global scale (e.g. Milly et al., 2002; Hirabayashi et al., 2008). In order to build adaptive capacity, however, regional impact evaluation is needed. We thus focus on the flood risk over Japan in the present study. The output from the Regional Climate Model 20 (RCM20), which was developed by the Meteorological Research Institute, was used. The data was first compared with observed data based on Automated Meteorological Data Acquisition System and ground weather observations, and the model biases were corrected using the ratio and difference of the 20-year mean values. The bias-corrected RCM20 atmospheric data were then forced to run a land surface model and a river routing model (Yoshimura et al., 2007; Ngo-Duc, T. et al. 2007) to simulate river discharge during 1981-2000, 2031-2050, and 2081-2100. Simulated river discharge was converted to Discharge Probability Index (DPI), which was proposed by Yoshimura et al based on a statistical approach. The bias and uncertainty of the models are already taken into account in the concept of DPI, so that DPI serves as a good indicator of flood risk. We estimated the statistical parameters for DPI using the river discharge for 1981-2000 with an assumption that the parameters stay the same in the different climate periods. We then evaluated the occurrence of flood events corresponding to DPI categories in each 20 years and averaged them in 9 regions. The results indicate that low DPI flood events (return period of 2 years) will become more frequent in 2031-2050 and high DPI flood events (return period of 200 years) will become more frequent in 2081-2100 compared with the period of 1981-2000, though average precipitation will become larger during 2031-2050 than during 2081-2100 in most regions. It reflects the increased extreme precipitation during 2081-2100.

  14. Heightened odds of large earthquakes near istanbul: An interaction-based probability calculation

    PubMed

    Parsons; Toda; Stein; Barka; Dieterich

    2000-04-28

    We calculate the probability of strong shaking in Istanbul, an urban center of 10 million people, from the description of earthquakes on the North Anatolian fault system in the Marmara Sea during the past 500 years and test the resulting catalog against the frequency of damage in Istanbul during the preceding millennium. Departing from current practice, we include the time-dependent effect of stress transferred by the 1999 moment magnitude M = 7.4 Izmit earthquake to faults nearer to Istanbul. We find a 62 +/- 15% probability (one standard deviation) of strong shaking during the next 30 years and 32 +/- 12% during the next decade.

  15. Heightened odds of large earthquakes near Istanbul: an interaction-based probability calculation

    USGS Publications Warehouse

    Parsons, T.; Toda, S.; Stein, R.S.; Barka, A.; Dieterich, J.H.

    2000-01-01

    We calculate the probability of strong shaking in Istanbul, an urban center of 10 million people, from the description of earthquakes on the North Anatolian fault system in the Marmara Sea during the past 500 years and test the resulting catalog against the frequency of damage in Istanbul during the preceding millennium, departing from current practice, we include the time-dependent effect of stress transferred by the 1999 moment magnitude M = 7.4 Izmit earthquake to faults nearer to Istanbul. We find a 62 ± 15% probability (one standard deviation) of strong shaking during the next 30 years and 32 ± 12% during the next decade.

  16. Probability Theory

    NASA Astrophysics Data System (ADS)

    Jaynes, E. T.; Bretthorst, G. Larry

    2003-04-01

    Foreword; Preface; Part I. Principles and Elementary Applications: 1. Plausible reasoning; 2. The quantitative rules; 3. Elementary sampling theory; 4. Elementary hypothesis testing; 5. Queer uses for probability theory; 6. Elementary parameter estimation; 7. The central, Gaussian or normal distribution; 8. Sufficiency, ancillarity, and all that; 9. Repetitive experiments, probability and frequency; 10. Physics of 'random experiments'; Part II. Advanced Applications: 11. Discrete prior probabilities, the entropy principle; 12. Ignorance priors and transformation groups; 13. Decision theory: historical background; 14. Simple applications of decision theory; 15. Paradoxes of probability theory; 16. Orthodox methods: historical background; 17. Principles and pathology of orthodox statistics; 18. The Ap distribution and rule of succession; 19. Physical measurements; 20. Model comparison; 21. Outliers and robustness; 22. Introduction to communication theory; References; Appendix A. Other approaches to probability theory; Appendix B. Mathematical formalities and style; Appendix C. Convolutions and cumulants.

  17. How the Probability and Potential Clinical Significance of Pharmacokinetically Mediated Drug-Drug Interactions Are Assessed in Drug Development: Desvenlafaxine as an Example

    PubMed Central

    Nichols, Alice I.; Preskorn, Sheldon H.

    2015-01-01

    Objective: The avoidance of adverse drug-drug interactions (DDIs) is a high priority in terms of both the US Food and Drug Administration (FDA) and the individual prescriber. With this perspective in mind, this article illustrates the process for assessing the risk of a drug (example here being desvenlafaxine) causing or being the victim of DDIs, in accordance with FDA guidance. Data Sources/Study Selection: DDI studies for the serotonin-norepinephrine reuptake inhibitor desvenlafaxine conducted by the sponsor and published since 2009 are used as examples of the systematic way that the FDA requires drug developers to assess whether their new drug is either capable of causing clinically meaningful DDIs or being the victim of such DDIs. In total, 8 open-label studies tested the effects of steady-state treatment with desvenlafaxine (50–400 mg/d) on the pharmacokinetics of cytochrome (CYP) 2D6 and/or CYP 3A4 substrate drugs, or the effect of CYP 3A4 inhibition on desvenlafaxine pharmacokinetics. The potential for DDIs mediated by the P-glycoprotein (P-gp) transporter was assessed in in vitro studies using Caco-2 monolayers. Data Extraction: Changes in area under the plasma concentration-time curve (AUC; CYP studies) and efflux (P-gp studies) were reviewed for potential DDIs in accordance with FDA criteria. Results: Desvenlafaxine coadministration had minimal effect on CYP 2D6 and/or 3A4 substrates per FDA criteria. Changes in AUC indicated either no interaction (90% confidence intervals for the ratio of AUC geometric least-squares means [GM] within 80%–125%) or weak inhibition (AUC GM ratio 125% to < 200%). Coadministration with ketoconazole resulted in a weak interaction with desvenlafaxine (AUC GM ratio of 143%). Desvenlafaxine was not a substrate (efflux ratio < 2) or inhibitor (50% inhibitory drug concentration values > 250 μM) of P-gp. Conclusions: A 2-step process based on FDA guidance can be used first to determine whether a pharmacokinetically mediated

  18. Estimating Promotion Probabilities of Navy Officers Based on Individual’s Attributes and Other Global Effects

    DTIC Science & Technology

    2012-09-01

    incorporates macro economic and policy level information. In the first step the conditional probabilities of staying or leaving the Navy are estimated...accommodates time dependent information, cohort information, censoring problems with the data as well as incorporating macro economic and policy level ...1 Introducing the Individual Level Information (Covariates

  19. Implicit Segmentation of a Stream of Syllables Based on Transitional Probabilities: An MEG Study

    ERIC Educational Resources Information Center

    Teinonen, Tuomas; Huotilainen, Minna

    2012-01-01

    Statistical segmentation of continuous speech, i.e., the ability to utilise transitional probabilities between syllables in order to detect word boundaries, is reflected in the brain's auditory event-related potentials (ERPs). The N1 and N400 ERP components are typically enhanced for word onsets compared to random syllables during active…

  20. Sample Size Determination for Estimation of Sensor Detection Probabilities Based on a Test Variable

    DTIC Science & Technology

    2007-06-01

    interest. 15. NUMBER OF PAGES 121 14. SUBJECT TERMS Sample Size, Binomial Proportion, Confidence Interval , Coverage Probability, Experimental...THE STUDY ..........................5 II. LITERATURE REVIEW .......................................7 A. CONFIDENCE INTERVAL METHODS FOR THE...BINOMIAL PROPORTION .........................................7 1. The Wald Confidence Interval ..................7 2. The Wilson Score Confidence Interval .........13

  1. A peptide-spectrum scoring system based on ion alignment, intensity, and pair probabilities.

    PubMed

    Risk, Brian A; Edwards, Nathan J; Giddings, Morgan C

    2013-09-06

    Peppy, the proteogenomic/proteomic search software, employs a novel method for assessing the match quality between an MS/MS spectrum and a theorized peptide sequence. The scoring system uses three score factors calculated with binomial probabilities: the probability that a fragment ion will randomly align with a peptide ion, the probability that the aligning ions will be selected from subsets of the most intense peaks, and the probability that the intensities of fragment ions identified as y-ions are greater than those of their counterpart b-ions. The scores produced by the method act as global confidence scores, which facilitate the accurate comparison of results and the estimation of false discovery rates. Peppy has been integrated into the meta-search engine PepArML to produce meaningful comparisons with Mascot, MSGF+, OMSSA, X!Tandem, k-Score and s-Score. For two of the four data sets examined with the PepArML analysis, Peppy exceeded the accuracy performance of the other scoring systems. Peppy is available for download at http://geneffects.com/peppy .

  2. Methods for estimating annual exceedance-probability discharges for streams in Iowa, based on data through water year 2010

    USGS Publications Warehouse

    Eash, David A.; Barnes, Kimberlee K.; Veilleux, Andrea G.

    2013-01-01

    A statewide study was performed to develop regional regression equations for estimating selected annual exceedance-probability statistics for ungaged stream sites in Iowa. The study area comprises streamgages located within Iowa and 50 miles beyond the State’s borders. Annual exceedance-probability estimates were computed for 518 streamgages by using the expected moments algorithm to fit a Pearson Type III distribution to the logarithms of annual peak discharges for each streamgage using annual peak-discharge data through 2010. The estimation of the selected statistics included a Bayesian weighted least-squares/generalized least-squares regression analysis to update regional skew coefficients for the 518 streamgages. Low-outlier and historic information were incorporated into the annual exceedance-probability analyses, and a generalized Grubbs-Beck test was used to detect multiple potentially influential low flows. Also, geographic information system software was used to measure 59 selected basin characteristics for each streamgage. Regional regression analysis, using generalized least-squares regression, was used to develop a set of equations for each flood region in Iowa for estimating discharges for ungaged stream sites with 50-, 20-, 10-, 4-, 2-, 1-, 0.5-, and 0.2-percent annual exceedance probabilities, which are equivalent to annual flood-frequency recurrence intervals of 2, 5, 10, 25, 50, 100, 200, and 500 years, respectively. A total of 394 streamgages were included in the development of regional regression equations for three flood regions (regions 1, 2, and 3) that were defined for Iowa based on landform regions and soil regions. Average standard errors of prediction range from 31.8 to 45.2 percent for flood region 1, 19.4 to 46.8 percent for flood region 2, and 26.5 to 43.1 percent for flood region 3. The pseudo coefficients of determination for the generalized least-squares equations range from 90.8 to 96.2 percent for flood region 1, 91.5 to 97

  3. Evaluation of tsunami potential based on conditional probability for specific zones of the Pacific tsunamigenic rim

    NASA Astrophysics Data System (ADS)

    Koravos, George Ch.; Yadav, R. B. S.; Tsapanos, Theodoros M.

    2015-09-01

    The Pacific tsunamigenic rim is one of the most tsunamigenic regions of the world which has experienced large catastrophic tsunamis in the past, resulting in huge loss of lives and properties. In this study, probabilities of occurrences of large tsunamis with tsunami intensity (Soloviev-Imamura intensity scale) I ≥ 1.5, I ≥ 2.0, I ≥ 2.5, I ≥ 3.0, I ≥ 3.5 and I ≥ 4.0 have been calculated over the next 100 years in ten main tsunamigenic zones of the Pacific rim area using a homogeneous and complete tsunami catalogue covering the time periods from 684 to 2011. In order to evaluate tsunami potential, we applied the conditional probability method in each zone by considering the inter-occurrence times between the successive tsunamis generated in the past that follow the lognormal distribution. Thus, we assessed the probability of the next generation of large tsunamis in each zone by considering the time of the last tsunami occurrence. The a-posteriori occurrence of the last large tsunami has been also assessed, assuming that the time of the last occurrence coincides with the time of the event prior to the last one. The estimated a-posteriori probabilities exhibit satisfactory results in most of the zones, revealing a promising technique and confirming the reliability of the tsunami data used. Furthermore, the tsunami potential in different tsunamigenic zones is also expressed in terms of spatial maps of conditional probabilities for two levels of tsunami intensities I ≥ 1.5 and I ≥ 2.5 during next 10, 20, 50 and 100 years. Estimated results reveal that the conditional probabilities in the South America and Alaska-Aleutian zones for larger tsunami intensity I ≥ 2.5 are in the range of 92-93%, much larger than the Japan (69%), for a time period of 100 years, suggesting that those are the most vulnerable tsunamigenic zones. The spatial maps provide brief atlas of tsunami potential in the Pacific rim area.

  4. Pure perceptual-based learning of second-, third-, and fourth-order sequential probabilities.

    PubMed

    Remillard, Gilbert

    2011-07-01

    There is evidence that sequence learning in the traditional serial reaction time task (SRTT), where target location is the response dimension, and sequence learning in the perceptual SRTT, where target location is not the response dimension, are handled by different mechanisms. The ability of the latter mechanism to learn sequential contingencies that can be learned by the former mechanism was examined. Prior research has established that people can learn second-, third-, and fourth-order probabilities in the traditional SRTT. The present study reveals that people can learn such probabilities in the perceptual SRTT. This suggests that the two mechanisms may have similar architectures. A possible neural basis of the two mechanisms is discussed.

  5. MEASUREMENT OF CHILDREN'S EXPOSURE TO PESTICIDES: ANALYSIS OF URINARY METABOLITE LEVELS IN A PROBABILITY-BASED SAMPLE

    EPA Science Inventory

    The Minnesota Children's Pesticide Exposure Study is a probability-based sample of 102 children 3-13 years old who were monitored for commonly used pesticides. During the summer of 1997, first-morning-void urine samples (1-3 per child) were obtained for 88% of study children a...

  6. Effect of Reinforcement Probability and Prize Size on Cocaine and Heroin Abstinence in Prize-Based Contingency Management

    ERIC Educational Resources Information Center

    Ghitza, Udi E.; Epstein, David H.; Schmittner, John; Vahabzadeh, Massoud; Lin, Jia-Ling; Preston, Kenzie L.

    2008-01-01

    Although treatment outcome in prize-based contingency management has been shown to depend on reinforcement schedule, the optimal schedule is still unknown. Therefore, we conducted a retrospective analysis of data from a randomized clinical trial (Ghitza et al., 2007) to determine the effects of the probability of winning a prize (low vs. high) and…

  7. Lake Superior Zooplankton Biomass Predictions from LOPC Tow Surveys Compare Well with a Probability Based Net Survey

    EPA Science Inventory

    We conducted a probability-based sampling of Lake Superior in 2006 and compared the zooplankton biomass estimate with laser optical plankton counter (LOPC) predictions. The net survey consisted of 52 sites stratified across three depth zones (0-30, 30-150, >150 m). The LOPC tow...

  8. Computer-Based Graphical Displays for Enhancing Mental Animation and Improving Reasoning in Novice Learning of Probability

    ERIC Educational Resources Information Center

    Kaplan, Danielle E.; Wu, Erin Chia-ling

    2006-01-01

    Our research suggests static and animated graphics can lead to more animated thinking and more correct problem solving in computer-based probability learning. Pilot software modules were developed for graduate online statistics courses and representation research. A study with novice graduate student statisticians compared problem solving in five…

  9. Quantization of probability distributions under norm-based distortion measures II: Self-similar distributions

    NASA Astrophysics Data System (ADS)

    Delattre, Sylvain; Graf, Siegfried; Luschgy, Harald; Pages, Gilles

    2006-06-01

    For a probability measure P on and consider where the infimum is taken over all subsets [alpha] of with card([alpha])[less-than-or-equals, slant]n and V is a nondecreasing function. Under certain conditions on V, we derive the precise n-asymptotics of en for self-similar distributions P and we find the asymptotic performance of optimal quantizers using weighted empirical measures.

  10. Effect of reinforcement probability and prize size on cocaine and heroin abstinence in prize-based contingency management.

    PubMed

    Ghitza, Udi E; Epstein, David H; Schmittner, John; Vahabzadeh, Massoud; Lin, Jia-Ling; Preston, Kenzie L

    2008-01-01

    Although treatment outcome in prize-based contingency management has been shown to depend on reinforcement schedule, the optimal schedule is still unknown. Therefore, we conducted a retrospective analysis of data from a randomized clinical trial (Ghitza et al., 2007) to determine the effects of the probability of winning a prize (low vs. high) and the size of the prize won (small, large, or jumbo) on likelihood of abstinence until the next urine-collection day for heroin and cocaine users (N=116) in methadone maintenance. Higher probability of winning, but not the size of individual prizes, was associated with a greater percentage of cocaine-negative, but not opiate-negative, urines.

  11. Probability-based non-local means filter for speckle noise suppression in optical coherence tomography images.

    PubMed

    Yu, Hancheng; Gao, Jianlin; Li, Aiting

    2016-03-01

    In this Letter, a probability-based non-local means filter is proposed for speckle reduction in optical coherence tomography (OCT). Originally developed for additive white Gaussian noise, the non-local means filter is not suitable for multiplicative speckle noise suppression. This Letter presents a two-stage non-local means algorithm using the uncorrupted probability of each pixel to effectively reduce speckle noise in OCT. Experiments on real OCT images demonstrate that the proposed filter is competitive with other state-of-the-art speckle removal techniques and able to accurately preserve edges and structural details with small computational cost.

  12. Statistical analysis of blocking probability and fragmentation based on Markov modeling of elastic spectrum allocation on fiber link

    NASA Astrophysics Data System (ADS)

    Rosa, A. N. F.; Wiatr, P.; Cavdar, C.; Carvalho, S. V.; Costa, J. C. W. A.; Wosinska, L.

    2015-11-01

    In Elastic Optical Network (EON), spectrum fragmentation refers to the existence of non-aligned, small-sized blocks of free subcarrier slots in the optical spectrum. Several metrics have been proposed in order to quantify a level of spectrum fragmentation. Approximation methods might be used for estimating average blocking probability and some fragmentation measures, but are so far unable to accurately evaluate the influence of different sizes of connection requests and do not allow in-depth investigation of blocking events and their relation to fragmentation. The analytical study of the effect of fragmentation on requests' blocking probability is still under-explored. In this work, we introduce new definitions for blocking that differentiate between the reasons for the blocking events. We developed a framework based on Markov modeling to calculate steady-state probabilities for the different blocking events and to analyze fragmentation related problems in elastic optical links under dynamic traffic conditions. This framework can also be used for evaluation of different definitions of fragmentation in terms of their relation to the blocking probability. We investigate how different allocation request sizes contribute to fragmentation and blocking probability. Moreover, we show to which extend blocking events, due to insufficient amount of available resources, become inevitable and, compared to the amount of blocking events due to fragmented spectrum, we draw conclusions on the possible gains one can achieve by system defragmentation. We also show how efficient spectrum allocation policies really are in reducing the part of fragmentation that in particular leads to actual blocking events. Simulation experiments are carried out showing good match with our analytical results for blocking probability in a small scale scenario. Simulated blocking probabilities for the different blocking events are provided for a larger scale elastic optical link.

  13. Lexicographic Probability, Conditional Probability, and Nonstandard Probability

    DTIC Science & Technology

    2009-11-11

    the following conditions: CP1. µ(U |U) = 1 if U ∈ F ′. CP2 . µ(V1 ∪ V2 |U) = µ(V1 |U) + µ(V2 |U) if V1 ∩ V2 = ∅, U ∈ F ′, and V1, V2 ∈ F . CP3. µ(V |U...µ(V |X)× µ(X |U) if V ⊆ X ⊆ U , U,X ∈ F ′, V ∈ F . Note that it follows from CP1 and CP2 that µ(· |U) is a probability measure on (W,F) (and, in... CP2 hold. This is easily seen to determine µ. Moreover, µ vaciously satisfies CP3, since there do not exist distinct sets U and X in F ′ such that U

  14. Guide waves-based multi-damage identification using a local probability-based diagnostic imaging method

    NASA Astrophysics Data System (ADS)

    Gao, Dongyue; Wu, Zhanjun; Yang, Lei; Zheng, Yuebin

    2016-04-01

    Multi-damage identification is an important and challenging task in the research of guide waves-based structural health monitoring. In this paper, a multi-damage identification method is presented using a guide waves-based local probability-based diagnostic imaging (PDI) method. The method includes a path damage judgment stage, a multi-damage judgment stage and a multi-damage imaging stage. First, damage imaging was performed by partition. The damage imaging regions are divided into beside damage signal paths. The difference in guide waves propagation characteristics between cross and beside damage paths is proposed by theoretical analysis of the guide wave signal feature. The time-of-flight difference of paths is used as a factor to distinguish between cross and beside damage paths. Then, a global PDI method (damage identification using all paths in the sensor network) is performed using the beside damage path network. If the global PDI damage zone crosses the beside damage path, it means that the discrete multi-damage model (such as a group of holes or cracks) has been misjudged as a continuum single-damage model (such as a single hole or crack) by the global PDI method. Subsequently, damage imaging regions are separated by beside damage path and local PDI (damage identification using paths in the damage imaging regions) is performed in each damage imaging region. Finally, multi-damage identification results are obtained by superimposing the local damage imaging results and the marked cross damage paths. The method is employed to inspect the multi-damage in an aluminum plate with a surface-mounted piezoelectric ceramic sensors network. The results show that the guide waves-based multi-damage identification method is capable of visualizing the presence, quantity and location of structural damage.

  15. Ethanol, not detectably metabolized in brain, significantly reduces brain metabolism, probably via action at specific GABA(A) receptors and has measureable metabolic effects at very low concentrations.

    PubMed

    Rae, Caroline D; Davidson, Joanne E; Maher, Anthony D; Rowlands, Benjamin D; Kashem, Mohammed A; Nasrallah, Fatima A; Rallapalli, Sundari K; Cook, James M; Balcar, Vladimir J

    2014-04-01

    Ethanol is a known neuromodulatory agent with reported actions at a range of neurotransmitter receptors. Here, we measured the effect of alcohol on metabolism of [3-¹³C]pyruvate in the adult Guinea pig brain cortical tissue slice and compared the outcomes to those from a library of ligands active in the GABAergic system as well as studying the metabolic fate of [1,2-¹³C]ethanol. Analyses of metabolic profile clusters suggest that the significant reductions in metabolism induced by ethanol (10, 30 and 60 mM) are via action at neurotransmitter receptors, particularly α4β3δ receptors, whereas very low concentrations of ethanol may produce metabolic responses owing to release of GABA via GABA transporter 1 (GAT1) and the subsequent interaction of this GABA with local α5- or α1-containing GABA(A)R. There was no measureable metabolism of [1,2-¹³C]ethanol with no significant incorporation of ¹³C from [1,2-¹³C]ethanol into any measured metabolite above natural abundance, although there were measurable effects on total metabolite sizes similar to those seen with unlabelled ethanol.

  16. A satellite rainfall retrieval technique over northern Algeria based on the probability of rainfall intensities classification from MSG-SEVIRI

    NASA Astrophysics Data System (ADS)

    Lazri, Mourad; Ameur, Soltane

    2016-09-01

    In this paper, an algorithm based on the probability of rainfall intensities classification for rainfall estimation from Meteosat Second Generation/Spinning Enhanced Visible and Infrared Imager (MSG-SEVIRI) has been developed. The classification scheme uses various spectral parameters of SEVIRI that provide information about cloud top temperature and optical and microphysical cloud properties. The presented method is developed and trained for the north of Algeria. The calibration of the method is carried out using as a reference rain classification fields derived from radar for rainy season from November 2006 to March 2007. Rainfall rates are assigned to rain areas previously identified and classified according to the precipitation formation processes. The comparisons between satellite-derived precipitation estimates and validation data show that the developed scheme performs reasonably well. Indeed, the correlation coefficient presents a significant level (r:0.87). The values of POD, POFD and FAR are 80%, 13% and 25%, respectively. Also, for a rainfall estimation of about 614 mm, the RMSD, Bias, MAD and PD indicate 102.06(mm), 2.18(mm), 68.07(mm) and 12.58, respectively.

  17. Prediction of nucleic acid binding probability in proteins: a neighboring residue network based score.

    PubMed

    Miao, Zhichao; Westhof, Eric

    2015-06-23

    We describe a general binding score for predicting the nucleic acid binding probability in proteins. The score is directly derived from physicochemical and evolutionary features and integrates a residue neighboring network approach. Our process achieves stable and high accuracies on both DNA- and RNA-binding proteins and illustrates how the main driving forces for nucleic acid binding are common. Because of the effective integration of the synergetic effects of the network of neighboring residues and the fact that the prediction yields a hierarchical scoring on the protein surface, energy funnels for nucleic acid binding appear on protein surfaces, pointing to the dynamic process occurring in the binding of nucleic acids to proteins.

  18. The Significance of Trust in School-Based Collaborative Leadership

    ERIC Educational Resources Information Center

    Coleman, Andrew

    2012-01-01

    The expectation that schools should work in partnership to promote the achievement of children has arguably been the defining feature of school policy over the last decade. This rise in school-to-school partnerships and increased emphasis on multi-agency-based interventions for vulnerable children have seen the emergence of a new form of school…

  19. Competency-based curricular design to encourage significant learning.

    PubMed

    Hurtubise, Larry; Roman, Brenda

    2014-07-01

    Most significant learning (SL) experiences produce long-lasting learning experiences that meaningfully change the learner's thinking, feeling, and/or behavior. Most significant teaching experiences involve strong connections with the learner and recognition that the learner felt changed by the teaching effort. L. Dee Fink in Creating Significant Learning Experiences: An Integrated Approach to Designing College Course defines six kinds of learning goals: Foundational Knowledge, Application, Integration, Human Dimension, Caring, and Learning to Learn. SL occurs when learning experiences promote interaction between the different kinds of goals, for example, acquiring knowledge alone is not enough, but when paired with a learning experience, such as an effective patient experience as in Caring, then significant (and lasting) learning occurs. To promote SL, backward design principles that start with clearly defined learning goals and the context of the situation of the learner are particularly effective. Emphasis on defining assessment methods prior to developing teaching/learning activities is the key: this ensures that assessment (where the learner should be at the end of the educational activity/process) drives instruction and that assessment and learning/instruction are tightly linked so that assessment measures a defined outcome (competency) of the learner. Employing backward design and the AAMC's MedBiquitous standard vocabulary for medical education can help to ensure that curricular design and redesign efforts effectively enhance educational program quality and efficacy, leading to improved patient care. Such methods can promote successful careers in health care for learners through development of self-directed learning skills and active learning, in ways that help learners become fully committed to lifelong learning and continuous professional development.

  20. Avoidance based on shock intensity reduction with no change in shock probability.

    PubMed

    Bersh, P J; Alloy, L B

    1978-11-01

    Rats were trained on a free-operant avoidance procedure in which shock intensity was controlled by interresponse time. Shocks were random at a density of about 10 shocks per minute. Shock probability was response independent. As long as interresponse times remained less than the limit in effect, any shocks received were at the lower of two intensities (0.75 mA). Whenever interresponse times exceeded the limit, any shocks received were at the higher intensity (1.6 mA). The initial limit of 15 seconds was decreased in 3-second steps to either 6 or 3 seconds. All animals lever pressed to avoid higher intensity shock. As the interresponse time limit was reduced, the response rate during the lower intensity shock and the proportion of brief interresponse times increased. Substantial warmup effects were evident, particularly at the shorter interresponse-time limits. Shock intensity reduction without change in shock probability was effective in the acquisition and maintenance of avoidance responding, as well as in differentiation of interresponse times. This research suggests limitations on the generality of a safety signal interpretation of avoidance conditioning.

  1. Mesh-Based Entry Vehicle and Explosive Debris Re-Contact Probability Modeling

    NASA Technical Reports Server (NTRS)

    McPherson, Mark A.; Mendeck, Gavin F.

    2011-01-01

    The risk to a crewed vehicle arising from potential re-contact with fragments from an explosive breakup of any jettisoned spacecraft segments during entry has long sought to be quantified. However, great difficulty lies in efficiently capturing the potential locations of each fragment and their collective threat to the vehicle. The method presented in this paper addresses this problem by using a stochastic approach that discretizes simulated debris pieces into volumetric cells, and then assesses strike probabilities accordingly. Combining spatial debris density and relative velocity between the debris and the entry vehicle, the strike probability can be calculated from the integral of the debris flux inside each cell over time. Using this technique it is possible to assess the risk to an entry vehicle along an entire trajectory as it separates from the jettisoned segment. By decoupling the fragment trajectories from that of the entry vehicle, multiple potential separation maneuvers can then be evaluated rapidly to provide an assessment of the best strategy to mitigate the re-contact risk.

  2. A Probability-Base Alerting Logic for Aircraft on Parallel Approach

    NASA Technical Reports Server (NTRS)

    Carpenter, Brenda D.; Kuchar, James K.

    1997-01-01

    This document discusses the development and evaluation of an airborne collision alerting logic for aircraft on closely-spaced approaches to parallel runways. A novel methodology is used when links alerts to collision probabilities: alerting thresholds are set such that when the probability of a collision exceeds an acceptable hazard level an alert is issued. The logic was designed to limit the hazard level to that estimated for the Precision Runway Monitoring system: one accident in every one thousand blunders which trigger alerts. When the aircraft were constrained to be coaltitude, evaluations of a two-dimensional version of the alerting logic show that the achieved hazard level is approximately one accident in every 250 blunders. Problematic scenarios have been identified and corrections to the logic can be made. The evaluations also show that over eighty percent of all unnecessary alerts were issued during scenarios in which the miss distance would have been less than 1000 ft, indicating that the alerts may have been justified. Also, no unnecessary alerts were generated during normal approaches.

  3. Routing and wavelength allocation algorithm based on the weighted attack probability in software-defined optical networks

    NASA Astrophysics Data System (ADS)

    Zhao, Yongli; Zhang, Jie

    2017-02-01

    A routing and wavelength assignment (RWA) algorithm against high-power jamming based on software-defined optical networks (SDONs) is proposed. The SDON architecture is designed with power monitors at each node, which can collect the abnormal power information from each port and wavelength. Based on the abnormal power information, a metric, the weighted attack probability (WAP), can be calculated. A WAP-based RWA algorithm (WAP-RWA) is proposed considering the WAP values of each link and node along the selected lightpath. Numerical results show that the WAP-RWA algorithm can achieve a better performance in terms of blocking probability and resource utilization compared with the attack-aware dedicated path protection (AA-DPP) RWA (AA-DPP-RWA) algorithm, while providing a protection comparable with the AA-DPP-RWA algorithm.

  4. Accuracies of the empirical theories of the escape probability based on Eigen model and Braun model compared with the exact extension of Onsager theory.

    PubMed

    Wojcik, Mariusz; Tachiya, M

    2009-03-14

    This paper deals with the exact extension of the original Onsager theory of the escape probability to the case of finite recombination rate at nonzero reaction radius. The empirical theories based on the Eigen model and the Braun model, which are applicable in the absence and presence of an external electric field, respectively, are based on a wrong assumption that both recombination and separation processes in geminate recombination follow exponential kinetics. The accuracies of the empirical theories are examined against the exact extension of the Onsager theory. The Eigen model gives the escape probability in the absence of an electric field, which is different by a factor of 3 from the exact one. We have shown that this difference can be removed by operationally redefining the volume occupied by the dissociating partner before dissociation, which appears in the Eigen model as a parameter. The Braun model gives the escape probability in the presence of an electric field, which is significantly different from the exact one over the whole range of electric fields. Appropriate modification of the original Braun model removes the discrepancy at zero or low electric fields, but it does not affect the discrepancy at high electric fields. In all the above theories it is assumed that recombination takes place only at the reaction radius. The escape probability in the case when recombination takes place over a range of distances is also calculated and compared with that in the case of recombination only at the reaction radius.

  5. Significance of hair-dye base-induced sensory irritation.

    PubMed

    Fujita, F; Azuma, T; Tajiri, M; Okamoto, H; Sano, M; Tominaga, M

    2010-06-01

    Oxidation hair-dyes, which are the principal hair-dyes, sometimes induce painful sensory irritation of the scalp caused by the combination of highly reactive substances, such as hydrogen peroxide and alkali agents. Although many cases of severe facial and scalp dermatitis have been reported following the use of hair-dyes, sensory irritation caused by contact of the hair-dye with the skin has not been reported clearly. In this study, we used a self-assessment questionnaire to measure the sensory irritation in various regions of the body caused by two model hair-dye bases that contained different amounts of alkali agents without dyes. Moreover, the occipital region was found as an alternative region of the scalp to test for sensory irritation of the hair-dye bases. We used this region to evaluate the relationship of sensitivity with skin properties, such as trans-epidermal water loss (TEWL), stratum corneum water content, sebum amount, surface temperature, current perception threshold (CPT), catalase activities in tape-stripped skin and sensory irritation score with the model hair-dye bases. The hair-dye sensitive group showed higher TEWL, a lower sebum amount, a lower surface temperature and higher catalase activity than the insensitive group, and was similar to that of damaged skin. These results suggest that sensory irritation caused by hair-dye could occur easily on the damaged dry scalp, as that caused by skin cosmetics reported previously.

  6. Model-Based Calculations of the Probability of a Country's Nuclear Proliferation Decisions

    SciTech Connect

    Li, Jun; Yim, Man-Sung; McNelis, David N.

    2007-07-01

    explain the occurrences of proliferation decisions. However, predicting major historical proliferation events using model-based predictions has been unreliable. Nuclear proliferation decisions by a country is affected by three main factors: (1) technology; (2) finance; and (3) political motivation [1]. Technological capability is important as nuclear weapons development needs special materials, detonation mechanism, delivery capability, and the supporting human resources and knowledge base. Financial capability is likewise important as the development of the technological capabilities requires a serious financial commitment. It would be difficult for any state with a gross national product (GNP) significantly less than that of about $100 billion to devote enough annual governmental funding to a nuclear weapon program to actually achieve positive results within a reasonable time frame (i.e., 10 years). At the same time, nuclear proliferation is not a matter determined by a mastery of technical details or overcoming financial constraints. Technology or finance is a necessary condition but not a sufficient condition for nuclear proliferation. At the most fundamental level, the proliferation decision by a state is controlled by its political motivation. To effectively address the issue of predicting proliferation events, all three of the factors must be included in the model. To the knowledge of the authors, none of the exiting models considered the 'technology' variable as part of the modeling. This paper presents an attempt to develop a methodology for statistical modeling and predicting a country's nuclear proliferation decisions. The approach is based on the combined use of data on a country's nuclear technical capability profiles economic development status, security environment factors and internal political and cultural factors. All of the information utilized in the study was from open source literature. (authors)

  7. Implicit segmentation of a stream of syllables based on transitional probabilities: an MEG study.

    PubMed

    Teinonen, Tuomas; Huotilainen, Minna

    2012-02-01

    Statistical segmentation of continuous speech, i.e., the ability to utilise transitional probabilities between syllables in order to detect word boundaries, is reflected in the brain's auditory event-related potentials (ERPs). The N1 and N400 ERP components are typically enhanced for word onsets compared to random syllables during active listening. We used magnetoencephalography (MEG) to record event-related fields (ERFs) simultaneously with ERPs to syllables in a continuous sequence consisting of ten repeating tri-syllabic pseudowords and unexpected syllables presented between these pseudowords. We found the responses to differ between the syllables within the pseudowords and between the expected and unexpected syllables, reflecting an implicit process extracting the statistical characteristics of the sequence and monitoring for unexpected syllables.

  8. Corticostriatal connectivity fingerprints: Probability maps based on resting-state functional connectivity.

    PubMed

    Jaspers, Ellen; Balsters, Joshua H; Kassraian Fard, Pegah; Mantini, Dante; Wenderoth, Nicole

    2017-03-01

    Over the last decade, structure-function relationships have begun to encompass networks of brain areas rather than individual structures. For example, corticostriatal circuits have been associated with sensorimotor, limbic, and cognitive information processing, and damage to these circuits has been shown to produce unique behavioral outcomes in Autism, Parkinson's Disease, Schizophrenia and healthy ageing. However, it remains an open question how abnormal or absent connectivity can be detected at the individual level. Here, we provide a method for clustering gross morphological structures into subregions with unique functional connectivity fingerprints, and generate network probability maps usable as a baseline to compare individual cases against. We used connectivity metrics derived from resting-state fMRI (N = 100), in conjunction with hierarchical clustering methods, to parcellate the striatum into functionally distinct clusters. We identified three highly reproducible striatal subregions, across both hemispheres and in an independent replication dataset (N = 100) (dice-similarity values 0.40-1.00). Each striatal seed region resulted in a highly reproducible distinct connectivity fingerprint: the putamen showed predominant connectivity with cortical and cerebellar sensorimotor and language processing areas; the ventromedial striatum cluster had a distinct limbic connectivity pattern; the caudate showed predominant connectivity with the thalamus, frontal and occipital areas, and the cerebellum. Our corticostriatal probability maps agree with existing connectivity data in humans and non-human primates, and showed a high degree of replication. We believe that these maps offer an efficient tool to further advance hypothesis driven research and provide important guidance when investigating deviant connectivity in neurological patient populations suffering from e.g., stroke or cerebral palsy. Hum Brain Mapp 38:1478-1491, 2017. © 2016 Wiley Periodicals, Inc.

  9. Probability-based particle detection that enables threshold-free and robust in vivo single-molecule tracking

    PubMed Central

    Smith, Carlas S.; Stallinga, Sjoerd; Lidke, Keith A.; Rieger, Bernd; Grunwald, David

    2015-01-01

    Single-molecule detection in fluorescence nanoscopy has become a powerful tool in cell biology but can present vexing issues in image analysis, such as limited signal, unspecific background, empirically set thresholds, image filtering, and false-positive detection limiting overall detection efficiency. Here we present a framework in which expert knowledge and parameter tweaking are replaced with a probability-based hypothesis test. Our method delivers robust and threshold-free signal detection with a defined error estimate and improved detection of weaker signals. The probability value has consequences for downstream data analysis, such as weighing a series of detections and corresponding probabilities, Bayesian propagation of probability, or defining metrics in tracking applications. We show that the method outperforms all current approaches, yielding a detection efficiency of >70% and a false-positive detection rate of <5% under conditions down to 17 photons/pixel background and 180 photons/molecule signal, which is beneficial for any kind of photon-limited application. Examples include limited brightness and photostability, phototoxicity in live-cell single-molecule imaging, and use of new labels for nanoscopy. We present simulations, experimental data, and tracking of low-signal mRNAs in yeast cells. PMID:26424801

  10. Measurement of children's exposure to pesticides: analysis of urinary metabolite levels in a probability-based sample.

    PubMed Central

    Adgate, J L; Barr, D B; Clayton, C A; Eberly, L E; Freeman, N C; Lioy, P J; Needham, L L; Pellizzari, E D; Quackenboss, J J; Roy, A; Sexton, K

    2001-01-01

    The Minnesota Children's Pesticide Exposure Study is a probability-based sample of 102 children 3-13 years old who were monitored for commonly used pesticides. During the summer of 1997, first-morning-void urine samples (1-3 per child) were obtained for 88% of study children and analyzed for metabolites of insecticides and herbicides: carbamates and related compounds (1-NAP), atrazine (AM), malathion (MDA), and chlorpyrifos and related compounds (TCPy). TCPy was present in 93% of the samples, whereas 1-NAP, MDA, and AM were detected in 45%, 37%, and 2% of samples, respectively. Measured intrachild means ranged from 1.4 microg/L for MDA to 9.2 microg/L for TCPy, and there was considerable intrachild variability. For children providing three urine samples, geometric mean TCPy levels were greater than the detection limit in 98% of the samples, and nearly half the children had geometric mean 1-NAP and MDA levels greater than the detection limit. Interchild variability was significantly greater than intrachild variability for 1-NAP (p = 0.0037) and TCPy (p < 0.0001). The four metabolites measured were not correlated within urine samples, and children's metabolite levels did not vary systematically by sex, age, race, household income, or putative household pesticide use. On a log scale, mean TCPy levels were significantly higher in urban than in nonurban children (7.2 vs. 4.7 microg/L; p = 0.036). Weighted population mean concentrations were 3.9 [standard error (SE) = 0.7; 95% confidence interval (CI), 2.5, 5.3] microg/L for 1-NAP, 1.7 (SE = 0.3; 95% CI, 1.1, 2.3) microg/L for MDA, and 9.6 (SE = 0.9; 95% CI, 7.8, 11) microg/L for TCPy. The weighted population results estimate the overall mean and variability of metabolite levels for more than 84,000 children in the census tracts sampled. Levels of 1-NAP were lower than reported adult reference range concentrations, whereas TCPy concentrations were substantially higher. Concentrations of MDA were detected more frequently

  11. Microstructure-Sensitive Extreme Value Probabilities for High Cycle Fatigue of Ni-Base Superalloy IN100 (Preprint)

    DTIC Science & Technology

    2009-03-01

    transition fatigue regimes; however, microplasticity (i.e., heterogeneous plasticity at the scale of microstructure) is relevant to understanding fatigue...and Socie [57] considered the affect of microplastic 14 Microstructure-Sensitive Extreme Value Probabilities for High Cycle Fatigue of Ni-Base...considers the local stress state as affected by intergranular interactions and microplasticity . For the calculations given below, the volumes over which

  12. Probability Distribution Extraction from TEC Estimates based on Kernel Density Estimation

    NASA Astrophysics Data System (ADS)

    Demir, Uygar; Toker, Cenk; Çenet, Duygu

    2016-07-01

    Statistical analysis of the ionosphere, specifically the Total Electron Content (TEC), may reveal important information about its temporal and spatial characteristics. One of the core metrics that express the statistical properties of a stochastic process is its Probability Density Function (pdf). Furthermore, statistical parameters such as mean, variance and kurtosis, which can be derived from the pdf, may provide information about the spatial uniformity or clustering of the electron content. For example, the variance differentiates between a quiet ionosphere and a disturbed one, whereas kurtosis differentiates between a geomagnetic storm and an earthquake. Therefore, valuable information about the state of the ionosphere (and the natural phenomena that cause the disturbance) can be obtained by looking at the statistical parameters. In the literature, there are publications which try to fit the histogram of TEC estimates to some well-known pdf.s such as Gaussian, Exponential, etc. However, constraining a histogram to fit to a function with a fixed shape will increase estimation error, and all the information extracted from such pdf will continue to contain this error. In such techniques, it is highly likely to observe some artificial characteristics in the estimated pdf which is not present in the original data. In the present study, we use the Kernel Density Estimation (KDE) technique to estimate the pdf of the TEC. KDE is a non-parametric approach which does not impose a specific form on the TEC. As a result, better pdf estimates that almost perfectly fit to the observed TEC values can be obtained as compared to the techniques mentioned above. KDE is particularly good at representing the tail probabilities, and outliers. We also calculate the mean, variance and kurtosis of the measured TEC values. The technique is applied to the ionosphere over Turkey where the TEC values are estimated from the GNSS measurement from the TNPGN-Active (Turkish National Permanent

  13. A generic probability based model to derive regional patterns of crops in time and space

    NASA Astrophysics Data System (ADS)

    Wattenbach, Martin; Luedtke, Stefan; Redweik, Richard; van Oijen, Marcel; Balkovic, Juraj; Reinds, Gert Jan

    2015-04-01

    Croplands are not only the key to human food supply, they also change the biophysical and biogeochemical properties of the land surface leading to changes in the water cycle, energy portioning, they influence soil erosion and substantially contribute to the amount of greenhouse gases entering the atmosphere. The effects of croplands on the environment depend on the type of crop and the associated management which both are related to the site conditions, economic boundary settings as well as preferences of individual farmers. The method described here is designed to predict the most probable crop to appear at a given location and time. The method uses statistical crop area information on NUTS2 level from EUROSTAT and the Common Agricultural Policy Regionalized Impact Model (CAPRI) as observation. These crops are then spatially disaggregated to the 1 x 1 km grid scale within the region, using the assumption that the probability of a crop appearing at a given location and a given year depends on a) the suitability of the land for the cultivation of the crop derived from the MARS Crop Yield Forecast System (MCYFS) and b) expert knowledge of agricultural practices. The latter includes knowledge concerning the feasibility of one crop following another (e.g. a late-maturing crop might leave too little time for the establishment of a winter cereal crop) and the need to combat weed infestations or crop diseases. The model is implemented in R and PostGIS. The quality of the generated crop sequences per grid cell is evaluated on the basis of the given statistics reported by the joint EU/CAPRI database. The assessment is given on NUTS2 level using per cent bias as a measure with a threshold of 15% as minimum quality. The results clearly indicates that crops with a large relative share within the administrative unit are not as error prone as crops that allocate only minor parts of the unit. However, still roughly 40% show an absolute per cent bias above the 15% threshold. This

  14. A Probability-Based Algorithm Using Image Sensors to Track the LED in a Vehicle Visible Light Communication System

    PubMed Central

    Huynh, Phat; Do, Trong-Hop; Yoo, Myungsik

    2017-01-01

    This paper proposes a probability-based algorithm to track the LED in vehicle visible light communication systems using a camera. In this system, the transmitters are the vehicles’ front and rear LED lights. The receivers are high speed cameras that take a series of images of the LEDs. The data embedded in the light is extracted by first detecting the position of the LEDs in these images. Traditionally, LEDs are detected according to pixel intensity. However, when the vehicle is moving, motion blur occurs in the LED images, making it difficult to detect the LEDs. Particularly at high speeds, some frames are blurred at a high degree, which makes it impossible to detect the LED as well as extract the information embedded in these frames. The proposed algorithm relies not only on the pixel intensity, but also on the optical flow of the LEDs and on statistical information obtained from previous frames. Based on this information, the conditional probability that a pixel belongs to a LED is calculated. Then, the position of LED is determined based on this probability. To verify the suitability of the proposed algorithm, simulations are conducted by considering the incidents that can happen in a real-world situation, including a change in the position of the LEDs at each frame, as well as motion blur due to the vehicle speed. PMID:28208637

  15. A Probability-Based Algorithm Using Image Sensors to Track the LED in a Vehicle Visible Light Communication System.

    PubMed

    Huynh, Phat; Do, Trong-Hop; Yoo, Myungsik

    2017-02-10

    This paper proposes a probability-based algorithm to track the LED in vehicle visible light communication systems using a camera. In this system, the transmitters are the vehicles' front and rear LED lights. The receivers are high speed cameras that take a series of images of the LEDs. ThedataembeddedinthelightisextractedbyfirstdetectingthepositionoftheLEDsintheseimages. Traditionally, LEDs are detected according to pixel intensity. However, when the vehicle is moving, motion blur occurs in the LED images, making it difficult to detect the LEDs. Particularly at high speeds, some frames are blurred at a high degree, which makes it impossible to detect the LED as well as extract the information embedded in these frames. The proposed algorithm relies not only on the pixel intensity, but also on the optical flow of the LEDs and on statistical information obtained from previous frames. Based on this information, the conditional probability that a pixel belongs to a LED is calculated. Then, the position of LED is determined based on this probability. To verify the suitability of the proposed algorithm, simulations are conducted by considering the incidents that can happen in a real-world situation, including a change in the position of the LEDs at each frame, as well as motion blur due to the vehicle speed.

  16. Towards smart prosthetic hand: Adaptive probability based skeletan muscle fatigue model.

    PubMed

    Kumar, Parmod; Sebastian, Anish; Potluri, Chandrasekhar; Urfer, Alex; Naidu, D; Schoen, Marco P

    2010-01-01

    Skeletal muscle force can be estimated using surface electromyographic (sEMG) signals. Usually, the surface location for the sensors is near the respective muscle motor unit points. Skeletal muscles generate a spatial EMG signal, which causes cross talk between different sEMG signal sensors. In this study, an array of three sEMG sensors is used to capture the information of muscle dynamics in terms of sEMG signals. The recorded sEMG signals are filtered utilizing optimized nonlinear Half-Gaussian Bayesian filters parameters, and the muscle force signal using a Chebyshev type-II filter. The filter optimization is accomplished using Genetic Algorithms. Three discrete time state-space muscle fatigue models are obtained using system identification and modal transformation for three sets of sensors for single motor unit. The outputs of these three muscle fatigue models are fused with a probabilistic Kullback Information Criterion (KIC) for model selection. The final fused output is estimated with an adaptive probability of KIC, which provides improved force estimates.

  17. Dictionary-based probability density function estimation for high-resolution SAR data

    NASA Astrophysics Data System (ADS)

    Krylov, Vladimir; Moser, Gabriele; Serpico, Sebastiano B.; Zerubia, Josiane

    2009-02-01

    In the context of remotely sensed data analysis, a crucial problem is represented by the need to develop accurate models for the statistics of pixel intensities. In this work, we develop a parametric finite mixture model for the statistics of pixel intensities in high resolution synthetic aperture radar (SAR) images. This method is an extension of previously existing method for lower resolution images. The method integrates the stochastic expectation maximization (SEM) scheme and the method of log-cumulants (MoLC) with an automatic technique to select, for each mixture component, an optimal parametric model taken from a predefined dictionary of parametric probability density functions (pdf). The proposed dictionary consists of eight state-of-the-art SAR-specific pdfs: Nakagami, log-normal, generalized Gaussian Rayleigh, Heavy-tailed Rayleigh, Weibull, K-root, Fisher and generalized Gamma. The designed scheme is endowed with the novel initialization procedure and the algorithm to automatically estimate the optimal number of mixture components. The experimental results with a set of several high resolution COSMO-SkyMed images demonstrate the high accuracy of the designed algorithm, both from the viewpoint of a visual comparison of the histograms, and from the viewpoint of quantitive accuracy measures such as correlation coefficient (above 99,5%). The method proves to be effective on all the considered images, remaining accurate for multimodal and highly heterogeneous scenes.

  18. Filled pause refinement based on the pronunciation probability for lecture speech.

    PubMed

    Long, Yan-Hua; Ye, Hong

    2015-01-01

    Nowadays, although automatic speech recognition has become quite proficient in recognizing or transcribing well-prepared fluent speech, the transcription of speech that contains many disfluencies remains problematic, such as spontaneous conversational and lecture speech. Filled pauses (FPs) are the most frequently occurring disfluencies in this type of speech. Most recent studies have shown that FPs are widely believed to increase the error rates for state-of-the-art speech transcription, primarily because most FPs are not well annotated or provided in training data transcriptions and because of the similarities in acoustic characteristics between FPs and some common non-content words. To enhance the speech transcription system, we propose a new automatic refinement approach to detect FPs in British English lecture speech transcription. This approach combines the pronunciation probabilities for each word in the dictionary and acoustic language model scores for FP refinement through a modified speech recognition forced-alignment framework. We evaluate the proposed approach on the Reith Lectures speech transcription task, in which only imperfect training transcriptions are available. Successful results are achieved for both the development and evaluation datasets. Acoustic models trained on different styles of speech genres have been investigated with respect to FP refinement. To further validate the effectiveness of the proposed approach, speech transcription performance has also been examined using systems built on training data transcriptions with and without FP refinement.

  19. Development of new risk score for pre-test probability of obstructive coronary artery disease based on coronary CT angiography.

    PubMed

    Fujimoto, Shinichiro; Kondo, Takeshi; Yamamoto, Hideya; Yokoyama, Naoyuki; Tarutani, Yasuhiro; Takamura, Kazuhisa; Urabe, Yoji; Konno, Kumiko; Nishizaki, Yuji; Shinozaki, Tomohiro; Kihara, Yasuki; Daida, Hiroyuki; Isshiki, Takaaki; Takase, Shinichi

    2015-09-01

    Existing methods to calculate pre-test probability of obstructive coronary artery disease (CAD) have been established using selected high-risk patients who were referred to conventional coronary angiography. The purpose of this study is to develop and validate our new method for pre-test probability of obstructive CAD using patients who underwent coronary CT angiography (CTA), which could be applicable to a wider range of patient population. Using consecutive 4137 patients with suspected CAD who underwent coronary CTA at our institution, a multivariate logistic regression model including clinical factors as covariates calculated the pre-test probability (K-score) of obstructive CAD determined by coronary CTA. The K-score was compared with the Duke clinical score using the area under the curve (AUC) for the receiver-operating characteristic curve. External validation was performed by an independent sample of 319 patients. The final model included eight significant predictors: age, gender, coronary risk factor (hypertension, diabetes mellitus, dyslipidemia, smoking), history of cerebral infarction, and chest symptom. The AUC of the K-score was significantly greater than that of the Duke clinical score for both derivation (0.736 vs. 0.699) and validation (0.714 vs. 0.688) data sets. Among patients who underwent coronary CTA, newly developed K-score had better pre-test prediction ability of obstructive CAD compared to Duke clinical score in Japanese population.

  20. Calibrating perceived understanding and competency in probability concepts: A diagnosis of learning difficulties based on Rasch probabilistic model

    NASA Astrophysics Data System (ADS)

    Mahmud, Zamalia; Porter, Anne; Salikin, Masniyati; Ghani, Nor Azura Md

    2015-12-01

    Students' understanding of probability concepts have been investigated from various different perspectives. Competency on the other hand is often measured separately in the form of test structure. This study was set out to show that perceived understanding and competency can be calibrated and assessed together using Rasch measurement tools. Forty-four students from the STAT131 Understanding Uncertainty and Variation course at the University of Wollongong, NSW have volunteered to participate in the study. Rasch measurement which is based on a probabilistic model is used to calibrate the responses from two survey instruments and investigate the interactions between them. Data were captured from the e-learning platform Moodle where students provided their responses through an online quiz. The study shows that majority of the students perceived little understanding about conditional and independent events prior to learning about it but tend to demonstrate a slightly higher competency level afterward. Based on the Rasch map, there is indication of some increase in learning and knowledge about some probability concepts at the end of the two weeks lessons on probability concepts.

  1. Automatic Sleep Stage Determination by Multi-Valued Decision Making Based on Conditional Probability with Optimal Parameters

    NASA Astrophysics Data System (ADS)

    Wang, Bei; Sugi, Takenao; Wang, Xingyu; Nakamura, Masatoshi

    Data for human sleep study may be affected by internal and external influences. The recorded sleep data contains complex and stochastic factors, which increase the difficulties for the computerized sleep stage determination techniques to be applied for clinical practice. The aim of this study is to develop an automatic sleep stage determination system which is optimized for variable sleep data. The main methodology includes two modules: expert knowledge database construction and automatic sleep stage determination. Visual inspection by a qualified clinician is utilized to obtain the probability density function of parameters during the learning process of expert knowledge database construction. Parameter selection is introduced in order to make the algorithm flexible. Automatic sleep stage determination is manipulated based on conditional probability. The result showed close agreement comparing with the visual inspection by clinician. The developed system can meet the customized requirements in hospitals and institutions.

  2. A Framework for the Statistical Analysis of Probability of Mission Success Based on Bayesian Theory

    DTIC Science & Technology

    2014-06-01

    Mission Success Prediction Capability (MSPC) is a Model-Based Systems Engineering ( MBSE ) approach to mission planning, used for the analysis of complex... MBSE ) approach to mission planning, used for the analysis of complex systems of precision strike (air-to-surface) weapons. This report focuses on...Based Systems Engineering ( MBSE ) approach to mission planning. It focuses on holistically analysing complex systems, more specifically those of

  3. A mechanical model for predicting the probability of osteoporotic hip fractures based in DXA measurements and finite element simulation

    PubMed Central

    2012-01-01

    Background Osteoporotic hip fractures represent major cause of disability, loss of quality of life and even mortality among the elderly population. Decisions on drug therapy are based on the assessment of risk factors for fracture, from BMD measurements. The combination of biomechanical models with clinical studies could better estimate bone strength and supporting the specialists in their decision. Methods A model to assess the probability of fracture, based on the Damage and Fracture Mechanics has been developed, evaluating the mechanical magnitudes involved in the fracture process from clinical BMD measurements. The model is intended for simulating the degenerative process in the skeleton, with the consequent lost of bone mass and hence the decrease of its mechanical resistance which enables the fracture due to different traumatisms. Clinical studies were chosen, both in non-treatment conditions and receiving drug therapy, and fitted to specific patients according their actual BMD measures. The predictive model is applied in a FE simulation of the proximal femur. The fracture zone would be determined according loading scenario (sideway fall, impact, accidental loads, etc.), using the mechanical properties of bone obtained from the evolutionary model corresponding to the considered time. Results BMD evolution in untreated patients and in those under different treatments was analyzed. Evolutionary curves of fracture probability were obtained from the evolution of mechanical damage. The evolutionary curve of the untreated group of patients presented a marked increase of the fracture probability, while the curves of patients under drug treatment showed variable decreased risks, depending on the therapy type. Conclusion The FE model allowed to obtain detailed maps of damage and fracture probability, identifying high-risk local zones at femoral neck and intertrochanteric and subtrochanteric areas, which are the typical locations of osteoporotic hip fractures. The

  4. Results from probability-based, simplified, off-shore Louisiana CSEM hydrocarbon reservoir modeling

    NASA Astrophysics Data System (ADS)

    Stalnaker, J. L.; Tinley, M.; Gueho, B.

    2009-12-01

    Perhaps the biggest impediment to the commercial application of controlled-source electromagnetic (CSEM) geophysics marine hydrocarbon exploration is the inefficiency of modeling and data inversion. If an understanding of the typical (in a statistical sense) geometrical and electrical nature of a reservoir can be attained, then it is possible to derive therefrom a simplified yet accurate model of the electromagnetic interactions that produce a measured marine CSEM signal, leading ultimately to efficient modeling and inversion. We have compiled geometric and resistivity measurements from roughly 100 known, producing off-shore Louisiana Gulf of Mexico reservoirs. Recognizing that most reservoirs could be recreated roughly from a sectioned hemi-ellipsoid, we devised a unified, compact reservoir geometry description. Each reservoir was initially fit to the ellipsoid by eye, though we plan in the future to perform a more rigorous least-squares fit. We created, using kernel density estimation, initial probabilistic descriptions of reservoir parameter distributions, with the understanding that additional information would not fundamentally alter our results, but rather increase accuracy. From the probabilistic description, we designed an approximate model consisting of orthogonally oriented current segments distributed across the ellipsoid--enough to define the shape, yet few enough to be resolved during inversion. The moment and length of the currents are mapped to geometry and resistivity of the ellipsoid. The probability density functions (pdfs) derived from reservoir statistics serve as a workbench. We first use the pdfs in a Monte Carlo simulation designed to assess the detectability off-shore Louisiana reservoirs using magnitude versus offset (MVO) anomalies. From the pdfs, many reservoir instances are generated (using rejection sampling) and each normalized MVO response is calculated. The response strength is summarized by numerically computing MVO power, and that

  5. RECRUITING FOR A LONGITUDINAL STUDY OF CHILDREN'S HEALTH USING A HOUSEHOLD-BASED PROBABILITY SAMPLING APPROACH

    EPA Science Inventory

    The sampling design for the National Children¿s Study (NCS) calls for a population-based, multi-stage, clustered household sampling approach (visit our website for more information on the NCS : www.nationalchildrensstudy.gov). The full sample is designed to be representative of ...

  6. An EEG-Based Fuzzy Probability Model for Early Diagnosis of Alzheimer's Disease.

    PubMed

    Chiang, Hsiu-Sen; Pao, Shun-Chi

    2016-05-01

    Alzheimer's disease is a degenerative brain disease that results in cardinal memory deterioration and significant cognitive impairments. The early treatment of Alzheimer's disease can significantly reduce deterioration. Early diagnosis is difficult, and early symptoms are frequently overlooked. While much of the literature focuses on disease detection, the use of electroencephalography (EEG) in Alzheimer's diagnosis has received relatively little attention. This study combines the fuzzy and associative Petri net methodologies to develop a model for the effective and objective detection of Alzheimer's disease. Differences in EEG patterns between normal subjects and Alzheimer patients are used to establish prediction criteria for Alzheimer's disease, potentially providing physicians with a reference for early diagnosis, allowing for early action to delay the disease progression.

  7. Performance of methods for estimating the effect of covariates on group membership probabilities in group-based trajectory models.

    PubMed

    Davies, Christopher E; Giles, Lynne C; Glonek, Gary Fv

    2017-01-01

    One purpose of a longitudinal study is to gain insight of how characteristics at earlier points in time can impact on subsequent outcomes. Typically, the outcome variable varies over time and the data for each individual can be used to form a discrete path of measurements, that is a trajectory. Group-based trajectory modelling methods seek to identify subgroups of individuals within a population with trajectories that are more similar to each other than to trajectories in distinct groups. An approach to modelling the influence of covariates measured at earlier time points in the group-based setting is to consider models wherein these covariates affect the group membership probabilities. Models in which prior covariates impact the trajectories directly are also possible but are not considered here. In the present study, we compared six different methods for estimating the effect of covariates on the group membership probabilities, which have different approaches to account for the uncertainty in the group membership assignment. We found that when investigating the effect of one or several covariates on a group-based trajectory model, the full likelihood approach minimized the bias in the estimate of the covariate effect. In this '1-step' approach, the estimation of the effect of covariates and the trajectory model are carried out simultaneously. Of the '3-step' approaches, where the effect of the covariates is assessed subsequent to the estimation of the group-based trajectory model, only Vermunt's improved 3 step resulted in bias estimates similar in size to the full likelihood approach. The remaining methods considered resulted in considerably higher bias in the covariate effect estimates and should not be used. In addition to the bias empirically demonstrated for the probability regression approach, we have shown analytically that it is biased in general.

  8. Translating CFC-based piston ages into probability density functions of ground-water age in karst

    USGS Publications Warehouse

    Long, A.J.; Putnam, L.D.

    2006-01-01

    Temporal age distributions are equivalent to probability density functions (PDFs) of transit time. The type and shape of a PDF provides important information related to ground-water mixing at the well or spring and the complex nature of flow networks in karst aquifers. Chlorofluorocarbon (CFC) concentrations measured for samples from 12 locations in the karstic Madison aquifer were used to evaluate the suitability of various PDF types for this aquifer. Parameters of PDFs could not be estimated within acceptable confidence intervals for any of the individual sites. Therefore, metrics derived from CFC-based apparent ages were used to evaluate results of PDF modeling in a more general approach. The ranges of these metrics were established as criteria against which families of PDFs could be evaluated for their applicability to different parts of the aquifer. Seven PDF types, including five unimodal and two bimodal models, were evaluated. Model results indicate that unimodal models may be applicable to areas close to conduits that have younger piston (i.e., apparent) ages and that bimodal models probably are applicable to areas farther from conduits that have older piston ages. The two components of a bimodal PDF are interpreted as representing conduit and diffuse flow, and transit times of as much as two decades may separate these PDF components. Areas near conduits may be dominated by conduit flow, whereas areas farther from conduits having bimodal distributions probably have good hydraulic connection to both diffuse and conduit flow. ?? 2006 Elsevier B.V. All rights reserved.

  9. Projectile Two-dimensional Coordinate Measurement Method Based on Optical Fiber Coding Fire and its Coordinate Distribution Probability

    NASA Astrophysics Data System (ADS)

    Li, Hanshan; Lei, Zhiyong

    2013-01-01

    To improve projectile coordinate measurement precision in fire measurement system, this paper introduces the optical fiber coding fire measurement method and principle, sets up their measurement model, and analyzes coordinate errors by using the differential method. To study the projectile coordinate position distribution, using the mathematical statistics hypothesis method to analyze their distributing law, firing dispersion and probability of projectile shooting the object center were put under study. The results show that exponential distribution testing is relatively reasonable to ensure projectile position distribution on the given significance level. Through experimentation and calculation, the optical fiber coding fire measurement method is scientific and feasible, which can gain accurate projectile coordinate position.

  10. Transgender Population Size in the United States: a Meta-Regression of Population-Based Probability Samples

    PubMed Central

    Sevelius, Jae M.

    2017-01-01

    Background. Transgender individuals have a gender identity that differs from the sex they were assigned at birth. The population size of transgender individuals in the United States is not well-known, in part because official records, including the US Census, do not include data on gender identity. Population surveys today more often collect transgender-inclusive gender-identity data, and secular trends in culture and the media have created a somewhat more favorable environment for transgender people. Objectives. To estimate the current population size of transgender individuals in the United States and evaluate any trend over time. Search methods. In June and July 2016, we searched PubMed, Cumulative Index to Nursing and Allied Health Literature, and Web of Science for national surveys, as well as “gray” literature, through an Internet search. We limited the search to 2006 through 2016. Selection criteria. We selected population-based surveys that used probability sampling and included self-reported transgender-identity data. Data collection and analysis. We used random-effects meta-analysis to pool eligible surveys and used meta-regression to address our hypothesis that the transgender population size estimate would increase over time. We used subsample and leave-one-out analysis to assess for bias. Main results. Our meta-regression model, based on 12 surveys covering 2007 to 2015, explained 62.5% of model heterogeneity, with a significant effect for each unit increase in survey year (F = 17.122; df = 1,10; b = 0.026%; P = .002). Extrapolating these results to 2016 suggested a current US population size of 390 adults per 100 000, or almost 1 million adults nationally. This estimate may be more indicative for younger adults, who represented more than 50% of the respondents in our analysis. Authors’ conclusions. Future national surveys are likely to observe higher numbers of transgender people. The large variety in questions used to ask

  11. Probability-Based Design Criteria of the ASCE 7 Tsunami Loads and Effects Provisions (Invited)

    NASA Astrophysics Data System (ADS)

    Chock, G.

    2013-12-01

    Mitigation of tsunami risk requires a combination of emergency preparedness for evacuation in addition to providing structural resilience of critical facilities, infrastructure, and key resources necessary for immediate response and economic and social recovery. Critical facilities would include emergency response, medical, tsunami refuges and shelters, ports and harbors, lifelines, transportation, telecommunications, power, financial institutions, and major industrial/commercial facilities. The Tsunami Loads and Effects Subcommittee of the ASCE/SEI 7 Standards Committee is developing a proposed new Chapter 6 - Tsunami Loads and Effects for the 2016 edition of the ASCE 7 Standard. ASCE 7 provides the minimum design loads and requirements for structures subject to building codes such as the International Building Code utilized in the USA. In this paper we will provide a review emphasizing the intent of these new code provisions and explain the design methodology. The ASCE 7 provisions for Tsunami Loads and Effects enables a set of analysis and design methodologies that are consistent with performance-based engineering based on probabilistic criteria. . The ASCE 7 Tsunami Loads and Effects chapter will be initially applicable only to the states of Alaska, Washington, Oregon, California, and Hawaii. Ground shaking effects and subsidence from a preceding local offshore Maximum Considered Earthquake will also be considered prior to tsunami arrival for Alaska and states in the Pacific Northwest regions governed by nearby offshore subduction earthquakes. For national tsunami design provisions to achieve a consistent reliability standard of structural performance for community resilience, a new generation of tsunami inundation hazard maps for design is required. The lesson of recent tsunami is that historical records alone do not provide a sufficient measure of the potential heights of future tsunamis. Engineering design must consider the occurrence of events greater than

  12. Computing posterior probabilities for score-based alignments using ppALIGN.

    PubMed

    Wolfsheimer, Stefan; Hartmann, Alexander; Rabus, Ralf; Nuel, Gregory

    2012-05-16

    Score-based pairwise alignments are widely used in bioinformatics in particular with molecular database search tools, such as the BLAST family. Due to sophisticated heuristics, such algorithms are usually fast but the underlying scoring model unfortunately lacks a statistical description of the reliability of the reported alignments. In particular, close to gaps, in low-score or low-complexity regions, a huge number of alternative alignments arise which results in a decrease of the certainty of the alignment. ppALIGN is a software package that uses hidden Markov Model techniques to compute position-wise reliability of score-based pairwise alignments of DNA or protein sequences. The design of the model allows for a direct connection between the scoring function and the parameters of the probabilistic model. For this reason it is suitable to analyze the outcomes of popular score based aligners and search tools without having to choose a complicated set of parameters. By contrast, our program only requires the classical score parameters (the scoring function and gap costs). The package comes along with a library written in C++, a standalone program for user defined alignments (ppALIGN) and another program (ppBLAST) which can process a complete result set of BLAST. The main algorithms essentially exhibit a linear time complexity (in the alignment lengths), and they are hence suitable for on-line computations. We have also included alternative decoding algorithms to provide alternative alignments. ppALIGN is a fast program/library that helps detect and quantify questionable regions in pairwise alignments. Due to its structure, the input/output interface it can to be connected to other post-processing tools. Empirically, we illustrate its usefulness in terms of correctly predicted reliable regions for sequences generated using the ROSE model for sequence evolution, and identify sensor-specific regions in the denitrifying betaproteobacterium Aromatoleum aromaticum.

  13. Urban seismic risk assessment: statistical repair cost data and probable structural losses based on damage scenario—correlation analysis

    NASA Astrophysics Data System (ADS)

    Eleftheriadou, Anastasia K.; Baltzopoulou, Aikaterini D.; Karabinis, Athanasios I.

    2016-06-01

    The current seismic risk assessment is based on two discrete approaches, actual and probable, validating afterwards the produced results. In the first part of this research, the seismic risk is evaluated from the available data regarding the mean statistical repair/strengthening or replacement cost for the total number of damaged structures (180,427 buildings) after the 7/9/1999 Parnitha (Athens) earthquake. The actual evaluated seismic risk is afterwards compared to the estimated probable structural losses, which is presented in the second part of the paper, based on a damage scenario in the referring earthquake. The applied damage scenario is based on recently developed damage probability matrices (DPMs) from Athens (Greece) damage database. The seismic risk estimation refers to 750,085 buildings situated in the extended urban region of Athens. The building exposure is categorized in five typical structural types and represents 18.80 % of the entire building stock in Greece. The last information is provided by the National Statistics Service of Greece (NSSG) according to the 2000-2001 census. The seismic input is characterized by the ratio, a g/ a o, where a g is the regional peak ground acceleration (PGA) which is evaluated from the earlier estimated research macroseismic intensities, and a o is the PGA according to the hazard map of the 2003 Greek Seismic Code. Finally, the collected investigated financial data derived from different National Services responsible for the post-earthquake crisis management concerning the repair/strengthening or replacement costs or other categories of costs for the rehabilitation of earthquake victims (construction and function of settlements for earthquake homeless, rent supports, demolitions, shorings) are used to determine the final total seismic risk factor.

  14. Don’t make cache too complex: A simple probability-based cache management scheme for SSDs

    PubMed Central

    Cho, Sangyeun; Choi, Jongmoo

    2017-01-01

    Solid-state drives (SSDs) have recently become a common storage component in computer systems, and they are fueled by continued bit cost reductions achieved with smaller feature sizes and multiple-level cell technologies. However, as the flash memory stores more bits per cell, the performance and reliability of the flash memory degrade substantially. To solve this problem, a fast non-volatile memory (NVM-)based cache has been employed within SSDs to reduce the long latency required to write data. Absorbing small writes in a fast NVM cache can also reduce the number of flash memory erase operations. To maximize the benefits of an NVM cache, it is important to increase the NVM cache utilization. In this paper, we propose and study ProCache, a simple NVM cache management scheme, that makes cache-entrance decisions based on random probability testing. Our scheme is motivated by the observation that frequently written hot data will eventually enter the cache with a high probability, and that infrequently accessed cold data will not enter the cache easily. Owing to its simplicity, ProCache is easy to implement at a substantially smaller cost than similar previously studied techniques. We evaluate ProCache and conclude that it achieves comparable performance compared to a more complex reference counter-based cache-management scheme. PMID:28358897

  15. EUPDF: An Eulerian-Based Monte Carlo Probability Density Function (PDF) Solver. User's Manual

    NASA Technical Reports Server (NTRS)

    Raju, M. S.

    1998-01-01

    EUPDF is an Eulerian-based Monte Carlo PDF solver developed for application with sprays, combustion, parallel computing and unstructured grids. It is designed to be massively parallel and could easily be coupled with any existing gas-phase flow and spray solvers. The solver accommodates the use of an unstructured mesh with mixed elements of either triangular, quadrilateral, and/or tetrahedral type. The manual provides the user with the coding required to couple the PDF code to any given flow code and a basic understanding of the EUPDF code structure as well as the models involved in the PDF formulation. The source code of EUPDF will be available with the release of the National Combustion Code (NCC) as a complete package.

  16. Research on particle swarm optimization algorithm based on optimal movement probability

    NASA Astrophysics Data System (ADS)

    Ma, Jianhong; Zhang, Han; He, Baofeng

    2017-01-01

    The particle swarm optimization algorithm to improve the control precision, and has great application value training neural network and fuzzy system control fields etc.The traditional particle swarm algorithm is used for the training of feed forward neural networks,the search efficiency is low, and easy to fall into local convergence.An improved particle swarm optimization algorithm is proposed based on error back propagation gradient descent. Particle swarm optimization for Solving Least Squares Problems to meme group, the particles in the fitness ranking, optimization problem of the overall consideration, the error back propagation gradient descent training BP neural network, particle to update the velocity and position according to their individual optimal and global optimization, make the particles more to the social optimal learning and less to its optimal learning, it can avoid the particles fall into local optimum, by using gradient information can accelerate the PSO local search ability, improve the multi beam particle swarm depth zero less trajectory information search efficiency, the realization of improved particle swarm optimization algorithm. Simulation results show that the algorithm in the initial stage of rapid convergence to the global optimal solution can be near to the global optimal solution and keep close to the trend, the algorithm has faster convergence speed and search performance in the same running time, it can improve the convergence speed of the algorithm, especially the later search efficiency.

  17. Computing elastic‐rebound‐motivated rarthquake probabilities in unsegmented fault models: a new methodology supported by physics‐based simulators

    USGS Publications Warehouse

    Field, Edward H.

    2015-01-01

    A methodology is presented for computing elastic‐rebound‐based probabilities in an unsegmented fault or fault system, which involves computing along‐fault averages of renewal‐model parameters. The approach is less biased and more self‐consistent than a logical extension of that applied most recently for multisegment ruptures in California. It also enables the application of magnitude‐dependent aperiodicity values, which the previous approach does not. Monte Carlo simulations are used to analyze long‐term system behavior, which is generally found to be consistent with that of physics‐based earthquake simulators. Results cast doubt that recurrence‐interval distributions at points on faults look anything like traditionally applied renewal models, a fact that should be considered when interpreting paleoseismic data. We avoid such assumptions by changing the "probability of what" question (from offset at a point to the occurrence of a rupture, assuming it is the next event to occur). The new methodology is simple, although not perfect in terms of recovering long‐term rates in Monte Carlo simulations. It represents a reasonable, improved way to represent first‐order elastic‐rebound predictability, assuming it is there in the first place, and for a system that clearly exhibits other unmodeled complexities, such as aftershock triggering.

  18. APL: An angle probability list to improve knowledge-based metaheuristics for the three-dimensional protein structure prediction.

    PubMed

    Borguesan, Bruno; Barbachan e Silva, Mariel; Grisci, Bruno; Inostroza-Ponta, Mario; Dorn, Márcio

    2015-12-01

    Tertiary protein structure prediction is one of the most challenging problems in structural bioinformatics. Despite the advances in algorithm development and computational strategies, predicting the folded structure of a protein only from its amino acid sequence remains as an unsolved problem. We present a new computational approach to predict the native-like three-dimensional structure of proteins. Conformational preferences of amino acid residues and secondary structure information were obtained from protein templates stored in the Protein Data Bank and represented as an Angle Probability List. Two knowledge-based prediction methods based on Genetic Algorithms and Particle Swarm Optimization were developed using this information. The proposed method has been tested with twenty-six case studies selected to validate our approach with different classes of proteins and folding patterns. Stereochemical and structural analysis were performed for each predicted three-dimensional structure. Results achieved suggest that the Angle Probability List can improve the effectiveness of metaheuristics used to predicted the three-dimensional structure of protein molecules by reducing its conformational search space.

  19. Automatic estimation of sleep level for nap based on conditional probability of sleep stages and an exponential smoothing method.

    PubMed

    Wang, Bei; Wang, Xingyu; Zhang, Tao; Nakamura, Masatoshi

    2013-01-01

    An automatic sleep level estimation method was developed for monitoring and regulation of day time nap sleep. The recorded nap data is separated into continuous 5-second segments. Features are extracted from EEGs, EOGs and EMG. A parameter of sleep level is defined which is estimated based on the conditional probability of sleep stages. An exponential smoothing method is applied for the estimated sleep level. There were totally 12 healthy subjects, with an averaged age of 22 yeas old, participated into the experimental work. Comparing with sleep stage determination, the presented sleep level estimation method showed better performance for nap sleep interpretation. Real time monitoring and regulation of nap is realizable based on the developed technique.

  20. Uncertainty analysis of a structural-acoustic problem using imprecise probabilities based on p-box representations

    NASA Astrophysics Data System (ADS)

    Chen, Ning; Yu, Dejie; Xia, Baizhan; Beer, Michael

    2016-12-01

    Imprecise probabilities can capture epistemic uncertainty, which reflects limited available knowledge so that a precise probabilistic model cannot be established. In this paper, the parameters of a structural-acoustic problem are represented with the aid of p-boxes to capture epistemic uncertainty in the model. To perform the necessary analysis of the structural-acoustic problem with p-boxes, a first-order matrix decomposition perturbation method (FMDPM) for interval analysis is proposed, and an efficient interval Monte Carlo method based on FMDPM is derived. In the implementation of the efficient interval Monte Carlo method based on FMDPM, constant matrices are obtained, first, through an uncertain parameter extraction on the basis of the matrix decomposition technique. Then, these constant matrices are employed to perform multiple interval analyses by using the first-order perturbation method. A numerical example is provided to illustrate the feasibility and effectiveness of the presented approach.

  1. B* Probability Based Search

    DTIC Science & Technology

    1994-06-27

    the check Qb3 , which forces the exchange of queens leads to a win (it does). If it does not, then white would be foolish to give away his existing...f:e6 11 B:c4 0-0 8 Qb3 Qd7 12 Qc2 .5 9 Qsb7 Rb8 13 Ratl e:d4 10 Qa6 Nf6 14 N:d4 Qe7 11 Nbd2 Bb4 15 042 d5 12 Nc4 0-0 16 *:d5 N:d5 13 a3 Be7 17 N:d5...16, 1994 27 8d6 .5 White Black 28 R:a7 Rd7 B* Hitech Hitech 5.6 29 Qb3 + Kh8 30 R:d7 N:d7 1 44 45 31 Qd5 RcS 2 c4 d:c4 32 f3 h5 3 e4 05 33 c6 Nf6 4

  2. Estimation of the probability of exposure to metalworking fluids in a population-based case-control study

    PubMed Central

    Park, Dong-Uk; Colt, Joanne S.; Baris, Dalsu; Schwenn, Molly; Karagas, Margaret R.; Armenti, Karla R.; Johnson, Alison; Silverman, Debra T; Stewart, Patricia A

    2014-01-01

    We describe here an approach for estimating the probability that study subjects were exposed to metalworking fluids (MWFs) in a population-based case-control study of bladder cancer. Study subject reports on the frequency of machining and use of specific MWFs (straight, soluble, and synthetic/semi-synthetic) were used to estimate exposure probability when available. Those reports also were used to develop estimates for job groups, which were then applied to jobs without MWF reports. Estimates using both cases and controls and controls only were developed. The prevalence of machining varied substantially across job groups (10-90%), with the greatest percentage of jobs that machined being reported by machinists and tool and die workers. Reports of straight and soluble MWF use were fairly consistent across job groups (generally, 50-70%). Synthetic MWF use was lower (13-45%). There was little difference in reports by cases and controls vs. controls only. Approximately, 1% of the entire study population was assessed as definitely exposed to straight or soluble fluids in contrast to 0.2% definitely exposed to synthetic/semi-synthetics. A comparison between the reported use of the MWFs and the US production levels by decade found high correlations (r generally >0.7). Overall, the method described here is likely to have provided a systematic and reliable ranking that better reflects the variability of exposure to three types of MWFs than approaches applied in the past. PMID:25256317

  3. A Web-based interface to calculate phonotactic probability for words and nonwords in Modern Standard Arabic.

    PubMed

    Aljasser, Faisal; Vitevitch, Michael S

    2017-03-24

    A number of databases (Storkel Behavior Research Methods, 45, 1159-1167, 2013) and online calculators (Vitevitch & Luce Behavior Research Methods, Instruments, and Computers, 36, 481-487, 2004) have been developed to provide statistical information about various aspects of language, and these have proven to be invaluable assets to researchers, clinicians, and instructors in the language sciences. The number of such resources for English is quite large and continues to grow, whereas the number of such resources for other languages is much smaller. This article describes the development of a Web-based interface to calculate phonotactic probability in Modern Standard Arabic (MSA). A full description of how the calculator can be used is provided. It can be freely accessed at http://phonotactic.drupal.ku.edu/ .

  4. EFFECT OF CHLORIDE AND SULFATE CONCENTRATION ON PROBABLITY BASED CORROSION CONTROL FOR LIQUID WASTE TANKS- PART IV

    SciTech Connect

    Hoffman, E.

    2012-08-23

    A series of cyclic potentiodynamic polarization tests was performed on samples of A537 carbon steel in support of a probability-based approach to evaluate the effect of chloride and sulfate on corrosion susceptibility. Testing solutions were chosen to build off previous experimental results from FY07, FY08, FY09 and FY10 to systemically evaluate the influence of the secondary aggressive species, chloride, and sulfate. The FY11 results suggest that evaluating the combined effect of all aggressive species, nitrate, chloride, and sulfate, provides a consistent response for determining corrosion susceptibility. The results of this work emphasize the importance for not only nitrate concentration limits, but also chloride and sulfate concentration limits as well.

  5. Estimation of the probability of exposure to machining fluids in a population-based case-control study.

    PubMed

    Park, Dong-Uk; Colt, Joanne S; Baris, Dalsu; Schwenn, Molly; Karagas, Margaret R; Armenti, Karla R; Johnson, Alison; Silverman, Debra T; Stewart, Patricia A

    2014-01-01

    We describe an approach for estimating the probability that study subjects were exposed to metalworking fluids (MWFs) in a population-based case-control study of bladder cancer. Study subject reports on the frequency of machining and use of specific MWFs (straight, soluble, and synthetic/semi-synthetic) were used to estimate exposure probability when available. Those reports also were used to develop estimates for job groups, which were then applied to jobs without MWF reports. Estimates using both cases and controls and controls only were developed. The prevalence of machining varied substantially across job groups (0.1->0.9%), with the greatest percentage of jobs that machined being reported by machinists and tool and die workers. Reports of straight and soluble MWF use were fairly consistent across job groups (generally 50-70%). Synthetic MWF use was lower (13-45%). There was little difference in reports by cases and controls vs. controls only. Approximately, 1% of the entire study population was assessed as definitely exposed to straight or soluble fluids in contrast to 0.2% definitely exposed to synthetic/semi-synthetics. A comparison between the reported use of the MWFs and U.S. production levels found high correlations (r generally >0.7). Overall, the method described here is likely to have provided a systematic and reliable ranking that better reflects the variability of exposure to three types of MWFs than approaches applied in the past. [Supplementary materials are available for this article. Go to the publisher's online edition of Journal of Occupational and Environmental Hygiene for the following free supplemental resources: a list of keywords in the occupational histories that were used to link study subjects to the metalworking fluids (MWFs) modules; recommendations from the literature on selection of MWFs based on type of machining operation, the metal being machined and decade; popular additives to MWFs; the number and proportion of controls who

  6. Small scale photo probability sampling and vegetation classification in southeast Arizona as an ecological base for resource inventory. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Johnson, J. R. (Principal Investigator)

    1974-01-01

    The author has identified the following significant results. The broad scale vegetation classification was developed for a 3,200 sq mile area in southeastern Arizona. The 31 vegetation types were derived from association tables which contained information taken at about 500 ground sites. The classification provided an information base that was suitable for use with small scale photography. A procedure was developed and tested for objectively comparing photo images. The procedure consisted of two parts, image groupability testing and image complexity testing. The Apollo and ERTS photos were compared for relative suitability as first stage stratification bases in two stage proportional probability sampling. High altitude photography was used in common at the second stage.

  7. SU-E-T-144: Bayesian Inference of Local Relapse Data Using a Poisson-Based Tumour Control Probability Model

    SciTech Connect

    La Russa, D

    2015-06-15

    Purpose: The purpose of this project is to develop a robust method of parameter estimation for a Poisson-based TCP model using Bayesian inference. Methods: Bayesian inference was performed using the PyMC3 probabilistic programming framework written in Python. A Poisson-based TCP regression model that accounts for clonogen proliferation was fit to observed rates of local relapse as a function of equivalent dose in 2 Gy fractions for a population of 623 stage-I non-small-cell lung cancer patients. The Slice Markov Chain Monte Carlo sampling algorithm was used to sample the posterior distributions, and was initiated using the maximum of the posterior distributions found by optimization. The calculation of TCP with each sample step required integration over the free parameter α, which was performed using an adaptive 24-point Gauss-Legendre quadrature. Convergence was verified via inspection of the trace plot and posterior distribution for each of the fit parameters, as well as with comparisons of the most probable parameter values with their respective maximum likelihood estimates. Results: Posterior distributions for α, the standard deviation of α (σ), the average tumour cell-doubling time (Td), and the repopulation delay time (Tk), were generated assuming α/β = 10 Gy, and a fixed clonogen density of 10{sup 7} cm−{sup 3}. Posterior predictive plots generated from samples from these posterior distributions are in excellent agreement with the observed rates of local relapse used in the Bayesian inference. The most probable values of the model parameters also agree well with maximum likelihood estimates. Conclusion: A robust method of performing Bayesian inference of TCP data using a complex TCP model has been established.

  8. Chromatographic elution process design space development for the purification of saponins in Panax notoginseng extract using a probability-based approach.

    PubMed

    Chen, Teng; Gong, Xingchu; Chen, Huali; Zhang, Ying; Qu, Haibin

    2016-01-01

    A Monte Carlo method was used to develop the design space of a chromatographic elution process for the purification of saponins in Panax notoginseng extract. During this process, saponin recovery ratios, saponin purity, and elution productivity are determined as process critical quality attributes, and ethanol concentration, elution rate, and elution volume are identified as critical process parameters. Quadratic equations between process critical quality attributes and critical process parameters were established using response surface methodology. Then probability-based design space was computed by calculating the prediction errors using Monte Carlo simulations. The influences of calculation parameters on computation results were investigated. The optimized calculation condition was as follows: calculation step length of 0.02, simulation times of 10 000, and a significance level value of 0.15 for adding or removing terms in a stepwise regression. Recommended normal operation region is located in ethanol concentration of 65.0-70.0%, elution rate of 1.7-2.0 bed volumes (BV)/h and elution volume of 3.0-3.6 BV. Verification experiments were carried out and the experimental values were in a good agreement with the predicted values. The application of present method is promising to develop a probability-based design space for other botanical drug manufacturing process.

  9. Guided waves based SHM systems for composites structural elements: statistical analyses finalized at probability of detection definition and assessment

    NASA Astrophysics Data System (ADS)

    Monaco, E.; Memmolo, V.; Ricci, F.; Boffa, N. D.; Maio, L.

    2015-03-01

    Maintenance approaches based on sensorised structures and Structural Health Monitoring systems could represent one of the most promising innovations in the fields of aerostructures since many years, mostly when composites materials (fibers reinforced resins) are considered. Layered materials still suffer today of drastic reductions of maximum allowable stress values during the design phase as well as of costly and recurrent inspections during the life cycle phase that don't permit of completely exploit their structural and economic potentialities in today aircrafts. Those penalizing measures are necessary mainly to consider the presence of undetected hidden flaws within the layered sequence (delaminations) or in bonded areas (partial disbonding); in order to relax design and maintenance constraints a system based on sensors permanently installed on the structure to detect and locate eventual flaws can be considered (SHM system) once its effectiveness and reliability will be statistically demonstrated via a rigorous Probability Of Detection function definition and evaluation. This paper presents an experimental approach with a statistical procedure for the evaluation of detection threshold of a guided waves based SHM system oriented to delaminations detection on a typical wing composite layered panel. The experimental tests are mostly oriented to characterize the statistical distribution of measurements and damage metrics as well as to characterize the system detection capability using this approach. Numerically it is not possible to substitute part of the experimental tests aimed at POD where the noise in the system response is crucial. Results of experiments are presented in the paper and analyzed.

  10. Statistical analysis of the induced Basel 2006 earthquake sequence: introducing a probability-based monitoring approach for Enhanced Geothermal Systems

    NASA Astrophysics Data System (ADS)

    Bachmann, C. E.; Wiemer, S.; Woessner, J.; Hainzl, S.

    2011-08-01

    Geothermal energy is becoming an important clean energy source, however, the stimulation of a reservoir for an Enhanced Geothermal System (EGS) is associated with seismic risk due to induced seismicity. Seismicity occurring due to the water injection at depth have to be well recorded and monitored. To mitigate the seismic risk of a damaging event, an appropriate alarm system needs to be in place for each individual experiment. In recent experiments, the so-called traffic-light alarm system, based on public response, local magnitude and peak ground velocity, was used. We aim to improve the pre-defined alarm system by introducing a probability-based approach; we retrospectively model the ongoing seismicity in real time with multiple statistical forecast models and then translate the forecast to seismic hazard in terms of probabilities of exceeding a ground motion intensity level. One class of models accounts for the water injection rate, the main parameter that can be controlled by the operators during an experiment. By translating the models into time-varying probabilities of exceeding various intensity levels, we provide tools which are well understood by the decision makers and can be used to determine thresholds non-exceedance during a reservoir stimulation; this, however, remains an entrepreneurial or political decision of the responsible project coordinators. We introduce forecast models based on the data set of an EGS experiment in the city of Basel. Between 2006 December 2 and 8, approximately 11 500 m3 of water was injected into a 5-km-deep well at high pressures. A six-sensor borehole array, was installed by the company Geothermal Explorers Limited (GEL) at depths between 300 and 2700 m around the well to monitor the induced seismicity. The network recorded approximately 11 200 events during the injection phase, more than 3500 of which were located. With the traffic-light system, actions where implemented after an ML 2.7 event, the water injection was

  11. FINAL PROJECT REPORT DOE Early Career Principal Investigator Program Project Title: Developing New Mathematical Models for Multiphase Flows Based on a Fundamental Probability Density Function Approach

    SciTech Connect

    Shankar Subramaniam

    2009-04-01

    This final project report summarizes progress made towards the objectives described in the proposal entitled “Developing New Mathematical Models for Multiphase Flows Based on a Fundamental Probability Density Function Approach”. Substantial progress has been made in theory, modeling and numerical simulation of turbulent multiphase flows. The consistent mathematical framework based on probability density functions is described. New models are proposed for turbulent particle-laden flows and sprays.

  12. Applying probability theory for the quality assessment of a wildfire spread prediction framework based on genetic algorithms.

    PubMed

    Cencerrado, Andrés; Cortés, Ana; Margalef, Tomàs

    2013-01-01

    This work presents a framework for assessing how the existing constraints at the time of attending an ongoing forest fire affect simulation results, both in terms of quality (accuracy) obtained and the time needed to make a decision. In the wildfire spread simulation and prediction area, it is essential to properly exploit the computational power offered by new computing advances. For this purpose, we rely on a two-stage prediction process to enhance the quality of traditional predictions, taking advantage of parallel computing. This strategy is based on an adjustment stage which is carried out by a well-known evolutionary technique: Genetic Algorithms. The core of this framework is evaluated according to the probability theory principles. Thus, a strong statistical study is presented and oriented towards the characterization of such an adjustment technique in order to help the operation managers deal with the two aspects previously mentioned: time and quality. The experimental work in this paper is based on a region in Spain which is one of the most prone to forest fires: El Cap de Creus.

  13. Applying Probability Theory for the Quality Assessment of a Wildfire Spread Prediction Framework Based on Genetic Algorithms

    PubMed Central

    Cencerrado, Andrés; Cortés, Ana; Margalef, Tomàs

    2013-01-01

    This work presents a framework for assessing how the existing constraints at the time of attending an ongoing forest fire affect simulation results, both in terms of quality (accuracy) obtained and the time needed to make a decision. In the wildfire spread simulation and prediction area, it is essential to properly exploit the computational power offered by new computing advances. For this purpose, we rely on a two-stage prediction process to enhance the quality of traditional predictions, taking advantage of parallel computing. This strategy is based on an adjustment stage which is carried out by a well-known evolutionary technique: Genetic Algorithms. The core of this framework is evaluated according to the probability theory principles. Thus, a strong statistical study is presented and oriented towards the characterization of such an adjustment technique in order to help the operation managers deal with the two aspects previously mentioned: time and quality. The experimental work in this paper is based on a region in Spain which is one of the most prone to forest fires: El Cap de Creus. PMID:24453898

  14. Situational Lightning Climatologies for Central Florida: Phase IV: Central Florida Flow Regime Based Climatologies of Lightning Probabilities

    NASA Technical Reports Server (NTRS)

    Bauman, William H., III

    2009-01-01

    The threat of lightning is a daily concern during the warm season in Florida. Research has revealed distinct spatial and temporal distributions of lightning occurrence that are strongly influenced by large-scale atmospheric flow regimes. Previously, the Applied Meteorology Unit (AMU) calculated the gridded lightning climatologies based on seven flow regimes over Florida for 1-, 3- and 6-hr intervals in 5-, 10-, 20-, and 30-NM diameter range rings around the Shuttle Landing Facility (SLF) and eight other airfields in the National Weather Service in Melbourne (NWS MLB) county warning area (CWA). In this update to the work, the AMU recalculated the lightning climatologies for using individual lightning strike data to improve the accuracy of the climatologies. The AMU included all data regardless of flow regime as one of the stratifications, added monthly stratifications, added three years of data to the period of record and used modified flow regimes based work from the AMU's Objective Lightning Probability Forecast Tool, Phase II. The AMU made changes so the 5- and 10-NM radius range rings are consistent with the aviation forecast requirements at NWS MLB, while the 20- and 30-NM radius range rings at the SLF assist the Spaceflight Meteorology Group in making forecasts for weather Flight Rule violations during Shuttle landings. The AMU also updated the graphical user interface with the new data.

  15. Premelting base pair opening probability and drug binding constant of a daunomycin-poly d(GCAT).poly d(ATGC) complex.

    PubMed Central

    Chen, Y Z; Prohofsky, E W

    1994-01-01

    We calculate room temperature thermal fluctuational base pair opening probability of a daunomycin-poly d(GCAT).poly d(ATGC) complex. This system is constructed at an atomic level of detail based on x-ray analysis of a crystal structure. The base pair opening probabilities are calculated from a modified self-consistent phonon approach of anharmonic lattice dynamics theory. We find that daunomycin binding substantially enhances the thermal stability of one of the base pairs adjacent the drug because of strong hydrogen bonding between the drug and the base. The possible effect of this enhanced stability on the drug inhibition of DNA transcription and replication is discussed. We also calculate the probability of drug dissociation from the helix based on the selfconsistent calculation of the probability of the disruption of drug-base H-bonds and the unstacking probability of the drug. The calculations can be used to determine the equilibrium drug binding constant which is found to be in good agreement with observations on similar daunomycin-DNA systems. PMID:8011914

  16. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    NASA Astrophysics Data System (ADS)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2016-07-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  17. Developing intensity duration frequency curves based on scaling theory using linear probability weighted moments: A case study from India

    NASA Astrophysics Data System (ADS)

    Bairwa, Arvind Kumar; Khosa, Rakesh; Maheswaran, R.

    2016-11-01

    In this study, presence of multi-scale behaviour in rainfall IDF relationship has been established using Linear Probability Weighted Moments (LPWMs) for some selected stations in India. Simple, non-central moments (SMs) have seen widespread use in similar scaling studies but these latter statistical attributes are known to mask the 'true' scaling pattern and, consequently, leading to inappropriate inferences. There is a general agreement amongst researchers that conventional higher order moments do indeed amplify the extreme observations and drastically affect scaling exponents. Additional advantage of LPWMs over SMs is that they exist even when the standard moments do not exist. As an alternative, this study presents a comparison with results based on use of the robust LPWMs which have revealed, in sharp contrast with the conventional moments, a definitive multi-scaling behaviour in all four rainfall observation stations that were selected from different climatic zones. The multi-scale IDF curves derived using LPWMs show a good agreement with observations and it is accordingly concluded that LPWMs provide a more reliable tool for investigating scaling in sequences of observed rainfall corresponding to various durations.

  18. Maximum Magnitude and Probabilities of Induced Earthquakes in California Geothermal Fields: Applications for a Science-Based Decision Framework

    NASA Astrophysics Data System (ADS)

    Weiser, Deborah Anne

    Induced seismicity is occurring at increasing rates around the country. Brodsky and Lajoie (2013) and others have recognized anthropogenic quakes at a few geothermal fields in California. I use three techniques to assess if there are induced earthquakes in California geothermal fields; there are three sites with clear induced seismicity: Brawley, The Geysers, and Salton Sea. Moderate to strong evidence is found at Casa Diablo, Coso, East Mesa, and Susanville. Little to no evidence is found for Heber and Wendel. I develop a set of tools to reduce or cope with the risk imposed by these earthquakes, and also to address uncertainties through simulations. I test if an earthquake catalog may be bounded by an upper magnitude limit. I address whether the earthquake record during pumping time is consistent with the past earthquake record, or if injection can explain all or some of the earthquakes. I also present ways to assess the probability of future earthquake occurrence based on past records. I summarize current legislation for eight states where induced earthquakes are of concern. Unlike tectonic earthquakes, the hazard from induced earthquakes has the potential to be modified. I discuss direct and indirect mitigation practices. I present a framework with scientific and communication techniques for assessing uncertainty, ultimately allowing more informed decisions to be made.

  19. A biology-driven receptor model for daily pollen allergy risk in Korea based on Weibull probability density function

    NASA Astrophysics Data System (ADS)

    Kim, Kyu Rang; Kim, Mijin; Choe, Ho-Seong; Han, Mae Ja; Lee, Hye-Rim; Oh, Jae-Won; Kim, Baek-Jo

    2017-02-01

    Pollen is an important cause of respiratory allergic reactions. As individual sanitation has improved, allergy risk has increased, and this trend is expected to continue due to climate change. Atmospheric pollen concentration is highly influenced by weather conditions. Regression analysis and modeling of the relationships between airborne pollen concentrations and weather conditions were performed to analyze and forecast pollen conditions. Traditionally, daily pollen concentration has been estimated using regression models that describe the relationships between observed pollen concentrations and weather conditions. These models were able to forecast daily concentrations at the sites of observation, but lacked broader spatial applicability beyond those sites. To overcome this limitation, an integrated modeling scheme was developed that is designed to represent the underlying processes of pollen production and distribution. A maximum potential for airborne pollen is first determined using the Weibull probability density function. Then, daily pollen concentration is estimated using multiple regression models. Daily risk grade levels are determined based on the risk criteria used in Korea. The mean percentages of agreement between the observed and estimated levels were 81.4-88.2 % and 92.5-98.5 % for oak and Japanese hop pollens, respectively. The new models estimated daily pollen risk more accurately than the original statistical models because of the newly integrated biological response curves. Although they overestimated seasonal mean concentration, they did not simulate all of the peak concentrations. This issue would be resolved by adding more variables that affect the prevalence and internal maturity of pollens.

  20. Location and release time identification of pollution point source in river networks based on the Backward Probability Method.

    PubMed

    Ghane, Alireza; Mazaheri, Mehdi; Mohammad Vali Samani, Jamal

    2016-09-15

    The pollution of rivers due to accidental spills is a major threat to environment and human health. To protect river systems from accidental spills, it is essential to introduce a reliable tool for identification process. Backward Probability Method (BPM) is one of the most recommended tools that is able to introduce information related to the prior location and the release time of the pollution. This method was originally developed and employed in groundwater pollution source identification problems. One of the objectives of this study is to apply this method in identifying the pollution source location and release time in surface waters, mainly in rivers. To accomplish this task, a numerical model is developed based on the adjoint analysis. Then the developed model is verified using analytical solution and some real data. The second objective of this study is to extend the method to pollution source identification in river networks. In this regard, a hypothetical test case is considered. In the later simulations, all of the suspected points are identified, using only one backward simulation. The results demonstrated that all suspected points, determined by the BPM could be a possible pollution source. The proposed approach is accurate and computationally efficient and does not need any simplification in river geometry and flow. Due to this simplicity, it is highly recommended for practical purposes.

  1. A multiscale finite element model validation method of composite cable-stayed bridge based on Probability Box theory

    NASA Astrophysics Data System (ADS)

    Zhong, Rumian; Zong, Zhouhong; Niu, Jie; Liu, Qiqi; Zheng, Peijuan

    2016-05-01

    Modeling and simulation are routinely implemented to predict the behavior of complex structures. These tools powerfully unite theoretical foundations, numerical models and experimental data which include associated uncertainties and errors. A new methodology for multi-scale finite element (FE) model validation is proposed in this paper. The method is based on two-step updating method, a novel approach to obtain coupling parameters in the gluing sub-regions of a multi-scale FE model, and upon Probability Box (P-box) theory that can provide a lower and upper bound for the purpose of quantifying and transmitting the uncertainty of structural parameters. The structural health monitoring data of Guanhe Bridge, a composite cable-stayed bridge with large span, and Monte Carlo simulation were used to verify the proposed method. The results show satisfactory accuracy, as the overlap ratio index of each modal frequency is over 89% without the average absolute value of relative errors, and the CDF of normal distribution has a good coincidence with measured frequencies of Guanhe Bridge. The validated multiscale FE model may be further used in structural damage prognosis and safety prognosis.

  2. Time-dependent earthquake probabilities

    USGS Publications Warehouse

    Gomberg, J.; Belardinelli, M.E.; Cocco, M.; Reasenberg, P.

    2005-01-01

    We have attempted to provide a careful examination of a class of approaches for estimating the conditional probability of failure of a single large earthquake, particularly approaches that account for static stress perturbations to tectonic loading as in the approaches of Stein et al. (1997) and Hardebeck (2004). We have loading as in the framework based on a simple, generalized rate change formulation and applied it to these two approaches to show how they relate to one another. We also have attempted to show the connection between models of seismicity rate changes applied to (1) populations of independent faults as in background and aftershock seismicity and (2) changes in estimates of the conditional probability of failures of different members of a the notion of failure rate corresponds to successive failures of different members of a population of faults. The latter application requires specification of some probability distribution (density function of PDF) that describes some population of potential recurrence times. This PDF may reflect our imperfect knowledge of when past earthquakes have occurred on a fault (epistemic uncertainty), the true natural variability in failure times, or some combination of both. We suggest two end-member conceptual single-fault models that may explain natural variability in recurrence times and suggest how they might be distinguished observationally. When viewed deterministically, these single-fault patch models differ significantly in their physical attributes, and when faults are immature, they differ in their responses to stress perturbations. Estimates of conditional failure probabilities effectively integrate over a range of possible deterministic fault models, usually with ranges that correspond to mature faults. Thus conditional failure probability estimates usually should not differ significantly for these models. Copyright 2005 by the American Geophysical Union.

  3. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  4. Estimating tail probabilities

    SciTech Connect

    Carr, D.B.; Tolley, H.D.

    1982-12-01

    This paper investigates procedures for univariate nonparametric estimation of tail probabilities. Extrapolated values for tail probabilities beyond the data are also obtained based on the shape of the density in the tail. Several estimators which use exponential weighting are described. These are compared in a Monte Carlo study to nonweighted estimators, to the empirical cdf, to an integrated kernel, to a Fourier series estimate, to a penalized likelihood estimate and a maximum likelihood estimate. Selected weighted estimators are shown to compare favorably to many of these standard estimators for the sampling distributions investigated.

  5. A VLSI Architecture with Multiple Fast Store-Based Block Parallel Processing for Output Probability and Likelihood Score Computations in HMM-Based Isolated Word Recognition

    NASA Astrophysics Data System (ADS)

    Nakamura, Kazuhiro; Shimazaki, Ryo; Yamamoto, Masatoshi; Takagi, Kazuyoshi; Takagi, Naofumi

    This paper presents a memory-efficient VLSI architecture for output probability computations (OPCs) of continuous hidden Markov models (HMMs) and likelihood score computations (LSCs). These computations are the most time consuming part of HMM-based isolated word recognition systems. We demonstrate multiple fast store-based block parallel processing (MultipleFastStoreBPP) for OPCs and LSCs and present a VLSI architecture that supports it. Compared with conventional fast store-based block parallel processing (FastStoreBPP) and stream-based block parallel processing (StreamBPP) architectures, the proposed architecture requires fewer registers and less processing time. The processing elements (PEs) used in the FastStoreBPP and StreamBPP architectures are identical to those used in the MultipleFastStoreBPP architecture. From a VLSI architectural viewpoint, a comparison shows that the proposed architecture is an improvement over the others, through efficient use of PEs and registers for storing input feature vectors.

  6. Threatened species and the potential loss of phylogenetic diversity: conservation scenarios based on estimated extinction probabilities and phylogenetic risk analysis.

    PubMed

    Faith, Daniel P

    2008-12-01

    New species conservation strategies, including the EDGE of Existence (EDGE) program, have expanded threatened species assessments by integrating information about species' phylogenetic distinctiveness. Distinctiveness has been measured through simple scores that assign shared credit among species for evolutionary heritage represented by the deeper phylogenetic branches. A species with a high score combined with a high extinction probability receives high priority for conservation efforts. Simple hypothetical scenarios for phylogenetic trees and extinction probabilities demonstrate how such scoring approaches can provide inefficient priorities for conservation. An existing probabilistic framework derived from the phylogenetic diversity measure (PD) properly captures the idea of shared responsibility for the persistence of evolutionary history. It avoids static scores, takes into account the status of close relatives through their extinction probabilities, and allows for the necessary updating of priorities in light of changes in species threat status. A hypothetical phylogenetic tree illustrates how changes in extinction probabilities of one or more species translate into changes in expected PD. The probabilistic PD framework provided a range of strategies that moved beyond expected PD to better consider worst-case PD losses. In another example, risk aversion gave higher priority to a conservation program that provided a smaller, but less risky, gain in expected PD. The EDGE program could continue to promote a list of top species conservation priorities through application of probabilistic PD and simple estimates of current extinction probability. The list might be a dynamic one, with all the priority scores updated as extinction probabilities change. Results of recent studies suggest that estimation of extinction probabilities derived from the red list criteria linked to changes in species range sizes may provide estimated probabilities for many different species

  7. Small-area estimation of the probability of toxocariasis in New York City based on sociodemographic neighborhood composition.

    PubMed

    Walsh, Michael G; Haseeb, M A

    2014-01-01

    Toxocariasis is increasingly recognized as an important neglected infection of poverty (NIP) in developed countries, and may constitute the most important NIP in the United States (US) given its association with chronic sequelae such as asthma and poor cognitive development. Its potential public health burden notwithstanding, toxocariasis surveillance is minimal throughout the US and so the true burden of disease remains uncertain in many areas. The Third National Health and Nutrition Examination Survey conducted a representative serologic survey of toxocariasis to estimate the prevalence of infection in diverse US subpopulations across different regions of the country. Using the NHANES III surveillance data, the current study applied the predicted probabilities of toxocariasis to the sociodemographic composition of New York census tracts to estimate the local probability of infection across the city. The predicted probability of toxocariasis ranged from 6% among US-born Latino women with a university education to 57% among immigrant men with less than a high school education. The predicted probability of toxocariasis exhibited marked spatial variation across the city, with particularly high infection probabilities in large sections of Queens, and smaller, more concentrated areas of Brooklyn and northern Manhattan. This investigation is the first attempt at small-area estimation of the probability surface of toxocariasis in a major US city. While this study does not define toxocariasis risk directly, it does provide a much needed tool to aid the development of toxocariasis surveillance in New York City.

  8. Evaluation of the eruptive potential and probability in open conduit volcano (Mt Etna) based on soil CO2 flux measurements

    NASA Astrophysics Data System (ADS)

    De Gregorio, Sofia; Camarda, Marco

    2016-04-01

    The evaluation of the amount of magma that might be potentially erupted, i.e. the eruptive potential (EP), and the probability of eruptive event occurrence, i.e. eruptive probability (EPR) of active volcano is one of the most compelling and challenging topic addressed by the volcanology community in the last years. The evaluation of the EP in open conduit volcano is generally based on constant magma supply rate deduced by long-term series of eruptive rate. This EP computation gives good results on long-term (centuries) evaluations, but resulted less effective when short-term (years or months) estimations are needed. Actually the rate of magma supply can undergo changes both on long-term and short-term. At steady condition it can be supposed that the regular supply of magma determines an almost constant level of magma in the feeding system (FS) whereas episodic surplus of magma inputs, with respect the regular supply, can cause large variations in the magma level. Follow that the surplus of magma occasionally entered in the FS represents a supply of material that sooner or later will be disposed, i.e. it will be emitted. Afterwards the amount of surplus of magma inward the FS nearly corresponds to the amount of magma that must be erupted in order to restore the equilibrium. Further, larger is the amount of surplus of magma stored in the system higher is the energetic level of the system and its propensity to erupt or in other words its EPR. On the light of the above consideration herein, we present an innovative methodology to evaluate the EP based on the quantification of surplus of magma with respect the regular supply, progressively intruded in the FS. To estimate the surplus of magma supply we used soil CO2 emission data measured monthly at 130 sites in two peripheral areas of Mt Etna Volcano. Indeed as reported by many authors soil CO2 emissions in the areas are linked to magma supply dynamics and more, anomalous discharges of CO2 are ascribable to surplus of

  9. [EEG background activity in patients with dementia of the Alzheimer type--with special reference to analysis by t-statistic significance probability mapping (SPM) in Alzheimer's disease and senile dementia].

    PubMed

    Miyauchi, T; Hagimoto, H; Saito, T; Endo, K; Ishii, M; Yamaguchi, T; Kajiwara, A; Matsushita, M

    1989-01-01

    EEG power amplitude and power ratio data obtained from 15 (3 men and 12 women) patients with Alzheimer's disease (AD) and 8 (2 men and 6 women) with senile dementia of Alzheimer type (SDAT) were compared with similar data from 40 age- and sex-matched normal controls. Compared with the healthy controls, both patient groups demonstrated increased EEG background slowing, and it indicated more slower in AD than in SDAT. Moreover, both groups showed characteristic findings respectively on EEG topography and t-statistic significance probability mapping (SPM). The differences between AD and their controls indicated high slowing with reductions in alpha 2, beta 1 and beta 2 activity. The SPMs of power ratio in theta and alpha 2 bands showed most prominent significance in the right posterior-temporal region and delta and beta bands did in the frontal region. Severe AD indicated only frontal delta slowing compared to mild AD. The differences between SDAT and their controls indicated only mild slowing in delta and theta bands. The SPM of power amplitude showed occipital slowing, whereas the SPM of power ratio showed the slowing in the frontal region. Judging from both topographic findings, these were considered to denote diffuse slow tendency. In summary, these results presumed that in AD, cortical damages followed by EEG slowing with reductions of alpha 2 and beta bands originated rapidly and thereafter developed subcortical (non-specific area in thalamus) changes with frontal delta activity on SPM. On the other hand, in SDAT, diffuse cortico-subcortical damages with diffuse slowing on EEG topography were caused gradually.

  10. Prospect evaluation as a function of numeracy and probability denominator.

    PubMed

    Millroth, Philip; Juslin, Peter

    2015-05-01

    This study examines how numeracy and probability denominator (a direct-ratio probability, a relative frequency with denominator 100, a relative frequency with denominator 10,000) affect the evaluation of prospects in an expected-value based pricing task. We expected that numeracy would affect the results due to differences in the linearity of number perception and the susceptibility to denominator neglect with different probability formats. An analysis with functional measurement verified that participants integrated value and probability into an expected value. However, a significant interaction between numeracy and probability format and subsequent analyses of the parameters of cumulative prospect theory showed that the manipulation of probability denominator changed participants' psychophysical response to probability and value. Standard methods in decision research may thus confound people's genuine risk attitude with their numerical capacities and the probability format used.

  11. Physics-based Broadband Ground Motion Simulations for Probable M>7.0 earthquakes in the Marmara Sea Region (Turkey)

    NASA Astrophysics Data System (ADS)

    Akinci, Aybige; Aochi, Hideo; Herrero, Andre; Pischiutta, Marta; Karanikas, Dimitris

    2016-04-01

    The city of Istanbul is characterized by one of the highest levels of seismic risk in Europe and the Mediterranean region. The important source of the increased risk in Istanbul is the remarkable probability of the occurrence of a large earthquake, which stands at about 65% during the coming years due to the existing seismic gap and the post-1999 earthquake stress transfer at the western portion of the North Anatolian Fault Zone (NAFZ). In this study, we have simulated hybrid broadband time histories from two selected scenario earthquakes having magnitude M>7.0 in the Marmara Sea within 10-20 km of Istanbul believed to have generated devastating 1509 event in the region. The physics-based rupture scenarios, which may be an indication of potential future events, are adopted to estimate the ground motion characteristics and its variability in the region. Two simulation techniques (a full 3D wave propagation method to generate low-frequency seismograms, <~1 Hz and a stochastic technique to simulate high-frequency seismograms, >1Hz) are used to compute more realistic time series associated with scenario earthquakes having magnitudes Mw >7.0 in the Marmara Sea Region. A dynamic rupture is generated and computed with a boundary integral equation method and the propagation in the medium is realized through a finite difference approach (Aochi and Ulrich, 2015). The high frequency radiation is computed using stochastic finite-fault model approach based on a dynamic corner frequency (Motazedian and Atkinson, 2005; Boore, 2009). The results from the two simulation techniques are then merged by performing a weighted summation at intermediate frequencies to calculate broadband synthetic time series. The hybrid broadband ground motions computed with the proposed approach are validated by comparing peak ground acceleration (PGA), peak ground velocity (PGV), and spectral acceleration (SA) with recently proposed ground motion prediction equations (GMPE) in the region. Our

  12. Highly efficient codec based on significance-linked connected-component analysis of wavelet coefficients

    NASA Astrophysics Data System (ADS)

    Chai, Bing-Bing; Vass, Jozsef; Zhuang, Xinhua

    1997-04-01

    Recent success in wavelet coding is mainly attributed to the recognition of importance of data organization. There has been several very competitive wavelet codecs developed, namely, Shapiro's Embedded Zerotree Wavelets (EZW), Servetto et. al.'s Morphological Representation of Wavelet Data (MRWD), and Said and Pearlman's Set Partitioning in Hierarchical Trees (SPIHT). In this paper, we propose a new image compression algorithm called Significant-Linked Connected Component Analysis (SLCCA) of wavelet coefficients. SLCCA exploits both within-subband clustering of significant coefficients and cross-subband dependency in significant fields. A so-called significant link between connected components is designed to reduce the positional overhead of MRWD. In addition, the significant coefficients' magnitude are encoded in bit plane order to match the probability model of the adaptive arithmetic coder. Experiments show that SLCCA outperforms both EZW and MRWD, and is tied with SPIHT. Furthermore, it is observed that SLCCA generally has the best performance on images with large portion of texture. When applied to fingerprint image compression, it outperforms FBI's wavelet scalar quantization by about 1 dB.

  13. Industrial Base: Significance of DoD’s Foreign Dependence

    DTIC Science & Technology

    1991-01-01

    defense industrial base: the U.S. defense industrial base information system and revised DOD guidance for assessing foreign dependence throughout the...DOD’s "Foreign Dependence Is data bases and models is cited as a problem hindering effective indus- Unknown trial base planning.3 Determining if...Assess the procedures to include early consideration of foreign sourcing and depen- Significance of Foreign dency issues. Dependence on the DOD efforts

  14. A Galerkin-based formulation of the probability density evolution method for general stochastic finite element systems

    NASA Astrophysics Data System (ADS)

    Papadopoulos, Vissarion; Kalogeris, Ioannis

    2016-05-01

    The present paper proposes a Galerkin finite element projection scheme for the solution of the partial differential equations (pde's) involved in the probability density evolution method, for the linear and nonlinear static analysis of stochastic systems. According to the principle of preservation of probability, the probability density evolution of a stochastic system is expressed by its corresponding Fokker-Planck (FP) stochastic partial differential equation. Direct integration of the FP equation is feasible only for simple systems with a small number of degrees of freedom, due to analytical and/or numerical intractability. However, rewriting the FP equation conditioned to the random event description, a generalized density evolution equation (GDEE) can be obtained, which can be reduced to a one dimensional pde. Two Galerkin finite element method schemes are proposed for the numerical solution of the resulting pde's, namely a time-marching discontinuous Galerkin scheme and the StreamlineUpwind/Petrov Galerkin (SUPG) scheme. In addition, a reformulation of the classical GDEE is proposed, which implements the principle of probability preservation in space instead of time, making this approach suitable for the stochastic analysis of finite element systems. The advantages of the FE Galerkin methods and in particular the SUPG over finite difference schemes, like the modified Lax-Wendroff, which is the most frequently used method for the solution of the GDEE, are illustrated with numerical examples and explored further.

  15. Changes in Sexual Behavior and Attitudes Across Generations and Gender Among a Population-Based Probability Sample From an Urbanizing Province in Thailand.

    PubMed

    Techasrivichien, Teeranee; Darawuttimaprakorn, Niphon; Punpuing, Sureeporn; Musumari, Patou Masika; Lukhele, Bhekumusa Wellington; El-Saaidi, Christina; Suguimoto, S Pilar; Feldman, Mitchell D; Ono-Kihara, Masako; Kihara, Masahiro

    2016-02-01

    Thailand has undergone rapid modernization with implications for changes in sexual norms. We investigated sexual behavior and attitudes across generations and gender among a probability sample of the general population of Nonthaburi province located near Bangkok in 2012. A tablet-based survey was performed among 2,138 men and women aged 15-59 years identified through a three-stage, stratified, probability proportional to size, clustered sampling. Descriptive statistical analysis was carried out accounting for the effects of multistage sampling. Relationship of age and gender to sexual behavior and attitudes was analyzed by bivariate analysis followed by multivariate logistic regression analysis to adjust for possible confounding. Patterns of sexual behavior and attitudes varied substantially across generations and gender. We found strong evidence for a decline in the age of sexual initiation, a shift in the type of the first sexual partner, and a greater rate of acceptance of adolescent premarital sex among younger generations. The study highlighted profound changes among young women as evidenced by a higher number of lifetime sexual partners as compared to older women. In contrast to the significant gender gap in older generations, sexual profiles of Thai young women have evolved to resemble those of young men with attitudes gradually converging to similar sexual standards. Our data suggest that higher education, being never-married, and an urban lifestyle may have been associated with these changes. Our study found that Thai sexual norms are changing dramatically. It is vital to continue monitoring such changes, considering the potential impact on the HIV/STIs epidemic and unintended pregnancies.

  16. A Block Compressive Sensing Based Scalable Encryption Framework for Protecting Significant Image Regions

    NASA Astrophysics Data System (ADS)

    Zhang, Yushu; Zhou, Jiantao; Chen, Fei; Zhang, Leo Yu; Xiao, Di; Chen, Bin; Liao, Xiaofeng

    The existing Block Compressive Sensing (BCS) based image ciphers adopted the same sampling rate for all the blocks, which may lead to the desirable result that after subsampling, significant blocks lose some more-useful information while insignificant blocks still retain some less-useful information. Motivated by this observation, we propose a scalable encryption framework (SEF) based on BCS together with a Sobel Edge Detector and Cascade Chaotic Maps. Our work is firstly dedicated to the design of two new fusion techniques, chaos-based structurally random matrices and chaos-based random convolution and subsampling. The basic idea is to divide an image into some blocks with an equal size and then diagnose their respective significance with the help of the Sobel Edge Detector. For significant block encryption, chaos-based structurally random matrix is applied to significant blocks whereas chaos-based random convolution and subsampling are responsible for the remaining insignificant ones. In comparison with the BCS based image ciphers, the SEF takes lightweight subsampling and severe sensitivity encryption for the significant blocks and severe subsampling and lightweight robustness encryption for the insignificant ones in parallel, thus better protecting significant image regions.

  17. Searching with Probabilities

    DTIC Science & Technology

    1983-07-26

    DeGroot , Morris H. Probability and Statistic. Addison-Wesley Publishing Company, Reading, Massachusetts, 1975. [Gillogly 78] Gillogly, J.J. Performance...distribution [ DeGroot 751 has just begun. The beta distribution has several features that might make it a more reasonable choice. As with the normal-based...1982. [Cooley 65] Cooley, J.M. and Tukey, J.W. An algorithm for the machine calculation of complex Fourier series. Math. Comp. 19, 1965. [ DeGroot 75

  18. Low probability of intercept-based adaptive radar waveform optimization in signal-dependent clutter for joint radar and cellular communication systems

    NASA Astrophysics Data System (ADS)

    Shi, Chenguang; Salous, Sana; Wang, Fei; Zhou, Jianjiang

    2016-12-01

    In this paper, we investigate the problem of low probability of intercept (LPI)-based adaptive radar waveform optimization in signal-dependent clutter for joint radar and cellular communication systems, where the radar system optimizes the transmitted waveform such that the interference caused to the cellular communication systems is strictly controlled. Assuming that the precise knowledge of the target spectra, the power spectral densities (PSDs) of signal-dependent clutters, the propagation losses of corresponding channels and the communication signals is known by the radar, three different LPI based criteria for radar waveform optimization are proposed to minimize the total transmitted power of the radar system by optimizing the multicarrier radar waveform with a predefined signal-to-interference-plus-noise ratio (SINR) constraint and a minimum required capacity for the cellular communication systems. These criteria differ in the way the communication signals scattered off the target are considered in the radar waveform design: (1) as useful energy, (2) as interference or (3) ignored altogether. The resulting problems are solved analytically and their solutions represent the optimum power allocation for each subcarrier in the multicarrier radar waveform. We show with numerical results that the LPI performance of the radar system can be significantly improved by exploiting the scattered echoes off the target due to cellular communication signals received at the radar receiver.

  19. Low probability of intercept-based adaptive radar waveform optimization in signal-dependent clutter for joint radar and cellular communication systems.

    PubMed

    Shi, Chenguang; Salous, Sana; Wang, Fei; Zhou, Jianjiang

    2016-01-01

    In this paper, we investigate the problem of low probability of intercept (LPI)-based adaptive radar waveform optimization in signal-dependent clutter for joint radar and cellular communication systems, where the radar system optimizes the transmitted waveform such that the interference caused to the cellular communication systems is strictly controlled. Assuming that the precise knowledge of the target spectra, the power spectral densities (PSDs) of signal-dependent clutters, the propagation losses of corresponding channels and the communication signals is known by the radar, three different LPI based criteria for radar waveform optimization are proposed to minimize the total transmitted power of the radar system by optimizing the multicarrier radar waveform with a predefined signal-to-interference-plus-noise ratio (SINR) constraint and a minimum required capacity for the cellular communication systems. These criteria differ in the way the communication signals scattered off the target are considered in the radar waveform design: (1) as useful energy, (2) as interference or (3) ignored altogether. The resulting problems are solved analytically and their solutions represent the optimum power allocation for each subcarrier in the multicarrier radar waveform. We show with numerical results that the LPI performance of the radar system can be significantly improved by exploiting the scattered echoes off the target due to cellular communication signals received at the radar receiver.

  20. Precursor Analysis for Flight- and Ground-Based Anomaly Risk Significance Determination

    NASA Technical Reports Server (NTRS)

    Groen, Frank

    2010-01-01

    This slide presentation reviews the precursor analysis for flight and ground based anomaly risk significance. It includes information on accident precursor analysis, real models vs. models, and probabilistic analysis.

  1. Significance testing of rules in rule-based models of human problem solving

    NASA Technical Reports Server (NTRS)

    Lewis, C. M.; Hammer, J. M.

    1986-01-01

    Rule-based models of human problem solving have typically not been tested for statistical significance. Three methods of testing rules - analysis of variance, randomization, and contingency tables - are presented. Advantages and disadvantages of the methods are also described.

  2. Wave height possibility distribution characteristics of significant wave height in China Sea based on multi-satellite grid data

    NASA Astrophysics Data System (ADS)

    Han, W.; Yang, J.

    2016-11-01

    This paper discusses the group of wave height possibility distribution characteristics of significant wave height in China Sea based on multi-satellite grid data, the grid SWH data merges six satellites (TOPEX/Poseidon, Jason-1/2, ENVISAT, Cryosat-2, HY-2A) corrected satellite altimeter data into the global SWH grid data in 2000∼2015 using Inverse Distance Weighting Method. Comparing the difference of wave height possibility distribution of two schemes that scheme two includes all of 6 satellite data and scheme one includes all of other 5 satellite data except HY-2A in two wave height interval, the first interval is [0,25) m, the second interval is [4,25) m, finding that two schemes have close wave height probability distribution and the probability change trend, there are difference only in interval [0.4, 1.8) m and the possibility in this interval occupies over 70%; then mainly discussing scheme two, finding that the interval of greatest wave height possibility is [0.6, 3) m, and the wave height possibility that the SWH is greater than 4m is less than 0.18%.

  3. Co-activation Probability Estimation (CoPE): An approach for modeling functional co-activation architecture based on neuroimaging coordinates

    PubMed Central

    Chu, Congying; Fan, Lingzhong; Eickhoff, Claudia R.; Liu, Yong; Yang, Yong; Eickhoff, Simon B.; Jiang, Tianzi

    2016-01-01

    Recent progress in functional neuroimaging has prompted studies of brain activation during various cognitive tasks. Coordinate-based meta-analysis has been utilized to discover the brain regions that are consistently activated across experiments. However, within-experiment co-activation relationships, which can reflect the underlying functional relationships between different brain regions, have not been widely studied. In particular, voxel-wise co-activation, which may be able to provide a detailed configuration of the co-activation network, still needs to be modeled. To estimate the voxel-wise co-activation pattern and deduce the co-activation network, a Co-activation Probability Estimation (CoPE) method was proposed to model within-experiment activations for the purpose of defining the co-activations. A permutation test was adopted as a significance test. Moreover, the co-activations were automatically separated into local and long-range ones, based on distance. The two types of co-activations describe distinct features: the first reflects convergent activations; the second represents co-activations between different brain regions. The validation of CoPE was based on five simulation tests and one real dataset derived from studies of working memory. Both the simulated and the real data demonstrated that CoPE was not only able to find local convergence but also significant long-range co-activation. In particular, CoPE was able to identify a ‘core’ co-activation network in the working memory dataset. As a data-driven method, the CoPE method can be used to mine underlying co-activation relationships across experiments in future studies. PMID:26037052

  4. Probability theory-based SNP association study method for identifying susceptibility loci and genetic disease models in human case-control data.

    PubMed

    Yuan, Xiguo; Zhang, Junying; Wang, Yue

    2010-12-01

    One of the most challenging points in studying human common complex diseases is to search for both strong and weak susceptibility single-nucleotide polymorphisms (SNPs) and identify forms of genetic disease models. Currently, a number of methods have been proposed for this purpose. Many of them have not been validated through applications into various genome datasets, so their abilities are not clear in real practice. In this paper, we present a novel SNP association study method based on probability theory, called ProbSNP. The method firstly detects SNPs by evaluating their joint probabilities in combining with disease status and selects those with the lowest joint probabilities as susceptibility ones, and then identifies some forms of genetic disease models through testing multiple-locus interactions among the selected SNPs. The joint probabilities of combined SNPs are estimated by establishing Gaussian distribution probability density functions, in which the related parameters (i.e., mean value and standard deviation) are evaluated based on allele and haplotype frequencies. Finally, we test and validate the method using various genome datasets. We find that ProbSNP has shown remarkable success in the applications to both simulated genome data and real genome-wide data.

  5. Fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy logical relationships.

    PubMed

    Chen, Shyi-Ming; Chen, Shen-Wen

    2015-03-01

    In this paper, we present a new method for fuzzy forecasting based on two-factors second-order fuzzy-trend logical relationship groups and the probabilities of trends of fuzzy-trend logical relationships. Firstly, the proposed method fuzzifies the historical training data of the main factor and the secondary factor into fuzzy sets, respectively, to form two-factors second-order fuzzy logical relationships. Then, it groups the obtained two-factors second-order fuzzy logical relationships into two-factors second-order fuzzy-trend logical relationship groups. Then, it calculates the probability of the "down-trend," the probability of the "equal-trend" and the probability of the "up-trend" of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group, respectively. Finally, it performs the forecasting based on the probabilities of the down-trend, the equal-trend, and the up-trend of the two-factors second-order fuzzy-trend logical relationships in each two-factors second-order fuzzy-trend logical relationship group. We also apply the proposed method to forecast the Taiwan Stock Exchange Capitalization Weighted Stock Index (TAIEX) and the NTD/USD exchange rates. The experimental results show that the proposed method outperforms the existing methods.

  6. Probability workshop to be better in probability topic

    NASA Astrophysics Data System (ADS)

    Asmat, Aszila; Ujang, Suriyati; Wahid, Sharifah Norhuda Syed

    2015-02-01

    The purpose of the present study was to examine whether statistics anxiety and attitudes towards probability topic among students in higher education level have an effect on their performance. 62 fourth semester science students were given statistics anxiety questionnaires about their perception towards probability topic. Result indicated that students' performance in probability topic is not related to anxiety level, which means that the higher level in statistics anxiety will not cause lower score in probability topic performance. The study also revealed that motivated students gained from probability workshop ensure that their performance in probability topic shows a positive improvement compared before the workshop. In addition there exists a significance difference in students' performance between genders with better achievement among female students compared to male students. Thus, more initiatives in learning programs with different teaching approaches is needed to provide useful information in improving student learning outcome in higher learning institution.

  7. Adapting the posterior probability of diagnosis index to enhance evidence-based screening: an application to ADHD in primary care.

    PubMed

    Lindhiem, Oliver; Yu, Lan; Grasso, Damion J; Kolko, David J; Youngstrom, Eric A

    2015-04-01

    This study adapts the Posterior Probability of Diagnosis (PPOD) Index for use with screening data. The original PPOD Index, designed for use in the context of comprehensive diagnostic assessments, is overconfident when applied to screening data. To correct for this overconfidence, we describe a simple method for adjusting the PPOD Index to improve its calibration when used for screening. Specifically, we compare the adjusted PPOD Index to the original index and naïve Bayes probability estimates on two dimensions of accuracy, discrimination and calibration, using a clinical sample of children and adolescents (N = 321) whose caregivers completed the Vanderbilt Assessment Scale to screen for attention-deficit/hyperactivity disorder and who subsequently completed a comprehensive diagnostic assessment. Results indicated that the adjusted PPOD Index, original PPOD Index, and naïve Bayes probability estimates are comparable using traditional measures of accuracy (sensitivity, specificity, and area under the curve), but the adjusted PPOD Index showed superior calibration. We discuss the importance of calibration for screening and diagnostic support tools when applied to individual patients.

  8. Probability Surveys, Conditional Probability, and Ecological Risk Assessment

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency’s (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  9. PROBABILITY SURVEYS, CONDITIONAL PROBABILITIES, AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Asscssment Program EMAP) can be analyzed with a conditional probability analysis (CPA) to conduct quantitative probabi...

  10. PROBABILITY SURVEYS , CONDITIONAL PROBABILITIES AND ECOLOGICAL RISK ASSESSMENT

    EPA Science Inventory

    We show that probability-based environmental resource monitoring programs, such as the U.S. Environmental Protection Agency's (U.S. EPA) Environmental Monitoring and Assessment Program, and conditional probability analysis can serve as a basis for estimating ecological risk over ...

  11. Capture probabilities for secondary resonances

    NASA Technical Reports Server (NTRS)

    Malhotra, Renu

    1990-01-01

    A perturbed pendulum model is used to analyze secondary resonances, and it is shown that a self-similarity between secondary and primary resonances exists. Henrard's (1982) theory is used to obtain formulas for the capture probability into secondary resonances. The tidal evolution of Miranda and Umbriel is considered as an example, and significant probabilities of capture into secondary resonances are found.

  12. Probability 1/e

    ERIC Educational Resources Information Center

    Koo, Reginald; Jones, Martin L.

    2011-01-01

    Quite a number of interesting problems in probability feature an event with probability equal to 1/e. This article discusses three such problems and attempts to explain why this probability occurs with such frequency.

  13. Modeling of the Dissociative Adsorption Probability of the H2-Pt(111) System Based on Molecular Dynamics

    NASA Astrophysics Data System (ADS)

    Koido, Tetsuya; Tomarikawa, Ko; Yonemura, Shigeru; Tokumasu, Takashi

    2011-05-01

    Molecular Dynamics (MD) was used to simulate dissociative adsorption of a hydrogen molecule on the Pt(111) surface considering the movement of the surface atoms and gas molecules. The Embedded Atom Method (EAM) was applied to represent the interaction potential. The parameters of the EAM potential were determined such that the values of the dissociation barrier at different sites estimated by the EAM potential agreed with that of DFT calculation results. A number of MD simulations of gas molecules impinging on a Pt(111) surface were carried out randomly changing initial orientations, incident azimuth angles, and impinging positions on the surface with fixed initial translational energy, initial rotational energy, and incident polar angle. The number of collisions in which the gas molecule was dissociated were counted to compute the dissociation probability. The dissociation probability was analyzed and expressed by a mathematical function involving the initial conditions of the impinging molecule, namely the translational energy, rotational energy, and incident polar angle. Furthermore, the utility of the model was verified by comparing its results with raw MD simulation results of molecular beam experiments.

  14. Estimation of the age-specific per-contact probability of Ebola virus transmission in Liberia using agent-based simulations

    NASA Astrophysics Data System (ADS)

    Siettos, Constantinos I.; Anastassopoulou, Cleo; Russo, Lucia; Grigoras, Christos; Mylonakis, Eleftherios

    2016-06-01

    Based on multiscale agent-based computations we estimated the per-contact probability of transmission by age of the Ebola virus disease (EVD) that swept through Liberia from May 2014 to March 2015. For the approximation of the epidemic dynamics we have developed a detailed agent-based model with small-world interactions between individuals categorized by age. For the estimation of the structure of the evolving contact network as well as the per-contact transmission probabilities by age group we exploited the so called Equation-Free framework. Model parameters were fitted to official case counts reported by the World Health Organization (WHO) as well as to recently published data of key epidemiological variables, such as the mean time to death, recovery and the case fatality rate.

  15. Probability for Weather and Climate

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    Over the last 60 years, the availability of large-scale electronic computers has stimulated rapid and significant advances both in meteorology and in our understanding of the Earth System as a whole. The speed of these advances was due, in large part, to the sudden ability to explore nonlinear systems of equations. The computer allows the meteorologist to carry a physical argument to its conclusion; the time scales of weather phenomena then allow the refinement of physical theory, numerical approximation or both in light of new observations. Prior to this extension, as Charney noted, the practicing meteorologist could ignore the results of theory with good conscience. Today, neither the practicing meteorologist nor the practicing climatologist can do so, but to what extent, and in what contexts, should they place the insights of theory above quantitative simulation? And in what circumstances can one confidently estimate the probability of events in the world from model-based simulations? Despite solid advances of theory and insight made possible by the computer, the fidelity of our models of climate differs in kind from the fidelity of models of weather. While all prediction is extrapolation in time, weather resembles interpolation in state space, while climate change is fundamentally an extrapolation. The trichotomy of simulation, observation and theory which has proven essential in meteorology will remain incomplete in climate science. Operationally, the roles of probability, indeed the kinds of probability one has access too, are different in operational weather forecasting and climate services. Significant barriers to forming probability forecasts (which can be used rationally as probabilities) are identified. Monte Carlo ensembles can explore sensitivity, diversity, and (sometimes) the likely impact of measurement uncertainty and structural model error. The aims of different ensemble strategies, and fundamental differences in ensemble design to support of

  16. Young Children's Probability of Dying Before and After Their Mother's Death: A Rural South African Population-Based Surveillance Study

    PubMed Central

    Clark, Samuel J.; Kahn, Kathleen; Houle, Brian; Arteche, Adriane; Collinson, Mark A.; Tollman, Stephen M.; Stein, Alan

    2013-01-01

    Background There is evidence that a young child's risk of dying increases following the mother's death, but little is known about the risk when the mother becomes very ill prior to her death. We hypothesized that children would be more likely to die during the period several months before their mother's death, as well as for several months after her death. Therefore we investigated the relationship between young children's likelihood of dying and the timing of their mother's death and, in particular, the existence of a critical period of increased risk. Methods and Findings Data from a health and socio-demographic surveillance system in rural South Africa were collected on children 0–5 y of age from 1 January 1994 to 31 December 2008. Discrete time survival analysis was used to estimate children's probability of dying before and after their mother's death, accounting for moderators. 1,244 children (3% of sample) died from 1994 to 2008. The probability of child death began to rise 6–11 mo prior to the mother's death and increased markedly during the 2 mo immediately before the month of her death (odds ratio [OR] 7.1 [95% CI 3.9–12.7]), in the month of her death (OR 12.6 [6.2–25.3]), and during the 2 mo following her death (OR 7.0 [3.2–15.6]). This increase in the probability of dying was more pronounced for children whose mothers died of AIDS or tuberculosis compared to other causes of death, but the pattern remained for causes unrelated to AIDS/tuberculosis. Infants aged 0–6 mo at the time of their mother's death were nine times more likely to die than children aged 2–5 y. The limitations of the study included the lack of knowledge about precisely when a very ill mother will die, a lack of information about child nutrition and care, and the diagnosis of AIDS deaths by verbal autopsy rather than serostatus. Conclusions Young children in lower income settings are more likely to die not only after their mother's death but also in the months before, when

  17. Laser Raman detection for oral cancer based on an adaptive Gaussian process classification method with posterior probabilities

    NASA Astrophysics Data System (ADS)

    Du, Zhanwei; Yang, Yongjian; Bai, Yuan; Wang, Lijun; Su, Le; Chen, Yong; Li, Xianchang; Zhou, Xiaodong; Jia, Jun; Shen, Aiguo; Hu, Jiming

    2013-03-01

    The existing methods for early and differential diagnosis of oral cancer are limited due to the unapparent early symptoms and the imperfect imaging examination methods. In this paper, the classification models of oral adenocarcinoma, carcinoma tissues and a control group with just four features are established by utilizing the hybrid Gaussian process (HGP) classification algorithm, with the introduction of the mechanisms of noise reduction and posterior probability. HGP shows much better performance in the experimental results. During the experimental process, oral tissues were divided into three groups, adenocarcinoma (n = 87), carcinoma (n = 100) and the control group (n = 134). The spectral data for these groups were collected. The prospective application of the proposed HGP classification method improved the diagnostic sensitivity to 56.35% and the specificity to about 70.00%, and resulted in a Matthews correlation coefficient (MCC) of 0.36. It is proved that the utilization of HGP in LRS detection analysis for the diagnosis of oral cancer gives accurate results. The prospect of application is also satisfactory.

  18. Probability and Relative Frequency

    NASA Astrophysics Data System (ADS)

    Drieschner, Michael

    2016-01-01

    The concept of probability seems to have been inexplicable since its invention in the seventeenth century. In its use in science, probability is closely related with relative frequency. So the task seems to be interpreting that relation. In this paper, we start with predicted relative frequency and show that its structure is the same as that of probability. I propose to call that the `prediction interpretation' of probability. The consequences of that definition are discussed. The "ladder"-structure of the probability calculus is analyzed. The expectation of the relative frequency is shown to be equal to the predicted relative frequency. Probability is shown to be the most general empirically testable prediction.

  19. Measuring local context as context-word probabilities.

    PubMed

    Hahn, Lance W

    2012-06-01

    Context enables readers to quickly recognize a related word but disturbs recognition of unrelated words. The relatedness of a final word to a sentence context has been estimated as the probability (cloze probability) that a participant will complete a sentence with a word. In four studies, I show that it is possible to estimate local context-word relatedness based on common language usage. Conditional probabilities were calculated for sentences with published cloze probabilities. Four-word contexts produced conditional probabilities significantly correlated with cloze probabilities, but usage statistics were unavailable for some sentence contexts. The present studies demonstrate that a composite context measure based on conditional probabilities for one- to four-word contexts and the presence of a final period represents all of the sentences and maintains significant correlations (.25, .52, .53) with cloze probabilities. Finally, the article provides evidence for the effectiveness of this measure by showing that local context varies in ways that are similar to the N400 effect and that are consistent with a role for local context in reading. The Supplemental materials include local context measures for three cloze probability data sets.

  20. Monoclonal gammopathy of undetermined significance and risk of infections: a population-based study.

    PubMed

    Kristinsson, Sigurdur Y; Tang, Min; Pfeiffer, Ruth M; Björkholm, Magnus; Goldin, Lynn R; Blimark, Cecilie; Mellqvist, Ulf-Henrik; Wahlin, Anders; Turesson, Ingemar; Landgren, Ola

    2012-06-01

    No comprehensive evaluation has been made to assess the risk of viral and bacterial infections among patients with monoclonal gammopathy of undetermined significance. Using population-based data from Sweden, we estimated risk of infections among 5,326 monoclonal gammopathy of undetermined significance patients compared to 20,161 matched controls. Patients with monoclonal gammopathy of undetermined significance had a 2-fold increased risk (P<0.05) of developing any infection at 5- and 10-year follow up. More specifically, patients with monoclonal gammopathy of undetermined significance had an increased risk (P<0.05) of bacterial (pneumonia, osteomyelitis, septicemia, pyelonephritis, cellulitis, endocarditis, and meningitis), and viral (influenza and herpes zoster) infections. Patients with monoclonal gammopathy of undetermined significance with M-protein concentrations over 2.5 g/dL at diagnosis had highest risks of infections. However, the risk was also increased (P<0.05) among those with concentrations below 0.5 g/dL. Patients with monoclonal gammopathy of undetermined significance who developed infections had no excess risk of developing multiple myeloma, Waldenström macroglobulinemia or related malignancy. Our findings provide novel insights into the mechanisms behind infections in patients with plasma cell dyscrasias, and may have clinical implications.

  1. Teacher and Student Based Instructions on Probability Achievement Outcomes and Attitudes of Secondary School Students in Bungoma North, Kenya

    ERIC Educational Resources Information Center

    Pale, Joseph W.

    2016-01-01

    Teacher based is the usual instructional method used by most teachers in high school. Traditionally, teachers direct the learning and students work individually and assume a receptive role in their education. Student based learning approach is an instructional use of small groups of students working together to accomplish shared goals to increase…

  2. One rhinophore probably provides sufficient sensory input for odour-based navigation by the nudibranch mollusc Tritonia diomedea.

    PubMed

    McCullagh, Gregory B; Bishop, Cory D; Wyeth, Russell C

    2014-12-01

    Tritonia diomedea (synonymous with Tritonia tetraquetra) navigates in turbulent odour plumes, crawling upstream towards prey and downstream to avoid predators. This is probably accomplished by odour-gated rheotaxis, but other possibilities have not been excluded. Our goal was to test whether T. diomedea uses odour-gated rheotaxis and to simultaneously determine which of the cephalic sensory organs (rhinophores and oral veil) are required for navigation. In a first experiment, slugs showed no coherent responses to streams of odour directed at single rhinophores. In a second experiment, navigation in prey and predator odour plumes was compared between animals with unilateral rhinophore lesions, denervated oral veils, or combined unilateral rhinophore lesions and denervated oral veils. In all treatments, animals navigated in a similar manner to that of control and sham-operated animals, indicating that a single rhinophore provides sufficient sensory input for navigation (assuming that a distributed flow measurement system would also be affected by the denervations). Amongst various potential navigational strategies, only odour-gated positive rheotaxis can produce the navigation tracks we observed in prey plumes while receiving input from a single sensor. Thus, we provide strong evidence that T. diomedea uses odour-gated rheotaxis in attractive odour plumes, with odours and flow detected by the rhinophores. In predator plumes, slugs turned downstream to varying degrees rather than orienting directly downstream for crawling, resulting in greater dispersion for negative rheotaxis in aversive plumes. These conclusions are the first explicit confirmation of odour-gated rheotaxis as a navigational strategy in gastropods and are also a foundation for exploring the neural circuits that mediate odour-gated rheotaxis.

  3. Curative Surgical Resection of Adrenocortical Carcinoma: Determining Long-term Outcome Based on Conditional Disease-free Probability

    PubMed Central

    Prescott, Jason D.; Tran, Thuy B.; Postlewait, Lauren M.; Maithel, Shishir K.; Wang, Tracy S.; Glenn, Jason A.; Hatzaras, Ioannis; Shenoy, Rivfka; Phay, John E.; Keplinger, Kara; Fields, Ryan C.; Jin, Linda X.; Weber, Sharon M.; Salem, Ahmed; Sicklick, Jason K.; Gad, Shady; Yopp, Adam C.; Mansour, John C.; Duh, Quan-Yang; Seiser, Natalie; Solorzano, Carmen C.; Kiernan, Colleen M.; Votanopoulos, Konstantinos I.; Levine, Edward A.; Poultsides, George A.; Pawlik, Timothy M.

    2016-01-01

    Objective To evaluate conditional disease-free survival (CDFS) for patients who underwent curative intent surgery for adrenocortical carcinoma (ACC). Background ACC is a rare but aggressive tumor. Survival estimates are usually reported as survival from the time of surgery. CDFS estimates may be more clinically relevant by accounting for the changing likelihood of disease-free survival (DFS) according to time elapsed after surgery. Methods CDFS was assessed using a multi-institutional cohort of patients. Cox proportional hazards models were used to evaluate factors associated with DFS. Three-year CDFS (CDFS3) estimates at “x” year after surgery were calculated as follows: CDFS3=DFS(x+3)/DFS(x). Results One hundred ninety-two patients were included in the study cohort; median patient age was 52 years. On presentation, 36% of patients had a functional tumor and median size was 11.5 cm. Most patients underwent R0 resection (75%) and 9% had N1 disease. Overall 1-, 3-, and 5-year DFS was 59%, 34%, and 22%, respectively. Using CDFS estimates, the probability of remaining disease free for an additional 3 years given that the patient had survived without disease at 1, 3, and 5 years, was 43%, 53%, and 70%, respectively. Patients with less favorable prognosis at baseline demonstrated the greatest increase in CDFS3 over time (eg, capsular invasion: 28%–88%, Δ60% vs no capsular invasion: 51%–87%, Δ36%). Conclusions DFS estimates for patients with ACC improved dramatically over time, in particular among patients with initial worse prognoses. CDFS estimates may provide more clinically relevant information about the changing likelihood of DFS over time. PMID:28009746

  4. The effects of inquiry-based science instruction training on teachers of students with significant disabilities

    NASA Astrophysics Data System (ADS)

    Courtade, Ginevra Rose

    Federal mandates (A Nation at Risk, 1983 and Project 2061: Science for all Americans, 1985) as well as the National Science Education Standards (NRC, 1996) call for science education for all students. Recent educational laws (IDEA, 1997; NCLB, 2002) require access to and assessment of the general curriculum, including science, for all students with disabilities. Although some research exists on teaching academics to students with significant disabilities, the research on teaching science is especially limited (Browder, Spooner, Ahlgrim-Delzell, Harris, & Wakeman, 2006; Browder, Wakeman, et al., 2006; Courtade, et al., 2006). The purpose of this investigation was to determine if training teachers of students with significant disabilities to teach science concepts using a guided inquiry-based method would change the way science was instructed in the classroom. Further objectives of this study were to determine if training the teachers would increase students' participation and achievement in science. The findings of this study demonstrated a functional relationship between the inquiry-based science instruction training and teacher's ability to instruct students with significant disabilities in science using inquiry-based science instruction. The findings of this study also indicated a functional relationship between the inquiry-based science instruction training and acquisition of student inquiry skills. Also, findings indicated an increase in the number of science content standards being addressed after the teachers received the training. Some students were also able to acquire new science terms after their teachers taught using inquiry-based instruction. Finally, social validity measures indicated a high degree of satisfaction with the intervention and its intended outcomes.

  5. What Are Probability Surveys?

    EPA Pesticide Factsheets

    The National Aquatic Resource Surveys (NARS) use probability-survey designs to assess the condition of the nation’s waters. In probability surveys (also known as sample-surveys or statistical surveys), sampling sites are selected randomly.

  6. Probability-based classifications for spatially characterizing the water temperatures and discharge rates of hot springs in the Tatun Volcanic Region, Taiwan.

    PubMed

    Jang, Cheng-Shin

    2015-05-01

    Accurately classifying the spatial features of the water temperatures and discharge rates of hot springs is crucial for environmental resources use and management. This study spatially characterized classifications of the water temperatures and discharge rates of hot springs in the Tatun Volcanic Region of Northern Taiwan by using indicator kriging (IK). The water temperatures and discharge rates of the springs were first assigned to high, moderate, and low categories according to the two thresholds of the proposed spring classification criteria. IK was then used to model the occurrence probabilities of the water temperatures and discharge rates of the springs and probabilistically determine their categories. Finally, nine combinations were acquired from the probability-based classifications for the spatial features of the water temperatures and discharge rates of the springs. Moreover, various combinations of spring water features were examined according to seven subzones of spring use in the study region. The research results reveal that probability-based classifications using IK provide practicable insights related to propagating the uncertainty of classifications according to the spatial features of the water temperatures and discharge rates of the springs. The springs in the Beitou (BT), Xingyi Road (XYR), Zhongshanlou (ZSL), and Lengshuikeng (LSK) subzones are suitable for supplying tourism hotels with a sufficient quantity of spring water because they have high or moderate discharge rates. Furthermore, natural hot springs in riverbeds and valleys should be developed in the Dingbeitou (DBT), ZSL, Xiayoukeng (XYK), and Macao (MC) subzones because of low discharge rates and low or moderate water temperatures.

  7. New Classification Method Based on Support-Significant Association Rules Algorithm

    NASA Astrophysics Data System (ADS)

    Li, Guoxin; Shi, Wen

    One of the most well-studied problems in data mining is mining for association rules. There was also research that introduced association rule mining methods to conduct classification tasks. These classification methods, based on association rule mining, could be applied for customer segmentation. Currently, most of the association rule mining methods are based on a support-confidence structure, where rules satisfied both minimum support and minimum confidence were returned as strong association rules back to the analyzer. But, this types of association rule mining methods lack of rigorous statistic guarantee, sometimes even caused misleading. A new classification model for customer segmentation, based on association rule mining algorithm, was proposed in this paper. This new model was based on the support-significant association rule mining method, where the measurement of confidence for association rule was substituted by the significant of association rule that was a better evaluation standard for association rules. Data experiment for customer segmentation from UCI indicated the effective of this new model.

  8. Dependent Probability Spaces

    ERIC Educational Resources Information Center

    Edwards, William F.; Shiflett, Ray C.; Shultz, Harris

    2008-01-01

    The mathematical model used to describe independence between two events in probability has a non-intuitive consequence called dependent spaces. The paper begins with a very brief history of the development of probability, then defines dependent spaces, and reviews what is known about finite spaces with uniform probability. The study of finite…

  9. Explosion probability of unexploded ordnance: expert beliefs.

    PubMed

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  10. The impacts of problem gambling on concerned significant others accessing web-based counselling.

    PubMed

    Dowling, Nicki A; Rodda, Simone N; Lubman, Dan I; Jackson, Alun C

    2014-08-01

    The 'concerned significant others' (CSOs) of people with problem gambling frequently seek professional support. However, there is surprisingly little research investigating the characteristics or help-seeking behaviour of these CSOs, particularly for web-based counselling. The aims of this study were to describe the characteristics of CSOs accessing the web-based counselling service (real time chat) offered by the Australian national gambling web-based counselling site, explore the most commonly reported CSO impacts using a new brief scale (the Problem Gambling Significant Other Impact Scale: PG-SOIS), and identify the factors associated with different types of CSO impact. The sample comprised all 366 CSOs accessing the service over a 21 month period. The findings revealed that the CSOs were most often the intimate partners of problem gamblers and that they were most often females aged under 30 years. All CSOs displayed a similar profile of impact, with emotional distress (97.5%) and impacts on the relationship (95.9%) reported to be the most commonly endorsed impacts, followed by impacts on social life (92.1%) and finances (91.3%). Impacts on employment (83.6%) and physical health (77.3%) were the least commonly endorsed. There were few significant differences in impacts between family members (children, partners, parents, and siblings), but friends consistently reported the lowest impact scores. Only prior counselling experience and Asian cultural background were consistently associated with higher CSO impacts. The findings can serve to inform the development of web-based interventions specifically designed for the CSOs of problem gamblers.

  11. Dynamical Simulation of Probabilities

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    1996-01-01

    It has been demonstrated that classical probabilities, and in particular, probabilistic Turing machine, can be simulated by combining chaos and non-Lipschitz dynamics, without utilization of any man-made devices(such as random number generators). Self-orgainizing properties of systems coupling simulated and calculated probabilities and their link to quantum computations are discussed. Special attention was focused upon coupled stochastic processes, defined in terms of conditional probabilities, for which joint probability does not exist. Simulations of quantum probabilities are also discussed.

  12. Evaluating research for clinical significance: using critically appraised topics to enhance evidence-based neuropsychology.

    PubMed

    Bowden, Stephen C; Harrison, Elise J; Loring, David W

    2014-01-01

    Meehl's (1973, Psychodiagnosis: Selected papers. Minneapolis: University of Minnesota Press) distinction between statistical and clinical significance holds special relevance for evidence-based neuropsychological practice. Meehl argued that despite attaining statistical significance, many published findings have limited practical value since they do not inform clinical care. In the context of an ever expanding clinical research literature, accessible methods to evaluate clinical impact are needed. The method of Critically Appraised Topics (Straus, Richardson, Glasziou, & Haynes, 2011, Evidence-based medicine: How to practice and teach EBM (4th ed.). Edinburgh: Elsevier Churchill-Livingstone) was developed to provide clinicians with a "toolkit" to facilitate implementation of evidence-based practice. We illustrate the Critically Appraised Topics method using a dementia screening example. We argue that the skills practiced through critical appraisal provide clinicians with methods to: (1) evaluate the clinical relevance of new or unfamiliar research findings with a focus on patient benefit, (2) help focus of research quality, and (3) incorporate evaluation of clinical impact into educational and professional development activities.

  13. Chronic Arsenic Poisoning Probably Caused by Arsenic-Based Pesticides: Findings from an Investigation Study of a Household

    PubMed Central

    Li, Yongfang; Ye, Feng; Wang, Anwei; Wang, Da; Yang, Boyi; Zheng, Quanmei; Sun, Guifan; Gao, Xinghua

    2016-01-01

    In addition to naturally occurring arsenic, man-made arsenic-based compounds are other sources of arsenic exposure. In 2013, our group identified 12 suspected arsenicosis patients in a household (32 living members). Of them, eight members were diagnosed with skin cancer. Interestingly, all of these patients had lived in the household prior to 1989. An investigation revealed that approximately 2 tons of arsenic-based pesticides had been previously placed near a well that had supplied drinking water to the family from 1973 to 1989. The current arsenic level in the well water was 620 μg/L. No other high arsenic wells were found near the family’s residence. Based on these findings, it is possible to infer that the skin lesions exhibited by these family members were caused by long-term exposure to well water contaminated with arsenic-based pesticides. Additionally, biochemical analysis showed that the individuals exposed to arsenic had higher levels of aspartate aminotransferase and γ-glutamyl transpeptidase than those who were not exposed. These findings might indicate the presence of liver dysfunction in the arsenic-exposed individuals. This report elucidates the effects of arsenical compounds on the occurrence of high levels of arsenic in the environment and emphasizes the severe human health impact of arsenic exposure. PMID:26784217

  14. Chronic Arsenic Poisoning Probably Caused by Arsenic-Based Pesticides: Findings from an Investigation Study of a Household.

    PubMed

    Li, Yongfang; Ye, Feng; Wang, Anwei; Wang, Da; Yang, Boyi; Zheng, Quanmei; Sun, Guifan; Gao, Xinghua

    2016-01-16

    In addition to naturally occurring arsenic, man-made arsenic-based compounds are other sources of arsenic exposure. In 2013, our group identified 12 suspected arsenicosis patients in a household (32 living members). Of them, eight members were diagnosed with skin cancer. Interestingly, all of these patients had lived in the household prior to 1989. An investigation revealed that approximately 2 tons of arsenic-based pesticides had been previously placed near a well that had supplied drinking water to the family from 1973 to 1989. The current arsenic level in the well water was 620 μg/L. No other high arsenic wells were found near the family's residence. Based on these findings, it is possible to infer that the skin lesions exhibited by these family members were caused by long-term exposure to well water contaminated with arsenic-based pesticides. Additionally, biochemical analysis showed that the individuals exposed to arsenic had higher levels of aspartate aminotransferase and γ-glutamyl transpeptidase than those who were not exposed. These findings might indicate the presence of liver dysfunction in the arsenic-exposed individuals. This report elucidates the effects of arsenical compounds on the occurrence of high levels of arsenic in the environment and emphasizes the severe human health impact of arsenic exposure.

  15. Development and Use of a Computer-Based Interactive Resource for Teaching and Learning Probability in Primary Classrooms

    ERIC Educational Resources Information Center

    Trigueros, Maria; Lozano, Maria Dolores; Lage, Ana Elisa

    2006-01-01

    "Enciclomedia" is a Mexican project for primary school teaching using computers in the classroom. Within this project, and following an enactivist theoretical perspective and methodology, we have designed a computer-based package called "Dados", which, together with teaching guides, is intended to support the teaching and…

  16. Cluster membership probability: polarimetric approach

    NASA Astrophysics Data System (ADS)

    Medhi, Biman J.; Tamura, Motohide

    2013-04-01

    Interstellar polarimetric data of the six open clusters Hogg 15, NGC 6611, NGC 5606, NGC 6231, NGC 5749 and NGC 6250 have been used to estimate the membership probability for the stars within them. For proper-motion member stars, the membership probability estimated using the polarimetric data is in good agreement with the proper-motion cluster membership probability. However, for proper-motion non-member stars, the membership probability estimated by the polarimetric method is in total disagreement with the proper-motion cluster membership probability. The inconsistencies in the determined memberships may be because of the fundamental differences between the two methods of determination: one is based on stellar proper motion in space and the other is based on selective extinction of the stellar output by the asymmetric aligned dust grains present in the interstellar medium. The results and analysis suggest that the scatter of the Stokes vectors q (per cent) and u (per cent) for the proper-motion member stars depends on the interstellar and intracluster differential reddening in the open cluster. It is found that this method could be used to estimate the cluster membership probability if we have additional polarimetric and photometric information for a star to identify it as a probable member/non-member of a particular cluster, such as the maximum wavelength value (λmax), the unit weight error of the fit (σ1), the dispersion in the polarimetric position angles (overline{ɛ }), reddening (E(B - V)) or the differential intracluster reddening (ΔE(B - V)). This method could also be used to estimate the membership probability of known member stars having no membership probability as well as to resolve disagreements about membership among different proper-motion surveys.

  17. Probability versus representativeness in infancy: can infants use naïve physics to adjust population base rates in probabilistic inference?

    PubMed

    Denison, Stephanie; Trikutam, Pallavi; Xu, Fei

    2014-08-01

    A rich tradition in developmental psychology explores physical reasoning in infancy. However, no research to date has investigated whether infants can reason about physical objects that behave probabilistically, rather than deterministically. Physical events are often quite variable, in that similar-looking objects can be placed in similar contexts with different outcomes. Can infants rapidly acquire probabilistic physical knowledge, such as some leaves fall and some glasses break by simply observing the statistical regularity with which objects behave and apply that knowledge in subsequent reasoning? We taught 11-month-old infants physical constraints on objects and asked them to reason about the probability of different outcomes when objects were drawn from a large distribution. Infants could have reasoned either by using the perceptual similarity between the samples and larger distributions or by applying physical rules to adjust base rates and estimate the probabilities. Infants learned the physical constraints quickly and used them to estimate probabilities, rather than relying on similarity, a version of the representativeness heuristic. These results indicate that infants can rapidly and flexibly acquire physical knowledge about objects following very brief exposure and apply it in subsequent reasoning.

  18. Probabilistic prediction of hydrologic drought using a conditional probability approach based on the meta-Gaussian model

    NASA Astrophysics Data System (ADS)

    Hao, Zengchao; Hao, Fanghua; Singh, Vijay P.; Sun, Alexander Y.; Xia, Youlong

    2016-11-01

    Prediction of drought plays an important role in drought preparedness and mitigation, especially because of large impacts of drought and increasing demand for water resources. An important aspect for improving drought prediction skills is the identification of drought predictability sources. In general, a drought originates from precipitation deficit and thus the antecedent meteorological drought may provide predictive information for other types of drought. In this study, a hydrological drought (represented by Standardized Runoff Index (SRI)) prediction method is proposed based on the meta-Gaussian model taking into account the persistence and its prior meteorological drought condition (represented by Standardized Precipitation Index (SPI)). Considering the inherent nature of standardized drought indices, the meta-Gaussian model arises as a suitable model for constructing the joint distribution of multiple drought indices. Accordingly, the conditional distribution of hydrological drought can be derived analytically, which enables the probabilistic prediction of hydrological drought in the target period and uncertainty quantifications. Based on monthly precipitation and surface runoff of climate divisions of Texas, U.S., 1-month and 2-month lead predictions of hydrological drought are illustrated and compared to the prediction from Ensemble Streamflow Prediction (ESP). Results, based on 10 climate divisions in Texas, show that the proposed meta-Gaussian model provides useful drought prediction information with performance depending on regions and seasons.

  19. Response of the San Andreas fault to the 1983 Coalinga-Nuñez earthquakes: an application of interaction-based probabilities for Parkfield

    USGS Publications Warehouse

    Toda, Shinji; Stein, Ross S.

    2002-01-01

    The Parkfield-Cholame section of the San Andreas fault, site of an unfulfilled earthquake forecast in 1985, is the best monitored section of the world's most closely watched fault. In 1983, the M = 6.5 Coalinga and M = 6.0 Nuñez events struck 25 km northeast of Parkfield. Seismicity rates climbed for 18 months along the creeping section of the San Andreas north of Parkfield and dropped for 6 years along the locked section to the south. Right-lateral creep also slowed or reversed from Parkfield south. Here we calculate that the Coalinga sequence increased the shear and Coulomb stress on the creeping section, causing the rate of small shocks to rise until the added stress was shed by additional slip. However, the 1983 events decreased the shear and Coulomb stress on the Parkfield segment, causing surface creep and seismicity rates to drop. We use these observations to cast the likelihood of a Parkfield earthquake into an interaction-based probability, which includes both the renewal of stress following the 1966 Parkfield earthquake and the stress transfer from the 1983 Coalinga events. We calculate that the 1983 shocks dropped the 10-year probability of a M ∼ 6 Parkfield earthquake by 22% (from 54 ± 22% to 42 ± 23%) and that the probability did not recover until about 1991, when seismicity and creep resumed. Our analysis may thus explain why the Parkfield earthquake did not strike in the 1980s, but not why it was absent in the 1990s. We calculate a 58 ± 17% probability of a M ∼ 6 Parkfield earthquake during 2001–2011.

  20. Fully automatized renal parenchyma volumetry using a support vector machine based recognition system for subject-specific probability map generation in native MR volume data

    NASA Astrophysics Data System (ADS)

    Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry

    2015-11-01

    In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.

  1. Bayesian modeling and inference for diagnostic accuracy and probability of disease based on multiple diagnostic biomarkers with and without a perfect reference standard.

    PubMed

    Jafarzadeh, S Reza; Johnson, Wesley O; Gardner, Ian A

    2016-03-15

    The area under the receiver operating characteristic (ROC) curve (AUC) is used as a performance metric for quantitative tests. Although multiple biomarkers may be available for diagnostic or screening purposes, diagnostic accuracy is often assessed individually rather than in combination. In this paper, we consider the interesting problem of combining multiple biomarkers for use in a single diagnostic criterion with the goal of improving the diagnostic accuracy above that of an individual biomarker. The diagnostic criterion created from multiple biomarkers is based on the predictive probability of disease, conditional on given multiple biomarker outcomes. If the computed predictive probability exceeds a specified cutoff, the corresponding subject is allocated as 'diseased'. This defines a standard diagnostic criterion that has its own ROC curve, namely, the combined ROC (cROC). The AUC metric for cROC, namely, the combined AUC (cAUC), is used to compare the predictive criterion based on multiple biomarkers to one based on fewer biomarkers. A multivariate random-effects model is proposed for modeling multiple normally distributed dependent scores. Bayesian methods for estimating ROC curves and corresponding (marginal) AUCs are developed when a perfect reference standard is not available. In addition, cAUCs are computed to compare the accuracy of different combinations of biomarkers for diagnosis. The methods are evaluated using simulations and are applied to data for Johne's disease (paratuberculosis) in cattle.

  2. Probability and radical behaviorism

    PubMed Central

    Espinosa, James M.

    1992-01-01

    The concept of probability appears to be very important in the radical behaviorism of Skinner. Yet, it seems that this probability has not been accurately defined and is still ambiguous. I give a strict, relative frequency interpretation of probability and its applicability to the data from the science of behavior as supplied by cumulative records. Two examples of stochastic processes are given that may model the data from cumulative records that result under conditions of continuous reinforcement and extinction, respectively. PMID:22478114

  3. The Sensitivity of Adolescent School-Based Hearing Screens Is Significantly Improved by Adding High Frequencies.

    PubMed

    Sekhar, Deepa L; Zalewski, Thomas R; Beiler, Jessica S; Czarnecki, Beth; Barr, Ashley L; King, Tonya S; Paul, Ian M

    2016-12-01

    High frequency hearing loss (HFHL), often related to hazardous noise, affects one in six U.S. adolescents. Yet, only 20 states include school-based hearing screens for adolescents. Only six states test multiple high frequencies. Study objectives were to (1) compare the sensitivity of state school-based hearing screens for adolescents to gold standard sound-treated booth testing and (2) consider the effect of adding multiple high frequencies and two-step screening on sensitivity/specificity. Of 134 eleventh-grade participants (2013-2014), 43 of the 134 (32%) did not pass sound-treated booth testing, and 27 of the 43 (63%) had HFHL. Sensitivity/specificity of the most common protocol (1,000, 2,000, 4,000 Hz at 20 dB HL) for these hearing losses was 25.6% (95% confidence interval [CI] = [13.5, 41.2]) and 85.7% (95% CI [76.8, 92.2]), respectively. A protocol including 500, 1,000, 2,000, 4,000, 6,000 Hz at 20 dB HL significantly improved sensitivity to 76.7% (95% CI [61.4, 88.2]), p < .001. Two-step screening maintained specificity (84.6%, 95% CI [75.5, 91.3]). Adolescent school-based hearing screen sensitivity improves with high frequencies.

  4. Statistics and Probability

    NASA Astrophysics Data System (ADS)

    Laktineh, Imad

    2010-04-01

    This ourse constitutes a brief introduction to probability applications in high energy physis. First the mathematical tools related to the diferent probability conepts are introduced. The probability distributions which are commonly used in high energy physics and their characteristics are then shown and commented. The central limit theorem and its consequences are analysed. Finally some numerical methods used to produce diferent kinds of probability distribution are presented. The full article (17 p.) corresponding to this lecture is written in french and is provided in the proceedings of the book SOS 2008.

  5. Probability of satellite collision

    NASA Technical Reports Server (NTRS)

    Mccarter, J. W.

    1972-01-01

    A method is presented for computing the probability of a collision between a particular artificial earth satellite and any one of the total population of earth satellites. The collision hazard incurred by the proposed modular Space Station is assessed using the technique presented. The results of a parametric study to determine what type of satellite orbits produce the greatest contribution to the total collision probability are presented. Collision probability for the Space Station is given as a function of Space Station altitude and inclination. Collision probability was also parameterized over miss distance and mission duration.

  6. PROBABILITY AND STATISTICS.

    DTIC Science & Technology

    STATISTICAL ANALYSIS, REPORTS), (*PROBABILITY, REPORTS), INFORMATION THEORY, DIFFERENTIAL EQUATIONS, STATISTICAL PROCESSES, STOCHASTIC PROCESSES, MULTIVARIATE ANALYSIS, DISTRIBUTION THEORY , DECISION THEORY, MEASURE THEORY, OPTIMIZATION

  7. Estimating the probability of occurrence of earthquakes (M>6) in the Western part of the Corinth rift using fault-based and classical seismotectonic approaches.

    NASA Astrophysics Data System (ADS)

    Boiselet, Aurelien; Scotti, Oona; Lyon-Caen, Hélène

    2014-05-01

    -SISCOR Working Group. On the basis of this consensual logic tree, median probability of occurrences of M>=6 events were computed for the region of study. Time-dependent models (Brownian Passage time and Weibull probability distributions) were also explored. The probability of a M>=6.0 event is found to be greater in the western region compared to the eastern part of the Corinth rift, whether a fault-based or a classical seismotectonic approach is used. Percentile probability estimates are also provided to represent the range of uncertainties in the results. The percentile results show that, in general, probability estimates following the classical approach (based on the definition of seismotectonic source zones), cover the median values estimated following the fault-based approach. On the contrary, the fault-based approach in this region is still affected by a high degree of uncertainty, because of the poor constraints on the 3D geometries of the faults and the high uncertainties in their slip rates.

  8. Significant performance enhancement in continuous wave terahertz photomixers based on fractal structures

    NASA Astrophysics Data System (ADS)

    Jafari, H.; Heidarzadeh, H.; Rostami, A.; Rostami, G.; Dolatyari, M.

    2017-01-01

    A photoconductive fractal antenna significantly improves the performance of photomixing-based continuous wave (CW) terahertz (THz) systems. An analysis has been carried out for the generation of CW-THz radiation by photomixer photoconductive antenna technique. To increase the active area for generation and hence the THz radiation power we used interdigitated electrodes that are coupled with a fractal tree antenna. In this paper, both semiconductor and electromagnetic problems are considered. Here, photomixer devices with Thue-Morse fractal tree antennas in two configurations (narrow and wide) are discussed. This new approach gives better performance, especially in the increasing of THz output power of photomixer devices, when compared with the conventional structures. In addition, applying the interdigitated electrodes improved THz photocurrent, considerably. It produces THz radiation power several times higher than the photomixers with simple gap.

  9. A network-based method to assess the statistical significance of mild co-regulation effects.

    PubMed

    Horvát, Emőke-Ágnes; Zhang, Jitao David; Uhlmann, Stefan; Sahin, Özgür; Zweig, Katharina Anna

    2013-01-01

    Recent development of high-throughput, multiplexing technology has initiated projects that systematically investigate interactions between two types of components in biological networks, for instance transcription factors and promoter sequences, or microRNAs (miRNAs) and mRNAs. In terms of network biology, such screening approaches primarily attempt to elucidate relations between biological components of two distinct types, which can be represented as edges between nodes in a bipartite graph. However, it is often desirable not only to determine regulatory relationships between nodes of different types, but also to understand the connection patterns of nodes of the same type. Especially interesting is the co-occurrence of two nodes of the same type, i.e., the number of their common neighbours, which current high-throughput screening analysis fails to address. The co-occurrence gives the number of circumstances under which both of the biological components are influenced in the same way. Here we present SICORE, a novel network-based method to detect pairs of nodes with a statistically significant co-occurrence. We first show the stability of the proposed method on artificial data sets: when randomly adding and deleting observations we obtain reliable results even with noise exceeding the expected level in large-scale experiments. Subsequently, we illustrate the viability of the method based on the analysis of a proteomic screening data set to reveal regulatory patterns of human microRNAs targeting proteins in the EGFR-driven cell cycle signalling system. Since statistically significant co-occurrence may indicate functional synergy and the mechanisms underlying canalization, and thus hold promise in drug target identification and therapeutic development, we provide a platform-independent implementation of SICORE with a graphical user interface as a novel tool in the arsenal of high-throughput screening analysis.

  10. Visualization of the significance of Receiver Operating Characteristics based on confidence ellipses

    NASA Astrophysics Data System (ADS)

    Sarlis, Nicholas V.; Christopoulos, Stavros-Richard G.

    2014-03-01

    The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Catalogue identifier: AERY_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AERY_v1_0.html Program obtainable from: CPC Program Library, Queen’s University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 11511 No. of bytes in distributed program, including test data, etc.: 72906 Distribution format: tar.gz Programming language: FORTRAN. Computer: Any computer supporting a GNU FORTRAN compiler. Operating system: Linux, MacOS, Windows. RAM: 1Mbyte Classification: 4.13, 9, 14. Nature of problem: The Receiver Operating Characteristics (ROC) is used for the evaluation of prediction methods in various disciplines like meteorology, geophysics, complex system physics, medicine etc. The estimation of the significance of a binary prediction method, however, remains a cumbersome task and is usually done by repeating the calculations by Monte Carlo. The FORTRAN code provided here simplifies this problem by evaluating the significance of binary predictions for a family of ellipses which are based on confidence ellipses and cover the whole ROC space. Solution method: Using the statistics of random binary predictions for a given value of the predictor threshold ɛt, one can construct the corresponding confidence ellipses. The envelope of these corresponding confidence ellipses is estimated when

  11. Future challenges for vection research: definitions, functional significance, measures, and neural bases

    PubMed Central

    Palmisano, Stephen; Allison, Robert S.; Schira, Mark M.; Barry, Robert J.

    2015-01-01

    This paper discusses four major challenges facing modern vection research. Challenge 1 (Defining Vection) outlines the different ways that vection has been defined in the literature and discusses their theoretical and experimental ramifications. The term vection is most often used to refer to visual illusions of self-motion induced in stationary observers (by moving, or simulating the motion of, the surrounding environment). However, vection is increasingly being used to also refer to non-visual illusions of self-motion, visually mediated self-motion perceptions, and even general subjective experiences (i.e., “feelings”) of self-motion. The common thread in all of these definitions is the conscious subjective experience of self-motion. Thus, Challenge 2 (Significance of Vection) tackles the crucial issue of whether such conscious experiences actually serve functional roles during self-motion (e.g., in terms of controlling or guiding the self-motion). After more than 100 years of vection research there has been surprisingly little investigation into its functional significance. Challenge 3 (Vection Measures) discusses the difficulties with existing subjective self-report measures of vection (particularly in the context of contemporary research), and proposes several more objective measures of vection based on recent empirical findings. Finally, Challenge 4 (Neural Basis) reviews the recent neuroimaging literature examining the neural basis of vection and discusses the hurdles still facing these investigations. PMID:25774143

  12. Probability and Statistics.

    ERIC Educational Resources Information Center

    Barnes, Bernis, Ed.; And Others

    This teacher's guide to probability and statistics contains three major sections. The first section on elementary combinatorial principles includes activities, student problems, and suggested teaching procedures for the multiplication principle, permutations, and combinations. Section two develops an intuitive approach to probability through…

  13. Teachers' Understandings of Probability

    ERIC Educational Resources Information Center

    Liu, Yan; Thompson, Patrick

    2007-01-01

    Probability is an important idea with a remarkably wide range of applications. However, psychological and instructional studies conducted in the last two decades have consistently documented poor understanding of probability among different populations across different settings. The purpose of this study is to develop a theoretical framework for…

  14. A probability-based sampling approach for the analysis of drug seizures composed of multiple containers of either cocaine, heroin, or Cannabis.

    PubMed

    Mario, John R

    2010-04-15

    A probability-based analytical sampling approach for seized containers of cocaine, Cannabis, or heroin, to answer questions of both content weight and identity, is described. It utilizes the Student's t distribution, and, because of the lack of normality in studied populations, the power of the Central Limit Theorem with samples of size 20 to calculate the mean net weights of multiple item drug seizures. Populations studied ranged between 50 and 1200 units. Identity determination is based on chemical testing and sampling using the hypergeometric distribution fit to a program macro - created by the European Network of Forensic Science Institutes (ENFSI) Drugs Working Group. Formal random item selection is effected through use of an Excel-generated list of random numbers. Included, because of their impact on actual practice, are discussions of admissibility, sufficiency of proof, method validation, and harmony with the guidelines of international standardizing bodies.

  15. Significance of Bias Correction in Drought Frequency and Scenario Analysis Based on Climate Models

    NASA Astrophysics Data System (ADS)

    Aryal, Y.; Zhu, J.

    2015-12-01

    Assessment of future drought characteristics is difficult as climate models usually have bias in simulating precipitation frequency and intensity. To overcome this limitation, output from climate models need to be bias corrected based on the specific purpose of applications. In this study, we examine the significance of bias correction in the context of drought frequency and scenario analysis using output from climate models. In particular, we investigate the performance of three widely used bias correction techniques: (1) monthly bias correction (MBC), (2) nested bias correction (NBC), and (3) equidistance quantile mapping (EQM) The effect of bias correction in future scenario of drought frequency is also analyzed. The characteristics of drought are investigated in terms of frequency and severity in nine representative locations in different climatic regions across the United States using regional climate model (RCM) output from the North American Regional Climate Change Assessment Program (NARCCAP). The Standardized Precipitation Index (SPI) is used as the means to compare and forecast drought characteristics at different timescales. Systematic biases in the RCM precipitation output are corrected against the National Centers for Environmental Prediction (NCEP) North American Regional Reanalysis (NARR) data. The results demonstrate that bias correction significantly decreases the RCM errors in reproducing drought frequency derived from the NARR data. Preserving mean and standard deviation is essential for climate models in drought frequency analysis. RCM biases both have regional and timescale dependence. Different timescale of input precipitation in the bias corrections show similar results. Drought frequency obtained from the RCM future (2040-2070) scenarios is compared with that from the historical simulations. The changes in drought characteristics occur in all climatic regions. The relative changes in drought frequency in future scenario in relation to

  16. Probability and amounts of yogurt intake are differently affected by sociodemographic, economic, and lifestyle factors in adults and the elderly-results from a population-based study.

    PubMed

    Possa, Gabriela; de Castro, Michelle Alessandra; Marchioni, Dirce Maria Lobo; Fisberg, Regina Mara; Fisberg, Mauro

    2015-08-01

    The aim of this population-based cross-sectional health survey (N = 532) was to investigate the factors associated with the probability and amounts of yogurt intake in Brazilian adults and the elderly. A structured questionnaire was used to obtain data on demographics, socioeconomic information, presence of morbidities and lifestyle and anthropometric characteristics. Food intake was evaluated using two nonconsecutive 24-hour dietary recalls and a Food Frequency Questionnaire. Approximately 60% of the subjects were classified as yogurt consumers. In the logistic regression model, yogurt intake was associated with smoking (odds ratio [OR], 1.98), female sex (OR, 2.12), and age 20 to 39 years (OR, 3.11). Per capita family income and being a nonsmoker were factors positively associated with the amount of yogurt consumption (coefficients, 0.61 and 3.73, respectively), whereas the level of education of the head of household was inversely associated (coefficient, 0.61). In this study, probability and amounts of yogurt intake are differently affected by demographic, socioeconomic, and lifestyle factors in adults and the elderly.

  17. A low false negative filter for detecting rare bird species from short video segments using a probable observation data set-based EKF method.

    PubMed

    Song, Dezhen; Xu, Yiliang

    2010-09-01

    We report a new filter to assist the search for rare bird species. Since a rare bird only appears in front of a camera with very low occurrence (e.g., less than ten times per year) for very short duration (e.g., less than a fraction of a second), our algorithm must have a very low false negative rate. We verify the bird body axis information with the known bird flying dynamics from the short video segment. Since a regular extended Kalman filter (EKF) cannot converge due to high measurement error and limited data, we develop a novel probable observation data set (PODS)-based EKF method. The new PODS-EKF searches the measurement error range for all probable observation data that ensures the convergence of the corresponding EKF in short time frame. The algorithm has been extensively tested using both simulated inputs and real video data of four representative bird species. In the physical experiments, our algorithm has been tested on rock pigeons and red-tailed hawks with 119 motion sequences. The area under the ROC curve is 95.0%. During the one-year search of ivory-billed woodpeckers, the system reduces the raw video data of 29.41 TB to only 146.7 MB (reduction rate 99.9995%).

  18. Implementation of a web based universal exchange and inference language for medicine: Sparse data, probabilities and inference in data mining of clinical data repositories.

    PubMed

    Robson, Barry; Boray, Srinidhi

    2015-11-01

    We extend Q-UEL, our universal exchange language for interoperability and inference in healthcare and biomedicine, to the more traditional fields of public health surveys. These are the type associated with screening, epidemiological and cross-sectional studies, and cohort studies in some cases similar to clinical trials. There is the challenge that there is some degree of split between frequentist notions of probability as (a) classical measures based only on the idea of counting and proportion and on classical biostatistics as used in the above conservative disciplines, and (b) more subjectivist notions of uncertainty, belief, reliability, or confidence often used in automated inference and decision support systems. Samples in the above kind of public health survey are typically small compared with our earlier "Big Data" mining efforts. An issue addressed here is how much impact on decisions should sparse data have. We describe a new Q-UEL compatible toolkit including a data analytics application DiracMiner that also delivers more standard biostatistical results, DiracBuilder that uses its output to build Hyperbolic Dirac Nets (HDN) for decision support, and HDNcoherer that ensures that probabilities are mutually consistent. Use is exemplified by participating in a real word health-screening project, and also by deployment in a industrial platform called the BioIngine, a cognitive computing platform for health management.

  19. Normal probability plots with confidence.

    PubMed

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods.

  20. Size effects on the open probability of two-state ion channel system in cell membranes using microcanonical formalism based on gamma function

    NASA Astrophysics Data System (ADS)

    Erdem, Riza; Aydiner, Ekrem

    2016-08-01

    Ion channel systems are a class of proteins that reside in the membranes of all biological cells and forms conduction pores that regulate the transport of ions into and out of cells. They can be investigated theoretically in the microcanonical formalism since the number of accessible states can be easily evaluated by using the Stirling approximation to deal with factorials. In this work, we have used gamma function (Γ (n)) to solve the two-state or open-close channel model without any approximation. New values are calculated for the open probability (p0) and the relative error between our numerical results and the approximate one using Stirling formula is presented. This error (p0 app — p0)/p0 is significant for small channel systems.

  1. On probability-possibility transformations

    NASA Technical Reports Server (NTRS)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  2. Guide star probabilities

    NASA Technical Reports Server (NTRS)

    Soneira, R. M.; Bahcall, J. N.

    1981-01-01

    Probabilities are calculated for acquiring suitable guide stars (GS) with the fine guidance system (FGS) of the space telescope. A number of the considerations and techniques described are also relevant for other space astronomy missions. The constraints of the FGS are reviewed. The available data on bright star densities are summarized and a previous error in the literature is corrected. Separate analytic and Monte Carlo calculations of the probabilities are described. A simulation of space telescope pointing is carried out using the Weistrop north galactic pole catalog of bright stars. Sufficient information is presented so that the probabilities of acquisition can be estimated as a function of position in the sky. The probability of acquiring suitable guide stars is greatly increased if the FGS can allow an appreciable difference between the (bright) primary GS limiting magnitude and the (fainter) secondary GS limiting magnitude.

  3. Quantum computing and probability.

    PubMed

    Ferry, David K

    2009-11-25

    Over the past two decades, quantum computing has become a popular and promising approach to trying to solve computationally difficult problems. Missing in many descriptions of quantum computing is just how probability enters into the process. Here, we discuss some simple examples of how uncertainty and probability enter, and how this and the ideas of quantum computing challenge our interpretations of quantum mechanics. It is found that this uncertainty can lead to intrinsic decoherence, and this raises challenges for error correction.

  4. Asteroidal collision probabilities

    NASA Astrophysics Data System (ADS)

    Bottke, W. F.; Greenberg, R.

    1993-05-01

    Several past calculations of collision probabilities between pairs of bodies on independent orbits have yielded inconsistent results. We review the methodologies and identify their various problems. Greenberg's (1982) collision probability formalism (now with a corrected symmetry assumption) is equivalent to Wetherill's (1967) approach, except that it includes a way to avoid singularities near apsides. That method shows that the procedure by Namiki and Binzel (1991) was accurate for those cases where singularities did not arise.

  5. Determining the Probability of Violating Upper-Level Wind Constraints for the Launch of Minuteman Ill Ballistic Missiles At Vandenberg Air Force Base

    NASA Technical Reports Server (NTRS)

    Shafer, Jaclyn A.; Brock, Tyler M.

    2013-01-01

    The 30th Operational Support Squadron Weather Flight (30 OSSWF) provides comprehensive weather services to the space program at Vandenberg Air Force Base (VAFB) in California. One of their responsibilities is to monitor upper-level winds to ensure safe launch operations of the Minuteman Ill ballistic missile. The 30 OSSWF requested the Applied Meteorology Unit (AMU) analyze VAFB sounding data to determine the probability of violating (PoV) upper-level thresholds for wind speed and shear constraints specific to this launch vehicle, and to develop a graphical user interface (GUI) that will calculate the PoV of each constraint on the day of launch. The AMU suggested also including forecast sounding data from the Rapid Refresh (RAP) model. This would provide further insight for the launch weather officers (LWOs) when determining if a wind constraint violation will occur over the next few hours, and help to improve the overall upper winds forecast on launch day.

  6. Probabilities in implicit learning.

    PubMed

    Tseng, Philip; Hsu, Tzu-Yu; Tzeng, Ovid J L; Hung, Daisy L; Juan, Chi-Hung

    2011-01-01

    The visual system possesses a remarkable ability in learning regularities from the environment. In the case of contextual cuing, predictive visual contexts such as spatial configurations are implicitly learned, retained, and used to facilitate visual search-all without one's subjective awareness and conscious effort. Here we investigated whether implicit learning and its facilitatory effects are sensitive to the statistical property of such implicit knowledge. In other words, are highly probable events learned better than less probable ones even when such learning is implicit? We systematically varied the frequencies of context repetition to alter the degrees of learning. Our results showed that search efficiency increased consistently as contextual probabilities increased. Thus, the visual contexts, along with their probability of occurrences, were both picked up by the visual system. Furthermore, even when the total number of exposures was held constant between each probability, the highest probability still enjoyed a greater cuing effect, suggesting that the temporal aspect of implicit learning is also an important factor to consider in addition to the effect of mere frequency. Together, these findings suggest that implicit learning, although bypassing observers' conscious encoding and retrieval effort, behaves much like explicit learning in the sense that its facilitatory effect also varies as a function of its associative strengths.

  7. Launch Collision Probability

    NASA Technical Reports Server (NTRS)

    Bollenbacher, Gary; Guptill, James D.

    1999-01-01

    This report analyzes the probability of a launch vehicle colliding with one of the nearly 10,000 tracked objects orbiting the Earth, given that an object on a near-collision course with the launch vehicle has been identified. Knowledge of the probability of collision throughout the launch window can be used to avoid launching at times when the probability of collision is unacceptably high. The analysis in this report assumes that the positions of the orbiting objects and the launch vehicle can be predicted as a function of time and therefore that any tracked object which comes close to the launch vehicle can be identified. The analysis further assumes that the position uncertainty of the launch vehicle and the approaching space object can be described with position covariance matrices. With these and some additional simplifying assumptions, a closed-form solution is developed using two approaches. The solution shows that the probability of collision is a function of position uncertainties, the size of the two potentially colliding objects, and the nominal separation distance at the point of closest approach. ne impact of the simplifying assumptions on the accuracy of the final result is assessed and the application of the results to the Cassini mission, launched in October 1997, is described. Other factors that affect the probability of collision are also discussed. Finally, the report offers alternative approaches that can be used to evaluate the probability of collision.

  8. The perception of probability.

    PubMed

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making.

  9. Determining the Probability of Violating Upper-Level Wind Constraints for the Launch of Minuteman III Ballistic Missiles at Vandenberg Air Force Base

    NASA Technical Reports Server (NTRS)

    Shafer, Jaclyn A.; Brock, Tyler M.

    2012-01-01

    The 30th Operational Support Squadron Weather Flight (30 OSSWF) provides comprehensive weather services to the space program at Vandenberg Air Force Base (VAFB) in California. One of their responsibilities is to monitor upper-level winds to ensure safe launch operations of the Minuteman Ill ballistic missile. The 30 OSSWF tasked the Applied Meteorology Unit (AMU) to analyze VAFB sounding data with the goal of determining the probability of violating (PoV) their upper-level thresholds for wind speed and shear constraints specific to this launch vehicle, and to develop a tool that will calculate the PoV of each constraint on the day of launch. In order to calculate the probability of exceeding each constraint, the AMU collected and analyzed historical data from VAFB. The historical sounding data were retrieved from the National Oceanic and Atmospheric Administration Earth System Research Laboratory archive for the years 1994-2011 and then stratified into four sub-seasons: January-March, April-June, July-September, and October-December. The AMU determined the theoretical distributions that best fit the maximum wind speed and maximum wind shear datasets and applied this information when calculating the averages and standard deviations needed for the historical and real-time PoV calculations. In addition, the AMU included forecast sounding data from the Rapid Refresh model. This information provides further insight for the launch weather officers (LWOs) when determining if a wind constraint violation will occur over the next few hours on the day of launch. The AMU developed an interactive graphical user interface (GUI) in Microsoft Excel using Visual Basic for Applications. The GUI displays the critical sounding data easily and quickly for LWOs on day of launch. This tool will replace the existing one used by the 30 OSSWF, assist the LWOs in determining the probability of exceeding specific wind threshold values, and help to improve the overall upper winds forecast for

  10. The Sensitivity of Adolescent School-Based Hearing Screens Is Significantly Improved by Adding High Frequencies

    ERIC Educational Resources Information Center

    Sekhar, Deepa L.; Zalewski, Thomas R.; Beiler, Jessica S.; Czarnecki, Beth; Barr, Ashley L.; King, Tonya S.; Paul, Ian M.

    2016-01-01

    High frequency hearing loss (HFHL), often related to hazardous noise, affects one in six U.S. adolescents. Yet, only 20 states include school-based hearing screens for adolescents. Only six states test multiple high frequencies. Study objectives were to (1) compare the sensitivity of state school-based hearing screens for adolescents to gold…

  11. Ultimate limits to error probabilities for ionospheric models based on solar geophysical indices and how these compare with the state of the art

    NASA Technical Reports Server (NTRS)

    Nisbet, J. S.; Stehle, C. G.

    1981-01-01

    An ideal model based on a given set of geophysical indices is defined as a model that provides a least squares fit to the data set as a function of the indices considered. Satellite measurements of electron content for three stations at different magnetic latitudes were used to provide such data sets which were each fitted to the geophysical indices. The magnitude of the difference between the measured value and the derived equation for the data set was used to estimate the probability of making an error greater than a given magnitude for such an ideal model. Atmospheric Explorer C data is used to examine the causes of the fluctuations and suggestions are made about how real improvements can be made in ionospheric forecasting ability. Joule heating inputs in the auroral electrojets are related to the AL and AU magnetic indices. Magnetic indices based on the time integral of the energy deposited in the electrojets are proposed for modeling processes affected by auroral zone heating.

  12. Incorporating a Process-Based Land Use Variable into Species- Distribution Modelling and an Estimated Probability of Species Occurrence Into a Land Change Model: A Case of Albania

    NASA Astrophysics Data System (ADS)

    Laze, Kuenda

    2016-08-01

    Modelling of land use may be improved by incorporating the results of species distribution modelling and species distribution modelling may be upgraded if a variable of the process-based variable of forest cover change or accessibility of forest from human settlement is included. This work presents the results of spatially explicit analyses of the changes in forest cover from 2000 to 2007 using the method of Geographically Weighted Regression (GWR) and of the species distribution for protected species of Lynx lynx martinoi, Ursus arctos using Generalized Linear Models (GLMs). The methodological approach is separately searching for a parsimonious model for forest cover change and species distribution for the entire territory of Albania. The findings of this work show that modelling of land change and of species distribution is indeed value-added by showing higher values of model selection of corrected Akaike Information Criterion. These results provide evidences on the effects of process-based variables on species distribution modelling and on the performance of species distribution modelling as well as show an example of the incorporation of estimated probability of species occurrences in a land change modelling.

  13. GIS-based estimation of the winter storm damage probability in forests: a case study from Baden-Wuerttemberg (Southwest Germany).

    PubMed

    Schindler, Dirk; Grebhan, Karin; Albrecht, Axel; Schönborn, Jochen; Kohnle, Ulrich

    2012-01-01

    Data on storm damage attributed to the two high-impact winter storms 'Wiebke' (28 February 1990) and 'Lothar' (26 December 1999) were used for GIS-based estimation and mapping (in a 50 × 50 m resolution grid) of the winter storm damage probability (P(DAM)) for the forests of the German federal state of Baden-Wuerttemberg (Southwest Germany). The P(DAM)-calculation was based on weights of evidence (WofE) methodology. A combination of information on forest type, geology, soil type, soil moisture regime, and topographic exposure, as well as maximum gust wind speed field was used to compute P(DAM) across the entire study area. Given the condition that maximum gust wind speed during the two storm events exceeded 35 m s(-1), the highest P(DAM) values computed were primarily where coniferous forest grows in severely exposed areas on temporarily moist soils on bunter sandstone formations. Such areas are found mainly in the mountainous ranges of the northern Black Forest, the eastern Forest of Odes, in the Virngrund area, and in the southwestern Alpine Foothills.

  14. A Unifying Probability Example.

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.

    2002-01-01

    Presents an example from probability and statistics that ties together several topics including the mean and variance of a discrete random variable, the binomial distribution and its particular mean and variance, the sum of independent random variables, the mean and variance of the sum, and the central limit theorem. Uses Excel to illustrate these…

  15. Varga: On Probability.

    ERIC Educational Resources Information Center

    Varga, Tamas

    This booklet resulted from a 1980 visit by the author, a Hungarian mathematics educator, to the Teachers' Center Project at Southern Illinois University at Edwardsville. Included are activities and problems that make probablility concepts accessible to young children. The topics considered are: two probability games; choosing two beads; matching…

  16. Approximating Integrals Using Probability

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.; Caudle, Kyle A.

    2005-01-01

    As part of a discussion on Monte Carlo methods, which outlines how to use probability expectations to approximate the value of a definite integral. The purpose of this paper is to elaborate on this technique and then to show several examples using visual basic as a programming tool. It is an interesting method because it combines two branches of…

  17. No Bridge Too High: Infants Decide Whether to Cross Based on the Probability of Falling not the Severity of the Potential Fall

    ERIC Educational Resources Information Center

    Kretch, Kari S.; Adolph, Karen E.

    2013-01-01

    Do infants, like adults, consider both the probability of falling and the severity of a potential fall when deciding whether to cross a bridge? Crawling and walking infants were encouraged to cross bridges varying in width over a small drop-off, a large drop-off, or no drop-off. Bridge width affects the probability of falling, whereas drop-off…

  18. Children with Significant Hearing Loss: Learning to Listen, Talk, and Read--Evidence-Based Best Practices

    ERIC Educational Resources Information Center

    Martindale, Maura

    2007-01-01

    A considerable body of evidence obtained from studies of children who are deaf and who use cochlear implants has been useful in guiding practices that lead to higher levels of English language proficiency and age-appropriate literacy. Both (a) research conducted at implant centers and (b) educational programs with significant numbers of children…

  19. Automatic Identification and Storage of Significant Points in a Computer-Based Presentation

    ERIC Educational Resources Information Center

    Dickson, Paul; Adrion, W. Richards; Hanson, Allen

    2007-01-01

    We describe an automatic classroom capture system that detects and records significant (stable) points in lectures by sampling and analyzing a sequence of screen capture frames from a PC used for presentations, application demonstrations, etc. The system uses visual inspection techniques to scan the screen capture stream to identify points to…

  20. The significant impact of education, poverty, and race on Internet-based research participant engagement

    PubMed Central

    Hartz, Sarah M.; Quan, Tiffany; Ibiebele, Abiye; Fisher, Sherri L.; Olfson, Emily; Salyer, Patricia; Bierut, Laura J.

    2017-01-01

    Purpose: Internet-based technologies are increasingly being used for research studies. However, it is not known whether Internet-based approaches will effectively engage participants from diverse racial and socioeconomic backgrounds. Methods: A total of 967 participants were recruited and offered genetic ancestry results. We evaluated viewing Internet-based genetic ancestry results among participants who expressed high interest in obtaining the results. Results: Of the participants, 64% stated that they were very or extremely interested in their genetic ancestry results. Among interested participants, individuals with a high school diploma (n = 473) viewed their results 19% of the time relative to 4% of the 145 participants without a diploma (P < 0.0001). Similarly, 22% of participants with household income above the federal poverty level (n = 286) viewed their results relative to 10% of the 314 participants living below the federal poverty level (P < 0.0001). Among interested participants both with a high school degree and living above the poverty level, self-identified Caucasians were more likely to view results than self-identified African Americans (P < 0.0001), and females were more likely to view results than males (P = 0.0007). Conclusion: In an underserved population, engagement in Internet-based research was low despite high reported interest. This suggests that explicit strategies should be developed to increase diversity in Internet-based research. Genet Med 19 2, 240–243. PMID:27467456

  1. An exclusive human milk-based diet in extremely premature infants reduces the probability of remaining on total parenteral nutrition: A reanalysis of the data

    Technology Transfer Automated Retrieval System (TEKTRAN)

    We have previously shown that an exclusively human-milk-based diet is beneficial for extremely premature infants who are at risk for necrotizing enterocolitis (NEC). However, no significant difference in the other primary study endpoint, the length of time on total parenteral nutrition (TPN), was fo...

  2. Empirical estimation of genome-wide significance thresholds based on the 1000 Genomes Project data set

    PubMed Central

    Kanai, Masahiro; Tanaka, Toshihiro; Okada, Yukinori

    2016-01-01

    To assess the statistical significance of associations between variants and traits, genome-wide association studies (GWAS) should employ an appropriate threshold that accounts for the massive burden of multiple testing in the study. Although most studies in the current literature commonly set a genome-wide significance threshold at the level of P=5.0 × 10−8, the adequacy of this value for respective populations has not been fully investigated. To empirically estimate thresholds for different ancestral populations, we conducted GWAS simulations using the 1000 Genomes Phase 3 data set for Africans (AFR), Europeans (EUR), Admixed Americans (AMR), East Asians (EAS) and South Asians (SAS). The estimated empirical genome-wide significance thresholds were Psig=3.24 × 10−8 (AFR), 9.26 × 10−8 (EUR), 1.83 × 10−7 (AMR), 1.61 × 10−7 (EAS) and 9.46 × 10−8 (SAS). We additionally conducted trans-ethnic meta-analyses across all populations (ALL) and all populations except for AFR (ΔAFR), which yielded Psig=3.25 × 10−8 (ALL) and 4.20 × 10−8 (ΔAFR). Our results indicate that the current threshold (P=5.0 × 10−8) is overly stringent for all ancestral populations except for Africans; however, we should employ a more stringent threshold when conducting a meta-analysis, regardless of the presence of African samples. PMID:27305981

  3. Empirical estimation of genome-wide significance thresholds based on the 1000 Genomes Project data set.

    PubMed

    Kanai, Masahiro; Tanaka, Toshihiro; Okada, Yukinori

    2016-10-01

    To assess the statistical significance of associations between variants and traits, genome-wide association studies (GWAS) should employ an appropriate threshold that accounts for the massive burden of multiple testing in the study. Although most studies in the current literature commonly set a genome-wide significance threshold at the level of P=5.0 × 10(-8), the adequacy of this value for respective populations has not been fully investigated. To empirically estimate thresholds for different ancestral populations, we conducted GWAS simulations using the 1000 Genomes Phase 3 data set for Africans (AFR), Europeans (EUR), Admixed Americans (AMR), East Asians (EAS) and South Asians (SAS). The estimated empirical genome-wide significance thresholds were Psig=3.24 × 10(-8) (AFR), 9.26 × 10(-8) (EUR), 1.83 × 10(-7) (AMR), 1.61 × 10(-7) (EAS) and 9.46 × 10(-8) (SAS). We additionally conducted trans-ethnic meta-analyses across all populations (ALL) and all populations except for AFR (ΔAFR), which yielded Psig=3.25 × 10(-8) (ALL) and 4.20 × 10(-8) (ΔAFR). Our results indicate that the current threshold (P=5.0 × 10(-8)) is overly stringent for all ancestral populations except for Africans; however, we should employ a more stringent threshold when conducting a meta-analysis, regardless of the presence of African samples.

  4. Identification of Patient Benefit From Proton Therapy for Advanced Head and Neck Cancer Patients Based on Individual and Subgroup Normal Tissue Complication Probability Analysis

    SciTech Connect

    Jakobi, Annika; Bandurska-Luque, Anna; Stützer, Kristin; Haase, Robert; Löck, Steffen; Wack, Linda-Jacqueline; Mönnich, David; Thorwarth, Daniela; and others

    2015-08-01

    Purpose: The purpose of this study was to determine, by treatment plan comparison along with normal tissue complication probability (NTCP) modeling, whether a subpopulation of patients with head and neck squamous cell carcinoma (HNSCC) could be identified that would gain substantial benefit from proton therapy in terms of NTCP. Methods and Materials: For 45 HNSCC patients, intensity modulated radiation therapy (IMRT) was compared to intensity modulated proton therapy (IMPT). Physical dose distributions were evaluated as well as the resulting NTCP values, using modern models for acute mucositis, xerostomia, aspiration, dysphagia, laryngeal edema, and trismus. Patient subgroups were defined based on primary tumor location. Results: Generally, IMPT reduced the NTCP values while keeping similar target coverage for all patients. Subgroup analyses revealed a higher individual reduction of swallowing-related side effects by IMPT for patients with tumors in the upper head and neck area, whereas the risk reduction of acute mucositis was more pronounced in patients with tumors in the larynx region. More patients with tumors in the upper head and neck area had a reduction in NTCP of more than 10%. Conclusions: Subgrouping can help to identify patients who may benefit more than others from the use of IMPT and, thus, can be a useful tool for a preselection of patients in the clinic where there are limited PT resources. Because the individual benefit differs within a subgroup, the relative merits should additionally be evaluated by individual treatment plan comparisons.

  5. Cell survival fraction estimation based on the probability densities of domain and cell nucleus specific energies using improved microdosimetric kinetic models.

    PubMed

    Sato, Tatsuhiko; Furusawa, Yoshiya

    2012-10-01

    Estimation of the survival fractions of cells irradiated with various particles over a wide linear energy transfer (LET) range is of great importance in the treatment planning of charged-particle therapy. Two computational models were developed for estimating survival fractions based on the concept of the microdosimetric kinetic model. They were designated as the double-stochastic microdosimetric kinetic and stochastic microdosimetric kinetic models. The former model takes into account the stochastic natures of both domain and cell nucleus specific energies, whereas the latter model represents the stochastic nature of domain specific energy by its approximated mean value and variance to reduce the computational time. The probability densities of the domain and cell nucleus specific energies are the fundamental quantities for expressing survival fractions in these models. These densities are calculated using the microdosimetric and LET-estimator functions implemented in the Particle and Heavy Ion Transport code System (PHITS) in combination with the convolution or database method. Both the double-stochastic microdosimetric kinetic and stochastic microdosimetric kinetic models can reproduce the measured survival fractions for high-LET and high-dose irradiations, whereas a previously proposed microdosimetric kinetic model predicts lower values for these fractions, mainly due to intrinsic ignorance of the stochastic nature of cell nucleus specific energies in the calculation. The models we developed should contribute to a better understanding of the mechanism of cell inactivation, as well as improve the accuracy of treatment planning of charged-particle therapy.

  6. Validation of three BRCA1/2 mutation-carrier probability models Myriad, BRCAPRO and BOADICEA in a population-based series of 183 German families.

    PubMed

    Schneegans, S M; Rosenberger, A; Engel, U; Sander, M; Emons, G; Shoukier, M

    2012-06-01

    Many studies have evaluated the performance of risk assessment models for BRCA1/2 mutation carrier probabilities in different populations, but to our knowledge very few studies have been conducted in the German population so far. In the recent study, we validated the performance of three risk calculation models by names BRCAPRO, Myriad and BOADICEA in 183 German families who had undergone molecular testing of mutations in BRCA1 and BRCA2 with an indication based on clinical criteria regarding their family history of cancer. The sensitivity and specificity at the conventional threshold of 10% as well as for a threshold of 20% were evaluated. The ability to discriminate between carriers and non-carriers was judged by the area under the receiver operating characteristics curve. We further focused on the performance characteristic of these models in patients carrying large genomic rearrangements as a subtype of mutations which is currently gaining increasing importance. BRCAPRO and BOADICEA performed almost equally well in our patient population, but we found a lack of agreement to Myriad. The results obtained from this study were consistent with previously published results from other population and racial/ethnic groups. We suggest using model specific decision thresholds instead of the recommended universal value of 10%. We further suggest integrating the CaGene5 software package, which includes BRCAPRO and Myriad, in the genetic counselling of German families with suspected inherited breast and ovarian cancer because of the good performance of BRCAPRO and the substantial ease of use of this software.

  7. Significance of periodogram peaks

    NASA Astrophysics Data System (ADS)

    Süveges, Maria; Guy, Leanne; Zucker, Shay

    2016-10-01

    Three versions of significance measures or False Alarm Probabilities (FAPs) for periodogram peaks are presented and compared for sinusoidal and box-like signals, with specific application on large-scale surveys in mind.

  8. Significance of four MRD markers in MRD-based treatment strategy for childhood acute lymphoblastic leukemia.

    PubMed

    Sawada, Akihisa; Sakata, Naoki; Kishimoto, Tomoko; Higuchi, Banryoku; Koyama, Maho; Kondo, Osamu; Sato, Emiko; Okamura, Takayuki; Yasui, Masahiro; Inoue, Masami; Yoshioka, Akira; Kawa, Keisei

    2009-12-01

    Newly diagnosed children with ALL (n=32) were treated on a protocol incorporating minimal residual disease (MRD)-based treatment decisions. MRD was monitored at 4 time points by semi-quantitative PCR detection of antigen receptor gene rearrangement, flow cytometry, quantitative RT-PCR detection of chimeric gene transcripts and overexpressed WT1 mRNA. Four patients positive for MRD at week 5 were treated with an intensified regimen. Median follow-up was 5.0 years (range 3.8-6.6 years) with a 4-year event-free survival rate of 93.8+/-4.3%. This MRD-based treatment strategy seems to be highly successful and may improve the outcomes of children with ALL. A large study is warranted.

  9. Innovations in individual feature history management - The significance of feature-based temporal model

    USGS Publications Warehouse

    Choi, J.; Seong, J.C.; Kim, B.; Usery, E.L.

    2008-01-01

    A feature relies on three dimensions (space, theme, and time) for its representation. Even though spatiotemporal models have been proposed, they have principally focused on the spatial changes of a feature. In this paper, a feature-based temporal model is proposed to represent the changes of both space and theme independently. The proposed model modifies the ISO's temporal schema and adds new explicit temporal relationship structure that stores temporal topological relationship with the ISO's temporal primitives of a feature in order to keep track feature history. The explicit temporal relationship can enhance query performance on feature history by removing topological comparison during query process. Further, a prototype system has been developed to test a proposed feature-based temporal model by querying land parcel history in Athens, Georgia. The result of temporal query on individual feature history shows the efficiency of the explicit temporal relationship structure. ?? Springer Science+Business Media, LLC 2007.

  10. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  11. Superpositions of probability distributions

    NASA Astrophysics Data System (ADS)

    Jizba, Petr; Kleinert, Hagen

    2008-09-01

    Probability distributions which can be obtained from superpositions of Gaussian distributions of different variances v=σ2 play a favored role in quantum theory and financial markets. Such superpositions need not necessarily obey the Chapman-Kolmogorov semigroup relation for Markovian processes because they may introduce memory effects. We derive the general form of the smearing distributions in v which do not destroy the semigroup property. The smearing technique has two immediate applications. It permits simplifying the system of Kramers-Moyal equations for smeared and unsmeared conditional probabilities, and can be conveniently implemented in the path integral calculus. In many cases, the superposition of path integrals can be evaluated much easier than the initial path integral. Three simple examples are presented, and it is shown how the technique is extended to quantum mechanics.

  12. Experience matters: information acquisition optimizes probability gain.

    PubMed

    Nelson, Jonathan D; McKenzie, Craig R M; Cottrell, Garrison W; Sejnowski, Terrence J

    2010-07-01

    Deciding which piece of information to acquire or attend to is fundamental to perception, categorization, medical diagnosis, and scientific inference. Four statistical theories of the value of information-information gain, Kullback-Liebler distance, probability gain (error minimization), and impact-are equally consistent with extant data on human information acquisition. Three experiments, designed via computer optimization to be maximally informative, tested which of these theories best describes human information search. Experiment 1, which used natural sampling and experience-based learning to convey environmental probabilities, found that probability gain explained subjects' information search better than the other statistical theories or the probability-of-certainty heuristic. Experiments 1 and 2 found that subjects behaved differently when the standard method of verbally presented summary statistics (rather than experience-based learning) was used to convey environmental probabilities. Experiment 3 found that subjects' preference for probability gain is robust, suggesting that the other models contribute little to subjects' search behavior.

  13. On the Metropolis-Hastings acceptance probability to add or drop a quantitative trait locus in Markov chain Monte Carlo-based Bayesian analyses.

    PubMed Central

    Jannink, Jean-Luc; Fernando, Rohan L

    2004-01-01

    The Metropolis-Hastings algorithm used in analyses that estimate the number of QTL segregating in a mapping population requires the calculation of an acceptance probability to add or drop a QTL from the model. Expressions for this acceptance probability need to recognize that sets of QTL are unordered such that the number of equivalent sets increases with the factorial of the QTL number. Here, we show how accounting for this fact affects the acceptance probability and review expressions found in the literature. PMID:15020452

  14. Significant Performance Enhancement in Asymmetric Supercapacitors based on Metal Oxides, Carbon nanotubes and Neutral Aqueous Electrolyte

    PubMed Central

    Singh, Arvinder; Chandra, Amreesh

    2015-01-01

    Amongst the materials being investigated for supercapacitor electrodes, carbon based materials are most investigated. However, pure carbon materials suffer from inherent physical processes which limit the maximum specific energy and power that can be achieved in an energy storage device. Therefore, use of carbon-based composites with suitable nano-materials is attaining prominence. The synergistic effect between the pseudocapacitive nanomaterials (high specific energy) and carbon (high specific power) is expected to deliver the desired improvements. We report the fabrication of high capacitance asymmetric supercapacitor based on electrodes of composites of SnO2 and V2O5 with multiwall carbon nanotubes and neutral 0.5 M Li2SO4 aqueous electrolyte. The advantages of the fabricated asymmetric supercapacitors are compared with the results published in the literature. The widened operating voltage window is due to the higher over-potential of electrolyte decomposition and a large difference in the work functions of the used metal oxides. The charge balanced device returns the specific capacitance of ~198 F g−1 with corresponding specific energy of ~89 Wh kg−1 at 1 A g−1. The proposed composite systems have shown great potential in fabricating high performance supercapacitors. PMID:26494197

  15. Significant Performance Enhancement in Asymmetric Supercapacitors based on Metal Oxides, Carbon nanotubes and Neutral Aqueous Electrolyte.

    PubMed

    Singh, Arvinder; Chandra, Amreesh

    2015-10-23

    Amongst the materials being investigated for supercapacitor electrodes, carbon based materials are most investigated. However, pure carbon materials suffer from inherent physical processes which limit the maximum specific energy and power that can be achieved in an energy storage device. Therefore, use of carbon-based composites with suitable nano-materials is attaining prominence. The synergistic effect between the pseudocapacitive nanomaterials (high specific energy) and carbon (high specific power) is expected to deliver the desired improvements. We report the fabrication of high capacitance asymmetric supercapacitor based on electrodes of composites of SnO2 and V2O5 with multiwall carbon nanotubes and neutral 0.5 M Li2SO4 aqueous electrolyte. The advantages of the fabricated asymmetric supercapacitors are compared with the results published in the literature. The widened operating voltage window is due to the higher over-potential of electrolyte decomposition and a large difference in the work functions of the used metal oxides. The charge balanced device returns the specific capacitance of ~198 F g(-1) with corresponding specific energy of ~89 Wh kg(-1) at 1 A g(-1). The proposed composite systems have shown great potential in fabricating high performance supercapacitors.

  16. Efficient Probability Sequences

    DTIC Science & Technology

    2014-08-18

    Ungar (2014), to produce a distinct forecasting system. The system consists of the method for eliciting individual subjective forecasts together with...E. Stone, and L. H. Ungar (2014). Two reasons to make aggregated probability forecasts more extreme. Decision Analysis 11 (2), 133–145. Bickel, J. E...Letters 91 (3), 425–429. Mellers, B., L. Ungar , J. Baron, J. Ramos, B. Gurcay, K. Fincher, S. E. Scott, D. Moore, P. Atanasov, S. A. Swift, et al. (2014

  17. Regional flood probabilities

    USGS Publications Warehouse

    Troutman, B.M.; Karlinger, M.R.

    2003-01-01

    The T-year annual maximum flood at a site is defined to be that streamflow, that has probability 1/T of being exceeded in any given year, and for a group of sites the corresponding regional flood probability (RFP) is the probability that at least one site will experience a T-year flood in any given year. The RFP depends on the number of sites of interest and on the spatial correlation of flows among the sites. We present a Monte Carlo method for obtaining the RFP and demonstrate that spatial correlation estimates used in this method may be obtained with rank transformed data and therefore that knowledge of the at-site peak flow distribution is not necessary. We examine the extent to which the estimates depend on specification of a parametric form for the spatial correlation function, which is known to be nonstationary for peak flows. It is shown in a simulation study that use of a stationary correlation function to compute RFPs yields satisfactory estimates for certain nonstationary processes. Application of asymptotic extreme value theory is examined, and a methodology for separating channel network and rainfall effects on RFPs is suggested. A case study is presented using peak flow data from the state of Washington. For 193 sites in the Puget Sound region it is estimated that a 100-year flood will occur on the average every 4,5 years.

  18. Most-probable-number loop-mediated isothermal amplification-based procedure enhanced with K antigen-specific immunomagnetic separation for quantifying tdh(+) Vibrio parahaemolyticus in molluscan Shellfish.

    PubMed

    Tanaka, Natsuko; Iwade, Yoshito; Yamazaki, Wataru; Gondaira, Fumio; Vuddhakul, Varaporn; Nakaguchi, Yoshitsugu; Nishibuchi, Mitsuaki

    2014-07-01

    Although thermostable direct hemolysin-producing (tdh(+)) Vibrio parahaemolyticus is the leading cause of seafood-borne gastroenteritis, the enumeration of tdh(+) V. parahaemolyticus remains challenging due to its low densities in the environment. In this study, we developed a most-probable-number (MPN)-based procedure designated A-IS(1)-LAMP, in which an immunomagnetic separation (IMS) technique targeting as many as 69 established K antigens and a loop-mediated isothermal amplification (LAMP) assay targeting the thermostable direct hemolysin (tdh) gene were applied in an MPN format. Our IMS employed PickPen, an eight-channel intrasolution magnetic particle separation device, which enabled a straightforward microtiter plate-based IMS procedure (designated as PickPen-IMS). The ability of the procedure to quantify a wide range of tdh(+) V. parahaemolyticus levels was evaluated by testing shellfish samples in Japan and southern Thailand, where shellfish products are known to contain relatively low and high levels of total V. parahaemolyticus, respectively. The Japanese and Thai shellfish samples showed, respectively, relatively low (< 3 to 11 MPN/10 g) and considerably higher (930 to 110,000 MPN/10 g) levels of tdh(+) V. parahaemolyticus, raising concern about the safety of Thai shellfish products sold to domestic consumers at local morning markets. LAMP showed similar or higher performance than conventional PCR in the detection and quantification of a wide range of tdh(+) V. parahaemolyticus levels in shellfish products. Whereas a positive effect of PickPen-IMS was not observed in MPN determination, PickPen-IMS was able to concentrate tdh(+) V. parahaemolyticus 32-fold on average from the Japanese shellfish samples at an individual tube level, suggesting a possibility of using PickPen-IMS as an optional tool for specific shellfish samples. The A-IS(1)-LAMP procedure can be used by any health authority in the world to measure the tdh(+) V. parahaemolyticus levels in

  19. Nanosilver based anionic linear globular dendrimer with a special significant antiretroviral activity.

    PubMed

    Ardestani, Mehdi Shafiee; Fordoei, Alireza Salehi; Abdoli, Asghar; Ahangari Cohan, Reza; Bahramali, Golnaz; Sadat, Seyed Mehdi; Siadat, Seyed Davar; Moloudian, Hamid; Nassiri Koopaei, Nasser; Bolhasani, Azam; Rahimi, Pooneh; Hekmat, Soheila; Davari, Mehdi; Aghasadeghi, Mohammad Reza

    2015-05-01

    HIV is commonly caused to a very complicated disease which has not any recognized vaccine, so designing and development of novel antiretroviral agents with specific application of nanomedicine is a globally interested research subject worldwide. In the current study, a novel structure of silver complexes with anionic linear globular dendrimer was synthesized, characterized and then assessed against HIV replication pathway in vitro as well. The results showed a very good yield of synthesis (up to 70%) for the nano-complex as well as a very potent significant (P < 0.05) antiretroviral activity with non-severe toxic effects in comparison with the Nevirapine as standard drug in positive control group. According to the present data, silver anionic linear globular dendrimers complex may have a promising future to inhibit replication of HIV viruse in clinical practice.

  20. A Citation-Based Analysis and Review of Significant Papers on Timing and Time Perception

    PubMed Central

    Teki, Sundeep

    2016-01-01

    Time is an important dimension of brain function, but little is yet known about the underlying cognitive principles and neurobiological mechanisms. The field of timing and time perception has witnessed tremendous growth and multidisciplinary interest in the recent years with the advent of modern neuroimaging and neurophysiological approaches. In this article, I used a data mining approach to analyze the timing literature published by a select group of researchers (n = 202) during the period 2000–2015 and highlight important reviews as well as empirical articles that meet the criterion of a minimum of 100 citations. The qualifying articles (n = 150) are listed in a table along with key details such as number of citations, names of authors, year and journal of publication as well as a short summary of the findings of each study. The results of such a data-driven approach to literature review not only serve as a useful resource to any researcher interested in timing, but also provides a means to evaluate key papers that have significantly influenced the field and summarize recent progress and popular research trends in the field. Additionally, such analyses provides food for thought about future scientific directions and raises important questions about improving organizational structures to boost open science and progress in the field. I discuss exciting avenues for future research that have the potential to significantly advance our understanding of the neurobiology of timing, and propose the establishment of a new society, the Timing Research Forum, to promote open science and collaborative work within the highly diverse and multidisciplinary community of researchers in the field of timing and time perception. PMID:27471445

  1. Prognostic significance of tumor subtypes in male breast cancer: a population-based study.

    PubMed

    Leone, José Pablo; Leone, Julieta; Zwenger, Ariel Osvaldo; Iturbe, Julián; Vallejo, Carlos Teodoro; Leone, Bernardo Amadeo

    2015-08-01

    Substantial controversy exists about the prognostic role of tumor subtypes in male breast cancer (MaBC). The aim of this study was to analyze the characteristics of each tumor subtype in MaBC and its association with prognosis compared with other factors. We evaluated MaBC patients between 2010 and 2012 with known estrogen receptor, progesterone receptor [together hormone receptor (HR)] status, and human epidermal growth factor receptor 2 (HER2) status reported to the Surveillance, Epidemiology, and End Results program. Patients were classified as: HR-positive/HER2-negative, HR-positive/HER2-positive, HR-negative/HER2-positive, and triple-negative (TN). Univariate and multivariate analyses determined the effect of each variable on overall survival (OS). We included 960 patients. Patient distribution was 84.9 % HR-positive/HER2-negative, 11.6 % HR-positive/HER2-positive, 0.6 % HR-negative/HER2-positive, and 2.9 % TN. TN patients were younger, had higher grade, presented with more advanced stage, were more likely to have mastectomy, and to die of breast cancer (all P < 0.05). Univariate analysis showed that HER2 positivity was associated with shorter OS (hazard ratio 1.90, P = 0.031) and TN patients had worse prognosis (hazard ratio 5.10, P = 0.0004). In multivariate analysis, older patients (hazard ratio 3.10, P = 0.032), those with stage IV (hazard ratio 16.27, P < 0.001) and those with TN tumors (hazard ratio 4.61, P = 0.002) had significantly worse OS. We observed significant differences in patient characteristics according to tumor subtype. HER2-positive and TN represented a small proportion of cases. In addition to age and stage, tumor subtype has clear influence on OS in MaBC.

  2. A Citation-Based Analysis and Review of Significant Papers on Timing and Time Perception.

    PubMed

    Teki, Sundeep

    2016-01-01

    Time is an important dimension of brain function, but little is yet known about the underlying cognitive principles and neurobiological mechanisms. The field of timing and time perception has witnessed tremendous growth and multidisciplinary interest in the recent years with the advent of modern neuroimaging and neurophysiological approaches. In this article, I used a data mining approach to analyze the timing literature published by a select group of researchers (n = 202) during the period 2000-2015 and highlight important reviews as well as empirical articles that meet the criterion of a minimum of 100 citations. The qualifying articles (n = 150) are listed in a table along with key details such as number of citations, names of authors, year and journal of publication as well as a short summary of the findings of each study. The results of such a data-driven approach to literature review not only serve as a useful resource to any researcher interested in timing, but also provides a means to evaluate key papers that have significantly influenced the field and summarize recent progress and popular research trends in the field. Additionally, such analyses provides food for thought about future scientific directions and raises important questions about improving organizational structures to boost open science and progress in the field. I discuss exciting avenues for future research that have the potential to significantly advance our understanding of the neurobiology of timing, and propose the establishment of a new society, the Timing Research Forum, to promote open science and collaborative work within the highly diverse and multidisciplinary community of researchers in the field of timing and time perception.

  3. Understanding Y haplotype matching probability.

    PubMed

    Brenner, Charles H

    2014-01-01

    The Y haplotype population-genetic terrain is better explored from a fresh perspective rather than by analogy with the more familiar autosomal ideas. For haplotype matching probabilities, versus for autosomal matching probabilities, explicit attention to modelling - such as how evolution got us where we are - is much more important while consideration of population frequency is much less so. This paper explores, extends, and explains some of the concepts of "Fundamental problem of forensic mathematics - the evidential strength of a rare haplotype match". That earlier paper presented and validated a "kappa method" formula for the evidential strength when a suspect matches a previously unseen haplotype (such as a Y-haplotype) at the crime scene. Mathematical implications of the kappa method are intuitive and reasonable. Suspicions to the contrary raised in rest on elementary errors. Critical to deriving the kappa method or any sensible evidential calculation is understanding that thinking about haplotype population frequency is a red herring; the pivotal question is one of matching probability. But confusion between the two is unfortunately institutionalized in much of the forensic world. Examples make clear why (matching) probability is not (population) frequency and why uncertainty intervals on matching probabilities are merely confused thinking. Forensic matching calculations should be based on a model, on stipulated premises. The model inevitably only approximates reality, and any error in the results comes only from error in the model, the inexactness of the approximation. Sampling variation does not measure that inexactness and hence is not helpful in explaining evidence and is in fact an impediment. Alternative haplotype matching probability approaches that various authors have considered are reviewed. Some are based on no model and cannot be taken seriously. For the others, some evaluation of the models is discussed. Recent evidence supports the adequacy of

  4. Control range: a controllability-based index for node significance in directed networks

    NASA Astrophysics Data System (ADS)

    Wang, Bingbo; Gao, Lin; Gao, Yong

    2012-04-01

    While a large number of methods for module detection have been developed for undirected networks, it is difficult to adapt them to handle directed networks due to the lack of consensus criteria for measuring the node significance in a directed network. In this paper, we propose a novel structural index, the control range, motivated by recent studies on the structural controllability of large-scale directed networks. The control range of a node quantifies the size of the subnetwork that the node can effectively control. A related index, called the control range similarity, is also introduced to measure the structural similarity between two nodes. When applying the index of control range to several real-world and synthetic directed networks, it is observed that the control range of the nodes is mainly influenced by the network's degree distribution and that nodes with a low degree may have a high control range. We use the index of control range similarity to detect and analyze functional modules in glossary networks and the enzyme-centric network of homo sapiens. Our results, as compared with other approaches to module detection such as modularity optimization algorithm, dynamic algorithm and clique percolation method, indicate that the proposed indices are effective and practical in depicting structural and modular characteristics of sparse directed networks.

  5. Heavy metal contamination of freshwater prawn (Macrobrachium rosenbergii) and prawn feed in Bangladesh: A market-based study to highlight probable health risks.

    PubMed

    Rabiul Islam, G M; Habib, Mohammad Ruzlan; Waid, Jillian L; Rahman, M Safiur; Kabir, J; Akter, S; Jolly, Y N

    2017-03-01

    An assessment of the dietary risk of heavy metal exposure to humans is important since it is the main source of exposure. This study aimed to estimate the degree of contamination and assess the probable health risk in the prawn food chain. In prawn feed, the concentrations of metals were detected in the following order: Hg > Co > Pb > Cd. The concentrations of heavy metals in prawn were the highest for Co and lowest for Cd. Trace amounts of As and Cr were detected in the analyzed sample. Target hazard quotients for heavy metals for adults were >1 for Pb, Cd, Hg, and Co, and for children, the same were high for Co and Hg, indicating significant health risks upon dietary exposure. All the prawn samples contained nine-fold and fourteen-fold higher concentrations than the maximum acceptable levels for Pb and Hg, respectively (0.5 mg kg(-1); WHO/FAO). Human health risk due to the Co exposure is quite alarming as the level of exposure was found to be very high. In the prawn samples intended for human consumption, the hazard index (HI) was highest in the samples obtained from Bagerhat (3.25 in flesh and 3.26 in skin), followed by the samples obtained from Satkhira (2.84 in flesh and 3.10 in skin) and Dhaka City Corporation (2.81 in flesh and 3.42 in Skin); this indicates a potential risk of prawn consumption obtained from Southeast Bangladesh. This is particularly problematic as this area accounts for the majority of prawn production and export of the country.

  6. The community-based Health Extension Program significantly improved contraceptive utilization in West Gojjam Zone, Ethiopia

    PubMed Central

    Yitayal, Mezgebu; Berhane, Yemane; Worku, Alemayehu; Kebede, Yigzaw

    2014-01-01

    Background Ethiopia has implemented a nationwide primary health program at grassroots level (known as the Health Extension Program) since 2003 to increase public access to basic health services. This study was conducted to assess whether households that fully implemented the Health Extension Program have improved current contraceptive use. Methods A cross-sectional community-based survey was conducted to collect data from 1,320 mothers using a structured questionnaire. A multivariate logistic regression was used to identify the predictors of current contraceptive utilization. A propensity score analysis was used to determine the contribution of the Health Extension Program “model households” on current contraceptive utilization. Result Mothers from households which fully benefited from the Health Extension Program (“model households”) were 3.97 (adjusted odds ratio, 3.97; 95% confidence interval, 3.01–5.23) times more likely to use contraceptives compared with mothers from non-model households. Model household status contributed to 29.3% (t=7.08) of the increase in current contraceptive utilization. Conclusion The Health Extension Program when implemented fully could help to increase the utilization of contraceptives in the rural community and improve family planning. PMID:24868165

  7. Retrieve Tether Survival Probability

    DTIC Science & Technology

    2007-11-02

    cuts of the tether by meteorites and orbital debris , is calculated to be 99.934% for the planned experiment duration of six months or less. This is...due to the unlikely event of a strike by a large piece of orbital debris greater than 1 meter in size cutting all the lines of the tether at once. The...probability of the tether surviving multiple cuts by meteoroid and orbital debris impactors smaller than 5 cm in diameter is 99.9993% at six months

  8. An algorithm for finding biologically significant features in microarray data based on a priori manifold learning.

    PubMed

    Hira, Zena M; Trigeorgis, George; Gillies, Duncan F

    2014-01-01

    Microarray databases are a large source of genetic data, which, upon proper analysis, could enhance our understanding of biology and medicine. Many microarray experiments have been designed to investigate the genetic mechanisms of cancer, and analytical approaches have been applied in order to classify different types of cancer or distinguish between cancerous and non-cancerous tissue. However, microarrays are high-dimensional datasets with high levels of noise and this causes problems when using machine learning methods. A popular approach to this problem is to search for a set of features that will simplify the structure and to some degree remove the noise from the data. The most widely used approach to feature extraction is principal component analysis (PCA) which assumes a multivariate Gaussian model of the data. More recently, non-linear methods have been investigated. Among these, manifold learning algorithms, for example Isomap, aim to project the data from a higher dimensional space onto a lower dimension one. We have proposed a priori manifold learning for finding a manifold in which a representative set of microarray data is fused with relevant data taken from the KEGG pathway database. Once the manifold has been constructed the raw microarray data is projected onto it and clustering and classification can take place. In contrast to earlier fusion based methods, the prior knowledge from the KEGG databases is not used in, and does not bias the classification process--it merely acts as an aid to find the best space in which to search the data. In our experiments we have found that using our new manifold method gives better classification results than using either PCA or conventional Isomap.

  9. Robustness of CDK2 in Triggering Cellular Senescence based on Probability of DNA-damaged Cells Passing G1/S Checkpoint

    NASA Astrophysics Data System (ADS)

    Ling, Hong; Samarasinghe, Sandhya; Kulasiri, Don

    2011-06-01

    Recent experiments have shown that cellular senescence, a mechanism employed by cells for thwarting cell proliferation, plays an important role in protecting cells against cancer; therefore, a deeper understanding of cellular senescence can lead to effective cancer treatment. Inhibition of CDK2 is thought to be the critical trigger for cellular senescence. In this study, we first implement a mathematical model of G1/S transition involving the DNA-damage pathway and show that cellular senescence can be achieved by lowering CDK2. The robustness of CDK2 in triggering cellular senescence is determined from the probability (β) of DNA-damaged cells passing G1/S checkpoint for normal CDK2 and CDK2-deficient situations based on different thresholds of the peak time of two important biomarkers, CycE and E2F. The comparison of the values of β under the normal CDK2 and lower CDK2 levels reveals that reducing CDK2 levels can decrease the percentage of damaged cells passing G1/S checkpoint; more importantly, 50% reduction of CDK2 achieves 65% reduction in the percentage of damaged cells passing the G1/S checkpoint. These results point out that the developed model can highlight the possibility of lowering the bar for cellular senescence by reducing CDK2 levels. The results of investigation of β for the different thresholds of the peak times of other biomarkers show that β is insensitive to these perturbations of the peak time indicating that CDK2 activity is robust in lowering the senescence bar for low and high levels of DNA-damage. Furthermore, a mathematical formulation of robustness indicates that the robustness of CDK2-triggered senescence increases with decreasing levels of CDK2, and is slightly greater for low-level DNA damage condition.

  10. WPE: A Mathematical Microworld for Learning Probability

    ERIC Educational Resources Information Center

    Kiew, Su Ding; Sam, Hong Kian

    2006-01-01

    In this study, the researchers developed the Web-based Probability Explorer (WPE), a mathematical microworld and investigated the effectiveness of the microworld's constructivist learning environment in enhancing the learning of probability and improving students' attitudes toward mathematics. This study also determined the students' satisfaction…

  11. Laboratory-Tutorial Activities for Teaching Probability

    ERIC Educational Resources Information Center

    Wittmann, Michael C.; Morgan, Jeffrey T.; Feeley, Roger E.

    2006-01-01

    We report on the development of students' ideas of probability and probability density in a University of Maine laboratory-based general education physics course called "Intuitive Quantum Physics". Students in the course are generally math phobic with unfavorable expectations about the nature of physics and their ability to do it. We…

  12. Microarray Based Gene Expression Analysis of Murine Brown and Subcutaneous Adipose Tissue: Significance with Human

    PubMed Central

    Boparai, Ravneet K.; Kondepudi, Kanthi Kiran; Mantri, Shrikant; Bishnoi, Mahendra

    2015-01-01

    Background Two types of adipose tissues, white (WAT) and brown (BAT) are found in mammals. Increasingly novel strategies are being proposed for the treatment of obesity and its associated complications by altering amount and/or activity of BAT using mouse models. Methodology/Principle Findings The present study was designed to: (a) investigate the differential expression of genes in LACA mice subcutaneous WAT (sWAT) and BAT using mouse DNA microarray, (b) to compare mouse differential gene expression with previously published human data; to understand any inter- species differences between the two and (c) to make a comparative assessment with C57BL/6 mouse strain. In mouse microarray studies, over 7003, 1176 and 401 probe sets showed more than two-fold, five-fold and ten-fold change respectively in differential expression between murine BAT and WAT. Microarray data was validated using quantitative RT-PCR of key genes showing high expression in BAT (Fabp3, Ucp1, Slc27a1) and sWAT (Ms4a1, H2-Ob, Bank1) or showing relatively low expression in BAT (Pgk1, Cox6b1) and sWAT (Slc20a1, Cd74). Multi-omic pathway analysis was employed to understand possible links between the organisms. When murine two fold data was compared with published human BAT and sWAT data, 90 genes showed parallel differential expression in both mouse and human. Out of these 90 genes, 46 showed same pattern of differential expression whereas the pattern was opposite for the remaining 44 genes. Based on our microarray results and its comparison with human data, we were able to identify genes (targets) (a) which can be studied in mouse model systems to extrapolate results to human (b) where caution should be exercised before extrapolation of murine data to human. Conclusion Our study provides evidence for inter species (mouse vs human) differences in differential gene expression between sWAT and BAT. Critical understanding of this data may help in development of novel ways to engineer one form of adipose

  13. Significant disparity in base and sugar damage in DNA resulting from neutron and electron irradiation

    PubMed Central

    Pang, Dalong; Nico, Jeffrey S.; Karam, Lisa; Timofeeva, Olga; Blakely, William F.; Dritschilo, Anatoly; Dizdaroglu, Miral; Jaruga, Pawel

    2014-01-01

    In this study, a comparison of the effects of neutron and electron irradiation of aqueous DNA solutions was investigated to characterize potential neutron signatures in DNA damage induction. Ionizing radiation generates numerous lesions in DNA, including base and sugar lesions, lesions involving base–sugar combinations (e.g. 8,5′-cyclopurine-2′-deoxynucleosides) and DNA–protein cross-links, as well as single- and double-strand breaks and clustered damage. The characteristics of damage depend on the linear energy transfer (LET) of the incident radiation. Here we investigated DNA damage using aqueous DNA solutions in 10 mmol/l phosphate buffer from 0–80 Gy by low-LET electrons (10 Gy/min) and the specific high-LET (∼0.16 Gy/h) neutrons formed by spontaneous 252Cf decay fissions. 8-hydroxy-2′-deoxyguanosine (8-OH-dG), (5′R)-8,5′-cyclo-2′-deoxyadenosine (R-cdA) and (5′S)-8,5′-cyclo-2′-deoxyadenosine (S-cdA) were quantified using liquid chromatography–isotope-dilution tandem mass spectrometry to demonstrate a linear dose dependence for induction of 8-OH-dG by both types of radiation, although neutron irradiation was ∼50% less effective at a given dose compared with electron irradiation. Electron irradiation resulted in an exponential increase in S-cdA and R-cdA with dose, whereas neutron irradiation induced substantially less damage and the amount of damage increased only gradually with dose. Addition of 30 mmol/l 2-amino-2-(hydroxymethyl)-1,3-propanediol (TRIS), a free radical scavenger, to the DNA solution before irradiation reduced lesion induction to background levels for both types of radiation. These results provide insight into the mechanisms of DNA damage by high-LET 252Cf decay neutrons and low-LET electrons, leading to enhanced understanding of the potential biological effects of these types of irradiation. PMID:25034731

  14. Error probability performance of unbalanced QPSK receivers

    NASA Technical Reports Server (NTRS)

    Simon, M. K.

    1978-01-01

    A simple technique for calculating the error probability performance and associated noisy reference loss of practical unbalanced QPSK receivers is presented. The approach is based on expanding the error probability conditioned on the loop phase error in a power series in the loop phase error and then, keeping only the first few terms of this series, averaging this conditional error probability over the probability density function of the loop phase error. Doing so results in an expression for the average error probability which is in the form of a leading term representing the ideal (perfect synchronization references) performance plus a term proportional to the mean-squared crosstalk. Thus, the additional error probability due to noisy synchronization references occurs as an additive term proportional to the mean-squared phase jitter directly associated with the receiver's tracking loop. Similar arguments are advanced to give closed-form results for the noisy reference loss itself.

  15. Probability and Quantum Paradigms: the Interplay

    SciTech Connect

    Kracklauer, A. F.

    2007-12-03

    Since the introduction of Born's interpretation of quantum wave functions as yielding the probability density of presence, Quantum Theory and Probability have lived in a troubled symbiosis. Problems arise with this interpretation because quantum probabilities exhibit features alien to usual probabilities, namely non Boolean structure and non positive-definite phase space probability densities. This has inspired research into both elaborate formulations of Probability Theory and alternate interpretations for wave functions. Herein the latter tactic is taken and a suggested variant interpretation of wave functions based on photo detection physics proposed, and some empirical consequences are considered. Although incomplete in a few details, this variant is appealing in its reliance on well tested concepts and technology.

  16. Probability of brittle failure

    NASA Technical Reports Server (NTRS)

    Kim, A.; Bosnyak, C. P.; Chudnovsky, A.

    1991-01-01

    A methodology was developed for collecting statistically representative data for crack initiation and arrest from small number of test specimens. An epoxy (based on bisphenol A diglycidyl ether and polyglycol extended diglycyl ether and cured with diethylene triamine) is selected as a model material. A compact tension specimen with displacement controlled loading is used to observe multiple crack initiation and arrests. The energy release rate at crack initiation is significantly higher than that at a crack arrest, as has been observed elsewhere. The difference between these energy release rates is found to depend on specimen size (scale effect), and is quantitatively related to the fracture surface morphology. The scale effect, similar to that in statistical strength theory, is usually attributed to the statistics of defects which control the fracture process. Triangular shaped ripples (deltoids) are formed on the fracture surface during the slow subcritical crack growth, prior to the smooth mirror-like surface characteristic of fast cracks. The deltoids are complementary on the two crack faces which excludes any inelastic deformation from consideration. Presence of defects is also suggested by the observed scale effect. However, there are no defects at the deltoid apexes detectable down to the 0.1 micron level.

  17. Earthquake probabilities: theoretical assessments and reality

    NASA Astrophysics Data System (ADS)

    Kossobokov, V. G.

    2013-12-01

    It is of common knowledge that earthquakes are complex phenomena which classification and sizing remain serious problems of the contemporary seismology. In general, their frequency-magnitude distribution exhibit power law scaling. This scaling differs significantly when different time and/or space domains are considered. At the scale of a particular earthquake rupture zone the frequency of similar size events is usually estimated to be about once in several hundred years. Evidently, contemporary seismology does not possess enough reported instrumental data for any reliable quantification of an earthquake probability at a given place of expected event. Regretfully, most of the state-of-the-art theoretical approaches to assess probability of seismic events are based on trivial (e.g. Poisson, periodic, etc) or, conversely, delicately-designed (e.g. STEP, ETAS, etc) models of earthquake sequences. Some of these models are evidently erroneous, some can be rejected by the existing statistics, and some are hardly testable in our life-time. Nevertheless such probabilistic counts including seismic hazard assessment and earthquake forecasting when used on practice eventually mislead to scientifically groundless advices communicated to decision makers and inappropriate decisions. As a result, the population of seismic regions continues facing unexpected risk and losses. The international project Global Earthquake Model (GEM) is on the wrong track, if it continues to base seismic risk estimates on the standard, mainly probabilistic, methodology to assess seismic hazard. It is generally accepted that earthquakes are infrequent, low-probability events. However, they keep occurring at earthquake-prone areas with 100% certainty. Given the expectation of seismic event once per hundred years, the daily probability of occurrence on a certain date may range from 0 to 100% depending on a choice of probability space (which is yet unknown and, therefore, made by a subjective lucky chance

  18. Probabilities of future VEI ≥ 2 eruptions at the Central American Volcanic Arc: a statistical perspective based on the past centuries' eruption record

    NASA Astrophysics Data System (ADS)

    Dzierma, Yvonne; Wehrmann, Heidi

    2014-10-01

    A probabilistic eruption forecast is provided for seven historically active volcanoes along the Central American Volcanic Arc (CAVA), as a pivotal empirical contribution to multi-disciplinary volcanic hazards assessment. The eruption probabilities are determined with a Kaplan-Meier estimator of survival functions, and parametric time series models are applied to describe the historical eruption records. Aside from the volcanoes that are currently in a state of eruptive activity (Santa María, Fuego, and Arenal), the highest probabilities for eruptions of VEI ≥ 2 occur at Concepción and Cerro Negro in Nicaragua, which are likely to erupt to 70-85 % within the next 10 years. Poás and Irazú in Costa Rica show a medium to high eruption probability, followed by San Miguel (El Salvador), Rincón de la Vieja (Costa Rica), and Izalco (El Salvador; 24 % within the next 10 years).

  19. A Manual for Encoding Probability Distributions.

    DTIC Science & Technology

    1978-09-01

    summary of the most significant information contained in the report. If the report contains a significant bibliography or literature survey, mention it...probability distri- bution. Some terms in the literature that are used synonymously to Encoding: Assessment, Assignment (used for single events in this...sessions conducted as parts of practical decision analyses as well as on experimental evidence in the literature . Probability encoding can be applied

  20. Alternative probability theories for cognitive psychology.

    PubMed

    Narens, Louis

    2014-01-01

    Various proposals for generalizing event spaces for probability functions have been put forth in the mathematical, scientific, and philosophic literatures. In cognitive psychology such generalizations are used for explaining puzzling results in decision theory and for modeling the influence of context effects. This commentary discusses proposals for generalizing probability theory to event spaces that are not necessarily boolean algebras. Two prominent examples are quantum probability theory, which is based on the set of closed subspaces of a Hilbert space, and topological probability theory, which is based on the set of open sets of a topology. Both have been applied to a variety of cognitive situations. This commentary focuses on how event space properties can influence probability concepts and impact cognitive modeling.

  1. People's conditional probability judgments follow probability theory (plus noise).

    PubMed

    Costello, Fintan; Watts, Paul

    2016-09-01

    A common view in current psychology is that people estimate probabilities using various 'heuristics' or rules of thumb that do not follow the normative rules of probability theory. We present a model where people estimate conditional probabilities such as P(A|B) (the probability of A given that B has occurred) by a process that follows standard frequentist probability theory but is subject to random noise. This model accounts for various results from previous studies of conditional probability judgment. This model predicts that people's conditional probability judgments will agree with a series of fundamental identities in probability theory whose form cancels the effect of noise, while deviating from probability theory in other expressions whose form does not allow such cancellation. Two experiments strongly confirm these predictions, with people's estimates on average agreeing with probability theory for the noise-cancelling identities, but deviating from probability theory (in just the way predicted by the model) for other identities. This new model subsumes an earlier model of unconditional or 'direct' probability judgment which explains a number of systematic biases seen in direct probability judgment (Costello & Watts, 2014). This model may thus provide a fully general account of the mechanisms by which people estimate probabilities.

  2. No bridge too high: infants decide whether to cross based on the probability of falling not the severity of the potential fall.

    PubMed

    Kretch, Kari S; Adolph, Karen E

    2013-05-01

    Do infants, like adults, consider both the probability of falling and the severity of a potential fall when deciding whether to cross a bridge? Crawling and walking infants were encouraged to cross bridges varying in width over a small drop-off, a large drop-off, or no drop-off. Bridge width affects the probability of falling, whereas drop-off height affects the severity of the potential fall. For both crawlers and walkers, decisions about crossing bridges depended only on the probability of falling: As bridge width decreased, attempts to cross decreased, and gait modifications and exploration increased, but behaviors did not differ between small and large drop-off conditions. Similarly, decisions about descent depended on the probability of falling: Infants backed or crawled into the small drop-off, but avoided the large drop-off. With no drop-off, infants ran straight across. Results indicate that experienced crawlers and walkers accurately perceive affordances for locomotion, but they do not yet consider the severity of a potential fall when making decisions for action.

  3. Finding of No Significant Impact (FONSI) for Construction of a Base Civil Engineer Complex at Travis Air Force Base, California

    DTIC Science & Technology

    2002-01-26

    Region AST Aboveground storage tank AT/FP Anti-Terrorism/Force Protection ATW Air Transport Wing BA Biological Assessment BAAQMD Bay Area...Basin San Francisco Bay Area Air Basin BCE Base Civil Engineering bgs Below ground surface BMP Best management practice BO Biological...Sulfur hexafluoride SFBAAB San Francisco Bay Area Air Basin SHPO State Historic Preservation Office SIP state implementation plan SO2

  4. Naive Probability: A Mental Model Theory of Extensional Reasoning.

    ERIC Educational Resources Information Center

    Johnson-Laird, P. N.; Legrenzi, Paolo; Girotto, Vittorio; Legrenzi, Maria Sonino; Caverni, Jean-Paul

    1999-01-01

    Outlines a theory of naive probability in which individuals who are unfamiliar with the probability calculus can infer the probabilities of events in an "extensional" way. The theory accommodates reasoning based on numerical premises, and explains how naive reasoners can infer posterior probabilities without relying on Bayes's theorem.…

  5. Acid-base titrations for polyacids: Significance of the pK sub a and parameters in the Kern equation

    NASA Technical Reports Server (NTRS)

    Meites, L.

    1978-01-01

    A new method is suggested for calculating the dissociation constants of polyvalent acids, especially polymeric acids. In qualitative form the most significant characteristics of the titration curves are demonstrated and identified which are obtained when titrating the solutions of such acids with a standard base potentiometrically.

  6. Probability state modeling theory.

    PubMed

    Bagwell, C Bruce; Hunsberger, Benjamin C; Herbert, Donald J; Munson, Mark E; Hill, Beth L; Bray, Chris M; Preffer, Frederic I

    2015-07-01

    As the technology of cytometry matures, there is mounting pressure to address two major issues with data analyses. The first issue is to develop new analysis methods for high-dimensional data that can directly reveal and quantify important characteristics associated with complex cellular biology. The other issue is to replace subjective and inaccurate gating with automated methods that objectively define subpopulations and account for population overlap due to measurement uncertainty. Probability state modeling (PSM) is a technique that addresses both of these issues. The theory and important algorithms associated with PSM are presented along with simple examples and general strategies for autonomous analyses. PSM is leveraged to better understand B-cell ontogeny in bone marrow in a companion Cytometry Part B manuscript. Three short relevant videos are available in the online supporting information for both of these papers. PSM avoids the dimensionality barrier normally associated with high-dimensionality modeling by using broadened quantile functions instead of frequency functions to represent the modulation of cellular epitopes as cells differentiate. Since modeling programs ultimately minimize or maximize one or more objective functions, they are particularly amenable to automation and, therefore, represent a viable alternative to subjective and inaccurate gating approaches.

  7. Lectures on probability and statistics

    SciTech Connect

    Yost, G.P.

    1984-09-01

    These notes are based on a set of statistics lectures delivered at Imperial College to the first-year postgraduate students in High Energy Physics. They are designed for the professional experimental scientist. We begin with the fundamentals of probability theory, in which one makes statements about the set of possible outcomes of an experiment, based upon a complete a priori understanding of the experiment. For example, in a roll of a set of (fair) dice, one understands a priori that any given side of each die is equally likely to turn up. From that, we can calculate the probability of any specified outcome. We finish with the inverse problem, statistics. Here, one begins with a set of actual data (e.g., the outcomes of a number of rolls of the dice), and attempts to make inferences about the state of nature which gave those data (e.g., the likelihood of seeing any given side of any given die turn up). This is a much more difficult problem, of course, and one's solutions often turn out to be unsatisfactory in one respect or another.

  8. Prestack inversion based on anisotropic Markov random field-maximum posterior probability inversion and its application to identify shale gas sweet spots

    NASA Astrophysics Data System (ADS)

    Wang, Kang-Ning; Sun, Zan-Dong; Dong, Ning

    2015-12-01

    Economic shale gas production requires hydraulic fracture stimulation to increase the formation permeability. Hydraulic fracturing strongly depends on geomechanical parameters such as Young's modulus and Poisson's ratio. Fracture-prone sweet spots can be predicted by prestack inversion, which is an ill-posed problem; thus, regularization is needed to obtain unique and stable solutions. To characterize gas-bearing shale sedimentary bodies, elastic parameter variations are regarded as an anisotropic Markov random field. Bayesian statistics are adopted for transforming prestack inversion to the maximum posterior probability. Two energy functions for the lateral and vertical directions are used to describe the distribution, and the expectation-maximization algorithm is used to estimate the hyperparameters of the prior probability of elastic parameters. Finally, the inversion yields clear geological boundaries, high vertical resolution, and reasonable lateral continuity using the conjugate gradient method to minimize the objective function. Antinoise and imaging ability of the method were tested using synthetic and real data.

  9. Cluster Based Reaction Probabilities for Boron with Oxygen, Hydrogen, Water, Nitrogen, Nitrous Oxide, Carbon Dioxide, Carbon Monoxide, Methane, Tetrafluoromethane, and Silane

    DTIC Science & Technology

    1989-10-28

    measured for reactions of boron cluster ions with the gases in question. We present both total reaction probabilities and also the branching fractions...Water, Nitrogen, Nitrous Oxide, Carbon Dioxide, Carbon Monoxide, Methane, Tetrafluoromethane , and Silane Paul A. Hintz, Stephen A. Ruatta, and Scott...detailed study of boron cluster ion reaction dynamics, we have tried to present our cross section measurements in a form most useful to combustion

  10. Assessment of the probability of contaminating Mars

    NASA Technical Reports Server (NTRS)

    Judd, B. R.; North, D. W.; Pezier, J. P.

    1974-01-01

    New methodology is proposed to assess the probability that the planet Mars will by biologically contaminated by terrestrial microorganisms aboard a spacecraft. Present NASA methods are based on the Sagan-Coleman formula, which states that the probability of contamination is the product of the expected microbial release and a probability of growth. The proposed new methodology extends the Sagan-Coleman approach to permit utilization of detailed information on microbial characteristics, the lethality of release and transport mechanisms, and of other information about the Martian environment. Three different types of microbial release are distinguished in the model for assessing the probability of contamination. The number of viable microbes released by each mechanism depends on the bio-burden in various locations on the spacecraft and on whether the spacecraft landing is accomplished according to plan. For each of the three release mechanisms a probability of growth is computed, using a model for transport into an environment suited to microbial growth.

  11. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  12. A timer inventory based upon manual and automated analysis of ERTS-1 and supporting aircraft data using multistage probability sampling. [Plumas National Forest, California

    NASA Technical Reports Server (NTRS)

    Nichols, J. D.; Gialdini, M.; Jaakkola, S.

    1974-01-01

    A quasi-operational study demonstrating that a timber inventory based on manual and automated analysis of ERTS-1, supporting aircraft data and ground data was made using multistage sampling techniques. The inventory proved to be a timely, cost effective alternative to conventional timber inventory techniques. The timber volume on the Quincy Ranger District of the Plumas National Forest was estimated to be 2.44 billion board feet with a sampling error of 8.2 percent. Costs per acre for the inventory procedure at 1.1 cent/acre compared favorably with the costs of a conventional inventory at 25 cents/acre. A point-by-point comparison of CALSCAN-classified ERTS data with human-interpreted low altitude photo plots indicated no significant differences in the overall classification accuracies.

  13. A Tale of Two Probabilities

    ERIC Educational Resources Information Center

    Falk, Ruma; Kendig, Keith

    2013-01-01

    Two contestants debate the notorious probability problem of the sex of the second child. The conclusions boil down to explication of the underlying scenarios and assumptions. Basic principles of probability theory are highlighted.

  14. The probability distribution of intense daily precipitation

    NASA Astrophysics Data System (ADS)

    Cavanaugh, Nicholas R.; Gershunov, Alexander; Panorska, Anna K.; Kozubowski, Tomasz J.

    2015-03-01

    The probability tail structure of over 22,000 weather stations globally is examined in order to identify the physically and mathematically consistent distribution type for modeling the probability of intense daily precipitation and extremes. Results indicate that when aggregating data annually, most locations are to be considered heavy tailed with statistical significance. When aggregating data by season, it becomes evident that the thickness of the probability tail is related to the variability in precipitation causing events and thus that the fundamental cause of precipitation volatility is weather diversity. These results have both theoretical and practical implications for the modeling of high-frequency climate variability worldwide.

  15. SU-E-T-580: On the Significance of Model Based Dosimetry for Breast and Head and Neck 192Ir HDR Brachytherapy

    SciTech Connect

    Peppa, V; Pappas, E; Pantelis, E; Papagiannis, P; Major, T; Polgar, C

    2015-06-15

    Purpose: To assess the dosimetric and radiobiological differences between TG43-based and model-based dosimetry in the treatment planning of {sup 192}Ir HDR brachytherapy for breast and head and neck cancer. Methods: Two cohorts of 57 Accelerated Partial Breast Irradiation (APBI) and 22 head and neck (H&N) patients with oral cavity carcinoma were studied. Dosimetry for the treatment plans was performed using the TG43 algorithm of the Oncentra Brachy v4.4 treatment planning system (TPS). Corresponding Monte Carlo (MC) simulations were performed using MCNP6 with input files automatically prepared by the BrachyGuide software tool from DICOM RT plan data. TG43 and MC data were compared in terms of % dose differences, Dose Volume Histograms (DVHs) and related indices of clinical interest for the Planning Target Volume (PTV) and the Organs-At-Risk (OARs). A radiobiological analysis was also performed using the Equivalent Uniform Dose (EUD), mean survival fraction (S) and Tumor Control Probability (TCP) for the PTV, and the Normal Tissue Control Probability (N TCP) and the generalized EUD (gEUD) for the OARs. Significance testing of the observed differences performed using the Wilcoxon paired sample test. Results: Differences between TG43 and MC DVH indices, associated with the increased corresponding local % dose differences observed, were statistically significant. This is mainly attributed to their consistency however, since TG43 agrees closely with MC for the majority of DVH and radiobiological parameters in both patient cohorts. Differences varied considerably among patients only for the ipsilateral lung and ribs in the APBI cohort, with a strong correlation to target location. Conclusion: While the consistency and magnitude of differences in the majority of clinically relevant DVH indices imply that no change is needed in the treatment planning practice, individualized dosimetry improves accuracy and addresses instances of inter-patient variability observed. Research

  16. Coherent Assessment of Subjective Probability

    DTIC Science & Technology

    1981-03-01

    known results of de Finetti (1937, 1972, 1974), Smith (1961), and Savage (1971) and some recent results of Lind- ley (1980) concerning the use of...provides the motivation for de Finettis definition of subjective probabilities as coherent bet prices. From the definition of the probability measure...subjective probability, the probability laws which are traditionally stated as axioms or definitions are obtained instead as theorems. (De Finetti F -7

  17. Modality, probability, and mental models.

    PubMed

    Hinterecker, Thomas; Knauff, Markus; Johnson-Laird, P N

    2016-10-01

    We report 3 experiments investigating novel sorts of inference, such as: A or B or both. Therefore, possibly (A and B). Where the contents were sensible assertions, for example, Space tourism will achieve widespread popularity in the next 50 years or advances in material science will lead to the development of antigravity materials in the next 50 years, or both. Most participants accepted the inferences as valid, though they are invalid in modal logic and in probabilistic logic too. But, the theory of mental models predicts that individuals should accept them. In contrast, inferences of this sort—A or B but not both. Therefore, A or B or both—are both logically valid and probabilistically valid. Yet, as the model theory also predicts, most reasoners rejected them. The participants’ estimates of probabilities showed that their inferences tended not to be based on probabilistic validity, but that they did rate acceptable conclusions as more probable than unacceptable conclusions. We discuss the implications of the results for current theories of reasoning.

  18. Probability, clinical decision making and hypothesis testing

    PubMed Central

    Banerjee, A.; Jadhav, S. L.; Bhawalkar, J. S.

    2009-01-01

    Few clinicians grasp the true concept of probability expressed in the ‘P value.’ For most, a statistically significant P value is the end of the search for truth. In fact, the opposite is the case. The present paper attempts to put the P value in proper perspective by explaining different types of probabilities, their role in clinical decision making, medical research and hypothesis testing. PMID:21234167

  19. UQ for Decision Making: How (at least five) Kinds of Probability Might Come Into Play

    NASA Astrophysics Data System (ADS)

    Smith, L. A.

    2013-12-01

    In 1959 IJ Good published the discussion "Kinds of Probability" in Science. Good identified (at least) five kinds. The need for (at least) a sixth kind of probability when quantifying uncertainty in the context of climate science is discussed. This discussion brings out the differences in weather-like forecasting tasks and climate-links tasks, with a focus on the effective use both of science and of modelling in support of decision making. Good also introduced the idea of a "Dynamic probability" a probability one expects to change without any additional empirical evidence; the probabilities assigned by a chess playing program when it is only half thorough its analysis being an example. This case is contrasted with the case of "Mature probabilities" where a forecast algorithm (or model) has converged on its asymptotic probabilities and the question hinges in whether or not those probabilities are expected to change significantly before the event in question occurs, even in the absence of new empirical evidence. If so, then how might one report and deploy such immature probabilities in scientific-support of decision-making rationally? Mature Probability is suggested as a useful sixth kind, although Good would doubtlessly argue that we can get by with just one, effective communication with decision makers may be enhanced by speaking as if the others existed. This again highlights the distinction between weather-like contexts and climate-like contexts. In the former context one has access to a relevant climatology (a relevant, arguably informative distribution prior to any model simulations), in the latter context that information is not available although one can fall back on the scientific basis upon which the model itself rests, and estimate the probability that the model output is in fact misinformative. This subjective "probability of a big surprise" is one way to communicate the probability of model-based information holding in practice, the probability that the

  20. Finding of No Significant Impact: Replacement of Chemical Cleaning Line Tinker Air Force Base Oklahoma City, Oklahoma

    DTIC Science & Technology

    2012-05-01

    economy , and there would be no long-term impacts on local socioeconomic conditions. Page 4-13 February 2012 Environmental Assessment FINAL...FINDING OF NO SIGNIFICANT IMPACT : REPLACEMENT OF CHEMICAL CLEANING LINE TINKER AIR FORCE BASE OKLAHOMA CITY, OKLAHOMA An Environmental Assessment...entitled Environmental Impact Analysis Process (EIAP) and codified at 32 CFR 989. The EA is incorporated by reference into this finding. DESCRIPTION

  1. Measurement of the neutral B meson decay mixing frequency using a new probability based self-tagging algorithm applied to inclusive lepton events from proton-antiproton collisions at center-of-mass energy = 1.8 TeV

    NASA Astrophysics Data System (ADS)

    Shah, Tushar

    We present a measurement of the Bd mixing frequency performed in an inclusive lepton sample, B --> l + X. A secondary vertex identifies a B meson decay, and a high pt lepton determines the flavor at the time of decay. We use a self-tagging algorithm (exploiting the correlation between the charge of particles produced along with a B meson and its flavor) in order to determine the B flavor at the time of production. Confusion of B daughter particles with charge-flavor correlated particles can cause significant degradation of the flavor tagging performance. Monte Carlo based probability distributions over kinematic and geometric properties of tracks are used to distinguish between potential self-tagging candidates and unidentified B meson daughters. We measure Δmd = 0.42 +/- 0.09(stat) +/- 0.03(sys) × (ps) -1. (Copies available exclusively from MIT Libraries, Rm. 14-0551, Cambridge, MA 02139-4307. Ph. 617-253-5668; Fax 617-253-1690.)

  2. Mentorship: increasing retention probabilities.

    PubMed

    Leners, Debra Woodard; Wilson, Vicki W; Connor, Peggy; Fenton, Joanne

    2006-11-01

    Retaining nurses is a significant workforce issue. Experienced nurses in particular are getting harder to retain within hospitals and the discipline at large. One solution to boost retention is to give serious attention to professional socialization activities through contemporary nurse mentorship experiences. The authors contend that contemporary mentoring programmes, targeting developmental quality of life issues of the expert nurse, would appreciably benefit retention programmes within the hospital environment.

  3. Probabilities of transversions and transitions.

    PubMed

    Vol'kenshtein, M V

    1976-01-01

    The values of the mean relative probabilities of transversions and transitions have been refined on the basis of the data collected by Jukes and found to be equal to 0.34 and 0.66, respectively. Evolutionary factors increase the probability of transversions to 0.44. The relative probabilities of individual substitutions have been determined, and a detailed classification of the nonsense mutations has been given. Such mutations are especially probable in the UGG (Trp) codon. The highest probability of AG, GA transitions correlates with the lowest mean change in the hydrophobic nature of the amino acids coded.

  4. An introductory analysis of satellite collision probabilities

    NASA Astrophysics Data System (ADS)

    Carlton-Wippern, Kitt C.

    This paper addresses a probailistic approach in assessing the probabilities of a satellite collision occurring due to relative trajectory analyses and probability density functions representing the satellites' position/momentum vectors. The paper is divided into 2 parts: Static and Dynamic Collision Probabilities. In the Static Collision Probability section, the basic phenomenon under study is: given the mean positions and associated position probability density functions for the two objects, calculate the probability that the two objects collide (defined as being within some distance of each other). The paper presents the classic Laplace problem of the probability of arrival, using standard uniform distribution functions. This problem is then extrapolated to show how 'arrival' can be classified as 'collision', how the arrival space geometries map to collision space geometries and how arbitrary position density functions can then be included and integrated into the analysis. In the Dynamic Collision Probability section, the nature of collisions based upon both trajectory and energy considerations is discussed, and that energy states alone cannot be used to completely describe whether or not a collision occurs. This fact invalidates some earlier work on the subject and demonstrates why Liouville's theorem cannot be used in general to describe the constant density of the position/momentum space in which a collision may occur. Future position probability density functions are then shown to be the convolution of the current position and momentum density functions (linear analysis), and the paper further demonstrates the dependency of the future position density functions on time. Strategies for assessing the collision probabilities for two point masses with uncertainties in position and momentum at some given time, and thes integrated with some arbitrary impact volume schema, are then discussed. This presentation concludes with the formulation of a high level design

  5. Structural and Sequence Similarity Makes a Significant Impact on Machine-Learning-Based Scoring Functions for Protein-Ligand Interactions.

    PubMed

    Li, Yang; Yang, Jianyi

    2017-04-05

    The prediction of protein-ligand binding affinity has recently been improved remarkably by machine-learning-based scoring functions. For example, using a set of simple descriptors representing the atomic distance counts, the RF-Score improves the Pearson correlation coefficient to about 0.8 on the core set of the PDBbind 2007 database, which is significantly higher than the performance of any conventional scoring function on the same benchmark. A few studies have been made to discuss the performance of machine-learning-based methods, but the reason for this improvement remains unclear. In this study, by systemically controlling the structural and sequence similarity between the training and test proteins of the PDBbind benchmark, we demonstrate that protein structural and sequence similarity makes a significant impact on machine-learning-based methods. After removal of training proteins that are highly similar to the test proteins identified by structure alignment and sequence alignment, machine-learning-based methods trained on the new training sets do not outperform the conventional scoring functions any more. On the contrary, the performance of conventional functions like X-Score is relatively stable no matter what training data are used to fit the weights of its energy terms.

  6. An Alignment-Free Algorithm in Comparing the Similarity of Protein Sequences Based on Pseudo-Markov Transition Probabilities among Amino Acids.

    PubMed

    Li, Yushuang; Song, Tian; Yang, Jiasheng; Zhang, Yi; Yang, Jialiang

    2016-01-01

    In this paper, we have proposed a novel alignment-free method for comparing the similarity of protein sequences. We first encode a protein sequence into a 440 dimensional feature vector consisting of a 400 dimensional Pseudo-Markov transition probability vector among the 20 amino acids, a 20 dimensional content ratio vector, and a 20 dimensional position ratio vector of the amino acids in the sequence. By evaluating the Euclidean distances among the representing vectors, we compare the similarity of protein sequences. We then apply this method into the ND5 dataset consisting of the ND5 protein sequences of 9 species, and the F10 and G11 datasets representing two of the xylanases containing glycoside hydrolase families, i.e., families 10 and 11. As a result, our method achieves a correlation coefficient of 0.962 with the canonical protein sequence aligner ClustalW in the ND5 dataset, much higher than those of other 5 popular alignment-free methods. In addition, we successfully separate the xylanases sequences in the F10 family and the G11 family and illustrate that the F10 family is more heat stable than the G11 family, consistent with a few previous studies. Moreover, we prove mathematically an identity equation involving the Pseudo-Markov transition probability vector and the amino acids content ratio vector.

  7. An Alignment-Free Algorithm in Comparing the Similarity of Protein Sequences Based on Pseudo-Markov Transition Probabilities among Amino Acids

    PubMed Central

    Li, Yushuang; Yang, Jiasheng; Zhang, Yi

    2016-01-01

    In this paper, we have proposed a novel alignment-free method for comparing the similarity of protein sequences. We first encode a protein sequence into a 440 dimensional feature vector consisting of a 400 dimensional Pseudo-Markov transition probability vector among the 20 amino acids, a 20 dimensional content ratio vector, and a 20 dimensional position ratio vector of the amino acids in the sequence. By evaluating the Euclidean distances among the representing vectors, we compare the similarity of protein sequences. We then apply this method into the ND5 dataset consisting of the ND5 protein sequences of 9 species, and the F10 and G11 datasets representing two of the xylanases containing glycoside hydrolase families, i.e., families 10 and 11. As a result, our method achieves a correlation coefficient of 0.962 with the canonical protein sequence aligner ClustalW in the ND5 dataset, much higher than those of other 5 popular alignment-free methods. In addition, we successfully separate the xylanases sequences in the F10 family and the G11 family and illustrate that the F10 family is more heat stable than the G11 family, consistent with a few previous studies. Moreover, we prove mathematically an identity equation involving the Pseudo-Markov transition probability vector and the amino acids content ratio vector. PMID:27918587

  8. Associativity and normative credal probability.

    PubMed

    Snow, P

    2002-01-01

    Cox's Theorem is a widely cited motivation for probabilistic models of uncertain belief. The theorem relates the associativity of the logical connectives to that of the arithmetic operations of probability. Recent questions about the correctness of Cox's Theorem have been resolved, but there are new questions about one functional equation used by Cox in 1946. This equation is missing from his later work. Advances in knowledge since 1946 and changes in Cox's research interests explain the equation's disappearance. Other associativity-based motivations avoid functional equations altogether, and so may be more transparently applied to finite domains and discrete beliefs. A discrete counterpart of Cox's Theorem can be assembled from results that have been in the literature since 1959.

  9. High Probabilities of Planet Detection during Microlensing Events.

    NASA Astrophysics Data System (ADS)

    Peale, S. J.

    2000-10-01

    probabilities offer the promise of gaining statistics rapidly on the frequency of planets in long period orbits, and thereby encourage the expansion of ground based microlensing searches for planets with enhanced capabilities. A ground based microlensing search for planets complements the highly successful radial velocity searches and expanding transit searches by being most sensitive to distant, long period planets, whereas both radial velocity and transit searches are most sensitive to close, massive planets. Existing and proposed astrometric searches are also most sensitive to distant planets, but only with a data time span that is a significant fraction of the orbit period.

  10. Significant Scales in Community Structure

    PubMed Central

    Traag, V. A.; Krings, G.; Van Dooren, P.

    2013-01-01

    Many complex networks show signs of modular structure, uncovered by community detection. Although many methods succeed in revealing various partitions, it remains difficult to detect at what scale some partition is significant. This problem shows foremost in multi-resolution methods. We here introduce an efficient method for scanning for resolutions in one such method. Additionally, we introduce the notion of “significance” of a partition, based on subgraph probabilities. Significance is independent of the exact method used, so could also be applied in other methods, and can be interpreted as the gain in encoding a graph by making use of a partition. Using significance, we can determine “good” resolution parameters, which we demonstrate on benchmark networks. Moreover, optimizing significance itself also shows excellent performance. We demonstrate our method on voting data from the European Parliament. Our analysis suggests the European Parliament has become increasingly ideologically divided and that nationality plays no role. PMID:24121597

  11. Seismicity alert probabilities at Parkfield, California, revisited

    USGS Publications Warehouse

    Michael, A.J.; Jones, L.M.

    1998-01-01

    For a decade, the US Geological Survey has used the Parkfield Earthquake Prediction Experiment scenario document to estimate the probability that earthquakes observed on the San Andreas fault near Parkfield will turn out to be foreshocks followed by the expected magnitude six mainshock. During this time, we have learned much about the seismogenic process at Parkfield, about the long-term probability of the Parkfield mainshock, and about the estimation of these types of probabilities. The probabilities for potential foreshocks at Parkfield are reexamined and revised in light of these advances. As part of this process, we have confirmed both the rate of foreshocks before strike-slip earthquakes in the San Andreas physiographic province and the uniform distribution of foreshocks with magnitude proposed by earlier studies. Compared to the earlier assessment, these new estimates of the long-term probability of the Parkfield mainshock are lower, our estimate of the rate of background seismicity is higher, and we find that the assumption that foreshocks at Parkfield occur in a unique way is not statistically significant at the 95% confidence level. While the exact numbers vary depending on the assumptions that are made, the new alert probabilities are lower than previously estimated. Considering the various assumptions and the statistical uncertainties in the input parameters, we also compute a plausible range for the probabilities. The range is large, partly due to the extra knowledge that exists for the Parkfield segment, making us question the usefulness of these numbers.

  12. Intensity-Modulated Radiotherapy Results in Significant Decrease in Clinical Toxicities Compared With Conventional Wedge-Based Breast Radiotherapy

    SciTech Connect

    Harsolia, Asif; Kestin, Larry; Grills, Inga; Wallace, Michelle; Jolly, Shruti; Jones, Cortney; Lala, Moinaktar; Martinez, Alvaro; Schell, Scott; Vicini, Frank A. . E-mail: fvicini@beaumont.edu

    2007-08-01

    Purpose: We have previously demonstrated that intensity-modulated radiotherapy (IMRT) with a static multileaf collimator process results in a more homogenous dose distribution compared with conventional wedge-based whole breast irradiation (WBI). In the present analysis, we reviewed the acute and chronic toxicity of this IMRT approach compared with conventional wedge-based treatment. Methods and Materials: A total of 172 patients with Stage 0-IIB breast cancer were treated with lumpectomy followed by WBI. All patients underwent treatment planning computed tomography and received WBI (median dose, 45 Gy) followed by a boost to 61 Gy. Of the 172 patients, 93 (54%) were treated with IMRT, and the 79 patients (46%) treated with wedge-based RT in a consecutive fashion immediately before this cohort served as the control group. The median follow-up was 4.7 years. Results: A significant reduction in acute Grade 2 or worse dermatitis, edema, and hyperpigmentation was seen with IMRT compared with wedges. A trend was found toward reduced acute Grade 3 or greater dermatitis (6% vs. 1%, p = 0.09) in favor of IMRT. Chronic Grade 2 or worse breast edema was significantly reduced with IMRT compared with conventional wedges. No difference was found in cosmesis scores between the two groups. In patients with larger breasts ({>=}1,600 cm{sup 3}, n = 64), IMRT resulted in reduced acute (Grade 2 or greater) breast edema (0% vs. 36%, p <0.001) and hyperpigmentation (3% vs. 41%, p 0.001) and chronic (Grade 2 or greater) long-term edema (3% vs. 30%, p 0.007). Conclusion: The use of IMRT in the treatment of the whole breast results in a significant decrease in acute dermatitis, edema, and hyperpigmentation and a reduction in the development of chronic breast edema compared with conventional wedge-based RT.

  13. Determination of the compound nucleus survival probability Psurv for various "hot" fusion reactions based on the dynamical cluster-decay model

    NASA Astrophysics Data System (ADS)

    Chopra, Sahila; Kaur, Arshdeep; Gupta, Raj K.

    2015-03-01

    After a successful attempt to define and determine recently the compound nucleus (CN) fusion/ formation probability PCN within the dynamical cluster-decay model (DCM), we introduce and estimate here for the first time the survival probability Psurv of CN against fission, again within the DCM. Calculated as the dynamical fragmentation process, Psurv is defined as the ratio of the evaporation residue (ER) cross section σER and the sum of σER and fusion-fission (ff) cross section σff, the CN formation cross section σCN, where each contributing fragmentation cross section is determined in terms of its formation and barrier penetration probabilities P0 and P . In DCM, the deformations up to hexadecapole and "compact" orientations for both in-plane (coplanar) and out-of-plane (noncoplanar) configurations are allowed. Some 16 "hot" fusion reactions, forming a CN of mass number ACN˜100 to superheavy nuclei, are analyzed for various different nuclear interaction potentials, and the variation of Psurv on CN excitation energy E*, fissility parameter χ , CN mass ACN, and Coulomb parameter Z1Z2 is investigated. Interesting results are that three groups, namely, weakly fissioning, radioactive, and strongly fissioning superheavy nuclei, are identified with Psurv, respectively, ˜1 ,˜10-6 , and ˜10-10 . For the weakly fissioning group (100

  14. Liquefaction probability curves for surficial geologic deposits

    USGS Publications Warehouse

    Holzer, Thomas L.; Noce, Thomas E.; Bennett, Michael J.

    2011-01-01

    Liquefaction probability curves that predict the probability of surface manifestations of earthquake-induced liquefaction are developed for 14 different types of surficial geologic units. The units consist of alluvial fan, beach ridge, river delta topset and foreset beds, eolian dune, point bar, flood basin, natural river and alluvial fan levees, abandoned river channel, deep-water lake, lagoonal, sandy artificial fill, and valley train deposits. Probability is conditioned on earthquake magnitude and peak ground acceleration. Curves are developed for water table depths of 1.5 and 5.0 m. Probabilities are derived from complementary cumulative frequency distributions of the liquefaction potential index (LPI) that were computed from 927 cone penetration tests. For natural deposits with a water table at 1.5 m and subjected to a M7.5 earthquake with peak ground acceleration (PGA)  =  0.25g, probabilities range from 0.5 for beach ridge, point bar, and deltaic deposits. The curves also were used to assign ranges of liquefaction probabilities to the susceptibility categories proposed previously for different geologic deposits. For the earthquake described here, probabilities for susceptibility categories have ranges of 0–0.08 for low, 0.09–0.30 for moderate, 0.31–0.62 for high, and 0.63–1.00 for very high. Retrospective predictions of liquefaction during historical earthquakes based on the curves compare favorably to observations.

  15. 28 CFR 2.101 - Probable cause hearing and determination.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... who have given information upon which revocation may be based) at a postponed probable cause hearing... attendance, unless good cause is found for not allowing confrontation. Whenever a probable cause hearing...

  16. Propensity, Probability, and Quantum Theory

    NASA Astrophysics Data System (ADS)

    Ballentine, Leslie E.

    2016-08-01

    Quantum mechanics and probability theory share one peculiarity. Both have well established mathematical formalisms, yet both are subject to controversy about the meaning and interpretation of their basic concepts. Since probability plays a fundamental role in QM, the conceptual problems of one theory can affect the other. We first classify the interpretations of probability into three major classes: (a) inferential probability, (b) ensemble probability, and (c) propensity. Class (a) is the basis of inductive logic; (b) deals with the frequencies of events in repeatable experiments; (c) describes a form of causality that is weaker than determinism. An important, but neglected, paper by P. Humphreys demonstrated that propensity must differ mathematically, as well as conceptually, from probability, but he did not develop a theory of propensity. Such a theory is developed in this paper. Propensity theory shares many, but not all, of the axioms of probability theory. As a consequence, propensity supports the Law of Large Numbers from probability theory, but does not support Bayes theorem. Although there are particular problems within QM to which any of the classes of probability may be applied, it is argued that the intrinsic quantum probabilities (calculated from a state vector or density matrix) are most naturally interpreted as quantum propensities. This does not alter the familiar statistical interpretation of QM. But the interpretation of quantum states as representing knowledge is untenable. Examples show that a density matrix fails to represent knowledge.

  17. Study on the Identification of Radix Bupleuri from Its Unofficial Varieties Based on Discrete Wavelet Transformation Feature Extraction of ATR-FTIR Spectroscopy Combined with Probability Neural Network.

    PubMed

    Jin, Wenying; Wan, Chayan; Cheng, Cungui

    2015-01-01

    The attenuated total reflection-Fourier transform infrared spectroscopy (ATR-FTIR) was employed to acquire the infrared spectra of Radix Bupleuri and its unofficial varieties: the root of Bupleurum smithii Wolff and the root of Bupleurum bicaule Helm. The infrared spectra and spectra of Fourier self-deconvolution (FSD), discrete wavelet transform (DWT), and probability neural network (PNN) of these species were analyzed. By the method of FSD, there were conspicuous differences of the infrared absorption peak intensity of different types between Radix Bupleuri and its unofficial varieties. But it is hard to tell the differences between the root of Bupleurum smithii Wolff and the root of Bupleurum bicaule. The differences could be shown more clearly when the DWT was used. The research result shows that by the DWT technology it is easier to identify Radix Bupleuri from its unofficial varieties the root of Bupleurum smithii Wolff and the root of Bupleurum bicaule.

  18. Characteristic length of the knotting probability revisited

    NASA Astrophysics Data System (ADS)

    Uehara, Erica; Deguchi, Tetsuo

    2015-09-01

    We present a self-avoiding polygon (SAP) model for circular DNA in which the radius of impermeable cylindrical segments corresponds to the screening length of double-stranded DNA surrounded by counter ions. For the model we evaluate the probability for a generated SAP with N segments having a given knot K through simulation. We call it the knotting probability of a knot K with N segments for the SAP model. We show that when N is large the most significant factor in the knotting probability is given by the exponentially decaying part exp(-N/NK), where the estimates of parameter NK are consistent with the same value for all the different knots we investigated. We thus call it the characteristic length of the knotting probability. We give formulae expressing the characteristic length as a function of the cylindrical radius rex, i.e. the screening length of double-stranded DNA.

  19. MSPI False Indication Probability Simulations

    SciTech Connect

    Dana Kelly; Kurt Vedros; Robert Youngblood

    2011-03-01

    This paper examines false indication probabilities in the context of the Mitigating System Performance Index (MSPI), in order to investigate the pros and cons of different approaches to resolving two coupled issues: (1) sensitivity to the prior distribution used in calculating the Bayesian-corrected unreliability contribution to the MSPI, and (2) whether (in a particular plant configuration) to model the fuel oil transfer pump (FOTP) as a separate component, or integrally to its emergency diesel generator (EDG). False indication probabilities were calculated for the following situations: (1) all component reliability parameters at their baseline values, so that the true indication is green, meaning that an indication of white or above would be false positive; (2) one or more components degraded to the extent that the true indication would be (mid) white, and “false” would be green (negative) or yellow (negative) or red (negative). In key respects, this was the approach taken in NUREG-1753. The prior distributions examined were the constrained noninformative (CNI) prior used currently by the MSPI, a mixture of conjugate priors, the Jeffreys noninformative prior, a nonconjugate log(istic)-normal prior, and the minimally informative prior investigated in (Kelly et al., 2010). The mid-white performance state was set at ?CDF = ?10 ? 10-6/yr. For each simulated time history, a check is made of whether the calculated ?CDF is above or below 10-6/yr. If the parameters were at their baseline values, and ?CDF > 10-6/yr, this is counted as a false positive. Conversely, if one or all of the parameters are set to values corresponding to ?CDF > 10-6/yr but that time history’s ?CDF < 10-6/yr, this is counted as a false negative indication. The false indication (positive or negative) probability is then estimated as the number of false positive or negative counts divided by the number of time histories (100,000). Results are presented for a set of base case parameter values

  20. Predicting loss exceedance probabilities for US hurricane landfalls

    NASA Astrophysics Data System (ADS)

    Murnane, R.

    2003-04-01

    The extreme winds, rains, and floods produced by landfalling hurricanes kill and injure many people and cause severe economic losses. Many business, planning, and emergency management decisions are based on the probability of hurricane landfall and associated emergency management considerations; however, the expected total economic and insured losses also are important. Insured losses generally are assumed to be half the total economic loss from hurricanes in the United States. Here I describe a simple model that can be used to estimate deterministic and probabilistic exceedance probabilities for insured losses associated with landfalling hurricanes along the US coastline. The model combines wind speed exceedance probabilities with loss records from historical hurricanes striking land. The wind speed exceedance probabilities are based on the HURDAT best track data and use the storm’s maximum sustained wind just prior to landfall. The loss records are normalized to present-day values using a risk model and account for historical changes in inflation, population, housing stock, and other factors. Analysis of the correlation between normalized losses and a number of storm-related parameters suggests that the most relevant, statistically-significant predictor for insured loss is the storm’s maximum sustained wind at landfall. Insured loss exceedance probabilities thus are estimated using a linear relationship between the log of the maximum sustained winds and normalized insured loss. Model estimates for insured losses from Hurricanes Isidore (US45 million) and Lili (US275 million) compare well with loss estimates from more sophisticated risk models and recorded losses. The model can also be used to estimate how exceedance probabilities for insured loss vary as a function of the North Atlantic Oscillation and the El Niño-Southern Oscillation.

  1. Significant life experience: Exploring the lifelong influence of place-based environmental and science education on program participants

    NASA Astrophysics Data System (ADS)

    Colvin, Corrie Ruth

    Current research provides a limited understanding of the life long influence of nonformal place-based environmental and science education programs on past participants. This study looks to address this gap, exploring the ways in which these learning environments have contributed to environmental identity and stewardship. Using Dorothy Holland's approach to social practice theory's understanding of identity formation, this study employed narrative interviews and a close-ended survey to understand past participants' experience over time. Participants from two place-based environmental education programs and one science-inquiry program were asked to share their reflections on their program experience and the influence they attribute to that experience. Among all participants, the element of hands-on learning, supportive instructors, and engaging learning environments remained salient over time. Participants of nature-based programs demonstrated that these programs in particular were formative in contributing to an environmental stewardship identity. Social practice theory can serve as a helpful theoretical framework for significant life experience research, which has largely been missing from this body of research. This study also holds implications for the fields of place-based environmental education, conservation psychology, and sustainability planning, all of which look to understand and increase environmentally sustainable practices.

  2. The Probabilities of Unique Events

    DTIC Science & Technology

    2012-08-30

    probabilities into quantum mechanics, and some psychologists have argued that they have a role to play in accounting for errors in judgment [30]. But, in...Discussion The mechanisms underlying naive estimates of the probabilities of unique events are largely inaccessible to consciousness , but they...Can quantum probability provide a new direc- tion for cognitive modeling? Behavioral and Brain Sciences (in press). 31. Paolacci G, Chandler J

  3. Information Processing Using Quantum Probability

    NASA Astrophysics Data System (ADS)

    Behera, Laxmidhar

    2006-11-01

    This paper presents an information processing paradigm that introduces collective response of multiple agents (computational units) while the level of intelligence associated with the information processing has been increased manifold. It is shown that if the potential field of the Schroedinger wave equation is modulated using a self-organized learning scheme, then the probability density function associated with the stochastic data is transferred to the probability amplitude function which is the response of the Schroedinger wave equation. This approach illustrates that information processing of data with stochastic behavior can be efficiently done using quantum probability instead of classical probability. The proposed scheme has been demonstrated through two applications: denoising and adaptive control.

  4. Identifying significant covariates for anti-HIV treatment response: mechanism-based differential equation models and empirical semiparametric regression models.

    PubMed

    Huang, Yangxin; Liang, Hua; Wu, Hulin

    2008-10-15

    In this paper, the mechanism-based ordinary differential equation (ODE) model and the flexible semiparametric regression model are employed to identify the significant covariates for antiretroviral response in AIDS clinical trials. We consider the treatment effect as a function of three factors (or covariates) including pharmacokinetics, drug adherence and susceptibility. Both clinical and simulated data examples are given to illustrate these two different kinds of modeling approaches. We found that the ODE model is more powerful to model the mechanism-based nonlinear relationship between treatment effects and virological response biomarkers. The ODE model is also better in identifying the significant factors for virological response, although it is slightly liberal and there is a trend to include more factors (or covariates) in the model. The semiparametric mixed-effects regression model is very flexible to fit the virological response data, but it is too liberal to identify correct factors for the virological response; sometimes it may miss the correct factors. The ODE model is also biologically justifiable and good for predictions and simulations for various biological scenarios. The limitations of the ODE models include the high cost of computation and the requirement of biological assumptions that sometimes may not be easy to validate. The methodologies reviewed in this paper are also generally applicable to studies of other viruses such as hepatitis B virus or hepatitis C virus.

  5. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are < 1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  6. The relationship between species detection probability and local extinction probability

    USGS Publications Warehouse

    Alpizar-Jara, R.; Nichols, J.D.; Hines, J.E.; Sauer, J.R.; Pollock, K.H.; Rosenberry, C.S.

    2004-01-01

    In community-level ecological studies, generally not all species present in sampled areas are detected. Many authors have proposed the use of estimation methods that allow detection probabilities that are <1 and that are heterogeneous among species. These methods can also be used to estimate community-dynamic parameters such as species local extinction probability and turnover rates (Nichols et al. Ecol Appl 8:1213-1225; Conserv Biol 12:1390-1398). Here, we present an ad hoc approach to estimating community-level vital rates in the presence of joint heterogeneity of detection probabilities and vital rates. The method consists of partitioning the number of species into two groups using the detection frequencies and then estimating vital rates (e.g., local extinction probabilities) for each group. Estimators from each group are combined in a weighted estimator of vital rates that accounts for the effect of heterogeneity. Using data from the North American Breeding Bird Survey, we computed such estimates and tested the hypothesis that detection probabilities and local extinction probabilities were negatively related. Our analyses support the hypothesis that species detection probability covaries negatively with local probability of extinction and turnover rates. A simulation study was conducted to assess the performance of vital parameter estimators as well as other estimators relevant to questions about heterogeneity, such as coefficient of variation of detection probabilities and proportion of species in each group. Both the weighted estimator suggested in this paper and the original unweighted estimator for local extinction probability performed fairly well and provided no basis for preferring one to the other.

  7. Determination of ecological significance based on geostatistical assessment: a case study from the Slovak Natura 2000 protected area

    NASA Astrophysics Data System (ADS)

    Klaučo, Michal; Gregorová, Bohuslava; Stankov, Uglješa; Marković, Vladimir; Lemenkova, Polina

    2013-03-01

    The Sitno Natura 2000 Site covers an area of 935,56 hectares. The Sitno region is significant due to the number of rare and endangered species of plants, and as a result is considered a location of great importance to the maintenance of floral gene pools. The study area suffers human impacts in the form of tourism. The main purpose of this study is to the measure landscape elements, determine the ecological significance of habitats within the Sitno area, and from this data, organize the study area into conservation zones. The results of this landscape quantification are numerical values that can be used to interpret the quality of ongoing ecological processes within individual landscape types. Interpretation of this quantified data can be used to determine the ecological significance of landscapes in other study areas. This research examines the habitats of Natura 2000 Sites by a set of landscape metrics for habitat area, size, density, and shape, such as Number of patches (NP), Patch density (PD), Mean patch size (MPS), Patch size standard deviation (PSSD) and Mean shape index (MSI). The classification of land cover patches is based on the Annex Code system.

  8. An evidence-based approach to establish the functional and clinical significance of CNVs in intellectual and developmental disabilities

    PubMed Central

    Kaminsky, Erin B.; Kaul, Vineith; Paschall, Justin; Church, Deanna M.; Bunke, Brian; Kunig, Dawn; Moreno-De-Luca, Daniel; Moreno-De-Luca, Andres; Mulle, Jennifer G.; Warren, Stephen T.; Richard, Gabriele; Compton, John G.; Fuller, Amy E.; Gliem, Troy J.; Huang, Shuwen; Collinson, Morag N.; Beal, Sarah J.; Ackley, Todd; Pickering, Diane L.; Golden, Denae M.; Aston, Emily; Whitby, Heidi; Shetty, Shashirekha; Rossi, Michael R.; Rudd, M. Katharine; South, Sarah T.; Brothman, Arthur R.; Sanger, Warren G.; Iyer, Ramaswamy K.; Crolla, John A.; Thorland, Erik C.; Aradhya, Swaroop; Ledbetter, David H.; Martin, Christa L.

    2013-01-01

    Purpose Copy number variants (CNVs) have emerged as a major cause of human disease such as autism and intellectual disabilities. Because CNVs are common in normal individuals, determining the functional and clinical significance of rare CNVs in patients remains challenging. The adoption of whole-genome chromosomal microarray analysis (CMA) as a first-tier diagnostic test for individuals with unexplained developmental disabilities provides a unique opportunity to obtain large CNV datasets generated through routine patient care. Methods A consortium of diagnostic laboratories was established [the International Standards for Cytogenomic Arrays (ISCA) consortium] to share CNV and phenotypic data in a central, public database. We present the largest CNV case-control study to date comprising 15,749 ISCA cases and 10,118 published controls, focusing our initial analysis on recurrent deletions and duplications involving 14 CNV regions. Results Compared to controls, fourteen deletions, and seven duplications were significantly overrepresented in cases, providing a clinical diagnosis as pathogenic. Conclusion Given the rapid expansion of clinical CMA testing, very large datasets will be available to determine the functional significance of increasingly rare CNVs. This data will provide an evidenced-based guide to clinicians across many disciplines involved in the diagnosis, management, and care of these patients and their families. PMID:21844811

  9. Comparisons of estimates of annual exceedance-probability discharges for small drainage basins in Iowa, based on data through water year 2013

    USGS Publications Warehouse

    Eash, David A.

    2015-01-01

    An examination was conducted to understand why the 1987 single-variable RREs seem to provide better accuracy and less bias than either of the 2013 multi- or single-variable RREs. A comparison of 1-percent annual exceedance-probability regression lines for hydrologic regions 1-4 from the 1987 single-variable RREs and for flood regions 1-3 from the 2013 single-variable RREs indicates that the 1987 single-variable regional-regression lines generally have steeper slopes and lower discharges when compared to 2013 single-variable regional-regression lines for corresponding areas of Iowa. The combination of the definition of hydrologic regions, the lower discharges, and the steeper slopes of regression lines associated with the 1987 single-variable RREs seem to provide better accuracy and less bias when compared to the 2013 multi- or single-variable RREs; better accuracy and less bias was determined particularly for drainage areas less than 2 mi2, and also for some drainage areas between 2 and 20 mi2. The 2013 multi- and single-variable RREs are considered to provide better accuracy and less bias for larger drainage areas. Results of this study indicate that additional research is needed to address the curvilinear relation between drainage area and AEPDs for areas of Iowa.

  10. Dynamic mean field theory for lattice gas models of fluids confined in porous materials: higher order theory based on the Bethe-Peierls and path probability method approximations.

    PubMed

    Edison, John R; Monson, Peter A

    2014-07-14

    Recently we have developed a dynamic mean field theory (DMFT) for lattice gas models of fluids in porous materials [P. A. Monson, J. Chem. Phys. 128(8), 084701 (2008)]. The theory can be used to describe the relaxation processes in the approach to equilibrium or metastable states for fluids in pores and is especially useful for studying system exhibiting adsorption/desorption hysteresis. In this paper we discuss the extension of the theory to higher order by means of the path probability method (PPM) of Kikuchi and co-workers. We show that this leads to a treatment of the dynamics that is consistent with thermodynamics coming from the Bethe-Peierls or Quasi-Chemical approximation for the equilibrium or metastable equilibrium states of the lattice model. We compare the results from the PPM with those from DMFT and from dynamic Monte Carlo simulations. We find that the predictions from PPM are qualitatively similar to those from DMFT but give somewhat improved quantitative accuracy, in part due to the superior treatment of the underlying thermodynamics. This comes at the cost of greater computational expense associated with the larger number of equations that must be solved.

  11. Dynamic mean field theory for lattice gas models of fluids confined in porous materials: Higher order theory based on the Bethe-Peierls and path probability method approximations

    SciTech Connect

    Edison, John R.; Monson, Peter A.

    2014-07-14

    Recently we have developed a dynamic mean field theory (DMFT) for lattice gas models of fluids in porous materials [P. A. Monson, J. Chem. Phys. 128(8), 084701 (2008)]. The theory can be used to describe the relaxation processes in the approach to equilibrium or metastable states for fluids in pores and is especially useful for studying system exhibiting adsorption/desorption hysteresis. In this paper we discuss the extension of the theory to higher order by means of the path probability method (PPM) of Kikuchi and co-workers. We show that this leads to a treatment of the dynamics that is consistent with thermodynamics coming from the Bethe-Peierls or Quasi-Chemical approximation for the equilibrium or metastable equilibrium states of the lattice model. We compare the results from the PPM with those from DMFT and from dynamic Monte Carlo simulations. We find that the predictions from PPM are qualitatively similar to those from DMFT but give somewhat improved quantitative accuracy, in part due to the superior treatment of the underlying thermodynamics. This comes at the cost of greater computational expense associated with the larger number of equations that must be solved.

  12. A clip-based protocol for breast boost radiotherapy provides clear target visualisation and demonstrates significant volume reduction over time

    SciTech Connect

    Lewis, Lorraine; Cox, Jennifer; Morgia, Marita; Atyeo, John; Lamoury, Gillian

    2015-09-15

    The clinical target volume (CTV) for early stage breast cancer is difficult to clearly identify on planning computed tomography (CT) scans. Surgical clips inserted around the tumour bed should help to identify the CTV, particularly if the seroma has been reabsorbed, and enable tracking of CTV changes over time. A surgical clip-based CTV delineation protocol was introduced. CTV visibility and its post-operative shrinkage pattern were assessed. The subjects were 27 early stage breast cancer patients receiving post-operative radiotherapy alone and 15 receiving post-operative chemotherapy followed by radiotherapy. The radiotherapy alone (RT/alone) group received a CT scan at median 25 days post-operatively (CT1rt) and another at 40 Gy, median 68 days (CT2rt). The chemotherapy/RT group (chemo/RT) received a CT scan at median 18 days post-operatively (CT1ch), a planning CT scan at median 126 days (CT2ch), and another at 40 Gy (CT3ch). There was no significant difference (P = 0.08) between the initial mean CTV for each cohort. The RT/alone cohort showed significant CTV volume reduction of 38.4% (P = 0.01) at 40 Gy. The Chemo/RT cohort had significantly reduced volumes between CT1ch: median 54 cm{sup 3} (4–118) and CT2ch: median 16 cm{sup 3}, (2–99), (P = 0.01), but no significant volume reduction thereafter. Surgical clips enable localisation of the post-surgical seroma for radiotherapy targeting. Most seroma shrinkage occurs early, enabling CT treatment planning to take place at 7 weeks, which is within the 9 weeks recommended to limit disease recurrence.

  13. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  14. Improvement of the quantitation method for the tdh+ Vibrio parahaemolyticus in molluscan shellfish based on most-probable- number, immunomagnetic separation, and loop-mediated isothermal amplification

    PubMed Central

    Escalante-Maldonado, Oscar; Kayali, Ahmad Y.; Yamazaki, Wataru; Vuddhakul, Varaporn; Nakaguchi, Yoshitsugu; Nishibuchi, Mitsuaki

    2015-01-01

    Vibrio parahaemolyticus is a marine microorganism that can cause seafood-borne gastroenteritis in humans. The infection can be spread and has become a pandemic through the international trade of contaminated seafood. Strains carrying the tdh gene encoding the thermostable direct hemolysin (TDH) and/or the trh gene encoding the TDH-related hemolysin (TRH) are considered to be pathogenic with the former gene being the most frequently found in clinical strains. However, their distribution frequency in environmental isolates is below 1%. Thus, very sensitive methods are required for detection and quantitation of tdh+ strains in seafood. We previously reported a method to detect and quantify tdh+ V. parahaemolyticus in seafood. This method consists of three components: the most-probable-number (MPN), the immunomagnetic separation (IMS) targeting all established K antigens, and the loop-mediated isothermal amplification (LAMP) targeting the tdh gene. However, this method faces regional issues in tropical zones of the world. Technicians have difficulties in securing dependable reagents in high-temperature climates where we found MPN underestimation in samples having tdh+ strains as well as other microorganisms present at high concentrations. In the present study, we solved the underestimation problem associated with the salt polymyxin broth enrichment for the MPN component and with the immunomagnetic bead-target association for the IMS component. We also improved the supply and maintenance of the dependable reagents by introducing a dried reagent system to the LAMP component. The modified method is specific, sensitive, quick and easy and applicable regardless of the concentrations of tdh+ V. parahaemolyticus. Therefore, we conclude this modified method is useful in world tropical, sub-tropical, and temperate zones. PMID:25914681

  15. Improvement of the quantitation method for the tdh (+) Vibrio parahaemolyticus in molluscan shellfish based on most-probable- number, immunomagnetic separation, and loop-mediated isothermal amplification.

    PubMed

    Escalante-Maldonado, Oscar; Kayali, Ahmad Y; Yamazaki, Wataru; Vuddhakul, Varaporn; Nakaguchi, Yoshitsugu; Nishibuchi, Mitsuaki

    2015-01-01

    Vibrio parahaemolyticus is a marine microorganism that can cause seafood-borne gastroenteritis in humans. The infection can be spread and has become a pandemic through the international trade of contaminated seafood. Strains carrying the tdh gene encoding the thermostable direct hemolysin (TDH) and/or the trh gene encoding the TDH-related hemolysin (TRH) are considered to be pathogenic with the former gene being the most frequently found in clinical strains. However, their distribution frequency in environmental isolates is below 1%. Thus, very sensitive methods are required for detection and quantitation of tdh (+) strains in seafood. We previously reported a method to detect and quantify tdh (+) V. parahaemolyticus in seafood. This method consists of three components: the most-probable-number (MPN), the immunomagnetic separation (IMS) targeting all established K antigens, and the loop-mediated isothermal amplification (LAMP) targeting the tdh gene. However, this method faces regional issues in tropical zones of the world. Technicians have difficulties in securing dependable reagents in high-temperature climates where we found MPN underestimation in samples having tdh (+) strains as well as other microorganisms present at high concentrations. In the present study, we solved the underestimation problem associated with the salt polymyxin broth enrichment for the MPN component and with the immunomagnetic bead-target association for the IMS component. We also improved the supply and maintenance of the dependable reagents by introducing a dried reagent system to the LAMP component. The modified method is specific, sensitive, quick and easy and applicable regardless of the concentrations of tdh (+) V. parahaemolyticus. Therefore, we conclude this modified method is useful in world tropical, sub-tropical, and temperate zones.

  16. A short note on probability in clinical medicine.

    PubMed

    Upshur, Ross E G

    2013-06-01

    Probability claims are ubiquitous in clinical medicine, yet exactly how clinical events relate to interpretations of probability has been not been well explored. This brief essay examines the major interpretations of probability and how these interpretations may account for the probabilistic nature of clinical events. It is argued that there are significant problems with the unquestioned application of interpretation of probability to clinical events. The essay concludes by suggesting other avenues to understand uncertainty in clinical medicine.

  17. Increasing the probability of long-range (1 month) SPI index forecasts based on SL-AV model for the Russian territory.

    NASA Astrophysics Data System (ADS)

    Utkuzova, Dilyara; Khan, Valentina; Donner, Reik

    2016-04-01

    Precipitation predictions for long-range period could be done with a numerical weather prediction model. Often, results after running the model are not so high. So, it is typically feasible to use post-processing methods producing the long - range precipitation forecast. For this purpose the SPI index was used. First of all it is necessary to test SPI index using statistical techniques. Different parameters of SPI frequency distribution and long-term tendencies were calculated as well as spatial characteristics indicating drought and wetness propagation. Results of analysis demonstrate that during previous years there is a tendency of increasing intensity of drought and wetness extremes over Russia. There are fewer droughts in the northern regions. The drought propagation for the European territory of Russia is decreasing in June and August, and increasing in July. The situation is opposite for the wetness tendencies. For the Asian territory of Russia, the drought propagation is significantly increasing in July along with a decreasing wetness trend. Then synoptic analysis has been conducted to describe wet and drought events. Synoptic conditions favorable for the formation of wet and drought extremes were identified by comparing synoptic charts with the spatial patterns of SPI. For synoptic analysis, episodes of extremely wet (6 episodes for the APR and 7 episodes for the EPR) and drought (6 episodes for the APR and 6 for the EPR) events were classified using A. Katz' typology of weather regimes. For European part of Russia, extreme DROUGHT events are linked to the weather type named "MIXED", for Asian part of Russia - the type "CENTRAL". For European part of Russia, extreme WET events associated with "CENTRAL" type. There is a displacement of the planetary frontal zone into southward direction approximately for 5-25 degrees relative to normal climatological position during WET extreme events linked to the «EASTERN» classification type. The SPI field (data was

  18. Dimension Reduction via Unsupervised Learning Yields Significant Computational Improvements for Support Vector Machine Based Protein Family Classification.

    SciTech Connect

    Webb-Robertson, Bobbie-Jo M.; Matzke, Melissa M.; Oehmen, Christopher S.

    2009-02-26

    Reducing the dimension of vectors used in training support vector machines (SVMs) results in a proportional speedup in training time. For large-scale problems this can make the difference between tractable and intractable training tasks. However, it is critical that classifiers trained on reduced datasets perform as reliably as their counterparts trained on high-dimensional data. We assessed principal component analysis (PCA) and sequential project pursuit (SPP) as dimension reduction strategies in the biology application of classifying proteins into well-defined functional ‘families’ (SVM-based protein family classification) by their impact on run-time, sensitivity and selectivity. Homology vectors of 4352 elements were reduced to approximately 2% of the original data size without significantly affecting accuracy using PCA and SPP, while leading to approximately a 28-fold speedup in run-time.

  19. [Significance of clinical laboratory accreditation based on ISO 15189, and recent trend of international clinical laboratory accreditation program].

    PubMed

    Kawai, Tadashi

    2014-06-01

    ISO 15189 was first published in 2003, its second edition in 2007, and its third edition in 2012 by the ISO. Since 2003, through the approval of ILAC, ISO 15189 has been used for the accreditation of clinical/medical laboratories throughout the world, and approximately 5,000 clinical laboratories have now been accredited. In Japan, the JAB, in cooperation with the JCCLS, introduced the clinical laboratory accreditation program based on ISO 15189 in 2005, and 70 labs had been accredited by January 2014. It has been purely voluntary, without any governmental or regulatory involvement so far. However, it has been gradually accepted to be significant for maintaining quality management and for the improvement of clinical laboratory efficiency. The program will expand widely throughout the world in order to accomplish "one-stop testing" among clinical laboratories, at least regarding frequently-used routine laboratory tests.

  20. Pig Data and Bayesian Inference on Multinomial Probabilities

    ERIC Educational Resources Information Center

    Kern, John C.

    2006-01-01

    Bayesian inference on multinomial probabilities is conducted based on data collected from the game Pass the Pigs[R]. Prior information on these probabilities is readily available from the instruction manual, and is easily incorporated in a Dirichlet prior. Posterior analysis of the scoring probabilities quantifies the discrepancy between empirical…

  1. Disease Associations With Monoclonal Gammopathy of Undetermined Significance: A Population-Based Study of 17,398 Patients

    PubMed Central

    Bida, John P.; Kyle, Robert A.; Therneau, Terry M.; Melton, L. Joseph; Plevak, Matthew F.; Larson, Dirk R.; Dispenzieri, Angela; Katzmann, Jerry A.; Rajkumar, S. Vincent

    2009-01-01

    OBJECTIVE: To systematically study the association of monoclonal gammopathy of undetermined significance (MGUS) with all diseases in a population-based cohort of 17,398 patients, all of whom were uniformly tested for the presence or absence of MGUS. PATIENTS AND METHODS: Serum samples were obtained from 77% (21,463) of the 28,038 enumerated residents in Olmsted County, Minnesota. Informed consent was obtained from patients to study 17,398 samples. Among 17,398 samples tested, 605 cases of MGUS and 16,793 negative controls were identified. The computerized Mayo Medical Index was used to obtain information on all diagnoses entered between January 1, 1975, and May 31, 2006, for a total of 422,663 person-years of observations. To identify and confirm previously reported associations, these diagnostic codes were analyzed using stratified Poisson regression, adjusting for age, sex, and total person-years of observation. RESULTS: We confirmed a significant association in 14 (19%) of 75 previously reported disease associations with MGUS, including vertebral and hip fractures and osteoporosis. Systematic analysis of all 16,062 diagnostic disease codes found additional previously unreported associations, including mycobacterium infection and superficial thrombophlebitis. CONCLUSION: These results have major implications both for confirmed associations and for 61 diseases in which the association with MGUS is likely coincidental. PMID:19648385

  2. Weighted Feature Significance: A Simple, Interpretable Model of Compound Toxicity Based on the Statistical Enrichment of Structural Features

    PubMed Central

    Huang, Ruili; Southall, Noel; Xia, Menghang; Cho, Ming-Hsuang; Jadhav, Ajit; Nguyen, Dac-Trung; Inglese, James; Tice, Raymond R.; Austin, Christopher P.

    2009-01-01

    In support of the U.S. Tox21 program, we have developed a simple and chemically intuitive model we call weighted feature significance (WFS) to predict the toxicological activity of compounds, based on the statistical enrichment of structural features in toxic compounds. We trained and tested the model on the following: (1) data from quantitative high–throughput screening cytotoxicity and caspase activation assays conducted at the National Institutes of Health Chemical Genomics Center, (2) data from Salmonella typhimurium reverse mutagenicity assays conducted by the U.S. National Toxicology Program, and (3) hepatotoxicity data published in the Registry of Toxic Effects of Chemical Substances. Enrichments of structural features in toxic compounds are evaluated for their statistical significance and compiled into a simple additive model of toxicity and then used to score new compounds for potential toxicity. The predictive power of the model for cytotoxicity was validated using an independent set of compounds from the U.S. Environmental Protection Agency tested also at the National Institutes of Health Chemical Genomics Center. We compared the performance of our WFS approach with classical classification methods such as Naive Bayesian clustering and support vector machines. In most test cases, WFS showed similar or slightly better predictive power, especially in the prediction of hepatotoxic compounds, where WFS appeared to have the best performance among the three methods. The new algorithm has the important advantages of simplicity, power, interpretability, and ease of implementation. PMID:19805409

  3. Significance of atypia in conventional Papanicolaou smears and liquid-based cytology: a follow-up study.

    PubMed

    Schledermann, D; Ejersbo, D; Hoelund, B

    2004-06-01

    The diagnosis of atypical squamous epithelial cells, borderline nuclear changes, is associated with some controversy, as it encompasses benign, reactive, as well as possible neoplastic conditions. The aim of this study was to evaluate the follow-up diagnoses of cytological atypia in conventional Papanicolaou smears (CP) and liquid-based samples by the ThinPrep Pap Test (TP). A total of 1607 CP smears from 1 January 2000 to 31 December 2000 and 798 TP samples from 1 January 2002 to 31 December 2002 diagnosed as atypia were included. The results show that the detection rate of atypia in cervical cytological samples was reduced by 41.3% (P < 0.001) in TP compared with CP. Cytological and histological follow-up data showed the presence of neoplastic lesions in 34.7% of patients screened by TP versus 22.3% of patients screened by CP, corresponding to a 55.6% increase in TP (P < 0.001). Follow-up diagnosis of mild dysplasia was seen more than twice as often in TP than in CP (12.8% versus 5.0%, P < 0.001). The prevalence of moderate and severe dysplasia was significantly increased with 26.7% in TP compared with CP (21.9% versus 17.2%, P < 0.01). In conclusion, the ThinPrep Pap Test yielded a significant decrease in atypia rates compared with the conventional Papanicolaou test. In subsequent follow-up the percentage of neoplastic lesions was significantly increased in the ThinPrep Pap Test samples.

  4. Parametric probability distributions for anomalous change detection

    SciTech Connect

    Theiler, James P; Foy, Bernard R; Wohlberg, Brendt E; Scovel, James C

    2010-01-01

    The problem of anomalous change detection arises when two (or possibly more) images are taken of the same scene, but at different times. The aim is to discount the 'pervasive differences' that occur thoughout the imagery, due to the inevitably different conditions under which the images were taken (caused, for instance, by differences in illumination, atmospheric conditions, sensor calibration, or misregistration), and to focus instead on the 'anomalous changes' that actually take place in the scene. In general, anomalous change detection algorithms attempt to model these normal or pervasive differences, based on data taken directly from the imagery, and then identify as anomalous those pixels for which the model does not hold. For many algorithms, these models are expressed in terms of probability distributions, and there is a class of such algorithms that assume the distributions are Gaussian. By considering a broader class of distributions, however, a new class of anomalous change detection algorithms can be developed. We consider several parametric families of such distributions, derive the associated change detection algorithms, and compare the performance with standard algorithms that are based on Gaussian distributions. We find that it is often possible to significantly outperform these standard algorithms, even using relatively simple non-Gaussian models.

  5. Probability detection mechanisms and motor learning.

    PubMed

    Lungu, O V; Wächter, T; Liu, T; Willingham, D T; Ashe, J

    2004-11-01

    The automatic detection of patterns or regularities in the environment is central to certain forms of motor learning, which are largely procedural and implicit. The rules underlying the detection and use of probabilistic information in the perceptual-motor domain are largely unknown. We conducted two experiments involving a motor learning task with direct and crossed mapping of motor responses in which probabilities were present at the stimulus set level, the response set level, and at the level of stimulus-response (S-R) mapping. We manipulated only one level at a time, while controlling for the other two. The results show that probabilities were detected only when present at the S-R mapping and motor levels, but not at the perceptual one (experiment 1), unless the perceptual features have a dimensional overlap with the S-R mapping rule (experiment 2). The effects of probability detection were mostly facilitatory at the S-R mapping, both facilitatory and inhibitory at the perceptual level, and predominantly inhibitory at the response-set level. The facilitatory effects were based on learning the absolute frequencies first and transitional probabilities later (for the S-R mapping rule) or both types of information at the same time (for perceptual level), whereas the inhibitory effects were based on learning first the transitional probabilities. Our data suggest that both absolute frequencies and transitional probabilities are used in motor learning, but in different temporal orders, according to the probabilistic properties of the environment. The results support the idea that separate neural circuits may be involved in detecting absolute frequencies as compared to transitional probabilities.

  6. Definition of the Neutrosophic Probability

    NASA Astrophysics Data System (ADS)

    Smarandache, Florentin

    2014-03-01

    Neutrosophic probability (or likelihood) [1995] is a particular case of the neutrosophic measure. It is an estimation of an event (different from indeterminacy) to occur, together with an estimation that some indeterminacy may occur, and the estimation that the event does not occur. The classical probability deals with fair dice, coins, roulettes, spinners, decks of cards, random works, while neutrosophic probability deals with unfair, imperfect such objects and processes. For example, if we toss a regular die on an irregular surface which has cracks, then it is possible to get the die stuck on one of its edges or vertices in a crack (indeterminate outcome). The sample space is in this case: {1, 2, 3, 4, 5, 6, indeterminacy}. So, the probability of getting, for example 1, is less than 1/6. Since there are seven outcomes. The neutrosophic probability is a generalization of the classical probability because, when the chance of determinacy of a stochastic process is zero, these two probabilities coincide. The Neutrosophic Probability that of an event A occurs is NP (A) = (ch (A) , ch (indetA) , ch (A ̲)) = (T , I , F) , where T , I , F are subsets of [0,1], and T is the chance that A occurs, denoted ch(A); I is the indeterminate chance related to A, ch(indetermA) ; and F is the chance that A does not occur, ch (A ̲) . So, NP is a generalization of the Imprecise Probability as well. If T, I, and F are crisp numbers then: - 0 <= T + I + F <=3+ . We used the same notations (T,I,F) as in neutrosophic logic and set.

  7. Failure probability under parameter uncertainty.

    PubMed

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications.

  8. Adaptive Collaborative Gaussian Mixture Probability Hypothesis Density Filter for Multi-Target Tracking.

    PubMed

    Yang, Feng; Wang, Yongqi; Chen, Hao; Zhang, Pengyan; Liang, Yan

    2016-10-11

    In this paper, an adaptive collaborative Gaussian Mixture Probability Hypothesis Density (ACo-GMPHD) filter is proposed for multi-target tracking with automatic track extraction. Based on the evolutionary difference between the persistent targets and the birth targets, the measurements are adaptively partitioned into two parts, persistent and birth measurement sets, for updating the persistent and birth target Probability Hypothesis Density, respectively. Furthermore, the collaboration mechanism of multiple probability hypothesis density (PHDs) is established, where tracks can be automatically extracted. Simulation results reveal that the proposed filter yields considerable computational savings in processing requirements and significant improvement in tracking accuracy.

  9. Holographic Probabilities in Eternal Inflation

    NASA Astrophysics Data System (ADS)

    Bousso, Raphael

    2006-11-01

    In the global description of eternal inflation, probabilities for vacua are notoriously ambiguous. The local point of view is preferred by holography and naturally picks out a simple probability measure. It is insensitive to large expansion factors or lifetimes and so resolves a recently noted paradox. Any cosmological measure must be complemented with the probability for observers to emerge in a given vacuum. In lieu of anthropic criteria, I propose to estimate this by the entropy that can be produced in a local patch. This allows for prior-free predictions.

  10. Minimal entropy probability paths between genome families.

    PubMed

    Ahlbrandt, Calvin; Benson, Gary; Casey, William

    2004-05-01

    We develop a metric for probability distributions with applications to biological sequence analysis. Our distance metric is obtained by minimizing a functional defined on the class of paths over probability measures on N categories. The underlying mathematical theory is connected to a constrained problem in the calculus of variations. The solution presented is a numerical solution, which approximates the true solution in a set of cases called rich paths where none of the components of the path is zero. The functional to be minimized is motivated by entropy considerations, reflecting the idea that nature might efficiently carry out mutations of genome sequences in such a way that the increase in entropy involved in transformation is as small as possible. We characterize sequences by frequency profiles or probability vectors, in the case of DNA where N is 4 and the components of the probability vector are the frequency of occurrence of each of the bases A, C, G and T. Given two probability vectors a and b, we define a distance function based as the infimum of path integrals of the entropy function H( p) over all admissible paths p(t), 0 < or = t< or =1, with p(t) a probability vector such that p(0)=a and p(1)=b. If the probability paths p(t) are parameterized as y(s) in terms of arc length s and the optimal path is smooth with arc length L, then smooth and "rich" optimal probability paths may be numerically estimated by a hybrid method of iterating Newton's method on solutions of a two point boundary value problem, with unknown distance L between the abscissas, for the Euler-Lagrange equations resulting from a multiplier rule for the constrained optimization problem together with linear regression to improve the arc length estimate L. Matlab code for these numerical methods is provided which works only for "rich" optimal probability vectors. These methods motivate a definition of an elementary distance function which is easier and faster to calculate, works on non

  11. Automatic Item Generation of Probability Word Problems

    ERIC Educational Resources Information Center

    Holling, Heinz; Bertling, Jonas P.; Zeuch, Nina

    2009-01-01

    Mathematical word problems represent a common item format for assessing student competencies. Automatic item generation (AIG) is an effective way of constructing many items with predictable difficulties, based on a set of predefined task parameters. The current study presents a framework for the automatic generation of probability word problems…

  12. Tumor Budding in Colorectal Carcinoma: Confirmation of Prognostic Significance and Histologic Cutoff in a Population-based Cohort

    PubMed Central

    Graham, Rondell P.; Vierkant, Robert A.; Tillmans, Lori S.; Wang, Alice H.; Laird, Peter W; Weisenberger, Daniel J.; Lynch, Charles F.; French, Amy J.; Slager, Susan L.; Raissian, Yassaman; Garcia, Joaquin J.; Kerr, Sarah E.; Lee, Hee Eun; Thibodeau, Stephen N.; Cerhan, James R.; Limburg, Paul J.; Smyrk, Thomas C.

    2015-01-01

    Tumor budding in colorectal carcinoma has been associated with poor outcome in multiple studies, but the absence of an established histologic cutoff for “high” tumor budding, heterogeneity in study populations and varying methods for assessing tumor budding have hindered widespread incorporation of this parameter in clinical reports. We used an established scoring system in a population-based cohort to determine a histologic cutoff for “high” tumor budding and confirm its prognostic significance. We retrieved hematoxylin and eosin-stained sections from 553 incident colorectal carcinoma cases. Each case was previously characterized for select molecular alterations and survival data. Interobserver agreement was assessed between two GI pathologists and a group of four general surgical pathologists. High budding (≥10 tumor buds in a 20× objective field) was present in 32% of cases, low budding in 46% and no budding in 22%. High tumor budding was associated with advanced pathologic stage (p<0.001), microsatellite stability (p=0.005), KRAS mutation (p=0.010) and on multivariate analysis with a greater than two times risk of cancer-specific death (HR=2.57 (1.27, 5.19)). After multivariate adjustment, via penalized smoothing splines, we found increasing tumor bud counts from 5 upward to be associated with an increasingly shortened cancer-specific survival. By this method, a tumor bud count of 10 corresponded to approximately 2.5 times risk of cancer –specific death. The interobserver agreement was good with weighted kappa of 0.70 for two GI pathologists over 121 random cases and 0.72 between all six pathologists for 20 random cases. Using an established method to assess budding on routine histologic stains, we have shown a cutoff of 10 for high tumor budding is independently associated with a significantly worse prognosis. The reproducibility data provide support for the routine widespread implementation of tumor budding in clinical reports. PMID:26200097

  13. Logic, probability, and human reasoning.

    PubMed

    Johnson-Laird, P N; Khemlani, Sangeet S; Goodwin, Geoffrey P

    2015-04-01

    This review addresses the long-standing puzzle of how logic and probability fit together in human reasoning. Many cognitive scientists argue that conventional logic cannot underlie deductions, because it never requires valid conclusions to be withdrawn - not even if they are false; it treats conditional assertions implausibly; and it yields many vapid, although valid, conclusions. A new paradigm of probability logic allows conclusions to be withdrawn and treats conditionals more plausibly, although it does not address the problem of vapidity. The theory of mental models solves all of these problems. It explains how people reason about probabilities and postulates that the machinery for reasoning is itself probabilistic. Recent investigations accordingly suggest a way to integrate probability and deduction.

  14. Dinosaurs, Dinosaur Eggs, and Probability.

    ERIC Educational Resources Information Center

    Teppo, Anne R.; Hodgson, Ted

    2001-01-01

    Outlines several recommendations for teaching probability in the secondary school. Offers an activity that employs simulation by hand and using a programmable calculator in which geometry, analytical geometry, and discrete mathematics are explored. (KHR)

  15. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  16. How LO Can You GO? Using the Dice-Based Golf Game GOLO to Illustrate Inferences on Proportions and Discrete Probability Distributions

    ERIC Educational Resources Information Center

    Stephenson, Paul; Richardson, Mary; Gabrosek, John; Reischman, Diann

    2009-01-01

    This paper describes an interactive activity that revolves around the dice-based golf game GOLO. The GOLO game can be purchased at various retail locations or online at igolo.com. In addition, the game may be played online free of charge at igolo.com. The activity is completed in four parts. The four parts can be used in a sequence or they can be…

  17. Intrinsic Probability of a Multifractal Set

    NASA Astrophysics Data System (ADS)

    Hosokawa, Iwao

    1991-12-01

    It is shown that a self-similar measure isotropically distributed in a d-dimensional set should have its own intermittency exponents equivalent to its own generalized dimensions (in the sense of Hentschel and Procaccia), and that the intermittency exponents are completely designated by an intrinsic probability which governs the spatial distribution of the measure. Based on this, it is proven that the intrinsic probability uniquely determines the spatial distribution of the scaling index α of the measure as well as the so-called f-α spectrum of the multifractal set.

  18. Steering in spin tomographic probability representation

    NASA Astrophysics Data System (ADS)

    Man'ko, V. I.; Markovich, L. A.

    2016-09-01

    The steering property known for two-qubit state in terms of specific inequalities for the correlation function is translated for the state of qudit with the spin j = 3 / 2. Since most steering detection inequalities are based on the correlation functions we introduce analogs of such functions for the single qudit systems. The tomographic probability representation for the qudit states is applied. The connection between the correlation function in the two-qubit system and the single qudit is presented in an integral form with an intertwining kernel calculated explicitly in tomographic probability terms.

  19. Probability of real-time detection versus probability of infection for aerosolized biowarfare agents: a model study.

    PubMed

    Sabelnikov, Alexander; Zhukov, Vladimir; Kempf, Ruth

    2006-05-15

    Real-time biosensors are expected to provide significant help in emergency response management should a terrorist attack with the use of biowarfare, BW, agents occur. In spite of recent and spectacular progress in the field of biosensors, several core questions still remain unaddressed. For instance, how sensitive should be a sensor? To what levels of infection would the different sensitivity limits correspond? How the probabilities of identification correspond to the probabilities of infection by an agent? In this paper, an attempt was made to address these questions. A simple probability model was generated for the calculation of risks of infection of humans exposed to different doses of infectious agents and of the probability of their simultaneous real-time detection/identification by a model biosensor and its network. A model biosensor was defined as a single device that included an aerosol sampler and a device for identification by any known (or conceived) method. A network of biosensors was defined as a set of several single biosensors that operated in a similar way and dealt with the same amount of an agent. Neither the particular deployment of sensors within the network, nor the spacious and timely distribution of agent aerosols due to wind, ventilation, humidity, temperature, etc., was considered by the model. Three model biosensors based on PCR-, antibody/antigen-, and MS-technique were used for simulation. A wide range of their metric parameters encompassing those of commercially available and laboratory biosensors, and those of future, theoretically conceivable devices was used for several hundred simulations. Based on the analysis of the obtained results, it is concluded that small concentrations of aerosolized agents that are still able to provide significant risks of infection especially for highly infectious agents (e.g. for small pox those risk are 1, 8, and 37 infected out of 1000 exposed, depending on the viability of the virus preparation) will

  20. Tsunami probability in the Caribbean Region

    USGS Publications Warehouse

    Parsons, T.; Geist, E.L.

    2008-01-01

    We calculated tsunami runup probability (in excess of 0.5 m) at coastal sites throughout the Caribbean region. We applied a Poissonian probability model because of the variety of uncorrelated tsunami sources in the region. Coastlines were discretized into 20 km by 20 km cells, and the mean tsunami runup rate was determined for each cell. The remarkable ???500-year empirical record compiled by O'Loughlin and Lander (2003) was used to calculate an empirical tsunami probability map, the first of three constructed for this study. However, it is unclear whether the 500-year record is complete, so we conducted a seismic moment-balance exercise using a finite-element model of the Caribbean-North American plate boundaries and the earthquake catalog, and found that moment could be balanced if the seismic coupling coefficient is c = 0.32. Modeled moment release was therefore used to generate synthetic earthquake sequences to calculate 50 tsunami runup scenarios for 500-year periods. We made a second probability map from numerically-calculated runup rates in each cell. Differences between the first two probability maps based on empirical and numerical-modeled rates suggest that each captured different aspects of tsunami generation; the empirical model may be deficient in primary plate-boundary events, whereas numerical model rates lack backarc fault and landslide sources. We thus prepared a third probability map using Bayesian likelihood functions derived from the empirical and numerical rate models and their attendant uncertainty to weight a range of rates at each 20 km by 20 km coastal cell. Our best-estimate map gives a range of 30-year runup probability from 0 - 30% regionally. ?? irkhaueser 2008.

  1. Pathway-based network analysis of myeloma tumors: monoclonal gammopathy of unknown significance, smoldering multiple myeloma, and multiple myeloma.

    PubMed

    Dong, L; Chen, C Y; Ning, B; Xu, D L; Gao, J H; Wang, L L; Yan, S Y; Cheng, S

    2015-08-14

    Although many studies have been carried out on monoclonal gammopathy of unknown significances (MGUS), smoldering multiple myeloma (SMM), and multiple myeloma (MM), their classification and underlying pathogenesis are far from elucidated. To discover the relationships among MGUS, SMM, and MM at the transcriptome level, differentially expressed genes in MGUS, SMM, and MM were identified by the rank product method, and then co-expression networks were constructed by integrating the data. Finally, a pathway-network was constructed based on Kyoto Encyclopedia of Genes and Genomes pathway enrichment analysis, and the relationships between the pathways were identified. The results indicated that there were 55, 78, and 138 pathways involved in the myeloma tumor developmental stages of MGUS, SMM, and MM, respectively. The biological processes identified therein were found to have a close relationship with the immune system. Processes and pathways related to the abnormal activity of DNA and RNA were also present in SMM and MM. Six common pathways were found in the whole process of myeloma tumor development. Nine pathways were shown to participate in the progression of MGUS to SMM, and prostate cancer was the sole pathway that was involved only in MGUS and MM. Pathway-network analysis might provide a new indicator for the developmental stage diagnosis of myeloma tumors.

  2. Boron Doping of Multiwalled Carbon Nanotubes Significantly Enhances Hole Extraction in Carbon-Based Perovskite Solar Cells.

    PubMed

    Zheng, Xiaoli; Chen, Haining; Li, Qiang; Yang, Yinglong; Wei, Zhanhua; Bai, Yang; Qiu, Yongcai; Zhou, Dan; Wong, Kam Sing; Yang, Shihe

    2017-03-15

    Compared to the conventional perovskite solar cells (PSCs) containing hole-transport materials (HTM), carbon materials based HTM-free PSCs (C-PSCs) have often suffered from inferior power conversion efficiencies (PCEs) arising at least partially from the inefficient hole extraction at the perovskite-carbon interface. Here, we show that boron (B) doping of multiwalled carbon nanotubes (B-MWNTs) electrodes are superior in enabling enhanced hole extraction and transport by increasing work function, carrier concentration, and conductivity of MWNTs. The C-PSCs prepared using the B-MWNTs as the counter electrodes to extract and transport hole carriers have achieved remarkably higher performances than that with the undoped MWNTs, with the resulting PCE being considerably improved from 10.70% (average of 9.58%) to 14.60% (average of 13.70%). Significantly, these cells show negligible hysteretic behavior. Moreover, by coating a thin layer of insulating aluminum oxide (Al2O3) on the mesoporous TiO2 film as a physical barrier to substantially reduce the charge losses, the PCE has been further pushed to 15.23% (average 14.20%). Finally, the impressive durability and stability of the prepared C-PSCs were also testified under various conditions, including long-term air exposure, heat treatment, and high humidity.

  3. Reactive Intermediates: Molecular and MS-Based Approaches to Assess the Functional Significance of Chemical:Protein Adducts1

    PubMed Central

    Monks, Terrence J.; Lau, Serrine S.

    2014-01-01

    Biologically reactive intermediates formed as endogenous products of various metabolic processes are considered important factors in a variety of human diseases, including Parkinson’s disease and other neurological disorders, diabetes and complications thereof, and other inflammatory-associated diseases. Chemical-induced toxicities are also frequently mediated via the bioactivation of relatively stable organic molecules to reactive electrophilic metabolites. Indeed, chemical-induced toxicities have long been known to be associated with the ability of electrophilic metabolites to react with a variety of targets within the cell, including their covalent adduction to nucleophilic residues in proteins, and nucleotides within DNA. Although we possess considerable knowledge of the various biochemical mechanisms by which chemicals undergo metabolic bioactivation, we understand far less about the processes that couple bioactivation to toxicity. Identifying specific sites within a protein that are targets for adduction can provide the initial information necessary to determine whether such adventitious post-translational modifications significantly alter either protein structure and/or function. To address this problem we have developed MS-based approaches to identify specific amino acid targets of electrophile adduction (electrophile-binding motifs), coupled with molecular modeling of such adducts, to determine the potential structural and functional consequences. Where appropriate, functional assays are subsequently conducted to assess protein function. PMID:23222993

  4. Modifications to the AOAC use-dilution test for quaternary ammonium compound-based disinfectants that significantly improve method reliability.

    PubMed

    Arlea, Crystal; King, Sharon; Bennie, Barbara; Kemp, Kere; Mertz, Erin; Staub, Richard

    2008-01-01

    The AOAC use-dilution test (UDT) for bactericidal disinfectant efficacy (Method 964.02) has often been criticized for its extreme variability in test results, particularly for quaternary ammonium compound (QAC)-based disinfectants against Pseudomonas aeruginosa. While efforts are under way to develop a new and better test method for hospital disinfectant products that is globally acceptable, U.S. manufacturers and formulators of QAC products must continue in the interim to measure their product performance against the current UDT method. Therefore, continued variability in the UDT places an unnecessary and unfair burden on U.S. QAC product manufacturers to ensure that their products perform against an, at best, unreliable test method. This article reports on evaluations that were conducted to attempt to identify key sources of UDT method variability and to find ways to mitigate their impact on test outcomes for the method. The results of testing across 4 laboratories, involving over 6015 carriers, determined that operator error was a key factor in test variability. This variability was found to be significantly minimized by the inclusion of a simple culture dilution step. The findings from this study suggest possible refinements to the current AOAC UDT method that would serve to improve the overall ruggedness and reliability of the method and to optimize recovery of cells from the carrier surface, thereby further improving the accuracy and reproducibility of counts and test outcomes until such time as a replacement method is implemented.

  5. Probability, arrow of time and decoherence

    NASA Astrophysics Data System (ADS)

    Bacciagaluppi, Guido

    This paper relates both to the metaphysics of probability and to the physics of time asymmetry. Using the formalism of decoherent histories, it investigates whether intuitions about intrinsic time directedness that are often associated with probability can be justified in the context of no-collapse approaches to quantum mechanics. The standard (two-vector) approach to time symmetry in the decoherent histories literature is criticised, and an alternative approach is proposed, based on two decoherence conditions ('forwards' and 'backwards') within the one-vector formalism. In turn, considerations of forwards and backwards decoherence and of decoherence and recoherence suggest that a time-directed interpretation of probabilities, if adopted, should be both contingent and perspectival.

  6. Local Directed Percolation Probability in Two Dimensions

    NASA Astrophysics Data System (ADS)

    Inui, Norio; Konno, Norio; Komatsu, Genichi; Kameoka, Koichi

    1998-01-01

    Using the series expansion method and Monte Carlo simulation,we study the directed percolation probability on the square lattice Vn0=\\{ (x,y) \\in {Z}2:x+y=even, 0 ≤ y ≤ n, - y ≤ x ≤ y \\}.We calculate the local percolationprobability Pnl defined as the connection probability between theorigin and a site (0,n). The critical behavior of P∞lis clearly different from the global percolation probability P∞g characterized by a critical exponent βg.An analysis based on the Padé approximants shows βl=2βg.In addition, we find that the series expansion of P2nl can be expressed as a function of Png.

  7. Choice strategies in multiple-cue probability learning.

    PubMed

    White, Chris M; Koehler, Derek J

    2007-07-01

    Choice strategies for selecting among outcomes in multiple-cue probability learning were investigated using a simulated medical diagnosis task. Expected choice probabilities (the proportion of times each outcome was selected given each cue pattern) under alternative choice strategies were constructed from corresponding observed judged probabilities (of each outcome given each cue pattern) and compared with observed choice probabilities. Most of the participants were inferred to have responded by using a deterministic strategy, in which the outcome with the higher judged probability is consistently chosen, rather than a probabilistic strategy, in which an outcome is chosen with a probability equal to its judged probability. Extended practice in the learning environment did not affect choice strategy selection, contrary to reports from previous studies, results of which may instead be attributable to changes with practice in the variability and extremity of the perceived probabilities on which the choices were based.

  8. The role of probabilities in physics.

    PubMed

    Le Bellac, Michel

    2012-09-01

    Although modern physics was born in the XVIIth century as a fully deterministic theory in the form of Newtonian mechanics, the use of probabilistic arguments turned out later on to be unavoidable. Three main situations can be distinguished. (1) When the number of degrees of freedom is very large, on the order of Avogadro's number, a detailed dynamical description is not possible, and in fact not useful: we do not care about the velocity of a particular molecule in a gas, all we need is the probability distribution of the velocities. This statistical description introduced by Maxwell and Boltzmann allows us to recover equilibrium thermodynamics, gives a microscopic interpretation of entropy and underlies our understanding of irreversibility. (2) Even when the number of degrees of freedom is small (but larger than three) sensitivity to initial conditions of chaotic dynamics makes determinism irrelevant in practice, because we cannot control the initial conditions with infinite accuracy. Although die tossing is in principle predictable, the approach to chaotic dynamics in some limit implies that our ignorance of initial conditions is translated into a probabilistic description: each face comes up with probability 1/6. (3) As is well-known, quantum mechanics is incompatible with determinism. However, quantum probabilities differ in an essential way from the probabilities introduced previously: it has been shown from the work of John Bell that quantum probabilities are intrinsic and cannot be given an ignorance interpretation based on a hypothetical deeper level of description.

  9. Estimating the exceedance probability of extreme rainfalls up to the probable maximum precipitation

    NASA Astrophysics Data System (ADS)

    Nathan, Rory; Jordan, Phillip; Scorah, Matthew; Lang, Simon; Kuczera, George; Schaefer, Melvin; Weinmann, Erwin

    2016-12-01

    If risk-based criteria are used in the design of high hazard structures (such as dam spillways and nuclear power stations), then it is necessary to estimate the annual exceedance probability (AEP) of extreme rainfalls up to and including the Probable Maximum Precipitation (PMP). This paper describes the development and application of two largely independent methods to estimate the frequencies of such extreme rainfalls. One method is based on stochastic storm transposition (SST), which combines the "arrival" and "transposition" probabilities of an extreme storm using the total probability theorem. The second method, based on "stochastic storm regression" (SSR), combines frequency curves of point rainfalls with regression estimates of local and transposed areal rainfalls; rainfall maxima are generated by stochastically sampling the independent variates, where the required exceedance probabilities are obtained using the total probability theorem. The methods are applied to two large catchments (with areas of 3550 km2 and 15,280 km2) located in inland southern Australia. Both methods were found to provide similar estimates of the frequency of extreme areal rainfalls for the two study catchments. The best estimates of the AEP of the PMP for the smaller and larger of the catchments were found to be 10-7 and 10-6, respectively, but the uncertainty of these estimates spans one to two orders of magnitude. Additionally, the SST method was applied to a range of locations within a meteorologically homogenous region to investigate the nature of the relationship between the AEP of PMP and catchment area.

  10. A significant carbon sink in temperate forests in Beijing: based on 20-year field measurements in three stands.

    PubMed

    Zhu, JianXiao; Hu, XueYang; Yao, Hui; Liu, GuoHua; Ji, ChenJun; Fang, JingYun

    2015-11-01

    Numerous efforts have been made to characterize forest carbon (C) cycles and stocks in various ecosystems. However, long-term observation on each component of the forest C cycle is still lacking. We measured C stocks and fluxes in three permanent temperate forest plots (birch, oak and pine forest) during 2011–2014, and calculated the changes of the components of the C cycle related to the measurements during 1992–1994 at Mt. Dongling, Beijing, China. Forest net primary production in birch, oak, and pine plots was 5.32, 4.53, and 6.73 Mg C ha-1 a-1, respectively. Corresponding net ecosystem production was 0.12, 0.43, and 3.53 Mg C ha-1 a-1. The C stocks and fluxes in 2011–2014 were significantly larger than those in 1992–1994 in which the biomass C densities in birch, oak, and pine plots increased from 50.0, 37.7, and 54.0 Mg C ha-1 in 1994 to 101.5, 77.3, and 110.9 Mg C ha-1 in 2014; soil organic C densities increased from 207.0, 239.1, and 231.7 Mg C ha-1 to 214.8, 241.7, and 238.4 Mg C ha-1; and soil heterotrophic respiration increased from 2.78, 3.49, and 1.81 Mg C ha-1 a-1 to 5.20, 4.10, and 3.20 Mg C ha-1 a-1. These results suggest that the mountainous temperate forest ecosystems in Beijing have served as a carbon sink in the last two decades. These observations of C stocks and fluxes provided field-based data for a long-term study of C cycling in temperate forest ecosystems.

  11. Immunogenic Cell Death Induced by Ginsenoside Rg3: Significance in Dendritic Cell-based Anti-tumor Immunotherapy.

    PubMed

    Son, Keum-Joo; Choi, Ki Ryung; Lee, Seog Jae; Lee, Hyunah

    2016-02-01

    Cancer is one of the leading causes of morbidity and mortality worldwide; therefore there is a need to discover new therapeutic modules with improved efficacy and safety. Immune-(cell) therapy is a promising therapeutic strategy for the treatment of intractable cancers. The effectiveness of certain chemotherapeutics in inducing immunogenic tumor cell death thus promoting cancer eradication has been reported. Ginsenoside Rg3 is a ginseng saponin that has antitumor and immunomodulatory activity. In this study, we treated tumor cells with Rg3 to verify the significance of inducing immunogenic tumor cell death in antitumor therapy, especially in DC-based immunotherapy. Rg3 killed the both immunogenic (B16F10 melanoma cells) and non-immunogenic (LLC: Lewis Lung Carcinoma cells) tumor cells by inducing apoptosis. Surface expression of immunogenic death markers including calreticulin and heat shock proteins and the transcription of relevant genes were increased in the Rg3-dying tumor. Increased calreticulin expression was directly related to the uptake of dying tumor cells by dendritic cells (DCs): the proportion of CRT(+) CD11c(+) cells was increased in the Rg3-treated group. Interestingly, tumor cells dying by immunogenic cell death secreted IFN-γ, an effector molecule for antitumor activity in T cells. Along with the Rg3-induced suppression of pro-angiogenic (TNF-α) and immunosuppressive cytokine (TGF-β) secretion, IFN-γ production from the Rg3-treated tumor cells may also indicate Rg3 as an effective anticancer immunotherapeutic strategy. The data clearly suggests that Rg3-induced immunogenic tumor cell death due its cytotoxic effect and its ability to induce DC function. This indicates that Rg3 may be an effective immunotherapeutic strategy.

  12. Significance and Suppression of Redundant IL17 Responses in Acute Allograft Rejection by Bioinformatics Based Drug Repositioning of Fenofibrate

    PubMed Central

    Okamura, Homare; Hsieh, Szu-Chuan; Gong, Yongquan; Sarwal, Minnie M.

    2013-01-01

    Despite advanced immunosuppression, redundancy in the molecular diversity of acute rejection (AR) often results in incomplete resolution of the injury response. We present a bioinformatics based approach for identification of these redundant molecular pathways in AR and a drug repositioning approach to suppress these using FDA approved drugs currently available for non-transplant indications. Two independent microarray data-sets from human renal allograft biopsies (n = 101) from patients on majorly Th1/IFN-y immune response targeted immunosuppression, with and without AR, were profiled. Using gene-set analysis across 3305 biological pathways, significant enrichment was found for the IL17 pathway in AR in both data-sets. Recent evidence suggests IL17 pathway as an important escape mechanism when Th1/IFN-y mediated responses are suppressed. As current immunosuppressions do not specifically target the IL17 axis, 7200 molecular compounds were interrogated for FDA approved drugs with specific inhibition of this axis. A combined IL17/IFN-y suppressive role was predicted for the antilipidemic drug Fenofibrate. To assess the immunregulatory action of Fenofibrate, we conducted in-vitro treatment of anti-CD3/CD28 stimulated human peripheral blood cells (PBMC), and, as predicted, Fenofibrate reduced IL17 and IFN-γ gene expression in stimulated PMBC. In-vivo Fenofibrate treatment of an experimental rodent model of cardiac AR reduced infiltration of total leukocytes, reduced expression of IL17/IFN-y and their pathway related genes in allografts and recipients’ spleens, and extended graft survival by 21 days (p<0.007). In conclusion, this study provides important proof of concept that meta-analyses of genomic data and drug databases can provide new insights into the redundancy of the rejection response and presents an economic methodology to reposition FDA approved drugs in organ transplantation. PMID:23437201

  13. FISH-Based Analysis of Clonally Derived CHO Cell Populations Reveals High Probability for Transgene Integration in a Terminal Region of Chromosome 1 (1q13)

    PubMed Central

    Li, Shengwei; Gao, Xiaoping; Peng, Rui; Zhang, Sheng; Fu, Wei

    2016-01-01

    A basic goal in the development of recombinant proteins is the generation of cell lines that express the desired protein stably over many generations. Here, we constructed engineered Chinese hamster ovary cell lines (CHO-S) with a pCHO-hVR1 vector that carried an extracellular domain of a VEGF receptor (VR) fusion gene. Forty-five clones with high hVR1 expression were selected for karyotype analysis. Using fluorescence in situ hybridization (FISH) and G-banding, we found that pCHO-hVR1 was integrated into three chromosomes, including chromosomes 1, Z3 and Z4. Four clones were selected to evaluate their productivity under non-fed, non-optimized shake flask conditions. The results showed that clones 1 and 2 with integration sites on chromosome 1 revealed high levels of hVR1 products (shake flask of approximately 800 mg/L), whereas clones 3 and 4 with integration sites on chromosomes Z3 or Z4 had lower levels of hVR1 products. Furthermore, clones 1 and 2 maintained their productivity stabilities over a continuous period of 80 generations, and clones 3 and 4 showed significant declines in their productivities in the presence of selection pressure. Finally, pCHO-hVR1 localized to the same region at chromosome 1q13, the telomere region of normal chromosome 1. In this study, these results demonstrate that the integration of exogenous hVR1 gene on chromosome 1, band q13, may create a high protein-producing CHO-S cell line, suggesting that chromosome 1q13 may contain a useful target site for the high expression of exogenous protein. This study shows that the integration into the target site of chromosome 1q13 may avoid the problems of random integration that cause gene silencing or also overcome position effects, facilitating exogenous gene expression in CHO-S cells. PMID:27684722

  14. Detonation probabilities of high explosives

    SciTech Connect

    Eisenhawer, S.W.; Bott, T.F.; Bement, T.R.

    1995-07-01

    The probability of a high explosive violent reaction (HEVR) following various events is an extremely important aspect of estimating accident-sequence frequency for nuclear weapons dismantlement. In this paper, we describe the development of response curves for insults to PBX 9404, a conventional high-performance explosive used in US weapons. The insults during dismantlement include drops of high explosive (HE), strikes of tools and components on HE, and abrasion of the explosive. In the case of drops, we combine available test data on HEVRs and the results of flooring certification tests to estimate the HEVR probability. For other insults, it was necessary to use expert opinion. We describe the expert solicitation process and the methods used to consolidate the responses. The HEVR probabilities obtained from both approaches are compared.

  15. A Probability Distribution over Latent Causes, in the Orbitofrontal Cortex

    PubMed Central

    Niv, Yael; Norman, Kenneth A.

    2016-01-01

    The orbitofrontal cortex (OFC) has been implicated in both the representation of “state,” in studies of reinforcement learning and decision making, and also in the representation of “schemas,” in studies of episodic memory. Both of these cognitive constructs require a similar inference about the underlying situation or “latent cause” that generates our observations at any given time. The statistically optimal solution to this inference problem is to use Bayes' rule to compute a posterior probability distribution over latent causes. To test whether such a posterior probability distribution is represented in the OFC, we tasked human participants with inferring a probability distribution over four possible latent causes, based on their observations. Using fMRI pattern similarity analyses, we found that BOLD activity in the OFC is best explained as representing the (log-transformed) posterior distribution over latent causes. Furthermore, this pattern explained OFC activity better than other task-relevant alternatives, such as the most probable latent cause, the most recent observation, or the uncertainty over latent causes. SIGNIFICANCE STATEMENT Our world is governed by hidden (latent) causes that we cannot observe, but which generate the observations we see. A range of high-level cognitive processes require inference of a probability distribution (or “belief distribution”) over the possible latent causes that might be generating our current observations. This is true for reinforcement learning and decision making (where the latent cause comprises the true “state” of the task), and for episodic memory (where memories are believed to be organized by the inferred situation or “schema”). Using fMRI, we show that this belief distribution over latent causes is encoded in patterns of brain activity in the orbitofrontal cortex, an area that has been separately implicated in the representations of both states and schemas. PMID:27466328

  16. Interference of probabilities in dynamics

    SciTech Connect

    Zak, Michail

    2014-08-15

    A new class of dynamical systems with a preset type of interference of probabilities is introduced. It is obtained from the extension of the Madelung equation by replacing the quantum potential with a specially selected feedback from the Liouville equation. It has been proved that these systems are different from both Newtonian and quantum systems, but they can be useful for modeling spontaneous collective novelty phenomena when emerging outputs are qualitatively different from the weighted sum of individual inputs. Formation of language and fast decision-making process as potential applications of the probability interference is discussed.

  17. Knowledge typology for imprecise probabilities.

    SciTech Connect

    Wilson, G. D.; Zucker, L. J.

    2002-01-01

    When characterizing the reliability of a complex system there are often gaps in the data available for specific subsystems or other factors influencing total system reliability. At Los Alamos National Laboratory we employ ethnographic methods to elicit expert knowledge when traditional data is scarce. Typically, we elicit expert knowledge in probabilistic terms. This paper will explore how we might approach elicitation if methods other than probability (i.e., Dempster-Shafer, or fuzzy sets) prove more useful for quantifying certain types of expert knowledge. Specifically, we will consider if experts have different types of knowledge that may be better characterized in ways other than standard probability theory.

  18. Stretching Probability Explorations with Geoboards

    ERIC Educational Resources Information Center

    Wheeler, Ann; Champion, Joe

    2016-01-01

    Students are faced with many transitions in their middle school mathematics classes. To build knowledge, skills, and confidence in the key areas of algebra and geometry, students often need to practice using numbers and polygons in a variety of contexts. Teachers also want students to explore ideas from probability and statistics. Teachers know…

  19. Risk estimation using probability machines

    PubMed Central

    2014-01-01

    Background Logistic regression has been the de facto, and often the only, model used in the description and analysis of relationships between a binary outcome and observed features. It is widely used to obtain the conditional probabilities of the outcome given predictors, as well as predictor effect size estimates using conditional odds ratios. Results We show how statistical learning machines for binary outcomes, provably consistent for the nonparametric regression problem, can be used to provide both consistent conditional probability estimation and conditional effect size estimates. Effect size estimates from learning machines leverage our understanding of counterfactual arguments central to the interpretation of such estimates. We show that, if the data generating model is logistic, we can recover accurate probability predictions and effect size estimates with nearly the same efficiency as a correct logistic model, both for main effects and interactions. We also propose a method using learning machines to scan for possible interaction effects quickly and efficiently. Simulations using random forest probability machines are presented. Conclusions The models we propose make no assumptions about the data structure, and capture the patterns in the data by just specifying the predictors involved and not any particular model structure. So they do not run the same risks of model mis-specification and the resultant estimation biases as a logistic model. This methodology, which we call a “risk machine”, will share properties from the statistical machine that it is derived from. PMID:24581306

  20. Children's Understanding of Posterior Probability

    ERIC Educational Resources Information Center

    Girotto, Vittorio; Gonzalez, Michael

    2008-01-01

    Do young children have a basic intuition of posterior probability? Do they update their decisions and judgments in the light of new evidence? We hypothesized that they can do so extensionally, by considering and counting the various ways in which an event may or may not occur. The results reported in this paper showed that from the age of five,…

  1. Comments on quantum probability theory.

    PubMed

    Sloman, Steven

    2014-01-01

    Quantum probability theory (QP) is the best formal representation available of the most common form of judgment involving attribute comparison (inside judgment). People are capable, however, of judgments that involve proportions over sets of instances (outside judgment). Here, the theory does not do so well. I discuss the theory both in terms of descriptive adequacy and normative appropriateness.

  2. Probability Simulation in Middle School.

    ERIC Educational Resources Information Center

    Lappan, Glenda; Winter, M. J.

    1980-01-01

    Two simulations designed to teach probability to middle-school age pupils are presented. The first simulates the one-on-one foul shot simulation in basketball; the second deals with collecting a set of six cereal box prizes by buying boxes containing one toy each. (MP)

  3. GPS: Geometry, Probability, and Statistics

    ERIC Educational Resources Information Center

    Field, Mike

    2012-01-01

    It might be said that for most occupations there is now less of a need for mathematics than there was say fifty years ago. But, the author argues, geometry, probability, and statistics constitute essential knowledge for everyone. Maybe not the geometry of Euclid, but certainly geometrical ways of thinking that might enable us to describe the world…

  4. Rethinking the learning of belief network probabilities

    SciTech Connect

    Musick, R.

    1996-03-01

    Belief networks are a powerful tool for knowledge discovery that provide concise, understandable probabilistic models of data. There are methods grounded in probability theory to incrementally update the relationships described by the belief network when new information is seen, to perform complex inferences over any set of variables in the data, to incorporate domain expertise and prior knowledge into the model, and to automatically learn the model from data. This paper concentrates on part of the belief network induction problem, that of learning the quantitative structure (the conditional probabilities), given the qualitative structure. In particular, the current practice of rote learning the probabilities in belief networks can be significantly improved upon. We advance the idea of applying any learning algorithm to the task of conditional probability learning in belief networks, discuss potential benefits, and show results of applying neural networks and other algorithms to a medium sized car insurance belief network. The results demonstrate from 10 to 100% improvements in model error rates over the current approaches.

  5. Probability-Based Inference in Cognitive Diagnosis

    DTIC Science & Technology

    1994-02-01

    of variables in the student model. In Siegler’s study , this corresponds to determining how a child with a given set of strategies at her disposal...programs are commercially available to carry out the number-crunching aspect. We used Andersen, Jensen, Olesen, and Jensen’s (1989) HUGIN program and Noetic ... studying how they are typically acquired (e.g., in mechanics, Clement, 1982; in ratio and proportional reasoning, Karplus, Pulos, & Stage, 1983), and

  6. Development of Assays for Detecting Significant Prostate Cancer Based on Molecular Alterations Associated with Cancer in Non-Neoplastic Prostate Tissue

    DTIC Science & Technology

    2015-10-01

    Award Number: W81XWH-11-1-0744 TITLE: Development of Assays for Detecting Significant Prostate Cancer Based on Molecular Alterations Associated...Significant Prostate Cancer Based on Molecular Alterations Associated with Cancer in Non-Neoplastic Prostate Tissue 5b. GRANT NUMBER W81XWH-11-1-0744 5c...for Public Release; Distribution Unlimited 13. SUPPLEMENTARY NOTES 14. ABSTRACT The goal of this project is to develop biopsy based assays to

  7. Finding of No Significant Impact for Porposed Replacement of Senior Officers Quarters Project, McConnell Air Force Base, Kansas

    DTIC Science & Technology

    2006-08-08

    AF Air Force AFB Air Force Base AFI Air Force Instruction AFM Air Force Manual AFRC Air Force Reserve Command AICUZ Air Installation...shallow unconfined zones, and again in the deeper Wellington shale . Groundwater occurs in two water-bearing units at McConnell AFB. The shallow...east side of the base, the Wellington formation, Permian, silty shale , is highly weathered at the surface to a depth of about 40 feet. The Wellington

  8. Music-evoked incidental happiness modulates probability weighting during risky lottery choices

    PubMed Central

    Schulreich, Stefan; Heussen, Yana G.; Gerhardt, Holger; Mohr, Peter N. C.; Binkofski, Ferdinand C.; Koelsch, Stefan; Heekeren, Hauke R.

    2014-01-01

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music—happy, sad, or no music, or sequences of random tones—and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the “happy” than in the “sad” and “random tones” conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the “happy” condition, participants showed significantly higher decision weights associated with the larger payoffs than in the “sad” and “random tones” conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting. PMID:24432007

  9. Music-evoked incidental happiness modulates probability weighting during risky lottery choices.

    PubMed

    Schulreich, Stefan; Heussen, Yana G; Gerhardt, Holger; Mohr, Peter N C; Binkofski, Ferdinand C; Koelsch, Stefan; Heekeren, Hauke R

    2014-01-07

    We often make decisions with uncertain consequences. The outcomes of the choices we make are usually not perfectly predictable but probabilistic, and the probabilities can be known or unknown. Probability judgments, i.e., the assessment of unknown probabilities, can be influenced by evoked emotional states. This suggests that also the weighting of known probabilities in decision making under risk might be influenced by incidental emotions, i.e., emotions unrelated to the judgments and decisions at issue. Probability weighting describes the transformation of probabilities into subjective decision weights for outcomes and is one of the central components of cumulative prospect theory (CPT) that determine risk attitudes. We hypothesized that music-evoked emotions would modulate risk attitudes in the gain domain and in particular probability weighting. Our experiment featured a within-subject design consisting of four conditions in separate sessions. In each condition, the 41 participants listened to a different kind of music-happy, sad, or no music, or sequences of random tones-and performed a repeated pairwise lottery choice task. We found that participants chose the riskier lotteries significantly more often in the "happy" than in the "sad" and "random tones" conditions. Via structural regressions based on CPT, we found that the observed changes in participants' choices can be attributed to changes in the elevation parameter of the probability weighting function: in the "happy" condition, participants showed significantly higher decision weights associated with the larger payoffs than in the "sad" and "random tones" conditions. Moreover, elevation correlated positively with self-reported music-evoked happiness. Thus, our experimental results provide evidence in favor of a causal effect of incidental happiness on risk attitudes that can be explained by changes in probability weighting.

  10. Team-Based Learning, Faculty Research, and Grant Writing Bring Significant Learning Experiences to an Undergraduate Biochemistry Laboratory Course

    ERIC Educational Resources Information Center

    Evans, Hedeel Guy; Heyl, Deborah L.; Liggit, Peggy

    2016-01-01

    This biochemistry laboratory course was designed to provide significant learning experiences to expose students to different ways of succeeding as scientists in academia and foster development and improvement of their potential and competency as the next generation of investigators. To meet these goals, the laboratory course employs three…

  11. Reduced reward-related probability learning in schizophrenia patients.

    PubMed

    Yılmaz, Alpaslan; Simsek, Fatma; Gonul, Ali Saffet

    2012-01-01

    Although it is known that individuals with schizophrenia demonstrate marked impairment in reinforcement learning, the details of this impairment are not known. The aim of this study was to test the hypothesis that reward-related probability learning is altered in schizophrenia patients. Twenty-five clinically stable schizophrenia patients and 25 age- and gender-matched controls participated in the study. A simple gambling paradigm was used in which five different cues were associated with different reward probabilities (50%, 67%, and 100%). Participants were asked to make their best guess about the reward probability of each cue. Compared with controls, patients had significant impairment in learning contingencies on the basis of reward-related feedback. The correlation analyses revealed that the impairment of patients partially correlated with the severity of negative symptoms as measured on the Positive and Negative Syndrome Scale but that it was not related to antipsychotic dose. In conclusion, the present study showed that the schizophrenia patients had impaired reward-based learning and that this was independent from their medication status.

  12. Probability summation--a critique.

    PubMed

    Laming, Donald

    2013-03-01

    This Discussion Paper seeks to kill off probability summation, specifically the high-threshold assumption, as an explanatory idea in visual science. In combination with a Weibull function of a parameter of about 4, probability summation can accommodate, to within the limits of experimental error, the shape of the detectability function for contrast, the reduction in threshold that results from the combination of widely separated grating components, summation with respect to duration at threshold, and some instances, but not all, of spatial summation. But it has repeated difficulty with stimuli below threshold, because it denies the availability of input from such stimuli. All the phenomena listed above, and many more, can be accommodated equally accurately by signal-detection theory combined with an accelerated nonlinear transform of small, near-threshold, contrasts. This is illustrated with a transform that is the fourth power for the smallest contrasts, but tends to linear above threshold. Moreover, this particular transform can be derived from elementary properties of sensory neurons. Probability summation cannot be regarded as a special case of a more general theory, because it depends essentially on the 19th-century notion of a high fixed threshold. It is simply an obstruction to further progress.

  13. Finding of No Significant Impact: Environmental Assessment Construction of Hangar Addition Building 820 Tinker Air Force Base, Oklahoma

    DTIC Science & Technology

    2012-03-06

    Haza rdous materia ls are used by military personnel and on-base contractors throughout the base. The location of hazardous materials, procedures and...implemented by the DoD to identify and evaluate areas and constituents o f concern of toxic and hazardous materia l disposal and spill sites. Once the areas...to hazardous materia ls and wastes, including ERP sites. 4.2.6.1 Haza1·dous Materials 4.2.6.1.1 Preferred Alternative Construction Impacts: The

  14. Richness-Productivity Relationships Between Trophic Levels in a Detritus-Based System: Significance of Abundance and Trophic Linkage.

    EPA Science Inventory

    Most theoretical and empirical studies of productivity–species richness relationships fail to consider linkages among trophic levels. We quantified productivity–richness relationships in detritus-based, water-filled tree-hole communities for two trophic levels: invertebrate consu...

  15. Significant Life Experience: Exploring the Lifelong Influence of Place-Based Environmental and Science Education on Program Participants

    ERIC Educational Resources Information Center

    Colvin, Corrie Ruth

    2013-01-01

    Current research provides a limited understanding of the life long influence of nonformal place-based environmental and science education programs on past participants. This study looks to address this gap, exploring the ways in which these learning environments have contributed to environmental identity and stewardship. Using Dorothy Holland's…

  16. Significant light induced ozone loss on biomass burning aerosol: Evidence from chemistry-transport modeling based on new laboratory studies

    NASA Astrophysics Data System (ADS)

    Konovalov, I. B.; Beekmann, M.; D'Anna, B.; George, C.

    2012-09-01

    Recent laboratory studies indicated that a photo-induced heterogeneous reaction of ozone on the surface of aerosol containing humic like substances (HULIS) has the potential to affect the ozone budget in biomass burning plumes. To evaluate atmospheric significance of such heterogeneous light induced ozone loss, this process has been taken into account in the simulation of the extreme air pollution episode in the Moscow region during the 2010 mega fire event in western Russia. Results of the numerical experiments performed with the CHIMERE chemistry transport model indicate that photo induced removal of ozone could lead to significant (reaching several tens of percent) episodic decrease of the ozone concentration. The simulations also show that while wildfires provide reactive surface for the considered reaction, they strongly inhibit the photo-induced heterogeneous ozone loss by attenuating actinic fluxes through the “shielding” aerosol effect. The present results are calling for additional experimental and modelling studies.

  17. Objective Probability and Quantum Fuzziness

    NASA Astrophysics Data System (ADS)

    Mohrhoff, U.

    2009-02-01

    This paper offers a critique of the Bayesian interpretation of quantum mechanics with particular focus on a paper by Caves, Fuchs, and Schack containing a critique of the “objective preparations view” or OPV. It also aims to carry the discussion beyond the hardened positions of Bayesians and proponents of the OPV. Several claims made by Caves et al. are rebutted, including the claim that different pure states may legitimately be assigned to the same system at the same time, and the claim that the quantum nature of a preparation device cannot legitimately be ignored. Both Bayesians and proponents of the OPV regard the time dependence of a quantum state as the continuous dependence on time of an evolving state of some kind. This leads to a false dilemma: quantum states are either objective states of nature or subjective states of belief. In reality they are neither. The present paper views the aforesaid dependence as a dependence on the time of the measurement to whose possible outcomes the quantum state serves to assign probabilities. This makes it possible to recognize the full implications of the only testable feature of the theory, viz., the probabilities it assigns to measurement outcomes. Most important among these are the objective fuzziness of all relative positions and momenta and the consequent incomplete spatiotemporal differentiation of the physical world. The latter makes it possible to draw a clear distinction between the macroscopic and the microscopic. This in turn makes it possible to understand the special status of measurements in all standard formulations of the theory. Whereas Bayesians have written contemptuously about the “folly” of conjoining “objective” to “probability,” there are various reasons why quantum-mechanical probabilities can be considered objective, not least the fact that they are needed to quantify an objective fuzziness. But this cannot be appreciated without giving thought to the makeup of the world, which

  18. Clinical Significance of Two Real-Time PCR Assays for Chronic Hepatitis C Patients Receiving Protease Inhibitor-Based Therapy

    PubMed Central

    Inoue, Takako; Hmwe, Su Su; Shimada, Noritomo; Kato, Keizo; Ide, Tatsuya; Torimura, Takuji; Kumada, Takashi; Toyoda, Hidenori; Tsubota, Akihito; Takaguchi, Koichi; Wakita, Takaji; Tanaka, Yasuhito

    2017-01-01

    The aim of this study was to determine the efficacy of two hepatitis C virus (HCV) real-time PCR assays, the COBAS AmpliPrep/COBAS TaqMan HCV test (CAP/CTM) and the Abbott RealTime HCV test (ART), for predicting the clinical outcomes of patients infected with HCV who received telaprevir (TVR)-based triple therapy or daclatasvir/asunaprevir (DCV/ASV) dual therapy. The rapid virological response rates in patients receiving TVR-based triple therapy were 92% (23/25) and 40% (10/25) for CAP/CTM and ART, respectively. The false omission rate (FOR) of ART was 93.3% (14/15), indicating that CAP/CTM could accurately predict clinical outcome in the early phase. In an independent examination of 20 patients receiving TVR-based triple therapy who developed viral breakthrough or relapse, the times to HCV disappearance by ART were longer than by CAP/CTM, whereas the times to HCV reappearance were similar. In an independent experiment of WHO standard HCV RNA serially diluted in serum containing TVR, the analytical sensitivities of CAP/CTM and ART were similar. However, cell cultures transfected with HCV and grown in medium containing TVR demonstrated that ART detected HCV RNA for a longer time than CAP/CTM. Similar results were found for 42 patients receiving DCV/ASV dual therapy. The FOR of ART was 73.3% (11/15) at week 8 after initiation of therapy, indicating that ART at week 8 could not accurately predict the clinical outcome. In conclusion, although CAP/CTM and ART detected HCV RNA with comparable analytical sensitivity, CAP/CTM might be preferable for predicting the clinical outcomes of patients receiving protease inhibitor-based therapy. PMID:28118381

  19. Construction of Open Burning Facility Moody Air Force Base, Georgia Environmental Assessment and Finding of No Significant Impact

    DTIC Science & Technology

    2009-01-01

    Force Manual (AFMAN) 91-201 restricts access within the explosive clear zone around the MSA to mission essential personnel, which would exclude...the combustion of coal and oil by steel mills, pulp and paper mills, and from non-ferrous smelters. High concentrations of SO2 can aggravate...Assessment -- Open Burning Facility 29 4. Industrial—265 acres • Base Civil Engineering shops • Munitions storage • Petroleum, oil , and

  20. VOLCANIC RISK ASSESSMENT - PROBABILITY AND CONSEQUENCES

    SciTech Connect

    G.A. Valentine; F.V. Perry; S. Dartevelle

    2005-08-26

    Risk is the product of the probability and consequences of an event. Both of these must be based upon sound science that integrates field data, experiments, and modeling, but must also be useful to decision makers who likely do not understand all aspects of the underlying science. We review a decision framework used in many fields such as performance assessment for hazardous and/or radioactive waste disposal sites that can serve to guide the volcanological community towards integrated risk assessment. In this framework the underlying scientific understanding of processes that affect probability and consequences drive the decision-level results, but in turn these results can drive focused research in areas that cause the greatest level of uncertainty at the decision level. We review two examples of the determination of volcanic event probability: (1) probability of a new volcano forming at the proposed Yucca Mountain radioactive waste repository, and (2) probability that a subsurface repository in Japan would be affected by the nearby formation of a new stratovolcano. We also provide examples of work on consequences of explosive eruptions, within the framework mentioned above. These include field-based studies aimed at providing data for ''closure'' of wall rock erosion terms in a conduit flow model, predictions of dynamic pressure and other variables related to damage by pyroclastic flow into underground structures, and vulnerability criteria for structures subjected to conditions of explosive eruption. Process models (e.g., multiphase flow) are important for testing the validity or relative importance of possible scenarios in a volcanic risk assessment. We show how time-dependent multiphase modeling of explosive ''eruption'' of basaltic magma into an open tunnel (drift) at the Yucca Mountain repository provides insight into proposed scenarios that include the development of secondary pathways to the Earth's surface. Addressing volcanic risk within a decision

  1. The carpenter fork bed, a new - and older - Black-shale unit at the base of the New Albany shale in central Kentucky: Characterization and significance

    USGS Publications Warehouse

    Barnett, S.F.; Ettensohn, F.R.; Norby, R.D.

    1996-01-01

    Black shales previously interpreted to be Late Devonian cave-fill or slide deposits are shown to be much older Middle Devonian black shales only preserved locally in Middle Devonian grabens and structural lows in central Kentucky. This newly recognized - and older -black-shale unit occurs at the base of the New Albany Shale and is named the Carpenter Fork Bed of the Portwood Member of the New Albany Shale after its only known exposure on Carpenter Fork in Boyle County, central Kentucky; two other occurrences are known from core holes in east-central Kentucky. Based on stratigraphic position and conodont biostratigraphy, the unit is Middle Devonian (Givetian: probably Middle to Upper P. varcus Zone) in age and occurs at a position represented by an unconformity atop the Middle Devonian Boyle Dolostone and its equivalents elsewhere on the outcrop belt. Based on its presence as isolated clasts in the overlying Duffin Bed of the Portwood Member, the former distribution of the unit was probably much more widespread - perhaps occurring throughout western parts of the Rome trough. Carpenter Fork black shales apparently represent an episode of subsidence or sea-level rise coincident with inception of the third tectophase of the Acadian orogeny. Deposition, however, was soon interrupted by reactivation of several fault zones in central Kentucky, perhaps in response to bulge migration accompanying start of the tectophase. As a result, much of central Kentucky was uplifted and tilted, and the Carpenter Fork Bed was largely eroded from the top of the Boyle, except in a few structural lows like the Carpenter Fork graben where a nearly complete record of Middle to early Late Devonian deposition is preserved.

  2. Limits on the significant mass-loss scenario based on the globular clusters of the Fornax dwarf spheroidal galaxy

    NASA Astrophysics Data System (ADS)

    Khalaj, P.; Baumgardt, H.

    2016-03-01

    Many of the scenarios proposed to explain the origin of chemically peculiar stars in globular clusters (GCs) require significant mass loss (≥95 per cent) to explain the observed fraction of such stars. In the GCs of the Fornax dwarf galaxy, significant mass loss could be a problem. Larsen et al. showed that there is a large ratio of GCs to metal-poor field stars in Fornax and about 20-25 per cent of all the stars with [Fe/H] < -2 belong to the four metal-poor GCs. This imposes an upper limit of ˜80 per cent mass loss that could have happened in Fornax GCs. In this paper, we propose a solution to this problem by suggesting that stars can leave the Fornax galaxy. We use a series of N-body simulations to determine the limit of mass loss from Fornax as a function of the initial orbital radii of GCs and the speed with which stars leave Fornax GCs. We consider a set of cored and cuspy density profiles for Fornax. Our results show that with a cuspy model for Fornax, the fraction of stars that leave the galaxy can be as high as ˜90 per cent, when the initial orbital radii of GCs are R = 2-3 kpc and the initial speed of stars is v > 20 km s-1. We show that such large velocities can be achieved by mass loss induced by gas expulsion but not mass loss induced by stellar evolution. Our results imply that one cannot interpret the metallicity distribution of Fornax field stars as evidence against significant mass loss in Fornax GCs, if mass loss is due to gas expulsion.

  3. Significance of Rumex vesicarius as anticancer remedy against hepatocellular carcinoma: a proposal-based on experimental animal studies.

    PubMed

    Shahat, Abdelaaty A; Alsaid, Mansour S; Kotob, Soheir E; Ahmed, Hanaa H

    2015-01-01

    Rumex vesicarius is an edible herb distributed in Egypt and Saudi Arabia. The whole plant has significant value in folk medicine and it has been used to alleviate several diseases. Hepatocellular carcinoma (HCC), the major primary malignant tumor of the liver, is one of the most life-threatening human cancers. The goal of the current study was to explore the potent role of Rumex vesicarius extract against HCC induced in rats. Thirty adult male albino rats were divided into 3 groups: (I): Healthy animals received orally 0.9% normal saline and served as negative control group, (II): HCC group in which rats were orally administered N-nitrosodiethylamine NDEA, (III): HCC group treated orally with R. vesicarius extract in a dose of 400 mg/kg b.wt daily for two months. ALT and AST, ALP and γ-GT activities were estimated. CEA, AFP, AFU, GPC-3, Gp-73 and VEGF levels were quantified. Histopathological examination of liver tissue sections was also carried out. The results of the current study showed that the treatment of the HCC group with R. vesicarius extract reversed the significant increase in liver enzymes activity, CEA, AFP, AFU, glypican 3, golgi 73 and VEGF levels in serum as compared to HCC-untreated counterparts. In addition, the favorable impact of R. vesicarius treatment was evidenced by the marked improvement in the histopathological features of the liver of the treated group. In conclusion, the present experimental setting provided evidence for the significance of R. vesicarius as anticancer candidate with a promising anticancer potential against HCC. The powerful hepatoprotective properties, the potent antiangiogenic activity and the effective antiproliferative capacity are responsible for the anticancer effect of this plant.

  4. On the Role of Prior Probability in Adiabatic Quantum Algorithms

    NASA Astrophysics Data System (ADS)

    Sun, Jie; Lu, Songfeng; Yang, Liping

    2016-03-01

    In this paper, we study the role of prior probability on the efficiency of quantum local adiabatic search algorithm. The following aspects for prior probability are found here: firstly, only the probabilities of marked states affect the running time of the adiabatic evolution; secondly, the prior probability can be used for improving the efficiency of the adiabatic algorithm; thirdly, like the usual quantum adiabatic evolution, the running time for the case of multiple solution states where the number of marked elements are smaller enough than the size of the set assigned that contains them can be significantly bigger than that of the case where the assigned set only contains all the marked states.

  5. Cheating Probabilities on Multiple Choice Tests

    NASA Astrophysics Data System (ADS)

    Rizzuto, Gaspard T.; Walters, Fred

    1997-10-01

    This paper is strictly based on mathematical statistics and as such does not depend on prior performance and assumes the probability of each choice to be identical. In a real life situation, the probability of two students having identical responses becomes larger the better the students are. However the mathematical model is developed for all responses, both correct and incorrect, and provides a baseline for evaluation. David Harpp and coworkers (2, 3) at McGill University have evaluated ratios of exact errors in common (EEIC) to errors in common (EIC) and differences (D). In pairings where the ratio EEIC/EIC was greater than 0.75, the pair had unusually high odds against their answer pattern being random. Detection of copying of the EEIC/D ratios at values >1.0 indicate that pairs of these students were seated adjacent to one another and copied from one another. The original papers should be examined for details.

  6. Approaches to Evaluating Probability of Collision Uncertainty

    NASA Technical Reports Server (NTRS)

    Hejduk, Matthew D.; Johnson, Lauren C.

    2016-01-01

    While the two-dimensional probability of collision (Pc) calculation has served as the main input to conjunction analysis risk assessment for over a decade, it has done this mostly as a point estimate, with relatively little effort made to produce confidence intervals on the Pc value based on the uncertainties in the inputs. The present effort seeks to try to carry these uncertainties through the calculation in order to generate a probability density of Pc results rather than a single average value. Methods for assessing uncertainty in the primary and secondary objects' physical sizes and state estimate covariances, as well as a resampling approach to reveal the natural variability in the calculation, are presented; and an initial proposal for operationally-useful display and interpretation of these data for a particular conjunction is given.

  7. A quantum probability perspective on borderline vagueness.

    PubMed

    Blutner, Reinhard; Pothos, Emmanuel M; Bruza, Peter

    2013-10-01

    The term "vagueness" describes a property of natural concepts, which normally have fuzzy boundaries, admit borderline cases, and are susceptible to Zeno's sorites paradox. We will discuss the psychology of vagueness, especially experiments investigating the judgment of borderline cases and contradictions. In the theoretical part, we will propose a probabilistic model that describes the quantitative characteristics of the experimental finding and extends Alxatib's and Pelletier's () theoretical analysis. The model is based on a Hopfield network for predicting truth values. Powerful as this classical perspective is, we show that it falls short of providing an adequate coverage of the relevant empirical results. In the final part, we will argue that a substantial modification of the analysis put forward by Alxatib and Pelletier and its probabilistic pendant is needed. The proposed modification replaces the standard notion of probabilities by quantum probabilities. The crucial phenomenon of borderline contradictions can be explained then as a quantum interference phenomenon.

  8. Probability of identity by descent in metapopulations.

    PubMed Central

    Kaj, I; Lascoux, M

    1999-01-01

    Equilibrium probabilities of identity by descent (IBD), for pairs of genes within individuals, for genes between individuals within subpopulations, and for genes between subpopulations are calculated in metapopulation models with fixed or varying colony sizes. A continuous-time analog to the Moran model was used in either case. For fixed-colony size both propagule and migrant pool models were considered. The varying population size model is based on a birth-death-immigration (BDI) process, to which migration between colonies is added. Wright's F statistics are calculated and compared to previous results. Adding between-island migration to the BDI model can have an important effect on the equilibrium probabilities of IBD and on Wright's index. PMID:10388835

  9. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  10. Group cognitive behavioural treatment of youth anxiety in community based clinical practice: Clinical significance and benchmarking against efficacy.

    PubMed

    Jónsson, H; Thastum, M; Arendt, K; Juul-Sørensen, M

    2015-10-01

    The efficacy of a group cognitive behavioural therapy (CBT) programme (Cool Kids) of youth anxiety has been demonstrated at university clinics in Australia and Denmark and similar CBT programmes have been found effective within community settings in other countries. However, most effectiveness studies of CBT for youth anxiety have either used a mixture of CBT guidelines, or translated protocols not previous tested in an efficacy trial. This study used a benchmarking strategy to compare outcomes from the same CBT programme used at a university research clinic (N=87) and community centres (N=82). There was a significant reduction on both clinical and self-report measures of youth anxiety over time with medium to large effect sizes within both samples. Treatment effects on self-report measures of youth anxiety were significantly larger within the university sample, while changes in clinical measures of youth anxiety were similar in the two samples. Overall these findings suggest that an efficacious CBT group treatment programme developed within research contexts is transportable to community centres. Despite being effective within the community, the results indicate that the treatment may lose some of its efficacy when disseminated to the community.

  11. Annonaceous acetogenins (ACGs) nanosuspensions based on a self-assembly stabilizer and the significantly improved anti-tumor efficacy.

    PubMed

    Hong, Jingyi; Li, Yanhong; Xiao, Yao; Li, Yijing; Guo, Yifei; Kuang, Haixue; Wang, Xiangtao

    2016-09-01

    Annonaceous acetogenins (ACGs) have exhibited antitumor activity against various cancers. However, these substances' poor solubility has limited clinical applications. In this study, hydroxypropyl-beta-cyclodextrin (HP-β-CD) and soybean lecithin (SPC) were self-assembled into an amphiphilic complex. ACGs nanosuspensions (ACGs-NSps) were prepared with a mean particle size of 144.4nm, a zeta potential of -22.9mV and a high drug payload of 46.17% using this complex as stabilizer. The ACGs-NSps demonstrated sustained release in vitro and good stability in plasma as well as simulated gastrointestinal fluid, and met the demand of both intravenous injection and oral administration. The ACGs-NSps demonstrated significantly increased cytotoxicity against Hela and HepG2 cancer cell lines compared to ACGs in solution (in vitro cytotoxicity assay). An in vivo study with H22-tumor bearing mice demonstrated that nanosuspensions significantly improved ACGs' antitumor activity. When orally administered, ACGs-NSps achieved a similar tumor inhibition rate at 1/10th the dose of ACGs in an oil solution (47.94% vs. 49.74%, p>0.05). Improved therapeutic efficacy was further achieved when the ACGs-NSps were intravenously injected into mice (70.31%). With the help of nanosuspension technology, ACGs may be an effective antitumor drug for clinic use.

  12. Development of Assays for Detecting Significant Prostate Cancer Based on Molecular Alterations Associated with Cancer in Non-Neoplastic Prostate Tissue

    DTIC Science & Technology

    2012-10-01

    prostate cancer ." Am J Pathol 181(1): 34-42. Li, M. and L. A. Cannizzaro (1999). "Identical clonal origin of synchronous and metachronous low-grade...significant prostate cancer based on molecular alterations associated with cancer in non-neoplastic prostate tissue PRINCIPAL INVESTIGATOR...significant prostate cancer based on molecular alterations associated with cancer in non-neoplastic prostate tissue 5a. CONTRACT NUMBER 5b. GRANT

  13. The Black Hole Formation Probability

    NASA Astrophysics Data System (ADS)

    Clausen, Drew R.; Piro, Anthony; Ott, Christian D.

    2015-01-01

    A longstanding question in stellar evolution is which massive stars produce black holes (BHs) rather than neutron stars (NSs) upon death. It has been common practice to assume that a given zero-age main sequence (ZAMS) mass star (and perhaps a given metallicity) simply produces either an NS or a BH, but this fails to account for a myriad of other variables that may effect this outcome, such as spin, binarity, or even stochastic differences in the stellar structure near core collapse. We argue that instead a probabilistic description of NS versus BH formation may be better suited to account for the current uncertainties in understanding how massive stars die. Using the observed BH mass distribution from Galactic X-ray binaries, we investigate the probability that a star will make a BH as a function of its ZAMS mass. Although the shape of the black hole formation probability function is poorly constrained by current measurements, we believe that this framework is an important new step toward better understanding BH formation. We also consider some of the implications of this probability distribution, from its impact on the chemical enrichment from massive stars, to its connection with the structure of the core at the time of collapse, to the birth kicks that black holes receive. A probabilistic description of BH formation will be a useful input for future population synthesis studies that are interested in the formation of X-ray binaries, the nature and event rate of gravitational wave sources, and answering questions about chemical enrichment.

  14. Applications and statistical properties of minimum significant difference-based criterion testing in a toxicity testing program

    SciTech Connect

    Wang, Q.; Denton, D.L.; Shukla, R.

    2000-01-01

    As a follow up to the recommendations of the September 1995 SETAC Pellston Workshop on Whole Effluent Toxicity (WET) on test methods and appropriate endpoints, this paper will discuss the applications and statistical properties of using a statistical criterion of minimum significant difference (MSD). The authors examined the upper limits of acceptable MSDs as acceptance criterion in the case of normally distributed data. The implications of this approach are examined in terms of false negative rate as well as false positive rate. Results indicated that the proposed approach has reasonable statistical properties. Reproductive data from short-term chronic WET test with Ceriodaphnia dubia tests were used to demonstrate the applications of the proposed approach. The data were collected by the North Carolina Department of Environment, Health, and Natural Resources (Raleigh, NC, USA) as part of their National Pollutant Discharge Elimination System program.

  15. Airborne/Space-Based Doppler Lidar Wind Sounders Sampling the PBL and Other Regions of Significant Beta and U Inhomogeneities

    NASA Technical Reports Server (NTRS)

    Emmitt, Dave

    1998-01-01

    This final report covers the period from April 1994 through March 1998. The proposed research was organized under four main tasks. Those tasks were: (1) Investigate the vertical and horizontal velocity structures within and adjacent to thin and subvisual cirrus; (2) Investigate the lowest 1 km of the PBL and develop algorithms for processing pulsed Doppler lidar data obtained from single shots into regions of significant inhomogeneities in Beta and U; (3) Participate in OSSEs including those designed to establish shot density requirements for meso-gamma scale phenomena with quasi-persistent locations (e.g., jets, leewaves, tropical storms); and (4) Participate in the planning and execution of an airborne mission to measure winds with a pulsed CO2 Doppler lidar. Over the four year period of this research contract, work on all four tasks has yielded significant results which have led to 38 professional presentations (conferences and publications) and have been folded into the science justification for an approved NASA space mission, SPARCLE (SPAce Readiness Coherent Lidar Experiment), in 2001. Also this research has, through Task 4, led to a funded proposal to work directly on a NASA field campaign, CAMEX III, in which an airborne Doppler wind lidar will be used to investigate the cloud-free circulations near tropical storms. Monthly progress reports required under this contract are on file. This final report will highlight major accomplishments, including some that were not foreseen in the original proposal. The presentation of this final report includes this written document as well as material that is better presented via the internet (web pages). There is heavy reference to appended papers and documents. Thus, the main body of the report will serve to summarize the key efforts and findings.