Sample records for predictive probability distributions

  1. Evaluation of the Three Parameter Weibull Distribution Function for Predicting Fracture Probability in Composite Materials

    DTIC Science & Technology

    1978-03-01

    for the risk of rupture for a unidirectionally laminat - ed composite subjected to pure bending. (5D This equation can be simplified further by use of...C EVALUATION OF THE THREE PARAMETER WEIBULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS. THESIS / AFIT/GAE...EVALUATION OF THE THREE PARAMETER WE1BULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS THESIS Presented

  2. Predicting the probability of slip in gait: methodology and distribution study.

    PubMed

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  3. Development and application of an empirical probability distribution for the prediction error of re-entry body maximum dynamic pressure

    NASA Technical Reports Server (NTRS)

    Lanzi, R. James; Vincent, Brett T.

    1993-01-01

    The relationship between actual and predicted re-entry maximum dynamic pressure is characterized using a probability density function and a cumulative distribution function derived from sounding rocket flight data. This paper explores the properties of this distribution and demonstrates applications of this data with observed sounding rocket re-entry body damage characteristics to assess probabilities of sustaining various levels of heating damage. The results from this paper effectively bridge the gap existing in sounding rocket reentry analysis between the known damage level/flight environment relationships and the predicted flight environment.

  4. SU-F-T-450: The Investigation of Radiotherapy Quality Assurance and Automatic Treatment Planning Based On the Kernel Density Estimation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, J; Fan, J; Hu, W

    Purpose: To develop a fast automatic algorithm based on the two dimensional kernel density estimation (2D KDE) to predict the dose-volume histogram (DVH) which can be employed for the investigation of radiotherapy quality assurance and automatic treatment planning. Methods: We propose a machine learning method that uses previous treatment plans to predict the DVH. The key to the approach is the framing of DVH in a probabilistic setting. The training consists of estimating, from the patients in the training set, the joint probability distribution of the dose and the predictive features. The joint distribution provides an estimation of the conditionalmore » probability of the dose given the values of the predictive features. For the new patient, the prediction consists of estimating the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimation of the DVH. The 2D KDE is implemented to predict the joint probability distribution of the training set and the distribution of the predictive features for the new patient. Two variables, including the signed minimal distance from each OAR (organs at risk) voxel to the target boundary and its opening angle with respect to the origin of voxel coordinate, are considered as the predictive features to represent the OAR-target spatial relationship. The feasibility of our method has been demonstrated with the rectum, breast and head-and-neck cancer cases by comparing the predicted DVHs with the planned ones. Results: The consistent result has been found between these two DVHs for each cancer and the average of relative point-wise differences is about 5% within the clinical acceptable extent. Conclusion: According to the result of this study, our method can be used to predict the clinical acceptable DVH and has ability to evaluate the quality and consistency of the treatment planning.« less

  5. Characterising RNA secondary structure space using information entropy

    PubMed Central

    2013-01-01

    Comparative methods for RNA secondary structure prediction use evolutionary information from RNA alignments to increase prediction accuracy. The model is often described in terms of stochastic context-free grammars (SCFGs), which generate a probability distribution over secondary structures. It is, however, unclear how this probability distribution changes as a function of the input alignment. As prediction programs typically only return a single secondary structure, better characterisation of the underlying probability space of RNA secondary structures is of great interest. In this work, we show how to efficiently compute the information entropy of the probability distribution over RNA secondary structures produced for RNA alignments by a phylo-SCFG, and implement it for the PPfold model. We also discuss interpretations and applications of this quantity, including how it can clarify reasons for low prediction reliability scores. PPfold and its source code are available from http://birc.au.dk/software/ppfold/. PMID:23368905

  6. Predicting bottlenose dolphin distribution along Liguria coast (northwestern Mediterranean Sea) through different modeling techniques and indirect predictors.

    PubMed

    Marini, C; Fossa, F; Paoli, C; Bellingeri, M; Gnone, G; Vassallo, P

    2015-03-01

    Habitat modeling is an important tool to investigate the quality of the habitat for a species within a certain area, to predict species distribution and to understand the ecological processes behind it. Many species have been investigated by means of habitat modeling techniques mainly to address effective management and protection policies and cetaceans play an important role in this context. The bottlenose dolphin (Tursiops truncatus) has been investigated with habitat modeling techniques since 1997. The objectives of this work were to predict the distribution of bottlenose dolphin in a coastal area through the use of static morphological features and to compare the prediction performances of three different modeling techniques: Generalized Linear Model (GLM), Generalized Additive Model (GAM) and Random Forest (RF). Four static variables were tested: depth, bottom slope, distance from 100 m bathymetric contour and distance from coast. RF revealed itself both the most accurate and the most precise modeling technique with very high distribution probabilities predicted in presence cells (90.4% of mean predicted probabilities) and with 66.7% of presence cells with a predicted probability comprised between 90% and 100%. The bottlenose distribution obtained with RF allowed the identification of specific areas with particularly high presence probability along the coastal zone; the recognition of these core areas may be the starting point to develop effective management practices to improve T. truncatus protection. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Use of the negative binomial-truncated Poisson distribution in thunderstorm prediction

    NASA Technical Reports Server (NTRS)

    Cohen, A. C.

    1971-01-01

    A probability model is presented for the distribution of thunderstorms over a small area given that thunderstorm events (1 or more thunderstorms) are occurring over a larger area. The model incorporates the negative binomial and truncated Poisson distributions. Probability tables for Cape Kennedy for spring, summer, and fall months and seasons are presented. The computer program used to compute these probabilities is appended.

  8. Dose-volume histogram prediction using density estimation.

    PubMed

    Skarpman Munter, Johanna; Sjölund, Jens

    2015-09-07

    Knowledge of what dose-volume histograms can be expected for a previously unseen patient could increase consistency and quality in radiotherapy treatment planning. We propose a machine learning method that uses previous treatment plans to predict such dose-volume histograms. The key to the approach is the framing of dose-volume histograms in a probabilistic setting.The training consists of estimating, from the patients in the training set, the joint probability distribution of some predictive features and the dose. The joint distribution immediately provides an estimate of the conditional probability of the dose given the values of the predictive features. The prediction consists of estimating, from the new patient, the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimate of the dose-volume histogram.To illustrate how the proposed method relates to previously proposed methods, we use the signed distance to the target boundary as a single predictive feature. As a proof-of-concept, we predicted dose-volume histograms for the brainstems of 22 acoustic schwannoma patients treated with stereotactic radiosurgery, and for the lungs of 9 lung cancer patients treated with stereotactic body radiation therapy. Comparing with two previous attempts at dose-volume histogram prediction we find that, given the same input data, the predictions are similar.In summary, we propose a method for dose-volume histogram prediction that exploits the intrinsic probabilistic properties of dose-volume histograms. We argue that the proposed method makes up for some deficiencies in previously proposed methods, thereby potentially increasing ease of use, flexibility and ability to perform well with small amounts of training data.

  9. Gas Hydrate Formation Probability Distributions: The Effect of Shear and Comparisons with Nucleation Theory.

    PubMed

    May, Eric F; Lim, Vincent W; Metaxas, Peter J; Du, Jianwei; Stanwix, Paul L; Rowland, Darren; Johns, Michael L; Haandrikman, Gert; Crosby, Daniel; Aman, Zachary M

    2018-03-13

    Gas hydrate formation is a stochastic phenomenon of considerable significance for any risk-based approach to flow assurance in the oil and gas industry. In principle, well-established results from nucleation theory offer the prospect of predictive models for hydrate formation probability in industrial production systems. In practice, however, heuristics are relied on when estimating formation risk for a given flowline subcooling or when quantifying kinetic hydrate inhibitor (KHI) performance. Here, we present statistically significant measurements of formation probability distributions for natural gas hydrate systems under shear, which are quantitatively compared with theoretical predictions. Distributions with over 100 points were generated using low-mass, Peltier-cooled pressure cells, cycled in temperature between 40 and -5 °C at up to 2 K·min -1 and analyzed with robust algorithms that automatically identify hydrate formation and initial growth rates from dynamic pressure data. The application of shear had a significant influence on the measured distributions: at 700 rpm mass-transfer limitations were minimal, as demonstrated by the kinetic growth rates observed. The formation probability distributions measured at this shear rate had mean subcoolings consistent with theoretical predictions and steel-hydrate-water contact angles of 14-26°. However, the experimental distributions were substantially wider than predicted, suggesting that phenomena acting on macroscopic length scales are responsible for much of the observed stochastic formation. Performance tests of a KHI provided new insights into how such chemicals can reduce the risk of hydrate blockage in flowlines. Our data demonstrate that the KHI not only reduces the probability of formation (by both shifting and sharpening the distribution) but also reduces hydrate growth rates by a factor of 2.

  10. Predictions of malaria vector distribution in Belize based on multispectral satellite data.

    PubMed

    Roberts, D R; Paris, J F; Manguin, S; Harbach, R E; Woodruff, R; Rejmankova, E; Polanco, J; Wullschleger, B; Legters, L J

    1996-03-01

    Use of multispectral satellite data to predict arthropod-borne disease trouble spots is dependent on clear understandings of environmental factors that determine the presence of disease vectors. A blind test of remote sensing-based predictions for the spatial distribution of a malaria vector, Anopheles pseudopunctipennis, was conducted as a follow-up to two years of studies on vector-environmental relationships in Belize. Four of eight sites that were predicted to be high probability locations for presence of An. pseudopunctipennis were positive and all low probability sites (0 of 12) were negative. The absence of An. pseudopunctipennis at four high probability locations probably reflects the low densities that seem to characterize field populations of this species, i.e., the population densities were below the threshold of our sampling effort. Another important malaria vector, An. darlingi, was also present at all high probability sites and absent at all low probability sites. Anopheles darlingi, like An. pseudopunctipennis, is a riverine species. Prior to these collections at ecologically defined locations, this species was last detected in Belize in 1946.

  11. Predictions of malaria vector distribution in Belize based on multispectral satellite data

    NASA Technical Reports Server (NTRS)

    Roberts, D. R.; Paris, J. F.; Manguin, S.; Harbach, R. E.; Woodruff, R.; Rejmankova, E.; Polanco, J.; Wullschleger, B.; Legters, L. J.

    1996-01-01

    Use of multispectral satellite data to predict arthropod-borne disease trouble spots is dependent on clear understandings of environmental factors that determine the presence of disease vectors. A blind test of remote sensing-based predictions for the spatial distribution of a malaria vector, Anopheles pseudopunctipennis, was conducted as a follow-up to two years of studies on vector-environmental relationships in Belize. Four of eight sites that were predicted to be high probability locations for presence of An. pseudopunctipennis were positive and all low probability sites (0 of 12) were negative. The absence of An. pseudopunctipennis at four high probability locations probably reflects the low densities that seem to characterize field populations of this species, i.e., the population densities were below the threshold of our sampling effort. Another important malaria vector, An. darlingi, was also present at all high probability sites and absent at all low probability sites. Anopheles darlingi, like An. pseudopunctipennis, is a riverine species. Prior to these collections at ecologically defined locations, this species was last detected in Belize in 1946.

  12. The utility of Bayesian predictive probabilities for interim monitoring of clinical trials

    PubMed Central

    Connor, Jason T.; Ayers, Gregory D; Alvarez, JoAnn

    2014-01-01

    Background Bayesian predictive probabilities can be used for interim monitoring of clinical trials to estimate the probability of observing a statistically significant treatment effect if the trial were to continue to its predefined maximum sample size. Purpose We explore settings in which Bayesian predictive probabilities are advantageous for interim monitoring compared to Bayesian posterior probabilities, p-values, conditional power, or group sequential methods. Results For interim analyses that address prediction hypotheses, such as futility monitoring and efficacy monitoring with lagged outcomes, only predictive probabilities properly account for the amount of data remaining to be observed in a clinical trial and have the flexibility to incorporate additional information via auxiliary variables. Limitations Computational burdens limit the feasibility of predictive probabilities in many clinical trial settings. The specification of prior distributions brings additional challenges for regulatory approval. Conclusions The use of Bayesian predictive probabilities enables the choice of logical interim stopping rules that closely align with the clinical decision making process. PMID:24872363

  13. Mathematical Model to estimate the wind power using four-parameter Burr distribution

    NASA Astrophysics Data System (ADS)

    Liu, Sanming; Wang, Zhijie; Pan, Zhaoxu

    2018-03-01

    When the real probability of wind speed in the same position needs to be described, the four-parameter Burr distribution is more suitable than other distributions. This paper introduces its important properties and characteristics. Also, the application of the four-parameter Burr distribution in wind speed prediction is discussed, and the expression of probability distribution of output power of wind turbine is deduced.

  14. Reward skewness coding in the insula independent of probability and loss

    PubMed Central

    Tobler, Philippe N.

    2011-01-01

    Rewards in the natural environment are rarely predicted with complete certainty. Uncertainty relating to future rewards has typically been defined as the variance of the potential outcomes. However, the asymmetry of predicted reward distributions, known as skewness, constitutes a distinct but neuroscientifically underexplored risk term that may also have an impact on preference. By changing only reward magnitudes, we study skewness processing in equiprobable ternary lotteries involving only gains and constant probabilities, thus excluding probability distortion or loss aversion as mechanisms for skewness preference formation. We show that individual preferences are sensitive to not only the mean and variance but also to the skewness of predicted reward distributions. Using neuroimaging, we show that the insula, a structure previously implicated in the processing of reward-related uncertainty, responds to the skewness of predicted reward distributions. Some insula responses increased in a monotonic fashion with skewness (irrespective of individual skewness preferences), whereas others were similarly elevated to both negative and positive as opposed to no reward skew. These data support the notion that the asymmetry of reward distributions is processed in the brain and, taken together with replicated findings of mean coding in the striatum and variance coding in the cingulate, suggest that the brain codes distinct aspects of reward distributions in a distributed fashion. PMID:21849610

  15. Modeling potential distribution of Oligoryzomys longicaudatus, the Andes virus (Genus: Hantavirus) reservoir, in Argentina.

    PubMed

    Andreo, Verónica; Glass, Gregory; Shields, Timothy; Provensal, Cecilia; Polop, Jaime

    2011-09-01

    We constructed a model to predict the potential distribution of Oligoryzomys longicaudatus, the reservoir of Andes virus (Genus: Hantavirus), in Argentina. We developed an extensive database of occurrence records from published studies and our own surveys and compared two methods to model the probability of O. longicaudatus presence; logistic regression and MaxEnt algorithm. The environmental variables used were tree, grass and bare soil cover from MODIS imagery and, altitude and 19 bioclimatic variables from WorldClim database. The models performances were evaluated and compared both by threshold dependent and independent measures. The best models included tree and grass cover, mean diurnal temperature range, and precipitation of the warmest and coldest seasons. The potential distribution maps for O. longicaudatus predicted the highest occurrence probabilities along the Andes range, from 32°S and narrowing southwards. They also predicted high probabilities for the south-central area of Argentina, reaching the Atlantic coast. The Hantavirus Pulmonary Syndrome cases coincided with mean occurrence probabilities of 95 and 77% for logistic and MaxEnt models, respectively. HPS transmission zones in Argentine Patagonia matched the areas with the highest probability of presence. Therefore, colilargos presence probability may provide an approximate risk of transmission and act as an early tool to guide control and prevention plans.

  16. On the use of posterior predictive probabilities and prediction uncertainty to tailor informative sampling for parasitological surveillance in livestock.

    PubMed

    Musella, Vincenzo; Rinaldi, Laura; Lagazio, Corrado; Cringoli, Giuseppe; Biggeri, Annibale; Catelan, Dolores

    2014-09-15

    Model-based geostatistics and Bayesian approaches are appropriate in the context of Veterinary Epidemiology when point data have been collected by valid study designs. The aim is to predict a continuous infection risk surface. Little work has been done on the use of predictive infection probabilities at farm unit level. In this paper we show how to use predictive infection probability and related uncertainty from a Bayesian kriging model to draw a informative samples from the 8794 geo-referenced sheep farms of the Campania region (southern Italy). Parasitological data come from a first cross-sectional survey carried out to study the spatial distribution of selected helminths in sheep farms. A grid sampling was performed to select the farms for coprological examinations. Faecal samples were collected for 121 sheep farms and the presence of 21 different helminths were investigated using the FLOTAC technique. The 21 responses are very different in terms of geographical distribution and prevalence of infection. The observed prevalence range is from 0.83% to 96.69%. The distributions of the posterior predictive probabilities for all the 21 parasites are very heterogeneous. We show how the results of the Bayesian kriging model can be used to plan a second wave survey. Several alternatives can be chosen depending on the purposes of the second survey: weight by posterior predictive probabilities, their uncertainty or combining both information. The proposed Bayesian kriging model is simple, and the proposed samping strategy represents a useful tool to address targeted infection control treatments and surbveillance campaigns. It is easily extendable to other fields of research. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Effect of Individual Component Life Distribution on Engine Life Prediction

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin V.; Hendricks, Robert C.; Soditus, Sherry M.

    2003-01-01

    The effect of individual engine component life distributions on engine life prediction was determined. A Weibull-based life and reliability analysis of the NASA Energy Efficient Engine was conducted. The engine s life at a 95 and 99.9 percent probability of survival was determined based upon the engine manufacturer s original life calculations and assumed values of each of the component s cumulative life distributions as represented by a Weibull slope. The lives of the high-pressure turbine (HPT) disks and blades were also evaluated individually and as a system in a similar manner. Knowing the statistical cumulative distribution of each engine component with reasonable engineering certainty is a condition precedent to predicting the life and reliability of an entire engine. The life of a system at a given reliability will be less than the lowest-lived component in the system at the same reliability (probability of survival). Where Weibull slopes of all the engine components are equal, the Weibull slope had a minimal effect on engine L(sub 0.1) life prediction. However, at a probability of survival of 95 percent (L(sub 5) life), life decreased with increasing Weibull slope.

  18. A probabilistic approach to photovoltaic generator performance prediction

    NASA Astrophysics Data System (ADS)

    Khallat, M. A.; Rahman, S.

    1986-09-01

    A method for predicting the performance of a photovoltaic (PV) generator based on long term climatological data and expected cell performance is described. The equations for cell model formulation are provided. Use of the statistical model for characterizing the insolation level is discussed. The insolation data is fitted to appropriate probability distribution functions (Weibull, beta, normal). The probability distribution functions are utilized to evaluate the capacity factors of PV panels or arrays. An example is presented revealing the applicability of the procedure.

  19. Predictive probability methods for interim monitoring in clinical trials with longitudinal outcomes.

    PubMed

    Zhou, Ming; Tang, Qi; Lang, Lixin; Xing, Jun; Tatsuoka, Kay

    2018-04-17

    In clinical research and development, interim monitoring is critical for better decision-making and minimizing the risk of exposing patients to possible ineffective therapies. For interim futility or efficacy monitoring, predictive probability methods are widely adopted in practice. Those methods have been well studied for univariate variables. However, for longitudinal studies, predictive probability methods using univariate information from only completers may not be most efficient, and data from on-going subjects can be utilized to improve efficiency. On the other hand, leveraging information from on-going subjects could allow an interim analysis to be potentially conducted once a sufficient number of subjects reach an earlier time point. For longitudinal outcomes, we derive closed-form formulas for predictive probabilities, including Bayesian predictive probability, predictive power, and conditional power and also give closed-form solutions for predictive probability of success in a future trial and the predictive probability of success of the best dose. When predictive probabilities are used for interim monitoring, we study their distributions and discuss their analytical cutoff values or stopping boundaries that have desired operating characteristics. We show that predictive probabilities utilizing all longitudinal information are more efficient for interim monitoring than that using information from completers only. To illustrate their practical application for longitudinal data, we analyze 2 real data examples from clinical trials. Copyright © 2018 John Wiley & Sons, Ltd.

  20. Void probability as a function of the void's shape and scale-invariant models. [in studies of spacial galactic distribution

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1992-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  1. Void probability as a function of the void's shape and scale-invariant models

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1991-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  2. Conflict Probability Estimation for Free Flight

    NASA Technical Reports Server (NTRS)

    Paielli, Russell A.; Erzberger, Heinz

    1996-01-01

    The safety and efficiency of free flight will benefit from automated conflict prediction and resolution advisories. Conflict prediction is based on trajectory prediction and is less certain the farther in advance the prediction, however. An estimate is therefore needed of the probability that a conflict will occur, given a pair of predicted trajectories and their levels of uncertainty. A method is developed in this paper to estimate that conflict probability. The trajectory prediction errors are modeled as normally distributed, and the two error covariances for an aircraft pair are combined into a single equivalent covariance of the relative position. A coordinate transformation is then used to derive an analytical solution. Numerical examples and Monte Carlo validation are presented.

  3. Radial particle distributions in PARMILA simulation beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boicourt, G.P.

    1984-03-01

    The estimation of beam spill in particle accelerators is becoming of greater importance as higher current designs are being funded. To the present, no numerical method for predicting beam-spill has been available. In this paper, we present an approach to the loss-estimation problem that uses probability distributions fitted to particle-simulation beams. The properties of the PARMILA code's radial particle distribution are discussed, and a broad class of probability distributions are examined to check their ability to fit it. The possibility that the PARMILA distribution is a mixture is discussed, and a fitting distribution consisting of a mixture of two generalizedmore » gamma distributions is found. An efficient algorithm to accomplish the fit is presented. Examples of the relative prediction of beam spill are given. 26 references, 18 figures, 1 table.« less

  4. Predicting the cosmological constant with the scale-factor cutoff measure

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Simone, Andrea; Guth, Alan H.; Salem, Michael P.

    2008-09-15

    It is well known that anthropic selection from a landscape with a flat prior distribution of cosmological constant {lambda} gives a reasonable fit to observation. However, a realistic model of the multiverse has a physical volume that diverges with time, and the predicted distribution of {lambda} depends on how the spacetime volume is regulated. A very promising method of regulation uses a scale-factor cutoff, which avoids a number of serious problems that arise in other approaches. In particular, the scale-factor cutoff avoids the 'youngness problem' (high probability of living in a much younger universe) and the 'Q and G catastrophes'more » (high probability for the primordial density contrast Q and gravitational constant G to have extremely large or small values). We apply the scale-factor cutoff measure to the probability distribution of {lambda}, considering both positive and negative values. The results are in good agreement with observation. In particular, the scale-factor cutoff strongly suppresses the probability for values of {lambda} that are more than about 10 times the observed value. We also discuss qualitatively the prediction for the density parameter {omega}, indicating that with this measure there is a possibility of detectable negative curvature.« less

  5. Predicting probability of occurrence and factors affecting distribution and abundance of three Ozark endemic crayfish species at multiple spatial scales

    USGS Publications Warehouse

    Nolen, Matthew S.; Magoulick, Daniel D.; DiStefano, Robert J.; Imhoff, Emily M.; Wagner, Brian K.

    2014-01-01

    We found that a range of environmental variables were important in predicting crayfish distribution and abundance at multiple spatial scales and their importance was species-, response variable- and scale dependent. We would encourage others to examine the influence of spatial scale on species distribution and abundance patterns.

  6. Study of sea-surface slope distribution and its effect on radar backscatter based on Global Precipitation Measurement Ku-band precipitation radar measurements

    NASA Astrophysics Data System (ADS)

    Yan, Qiushuang; Zhang, Jie; Fan, Chenqing; Wang, Jing; Meng, Junmin

    2018-01-01

    The collocated normalized radar backscattering cross-section measurements from the Global Precipitation Measurement (GPM) Ku-band precipitation radar (KuPR) and the winds from the moored buoys are used to study the effect of different sea-surface slope probability density functions (PDFs), including the Gaussian PDF, the Gram-Charlier PDF, and the Liu PDF, on the geometrical optics (GO) model predictions of the radar backscatter at low incidence angles (0 deg to 18 deg) at different sea states. First, the peakedness coefficient in the Liu distribution is determined using the collocations at the normal incidence angle, and the results indicate that the peakedness coefficient is a nonlinear function of the wind speed. Then, the performance of the modified Liu distribution, i.e., Liu distribution using the obtained peakedness coefficient estimate; the Gaussian distribution; and the Gram-Charlier distribution is analyzed. The results show that the GO model predictions with the modified Liu distribution agree best with the KuPR measurements, followed by the predictions with the Gaussian distribution, while the predictions with the Gram-Charlier distribution have larger differences as the total or the slick filtered, not the radar filtered, probability density is included in the distribution. The best-performing distribution changes with incidence angle and changes with wind speed.

  7. Universal characteristics of fractal fluctuations in prime number distribution

    NASA Astrophysics Data System (ADS)

    Selvam, A. M.

    2014-11-01

    The frequency of occurrence of prime numbers at unit number spacing intervals exhibits self-similar fractal fluctuations concomitant with inverse power law form for power spectrum generic to dynamical systems in nature such as fluid flows, stock market fluctuations and population dynamics. The physics of long-range correlations exhibited by fractals is not yet identified. A recently developed general systems theory visualizes the eddy continuum underlying fractals to result from the growth of large eddies as the integrated mean of enclosed small scale eddies, thereby generating a hierarchy of eddy circulations or an inter-connected network with associated long-range correlations. The model predictions are as follows: (1) The probability distribution and power spectrum of fractals follow the same inverse power law which is a function of the golden mean. The predicted inverse power law distribution is very close to the statistical normal distribution for fluctuations within two standard deviations from the mean of the distribution. (2) Fractals signify quantum-like chaos since variance spectrum represents probability density distribution, a characteristic of quantum systems such as electron or photon. (3) Fractal fluctuations of frequency distribution of prime numbers signify spontaneous organization of underlying continuum number field into the ordered pattern of the quasiperiodic Penrose tiling pattern. The model predictions are in agreement with the probability distributions and power spectra for different sets of frequency of occurrence of prime numbers at unit number interval for successive 1000 numbers. Prime numbers in the first 10 million numbers were used for the study.

  8. Neighbor-Dependent Ramachandran Probability Distributions of Amino Acids Developed from a Hierarchical Dirichlet Process Model

    PubMed Central

    Mitra, Rajib; Jordan, Michael I.; Dunbrack, Roland L.

    2010-01-01

    Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1) input data size and criteria for structure inclusion (resolution, R-factor, etc.); 2) filtering of suspect conformations and outliers using B-factors or other features; 3) secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included); 4) the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5) whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately) have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp. PMID:20442867

  9. Redundancy and reduction: Speakers manage syntactic information density

    PubMed Central

    Florian Jaeger, T.

    2010-01-01

    A principle of efficient language production based on information theoretic considerations is proposed: Uniform Information Density predicts that language production is affected by a preference to distribute information uniformly across the linguistic signal. This prediction is tested against data from syntactic reduction. A single multilevel logit model analysis of naturally distributed data from a corpus of spontaneous speech is used to assess the effect of information density on complementizer that-mentioning, while simultaneously evaluating the predictions of several influential alternative accounts: availability, ambiguity avoidance, and dependency processing accounts. Information density emerges as an important predictor of speakers’ preferences during production. As information is defined in terms of probabilities, it follows that production is probability-sensitive, in that speakers’ preferences are affected by the contextual probability of syntactic structures. The merits of a corpus-based approach to the study of language production are discussed as well. PMID:20434141

  10. We'll Meet Again: Revealing Distributional and Temporal Patterns of Social Contact

    PubMed Central

    Pachur, Thorsten; Schooler, Lael J.; Stevens, Jeffrey R.

    2014-01-01

    What are the dynamics and regularities underlying social contact, and how can contact with the people in one's social network be predicted? In order to characterize distributional and temporal patterns underlying contact probability, we asked 40 participants to keep a diary of their social contacts for 100 consecutive days. Using a memory framework previously used to study environmental regularities, we predicted that the probability of future contact would follow in systematic ways from the frequency, recency, and spacing of previous contact. The distribution of contact probability across the members of a person's social network was highly skewed, following an exponential function. As predicted, it emerged that future contact scaled linearly with frequency of past contact, proportionally to a power function with recency of past contact, and differentially according to the spacing of past contact. These relations emerged across different contact media and irrespective of whether the participant initiated or received contact. We discuss how the identification of these regularities might inspire more realistic analyses of behavior in social networks (e.g., attitude formation, cooperation). PMID:24475073

  11. Prediction of future asset prices

    NASA Astrophysics Data System (ADS)

    Seong, Ng Yew; Hin, Pooi Ah; Ching, Soo Huei

    2014-12-01

    This paper attempts to incorporate trading volumes as an additional predictor for predicting asset prices. Denoting r(t) as the vector consisting of the time-t values of the trading volume and price of a given asset, we model the time-(t+1) asset price to be dependent on the present and l-1 past values r(t), r(t-1), ....., r(t-1+1) via a conditional distribution which is derived from a (2l+1)-dimensional power-normal distribution. A prediction interval based on the 100(α/2)% and 100(1-α/2)% points of the conditional distribution is then obtained. By examining the average lengths of the prediction intervals found by using the composite indices of the Malaysia stock market for the period 2008 to 2013, we found that the value 2 appears to be a good choice for l. With the omission of the trading volume in the vector r(t), the corresponding prediction interval exhibits a slightly longer average length, showing that it might be desirable to keep trading volume as a predictor. From the above conditional distribution, the probability that the time-(t+1) asset price will be larger than the time-t asset price is next computed. When the probability differs from 0 (or 1) by less than 0.03, the observed time-(t+1) increase in price tends to be negative (or positive). Thus the above probability has a good potential of being used as a market indicator in technical analysis.

  12. Latin hypercube approach to estimate uncertainty in ground water vulnerability

    USGS Publications Warehouse

    Gurdak, J.J.; McCray, J.E.; Thyne, G.; Qi, S.L.

    2007-01-01

    A methodology is proposed to quantify prediction uncertainty associated with ground water vulnerability models that were developed through an approach that coupled multivariate logistic regression with a geographic information system (GIS). This method uses Latin hypercube sampling (LHS) to illustrate the propagation of input error and estimate uncertainty associated with the logistic regression predictions of ground water vulnerability. Central to the proposed method is the assumption that prediction uncertainty in ground water vulnerability models is a function of input error propagation from uncertainty in the estimated logistic regression model coefficients (model error) and the values of explanatory variables represented in the GIS (data error). Input probability distributions that represent both model and data error sources of uncertainty were simultaneously sampled using a Latin hypercube approach with logistic regression calculations of probability of elevated nonpoint source contaminants in ground water. The resulting probability distribution represents the prediction intervals and associated uncertainty of the ground water vulnerability predictions. The method is illustrated through a ground water vulnerability assessment of the High Plains regional aquifer. Results of the LHS simulations reveal significant prediction uncertainties that vary spatially across the regional aquifer. Additionally, the proposed method enables a spatial deconstruction of the prediction uncertainty that can lead to improved prediction of ground water vulnerability. ?? 2007 National Ground Water Association.

  13. A comparison of numerical solutions of partial differential equations with probabilistic and possibilistic parameters for the quantification of uncertainty in subsurface solute transport.

    PubMed

    Zhang, Kejiang; Achari, Gopal; Li, Hua

    2009-11-03

    Traditionally, uncertainty in parameters are represented as probabilistic distributions and incorporated into groundwater flow and contaminant transport models. With the advent of newer uncertainty theories, it is now understood that stochastic methods cannot properly represent non random uncertainties. In the groundwater flow and contaminant transport equations, uncertainty in some parameters may be random, whereas those of others may be non random. The objective of this paper is to develop a fuzzy-stochastic partial differential equation (FSPDE) model to simulate conditions where both random and non random uncertainties are involved in groundwater flow and solute transport. Three potential solution techniques namely, (a) transforming a probability distribution to a possibility distribution (Method I) then a FSPDE becomes a fuzzy partial differential equation (FPDE), (b) transforming a possibility distribution to a probability distribution (Method II) and then a FSPDE becomes a stochastic partial differential equation (SPDE), and (c) the combination of Monte Carlo methods and FPDE solution techniques (Method III) are proposed and compared. The effects of these three methods on the predictive results are investigated by using two case studies. The results show that the predictions obtained from Method II is a specific case of that got from Method I. When an exact probabilistic result is needed, Method II is suggested. As the loss or gain of information during a probability-possibility (or vice versa) transformation cannot be quantified, their influences on the predictive results is not known. Thus, Method III should probably be preferred for risk assessments.

  14. Unit-Sphere Anisotropic Multiaxial Stochastic-Strength Model Probability Density Distribution for the Orientation of Critical Flaws

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel

    2013-01-01

    Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software

  15. Probability density function of non-reactive solute concentration in heterogeneous porous formations

    Treesearch

    Alberto Bellin; Daniele Tonina

    2007-01-01

    Available models of solute transport in heterogeneous formations lack in providing complete characterization of the predicted concentration. This is a serious drawback especially in risk analysis where confidence intervals and probability of exceeding threshold values are required. Our contribution to fill this gap of knowledge is a probability distribution model for...

  16. Use of the Weibull function to predict future diameter distributions from current plot data

    Treesearch

    Quang V. Cao

    2012-01-01

    The Weibull function has been widely used to characterize diameter distributions in forest stands. The future diameter distribution of a forest stand can be predicted by use of a Weibull probability density function from current inventory data for that stand. The parameter recovery approach has been used to “recover” the Weibull parameters from diameter moments or...

  17. Second look at the spread of epidemics on networks

    NASA Astrophysics Data System (ADS)

    Kenah, Eben; Robins, James M.

    2007-09-01

    In an important paper, Newman [Phys. Rev. E66, 016128 (2002)] claimed that a general network-based stochastic Susceptible-Infectious-Removed (SIR) epidemic model is isomorphic to a bond percolation model, where the bonds are the edges of the contact network and the bond occupation probability is equal to the marginal probability of transmission from an infected node to a susceptible neighbor. In this paper, we show that this isomorphism is incorrect and define a semidirected random network we call the epidemic percolation network that is exactly isomorphic to the SIR epidemic model in any finite population. In the limit of a large population, (i) the distribution of (self-limited) outbreak sizes is identical to the size distribution of (small) out-components, (ii) the epidemic threshold corresponds to the phase transition where a giant strongly connected component appears, (iii) the probability of a large epidemic is equal to the probability that an initial infection occurs in the giant in-component, and (iv) the relative final size of an epidemic is equal to the proportion of the network contained in the giant out-component. For the SIR model considered by Newman, we show that the epidemic percolation network predicts the same mean outbreak size below the epidemic threshold, the same epidemic threshold, and the same final size of an epidemic as the bond percolation model. However, the bond percolation model fails to predict the correct outbreak size distribution and probability of an epidemic when there is a nondegenerate infectious period distribution. We confirm our findings by comparing predictions from percolation networks and bond percolation models to the results of simulations. In the Appendix, we show that an isomorphism to an epidemic percolation network can be defined for any time-homogeneous stochastic SIR model.

  18. Randomized path optimization for thevMitigated counter detection of UAVS

    DTIC Science & Technology

    2017-06-01

    using Bayesian filtering . The KL divergence is used to compare the probability density of aircraft termination to a normal distribution around the...Bayesian filtering . The KL divergence is used to compare the probability density of aircraft termination to a normal distribution around the true terminal...algorithm’s success. A recursive Bayesian filtering scheme is used to assimilate noisy measurements of the UAVs position to predict its terminal location. We

  19. Prediction of fatty acid-binding residues on protein surfaces with three-dimensional probability distributions of interacting atoms.

    PubMed

    Mahalingam, Rajasekaran; Peng, Hung-Pin; Yang, An-Suei

    2014-08-01

    Protein-fatty acid interaction is vital for many cellular processes and understanding this interaction is important for functional annotation as well as drug discovery. In this work, we present a method for predicting the fatty acid (FA)-binding residues by using three-dimensional probability density distributions of interacting atoms of FAs on protein surfaces which are derived from the known protein-FA complex structures. A machine learning algorithm was established to learn the characteristic patterns of the probability density maps specific to the FA-binding sites. The predictor was trained with five-fold cross validation on a non-redundant training set and then evaluated with an independent test set as well as on holo-apo pair's dataset. The results showed good accuracy in predicting the FA-binding residues. Further, the predictor developed in this study is implemented as an online server which is freely accessible at the following website, http://ismblab.genomics.sinica.edu.tw/. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Species Distribution Models and Ecological Suitability Analysis for Potential Tick Vectors of Lyme Disease in Mexico

    PubMed Central

    Illoldi-Rangel, Patricia; Rivaldi, Chissa-Louise; Sissel, Blake; Trout Fryxell, Rebecca; Gordillo-Pérez, Guadalupe; Rodríguez-Moreno, Angel; Williamson, Phillip; Montiel-Parra, Griselda; Sánchez-Cordero, Víctor; Sarkar, Sahotra

    2012-01-01

    Species distribution models were constructed for ten Ixodes species and Amblyomma cajennense for a region including Mexico and Texas. The model was based on a maximum entropy algorithm that used environmental layers to predict the relative probability of presence for each taxon. For Mexico, species geographic ranges were predicted by restricting the models to cells which have a higher probability than the lowest probability of the cells in which a presence record was located. There was spatial nonconcordance between the distributions of Amblyomma cajennense and the Ixodes group with the former restricted to lowlands and mainly the eastern coast of Mexico and the latter to montane regions with lower temperature. The risk of Lyme disease is, therefore, mainly present in the highlands where some Ixodes species are known vectors; if Amblyomma cajennense turns out to be a competent vector, the area of risk also extends to the lowlands and the east coast. PMID:22518171

  1. Species distribution models and ecological suitability analysis for potential tick vectors of lyme disease in Mexico.

    PubMed

    Illoldi-Rangel, Patricia; Rivaldi, Chissa-Louise; Sissel, Blake; Trout Fryxell, Rebecca; Gordillo-Pérez, Guadalupe; Rodríguez-Moreno, Angel; Williamson, Phillip; Montiel-Parra, Griselda; Sánchez-Cordero, Víctor; Sarkar, Sahotra

    2012-01-01

    Species distribution models were constructed for ten Ixodes species and Amblyomma cajennense for a region including Mexico and Texas. The model was based on a maximum entropy algorithm that used environmental layers to predict the relative probability of presence for each taxon. For Mexico, species geographic ranges were predicted by restricting the models to cells which have a higher probability than the lowest probability of the cells in which a presence record was located. There was spatial nonconcordance between the distributions of Amblyomma cajennense and the Ixodes group with the former restricted to lowlands and mainly the eastern coast of Mexico and the latter to montane regions with lower temperature. The risk of Lyme disease is, therefore, mainly present in the highlands where some Ixodes species are known vectors; if Amblyomma cajennense turns out to be a competent vector, the area of risk also extends to the lowlands and the east coast.

  2. Real-Time Safety Monitoring and Prediction for the National Airspace System

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil

    2016-01-01

    As new operational paradigms and additional aircraft are being introduced into the National Airspace System (NAS), maintaining safety in such a rapidly growing environment becomes more challenging. It is therefore desirable to have both an overview of the current safety of the airspace at different levels of granularity, as well an understanding of how the state of the safety will evolve into the future given the anticipated flight plans, weather forecasts, predicted health of assets in the airspace, and so on. To this end, we have developed a Real-Time Safety Monitoring (RTSM) that first, estimates the state of the NAS using the dynamic models. Then, given the state estimate and a probability distribution of future inputs to the NAS, the framework predicts the evolution of the NAS, i.e., the future state, and analyzes these future states to predict the occurrence of unsafe events. The entire probability distribution of airspace safety metrics is computed, not just point estimates, without significant assumptions regarding the distribution type and or parameters. We demonstrate our overall approach by predicting the occurrence of some unsafe events and show how these predictions evolve in time as flight operations progress.

  3. Force Density Function Relationships in 2-D Granular Media

    NASA Technical Reports Server (NTRS)

    Youngquist, Robert C.; Metzger, Philip T.; Kilts, Kelly N.

    2004-01-01

    An integral transform relationship is developed to convert between two important probability density functions (distributions) used in the study of contact forces in granular physics. Developing this transform has now made it possible to compare and relate various theoretical approaches with one another and with the experimental data despite the fact that one may predict the Cartesian probability density and another the force magnitude probability density. Also, the transforms identify which functional forms are relevant to describe the probability density observed in nature, and so the modified Bessel function of the second kind has been identified as the relevant form for the Cartesian probability density corresponding to exponential forms in the force magnitude distribution. Furthermore, it is shown that this transform pair supplies a sufficient mathematical framework to describe the evolution of the force magnitude distribution under shearing. Apart from the choice of several coefficients, whose evolution of values must be explained in the physics, this framework successfully reproduces the features of the distribution that are taken to be an indicator of jamming and unjamming in a granular packing. Key words. Granular Physics, Probability Density Functions, Fourier Transforms

  4. Building a generalized distributed system model

    NASA Technical Reports Server (NTRS)

    Mukkamala, Ravi; Foudriat, E. C.

    1991-01-01

    A modeling tool for both analysis and design of distributed systems is discussed. Since many research institutions have access to networks of workstations, the researchers decided to build a tool running on top of the workstations to function as a prototype as well as a distributed simulator for a computing system. The effects of system modeling on performance prediction in distributed systems and the effect of static locking and deadlocks on the performance predictions of distributed transactions are also discussed. While the probability of deadlock is considerably small, its effects on performance could be significant.

  5. Using the weighted area under the net benefit curve for decision curve analysis.

    PubMed

    Talluri, Rajesh; Shete, Sanjay

    2016-07-18

    Risk prediction models have been proposed for various diseases and are being improved as new predictors are identified. A major challenge is to determine whether the newly discovered predictors improve risk prediction. Decision curve analysis has been proposed as an alternative to the area under the curve and net reclassification index to evaluate the performance of prediction models in clinical scenarios. The decision curve computed using the net benefit can evaluate the predictive performance of risk models at a given or range of threshold probabilities. However, when the decision curves for 2 competing models cross in the range of interest, it is difficult to identify the best model as there is no readily available summary measure for evaluating the predictive performance. The key deterrent for using simple measures such as the area under the net benefit curve is the assumption that the threshold probabilities are uniformly distributed among patients. We propose a novel measure for performing decision curve analysis. The approach estimates the distribution of threshold probabilities without the need of additional data. Using the estimated distribution of threshold probabilities, the weighted area under the net benefit curve serves as the summary measure to compare risk prediction models in a range of interest. We compared 3 different approaches, the standard method, the area under the net benefit curve, and the weighted area under the net benefit curve. Type 1 error and power comparisons demonstrate that the weighted area under the net benefit curve has higher power compared to the other methods. Several simulation studies are presented to demonstrate the improvement in model comparison using the weighted area under the net benefit curve compared to the standard method. The proposed measure improves decision curve analysis by using the weighted area under the curve and thereby improves the power of the decision curve analysis to compare risk prediction models in a clinical scenario.

  6. Advances in modeling trait-based plant community assembly.

    PubMed

    Laughlin, Daniel C; Laughlin, David E

    2013-10-01

    In this review, we examine two new trait-based models of community assembly that predict the relative abundance of species from a regional species pool. The models use fundamentally different mathematical approaches and the predictions can differ considerably. Maxent obtains the most even probability distribution subject to community-weighted mean trait constraints. Traitspace predicts low probabilities for any species whose trait distribution does not pass through the environmental filter. Neither model maximizes functional diversity because of the emphasis on environmental filtering over limiting similarity. Traitspace can test for the effects of limiting similarity by explicitly incorporating intraspecific trait variation. The range of solutions in both models could be used to define the range of natural variability of community composition in restoration projects. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. One-dimensional soil temperature assimilation experiment based on unscented particle filter and Common Land Model

    NASA Astrophysics Data System (ADS)

    Fu, Xiao Lei; Jin, Bao Ming; Jiang, Xiao Lei; Chen, Cheng

    2018-06-01

    Data assimilation is an efficient way to improve the simulation/prediction accuracy in many fields of geosciences especially in meteorological and hydrological applications. This study takes unscented particle filter (UPF) as an example to test its performance at different two probability distribution, Gaussian and Uniform distributions with two different assimilation frequencies experiments (1) assimilating hourly in situ soil surface temperature, (2) assimilating the original Moderate Resolution Imaging Spectroradiometer (MODIS) Land Surface Temperature (LST) once per day. The numerical experiment results show that the filter performs better when increasing the assimilation frequency. In addition, UPF is efficient for improving the soil variables (e.g., soil temperature) simulation/prediction accuracy, though it is not sensitive to the probability distribution for observation error in soil temperature assimilation.

  8. Reliability of windstorm predictions in the ECMWF ensemble prediction system

    NASA Astrophysics Data System (ADS)

    Becker, Nico; Ulbrich, Uwe

    2016-04-01

    Windstorms caused by extratropical cyclones are one of the most dangerous natural hazards in the European region. Therefore, reliable predictions of such storm events are needed. Case studies have shown that ensemble prediction systems (EPS) are able to provide useful information about windstorms between two and five days prior to the event. In this work, ensemble predictions with the European Centre for Medium-Range Weather Forecasts (ECMWF) EPS are evaluated in a four year period. Within the 50 ensemble members, which are initialized every 12 hours and are run for 10 days, windstorms are identified and tracked in time and space. By using a clustering approach, different predictions of the same storm are identified in the different ensemble members and compared to reanalysis data. The occurrence probability of the predicted storms is estimated by fitting a bivariate normal distribution to the storm track positions. Our results show, for example, that predicted storm clusters with occurrence probabilities of more than 50% have a matching observed storm in 80% of all cases at a lead time of two days. The predicted occurrence probabilities are reliable up to 3 days lead time. At longer lead times the occurrence probabilities are overestimated by the EPS.

  9. A functional model of sensemaking in a neurocognitive architecture.

    PubMed

    Lebiere, Christian; Pirolli, Peter; Thomson, Robert; Paik, Jaehyon; Rutledge-Taylor, Matthew; Staszewski, James; Anderson, John R

    2013-01-01

    Sensemaking is the active process of constructing a meaningful representation (i.e., making sense) of some complex aspect of the world. In relation to intelligence analysis, sensemaking is the act of finding and interpreting relevant facts amongst the sea of incoming reports, images, and intelligence. We present a cognitive model of core information-foraging and hypothesis-updating sensemaking processes applied to complex spatial probability estimation and decision-making tasks. While the model was developed in a hybrid symbolic-statistical cognitive architecture, its correspondence to neural frameworks in terms of both structure and mechanisms provided a direct bridge between rational and neural levels of description. Compared against data from two participant groups, the model correctly predicted both the presence and degree of four biases: confirmation, anchoring and adjustment, representativeness, and probability matching. It also favorably predicted human performance in generating probability distributions across categories, assigning resources based on these distributions, and selecting relevant features given a prior probability distribution. This model provides a constrained theoretical framework describing cognitive biases as arising from three interacting factors: the structure of the task environment, the mechanisms and limitations of the cognitive architecture, and the use of strategies to adapt to the dual constraints of cognition and the environment.

  10. A Functional Model of Sensemaking in a Neurocognitive Architecture

    PubMed Central

    Lebiere, Christian; Paik, Jaehyon; Rutledge-Taylor, Matthew; Staszewski, James; Anderson, John R.

    2013-01-01

    Sensemaking is the active process of constructing a meaningful representation (i.e., making sense) of some complex aspect of the world. In relation to intelligence analysis, sensemaking is the act of finding and interpreting relevant facts amongst the sea of incoming reports, images, and intelligence. We present a cognitive model of core information-foraging and hypothesis-updating sensemaking processes applied to complex spatial probability estimation and decision-making tasks. While the model was developed in a hybrid symbolic-statistical cognitive architecture, its correspondence to neural frameworks in terms of both structure and mechanisms provided a direct bridge between rational and neural levels of description. Compared against data from two participant groups, the model correctly predicted both the presence and degree of four biases: confirmation, anchoring and adjustment, representativeness, and probability matching. It also favorably predicted human performance in generating probability distributions across categories, assigning resources based on these distributions, and selecting relevant features given a prior probability distribution. This model provides a constrained theoretical framework describing cognitive biases as arising from three interacting factors: the structure of the task environment, the mechanisms and limitations of the cognitive architecture, and the use of strategies to adapt to the dual constraints of cognition and the environment. PMID:24302930

  11. Classic maximum entropy recovery of the average joint distribution of apparent FRET efficiency and fluorescence photons for single-molecule burst measurements.

    PubMed

    DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K

    2012-04-05

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.

  12. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    NASA Astrophysics Data System (ADS)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  13. The influence of coarse-scale environmental features on current and predicted future distributions of narrow-range endemic crayfish populations

    USGS Publications Warehouse

    Dyer, Joseph J.; Brewer, Shannon K.; Worthington, Thomas A.; Bergey, Elizabeth A.

    2013-01-01

    1.A major limitation to effective management of narrow-range crayfish populations is the paucity of information on the spatial distribution of crayfish species and a general understanding of the interacting environmental variables that drive current and future potential distributional patterns. 2.Maximum Entropy Species Distribution Modeling Software (MaxEnt) was used to predict the current and future potential distributions of four endemic crayfish species in the Ouachita Mountains. Current distributions were modelled using climate, geology, soils, land use, landform and flow variables thought to be important to lotic crayfish. Potential changes in the distribution were forecast by using models trained on current conditions and projecting onto the landscape predicted under climate-change scenarios. 3.The modelled distribution of the four species closely resembled the perceived distribution of each species but also predicted populations in streams and catchments where they had not previously been collected. Soils, elevation and winter precipitation and temperature most strongly related to current distributions and represented 6587% of the predictive power of the models. Model accuracy was high for all models, and model predictions of new populations were verified through additional field sampling. 4.Current models created using two spatial resolutions (1 and 4.5km2) showed that fine-resolution data more accurately represented current distributions. For three of the four species, the 1-km2 resolution models resulted in more conservative predictions. However, the modelled distributional extent of Orconectes leptogonopodus was similar regardless of data resolution. Field validations indicated 1-km2 resolution models were more accurate than 4.5-km2 resolution models. 5.Future projected (4.5-km2 resolution models) model distributions indicated three of the four endemic species would have truncated ranges with low occurrence probabilities under the low-emission scenario, whereas two of four species would be severely restricted in range under moderatehigh emissions. Discrepancies in the two emission scenarios probably relate to the exclusion of behavioural adaptations from species-distribution models. 6.These model predictions illustrate possible impacts of climate change on narrow-range endemic crayfish populations. The predictions do not account for biotic interactions, migration, local habitat conditions or species adaptation. However, we identified the constraining landscape features acting on these populations that provide a framework for addressing habitat needs at a fine scale and developing targeted and systematic monitoring programmes.

  14. Hypothesis testing and earthquake prediction.

    PubMed

    Jackson, D D

    1996-04-30

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions.

  15. Hypothesis testing and earthquake prediction.

    PubMed Central

    Jackson, D D

    1996-01-01

    Requirements for testing include advance specification of the conditional rate density (probability per unit time, area, and magnitude) or, alternatively, probabilities for specified intervals of time, space, and magnitude. Here I consider testing fully specified hypotheses, with no parameter adjustments or arbitrary decisions allowed during the test period. Because it may take decades to validate prediction methods, it is worthwhile to formulate testable hypotheses carefully in advance. Earthquake prediction generally implies that the probability will be temporarily higher than normal. Such a statement requires knowledge of "normal behavior"--that is, it requires a null hypothesis. Hypotheses can be tested in three ways: (i) by comparing the number of actual earth-quakes to the number predicted, (ii) by comparing the likelihood score of actual earthquakes to the predicted distribution, and (iii) by comparing the likelihood ratio to that of a null hypothesis. The first two tests are purely self-consistency tests, while the third is a direct comparison of two hypotheses. Predictions made without a statement of probability are very difficult to test, and any test must be based on the ratio of earthquakes in and out of the forecast regions. PMID:11607663

  16. Bayesian predictive power: choice of prior and some recommendations for its use as probability of success in drug development.

    PubMed

    Rufibach, Kaspar; Burger, Hans Ulrich; Abt, Markus

    2016-09-01

    Bayesian predictive power, the expectation of the power function with respect to a prior distribution for the true underlying effect size, is routinely used in drug development to quantify the probability of success of a clinical trial. Choosing the prior is crucial for the properties and interpretability of Bayesian predictive power. We review recommendations on the choice of prior for Bayesian predictive power and explore its features as a function of the prior. The density of power values induced by a given prior is derived analytically and its shape characterized. We find that for a typical clinical trial scenario, this density has a u-shape very similar, but not equal, to a β-distribution. Alternative priors are discussed, and practical recommendations to assess the sensitivity of Bayesian predictive power to its input parameters are provided. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Wildfire ignition-distribution modelling: a comparative study in the Huron-Manistee National Forest, Michigan, USA

    Treesearch

    Avi Bar Massada; Alexandra D. Syphard; Susan I. Stewart; Volker C. Radeloff

    2012-01-01

    Wildfire ignition distribution models are powerful tools for predicting the probability of ignitions across broad areas, and identifying their drivers. Several approaches have been used for ignition-distribution modelling, yet the performance of different model types has not been compared. This is unfortunate, given that conceptually similar species-distribution models...

  18. The 6dFGS Peculiar Velocity Field

    NASA Astrophysics Data System (ADS)

    Springob, Chris M.; Magoulas, C.; Colless, M.; Mould, J.; Erdogdu, P.; Jones, D. H.; Lucey, J.; Campbell, L.; Merson, A.; Jarrett, T.

    2012-01-01

    The 6dF Galaxy Survey (6dFGS) is an all southern sky galaxy survey, including 125,000 redshifts and a Fundamental Plane (FP) subsample of 10,000 peculiar velocities, making it the largest peculiar velocity sample to date. We have fit the FP using a maximum likelihood fit to a tri-variate Gaussian. We subsequently compute a Bayesian probability distribution for every possible peculiar velocity for each of the 10,000 galaxies, derived from the tri-variate Gaussian probability density distribution, accounting for our selection effects and measurement errors. We construct a predicted peculiar velocity field from the 2MASS redshift survey, and compare our observed 6dFGS velocity field to the predicted field. We discuss the resulting agreement between the observed and predicted fields, and the implications for measurements of the bias parameter and bulk flow.

  19. Tornado damage risk assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reinhold, T.A.; Ellingwood, B.

    1982-09-01

    Several proposed models were evaluated for predicting tornado wind speed probabilities at nuclear plant sites as part of a program to develop statistical data on tornadoes needed for probability-based load combination analysis. A unified model was developed which synthesized the desired aspects of tornado occurrence and damage potential. The sensitivity of wind speed probability estimates to various tornado modeling assumptions are examined, and the probability distributions of tornado wind speed that are needed for load combination studies are presented.

  20. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.

  1. Classic Maximum Entropy Recovery of the Average Joint Distribution of Apparent FRET Efficiency and Fluorescence Photons for Single-molecule Burst Measurements

    PubMed Central

    DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.

    2012-01-01

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions. PMID:22338694

  2. Spatial Distribution and Conservation of Speckled Hind and Warsaw Grouper in the Atlantic Ocean off the Southeastern U.S.

    PubMed Central

    Farmer, Nicholas A.; Karnauskas, Mandy

    2013-01-01

    There is broad interest in the development of efficient marine protected areas (MPAs) to reduce bycatch and end overfishing of speckled hind (Epinephelus drummondhayi) and warsaw grouper (Hyporthodus nigritus) in the Atlantic Ocean off the southeastern U.S. We assimilated decades of data from many fishery-dependent, fishery-independent, and anecdotal sources to describe the spatial distribution of these data limited stocks. A spatial classification model was developed to categorize depth-grids based on the distribution of speckled hind and warsaw grouper point observations and identified benthic habitats. Logistic regression analysis was used to develop a quantitative model to predict the spatial distribution of speckled hind and warsaw grouper as a function of depth, latitude, and habitat. Models, controlling for sampling gear effects, were selected based on AIC and 10-fold cross validation. The best-fitting model for warsaw grouper included latitude and depth to explain 10.8% of the variability in probability of detection, with a false prediction rate of 28–33%. The best-fitting model for speckled hind, per cross-validation, included latitude and depth to explain 36.8% of the variability in probability of detection, with a false prediction rate of 25–27%. The best-fitting speckled hind model, per AIC, also included habitat, but had false prediction rates up to 36%. Speckled hind and warsaw grouper habitats followed a shelf-edge hardbottom ridge from North Carolina to southeast Florida, with speckled hind more common to the north and warsaw grouper more common to the south. The proportion of habitat classifications and model-estimated stock contained within established and proposed MPAs was computed. Existing MPAs covered 10% of probable shelf-edge habitats for speckled hind and warsaw grouper, protecting 3–8% of speckled hind and 8% of warsaw grouper stocks. Proposed MPAs could add 24% more probable shelf-edge habitat, and protect an additional 14–29% of speckled hind and 20% of warsaw grouper stocks. PMID:24260126

  3. Probability distributions of the electroencephalogram envelope of preterm infants.

    PubMed

    Saji, Ryoya; Hirasawa, Kyoko; Ito, Masako; Kusuda, Satoshi; Konishi, Yukuo; Taga, Gentaro

    2015-06-01

    To determine the stationary characteristics of electroencephalogram (EEG) envelopes for prematurely born (preterm) infants and investigate the intrinsic characteristics of early brain development in preterm infants. Twenty neurologically normal sets of EEGs recorded in infants with a post-conceptional age (PCA) range of 26-44 weeks (mean 37.5 ± 5.0 weeks) were analyzed. Hilbert transform was applied to extract the envelope. We determined the suitable probability distribution of the envelope and performed a statistical analysis. It was found that (i) the probability distributions for preterm EEG envelopes were best fitted by lognormal distributions at 38 weeks PCA or less, and by gamma distributions at 44 weeks PCA; (ii) the scale parameter of the lognormal distribution had positive correlations with PCA as well as a strong negative correlation with the percentage of low-voltage activity; (iii) the shape parameter of the lognormal distribution had significant positive correlations with PCA; (iv) the statistics of mode showed significant linear relationships with PCA, and, therefore, it was considered a useful index in PCA prediction. These statistics, including the scale parameter of the lognormal distribution and the skewness and mode derived from a suitable probability distribution, may be good indexes for estimating stationary nature in developing brain activity in preterm infants. The stationary characteristics, such as discontinuity, asymmetry, and unimodality, of preterm EEGs are well indicated by the statistics estimated from the probability distribution of the preterm EEG envelopes. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  4. Distribution of submerged aquatic vegetation in the St. Louis River estuary: Maps and models

    EPA Science Inventory

    In late summer of 2011 and 2012 we used echo-sounding gear to map the distribution of submerged aquatic vegetation (SAV) in the St. Louis River Estuary (SLRE). From these data we produced maps of SAV distribution and we created logistic models to predict the probability of occurr...

  5. The probability distribution of side-chain conformations in [Leu] and [Met]enkephalin determines the potency and selectivity to mu and delta opiate receptors.

    PubMed

    Nielsen, Bjørn G; Jensen, Morten Ø; Bohr, Henrik G

    2003-01-01

    The structure of enkephalin, a small neuropeptide with five amino acids, has been simulated on computers using molecular dynamics. Such simulations exhibit a few stable conformations, which also have been identified experimentally. The simulations provide the possibility to perform cluster analysis in the space defined by potentially pharmacophoric measures such as dihedral angles, side-chain orientation, etc. By analyzing the statistics of the resulting clusters, the probability distribution of the side-chain conformations may be determined. These probabilities allow us to predict the selectivity of [Leu]enkephalin and [Met]enkephalin to the known mu- and delta-type opiate receptors to which they bind as agonists. Other plausible consequences of these probability distributions are discussed in relation to the way in which they may influence the dynamics of the synapse. Copyright 2003 Wiley Periodicals, Inc. Biopolymers (Pept Sci) 71: 577-592, 2003

  6. Time-dependent landslide probability mapping

    USGS Publications Warehouse

    Campbell, Russell H.; Bernknopf, Richard L.; ,

    1993-01-01

    Case studies where time of failure is known for rainfall-triggered debris flows can be used to estimate the parameters of a hazard model in which the probability of failure is a function of time. As an example, a time-dependent function for the conditional probability of a soil slip is estimated from independent variables representing hillside morphology, approximations of material properties, and the duration and rate of rainfall. If probabilities are calculated in a GIS (geomorphic information system ) environment, the spatial distribution of the result for any given hour can be displayed on a map. Although the probability levels in this example are uncalibrated, the method offers a potential for evaluating different physical models and different earth-science variables by comparing the map distribution of predicted probabilities with inventory maps for different areas and different storms. If linked with spatial and temporal socio-economic variables, this method could be used for short-term risk assessment.

  7. Nematode Damage Functions: The Problems of Experimental and Sampling Error

    PubMed Central

    Ferris, H.

    1984-01-01

    The development and use of pest damage functions involves measurement and experimental errors associated with cultural, environmental, and distributional factors. Damage predictions are more valuable if considered with associated probability. Collapsing population densities into a geometric series of population classes allows a pseudo-replication removal of experimental and sampling error in damage function development. Recognition of the nature of sampling error for aggregated populations allows assessment of probability associated with the population estimate. The product of the probabilities incorporated in the damage function and in the population estimate provides a basis for risk analysis of the yield loss prediction and the ensuing management decision. PMID:19295865

  8. Predicted Distribution of Visceral Leishmaniasis Vectors (Diptera: Psychodidae; Phlebotominae) in Iran: A Niche Model Study.

    PubMed

    Hanafi-Bojd, A A; Rassi, Y; Yaghoobi-Ershadi, M R; Haghdoost, A A; Akhavan, A A; Charrahy, Z; Karimi, A

    2015-12-01

    Visceral leishmaniasis (VL) is an important vector-borne disease in Iran. Till now, Leishmania infantum has been detected from five species of sand flies in the country including Phlebotomus kandelakii, Phlebotomus major s.l., Phlebotomus perfiliewi, Phlebotomus alexandri and Phlebotomus tobbi. Also, Phlebotomus keshishiani was found to be infected with Leishmania parasites. This study aimed at predicting the probable niches and distribution of vectors of visceral leishmaniasis in Iran. Data on spatial distribution studies of sand flies were obtained from Iranian database on sand flies. Sample points were included in data from faunistic studies on sand flies conducted during 1995-2013. MaxEnt software was used to predict the appropriate ecological niches for given species, using climatic and topographical data. Distribution maps were prepared and classified in ArcGIS to find main ecological niches of the vectors and hot spots for VL transmission in Iran. Phlebotomus kandelakii, Ph. major s.l. and Ph. alexandri seem to have played a more important role in VL transmission in Iran, so this study focuses on them. Representations of MaxEnt model for probability of distribution of the studied sand flies showed high contribution of climatological and topographical variables to predict the potential distribution of three vector species. Isothermality was found to be an environmental variable with the highest gain when used in isolation for Ph. kandelakii and Ph. major s.l., while for Ph. alexandri, the most effective variable was precipitation of the coldest quarter. The results of this study present the first prediction on distribution of sand fly vectors of VL in Iran. The predicted distributions were matched with the disease-endemic areas in the country, while it was found that there were some unaffected areas with the potential transmission. More comprehensive studies are recommended on the ecology and vector competence of VL vectors in the country. © 2015 Blackwell Verlag GmbH.

  9. Stochastic analysis of particle movement over a dune bed

    USGS Publications Warehouse

    Lee, Baum K.; Jobson, Harvey E.

    1977-01-01

    Stochastic models are available that can be used to predict the transport and dispersion of bed-material sediment particles in an alluvial channel. These models are based on the proposition that the movement of a single bed-material sediment particle consists of a series of steps of random length separated by rest periods of random duration and, therefore, application of the models requires a knowledge of the probability distributions of the step lengths, the rest periods, the elevation of particle deposition, and the elevation of particle erosion. The procedure was tested by determining distributions from bed profiles formed in a large laboratory flume with a coarse sand as the bed material. The elevation of particle deposition and the elevation of particle erosion can be considered to be identically distributed, and their distribution can be described by either a ' truncated Gaussian ' or a ' triangular ' density function. The conditional probability distribution of the rest period given the elevation of particle deposition closely followed the two-parameter gamma distribution. The conditional probability distribution of the step length given the elevation of particle erosion and the elevation of particle deposition also closely followed the two-parameter gamma density function. For a given flow, the scale and shape parameters describing the gamma probability distributions can be expressed as functions of bed-elevation. (Woodard-USGS)

  10. Estimates of the low-level wind shear and turbulence in the vicinity of Kennedy International Airport on 24 June 1975

    NASA Technical Reports Server (NTRS)

    Lewellen, W. S.; Williamson, G. G.

    1976-01-01

    A study was conducted to estimate the type of wind and turbulence distributions which may have existed at the time of the crash of Eastern Airlines Flight 66 while attempting to land. A number of different wind and turbulence profiles are predicted for the site and date of the crash. The morning and mid-afternoon predictions are in reasonably good agreement with magnitude and direction as reported by the weather observer. Although precise predictions cannot be made during the passage of the thunderstorm which coincides with the time of the accident, a number of different profiles which might exist under or in the vicinity of a thunderstorm are presented. The profile that is most probable predicts the mean headwind shear over 100 m (300 feet) altitude change and the average fluctuations about the mean headwind distribution. This combination of means and fluctuations leads to a reasonable probability that the instantaneous headwind shear would equal the maximum value reported in the flight recorder data.

  11. Probabilistic forecasting for extreme NO2 pollution episodes.

    PubMed

    Aznarte, José L

    2017-10-01

    In this study, we investigate the convenience of quantile regression to predict extreme concentrations of NO 2 . Contrarily to the usual point-forecasting, where a single value is forecast for each horizon, probabilistic forecasting through quantile regression allows for the prediction of the full probability distribution, which in turn allows to build models specifically fit for the tails of this distribution. Using data from the city of Madrid, including NO 2 concentrations as well as meteorological measures, we build models that predict extreme NO 2 concentrations, outperforming point-forecasting alternatives, and we prove that the predictions are accurate, reliable and sharp. Besides, we study the relative importance of the independent variables involved, and show how the important variables for the median quantile are different than those important for the upper quantiles. Furthermore, we present a method to compute the probability of exceedance of thresholds, which is a simple and comprehensible manner to present probabilistic forecasts maximizing their usefulness. Copyright © 2017 Elsevier Ltd. All rights reserved.

  12. High-precision simulation of the height distribution for the KPZ equation

    NASA Astrophysics Data System (ADS)

    Hartmann, Alexander K.; Le Doussal, Pierre; Majumdar, Satya N.; Rosso, Alberto; Schehr, Gregory

    2018-03-01

    The one-point distribution of the height for the continuum Kardar-Parisi-Zhang (KPZ) equation is determined numerically using the mapping to the directed polymer in a random potential at high temperature. Using an importance sampling approach, the distribution is obtained over a large range of values, down to a probability density as small as 10-1000 in the tails. Both short and long times are investigated and compared with recent analytical predictions for the large-deviation forms of the probability of rare fluctuations. At short times the agreement with the analytical expression is spectacular. We observe that the far left and right tails, with exponents 5/2 and 3/2, respectively, are preserved also in the region of long times. We present some evidence for the predicted non-trivial crossover in the left tail from the 5/2 tail exponent to the cubic tail of the Tracy-Widom distribution, although the details of the full scaling form remain beyond reach.

  13. Predictability in cellular automata.

    PubMed

    Agapie, Alexandru; Andreica, Anca; Chira, Camelia; Giuclea, Marius

    2014-01-01

    Modelled as finite homogeneous Markov chains, probabilistic cellular automata with local transition probabilities in (0, 1) always posses a stationary distribution. This result alone is not very helpful when it comes to predicting the final configuration; one needs also a formula connecting the probabilities in the stationary distribution to some intrinsic feature of the lattice configuration. Previous results on the asynchronous cellular automata have showed that such feature really exists. It is the number of zero-one borders within the automaton's binary configuration. An exponential formula in the number of zero-one borders has been proved for the 1-D, 2-D and 3-D asynchronous automata with neighborhood three, five and seven, respectively. We perform computer experiments on a synchronous cellular automaton to check whether the empirical distribution obeys also that theoretical formula. The numerical results indicate a perfect fit for neighbourhood three and five, which opens the way for a rigorous proof of the formula in this new, synchronous case.

  14. Modeling nonbreeding distributions of shorebirds and waterfowl in response to climate change

    USGS Publications Warehouse

    Reese, Gordon; Skagen, Susan K.

    2017-01-01

    To identify areas on the landscape that may contribute to a robust network of conservation areas, we modeled the probabilities of occurrence of several en route migratory shorebirds and wintering waterfowl in the southern Great Plains of North America, including responses to changing climate. We predominantly used data from the eBird citizen-science project to model probabilities of occurrence relative to land-use patterns, spatial distribution of wetlands, and climate. We projected models to potential future climate conditions using five representative general circulation models of the Coupled Model Intercomparison Project 5 (CMIP5). We used Random Forests to model probabilities of occurrence and compared the time periods 1981–2010 (hindcast) and 2041–2070 (forecast) in “model space.” Projected changes in shorebird probabilities of occurrence varied with species-specific general distribution pattern, migration distance, and spatial extent. Species using the western and northern portion of the study area exhibited the greatest likelihoods of decline, whereas species with more easterly occurrences, mostly long-distance migrants, had the greatest projected increases in probability of occurrence. At an ecoregional extent, differences in probabilities of shorebird occurrence ranged from −0.015 to 0.045 when averaged across climate models, with the largest increases occurring early in migration. Spatial shifts are predicted for several shorebird species. Probabilities of occurrence of wintering Mallards and Northern Pintail are predicted to increase by 0.046 and 0.061, respectively, with northward shifts projected for both species. When incorporated into partner land management decision tools, results at ecoregional extents can be used to identify wetland complexes with the greatest potential to support birds in the nonbreeding season under a wide range of future climate scenarios.

  15. Technical report. The application of probability-generating functions to linear-quadratic radiation survival curves.

    PubMed

    Kendal, W S

    2000-04-01

    To illustrate how probability-generating functions (PGFs) can be employed to derive a simple probabilistic model for clonogenic survival after exposure to ionizing irradiation. Both repairable and irreparable radiation damage to DNA were assumed to occur by independent (Poisson) processes, at intensities proportional to the irradiation dose. Also, repairable damage was assumed to be either repaired or further (lethally) injured according to a third (Bernoulli) process, with the probability of lethal conversion being directly proportional to dose. Using the algebra of PGFs, these three processes were combined to yield a composite PGF that described the distribution of lethal DNA lesions in irradiated cells. The composite PGF characterized a Poisson distribution with mean, chiD+betaD2, where D was dose and alpha and beta were radiobiological constants. This distribution yielded the conventional linear-quadratic survival equation. To test the composite model, the derived distribution was used to predict the frequencies of multiple chromosomal aberrations in irradiated human lymphocytes. The predictions agreed well with observation. This probabilistic model was consistent with single-hit mechanisms, but it was not consistent with binary misrepair mechanisms. A stochastic model for radiation survival has been constructed from elementary PGFs that exactly yields the linear-quadratic relationship. This approach can be used to investigate other simple probabilistic survival models.

  16. Probability and the changing shape of response distributions for orientation.

    PubMed

    Anderson, Britt

    2014-11-18

    Spatial attention and feature-based attention are regarded as two independent mechanisms for biasing the processing of sensory stimuli. Feature attention is held to be a spatially invariant mechanism that advantages a single feature per sensory dimension. In contrast to the prediction of location independence, I found that participants were able to report the orientation of a briefly presented visual grating better for targets defined by high probability conjunctions of features and locations even when orientations and locations were individually uniform. The advantage for high-probability conjunctions was accompanied by changes in the shape of the response distributions. High-probability conjunctions had error distributions that were not normally distributed but demonstrated increased kurtosis. The increase in kurtosis could be explained as a change in the variances of the component tuning functions that comprise a population mixture. By changing the mixture distribution of orientation-tuned neurons, it is possible to change the shape of the discrimination function. This prompts the suggestion that attention may not "increase" the quality of perceptual processing in an absolute sense but rather prioritizes some stimuli over others. This results in an increased number of highly accurate responses to probable targets and, simultaneously, an increase in the number of very inaccurate responses. © 2014 ARVO.

  17. Distribution of rain height over subtropical region: Durban, South Africa for satellite communication systems

    NASA Astrophysics Data System (ADS)

    Olurotimi, E. O.; Sokoya, O.; Ojo, J. S.; Owolawi, P. A.

    2018-03-01

    Rain height is one of the significant parameters for prediction of rain attenuation for Earth-space telecommunication links, especially those operating at frequencies above 10 GHz. This study examines Three-parameter Dagum distribution of the rain height over Durban, South Africa. 5-year data were used to study the monthly, seasonal, and annual variations using the parameters estimated by the maximum likelihood of the distribution. The performance estimation of the distribution was determined using the statistical goodness of fit. Three-parameter Dagum distribution shows an appropriate distribution for the modeling of rain height over Durban with the Root Mean Square Error of 0.26. Also, the shape and scale parameters for the distribution show a wide variation. The probability exceedance of time for 0.01% indicates the high probability of rain attenuation at higher frequencies.

  18. Predicting future changes in Muskegon River Watershed game fish distributions under future land cover alteration and climate change scenarios

    USGS Publications Warehouse

    Steen, Paul J.; Wiley, Michael J.; Schaeffer, Jeffrey S.

    2010-01-01

    Future alterations in land cover and climate are likely to cause substantial changes in the ranges of fish species. Predictive distribution models are an important tool for assessing the probability that these changes will cause increases or decreases in or the extirpation of species. Classification tree models that predict the probability of game fish presence were applied to the streams of the Muskegon River watershed, Michigan. The models were used to study three potential future scenarios: (1) land cover change only, (2) land cover change and a 3°C increase in air temperature by 2100, and (3) land cover change and a 5°C increase in air temperature by 2100. The analysis indicated that the expected change in air temperature and subsequent change in water temperatures would result in the decline of coldwater fish in the Muskegon watershed by the end of the 21st century while cool- and warmwater species would significantly increase their ranges. The greatest decline detected was a 90% reduction in the probability that brook trout Salvelinus fontinalis would occur in Bigelow Creek. The greatest increase was a 276% increase in the probability that northern pike Esox lucius would occur in the Middle Branch River. Changes in land cover are expected to cause large changes in a few fish species, such as walleye Sander vitreus and Chinook salmon Oncorhynchus tshawytscha, but not to drive major changes in species composition. Managers can alter stream environmental conditions to maximize the probability that species will reside in particular stream reaches through application of the classification tree models. Such models represent a good way to predict future changes, as they give quantitative estimates of the n-dimensional niches for particular species.

  19. Universal Inverse Power-Law Distribution for Fractal Fluctuations in Dynamical Systems: Applications for Predictability of Inter-Annual Variability of Indian and USA Region Rainfall

    NASA Astrophysics Data System (ADS)

    Selvam, A. M.

    2017-01-01

    Dynamical systems in nature exhibit self-similar fractal space-time fluctuations on all scales indicating long-range correlations and, therefore, the statistical normal distribution with implicit assumption of independence, fixed mean and standard deviation cannot be used for description and quantification of fractal data sets. The author has developed a general systems theory based on classical statistical physics for fractal fluctuations which predicts the following. (1) The fractal fluctuations signify an underlying eddy continuum, the larger eddies being the integrated mean of enclosed smaller-scale fluctuations. (2) The probability distribution of eddy amplitudes and the variance (square of eddy amplitude) spectrum of fractal fluctuations follow the universal Boltzmann inverse power law expressed as a function of the golden mean. (3) Fractal fluctuations are signatures of quantum-like chaos since the additive amplitudes of eddies when squared represent probability densities analogous to the sub-atomic dynamics of quantum systems such as the photon or electron. (4) The model predicted distribution is very close to statistical normal distribution for moderate events within two standard deviations from the mean but exhibits a fat long tail that are associated with hazardous extreme events. Continuous periodogram power spectral analyses of available GHCN annual total rainfall time series for the period 1900-2008 for Indian and USA stations show that the power spectra and the corresponding probability distributions follow model predicted universal inverse power law form signifying an eddy continuum structure underlying the observed inter-annual variability of rainfall. On a global scale, man-made greenhouse gas related atmospheric warming would result in intensification of natural climate variability, seen immediately in high frequency fluctuations such as QBO and ENSO and even shorter timescales. Model concepts and results of analyses are discussed with reference to possible prediction of climate change. Model concepts, if correct, rule out unambiguously, linear trends in climate. Climate change will only be manifested as increase or decrease in the natural variability. However, more stringent tests of model concepts and predictions are required before applications to such an important issue as climate change. Observations and simulations with climate models show that precipitation extremes intensify in response to a warming climate (O'Gorman in Curr Clim Change Rep 1:49-59, 2015).

  20. Predictive modelling of habitat selection by marine predators with respect to the abundance and depth distribution of pelagic prey.

    PubMed

    Boyd, Charlotte; Castillo, Ramiro; Hunt, George L; Punt, André E; VanBlaricom, Glenn R; Weimerskirch, Henri; Bertrand, Sophie

    2015-11-01

    Understanding the ecological processes that underpin species distribution patterns is a fundamental goal in spatial ecology. However, developing predictive models of habitat use is challenging for species that forage in marine environments, as both predators and prey are often highly mobile and difficult to monitor. Consequently, few studies have developed resource selection functions for marine predators based directly on the abundance and distribution of their prey. We analysed contemporaneous data on the diving locations of two seabird species, the shallow-diving Peruvian Booby (Sula variegata) and deeper diving Guanay Cormorant (Phalacrocorax bougainvilliorum), and the abundance and depth distribution of their main prey, Peruvian anchoveta (Engraulis ringens). Based on this unique data set, we developed resource selection functions to test the hypothesis that the probability of seabird diving behaviour at a given location is a function of the relative abundance of prey in the upper water column. For both species, we show that the probability of diving behaviour is mostly explained by the distribution of prey at shallow depths. While the probability of diving behaviour increases sharply with prey abundance at relatively low levels of abundance, support for including abundance in addition to the depth distribution of prey is weak, suggesting that prey abundance was not a major factor determining the location of diving behaviour during the study period. The study thus highlights the importance of the depth distribution of prey for two species of seabird with different diving capabilities. The results complement previous research that points towards the importance of oceanographic processes that enhance the accessibility of prey to seabirds. The implications are that locations where prey is predictably found at accessible depths may be more important for surface foragers, such as seabirds, than locations where prey is predictably abundant. Analysis of the relative importance of abundance and accessibility is essential for the design and evaluation of effective management responses to reduced prey availability for seabirds and other top predators in marine systems. © 2015 The Authors. Journal of Animal Ecology © 2015 British Ecological Society.

  1. Analytical performance evaluation of SAR ATR with inaccurate or estimated models

    NASA Astrophysics Data System (ADS)

    DeVore, Michael D.

    2004-09-01

    Hypothesis testing algorithms for automatic target recognition (ATR) are often formulated in terms of some assumed distribution family. The parameter values corresponding to a particular target class together with the distribution family constitute a model for the target's signature. In practice such models exhibit inaccuracy because of incorrect assumptions about the distribution family and/or because of errors in the assumed parameter values, which are often determined experimentally. Model inaccuracy can have a significant impact on performance predictions for target recognition systems. Such inaccuracy often causes model-based predictions that ignore the difference between assumed and actual distributions to be overly optimistic. This paper reports on research to quantify the effect of inaccurate models on performance prediction and to estimate the effect using only trained parameters. We demonstrate that for large observation vectors the class-conditional probabilities of error can be expressed as a simple function of the difference between two relative entropies. These relative entropies quantify the discrepancies between the actual and assumed distributions and can be used to express the difference between actual and predicted error rates. Focusing on the problem of ATR from synthetic aperture radar (SAR) imagery, we present estimators of the probabilities of error in both ideal and plug-in tests expressed in terms of the trained model parameters. These estimators are defined in terms of unbiased estimates for the first two moments of the sample statistic. We present an analytical treatment of these results and include demonstrations from simulated radar data.

  2. Steady-state distributions of probability fluxes on complex networks

    NASA Astrophysics Data System (ADS)

    Chełminiak, Przemysław; Kurzyński, Michał

    2017-02-01

    We consider a simple model of the Markovian stochastic dynamics on complex networks to examine the statistical properties of the probability fluxes. The additional transition, called hereafter a gate, powered by the external constant force breaks a detailed balance in the network. We argue, using a theoretical approach and numerical simulations, that the stationary distributions of the probability fluxes emergent under such conditions converge to the Gaussian distribution. By virtue of the stationary fluctuation theorem, its standard deviation depends directly on the square root of the mean flux. In turn, the nonlinear relation between the mean flux and the external force, which provides the key result of the present study, allows us to calculate the two parameters that entirely characterize the Gaussian distribution of the probability fluxes both close to as well as far from the equilibrium state. Also, the other effects that modify these parameters, such as the addition of shortcuts to the tree-like network, the extension and configuration of the gate and a change in the network size studied by means of computer simulations are widely discussed in terms of the rigorous theoretical predictions.

  3. Protein single-model quality assessment by feature-based probability density functions.

    PubMed

    Cao, Renzhi; Cheng, Jianlin

    2016-04-04

    Protein quality assessment (QA) has played an important role in protein structure prediction. We developed a novel single-model quality assessment method-Qprob. Qprob calculates the absolute error for each protein feature value against the true quality scores (i.e. GDT-TS scores) of protein structural models, and uses them to estimate its probability density distribution for quality assessment. Qprob has been blindly tested on the 11th Critical Assessment of Techniques for Protein Structure Prediction (CASP11) as MULTICOM-NOVEL server. The official CASP result shows that Qprob ranks as one of the top single-model QA methods. In addition, Qprob makes contributions to our protein tertiary structure predictor MULTICOM, which is officially ranked 3rd out of 143 predictors. The good performance shows that Qprob is good at assessing the quality of models of hard targets. These results demonstrate that this new probability density distribution based method is effective for protein single-model quality assessment and is useful for protein structure prediction. The webserver of Qprob is available at: http://calla.rnet.missouri.edu/qprob/. The software is now freely available in the web server of Qprob.

  4. Probability of lensing magnification by cosmologically distributed galaxies

    NASA Technical Reports Server (NTRS)

    Pei, Yichuan C.

    1993-01-01

    We present the analytical formulae for computing the magnification probability caused by cosmologically distributed galaxies. The galaxies are assumed to be singular, truncated-isothermal spheres without both evolution and clustering in redshift. We find that, for a fixed total mass, extended galaxies produce a broader shape in the magnification probability distribution and hence are less efficient as gravitational lenses than compact galaxies. The high-magnification tail caused by large galaxies is well approximated by an A exp -3 form, while the tail by small galaxies is slightly shallower. The mean magnification as a function of redshift is, however, found to be independent of the size of the lensing galaxies. In terms of the flux conservation, our formulae for the isothermal galaxy model predict a mean magnification to within a few percent with the Dyer-Roeder model of a clumpy universe.

  5. Prediction and typicality in multiverse cosmology

    NASA Astrophysics Data System (ADS)

    Azhar, Feraz

    2014-02-01

    In the absence of a fundamental theory that precisely predicts values for observable parameters, anthropic reasoning attempts to constrain probability distributions over those parameters in order to facilitate the extraction of testable predictions. The utility of this approach has been vigorously debated of late, particularly in light of theories that claim we live in a multiverse, where parameters may take differing values in regions lying outside our observable horizon. Within this cosmological framework, we investigate the efficacy of top-down anthropic reasoning based on the weak anthropic principle. We argue contrary to recent claims that it is not clear one can either dispense with notions of typicality altogether or presume typicality, in comparing resulting probability distributions with observations. We show in a concrete, top-down setting related to dark matter, that assumptions about typicality can dramatically affect predictions, thereby providing a guide to how errors in reasoning regarding typicality translate to errors in the assessment of predictive power. We conjecture that this dependence on typicality is an integral feature of anthropic reasoning in broader cosmological contexts, and argue in favour of the explicit inclusion of measures of typicality in schemes invoking anthropic reasoning, with a view to extracting predictions from multiverse scenarios.

  6. Stochastic Analysis of Orbital Lifetimes of Spacecraft

    NASA Technical Reports Server (NTRS)

    Sasamoto, Washito; Goodliff, Kandyce; Cornelius, David

    2008-01-01

    A document discusses (1) a Monte-Carlo-based methodology for probabilistic prediction and analysis of orbital lifetimes of spacecraft and (2) Orbital Lifetime Monte Carlo (OLMC)--a Fortran computer program, consisting of a previously developed long-term orbit-propagator integrated with a Monte Carlo engine. OLMC enables modeling of variances of key physical parameters that affect orbital lifetimes through the use of probability distributions. These parameters include altitude, speed, and flight-path angle at insertion into orbit; solar flux; and launch delays. The products of OLMC are predicted lifetimes (durations above specified minimum altitudes) for the number of user-specified cases. Histograms generated from such predictions can be used to determine the probabilities that spacecraft will satisfy lifetime requirements. The document discusses uncertainties that affect modeling of orbital lifetimes. Issues of repeatability, smoothness of distributions, and code run time are considered for the purpose of establishing values of code-specific parameters and number of Monte Carlo runs. Results from test cases are interpreted as demonstrating that solar-flux predictions are primary sources of variations in predicted lifetimes. Therefore, it is concluded, multiple sets of predictions should be utilized to fully characterize the lifetime range of a spacecraft.

  7. A Bayesian predictive two-stage design for phase II clinical trials.

    PubMed

    Sambucini, Valeria

    2008-04-15

    In this paper, we propose a Bayesian two-stage design for phase II clinical trials, which represents a predictive version of the single threshold design (STD) recently introduced by Tan and Machin. The STD two-stage sample sizes are determined specifying a minimum threshold for the posterior probability that the true response rate exceeds a pre-specified target value and assuming that the observed response rate is slightly higher than the target. Unlike the STD, we do not refer to a fixed experimental outcome, but take into account the uncertainty about future data. In both stages, the design aims to control the probability of getting a large posterior probability that the true response rate exceeds the target value. Such a probability is expressed in terms of prior predictive distributions of the data. The performance of the design is based on the distinction between analysis and design priors, recently introduced in the literature. The properties of the method are studied when all the design parameters vary.

  8. Towards a theoretical determination of the geographical probability distribution of meteoroid impacts on Earth

    NASA Astrophysics Data System (ADS)

    Zuluaga, Jorge I.; Sucerquia, Mario

    2018-06-01

    Tunguska and Chelyabinsk impact events occurred inside a geographical area of only 3.4 per cent of the Earth's surface. Although two events hardly constitute a statistically significant demonstration of a geographical pattern of impacts, their spatial coincidence is at least tantalizing. To understand if this concurrence reflects an underlying geographical and/or temporal pattern, we must aim at predicting the spatio-temporal distribution of meteoroid impacts on Earth. For this purpose we designed, implemented, and tested a novel numerical technique, the `Gravitational Ray Tracing' (GRT) designed to compute the relative impact probability (RIP) on the surface of any planet. GRT is inspired by the so-called ray-casting techniques used to render realistic images of complex 3D scenes. In this paper we describe the method and the results of testing it at the time of large impact events. Our findings suggest a non-trivial pattern of impact probabilities at any given time on the Earth. Locations at 60-90° from the apex are more prone to impacts, especially at midnight. Counterintuitively, sites close to apex direction have the lowest RIP, while in the antapex RIP are slightly larger than average. We present here preliminary maps of RIP at the time of Tunguska and Chelyabinsk events and found no evidence of a spatial or temporal pattern, suggesting that their coincidence was fortuitous. We apply the GRT method to compute theoretical RIP at the location and time of 394 large fireballs. Although the predicted spatio-temporal impact distribution matches marginally the observed events, we successfully predict their impact speed distribution.

  9. Predictive onboard flow control for packet switching satellites

    NASA Technical Reports Server (NTRS)

    Bobinsky, Eric A.

    1992-01-01

    We outline two alternate approaches to predicting the onset of congestion in a packet switching satellite, and argue that predictive, rather than reactive, flow control is necessary for the efficient operation of such a system. The first method discussed is based on standard, statistical techniques which are used to periodically calculate a probability of near-term congestion based on arrival rate statistics. If this probability exceeds a present threshold, the satellite would transmit a rate-reduction signal to all active ground stations. The second method discussed would utilize a neural network to periodically predict the occurrence of buffer overflow based on input data which would include, in addition to arrival rates, the distributions of packet lengths, source addresses, and destination addresses.

  10. Predicted deep-sea coral habitat suitability for the U.S. West coast.

    PubMed

    Guinotte, John M; Davies, Andrew J

    2014-01-01

    Regional scale habitat suitability models provide finer scale resolution and more focused predictions of where organisms may occur. Previous modelling approaches have focused primarily on local and/or global scales, while regional scale models have been relatively few. In this study, regional scale predictive habitat models are presented for deep-sea corals for the U.S. West Coast (California, Oregon and Washington). Model results are intended to aid in future research or mapping efforts and to assess potential coral habitat suitability both within and outside existing bottom trawl closures (i.e. Essential Fish Habitat (EFH)) and identify suitable habitat within U.S. National Marine Sanctuaries (NMS). Deep-sea coral habitat suitability was modelled at 500 m×500 m spatial resolution using a range of physical, chemical and environmental variables known or thought to influence the distribution of deep-sea corals. Using a spatial partitioning cross-validation approach, maximum entropy models identified slope, temperature, salinity and depth as important predictors for most deep-sea coral taxa. Large areas of highly suitable deep-sea coral habitat were predicted both within and outside of existing bottom trawl closures and NMS boundaries. Predicted habitat suitability over regional scales are not currently able to identify coral areas with pin point accuracy and probably overpredict actual coral distribution due to model limitations and unincorporated variables (i.e. data on distribution of hard substrate) that are known to limit their distribution. Predicted habitat results should be used in conjunction with multibeam bathymetry, geological mapping and other tools to guide future research efforts to areas with the highest probability of harboring deep-sea corals. Field validation of predicted habitat is needed to quantify model accuracy, particularly in areas that have not been sampled.

  11. Predicted Deep-Sea Coral Habitat Suitability for the U.S. West Coast

    PubMed Central

    Guinotte, John M.; Davies, Andrew J.

    2014-01-01

    Regional scale habitat suitability models provide finer scale resolution and more focused predictions of where organisms may occur. Previous modelling approaches have focused primarily on local and/or global scales, while regional scale models have been relatively few. In this study, regional scale predictive habitat models are presented for deep-sea corals for the U.S. West Coast (California, Oregon and Washington). Model results are intended to aid in future research or mapping efforts and to assess potential coral habitat suitability both within and outside existing bottom trawl closures (i.e. Essential Fish Habitat (EFH)) and identify suitable habitat within U.S. National Marine Sanctuaries (NMS). Deep-sea coral habitat suitability was modelled at 500 m×500 m spatial resolution using a range of physical, chemical and environmental variables known or thought to influence the distribution of deep-sea corals. Using a spatial partitioning cross-validation approach, maximum entropy models identified slope, temperature, salinity and depth as important predictors for most deep-sea coral taxa. Large areas of highly suitable deep-sea coral habitat were predicted both within and outside of existing bottom trawl closures and NMS boundaries. Predicted habitat suitability over regional scales are not currently able to identify coral areas with pin point accuracy and probably overpredict actual coral distribution due to model limitations and unincorporated variables (i.e. data on distribution of hard substrate) that are known to limit their distribution. Predicted habitat results should be used in conjunction with multibeam bathymetry, geological mapping and other tools to guide future research efforts to areas with the highest probability of harboring deep-sea corals. Field validation of predicted habitat is needed to quantify model accuracy, particularly in areas that have not been sampled. PMID:24759613

  12. Probability of success for phase III after exploratory biomarker analysis in phase II.

    PubMed

    Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver

    2017-05-01

    The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Computing exact bundle compliance control charts via probability generating functions.

    PubMed

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

  14. Effects of ignition location models on the burn patterns of simulated wildfires

    USGS Publications Warehouse

    Bar-Massada, A.; Syphard, A.D.; Hawbaker, T.J.; Stewart, S.I.; Radeloff, V.C.

    2011-01-01

    Fire simulation studies that use models such as FARSITE often assume that ignition locations are distributed randomly, because spatially explicit information about actual ignition locations are difficult to obtain. However, many studies show that the spatial distribution of ignition locations, whether human-caused or natural, is non-random. Thus, predictions from fire simulations based on random ignitions may be unrealistic. However, the extent to which the assumption of ignition location affects the predictions of fire simulation models has never been systematically explored. Our goal was to assess the difference in fire simulations that are based on random versus non-random ignition location patterns. We conducted four sets of 6000 FARSITE simulations for the Santa Monica Mountains in California to quantify the influence of random and non-random ignition locations and normal and extreme weather conditions on fire size distributions and spatial patterns of burn probability. Under extreme weather conditions, fires were significantly larger for non-random ignitions compared to random ignitions (mean area of 344.5 ha and 230.1 ha, respectively), but burn probability maps were highly correlated (r = 0.83). Under normal weather, random ignitions produced significantly larger fires than non-random ignitions (17.5 ha and 13.3 ha, respectively), and the spatial correlations between burn probability maps were not high (r = 0.54), though the difference in the average burn probability was small. The results of the study suggest that the location of ignitions used in fire simulation models may substantially influence the spatial predictions of fire spread patterns. However, the spatial bias introduced by using a random ignition location model may be minimized if the fire simulations are conducted under extreme weather conditions when fire spread is greatest. ?? 2010 Elsevier Ltd.

  15. Relationship between radiation-induced aberrations in individual chromosomes and their DNA content: effects of interaction distance

    NASA Technical Reports Server (NTRS)

    Wu, H.; Durante, M.; Lucas, J. N.

    2001-01-01

    PURPOSE: To study the effect of the interaction distance on the frequency of inter- and intrachromosome exchanges in individual chromosomes with respect to their DNA content. Assumptions: Chromosome exchanges are formed by misrejoining of two DNA double-strand breaks (DSB) induced within an interaction distance, d. It is assumed that chromosomes in G(0)/G(1) phase of the cell cycle occupy a spherical domain in a cell nucleus, with no spatial overlap between individual chromosome domains. RESULTS: Formulae are derived for the probability of formation of inter-, as well as intra-, chromosome exchanges relating to the DNA content of the chromosome for a given interaction distance. For interaction distances <1 microm, the relative frequency of interchromosome exchanges predicted by the present model is similar to that by Cigarran et al. (1998) based on the assumption that the probability of interchromosome exchanges is proportional to the "surface area" of the chromosome territory. The "surface area" assumption is shown to be a limiting case of d-->0 in the present model. The present model also predicts that the probability of intrachromosome exchanges occurring in individual chromosomes is proportional to their DNA content with correction terms. CONCLUSION: When the interaction distance is small, the "surface area" distribution for chromosome participation in interchromosome exchanges has been expected. However, the present model shows that for the interaction distance as large as 1 microm, the predicted probability of interchromosome exchange formation is still close to the surface area distribution. Therefore, this distribution does not necessarily rule out the formation of complex chromosomal aberrations by long-range misrejoining of DSB.

  16. Dual assimilation of satellite soil moisture to improve flood prediction in ungauged catchments

    USDA-ARS?s Scientific Manuscript database

    This paper explores the use of active and passive satellite soil moisture products for improving stream flow prediction within 4 large (>5,000km2) semi-arid catchments. We use the probability distributed model (PDM) under a data-scarce scenario and aim at correcting two key controlling factors in th...

  17. Neutrino mass priors for cosmology from random matrices

    NASA Astrophysics Data System (ADS)

    Long, Andrew J.; Raveri, Marco; Hu, Wayne; Dodelson, Scott

    2018-02-01

    Cosmological measurements of structure are placing increasingly strong constraints on the sum of the neutrino masses, Σ mν, through Bayesian inference. Because these constraints depend on the choice for the prior probability π (Σ mν), we argue that this prior should be motivated by fundamental physical principles rather than the ad hoc choices that are common in the literature. The first step in this direction is to specify the prior directly at the level of the neutrino mass matrix Mν, since this is the parameter appearing in the Lagrangian of the particle physics theory. Thus by specifying a probability distribution over Mν, and by including the known squared mass splittings, we predict a theoretical probability distribution over Σ mν that we interpret as a Bayesian prior probability π (Σ mν). Assuming a basis-invariant probability distribution on Mν, also known as the anarchy hypothesis, we find that π (Σ mν) peaks close to the smallest Σ mν allowed by the measured mass splittings, roughly 0.06 eV (0.1 eV) for normal (inverted) ordering, due to the phenomenon of eigenvalue repulsion in random matrices. We consider three models for neutrino mass generation: Dirac, Majorana, and Majorana via the seesaw mechanism; differences in the predicted priors π (Σ mν) allow for the possibility of having indications about the physical origin of neutrino masses once sufficient experimental sensitivity is achieved. We present fitting functions for π (Σ mν), which provide a simple means for applying these priors to cosmological constraints on the neutrino masses or marginalizing over their impact on other cosmological parameters.

  18. Universal inverse power-law distribution for temperature and rainfall in the UK region

    NASA Astrophysics Data System (ADS)

    Selvam, A. M.

    2014-06-01

    Meteorological parameters, such as temperature, rainfall, pressure, etc., exhibit selfsimilar space-time fractal fluctuations generic to dynamical systems in nature such as fluid flows, spread of forest fires, earthquakes, etc. The power spectra of fractal fluctuations display inverse power-law form signifying long-range correlations. A general systems theory model predicts universal inverse power-law form incorporating the golden mean for the fractal fluctuations. The model predicted distribution was compared with observed distribution of fractal fluctuations of all size scales (small, large and extreme values) in the historic month-wise temperature (maximum and minimum) and total rainfall for the four stations Oxford, Armagh, Durham and Stornoway in the UK region, for data periods ranging from 92 years to 160 years. For each parameter, the two cumulative probability distributions, namely cmax and cmin starting from respectively maximum and minimum data value were used. The results of the study show that (i) temperature distributions (maximum and minimum) follow model predicted distribution except for Stornowy, minimum temperature cmin. (ii) Rainfall distribution for cmin follow model predicted distribution for all the four stations. (iii) Rainfall distribution for cmax follows model predicted distribution for the two stations Armagh and Stornoway. The present study suggests that fractal fluctuations result from the superimposition of eddy continuum fluctuations.

  19. Estimating probable flaw distributions in PWR steam generator tubes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorman, J.A.; Turner, A.P.L.

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regardingmore » uncertainties and assumptions in the data and analyses.« less

  20. Predictability in Cellular Automata

    PubMed Central

    Agapie, Alexandru; Andreica, Anca; Chira, Camelia; Giuclea, Marius

    2014-01-01

    Modelled as finite homogeneous Markov chains, probabilistic cellular automata with local transition probabilities in (0, 1) always posses a stationary distribution. This result alone is not very helpful when it comes to predicting the final configuration; one needs also a formula connecting the probabilities in the stationary distribution to some intrinsic feature of the lattice configuration. Previous results on the asynchronous cellular automata have showed that such feature really exists. It is the number of zero-one borders within the automaton's binary configuration. An exponential formula in the number of zero-one borders has been proved for the 1-D, 2-D and 3-D asynchronous automata with neighborhood three, five and seven, respectively. We perform computer experiments on a synchronous cellular automaton to check whether the empirical distribution obeys also that theoretical formula. The numerical results indicate a perfect fit for neighbourhood three and five, which opens the way for a rigorous proof of the formula in this new, synchronous case. PMID:25271778

  1. Launch Vehicle Debris Models and Crew Vehicle Ascent Abort Risk

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Lawrence, Scott

    2013-01-01

    For manned space launch systems, a reliable abort system is required to reduce the risks associated with a launch vehicle failure during ascent. Understanding the risks associated with failure environments can be achieved through the use of physics-based models of these environments. Debris fields due to destruction of the launch vehicle is one such environment. To better analyze the risk posed by debris, a physics-based model for generating launch vehicle debris catalogs has been developed. The model predicts the mass distribution of the debris field based on formulae developed from analysis of explosions. Imparted velocity distributions are computed using a shock-physics code to model the explosions within the launch vehicle. A comparison of the debris catalog with an existing catalog for the Shuttle external tank show good comparison in the debris characteristics and the predicted debris strike probability. The model is used to analyze the effects of number of debris pieces and velocity distributions on the strike probability and risk.

  2. The 2004 Parkfield, CA Earthquake: A Teachable Moment for Exploring Earthquake Processes, Probability, and Earthquake Prediction

    NASA Astrophysics Data System (ADS)

    Kafka, A.; Barnett, M.; Ebel, J.; Bellegarde, H.; Campbell, L.

    2004-12-01

    The occurrence of the 2004 Parkfield earthquake provided a unique "teachable moment" for students in our science course for teacher education majors. The course uses seismology as a medium for teaching a wide variety of science topics appropriate for future teachers. The 2004 Parkfield earthquake occurred just 15 minutes after our students completed a lab on earthquake processes and earthquake prediction. That lab included a discussion of the Parkfield Earthquake Prediction Experiment as a motivation for the exercises they were working on that day. Furthermore, this earthquake was recorded on an AS1 seismograph right in their lab, just minutes after the students left. About an hour after we recorded the earthquake, the students were able to see their own seismogram of the event in the lecture part of the course, which provided an excellent teachable moment for a lecture/discussion on how the occurrence of the 2004 Parkfield earthquake might affect seismologists' ideas about earthquake prediction. The specific lab exercise that the students were working on just before we recorded this earthquake was a "sliding block" experiment that simulates earthquakes in the classroom. The experimental apparatus includes a flat board on top of which are blocks of wood attached to a bungee cord and a string wrapped around a hand crank. Plate motion is modeled by slowly turning the crank, and earthquakes are modeled as events in which the block slips ("blockquakes"). We scaled the earthquake data and the blockquake data (using how much the string moved as a proxy for time) so that we could compare blockquakes and earthquakes. This provided an opportunity to use interevent-time histograms to teach about earthquake processes, probability, and earthquake prediction, and to compare earthquake sequences with blockquake sequences. We were able to show the students, using data obtained directly from their own lab, how global earthquake data fit a Poisson exponential distribution better than do the blockquake and Parkfield data. This provided opportunities for discussing the difference between Poisson and normal distributions, how those differences affect our estimation of future earthquake probabilities, the importance of both the mean and the standard deviation in predicting future behavior from a sequence of events, and how conditional probability is used to help seismologists predict future earthquakes given a known or theoretical distribution of past earthquakes.

  3. The force distribution probability function for simple fluids by density functional theory.

    PubMed

    Rickayzen, G; Heyes, D M

    2013-02-28

    Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.

  4. Statistical analysis and modeling of intermittent transport events in the tokamak scrape-off layer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Johan, E-mail: anderson.johan@gmail.com; Halpern, Federico D.; Ricci, Paolo

    The turbulence observed in the scrape-off-layer of a tokamak is often characterized by intermittent events of bursty nature, a feature which raises concerns about the prediction of heat loads on the physical boundaries of the device. It appears thus necessary to delve into the statistical properties of turbulent physical fields such as density, electrostatic potential, and temperature, focusing on the mathematical expression of tails of the probability distribution functions. The method followed here is to generate statistical information from time-traces of the plasma density stemming from Braginskii-type fluid simulations and check this against a first-principles theoretical model. The analysis ofmore » the numerical simulations indicates that the probability distribution function of the intermittent process contains strong exponential tails, as predicted by the analytical theory.« less

  5. A study on the predictability of the transition day from the dry to the rainy season over South Korea

    NASA Astrophysics Data System (ADS)

    Lee, Sang-Min; Nam, Ji-Eun; Choi, Hee-Wook; Ha, Jong-Chul; Lee, Yong Hee; Kim, Yeon-Hee; Kang, Hyun-Suk; Cho, ChunHo

    2016-08-01

    This study was conducted to evaluate the prediction accuracies of THe Observing system Research and Predictability EXperiment (THORPEX) Interactive Grand Global Ensemble (TIGGE) data at six operational forecast centers using the root-mean square difference (RMSD) and Brier score (BS) from April to July 2012. And it was performed to test the precipitation predictability of ensemble prediction systems (EPS) on the onset of the summer rainy season, the day of withdrawal in spring drought over South Korea on 29 June 2012 with use of the ensemble mean precipitation, ensemble probability precipitation, 10-day lag ensemble forecasts (ensemble mean and probability precipitation), and effective drought index (EDI). The RMSD analysis of atmospheric variables (geopotential-height at 500 hPa, temperature at 850 hPa, sea-level pressure and specific humidity at 850 hPa) showed that the prediction accuracies of the EPS at the Meteorological Service of Canada (CMC) and China Meteorological Administration (CMA) were poor and those at the European Center for Medium-Range Weather Forecasts (ECMWF) and Korea Meteorological Administration (KMA) were good. Also, ECMWF and KMA showed better results than other EPSs for predicting precipitation in the BS distributions. It is also evaluated that the onset of the summer rainy season could be predicted using ensemble-mean precipitation from 4-day leading time at all forecast centers. In addition, the spatial distributions of predicted precipitation of the EPS at KMA and the Met Office of the United Kingdom (UKMO) were similar to those of observed precipitation; thus, the predictability showed good performance. The precipitation probability forecasts of EPS at CMA, the National Centers for Environmental Prediction (NCEP), and UKMO (ECMWF and KMA) at 1-day lead time produced over-forecasting (under-forecasting) in the reliability diagram. And all the ones at 2˜4-day lead time showed under-forecasting. Also, the precipitation on onset day of the summer rainy season could be predicted from a 4-day lead time to initial time by using the 10-day lag ensemble mean and probability forecasts. Additionally, the predictability for withdrawal day of spring drought to be ended due to precipitation on onset day of summer rainy season was evaluated using Effective Drought Index (EDI) to be calculated by ensemble mean precipitation forecasts and spreads at five EPSs.

  6. On computational Gestalt detection thresholds.

    PubMed

    Grompone von Gioi, Rafael; Jakubowicz, Jérémie

    2009-01-01

    The aim of this paper is to show some recent developments of computational Gestalt theory, as pioneered by Desolneux, Moisan and Morel. The new results allow to predict much more accurately the detection thresholds. This step is unavoidable if one wants to analyze visual detection thresholds in the light of computational Gestalt theory. The paper first recalls the main elements of computational Gestalt theory. It points out a precision issue in this theory, essentially due to the use of discrete probability distributions. It then proposes to overcome this issue by using continuous probability distributions and illustrates it on the meaningful alignment detector of Desolneux et al.

  7. Predicting Rib Fracture Risk With Whole-Body Finite Element Models: Development and Preliminary Evaluation of a Probabilistic Analytical Framework

    PubMed Central

    Forman, Jason L.; Kent, Richard W.; Mroz, Krystoffer; Pipkorn, Bengt; Bostrom, Ola; Segui-Gomez, Maria

    2012-01-01

    This study sought to develop a strain-based probabilistic method to predict rib fracture risk with whole-body finite element (FE) models, and to describe a method to combine the results with collision exposure information to predict injury risk and potential intervention effectiveness in the field. An age-adjusted ultimate strain distribution was used to estimate local rib fracture probabilities within an FE model. These local probabilities were combined to predict injury risk and severity within the whole ribcage. The ultimate strain distribution was developed from a literature dataset of 133 tests. Frontal collision simulations were performed with the THUMS (Total HUman Model for Safety) model with four levels of delta-V and two restraints: a standard 3-point belt and a progressive 3.5–7 kN force-limited, pretensioned (FL+PT) belt. The results of three simulations (29 km/h standard, 48 km/h standard, and 48 km/h FL+PT) were compared to matched cadaver sled tests. The numbers of fractures predicted for the comparison cases were consistent with those observed experimentally. Combining these results with field exposure informantion (ΔV, NASS-CDS 1992–2002) suggests a 8.9% probability of incurring AIS3+ rib fractures for a 60 year-old restrained by a standard belt in a tow-away frontal collision with this restraint, vehicle, and occupant configuration, compared to 4.6% for the FL+PT belt. This is the first study to describe a probabilistic framework to predict rib fracture risk based on strains observed in human-body FE models. Using this analytical framework, future efforts may incorporate additional subject or collision factors for multi-variable probabilistic injury prediction. PMID:23169122

  8. Estimating alarm thresholds and the number of components in mixture distributions

    NASA Astrophysics Data System (ADS)

    Burr, Tom; Hamada, Michael S.

    2012-09-01

    Mixtures of probability distributions arise in many nuclear assay and forensic applications, including nuclear weapon detection, neutron multiplicity counting, and in solution monitoring (SM) for nuclear safeguards. SM data is increasingly used to enhance nuclear safeguards in aqueous reprocessing facilities having plutonium in solution form in many tanks. This paper provides background for mixture probability distributions and then focuses on mixtures arising in SM data. SM data can be analyzed by evaluating transfer-mode residuals defined as tank-to-tank transfer differences, and wait-mode residuals defined as changes during non-transfer modes. A previous paper investigated impacts on transfer-mode and wait-mode residuals of event marking errors which arise when the estimated start and/or stop times of tank events such as transfers are somewhat different from the true start and/or stop times. Event marking errors contribute to non-Gaussian behavior and larger variation than predicted on the basis of individual tank calibration studies. This paper illustrates evidence for mixture probability distributions arising from such event marking errors and from effects such as condensation or evaporation during non-transfer modes, and pump carryover during transfer modes. A quantitative assessment of the sample size required to adequately characterize a mixture probability distribution arising in any context is included.

  9. Forecasting of the selected features of Poaceae (R. Br.) Barnh., Artemisia L. and Ambrosia L. pollen season in Szczecin, north-western Poland, using Gumbel's distribution.

    PubMed

    Puc, Małgorzata; Wolski, Tomasz

    2013-01-01

    The allergenic pollen content of the atmosphere varies according to climate, biogeography and vegetation. Minimisation of the pollen allergy symptoms is related to the possibility of avoidance of large doses of the allergen. Measurements performed in Szczecin over a period of 13 years (2000-2012 inclusive) permitted prediction of theoretical maximum concentrations of pollen grains and their probability for the pollen season of Poaceae, Artemisia and Ambrosia. Moreover, the probabilities were determined of a given date as the beginning of the pollen season, the date of the maximum pollen count, Seasonal Pollen Index value and the number of days with pollen count above threshold values. Aerobiological monitoring was conducted using a Hirst volumetric trap (Lanzoni VPPS). Linear trend with determination coefficient (R(2)) was calculated. Model for long-term forecasting was performed by the method based on Gumbel's distribution. A statistically significant negative correlation was determined between the duration of pollen season of Poaceae and Artemisia and the Seasonal Pollen Index value. Seasonal, total pollen counts of Artemisia and Ambrosia showed a strong and statistically significant decreasing tendency. On the basis of Gumbel's distribution, a model was proposed for Szczecin, allowing prediction of the probabilities of the maximum pollen count values that can appear once in e.g. 5, 10 or 100 years. Short pollen seasons are characterised by a higher intensity of pollination than long ones. Prediction of the maximum pollen count values, dates of the pollen season beginning, and the number of days with pollen count above the threshold, on the basis of Gumbel's distribution, is expected to lead to improvement in the prophylaxis and therapy of persons allergic to pollen.

  10. Assessing the potential for improving S2S forecast skill through multimodel ensembling

    NASA Astrophysics Data System (ADS)

    Vigaud, N.; Robertson, A. W.; Tippett, M. K.; Wang, L.; Bell, M. J.

    2016-12-01

    Non-linear logistic regression is well suited to probability forecasting and has been successfully applied in the past to ensemble weather and climate predictions, providing access to the full probabilities distribution without any Gaussian assumption. However, little work has been done at sub-monthly lead times where relatively small re-forecast ensembles and lengths represent new challenges for which post-processing avenues have yet to be investigated. A promising approach consists in extending the definition of non-linear logistic regression by including the quantile of the forecast distribution as one of the predictors. So-called Extended Logistic Regression (ELR), which enables mutually consistent individual threshold probabilities, is here applied to ECMWF, CFSv2 and CMA re-forecasts from the S2S database in order to produce rainfall probabilities at weekly resolution. The ELR model is trained on seasonally-varying tercile categories computed for lead times of 1 to 4 weeks. It is then tested in a cross-validated manner, i.e. allowing real-time predictability applications, to produce rainfall tercile probabilities from individual weekly hindcasts that are finally combined by equal pooling. Results will be discussed over a broader North American region, where individual and MME forecasts generated out to 4 weeks lead are characterized by good probabilistic reliability but low sharpness, exhibiting systematically more skill in winter than summer.

  11. A superstatistical model of metastasis and cancer survival

    NASA Astrophysics Data System (ADS)

    Leon Chen, L.; Beck, Christian

    2008-05-01

    We introduce a superstatistical model for the progression statistics of malignant cancer cells. The metastatic cascade is modeled as a complex nonequilibrium system with several macroscopic pathways and inverse-chi-square distributed parameters of the underlying Poisson processes. The predictions of the model are in excellent agreement with observed survival-time probability distributions of breast cancer patients.

  12. Modelling the distributions and spatial coincidence of bluetongue vectors Culicoides imicola and the Culicoides obsoletus group throughout the Iberian peninsula.

    PubMed

    Calvete, C; Estrada, R; Miranda, M A; Borrás, D; Calvo, J H; Lucientes, J

    2008-06-01

    Data obtained by a Spanish national surveillance programme in 2005 were used to develop climatic models for predictions of the distribution of the bluetongue virus (BTV) vectors Culicoides imicola Kieffer (Diptera: Ceratopogonidae) and the Culicoides obsoletus group Meigen throughout the Iberian peninsula. Models were generated using logistic regression to predict the probability of species occurrence at an 8-km spatial resolution. Predictor variables included the annual mean values and seasonalities of a remotely sensed normalized difference vegetation index (NDVI), a sun index, interpolated precipitation and temperature. Using an information-theoretic paradigm based on Akaike's criterion, a set of best models accounting for 95% of model selection certainty were selected and used to generate an average predictive model for each vector. The predictive performances (i.e. the discrimination capacity and calibration) of the average models were evaluated by both internal and external validation. External validation was achieved by comparing average model predictions with surveillance programme data obtained in 2004 and 2006. The discriminatory capacity of both models was found to be reasonably high. The estimated areas under the receiver operating characteristic (ROC) curve (AUC) were 0.78 and 0.70 for the C. imicola and C. obsoletus group models, respectively, in external validation, and 0.81 and 0.75, respectively, in internal validation. The predictions of both models were in close agreement with the observed distribution patterns of both vectors. Both models, however, showed a systematic bias in their predicted probability of occurrence: observed occurrence was systematically overestimated for C. imicola and underestimated for the C. obsoletus group. Average models were used to determine the areas of spatial coincidence of the two vectors. Although their spatial distributions were highly complementary, areas of spatial coincidence were identified, mainly in Portugal and in the southwest of peninsular Spain. In a hypothetical scenario in which both Culicoides members had similar vectorial capacity for a BTV strain, these areas should be considered of special epidemiological concern because any epizootic event could be intensified by consecutive vector activity developed for both species during the year; consequently, the probability of BTV spreading to remaining areas occupied by both vectors might also be higher.

  13. Effects of life-history requirements on the distribution of a threatened reptile.

    PubMed

    Thompson, Denise M; Ligon, Day B; Patton, Jason C; Papeş, Monica

    2017-04-01

    Survival and reproduction are the two primary life-history traits essential for species' persistence; however, the environmental conditions that support each of these traits may not be the same. Despite this, reproductive requirements are seldom considered when estimating species' potential distributions. We sought to examine potentially limiting environmental factors influencing the distribution of an oviparous reptile of conservation concern with respect to the species' survival and reproduction and to assess the implications of the species' predicted climatic constraints on current conservation practices. We used ecological niche modeling to predict the probability of environmental suitability for the alligator snapping turtle (Macrochelys temminckii). We built an annual climate model to examine survival and a nesting climate model to examine reproduction. We combined incubation temperature requirements, products of modeled soil temperature data, and our estimated distributions to determine whether embryonic development constrained the northern distribution of the species. Low annual precipitation constrained the western distribution of alligator snapping turtles, whereas the northern distribution was constrained by thermal requirements during embryonic development. Only a portion of the geographic range predicted to have a high probability of suitability for alligator snapping turtle survival was estimated to be capable of supporting successful embryonic development. Historic occurrence records suggest adult alligator snapping turtles can survive in regions with colder climes than those associated with consistent and successful production of offspring. Estimated egg-incubation requirements indicated that current reintroductions at the northern edge of the species' range are within reproductively viable environmental conditions. Our results highlight the importance of considering survival and reproduction when estimating species' ecological niches, implicating conservation plans, and benefits of incorporating physiological data when evaluating species' distributions. © 2016 Society for Conservation Biology.

  14. Modeling late rectal toxicities based on a parameterized representation of the 3D dose distribution

    NASA Astrophysics Data System (ADS)

    Buettner, Florian; Gulliford, Sarah L.; Webb, Steve; Partridge, Mike

    2011-04-01

    Many models exist for predicting toxicities based on dose-volume histograms (DVHs) or dose-surface histograms (DSHs). This approach has several drawbacks as firstly the reduction of the dose distribution to a histogram results in the loss of spatial information and secondly the bins of the histograms are highly correlated with each other. Furthermore, some of the complex nonlinear models proposed in the past lack a direct physical interpretation and the ability to predict probabilities rather than binary outcomes. We propose a parameterized representation of the 3D distribution of the dose to the rectal wall which explicitly includes geometrical information in the form of the eccentricity of the dose distribution as well as its lateral and longitudinal extent. We use a nonlinear kernel-based probabilistic model to predict late rectal toxicity based on the parameterized dose distribution and assessed its predictive power using data from the MRC RT01 trial (ISCTRN 47772397). The endpoints under consideration were rectal bleeding, loose stools, and a global toxicity score. We extract simple rules identifying 3D dose patterns related to a specifically low risk of complication. Normal tissue complication probability (NTCP) models based on parameterized representations of geometrical and volumetric measures resulted in areas under the curve (AUCs) of 0.66, 0.63 and 0.67 for predicting rectal bleeding, loose stools and global toxicity, respectively. In comparison, NTCP models based on standard DVHs performed worse and resulted in AUCs of 0.59 for all three endpoints. In conclusion, we have presented low-dimensional, interpretable and nonlinear NTCP models based on the parameterized representation of the dose to the rectal wall. These models had a higher predictive power than models based on standard DVHs and their low dimensionality allowed for the identification of 3D dose patterns related to a low risk of complication.

  15. Prediction of Carbohydrate Binding Sites on Protein Surfaces with 3-Dimensional Probability Density Distributions of Interacting Atoms

    PubMed Central

    Tsai, Keng-Chang; Jian, Jhih-Wei; Yang, Ei-Wen; Hsu, Po-Chiang; Peng, Hung-Pin; Chen, Ching-Tai; Chen, Jun-Bo; Chang, Jeng-Yih; Hsu, Wen-Lian; Yang, An-Suei

    2012-01-01

    Non-covalent protein-carbohydrate interactions mediate molecular targeting in many biological processes. Prediction of non-covalent carbohydrate binding sites on protein surfaces not only provides insights into the functions of the query proteins; information on key carbohydrate-binding residues could suggest site-directed mutagenesis experiments, design therapeutics targeting carbohydrate-binding proteins, and provide guidance in engineering protein-carbohydrate interactions. In this work, we show that non-covalent carbohydrate binding sites on protein surfaces can be predicted with relatively high accuracy when the query protein structures are known. The prediction capabilities were based on a novel encoding scheme of the three-dimensional probability density maps describing the distributions of 36 non-covalent interacting atom types around protein surfaces. One machine learning model was trained for each of the 30 protein atom types. The machine learning algorithms predicted tentative carbohydrate binding sites on query proteins by recognizing the characteristic interacting atom distribution patterns specific for carbohydrate binding sites from known protein structures. The prediction results for all protein atom types were integrated into surface patches as tentative carbohydrate binding sites based on normalized prediction confidence level. The prediction capabilities of the predictors were benchmarked by a 10-fold cross validation on 497 non-redundant proteins with known carbohydrate binding sites. The predictors were further tested on an independent test set with 108 proteins. The residue-based Matthews correlation coefficient (MCC) for the independent test was 0.45, with prediction precision and sensitivity (or recall) of 0.45 and 0.49 respectively. In addition, 111 unbound carbohydrate-binding protein structures for which the structures were determined in the absence of the carbohydrate ligands were predicted with the trained predictors. The overall prediction MCC was 0.49. Independent tests on anti-carbohydrate antibodies showed that the carbohydrate antigen binding sites were predicted with comparable accuracy. These results demonstrate that the predictors are among the best in carbohydrate binding site predictions to date. PMID:22848404

  16. Neutrino mass priors for cosmology from random matrices

    DOE PAGES

    Long, Andrew J.; Raveri, Marco; Hu, Wayne; ...

    2018-02-13

    Cosmological measurements of structure are placing increasingly strong constraints on the sum of the neutrino masses, Σm ν, through Bayesian inference. Because these constraints depend on the choice for the prior probability π(Σm ν), we argue that this prior should be motivated by fundamental physical principles rather than the ad hoc choices that are common in the literature. The first step in this direction is to specify the prior directly at the level of the neutrino mass matrix M ν, since this is the parameter appearing in the Lagrangian of the particle physics theory. Thus by specifying a probability distribution overmore » M ν, and by including the known squared mass splittings, we predict a theoretical probability distribution over Σm ν that we interpret as a Bayesian prior probability π(Σm ν). Assuming a basis-invariant probability distribution on M ν, also known as the anarchy hypothesis, we find that π(Σm ν) peaks close to the smallest Σm ν allowed by the measured mass splittings, roughly 0.06 eV (0.1 eV) for normal (inverted) ordering, due to the phenomenon of eigenvalue repulsion in random matrices. We consider three models for neutrino mass generation: Dirac, Majorana, and Majorana via the seesaw mechanism; differences in the predicted priors π(Σm ν) allow for the possibility of having indications about the physical origin of neutrino masses once sufficient experimental sensitivity is achieved. In conclusion, we present fitting functions for π(Σm ν), which provide a simple means for applying these priors to cosmological constraints on the neutrino masses or marginalizing over their impact on other cosmological parameters.« less

  17. Neutrino mass priors for cosmology from random matrices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Andrew J.; Raveri, Marco; Hu, Wayne

    Cosmological measurements of structure are placing increasingly strong constraints on the sum of the neutrino masses, Σm ν, through Bayesian inference. Because these constraints depend on the choice for the prior probability π(Σm ν), we argue that this prior should be motivated by fundamental physical principles rather than the ad hoc choices that are common in the literature. The first step in this direction is to specify the prior directly at the level of the neutrino mass matrix M ν, since this is the parameter appearing in the Lagrangian of the particle physics theory. Thus by specifying a probability distribution overmore » M ν, and by including the known squared mass splittings, we predict a theoretical probability distribution over Σm ν that we interpret as a Bayesian prior probability π(Σm ν). Assuming a basis-invariant probability distribution on M ν, also known as the anarchy hypothesis, we find that π(Σm ν) peaks close to the smallest Σm ν allowed by the measured mass splittings, roughly 0.06 eV (0.1 eV) for normal (inverted) ordering, due to the phenomenon of eigenvalue repulsion in random matrices. We consider three models for neutrino mass generation: Dirac, Majorana, and Majorana via the seesaw mechanism; differences in the predicted priors π(Σm ν) allow for the possibility of having indications about the physical origin of neutrino masses once sufficient experimental sensitivity is achieved. In conclusion, we present fitting functions for π(Σm ν), which provide a simple means for applying these priors to cosmological constraints on the neutrino masses or marginalizing over their impact on other cosmological parameters.« less

  18. Anthropic prediction for a large multi-jump landscape

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schwartz-Perlov, Delia, E-mail: delia@perlov.com

    2008-10-15

    The assumption of a flat prior distribution plays a critical role in the anthropic prediction of the cosmological constant. In a previous paper we analytically calculated the distribution for the cosmological constant, including the prior and anthropic selection effects, in a large toy 'single-jump' landscape model. We showed that it is possible for the fractal prior distribution that we found to behave as an effectively flat distribution in a wide class of landscapes, but only if the single-jump size is large enough. We extend this work here by investigating a large (N{approx}10{sup 500}) toy 'multi-jump' landscape model. The jump sizesmore » range over three orders of magnitude and an overall free parameter c determines the absolute size of the jumps. We will show that for 'large' c the distribution of probabilities of vacua in the anthropic range is effectively flat, and thus the successful anthropic prediction is validated. However, we argue that for small c, the distribution may not be smooth.« less

  19. Spectra of conditionalization and typicality in the multiverse

    NASA Astrophysics Data System (ADS)

    Azhar, Feraz

    2016-02-01

    An approach to testing theories describing a multiverse, that has gained interest of late, involves comparing theory-generated probability distributions over observables with their experimentally measured values. It is likely that such distributions, were we indeed able to calculate them unambiguously, will assign low probabilities to any such experimental measurements. An alternative to thereby rejecting these theories, is to conditionalize the distributions involved by restricting attention to domains of the multiverse in which we might arise. In order to elicit a crisp prediction, however, one needs to make a further assumption about how typical we are of the chosen domains. In this paper, we investigate interactions between the spectra of available assumptions regarding both conditionalization and typicality, and draw out the effects of these interactions in a concrete setting; namely, on predictions of the total number of species that contribute significantly to dark matter. In particular, for each conditionalization scheme studied, we analyze how correlations between densities of different dark matter species affect the prediction, and explicate the effects of assumptions regarding typicality. We find that the effects of correlations can depend on the conditionalization scheme, and that in each case atypicality can significantly change the prediction. In doing so, we demonstrate the existence of overlaps in the predictions of different "frameworks" consisting of conjunctions of theory, conditionalization scheme and typicality assumption. This conclusion highlights the acute challenges involved in using such tests to identify a preferred framework that aims to describe our observational situation in a multiverse.

  20. Sensitivity Analysis of the Bone Fracture Risk Model

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth; Myers, Jerry; Sibonga, Jean Diane

    2017-01-01

    Introduction: The probability of bone fracture during and after spaceflight is quantified to aid in mission planning, to determine required astronaut fitness standards and training requirements and to inform countermeasure research and design. Probability is quantified with a probabilistic modeling approach where distributions of model parameter values, instead of single deterministic values, capture the parameter variability within the astronaut population and fracture predictions are probability distributions with a mean value and an associated uncertainty. Because of this uncertainty, the model in its current state cannot discern an effect of countermeasures on fracture probability, for example between use and non-use of bisphosphonates or between spaceflight exercise performed with the Advanced Resistive Exercise Device (ARED) or on devices prior to installation of ARED on the International Space Station. This is thought to be due to the inability to measure key contributors to bone strength, for example, geometry and volumetric distributions of bone mass, with areal bone mineral density (BMD) measurement techniques. To further the applicability of model, we performed a parameter sensitivity study aimed at identifying those parameter uncertainties that most effect the model forecasts in order to determine what areas of the model needed enhancements for reducing uncertainty. Methods: The bone fracture risk model (BFxRM), originally published in (Nelson et al) is a probabilistic model that can assess the risk of astronaut bone fracture. This is accomplished by utilizing biomechanical models to assess the applied loads; utilizing models of spaceflight BMD loss in at-risk skeletal locations; quantifying bone strength through a relationship between areal BMD and bone failure load; and relating fracture risk index (FRI), the ratio of applied load to bone strength, to fracture probability. There are many factors associated with these calculations including environmental factors, factors associated with the fall event, mass and anthropometric values of the astronaut, BMD characteristics, characteristics of the relationship between BMD and bone strength and bone fracture characteristics. The uncertainty in these factors is captured through the use of parameter distributions and the fracture predictions are probability distributions with a mean value and an associated uncertainty. To determine parameter sensitivity, a correlation coefficient is found between the sample set of each model parameter and the calculated fracture probabilities. Each parameters contribution to the variance is found by squaring the correlation coefficients, dividing by the sum of the squared correlation coefficients, and multiplying by 100. Results: Sensitivity analyses of BFxRM simulations of preflight, 0 days post-flight and 365 days post-flight falls onto the hip revealed a subset of the twelve factors within the model which cause the most variation in the fracture predictions. These factors include the spring constant used in the hip biomechanical model, the midpoint FRI parameter within the equation used to convert FRI to fracture probability and preflight BMD values. Future work: Plans are underway to update the BFxRM by incorporating bone strength information from finite element models (FEM) into the bone strength portion of the BFxRM. Also, FEM bone strength information along with fracture outcome data will be incorporated into the FRI to fracture probability.

  1. Expected Utility Distributions for Flexible, Contingent Execution

    NASA Technical Reports Server (NTRS)

    Bresina, John L.; Washington, Richard

    2000-01-01

    This paper presents a method for using expected utility distributions in the execution of flexible, contingent plans. A utility distribution maps the possible start times of an action to the expected utility of the plan suffix starting with that action. The contingent plan encodes a tree of possible courses of action and includes flexible temporal constraints and resource constraints. When execution reaches a branch point, the eligible option with the highest expected utility at that point in time is selected. The utility distributions make this selection sensitive to the runtime context, yet still efficient. Our approach uses predictions of action duration uncertainty as well as expectations of resource usage and availability to determine when an action can execute and with what probability. Execution windows and probabilities inevitably change as execution proceeds, but such changes do not invalidate the cached utility distributions, thus, dynamic updating of utility information is minimized.

  2. Does prediction error drive one-shot declarative learning?

    PubMed

    Greve, Andrea; Cooper, Elisa; Kaula, Alexander; Anderson, Michael C; Henson, Richard

    2017-06-01

    The role of prediction error (PE) in driving learning is well-established in fields such as classical and instrumental conditioning, reward learning and procedural memory; however, its role in human one-shot declarative encoding is less clear. According to one recent hypothesis, PE reflects the divergence between two probability distributions: one reflecting the prior probability (from previous experiences) and the other reflecting the sensory evidence (from the current experience). Assuming unimodal probability distributions, PE can be manipulated in three ways: (1) the distance between the mode of the prior and evidence, (2) the precision of the prior, and (3) the precision of the evidence. We tested these three manipulations across five experiments, in terms of peoples' ability to encode a single presentation of a scene-item pairing as a function of previous exposures to that scene and/or item. Memory was probed by presenting the scene together with three choices for the previously paired item, in which the two foil items were from other pairings within the same condition as the target item. In Experiment 1, we manipulated the evidence to be either consistent or inconsistent with prior expectations, predicting PE to be larger, and hence memory better, when the new pairing was inconsistent. In Experiments 2a-c, we manipulated the precision of the priors, predicting better memory for a new pairing when the (inconsistent) priors were more precise. In Experiment 3, we manipulated both visual noise and prior exposure for unfamiliar faces, before pairing them with scenes, predicting better memory when the sensory evidence was more precise. In all experiments, the PE hypotheses were supported. We discuss alternative explanations of individual experiments, and conclude the Predictive Interactive Multiple Memory Signals (PIMMS) framework provides the most parsimonious account of the full pattern of results.

  3. Novel Data on the Ecology of Cochranella mache (Anura: Centrolenidae) and the Importance of Protected Areas for This Critically Endangered Glassfrog in the Neotropics

    PubMed Central

    Ortega-Andrade, H. Mauricio; Rojas-Soto, Octavio; Paucar, Christian

    2013-01-01

    We studied a population of the endangered glassfrog, Cochranella mache, at Bilsa Biological Station, northwestern Ecuador, from 2008 and 2009. We present information on annual abundance patterns, behavioral ecology, habitat use and a species distribution model performed with MaxEnt. We evaluate the importance of the National System of Protected Areas (SNAP) in Colombia and Ecuador, under scenarios of climate change and habitat loss. We predicted a restricted environmental suitability area from 48,509 Km2 to 65,147 Km2 along western Ecuador and adjacent Colombia; ∼8% of the potential distribution occurs within SNAP. We examined four aspects of C. mache ecology: (1) ecological data suggests a strong correlation between relative abundance and rainfall, with a high probability to observe frogs through rainy months (February–May); (2) habitat use and the species distribution model suggest that this canopy dweller is restricted to small streams and rivulets in primary and old secondary forest in evergreen lowland and piedmont forest of western Ecuador, with predictions of suitability areas in adjacent southern Colombia; (3) the SNAP of Colombia and Ecuador harbor a minimum portion of the predicted model of distribution (<10%); and (4) synergetic effects of habitat loss and climate change reduces in about 95% the suitability areas for this endangered frog along its distributional range in Protected Areas. The resulting model allows the recognition of areas to undertake conservation efforts and plan future field surveys, as well as forecasting regions with high probability of C. mache occurrence in western Ecuador and southern Colombia. Further research is required to assess population tendencies, habitat fragmentation and target survey zones to accelerate the discovery of unknown populations in unexplored areas with high probability of suitability. We recommend that Cochranella mache must be re-categorized as “Critically Endangered” species in national and global status, according with criteria and sub-criteria A4, B1ab(i,ii,iii,iv),E. PMID:24339973

  4. Beaked Whale Habitat Characterization and Prediction

    DTIC Science & Technology

    2005-09-30

    trying to develop a better understanding of beaked whale distribution. For long - range planning, the static habitat prediction maps provide a broad... whale presence ranged from 79.3% to 100.0% for the static models and 85.7% to 94.5% for the dynamic models. Beaked whale habitat prediction has been...submerged for such long periods of time that there is a high probability that they will never surface within the visual range of observers aboard a

  5. Using type IV Pearson distribution to calculate the probabilities of underrun and overrun of lists of multiple cases.

    PubMed

    Wang, Jihan; Yang, Kai

    2014-07-01

    An efficient operating room needs both little underutilised and overutilised time to achieve optimal cost efficiency. The probabilities of underrun and overrun of lists of cases can be estimated by a well defined duration distribution of the lists. To propose a method of predicting the probabilities of underrun and overrun of lists of cases using Type IV Pearson distribution to support case scheduling. Six years of data were collected. The first 5 years of data were used to fit distributions and estimate parameters. The data from the last year were used as testing data to validate the proposed methods. The percentiles of the duration distribution of lists of cases were calculated by Type IV Pearson distribution and t-distribution. Monte Carlo simulation was conducted to verify the accuracy of percentiles defined by the proposed methods. Operating rooms in John D. Dingell VA Medical Center, United States, from January 2005 to December 2011. Differences between the proportion of lists of cases that were completed within the percentiles of the proposed duration distribution of the lists and the corresponding percentiles. Compared with the t-distribution, the proposed new distribution is 8.31% (0.38) more accurate on average and 14.16% (0.19) more accurate in calculating the probabilities at the 10th and 90th percentiles of the distribution, which is a major concern of operating room schedulers. The absolute deviations between the percentiles defined by Type IV Pearson distribution and those from Monte Carlo simulation varied from 0.20  min (0.01) to 0.43  min (0.03). Operating room schedulers can rely on the most recent 10 cases with the same combination of surgeon and procedure(s) for distribution parameter estimation to plan lists of cases. Values are mean (SEM). The proposed Type IV Pearson distribution is more accurate than t-distribution to estimate the probabilities of underrun and overrun of lists of cases. However, as not all the individual case durations followed log-normal distributions, there was some deviation from the true duration distribution of the lists.

  6. A predictive approach to selecting the size of a clinical trial, based on subjective clinical opinion.

    PubMed

    Spiegelhalter, D J; Freedman, L S

    1986-01-01

    The 'textbook' approach to determining sample size in a clinical trial has some fundamental weaknesses which we discuss. We describe a new predictive method which takes account of prior clinical opinion about the treatment difference. The method adopts the point of clinical equivalence (determined by interviewing the clinical participants) as the null hypothesis. Decision rules at the end of the study are based on whether the interval estimate of the treatment difference (classical or Bayesian) includes the null hypothesis. The prior distribution is used to predict the probabilities of making the decisions to use one or other treatment or to reserve final judgement. It is recommended that sample size be chosen to control the predicted probability of the last of these decisions. An example is given from a multi-centre trial of superficial bladder cancer.

  7. Predictive Game Theory

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    Probability theory governs the outcome of a game; there is a distribution over mixed strat.'s, not a single "equilibrium". To predict a single mixed strategy must use our loss function (external to the game's players. Provides a quantification of any strategy's rationality. Prove rationality falls as cost of computation rises (for players who have not previously interacted). All extends to games with varying numbers of players.

  8. Position Error Covariance Matrix Validation and Correction

    NASA Technical Reports Server (NTRS)

    Frisbee, Joe, Jr.

    2016-01-01

    In order to calculate operationally accurate collision probabilities, the position error covariance matrices predicted at times of closest approach must be sufficiently accurate representations of the position uncertainties. This presentation will discuss why the Gaussian distribution is a reasonable expectation for the position uncertainty and how this assumed distribution type is used in the validation and correction of position error covariance matrices.

  9. Radar prediction of absolute rain fade distributions for earth-satellite paths and general methods for extrapolation of fade statistics to other locations

    NASA Technical Reports Server (NTRS)

    Goldhirsh, J.

    1982-01-01

    The first absolute rain fade distribution method described establishes absolute fade statistics at a given site by means of a sampled radar data base. The second method extrapolates absolute fade statistics from one location to another, given simultaneously measured fade and rain rate statistics at the former. Both methods employ similar conditional fade statistic concepts and long term rain rate distributions. Probability deviations in the 2-19% range, with an 11% average, were obtained upon comparison of measured and predicted levels at given attenuations. The extrapolation of fade distributions to other locations at 28 GHz showed very good agreement with measured data at three sites located in the continental temperate region.

  10. Epidemic extinction paths in complex networks

    NASA Astrophysics Data System (ADS)

    Hindes, Jason; Schwartz, Ira B.

    2017-05-01

    We study the extinction of long-lived epidemics on finite complex networks induced by intrinsic noise. Applying analytical techniques to the stochastic susceptible-infected-susceptible model, we predict the distribution of large fluctuations, the most probable or optimal path through a network that leads to a disease-free state from an endemic state, and the average extinction time in general configurations. Our predictions agree with Monte Carlo simulations on several networks, including synthetic weighted and degree-distributed networks with degree correlations, and an empirical high school contact network. In addition, our approach quantifies characteristic scaling patterns for the optimal path and distribution of large fluctuations, both near and away from the epidemic threshold, in networks with heterogeneous eigenvector centrality and degree distributions.

  11. Epidemic extinction paths in complex networks.

    PubMed

    Hindes, Jason; Schwartz, Ira B

    2017-05-01

    We study the extinction of long-lived epidemics on finite complex networks induced by intrinsic noise. Applying analytical techniques to the stochastic susceptible-infected-susceptible model, we predict the distribution of large fluctuations, the most probable or optimal path through a network that leads to a disease-free state from an endemic state, and the average extinction time in general configurations. Our predictions agree with Monte Carlo simulations on several networks, including synthetic weighted and degree-distributed networks with degree correlations, and an empirical high school contact network. In addition, our approach quantifies characteristic scaling patterns for the optimal path and distribution of large fluctuations, both near and away from the epidemic threshold, in networks with heterogeneous eigenvector centrality and degree distributions.

  12. Time-specific ecological niche modeling predicts spatial dynamics of vector insects and human dengue cases.

    PubMed

    Peterson, A Townsend; Martínez-Campos, Carmen; Nakazawa, Yoshinori; Martínez-Meyer, Enrique

    2005-09-01

    Numerous human diseases-malaria, dengue, yellow fever and leishmaniasis, to name a few-are transmitted by insect vectors with brief life cycles and biting activity that varies in both space and time. Although the general geographic distributions of these epidemiologically important species are known, the spatiotemporal variation in their emergence and activity remains poorly understood. We used ecological niche modeling via a genetic algorithm to produce time-specific predictive models of monthly distributions of Aedes aegypti in Mexico in 1995. Significant predictions of monthly mosquito activity and distributions indicate that predicting spatiotemporal dynamics of disease vector species is feasible; significant coincidence with human cases of dengue indicate that these dynamics probably translate directly into transmission of dengue virus to humans. This approach provides new potential for optimizing use of resources for disease prevention and remediation via automated forecasting of disease transmission risk.

  13. Metabolic networks evolve towards states of maximum entropy production.

    PubMed

    Unrean, Pornkamol; Srienc, Friedrich

    2011-11-01

    A metabolic network can be described by a set of elementary modes or pathways representing discrete metabolic states that support cell function. We have recently shown that in the most likely metabolic state the usage probability of individual elementary modes is distributed according to the Boltzmann distribution law while complying with the principle of maximum entropy production. To demonstrate that a metabolic network evolves towards such state we have carried out adaptive evolution experiments with Thermoanaerobacterium saccharolyticum operating with a reduced metabolic functionality based on a reduced set of elementary modes. In such reduced metabolic network metabolic fluxes can be conveniently computed from the measured metabolite secretion pattern. Over a time span of 300 generations the specific growth rate of the strain continuously increased together with a continuous increase in the rate of entropy production. We show that the rate of entropy production asymptotically approaches the maximum entropy production rate predicted from the state when the usage probability of individual elementary modes is distributed according to the Boltzmann distribution. Therefore, the outcome of evolution of a complex biological system can be predicted in highly quantitative terms using basic statistical mechanical principles. Copyright © 2011 Elsevier Inc. All rights reserved.

  14. Using Prediction Markets to Generate Probability Density Functions for Climate Change Risk Assessment

    NASA Astrophysics Data System (ADS)

    Boslough, M.

    2011-12-01

    Climate-related uncertainty is traditionally presented as an error bar, but it is becoming increasingly common to express it in terms of a probability density function (PDF). PDFs are a necessary component of probabilistic risk assessments, for which simple "best estimate" values are insufficient. Many groups have generated PDFs for climate sensitivity using a variety of methods. These PDFs are broadly consistent, but vary significantly in their details. One axiom of the verification and validation community is, "codes don't make predictions, people make predictions." This is a statement of the fact that subject domain experts generate results using assumptions within a range of epistemic uncertainty and interpret them according to their expert opinion. Different experts with different methods will arrive at different PDFs. For effective decision support, a single consensus PDF would be useful. We suggest that market methods can be used to aggregate an ensemble of opinions into a single distribution that expresses the consensus. Prediction markets have been shown to be highly successful at forecasting the outcome of events ranging from elections to box office returns. In prediction markets, traders can take a position on whether some future event will or will not occur. These positions are expressed as contracts that are traded in a double-action market that aggregates price, which can be interpreted as a consensus probability that the event will take place. Since climate sensitivity cannot directly be measured, it cannot be predicted. However, the changes in global mean surface temperature are a direct consequence of climate sensitivity, changes in forcing, and internal variability. Viable prediction markets require an undisputed event outcome on a specific date. Climate-related markets exist on Intrade.com, an online trading exchange. One such contract is titled "Global Temperature Anomaly for Dec 2011 to be greater than 0.65 Degrees C." Settlement is based global temperature anomaly data published by NASS GISS. Typical climate contracts predict the probability of a specified future temperature, but not the probability density or best estimate. One way to generate a probability distribution would be to create a family of contracts over a range of specified temperatures and interpret the price of each contract as its exceedance probability. The resulting plot of probability vs. anomaly is the market-based cumulative density function. The best estimate can be determined by interpolation, and the market-based uncertainty estimate can be based on the spread. One requirement for an effective prediction market is liquidity. Climate contracts are currently considered somewhat of a novelty and often lack sufficient liquidity, but climate change has the potential to generate both tremendous losses for some (e.g. agricultural collapse and extreme weather events) and wealth for others (access to natural resources and trading routes). Use of climate markets by large stakeholders has the potential to generate the liquidity necessary to make them viable. Sandia is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U.S. DoE's NNSA under contract DE-AC04-94AL85000.

  15. In favor of general probability distributions: lateral prefrontal and insular cortices respond to stimulus inherent, but irrelevant differences.

    PubMed

    Mestres-Missé, Anna; Trampel, Robert; Turner, Robert; Kotz, Sonja A

    2016-04-01

    A key aspect of optimal behavior is the ability to predict what will come next. To achieve this, we must have a fairly good idea of the probability of occurrence of possible outcomes. This is based both on prior knowledge about a particular or similar situation and on immediately relevant new information. One question that arises is: when considering converging prior probability and external evidence, is the most probable outcome selected or does the brain represent degrees of uncertainty, even highly improbable ones? Using functional magnetic resonance imaging, the current study explored these possibilities by contrasting words that differ in their probability of occurrence, namely, unbalanced ambiguous words and unambiguous words. Unbalanced ambiguous words have a strong frequency-based bias towards one meaning, while unambiguous words have only one meaning. The current results reveal larger activation in lateral prefrontal and insular cortices in response to dominant ambiguous compared to unambiguous words even when prior and contextual information biases one interpretation only. These results suggest a probability distribution, whereby all outcomes and their associated probabilities of occurrence--even if very low--are represented and maintained.

  16. Propensity scores-potential outcomes framework to incorporate severity probabilities in the highway safety manual crash prediction algorithm.

    PubMed

    Sasidharan, Lekshmi; Donnell, Eric T

    2014-10-01

    Accurate estimation of the expected number of crashes at different severity levels for entities with and without countermeasures plays a vital role in selecting countermeasures in the framework of the safety management process. The current practice is to use the American Association of State Highway and Transportation Officials' Highway Safety Manual crash prediction algorithms, which combine safety performance functions and crash modification factors, to estimate the effects of safety countermeasures on different highway and street facility types. Many of these crash prediction algorithms are based solely on crash frequency, or assume that severity outcomes are unchanged when planning for, or implementing, safety countermeasures. Failing to account for the uncertainty associated with crash severity outcomes, and assuming crash severity distributions remain unchanged in safety performance evaluations, limits the utility of the Highway Safety Manual crash prediction algorithms in assessing the effect of safety countermeasures on crash severity. This study demonstrates the application of a propensity scores-potential outcomes framework to estimate the probability distribution for the occurrence of different crash severity levels by accounting for the uncertainties associated with them. The probability of fatal and severe injury crash occurrence at lighted and unlighted intersections is estimated in this paper using data from Minnesota. The results show that the expected probability of occurrence of fatal and severe injury crashes at a lighted intersection was 1 in 35 crashes and the estimated risk ratio indicates that the respective probabilities at an unlighted intersection was 1.14 times higher compared to lighted intersections. The results from the potential outcomes-propensity scores framework are compared to results obtained from traditional binary logit models, without application of propensity scores matching. Traditional binary logit analysis suggests that the probability of occurrence of severe injury crashes is higher at lighted intersections compared to unlighted intersections, which contradicts the findings obtained from the propensity scores-potential outcomes framework. This finding underscores the importance of having comparable treated and untreated entities in traffic safety countermeasure evaluations. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. Fit to predict? Eco-informatics for predicting the catchability of a pelagic fish in near real time.

    PubMed

    Scales, Kylie L; Hazen, Elliott L; Maxwell, Sara M; Dewar, Heidi; Kohin, Suzanne; Jacox, Michael G; Edwards, Christopher A; Briscoe, Dana K; Crowder, Larry B; Lewison, Rebecca L; Bograd, Steven J

    2017-12-01

    The ocean is a dynamic environment inhabited by a diverse array of highly migratory species, many of which are under direct exploitation in targeted fisheries. The timescales of variability in the marine realm coupled with the extreme mobility of ocean-wandering species such as tuna and billfish complicates fisheries management. Developing eco-informatics solutions that allow for near real-time prediction of the distributions of highly mobile marine species is an important step towards the maturation of dynamic ocean management and ecological forecasting. Using 25 yr (1990-2014) of NOAA fisheries' observer data from the California drift gillnet fishery, we model relative probability of occurrence (presence-absence) and catchability (total catch per gillnet set) of broadbill swordfish Xiphias gladius in the California Current System. Using freely available environmental data sets and open source software, we explore the physical drivers of regional swordfish distribution. Comparing models built upon remotely sensed data sets with those built upon a data-assimilative configuration of the Regional Ocean Modelling System (ROMS), we explore trade-offs in model construction, and address how physical data can affect predictive performance and operational capacity. Swordfish catchability was found to be highest in deeper waters (>1,500 m) with surface temperatures in the 14-20°C range, isothermal layer depth (ILD) of 20-40 m, positive sea surface height (SSH) anomalies, and during the new moon (<20% lunar illumination). We observed a greater influence of mesoscale variability (SSH, wind speed, isothermal layer depth, eddy kinetic energy) in driving swordfish catchability (total catch) than was evident in predicting the relative probability of presence (presence-absence), confirming the utility of generating spatiotemporally dynamic predictions. Data-assimilative ROMS circumvent the limitations of satellite remote sensing in providing physical data fields for species distribution models (e.g., cloud cover, variable resolution, subsurface data), and facilitate broad-scale prediction of dynamic species distributions in near real time. © 2017 by the Ecological Society of America.

  18. Probabilistic performance estimators for computational chemistry methods: The empirical cumulative distribution function of absolute errors

    NASA Astrophysics Data System (ADS)

    Pernot, Pascal; Savin, Andreas

    2018-06-01

    Benchmarking studies in computational chemistry use reference datasets to assess the accuracy of a method through error statistics. The commonly used error statistics, such as the mean signed and mean unsigned errors, do not inform end-users on the expected amplitude of prediction errors attached to these methods. We show that, the distributions of model errors being neither normal nor zero-centered, these error statistics cannot be used to infer prediction error probabilities. To overcome this limitation, we advocate for the use of more informative statistics, based on the empirical cumulative distribution function of unsigned errors, namely, (1) the probability for a new calculation to have an absolute error below a chosen threshold and (2) the maximal amplitude of errors one can expect with a chosen high confidence level. Those statistics are also shown to be well suited for benchmarking and ranking studies. Moreover, the standard error on all benchmarking statistics depends on the size of the reference dataset. Systematic publication of these standard errors would be very helpful to assess the statistical reliability of benchmarking conclusions.

  19. Estimation and prediction of maximum daily rainfall at Sagar Island using best fit probability models

    NASA Astrophysics Data System (ADS)

    Mandal, S.; Choudhury, B. U.

    2015-07-01

    Sagar Island, setting on the continental shelf of Bay of Bengal, is one of the most vulnerable deltas to the occurrence of extreme rainfall-driven climatic hazards. Information on probability of occurrence of maximum daily rainfall will be useful in devising risk management for sustaining rainfed agrarian economy vis-a-vis food and livelihood security. Using six probability distribution models and long-term (1982-2010) daily rainfall data, we studied the probability of occurrence of annual, seasonal and monthly maximum daily rainfall (MDR) in the island. To select the best fit distribution models for annual, seasonal and monthly time series based on maximum rank with minimum value of test statistics, three statistical goodness of fit tests, viz. Kolmogorove-Smirnov test (K-S), Anderson Darling test ( A 2 ) and Chi-Square test ( X 2) were employed. The fourth probability distribution was identified from the highest overall score obtained from the three goodness of fit tests. Results revealed that normal probability distribution was best fitted for annual, post-monsoon and summer seasons MDR, while Lognormal, Weibull and Pearson 5 were best fitted for pre-monsoon, monsoon and winter seasons, respectively. The estimated annual MDR were 50, 69, 86, 106 and 114 mm for return periods of 2, 5, 10, 20 and 25 years, respectively. The probability of getting an annual MDR of >50, >100, >150, >200 and >250 mm were estimated as 99, 85, 40, 12 and 03 % level of exceedance, respectively. The monsoon, summer and winter seasons exhibited comparatively higher probabilities (78 to 85 %) for MDR of >100 mm and moderate probabilities (37 to 46 %) for >150 mm. For different recurrence intervals, the percent probability of MDR varied widely across intra- and inter-annual periods. In the island, rainfall anomaly can pose a climatic threat to the sustainability of agricultural production and thus needs adequate adaptation and mitigation measures.

  20. Multinomial Logistic Regression Predicted Probability Map To Visualize The Influence Of Socio-Economic Factors On Breast Cancer Occurrence in Southern Karnataka

    NASA Astrophysics Data System (ADS)

    Madhu, B.; Ashok, N. C.; Balasubramanian, S.

    2014-11-01

    Multinomial logistic regression analysis was used to develop statistical model that can predict the probability of breast cancer in Southern Karnataka using the breast cancer occurrence data during 2007-2011. Independent socio-economic variables describing the breast cancer occurrence like age, education, occupation, parity, type of family, health insurance coverage, residential locality and socioeconomic status of each case was obtained. The models were developed as follows: i) Spatial visualization of the Urban- rural distribution of breast cancer cases that were obtained from the Bharat Hospital and Institute of Oncology. ii) Socio-economic risk factors describing the breast cancer occurrences were complied for each case. These data were then analysed using multinomial logistic regression analysis in a SPSS statistical software and relations between the occurrence of breast cancer across the socio-economic status and the influence of other socio-economic variables were evaluated and multinomial logistic regression models were constructed. iii) the model that best predicted the occurrence of breast cancer were identified. This multivariate logistic regression model has been entered into a geographic information system and maps showing the predicted probability of breast cancer occurrence in Southern Karnataka was created. This study demonstrates that Multinomial logistic regression is a valuable tool for developing models that predict the probability of breast cancer Occurrence in Southern Karnataka.

  1. Prediction of Malaysian monthly GDP

    NASA Astrophysics Data System (ADS)

    Hin, Pooi Ah; Ching, Soo Huei; Yeing, Pan Wei

    2015-12-01

    The paper attempts to use a method based on multivariate power-normal distribution to predict the Malaysian Gross Domestic Product next month. Letting r(t) be the vector consisting of the month-t values on m selected macroeconomic variables, and GDP, we model the month-(t+1) GDP to be dependent on the present and l-1 past values r(t), r(t-1),…,r(t-l+1) via a conditional distribution which is derived from a [(m+1)l+1]-dimensional power-normal distribution. The 100(α/2)% and 100(1-α/2)% points of the conditional distribution may be used to form an out-of sample prediction interval. This interval together with the mean of the conditional distribution may be used to predict the month-(t+1) GDP. The mean absolute percentage error (MAPE), estimated coverage probability and average length of the prediction interval are used as the criterions for selecting the suitable lag value l-1 and the subset from a pool of 17 macroeconomic variables. It is found that the relatively better models would be those of which 2 ≤ l ≤ 3, and involving one or two of the macroeconomic variables given by Market Indicative Yield, Oil Prices, Exchange Rate and Import Trade.

  2. Relating Tropical Cyclone Track Forecast Error Distributions with Measurements of Forecast Uncertainty

    DTIC Science & Technology

    2016-03-01

    cyclone THORPEX The Observing System Research and Predictability Experiment TIGGE THORPEX Interactive Grand Global Ensemble TS tropical storm ...forecast possible, but also relay the level of uncertainty unique to a given storm . This will better inform decision makers to help protect all assets at...for any given storm . Thus, the probabilities may 4 increase or decrease (and the probability swath may widen or narrow) to provide a more

  3. Weibull-Based Design Methodology for Rotating Aircraft Engine Structures

    NASA Technical Reports Server (NTRS)

    Zaretsky, Erwin; Hendricks, Robert C.; Soditus, Sherry

    2002-01-01

    The NASA Energy Efficient Engine (E(sup 3)-Engine) is used as the basis of a Weibull-based life and reliability analysis. Each component's life and thus the engine's life is defined by high-cycle fatigue (HCF) or low-cycle fatigue (LCF). Knowing the cumulative life distribution of each of the components making up the engine as represented by a Weibull slope is a prerequisite to predicting the life and reliability of the entire engine. As the engine Weibull slope increases, the predicted lives decrease. The predicted engine lives L(sub 5) (95 % probability of survival) of approximately 17,000 and 32,000 hr do correlate with current engine maintenance practices without and with refurbishment. respectively. The individual high pressure turbine (HPT) blade lives necessary to obtain a blade system life L(sub 0.1) (99.9 % probability of survival) of 9000 hr for Weibull slopes of 3, 6 and 9, are 47,391 and 20,652 and 15,658 hr, respectively. For a design life of the HPT disks having probable points of failure equal to or greater than 36,000 hr at a probability of survival of 99.9 %, the predicted disk system life L(sub 0.1) can vary from 9,408 to 24,911 hr.

  4. Occupancy modeling of autonomously recorded vocalizations to predict distribution of rallids in tidal wetlands

    USGS Publications Warehouse

    Stiffler, Lydia L.; Anderson, James T.; Katzner, Todd

    2018-01-01

    Conservation and management for a species requires reliable information on its status, distribution, and habitat use. We identified occupancy and distributions of king (Rallus elegans) and clapper (R. crepitans) rail populations in marsh complexes along the Pamunkey and Mattaponi Rivers in Virginia, USA by modeling data on vocalizations recorded from autonomous recording units (ARUs). Occupancy probability for both species combined was 0.64 (95% CI: 0.53, 0.75) in marshes along the Pamunkey and 0.59 (0.45, 0.72) in marshes along the Mattaponi. Occupancy probability along the Pamunkey was strongly influenced by salinity, increasing logistically by a factor of 1.62 (0.6, 2.65) per parts per thousand of salinity. In contrast, there was not a strong salinity gradient on the Mattaponi and therefore vegetative community structure determined occupancy probability on that river. Estimated detection probability across both marshes was 0.63 (0.62, 0.65), but detection rates decreased as the season progressed. Monitoring wildlife within wetlands presents unique challenges for conservation managers. Our findings provide insight not only into how rails responded to environmental variation but also into the general utility of ARUs for occupancy modeling of the distribution and habitat associations of rails within tidal marsh systems.

  5. Correlation between discrete probability and reaction front propagation rate in heterogeneous mixtures

    NASA Astrophysics Data System (ADS)

    Naine, Tarun Bharath; Gundawar, Manoj Kumar

    2017-09-01

    We demonstrate a very powerful correlation between the discrete probability of distances of neighboring cells and thermal wave propagation rate, for a system of cells spread on a one-dimensional chain. A gamma distribution is employed to model the distances of neighboring cells. In the absence of an analytical solution and the differences in ignition times of adjacent reaction cells following non-Markovian statistics, invariably the solution for thermal wave propagation rate for a one-dimensional system with randomly distributed cells is obtained by numerical simulations. However, such simulations which are based on Monte-Carlo methods require several iterations of calculations for different realizations of distribution of adjacent cells. For several one-dimensional systems, differing in the value of shaping parameter of the gamma distribution, we show that the average reaction front propagation rates obtained by a discrete probability between two limits, shows excellent agreement with those obtained numerically. With the upper limit at 1.3, the lower limit depends on the non-dimensional ignition temperature. Additionally, this approach also facilitates the prediction of burning limits of heterogeneous thermal mixtures. The proposed method completely eliminates the need for laborious, time intensive numerical calculations where the thermal wave propagation rates can now be calculated based only on macroscopic entity of discrete probability.

  6. Earthquake prediction: the interaction of public policy and science.

    PubMed Central

    Jones, L M

    1996-01-01

    Earthquake prediction research has searched for both informational phenomena, those that provide information about earthquake hazards useful to the public, and causal phenomena, causally related to the physical processes governing failure on a fault, to improve our understanding of those processes. Neither informational nor causal phenomena are a subset of the other. I propose a classification of potential earthquake predictors of informational, causal, and predictive phenomena, where predictors are causal phenomena that provide more accurate assessments of the earthquake hazard than can be gotten from assuming a random distribution. Achieving higher, more accurate probabilities than a random distribution requires much more information about the precursor than just that it is causally related to the earthquake. PMID:11607656

  7. Integrity of Ceramic Parts Predicted When Loads and Temperatures Fluctuate Over Time

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.

    2004-01-01

    Brittle materials are being used, and being considered for use, for a wide variety of high performance applications that operate in harsh environments, including static and rotating turbine parts for unmanned aerial vehicles, auxiliary power units, and distributed power generation. Other applications include thermal protection systems, dental prosthetics, fuel cells, oxygen transport membranes, radomes, and microelectromechanical systems (MEMS). In order for these high-technology ceramics to be used successfully for structural applications that push the envelope of materials capabilities, design engineers must consider that brittle materials are designed and analyzed differently than metallic materials. Unlike ductile metals, brittle materials display a stochastic strength response because of the combination of low fracture toughness and the random nature of the size, orientation, and distribution of inherent microscopic flaws. This plus the fact that the strength of a component under load may degrade over time because of slow crack growth means that a probabilistic-based life-prediction methodology must be used when the tradeoffs of failure probability, performance, and useful life are being optimized. The CARES/Life code (which was developed at the NASA Glenn Research Center) predicts the probability of ceramic components failing from spontaneous catastrophic rupture when these components are subjected to multiaxial loading and slow crack growth conditions. Enhancements to CARES/Life now allow for the component survival probability to be calculated when loading and temperature vary over time.

  8. Twenty-five years of change in southern African passerine diversity: nonclimatic factors of change.

    PubMed

    Péron, Guillaume; Altwegg, Res

    2015-09-01

    We analysed more than 25 years of change in passerine bird distribution in South Africa, Swaziland and Lesotho, to show that species distributions can be influenced by processes that are at least in part independent of the local strength and direction of climate change: land use and ecological succession. We used occupancy models that separate species' detection from species' occupancy probability, fitted to citizen science data from both phases of the Southern African Bird Atlas Project (1987-1996 and 2007-2013). Temporal trends in species' occupancy probability were interpreted in terms of local extinction/colonization, and temporal trends in detection probability were interpreted in terms of change in abundance. We found for the first time at this scale that, as predicted in the context of bush encroachment, closed-savannah specialists increased where open-savannah specialists decreased. In addition, the trend in the abundance of species a priori thought to be favoured by agricultural conversion was negatively correlated with human population density, which is in line with hypotheses explaining the decline in farmland birds in the Northern Hemisphere. In addition to climate, vegetation cover and the intensity and time since agricultural conversion constitute important predictors of biodiversity changes in the region. Their inclusion will improve the reliability of predictive models of species distribution. © 2015 John Wiley & Sons Ltd.

  9. Developing and Validating a Survival Prediction Model for NSCLC Patients Through Distributed Learning Across 3 Countries.

    PubMed

    Jochems, Arthur; Deist, Timo M; El Naqa, Issam; Kessler, Marc; Mayo, Chuck; Reeves, Jackson; Jolly, Shruti; Matuszak, Martha; Ten Haken, Randall; van Soest, Johan; Oberije, Cary; Faivre-Finn, Corinne; Price, Gareth; de Ruysscher, Dirk; Lambin, Philippe; Dekker, Andre

    2017-10-01

    Tools for survival prediction for non-small cell lung cancer (NSCLC) patients treated with chemoradiation or radiation therapy are of limited quality. In this work, we developed a predictive model of survival at 2 years. The model is based on a large volume of historical patient data and serves as a proof of concept to demonstrate the distributed learning approach. Clinical data from 698 lung cancer patients, treated with curative intent with chemoradiation or radiation therapy alone, were collected and stored at 2 different cancer institutes (559 patients at Maastro clinic (Netherlands) and 139 at Michigan university [United States]). The model was further validated on 196 patients originating from The Christie (United Kingdon). A Bayesian network model was adapted for distributed learning (the animation can be viewed at https://www.youtube.com/watch?v=ZDJFOxpwqEA). Two-year posttreatment survival was chosen as the endpoint. The Maastro clinic cohort data are publicly available at https://www.cancerdata.org/publication/developing-and-validating-survival-prediction-model-nsclc-patients-through-distributed, and the developed models can be found at www.predictcancer.org. Variables included in the final model were T and N category, age, performance status, and total tumor dose. The model has an area under the curve (AUC) of 0.66 on the external validation set and an AUC of 0.62 on a 5-fold cross validation. A model based on the T and N category performed with an AUC of 0.47 on the validation set, significantly worse than our model (P<.001). Learning the model in a centralized or distributed fashion yields a minor difference on the probabilities of the conditional probability tables (0.6%); the discriminative performance of the models on the validation set is similar (P=.26). Distributed learning from federated databases allows learning of predictive models on data originating from multiple institutions while avoiding many of the data-sharing barriers. We believe that distributed learning is the future of sharing data in health care. Copyright © 2017 The Author(s). Published by Elsevier Inc. All rights reserved.

  10. UXO Burial Prediction Fidelity: A Summary

    DTIC Science & Technology

    2017-07-01

    should not be construed as representing the official position of either the Department of Defense or the sponsoring organization. For More Information ...equilibrium. Any complete picture of munition evolution in sediment would need to account for these effects. More relevant to the present topic: these...of adds uncertainty to predictions of munition fate, and assessments of risk probabilities would need to account for the statistical distribution of

  11. Tree mortality estimates and species distribution probabilities in southeastern United States forests

    Treesearch

    Martin A. Spetich; Zhaofei Fan; Zhen Sui; Michael Crosby; Hong S. He; Stephen R. Shifley; Theodor D. Leininger; W. Keith Moser

    2017-01-01

    Stresses to trees under a changing climate can lead to changes in forest tree survival, mortality and distribution.  For instance, a study examining the effects of human-induced climate change on forest biodiversity by Hansen and others (2001) predicted a 32% reduction in loblolly–shortleaf pine habitat across the eastern United States.  However, they also...

  12. Scale relativity and hierarchical structuring of planetary systems

    NASA Astrophysics Data System (ADS)

    Galopeau, P. H. M.; Nottale, L.; da Rocha, D.; Tran Minh, N.

    2003-04-01

    The theory of scale relativity, applied to macroscopic gravitational systems like planetary systems, allows one to predict quantization laws of several key parameters characterizing those systems (distance between planets and central star, obliquity, eccentricity...) which are organized in a hierarchical way. In the framework of the scale relativity approach, one demonstrates that the motion (at relatively large time-scales) of the bodies in planetary systems, described in terms of fractal geodesic trajectories, is governed by a Schrödinger-like equation. Preferential orbits are predicted in terms of probability density peaks with semi-major axis given by: a_n = GMn^2/w^2 (M is the mass of the central star and w is a velocity close to 144 km s-1 in the case of our inner solar system and of the presently observed exoplanets). The velocity of the planet orbiting at this distance satisfies the relation v_n = w/n. Moreover, the mass distribution of the planets in our solar system can be accounted for in this model. These predictions are in good agreement with the observed values of the actual orbital parameters. Furthermore, the exoplanets which have been recently discovered around nearby stars also follow the same law in terms of the same constant in a highly significant statistical way. The theory of scale relativity also predicts structures for the obliquities and inclinations of the planets and satellites: the probability density of their distribution between 0 and pi are expected to display peaks at particular angles θ_k = kpi/n. A statistical agreement is obtained for our solar system with n=7. Another prediction concerns the distribution of the planets eccentricities e. The theory foresees a quantization law e = k/n where k is an integer and n is the quantum number that characterizes semi-major axes. The presently known exoplanet eccentricities are compatible with this theoretical prediction. Finally, although all these planetary systems may look very different from our solar system, they actually present universal structures comparable to ours, so that a high probability to discover exoplanets having orbital characteristics very similar to the Earth's ones can be expected.

  13. Probability distribution functions for unit hydrographs with optimization using genetic algorithm

    NASA Astrophysics Data System (ADS)

    Ghorbani, Mohammad Ali; Singh, Vijay P.; Sivakumar, Bellie; H. Kashani, Mahsa; Atre, Atul Arvind; Asadi, Hakimeh

    2017-05-01

    A unit hydrograph (UH) of a watershed may be viewed as the unit pulse response function of a linear system. In recent years, the use of probability distribution functions (pdfs) for determining a UH has received much attention. In this study, a nonlinear optimization model is developed to transmute a UH into a pdf. The potential of six popular pdfs, namely two-parameter gamma, two-parameter Gumbel, two-parameter log-normal, two-parameter normal, three-parameter Pearson distribution, and two-parameter Weibull is tested on data from the Lighvan catchment in Iran. The probability distribution parameters are determined using the nonlinear least squares optimization method in two ways: (1) optimization by programming in Mathematica; and (2) optimization by applying genetic algorithm. The results are compared with those obtained by the traditional linear least squares method. The results show comparable capability and performance of two nonlinear methods. The gamma and Pearson distributions are the most successful models in preserving the rising and recession limbs of the unit hydographs. The log-normal distribution has a high ability in predicting both the peak flow and time to peak of the unit hydrograph. The nonlinear optimization method does not outperform the linear least squares method in determining the UH (especially for excess rainfall of one pulse), but is comparable.

  14. Theoretical Analysis of Rain Attenuation Probability

    NASA Astrophysics Data System (ADS)

    Roy, Surendra Kr.; Jha, Santosh Kr.; Jha, Lallan

    2007-07-01

    Satellite communication technologies are now highly developed and high quality, distance-independent services have expanded over a very wide area. As for the system design of the Hokkaido integrated telecommunications(HIT) network, it must first overcome outages of satellite links due to rain attenuation in ka frequency bands. In this paper theoretical analysis of rain attenuation probability on a slant path has been made. The formula proposed is based Weibull distribution and incorporates recent ITU-R recommendations concerning the necessary rain rates and rain heights inputs. The error behaviour of the model was tested with the loading rain attenuation prediction model recommended by ITU-R for large number of experiments at different probability levels. The novel slant path rain attenuastion prediction model compared to the ITU-R one exhibits a similar behaviour at low time percentages and a better root-mean-square error performance for probability levels above 0.02%. The set of presented models exhibits the advantage of implementation with little complexity and is considered useful for educational and back of the envelope computations.

  15. Deducing the multi-trader population driving a financial market

    NASA Astrophysics Data System (ADS)

    Gupta, Nachi; Hauser, Raphael; Johnson, Neil

    2005-12-01

    We have previously laid out a basic framework for predicting financial movements and pockets of predictability by tracking the distribution of a multi-trader population playing on an artificial financial market model. This work explores extensions to this basic framework. We allow for more intelligent agents with a richer strategy set, and we no longer constrain the distribution over these agents to a probability space. We then introduce a fusion scheme which accounts for multiple runs of randomly chosen sets of possible agent types. We also discuss a mechanism for bias removal on the estimates.

  16. The role of presumed probability density functions in the simulation of nonpremixed turbulent combustion

    NASA Astrophysics Data System (ADS)

    Coclite, A.; Pascazio, G.; De Palma, P.; Cutrone, L.

    2016-07-01

    Flamelet-Progress-Variable (FPV) combustion models allow the evaluation of all thermochemical quantities in a reacting flow by computing only the mixture fraction Z and a progress variable C. When using such a method to predict turbulent combustion in conjunction with a turbulence model, a probability density function (PDF) is required to evaluate statistical averages (e. g., Favre averages) of chemical quantities. The choice of the PDF is a compromise between computational costs and accuracy level. The aim of this paper is to investigate the influence of the PDF choice and its modeling aspects to predict turbulent combustion. Three different models are considered: the standard one, based on the choice of a β-distribution for Z and a Dirac-distribution for C; a model employing a β-distribution for both Z and C; and the third model obtained using a β-distribution for Z and the statistically most likely distribution (SMLD) for C. The standard model, although widely used, does not take into account the interaction between turbulence and chemical kinetics as well as the dependence of the progress variable not only on its mean but also on its variance. The SMLD approach establishes a systematic framework to incorporate informations from an arbitrary number of moments, thus providing an improvement over conventionally employed presumed PDF closure models. The rational behind the choice of the three PDFs is described in some details and the prediction capability of the corresponding models is tested vs. well-known test cases, namely, the Sandia flames, and H2-air supersonic combustion.

  17. Neural substrates of updating the prediction through prediction error during decision making.

    PubMed

    Wang, Ying; Ma, Ning; He, Xiaosong; Li, Nan; Wei, Zhengde; Yang, Lizhuang; Zha, Rujing; Han, Long; Li, Xiaoming; Zhang, Daren; Liu, Ying; Zhang, Xiaochu

    2017-08-15

    Learning of prediction error (PE), including reward PE and risk PE, is crucial for updating the prediction in reinforcement learning (RL). Neurobiological and computational models of RL have reported extensive brain activations related to PE. However, the occurrence of PE does not necessarily predict updating the prediction, e.g., in a probability-known event. Therefore, the brain regions specifically engaged in updating the prediction remain unknown. Here, we conducted two functional magnetic resonance imaging (fMRI) experiments, the probability-unknown Iowa Gambling Task (IGT) and the probability-known risk decision task (RDT). Behavioral analyses confirmed that PEs occurred in both tasks but were only used for updating the prediction in the IGT. By comparing PE-related brain activations between the two tasks, we found that the rostral anterior cingulate cortex/ventral medial prefrontal cortex (rACC/vmPFC) and the posterior cingulate cortex (PCC) activated only during the IGT and were related to both reward and risk PE. Moreover, the responses in the rACC/vmPFC and the PCC were modulated by uncertainty and were associated with reward prediction-related brain regions. Electric brain stimulation over these regions lowered the performance in the IGT but not in the RDT. Our findings of a distributed neural circuit of PE processing suggest that the rACC/vmPFC and the PCC play a key role in updating the prediction through PE processing during decision making. Copyright © 2017 Elsevier Inc. All rights reserved.

  18. Large eddy simulation of turbulent premixed combustion using tabulated detailed chemistry and presumed probability density function

    NASA Astrophysics Data System (ADS)

    Zhang, Hongda; Han, Chao; Ye, Taohong; Ren, Zhuyin

    2016-03-01

    A method of chemistry tabulation combined with presumed probability density function (PDF) is applied to simulate piloted premixed jet burner flames with high Karlovitz number using large eddy simulation. Thermo-chemistry states are tabulated by the combination of auto-ignition and extended auto-ignition model. To evaluate the predictive capability of the proposed tabulation method to represent the thermo-chemistry states under the condition of different fresh gases temperature, a-priori study is conducted by performing idealised transient one-dimensional premixed flame simulations. Presumed PDF is used to involve the interaction of turbulence and flame with beta PDF to model the reaction progress variable distribution. Two presumed PDF models, Dirichlet distribution and independent beta distribution, respectively, are applied for representing the interaction between two mixture fractions that are associated with three inlet streams. Comparisons of statistical results show that two presumed PDF models for the two mixture fractions are both capable of predicting temperature and major species profiles, however, they are shown to have a significant effect on the predictions for intermediate species. An analysis of the thermo-chemical state-space representation of the sub-grid scale (SGS) combustion model is performed by comparing correlations between the carbon monoxide mass fraction and temperature. The SGS combustion model based on the proposed chemistry tabulation can reasonably capture the peak value and change trend of intermediate species. Aspects regarding model extensions to adequately predict the peak location of intermediate species are discussed.

  19. Multinomial Logistic Regression & Bootstrapping for Bayesian Estimation of Vertical Facies Prediction in Heterogeneous Sandstone Reservoirs

    NASA Astrophysics Data System (ADS)

    Al-Mudhafar, W. J.

    2013-12-01

    Precisely prediction of rock facies leads to adequate reservoir characterization by improving the porosity-permeability relationships to estimate the properties in non-cored intervals. It also helps to accurately identify the spatial facies distribution to perform an accurate reservoir model for optimal future reservoir performance. In this paper, the facies estimation has been done through Multinomial logistic regression (MLR) with respect to the well logs and core data in a well in upper sandstone formation of South Rumaila oil field. The entire independent variables are gamma rays, formation density, water saturation, shale volume, log porosity, core porosity, and core permeability. Firstly, Robust Sequential Imputation Algorithm has been considered to impute the missing data. This algorithm starts from a complete subset of the dataset and estimates sequentially the missing values in an incomplete observation by minimizing the determinant of the covariance of the augmented data matrix. Then, the observation is added to the complete data matrix and the algorithm continues with the next observation with missing values. The MLR has been chosen to estimate the maximum likelihood and minimize the standard error for the nonlinear relationships between facies & core and log data. The MLR is used to predict the probabilities of the different possible facies given each independent variable by constructing a linear predictor function having a set of weights that are linearly combined with the independent variables by using a dot product. Beta distribution of facies has been considered as prior knowledge and the resulted predicted probability (posterior) has been estimated from MLR based on Baye's theorem that represents the relationship between predicted probability (posterior) with the conditional probability and the prior knowledge. To assess the statistical accuracy of the model, the bootstrap should be carried out to estimate extra-sample prediction error by randomly drawing datasets with replacement from the training data. Each sample has the same size of the original training set and it can be conducted N times to produce N bootstrap datasets to re-fit the model accordingly to decrease the squared difference between the estimated and observed categorical variables (facies) leading to decrease the degree of uncertainty.

  20. Under the hood of statistical learning: A statistical MMN reflects the magnitude of transitional probabilities in auditory sequences.

    PubMed

    Koelsch, Stefan; Busch, Tobias; Jentschke, Sebastian; Rohrmeier, Martin

    2016-02-02

    Within the framework of statistical learning, many behavioural studies investigated the processing of unpredicted events. However, surprisingly few neurophysiological studies are available on this topic, and no statistical learning experiment has investigated electroencephalographic (EEG) correlates of processing events with different transition probabilities. We carried out an EEG study with a novel variant of the established statistical learning paradigm. Timbres were presented in isochronous sequences of triplets. The first two sounds of all triplets were equiprobable, while the third sound occurred with either low (10%), intermediate (30%), or high (60%) probability. Thus, the occurrence probability of the third item of each triplet (given the first two items) was varied. Compared to high-probability triplet endings, endings with low and intermediate probability elicited an early anterior negativity that had an onset around 100 ms and was maximal at around 180 ms. This effect was larger for events with low than for events with intermediate probability. Our results reveal that, when predictions are based on statistical learning, events that do not match a prediction evoke an early anterior negativity, with the amplitude of this mismatch response being inversely related to the probability of such events. Thus, we report a statistical mismatch negativity (sMMN) that reflects statistical learning of transitional probability distributions that go beyond auditory sensory memory capabilities.

  1. Robust optimization based upon statistical theory.

    PubMed

    Sobotta, B; Söhn, M; Alber, M

    2010-08-01

    Organ movement is still the biggest challenge in cancer treatment despite advances in online imaging. Due to the resulting geometric uncertainties, the delivered dose cannot be predicted precisely at treatment planning time. Consequently, all associated dose metrics (e.g., EUD and maxDose) are random variables with a patient-specific probability distribution. The method that the authors propose makes these distributions the basis of the optimization and evaluation process. The authors start from a model of motion derived from patient-specific imaging. On a multitude of geometry instances sampled from this model, a dose metric is evaluated. The resulting pdf of this dose metric is termed outcome distribution. The approach optimizes the shape of the outcome distribution based on its mean and variance. This is in contrast to the conventional optimization of a nominal value (e.g., PTV EUD) computed on a single geometry instance. The mean and variance allow for an estimate of the expected treatment outcome along with the residual uncertainty. Besides being applicable to the target, the proposed method also seamlessly includes the organs at risk (OARs). The likelihood that a given value of a metric is reached in the treatment is predicted quantitatively. This information reveals potential hazards that may occur during the course of the treatment, thus helping the expert to find the right balance between the risk of insufficient normal tissue sparing and the risk of insufficient tumor control. By feeding this information to the optimizer, outcome distributions can be obtained where the probability of exceeding a given OAR maximum and that of falling short of a given target goal can be minimized simultaneously. The method is applicable to any source of residual motion uncertainty in treatment delivery. Any model that quantifies organ movement and deformation in terms of probability distributions can be used as basis for the algorithm. Thus, it can generate dose distributions that are robust against interfraction and intrafraction motion alike, effectively removing the need for indiscriminate safety margins.

  2. Universal noise and Efimov physics

    NASA Astrophysics Data System (ADS)

    Nicholson, Amy N.

    2016-03-01

    Probability distributions for correlation functions of particles interacting via random-valued fields are discussed as a novel tool for determining the spectrum of a theory. In particular, this method is used to determine the energies of universal N-body clusters tied to Efimov trimers, for even N, by investigating the distribution of a correlation function of two particles at unitarity. Using numerical evidence that this distribution is log-normal, an analytical prediction for the N-dependence of the N-body binding energies is made.

  3. Measures for a multidimensional multiverse

    NASA Astrophysics Data System (ADS)

    Chung, Hyeyoun

    2015-04-01

    We explore the phenomenological implications of generalizing the causal patch and fat geodesic measures to a multidimensional multiverse, where the vacua can have differing numbers of large dimensions. We consider a simple model in which the vacua are nucleated from a D -dimensional parent spacetime through dynamical compactification of the extra dimensions, and compute the geometric contribution to the probability distribution of observations within the multiverse for each measure. We then study how the shape of this probability distribution depends on the time scales for the existence of observers, for vacuum domination, and for curvature domination (tobs,tΛ , and tc, respectively.) In this work we restrict ourselves to bubbles with positive cosmological constant, Λ . We find that in the case of the causal patch cutoff, when the bubble universes have p +1 large spatial dimensions with p ≥2 , the shape of the probability distribution is such that we obtain the coincidence of time scales tobs˜tΛ˜tc . Moreover, the size of the cosmological constant is related to the size of the landscape. However, the exact shape of the probability distribution is different in the case p =2 , compared to p ≥3 . In the case of the fat geodesic measure, the result is even more robust: the shape of the probability distribution is the same for all p ≥2 , and we once again obtain the coincidence tobs˜tΛ˜tc . These results require only very mild conditions on the prior probability of the distribution of vacua in the landscape. Our work shows that the observed double coincidence of time scales is a robust prediction even when the multiverse is generalized to be multidimensional; that this coincidence is not a consequence of our particular Universe being (3 +1 )-dimensional; and that this observable cannot be used to preferentially select one measure over another in a multidimensional multiverse.

  4. Volume-weighted measure for eternal inflation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winitzki, Sergei

    2008-08-15

    I propose a new volume-weighted probability measure for cosmological 'multiverse' scenarios involving eternal inflation. The 'reheating-volume (RV) cutoff' calculates the distribution of observable quantities on a portion of the reheating hypersurface that is conditioned to be finite. The RV measure is gauge-invariant, does not suffer from the 'youngness paradox', and is independent of initial conditions at the beginning of inflation. In slow-roll inflationary models with a scalar inflaton, the RV-regulated probability distributions can be obtained by solving nonlinear diffusion equations. I discuss possible applications of the new measure to 'landscape' scenarios with bubble nucleation. As an illustration, I compute themore » predictions of the RV measure in a simple toy landscape.« less

  5. Population pharmacokinetics and maximum a posteriori probability Bayesian estimator of abacavir: application of individualized therapy in HIV-infected infants and toddlers

    PubMed Central

    Zhao, Wei; Cella, Massimo; Della Pasqua, Oscar; Burger, David; Jacqz-Aigrain, Evelyne

    2012-01-01

    AIMS To develop a population pharmacokinetic model for abacavir in HIV-infected infants and toddlers, which will be used to describe both once and twice daily pharmacokinetic profiles, identify covariates that explain variability and propose optimal time points to optimize the area under the concentration–time curve (AUC) targeted dosage and individualize therapy. METHODS The pharmacokinetics of abacavir was described with plasma concentrations from 23 patients using nonlinear mixed-effects modelling (NONMEM) software. A two-compartment model with first-order absorption and elimination was developed. The final model was validated using bootstrap, visual predictive check and normalized prediction distribution errors. The Bayesian estimator was validated using the cross-validation and simulation–estimation method. RESULTS The typical population pharmacokinetic parameters and relative standard errors (RSE) were apparent systemic clearance (CL) 13.4 l h−1 (RSE 6.3%), apparent central volume of distribution 4.94 l (RSE 28.7%), apparent peripheral volume of distribution 8.12 l (RSE14.2%), apparent intercompartment clearance 1.25 l h−1 (RSE 16.9%) and absorption rate constant 0.758 h−1 (RSE 5.8%). The covariate analysis identified weight as the individual factor influencing the apparent oral clearance: CL = 13.4 × (weight/12)1.14. The maximum a posteriori probability Bayesian estimator, based on three concentrations measured at 0, 1 or 2, and 3 h after drug intake allowed predicting individual AUC0–t. CONCLUSIONS The population pharmacokinetic model developed for abacavir in HIV-infected infants and toddlers accurately described both once and twice daily pharmacokinetic profiles. The maximum a posteriori probability Bayesian estimator of AUC0–t was developed from the final model and can be used routinely to optimize individual dosing. PMID:21988586

  6. Uncertainty Quantification in Remaining Useful Life of Aerospace Components using State Space Models and Inverse FORM

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Goebel, Kai

    2013-01-01

    This paper investigates the use of the inverse first-order reliability method (inverse- FORM) to quantify the uncertainty in the remaining useful life (RUL) of aerospace components. The prediction of remaining useful life is an integral part of system health prognosis, and directly helps in online health monitoring and decision-making. However, the prediction of remaining useful life is affected by several sources of uncertainty, and therefore it is necessary to quantify the uncertainty in the remaining useful life prediction. While system parameter uncertainty and physical variability can be easily included in inverse-FORM, this paper extends the methodology to include: (1) future loading uncertainty, (2) process noise; and (3) uncertainty in the state estimate. The inverse-FORM method has been used in this paper to (1) quickly obtain probability bounds on the remaining useful life prediction; and (2) calculate the entire probability distribution of remaining useful life prediction, and the results are verified against Monte Carlo sampling. The proposed methodology is illustrated using a numerical example.

  7. Probability Forecasting Using Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Duncan, M.; Frisbee, J.; Wysack, J.

    2014-09-01

    Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a collision probability distribution given known, predicted uncertainty. This paper presents the details of the collision probability forecasting method. We examine various conjunction event scenarios and numerically demonstrate the utility of this approach in typical event scenarios. We explore the utility of a probability-based track scenario simulation that models expected tracking data frequency as the tasking levels are increased. The resulting orbital uncertainty is subsequently used in the forecasting algorithm.

  8. Predicting Ligand Binding Sites on Protein Surfaces by 3-Dimensional Probability Density Distributions of Interacting Atoms

    PubMed Central

    Jian, Jhih-Wei; Elumalai, Pavadai; Pitti, Thejkiran; Wu, Chih Yuan; Tsai, Keng-Chang; Chang, Jeng-Yih; Peng, Hung-Pin; Yang, An-Suei

    2016-01-01

    Predicting ligand binding sites (LBSs) on protein structures, which are obtained either from experimental or computational methods, is a useful first step in functional annotation or structure-based drug design for the protein structures. In this work, the structure-based machine learning algorithm ISMBLab-LIG was developed to predict LBSs on protein surfaces with input attributes derived from the three-dimensional probability density maps of interacting atoms, which were reconstructed on the query protein surfaces and were relatively insensitive to local conformational variations of the tentative ligand binding sites. The prediction accuracy of the ISMBLab-LIG predictors is comparable to that of the best LBS predictors benchmarked on several well-established testing datasets. More importantly, the ISMBLab-LIG algorithm has substantial tolerance to the prediction uncertainties of computationally derived protein structure models. As such, the method is particularly useful for predicting LBSs not only on experimental protein structures without known LBS templates in the database but also on computationally predicted model protein structures with structural uncertainties in the tentative ligand binding sites. PMID:27513851

  9. The Effect of General Statistical Fiber Misalignment on Predicted Damage Initiation in Composites

    NASA Technical Reports Server (NTRS)

    Bednarcyk, Brett A.; Aboudi, Jacob; Arnold, Steven M.

    2014-01-01

    A micromechanical method is employed for the prediction of unidirectional composites in which the fiber orientation can possess various statistical misalignment distributions. The method relies on the probability-weighted averaging of the appropriate concentration tensor, which is established by the micromechanical procedure. This approach provides access to the local field quantities throughout the constituents, from which initiation of damage in the composite can be predicted. In contrast, a typical macromechanical procedure can determine the effective composite elastic properties in the presence of statistical fiber misalignment, but cannot provide the local fields. Fully random fiber distribution is presented as a special case using the proposed micromechanical method. Results are given that illustrate the effects of various amounts of fiber misalignment in terms of the standard deviations of in-plane and out-of-plane misalignment angles, where normal distributions have been employed. Damage initiation envelopes, local fields, effective moduli, and strengths are predicted for polymer and ceramic matrix composites with given normal distributions of misalignment angles, as well as fully random fiber orientation.

  10. Impacts of Climate Change on the Global Invasion Potential of the African Clawed Frog Xenopus laevis

    PubMed Central

    Ihlow, Flora; Courant, Julien; Secondi, Jean; Herrel, Anthony; Rebelo, Rui; Measey, G. John; Lillo, Francesco; De Villiers, F. André; Vogt, Solveig; De Busschere, Charlotte; Backeljau, Thierry; Rödder, Dennis

    2016-01-01

    By altering or eliminating delicate ecological relationships, non-indigenous species are considered a major threat to biodiversity, as well as a driver of environmental change. Global climate change affects ecosystems and ecological communities, leading to changes in the phenology, geographic ranges, or population abundance of several species. Thus, predicting the impacts of global climate change on the current and future distribution of invasive species is an important subject in macroecological studies. The African clawed frog (Xenopus laevis), native to South Africa, possesses a strong invasion potential and populations have become established in numerous countries across four continents. The global invasion potential of X. laevis was assessed using correlative species distribution models (SDMs). SDMs were computed based on a comprehensive set of occurrence records covering South Africa, North America, South America and Europe and a set of nine environmental predictors. Models were built using both a maximum entropy model and an ensemble approach integrating eight algorithms. The future occurrence probabilities for X. laevis were subsequently computed using bioclimatic variables for 2070 following four different IPCC scenarios. Despite minor differences between the statistical approaches, both SDMs predict the future potential distribution of X. laevis, on a global scale, to decrease across all climate change scenarios. On a continental scale, both SDMs predict decreasing potential distributions in the species’ native range in South Africa, as well as in the invaded areas in North and South America, and in Australia where the species has not been introduced. In contrast, both SDMs predict the potential range size to expand in Europe. Our results suggest that all probability classes will be equally affected by climate change. New regional conditions may promote new invasions or the spread of established invasive populations, especially in France and Great Britain. PMID:27248830

  11. Impacts of Climate Change on the Global Invasion Potential of the African Clawed Frog Xenopus laevis.

    PubMed

    Ihlow, Flora; Courant, Julien; Secondi, Jean; Herrel, Anthony; Rebelo, Rui; Measey, G John; Lillo, Francesco; De Villiers, F André; Vogt, Solveig; De Busschere, Charlotte; Backeljau, Thierry; Rödder, Dennis

    2016-01-01

    By altering or eliminating delicate ecological relationships, non-indigenous species are considered a major threat to biodiversity, as well as a driver of environmental change. Global climate change affects ecosystems and ecological communities, leading to changes in the phenology, geographic ranges, or population abundance of several species. Thus, predicting the impacts of global climate change on the current and future distribution of invasive species is an important subject in macroecological studies. The African clawed frog (Xenopus laevis), native to South Africa, possesses a strong invasion potential and populations have become established in numerous countries across four continents. The global invasion potential of X. laevis was assessed using correlative species distribution models (SDMs). SDMs were computed based on a comprehensive set of occurrence records covering South Africa, North America, South America and Europe and a set of nine environmental predictors. Models were built using both a maximum entropy model and an ensemble approach integrating eight algorithms. The future occurrence probabilities for X. laevis were subsequently computed using bioclimatic variables for 2070 following four different IPCC scenarios. Despite minor differences between the statistical approaches, both SDMs predict the future potential distribution of X. laevis, on a global scale, to decrease across all climate change scenarios. On a continental scale, both SDMs predict decreasing potential distributions in the species' native range in South Africa, as well as in the invaded areas in North and South America, and in Australia where the species has not been introduced. In contrast, both SDMs predict the potential range size to expand in Europe. Our results suggest that all probability classes will be equally affected by climate change. New regional conditions may promote new invasions or the spread of established invasive populations, especially in France and Great Britain.

  12. Statistical Issues for Uncontrolled Reentry Hazards Empirical Tests of the Predicted Footprint for Uncontrolled Satellite Reentry Hazards

    NASA Technical Reports Server (NTRS)

    Matney, Mark

    2011-01-01

    A number of statistical tools have been developed over the years for assessing the risk of reentering objects to human populations. These tools make use of the characteristics (e.g., mass, material, shape, size) of debris that are predicted by aerothermal models to survive reentry. The statistical tools use this information to compute the probability that one or more of the surviving debris might hit a person on the ground and cause one or more casualties. The statistical portion of the analysis relies on a number of assumptions about how the debris footprint and the human population are distributed in latitude and longitude, and how to use that information to arrive at realistic risk numbers. Because this information is used in making policy and engineering decisions, it is important that these assumptions be tested using empirical data. This study uses the latest database of known uncontrolled reentry locations measured by the United States Department of Defense. The predicted ground footprint distributions of these objects are based on the theory that their orbits behave basically like simple Kepler orbits. However, there are a number of factors in the final stages of reentry - including the effects of gravitational harmonics, the effects of the Earth s equatorial bulge on the atmosphere, and the rotation of the Earth and atmosphere - that could cause them to diverge from simple Kepler orbit behavior and possibly change the probability of reentering over a given location. In this paper, the measured latitude and longitude distributions of these objects are directly compared with the predicted distributions, providing a fundamental empirical test of the model assumptions.

  13. Fundamental niche prediction of the pathogenic yeasts Cryptococcus neoformans and Cryptococcus gattii in Europe.

    PubMed

    Cogliati, Massimo; Puccianti, Erika; Montagna, Maria T; De Donno, Antonella; Susever, Serdar; Ergin, Cagri; Velegraki, Aristea; Ellabib, Mohamed S; Nardoni, Simona; Macci, Cristina; Trovato, Laura; Dipineto, Ludovico; Rickerts, Volker; Akcaglar, Sevim; Mlinaric-Missoni, Emilija; Bertout, Sebastien; Vencà, Ana C F; Sampaio, Ana C; Criseo, Giuseppe; Ranque, Stéphane; Çerikçioğlu, Nilgün; Marchese, Anna; Vezzulli, Luigi; Ilkit, Macit; Desnos-Ollivier, Marie; Pasquale, Vincenzo; Polacheck, Itzhack; Scopa, Antonio; Meyer, Wieland; Ferreira-Paim, Kennio; Hagen, Ferry; Boekhout, Teun; Dromer, Françoise; Varma, Ashok; Kwon-Chung, Kyung J; Inácio, Joäo; Colom, Maria F

    2017-10-01

    Fundamental niche prediction of Cryptococcus neoformans and Cryptococcus gattii in Europe is an important tool to understand where these pathogenic yeasts have a high probability to survive in the environment and therefore to identify the areas with high risk of infection. In this study, occurrence data for C. neoformans and C. gattii were compared by MaxEnt software with several bioclimatic conditions as well as with soil characteristics and land use. The results showed that C. gattii distribution can be predicted with high probability along the Mediterranean coast. The analysis of variables showed that its distribution is limited by low temperatures during the coldest season, and by heavy precipitations in the driest season. C. neoformans var. grubii is able to colonize the same areas of C. gattii but is more tolerant to cold winter temperatures and summer precipitations. In contrast, the C. neoformans var. neoformans map was completely different. The best conditions for its survival were displayed in sub-continental areas and not along the Mediterranean coasts. In conclusion, we produced for the first time detailed prediction maps of the species and varieties of the C. neoformans and C. gattii species complex in Europe and Mediterranean area. © 2017 Society for Applied Microbiology and John Wiley & Sons Ltd.

  14. Inference as Prediction

    ERIC Educational Resources Information Center

    Watson, Jane

    2007-01-01

    Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…

  15. Statistical analysis of PM₁₀ concentrations at different locations in Malaysia.

    PubMed

    Sansuddin, Nurulilyana; Ramli, Nor Azam; Yahaya, Ahmad Shukri; Yusof, Noor Faizah Fitri Md; Ghazali, Nurul Adyani; Madhoun, Wesam Ahmed Al

    2011-09-01

    Malaysia has experienced several haze events since the 1980s as a consequence of the transboundary movement of air pollutants emitted from forest fires and open burning activities. Hazy episodes can result from local activities and be categorized as "localized haze". General probability distributions (i.e., gamma and log-normal) were chosen to analyze the PM(10) concentrations data at two different types of locations in Malaysia: industrial (Johor Bahru and Nilai) and residential (Kota Kinabalu and Kuantan). These areas were chosen based on their frequently high PM(10) concentration readings. The best models representing the areas were chosen based on their performance indicator values. The best distributions provided the probability of exceedances and the return period between the actual and predicted concentrations based on the threshold limit given by the Malaysian Ambient Air Quality Guidelines (24-h average of 150 μg/m(3)) for PM(10) concentrations. The short-term prediction for PM(10) exceedances in 14 days was obtained using the autoregressive model.

  16. Stochastic modelling of animal movement.

    PubMed

    Smouse, Peter E; Focardi, Stefano; Moorcroft, Paul R; Kie, John G; Forester, James D; Morales, Juan M

    2010-07-27

    Modern animal movement modelling derives from two traditions. Lagrangian models, based on random walk behaviour, are useful for multi-step trajectories of single animals. Continuous Eulerian models describe expected behaviour, averaged over stochastic realizations, and are usefully applied to ensembles of individuals. We illustrate three modern research arenas. (i) Models of home-range formation describe the process of an animal 'settling down', accomplished by including one or more focal points that attract the animal's movements. (ii) Memory-based models are used to predict how accumulated experience translates into biased movement choices, employing reinforced random walk behaviour, with previous visitation increasing or decreasing the probability of repetition. (iii) Lévy movement involves a step-length distribution that is over-dispersed, relative to standard probability distributions, and adaptive in exploring new environments or searching for rare targets. Each of these modelling arenas implies more detail in the movement pattern than general models of movement can accommodate, but realistic empiric evaluation of their predictions requires dense locational data, both in time and space, only available with modern GPS telemetry.

  17. Mixture EMOS model for calibrating ensemble forecasts of wind speed.

    PubMed

    Baran, S; Lerch, S

    2016-03-01

    Ensemble model output statistics (EMOS) is a statistical tool for post-processing forecast ensembles of weather variables obtained from multiple runs of numerical weather prediction models in order to produce calibrated predictive probability density functions. The EMOS predictive probability density function is given by a parametric distribution with parameters depending on the ensemble forecasts. We propose an EMOS model for calibrating wind speed forecasts based on weighted mixtures of truncated normal (TN) and log-normal (LN) distributions where model parameters and component weights are estimated by optimizing the values of proper scoring rules over a rolling training period. The new model is tested on wind speed forecasts of the 50 member European Centre for Medium-range Weather Forecasts ensemble, the 11 member Aire Limitée Adaptation dynamique Développement International-Hungary Ensemble Prediction System ensemble of the Hungarian Meteorological Service, and the eight-member University of Washington mesoscale ensemble, and its predictive performance is compared with that of various benchmark EMOS models based on single parametric families and combinations thereof. The results indicate improved calibration of probabilistic and accuracy of point forecasts in comparison with the raw ensemble and climatological forecasts. The mixture EMOS model significantly outperforms the TN and LN EMOS methods; moreover, it provides better calibrated forecasts than the TN-LN combination model and offers an increased flexibility while avoiding covariate selection problems. © 2016 The Authors Environmetrics Published by JohnWiley & Sons Ltd.

  18. A Reliability Estimation in Modeling Watershed Runoff With Uncertainties

    NASA Astrophysics Data System (ADS)

    Melching, Charles S.; Yen, Ben Chie; Wenzel, Harry G., Jr.

    1990-10-01

    The reliability of simulation results produced by watershed runoff models is a function of uncertainties in nature, data, model parameters, and model structure. A framework is presented here for using a reliability analysis method (such as first-order second-moment techniques or Monte Carlo simulation) to evaluate the combined effect of the uncertainties on the reliability of output hydrographs from hydrologic models. For a given event the prediction reliability can be expressed in terms of the probability distribution of the estimated hydrologic variable. The peak discharge probability for a watershed in Illinois using the HEC-1 watershed model is given as an example. The study of the reliability of predictions from watershed models provides useful information on the stochastic nature of output from deterministic models subject to uncertainties and identifies the relative contribution of the various uncertainties to unreliability of model predictions.

  19. Modeling workplace contact networks: The effects of organizational structure, architecture, and reporting errors on epidemic predictions.

    PubMed

    Potter, Gail E; Smieszek, Timo; Sailer, Kerstin

    2015-09-01

    Face-to-face social contacts are potentially important transmission routes for acute respiratory infections, and understanding the contact network can improve our ability to predict, contain, and control epidemics. Although workplaces are important settings for infectious disease transmission, few studies have collected workplace contact data and estimated workplace contact networks. We use contact diaries, architectural distance measures, and institutional structures to estimate social contact networks within a Swiss research institute. Some contact reports were inconsistent, indicating reporting errors. We adjust for this with a latent variable model, jointly estimating the true (unobserved) network of contacts and duration-specific reporting probabilities. We find that contact probability decreases with distance, and that research group membership, role, and shared projects are strongly predictive of contact patterns. Estimated reporting probabilities were low only for 0-5 min contacts. Adjusting for reporting error changed the estimate of the duration distribution, but did not change the estimates of covariate effects and had little effect on epidemic predictions. Our epidemic simulation study indicates that inclusion of network structure based on architectural and organizational structure data can improve the accuracy of epidemic forecasting models.

  20. Modeling workplace contact networks: The effects of organizational structure, architecture, and reporting errors on epidemic predictions

    PubMed Central

    Potter, Gail E.; Smieszek, Timo; Sailer, Kerstin

    2015-01-01

    Face-to-face social contacts are potentially important transmission routes for acute respiratory infections, and understanding the contact network can improve our ability to predict, contain, and control epidemics. Although workplaces are important settings for infectious disease transmission, few studies have collected workplace contact data and estimated workplace contact networks. We use contact diaries, architectural distance measures, and institutional structures to estimate social contact networks within a Swiss research institute. Some contact reports were inconsistent, indicating reporting errors. We adjust for this with a latent variable model, jointly estimating the true (unobserved) network of contacts and duration-specific reporting probabilities. We find that contact probability decreases with distance, and that research group membership, role, and shared projects are strongly predictive of contact patterns. Estimated reporting probabilities were low only for 0–5 min contacts. Adjusting for reporting error changed the estimate of the duration distribution, but did not change the estimates of covariate effects and had little effect on epidemic predictions. Our epidemic simulation study indicates that inclusion of network structure based on architectural and organizational structure data can improve the accuracy of epidemic forecasting models. PMID:26634122

  1. Naive Probability: Model-Based Estimates of Unique Events.

    PubMed

    Khemlani, Sangeet S; Lotstein, Max; Johnson-Laird, Philip N

    2015-08-01

    We describe a dual-process theory of how individuals estimate the probabilities of unique events, such as Hillary Clinton becoming U.S. President. It postulates that uncertainty is a guide to improbability. In its computer implementation, an intuitive system 1 simulates evidence in mental models and forms analog non-numerical representations of the magnitude of degrees of belief. This system has minimal computational power and combines evidence using a small repertoire of primitive operations. It resolves the uncertainty of divergent evidence for single events, for conjunctions of events, and for inclusive disjunctions of events, by taking a primitive average of non-numerical probabilities. It computes conditional probabilities in a tractable way, treating the given event as evidence that may be relevant to the probability of the dependent event. A deliberative system 2 maps the resulting representations into numerical probabilities. With access to working memory, it carries out arithmetical operations in combining numerical estimates. Experiments corroborated the theory's predictions. Participants concurred in estimates of real possibilities. They violated the complete joint probability distribution in the predicted ways, when they made estimates about conjunctions: P(A), P(B), P(A and B), disjunctions: P(A), P(B), P(A or B or both), and conditional probabilities P(A), P(B), P(B|A). They were faster to estimate the probabilities of compound propositions when they had already estimated the probabilities of each of their components. We discuss the implications of these results for theories of probabilistic reasoning. © 2014 Cognitive Science Society, Inc.

  2. Probability density function of non-reactive solute concentration in heterogeneous porous formations.

    PubMed

    Bellin, Alberto; Tonina, Daniele

    2007-10-30

    Available models of solute transport in heterogeneous formations lack in providing complete characterization of the predicted concentration. This is a serious drawback especially in risk analysis where confidence intervals and probability of exceeding threshold values are required. Our contribution to fill this gap of knowledge is a probability distribution model for the local concentration of conservative tracers migrating in heterogeneous aquifers. Our model accounts for dilution, mechanical mixing within the sampling volume and spreading due to formation heterogeneity. It is developed by modeling local concentration dynamics with an Ito Stochastic Differential Equation (SDE) that under the hypothesis of statistical stationarity leads to the Beta probability distribution function (pdf) for the solute concentration. This model shows large flexibility in capturing the smoothing effect of the sampling volume and the associated reduction of the probability of exceeding large concentrations. Furthermore, it is fully characterized by the first two moments of the solute concentration, and these are the same pieces of information required for standard geostatistical techniques employing Normal or Log-Normal distributions. Additionally, we show that in the absence of pore-scale dispersion and for point concentrations the pdf model converges to the binary distribution of [Dagan, G., 1982. Stochastic modeling of groundwater flow by unconditional and conditional probabilities, 2, The solute transport. Water Resour. Res. 18 (4), 835-848.], while it approaches the Normal distribution for sampling volumes much larger than the characteristic scale of the aquifer heterogeneity. Furthermore, we demonstrate that the same model with the spatial moments replacing the statistical moments can be applied to estimate the proportion of the plume volume where solute concentrations are above or below critical thresholds. Application of this model to point and vertically averaged bromide concentrations from the first Cape Cod tracer test and to a set of numerical simulations confirms the above findings and for the first time it shows the superiority of the Beta model to both Normal and Log-Normal models in interpreting field data. Furthermore, we show that assuming a-priori that local concentrations are normally or log-normally distributed may result in a severe underestimate of the probability of exceeding large concentrations.

  3. A robust method to forecast volcanic ash clouds

    USGS Publications Warehouse

    Denlinger, Roger P.; Pavolonis, Mike; Sieglaff, Justin

    2012-01-01

    Ash clouds emanating from volcanic eruption columns often form trails of ash extending thousands of kilometers through the Earth's atmosphere, disrupting air traffic and posing a significant hazard to air travel. To mitigate such hazards, the community charged with reducing flight risk must accurately assess risk of ash ingestion for any flight path and provide robust forecasts of volcanic ash dispersal. In response to this need, a number of different transport models have been developed for this purpose and applied to recent eruptions, providing a means to assess uncertainty in forecasts. Here we provide a framework for optimal forecasts and their uncertainties given any model and any observational data. This involves random sampling of the probability distributions of input (source) parameters to a transport model and iteratively running the model with different inputs, each time assessing the predictions that the model makes about ash dispersal by direct comparison with satellite data. The results of these comparisons are embodied in a likelihood function whose maximum corresponds to the minimum misfit between model output and observations. Bayes theorem is then used to determine a normalized posterior probability distribution and from that a forecast of future uncertainty in ash dispersal. The nature of ash clouds in heterogeneous wind fields creates a strong maximum likelihood estimate in which most of the probability is localized to narrow ranges of model source parameters. This property is used here to accelerate probability assessment, producing a method to rapidly generate a prediction of future ash concentrations and their distribution based upon assimilation of satellite data as well as model and data uncertainties. Applying this method to the recent eruption of Eyjafjallajökull in Iceland, we show that the 3 and 6 h forecasts of ash cloud location probability encompassed the location of observed satellite-determined ash cloud loads, providing an efficient means to assess all of the hazards associated with these ash clouds.

  4. ERMiT: Estimating Post-Fire Erosion in Probabilistic Terms

    NASA Astrophysics Data System (ADS)

    Pierson, F. B.; Robichaud, P. R.; Elliot, W. J.; Hall, D. E.; Moffet, C. A.

    2006-12-01

    Mitigating the impact of post-wildfire runoff and erosion on life, property, and natural resources have cost the United States government tens of millions of dollars over the past decade. The decision of where, when, and how to apply the most effective mitigation treatments requires land managers to assess the risk of damaging runoff and erosion events occurring after a fire. The Erosion Risk Management Tool (ERMiT) is a web-based application that estimates erosion in probabilistic terms on burned and recovering forest, range, and chaparral lands. Unlike most erosion prediction models, ERMiT does not provide `average annual erosion rates;' rather, it provides a distribution of erosion rates with the likelihood of their occurrence. ERMiT combines rain event variability with spatial and temporal variabilities of hillslope burn severity, soil properties, and ground cover to estimate Water Erosion Prediction Project (WEPP) model input parameter values. Based on 20 to 40 individual WEPP runs, ERMiT produces a distribution of rain event erosion rates with a probability of occurrence for each of five post-fire years. Over the 5 years of modeled recovery, the occurrence probability of the less erodible soil parameters is increased and the occurrence probability of the more erodible soil parameters is decreased. In addition, the occurrence probabilities and the four spatial arrangements of burn severity (arrangements of overland flow elements (OFE's)), are shifted toward lower burn severity with each year of recovery. These yearly adjustments are based on field measurements made through post-fire recovery periods. ERMiT also provides rain event erosion rate distributions for hillslopes that have been treated with seeding, straw mulch, straw wattles and contour-felled log erosion barriers. Such output can help managers make erosion mitigation treatment decisions based on the probability of high sediment yields occurring, the value of resources at risk for damage, cost, and other management considerations.

  5. Probabilistic Prediction of Lifetimes of Ceramic Parts

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel N.; Gyekenyesi, John P.; Jadaan, Osama M.; Palfi, Tamas; Powers, Lynn; Reh, Stefan; Baker, Eric H.

    2006-01-01

    ANSYS/CARES/PDS is a software system that combines the ANSYS Probabilistic Design System (PDS) software with a modified version of the Ceramics Analysis and Reliability Evaluation of Structures Life (CARES/Life) Version 6.0 software. [A prior version of CARES/Life was reported in Program for Evaluation of Reliability of Ceramic Parts (LEW-16018), NASA Tech Briefs, Vol. 20, No. 3 (March 1996), page 28.] CARES/Life models effects of stochastic strength, slow crack growth, and stress distribution on the overall reliability of a ceramic component. The essence of the enhancement in CARES/Life 6.0 is the capability to predict the probability of failure using results from transient finite-element analysis. ANSYS PDS models the effects of uncertainty in material properties, dimensions, and loading on the stress distribution and deformation. ANSYS/CARES/PDS accounts for the effects of probabilistic strength, probabilistic loads, probabilistic material properties, and probabilistic tolerances on the lifetime and reliability of the component. Even failure probability becomes a stochastic quantity that can be tracked as a response variable. ANSYS/CARES/PDS enables tracking of all stochastic quantities in the design space, thereby enabling more precise probabilistic prediction of lifetimes of ceramic components.

  6. Cost effectiveness of nutrition support in the prevention of pressure ulcer in hospitals.

    PubMed

    Banks, M D; Graves, N; Bauer, J D; Ash, S

    2013-01-01

    This study estimates the economic outcomes of a nutrition intervention to at-risk patients compared with standard care in the prevention of pressure ulcer. Statistical models were developed to predict 'cases of pressure ulcer avoided', 'number of bed days gained' and 'change to economic costs' in public hospitals in 2002-2003 in Queensland, Australia. Input parameters were specified and appropriate probability distributions fitted for: number of discharges per annum; incidence rate for pressure ulcer; independent effect of pressure ulcer on length of stay; cost of a bed day; change in risk in developing a pressure ulcer associated with nutrition support; annual cost of the provision of a nutrition support intervention for at-risk patients. A total of 1000 random re-samples were made and the results expressed as output probability distributions. The model predicts a mean 2896 (s.d. 632) cases of pressure ulcer avoided; 12, 397 (s.d. 4491) bed days released and corresponding mean economic cost saving of euros 2 869 526 (s.d. 2 078 715) with a nutrition support intervention, compared with standard care. Nutrition intervention is predicted to be a cost-effective approach in the prevention of pressure ulcer in at-risk patients.

  7. Distribution of arsenic and copper in sediment pore water: an ecological risk assessment case study for offshore drilling waste discharges.

    PubMed

    Sadiq, Rehan; Husain, Tahir; Veitch, Brian; Bose, Neil

    2003-12-01

    Due to the hydrophobic nature of synthetic based fluids (SBFs), drilling cuttings are not very dispersive in the water column and settle down close to the disposal site. Arsenic and copper are two important toxic heavy metals, among others, found in the drilling waste. In this article, the concentrations of heavy metals are determined using a steady state "aquivalence-based" fate model in a probabilistic mode. Monte Carlo simulations are employed to determine pore water concentrations. A hypothetical case study is used to determine the water quality impacts for two discharge options: 4% and 10% attached SBFs, which correspond to the best available technology option and the current discharge practice in the U.S. offshore. The exposure concentration (CE) is a predicted environmental concentration, which is adjusted for exposure probability and bioavailable fraction of heavy metals. The response of the ecosystem (RE) is defined by developing an empirical distribution function of predicted no-effect concentration. The pollutants' pore water concentrations within the radius of 750 m are estimated and cumulative distributions of risk quotient (RQ=CE/RE) are developed to determine the probability of RQ greater than 1.

  8. Rain attenuation measurements: Variability and data quality assessment

    NASA Technical Reports Server (NTRS)

    Crane, Robert K.

    1989-01-01

    Year to year variations in the cumulative distributions of rain rate or rain attenuation are evident in any of the published measurements for a single propagation path that span a period of several years of observation. These variations must be described by models for the prediction of rain attenuation statistics. Now that a large measurement data base has been assembled by the International Radio Consultative Committee, the information needed to assess variability is available. On the basis of 252 sample cumulative distribution functions for the occurrence of attenuation by rain, the expected year to year variation in attenuation at a fixed probability level in the 0.1 to 0.001 percent of a year range is estimated to be 27 percent. The expected deviation from an attenuation model prediction for a single year of observations is estimated to exceed 33 percent when any of the available global rain climate model are employed to estimate the rain rate statistics. The probability distribution for the variation in attenuation or rain rate at a fixed fraction of a year is lognormal. The lognormal behavior of the variate was used to compile the statistics for variability.

  9. Comparing hard and soft prior bounds in geophysical inverse problems

    NASA Technical Reports Server (NTRS)

    Backus, George E.

    1988-01-01

    In linear inversion of a finite-dimensional data vector y to estimate a finite-dimensional prediction vector z, prior information about X sub E is essential if y is to supply useful limits for z. The one exception occurs when all the prediction functionals are linear combinations of the data functionals. Two forms of prior information are compared: a soft bound on X sub E is a probability distribution p sub x on X which describes the observer's opinion about where X sub E is likely to be in X; a hard bound on X sub E is an inequality Q sub x(X sub E, X sub E) is equal to or less than 1, where Q sub x is a positive definite quadratic form on X. A hard bound Q sub x can be softened to many different probability distributions p sub x, but all these p sub x's carry much new information about X sub E which is absent from Q sub x, and some information which contradicts Q sub x. Both stochastic inversion (SI) and Bayesian inference (BI) estimate z from y and a soft prior bound p sub x. If that probability distribution was obtained by softening a hard prior bound Q sub x, rather than by objective statistical inference independent of y, then p sub x contains so much unsupported new information absent from Q sub x that conclusions about z obtained with SI or BI would seen to be suspect.

  10. Comparing hard and soft prior bounds in geophysical inverse problems

    NASA Technical Reports Server (NTRS)

    Backus, George E.

    1987-01-01

    In linear inversion of a finite-dimensional data vector y to estimate a finite-dimensional prediction vector z, prior information about X sub E is essential if y is to supply useful limits for z. The one exception occurs when all the prediction functionals are linear combinations of the data functionals. Two forms of prior information are compared: a soft bound on X sub E is a probability distribution p sub x on X which describeds the observer's opinion about where X sub E is likely to be in X; a hard bound on X sub E is an inequality Q sub x(X sub E, X sub E) is equal to or less than 1, where Q sub x is a positive definite quadratic form on X. A hard bound Q sub x can be softened to many different probability distributions p sub x, but all these p sub x's carry much new information about X sub E which is absent from Q sub x, and some information which contradicts Q sub x. Both stochastic inversion (SI) and Bayesian inference (BI) estimate z from y and a soft prior bound p sub x. If that probability distribution was obtained by softening a hard prior bound Q sub x, rather than by objective statistical inference independent of y, then p sub x contains so much unsupported new information absent from Q sub x that conclusions about z obtained with SI or BI would seen to be suspect.

  11. Modeling Aircraft Position and Conservatively Calculating Airspace Violations for an Autonomous Collision Awareness System for Unmanned Aerial Systems

    NASA Astrophysics Data System (ADS)

    Ueunten, Kevin K.

    With the scheduled 30 September 2015 integration of Unmanned Aerial System (UAS) into the national airspace, the Federal Aviation Administration (FAA) is concerned with UAS capabilities to sense and avoid conflicts. Since the operator is outside the cockpit, the proposed collision awareness plugin (CAPlugin), based on probability and error propagation, conservatively predicts potential conflicts with other aircraft and airspaces, thus increasing the operator's situational awareness. The conflict predictions are calculated using a forward state estimator (FSE) and a conflict calculator. Predicting an aircraft's position, modeled as a mixed Gaussian distribution, is the FSE's responsibility. Furthermore, the FSE supports aircraft engaged in the following three flight modes: free flight, flight path following and orbits. The conflict calculator uses the FSE result to calculate the conflict probability between an aircraft and airspace or another aircraft. Finally, the CAPlugin determines the highest conflict probability and warns the operator. In addition to discussing the FSE free flight, FSE orbit and the airspace conflict calculator, this thesis describes how each algorithm is implemented and tested. Lastly two simulations demonstrates the CAPlugin's capabilities.

  12. The evolution of trade-offs: geographic variation in call duration and flight ability in the sand cricket, Gryllus firmus.

    PubMed

    Roff, D A; Crnokrak, P; Fairbairn, D J

    2003-07-01

    Quantitative genetic theory assumes that trade-offs are best represented by bivariate normal distributions. This theory predicts that selection will shift the trade-off function itself and not just move the mean trait values along a fixed trade-off line, as is generally assumed in optimality models. As a consequence, quantitative genetic theory predicts that the trade-off function will vary among populations in which at least one of the component traits itself varies. This prediction is tested using the trade-off between call duration and flight capability, as indexed by the mass of the dorsolateral flight muscles, in the macropterous morph of the sand cricket. We use four different populations of crickets that vary in the proportion of macropterous males (Lab = 33%, Florida = 29%, Bermuda = 72%, South Carolina = 80%). We find, as predicted, that there is significant variation in the intercept of the trade-off function but not the slope, supporting the hypothesis that trade-off functions are better represented as bivariate normal distributions rather than single lines. We also test the prediction from a quantitative genetical model of the evolution of wing dimorphism that the mean call duration of macropterous males will increase with the percentage of macropterous males in the population. This prediction is also supported. Finally, we estimate the probability of a macropterous male attracting a female, P, as a function of the relative time spent calling (P = time spent calling by macropterous male/(total time spent calling by both micropterous and macropterous male). We find that in the Lab and Florida populations the probability of a female selecting the macropterous male is equal to P, indicating that preference is due simply to relative call duration. But in the Bermuda and South Carolina populations the probability of a female selecting a macropterous male is less than P, indicating a preference for the micropterous male even after differences in call duration are accounted for.

  13. Prediction of slant path rain attenuation statistics at various locations

    NASA Technical Reports Server (NTRS)

    Goldhirsh, J.

    1977-01-01

    The paper describes a method for predicting slant path attenuation statistics at arbitrary locations for variable frequencies and path elevation angles. The method involves the use of median reflectivity factor-height profiles measured with radar as well as the use of long-term point rain rate data and assumed or measured drop size distributions. The attenuation coefficient due to cloud liquid water in the presence of rain is also considered. Absolute probability fade distributions are compared for eight cases: Maryland (15 GHz), Texas (30 GHz), Slough, England (19 and 37 GHz), Fayetteville, North Carolina (13 and 18 GHz), and Cambridge, Massachusetts (13 and 18 GHz).

  14. Predicting the Distribution of Vibrio spp. in the Chesapeake Bay: A Vibrio cholerae Case Study

    PubMed Central

    Magny, Guillaume Constantin de; Long, Wen; Brown, Christopher W.; Hood, Raleigh R.; Huq, Anwar; Murtugudde, Raghu; Colwell, Rita R.

    2010-01-01

    Vibrio cholerae, the causative agent of cholera, is a naturally occurring inhabitant of the Chesapeake Bay and serves as a predictor for other clinically important vibrios, including Vibrio parahaemolyticus and Vibrio vulnificus. A system was constructed to predict the likelihood of the presence of V. cholerae in surface waters of the Chesapeake Bay, with the goal to provide forecasts of the occurrence of this and related pathogenic Vibrio spp. Prediction was achieved by driving an available multivariate empirical habitat model estimating the probability of V. cholerae within a range of temperatures and salinities in the Bay, with hydrodynamically generated predictions of ambient temperature and salinity. The experimental predictions provided both an improved understanding of the in situ variability of V. cholerae, including identification of potential hotspots of occurrence, and usefulness as an early warning system. With further development of the system, prediction of the probability of the occurrence of related pathogenic vibrios in the Chesapeake Bay, notably V. parahaemolyticus and V. vulnificus, will be possible, as well as its transport to any geographical location where sufficient relevant data are available. PMID:20145974

  15. Bayesian model averaging using particle filtering and Gaussian mixture modeling: Theory, concepts, and simulation experiments

    NASA Astrophysics Data System (ADS)

    Rings, Joerg; Vrugt, Jasper A.; Schoups, Gerrit; Huisman, Johan A.; Vereecken, Harry

    2012-05-01

    Bayesian model averaging (BMA) is a standard method for combining predictive distributions from different models. In recent years, this method has enjoyed widespread application and use in many fields of study to improve the spread-skill relationship of forecast ensembles. The BMA predictive probability density function (pdf) of any quantity of interest is a weighted average of pdfs centered around the individual (possibly bias-corrected) forecasts, where the weights are equal to posterior probabilities of the models generating the forecasts, and reflect the individual models skill over a training (calibration) period. The original BMA approach presented by Raftery et al. (2005) assumes that the conditional pdf of each individual model is adequately described with a rather standard Gaussian or Gamma statistical distribution, possibly with a heteroscedastic variance. Here we analyze the advantages of using BMA with a flexible representation of the conditional pdf. A joint particle filtering and Gaussian mixture modeling framework is presented to derive analytically, as closely and consistently as possible, the evolving forecast density (conditional pdf) of each constituent ensemble member. The median forecasts and evolving conditional pdfs of the constituent models are subsequently combined using BMA to derive one overall predictive distribution. This paper introduces the theory and concepts of this new ensemble postprocessing method, and demonstrates its usefulness and applicability by numerical simulation of the rainfall-runoff transformation using discharge data from three different catchments in the contiguous United States. The revised BMA method receives significantly lower-prediction errors than the original default BMA method (due to filtering) with predictive uncertainty intervals that are substantially smaller but still statistically coherent (due to the use of a time-variant conditional pdf).

  16. A Predictive Analysis of the Department of Defense Distribution System Utilizing Random Forests

    DTIC Science & Technology

    2016-06-01

    resources capable of meeting both customer and individual resource constraints and goals while also maximizing the global benefit to the supply...and probability rules to determine the optimal red wine distribution network for an Italian-based wine producer. The decision support model for...combinations of factors that will result in delivery of the highest quality wines . The model’s first stage inputs basic logistics information to look

  17. Habitat availability and gene flow influence diverging local population trajectories under scenarios of climate change: a place-based approach.

    PubMed

    Schwalm, Donelle; Epps, Clinton W; Rodhouse, Thomas J; Monahan, William B; Castillo, Jessica A; Ray, Chris; Jeffress, Mackenzie R

    2016-04-01

    Ecological niche theory holds that species distributions are shaped by a large and complex suite of interacting factors. Species distribution models (SDMs) are increasingly used to describe species' niches and predict the effects of future environmental change, including climate change. Currently, SDMs often fail to capture the complexity of species' niches, resulting in predictions that are generally limited to climate-occupancy interactions. Here, we explore the potential impact of climate change on the American pika using a replicated place-based approach that incorporates climate, gene flow, habitat configuration, and microhabitat complexity into SDMs. Using contemporary presence-absence data from occupancy surveys, genetic data to infer connectivity between habitat patches, and 21 environmental niche variables, we built separate SDMs for pika populations inhabiting eight US National Park Service units representing the habitat and climatic breadth of the species across the western United States. We then predicted occurrence probability under current (1981-2010) and three future time periods (out to 2100). Occurrence probabilities and the relative importance of predictor variables varied widely among study areas, revealing important local-scale differences in the realized niche of the American pika. This variation resulted in diverse and - in some cases - highly divergent future potential occupancy patterns for pikas, ranging from complete extirpation in some study areas to stable occupancy patterns in others. Habitat composition and connectivity, which are rarely incorporated in SDM projections, were influential in predicting pika occupancy in all study areas and frequently outranked climate variables. Our findings illustrate the importance of a place-based approach to species distribution modeling that includes fine-scale factors when assessing current and future climate impacts on species' distributions, especially when predictions are intended to manage and conserve species of concern within individual protected areas. © 2015 John Wiley & Sons Ltd.

  18. Combining Probability Distributions of Wind Waves and Sea Level Variations to Assess Return Periods of Coastal Floods

    NASA Astrophysics Data System (ADS)

    Leijala, U.; Bjorkqvist, J. V.; Pellikka, H.; Johansson, M. M.; Kahma, K. K.

    2017-12-01

    Predicting the behaviour of the joint effect of sea level and wind waves is of great significance due to the major impact of flooding events in densely populated coastal regions. As mean sea level rises, the effect of sea level variations accompanied by the waves will be even more harmful in the future. The main challenge when evaluating the effect of waves and sea level variations is that long time series of both variables rarely exist. Wave statistics are also highly location-dependent, thus requiring wave buoy measurements and/or high-resolution wave modelling. As an initial approximation of the joint effect, the variables may be treated as independent random variables, to achieve the probability distribution of their sum. We present results of a case study based on three probability distributions: 1) wave run-up constructed from individual wave buoy measurements, 2) short-term sea level variability based on tide gauge data, and 3) mean sea level projections based on up-to-date regional scenarios. The wave measurements were conducted during 2012-2014 on the coast of city of Helsinki located in the Gulf of Finland in the Baltic Sea. The short-term sea level distribution contains the last 30 years (1986-2015) of hourly data from Helsinki tide gauge, and the mean sea level projections are scenarios adjusted for the Gulf of Finland. Additionally, we present a sensitivity test based on six different theoretical wave height distributions representing different wave behaviour in relation to sea level variations. As these wave distributions are merged with one common sea level distribution, we can study how the different shapes of the wave height distribution affect the distribution of the sum, and which one of the components is dominating under different wave conditions. As an outcome of the method, we obtain a probability distribution of the maximum elevation of the continuous water mass, which enables a flexible tool for evaluating different risk levels in the current and future climate.

  19. Evaluation of an Ensemble Dispersion Calculation.

    NASA Astrophysics Data System (ADS)

    Draxler, Roland R.

    2003-02-01

    A Lagrangian transport and dispersion model was modified to generate multiple simulations from a single meteorological dataset. Each member of the simulation was computed by assuming a ±1-gridpoint shift in the horizontal direction and a ±250-m shift in the vertical direction of the particle position, with respect to the meteorological data. The configuration resulted in 27 ensemble members. Each member was assumed to have an equal probability. The model was tested by creating an ensemble of daily average air concentrations for 3 months at 75 measurement locations over the eastern half of the United States during the Across North America Tracer Experiment (ANATEX). Two generic graphical displays were developed to summarize the ensemble prediction and the resulting concentration probabilities for a specific event: a probability-exceed plot and a concentration-probability plot. Although a cumulative distribution of the ensemble probabilities compared favorably with the measurement data, the resulting distribution was not uniform. This result was attributed to release height sensitivity. The trajectory ensemble approach accounts for about 41%-47% of the variance in the measurement data. This residual uncertainty is caused by other model and data errors that are not included in the ensemble design.

  20. Future changes in South American biomass distributions, biome distributions and plant trait spectra is dependent on applied atmospheric forcings.

    NASA Astrophysics Data System (ADS)

    Langan, Liam; Scheiter, Simon; Higgins, Steven

    2017-04-01

    It remains poorly understood why the position of the forest-savanna biome boundary, in a domain defined by precipitation and temperature, differs in South America, Africa and Australia. Process based Dynamic Global Vegetation Models (DGVMs) are a valuable tool to investigate the determinants of vegetation distributions, however, many DGVMs fail to predict the spatial distribution or indeed presence of the South American savanna biome. Evidence suggests fire plays a significant role in mediating forest-savanna biome boundaries, however, fire alone appear to be insufficient to predict these boundaries in South America. We hypothesize that interactions between precipitation, constraints on tree rooting depth and fire, affect the probability of savanna occurrence and the position of the savanna-forest boundary. We tested our hypotheses at tropical forest and savanna sites in Brazil and Venezuela using a novel DGVM, aDGVM2, which allows plant trait spectra, constrained by trade-offs between traits, to evolve in response to abiotic and biotic conditions. Plant hydraulics is represented by the cohesion-tension theory, this allowed us to explore how soil and plant hydraulics control biome distributions and plant traits. The resulting community trait distributions are emergent properties of model dynamics. We showed that across much of South America the biome state is not determined by climate alone. Interactions between tree rooting depth, fire and precipitation affected the probability of observing a given biome state and the emergent traits of plant communities. Simulations where plant rooting depth varied in space provided the best match to satellite derived biomass estimates and generated biome distributions that reproduced contemporary biome maps well. Future projections showed that biomass distributions, biome distributions and plant trait spectra will change, however, the magnitude of these changes are highly dependent on the applied atmospheric forcings.

  1. Estimating true human and animal host source contribution in quantitative microbial source tracking using the Monte Carlo method.

    PubMed

    Wang, Dan; Silkie, Sarah S; Nelson, Kara L; Wuertz, Stefan

    2010-09-01

    Cultivation- and library-independent, quantitative PCR-based methods have become the method of choice in microbial source tracking. However, these qPCR assays are not 100% specific and sensitive for the target sequence in their respective hosts' genome. The factors that can lead to false positive and false negative information in qPCR results are well defined. It is highly desirable to have a way of removing such false information to estimate the true concentration of host-specific genetic markers and help guide the interpretation of environmental monitoring studies. Here we propose a statistical model based on the Law of Total Probability to predict the true concentration of these markers. The distributions of the probabilities of obtaining false information are estimated from representative fecal samples of known origin. Measurement error is derived from the sample precision error of replicated qPCR reactions. Then, the Monte Carlo method is applied to sample from these distributions of probabilities and measurement error. The set of equations given by the Law of Total Probability allows one to calculate the distribution of true concentrations, from which their expected value, confidence interval and other statistical characteristics can be easily evaluated. The output distributions of predicted true concentrations can then be used as input to watershed-wide total maximum daily load determinations, quantitative microbial risk assessment and other environmental models. This model was validated by both statistical simulations and real world samples. It was able to correct the intrinsic false information associated with qPCR assays and output the distribution of true concentrations of Bacteroidales for each animal host group. Model performance was strongly affected by the precision error. It could perform reliably and precisely when the standard deviation of the precision error was small (≤ 0.1). Further improvement on the precision of sample processing and qPCR reaction would greatly improve the performance of the model. This methodology, built upon Bacteroidales assays, is readily transferable to any other microbial source indicator where a universal assay for fecal sources of that indicator exists. Copyright © 2010 Elsevier Ltd. All rights reserved.

  2. Texture metric that predicts target detection performance

    NASA Astrophysics Data System (ADS)

    Culpepper, Joanne B.

    2015-12-01

    Two texture metrics based on gray level co-occurrence error (GLCE) are used to predict probability of detection and mean search time. The two texture metrics are local clutter metrics and are based on the statistics of GLCE probability distributions. The degree of correlation between various clutter metrics and the target detection performance of the nine military vehicles in complex natural scenes found in the Search_2 dataset are presented. Comparison is also made between four other common clutter metrics found in the literature: root sum of squares, Doyle, statistical variance, and target structure similarity. The experimental results show that the GLCE energy metric is a better predictor of target detection performance when searching for targets in natural scenes than the other clutter metrics studied.

  3. Past, present and future distributions of an Iberian Endemic, Lepus granatensis: ecological and evolutionary clues from species distribution models.

    PubMed

    Acevedo, Pelayo; Melo-Ferreira, José; Real, Raimundo; Alves, Paulo Célio

    2012-01-01

    The application of species distribution models (SDMs) in ecology and conservation biology is increasing and assuming an important role, mainly because they can be used to hindcast past and predict current and future species distributions. However, the accuracy of SDMs depends on the quality of the data and on appropriate theoretical frameworks. In this study, comprehensive data on the current distribution of the Iberian hare (Lepus granatensis) were used to i) determine the species' ecogeographical constraints, ii) hindcast a climatic model for the last glacial maximum (LGM), relating it to inferences derived from molecular studies, and iii) calibrate a model to assess the species future distribution trends (up to 2080). Our results showed that the climatic factor (in its pure effect and when it is combined with the land-cover factor) is the most important descriptor of the current distribution of the Iberian hare. In addition, the model's output was a reliable index of the local probability of species occurrence, which is a valuable tool to guide species management decisions and conservation planning. Climatic potential obtained for the LGM was combined with molecular data and the results suggest that several glacial refugia may have existed for the species within the major Iberian refugium. Finally, a high probability of occurrence of the Iberian hare in the current species range and a northward expansion were predicted for future. Given its current environmental envelope and evolutionary history, we discuss the macroecology of the Iberian hare and its sensitivity to climate change.

  4. Past, Present and Future Distributions of an Iberian Endemic, Lepus granatensis: Ecological and Evolutionary Clues from Species Distribution Models

    PubMed Central

    Acevedo, Pelayo; Melo-Ferreira, José; Real, Raimundo; Alves, Paulo Célio

    2012-01-01

    The application of species distribution models (SDMs) in ecology and conservation biology is increasing and assuming an important role, mainly because they can be used to hindcast past and predict current and future species distributions. However, the accuracy of SDMs depends on the quality of the data and on appropriate theoretical frameworks. In this study, comprehensive data on the current distribution of the Iberian hare (Lepus granatensis) were used to i) determine the species’ ecogeographical constraints, ii) hindcast a climatic model for the last glacial maximum (LGM), relating it to inferences derived from molecular studies, and iii) calibrate a model to assess the species future distribution trends (up to 2080). Our results showed that the climatic factor (in its pure effect and when it is combined with the land-cover factor) is the most important descriptor of the current distribution of the Iberian hare. In addition, the model’s output was a reliable index of the local probability of species occurrence, which is a valuable tool to guide species management decisions and conservation planning. Climatic potential obtained for the LGM was combined with molecular data and the results suggest that several glacial refugia may have existed for the species within the major Iberian refugium. Finally, a high probability of occurrence of the Iberian hare in the current species range and a northward expansion were predicted for future. Given its current environmental envelope and evolutionary history, we discuss the macroecology of the Iberian hare and its sensitivity to climate change. PMID:23272115

  5. Applications of the first digit law to measure correlations.

    PubMed

    Gramm, R; Yost, J; Su, Q; Grobe, R

    2017-04-01

    The quasiempirical Benford law predicts that the distribution of the first significant digit of random numbers obtained from mixed probability distributions is surprisingly meaningful and reveals some universal behavior. We generalize this finding to examine the joint first-digit probability of a pair of two random numbers and show that undetectable correlations by means of the usual covariance-based measure can be identified in the statistics of the corresponding first digits. We illustrate this new measure by analyzing the correlations and anticorrelations of the positions of two interacting particles in their quantum mechanical ground state. This suggests that by using this measure, the presence or absence of correlations can be determined even if only the first digit of noisy experimental data can be measured accurately.

  6. Risk Assessment of Bone Fracture During Space Exploration Missions to the Moon and Mars

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth E.; Myers, Jerry G.; Nelson, Emily S.; Licatta, Angelo; Griffin, Devon

    2007-01-01

    The possibility of a traumatic bone fracture in space is a concern due to the observed decrease in astronaut bone mineral density (BMD) during spaceflight and because of the physical demands of the mission. The Bone Fracture Risk Module (BFxRM) was developed to quantify the probability of fracture at the femoral neck and lumbar spine during space exploration missions. The BFxRM is scenario-based, providing predictions for specific activities or events during a particular space mission. The key elements of the BFxRM are the mission parameters, the biomechanical loading models, the bone loss and fracture models and the incidence rate of the activity or event. Uncertainties in the model parameters arise due to variations within the population and unknowns associated with the effects of the space environment. Consequently, parameter distributions were used in Monte Carlo simulations to obtain an estimate of fracture probability under real mission scenarios. The model predicts an increase in the probability of fracture as the mission length increases and fracture is more likely in the higher gravitational field of Mars than on the moon. The resulting probability predictions and sensitivity analyses of the BFxRM can be used as an engineering tool for mission operation and resource planning in order to mitigate the risk of bone fracture in space.

  7. Risk Assessment of Bone Fracture During Space Exploration Missions to the Moon and Mars

    NASA Technical Reports Server (NTRS)

    Lewandowski, Beth E.; Myers, Jerry G.; Nelson, Emily S.; Griffin, Devon

    2008-01-01

    The possibility of a traumatic bone fracture in space is a concern due to the observed decrease in astronaut bone mineral density (BMD) during spaceflight and because of the physical demands of the mission. The Bone Fracture Risk Module (BFxRM) was developed to quantify the probability of fracture at the femoral neck and lumbar spine during space exploration missions. The BFxRM is scenario-based, providing predictions for specific activities or events during a particular space mission. The key elements of the BFxRM are the mission parameters, the biomechanical loading models, the bone loss and fracture models and the incidence rate of the activity or event. Uncertainties in the model parameters arise due to variations within the population and unknowns associated with the effects of the space environment. Consequently, parameter distributions were used in Monte Carlo simulations to obtain an estimate of fracture probability under real mission scenarios. The model predicts an increase in the probability of fracture as the mission length increases and fracture is more likely in the higher gravitational field of Mars than on the moon. The resulting probability predictions and sensitivity analyses of the BFxRM can be used as an engineering tool for mission operation and resource planning in order to mitigate the risk of bone fracture in space.

  8. Failure-probability driven dose painting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vogelius, Ivan R.; Håkansson, Katrin; Due, Anne K.

    Purpose: To demonstrate a data-driven dose-painting strategy based on the spatial distribution of recurrences in previously treated patients. The result is a quantitative way to define a dose prescription function, optimizing the predicted local control at constant treatment intensity. A dose planning study using the optimized dose prescription in 20 patients is performed.Methods: Patients treated at our center have five tumor subvolumes from the center of the tumor (PET positive volume) and out delineated. The spatial distribution of 48 failures in patients with complete clinical response after (chemo)radiation is used to derive a model for tumor control probability (TCP). Themore » total TCP is fixed to the clinically observed 70% actuarial TCP at five years. Additionally, the authors match the distribution of failures between the five subvolumes to the observed distribution. The steepness of the dose–response is extracted from the literature and the authors assume 30% and 20% risk of subclinical involvement in the elective volumes. The result is a five-compartment dose response model matching the observed distribution of failures. The model is used to optimize the distribution of dose in individual patients, while keeping the treatment intensity constant and the maximum prescribed dose below 85 Gy.Results: The vast majority of failures occur centrally despite the small volumes of the central regions. Thus, optimizing the dose prescription yields higher doses to the central target volumes and lower doses to the elective volumes. The dose planning study shows that the modified prescription is clinically feasible. The optimized TCP is 89% (range: 82%–91%) as compared to the observed TCP of 70%.Conclusions: The observed distribution of locoregional failures was used to derive an objective, data-driven dose prescription function. The optimized dose is predicted to result in a substantial increase in local control without increasing the predicted risk of toxicity.« less

  9. Identification, prediction, and mitigation of sinkhole hazards in evaporite karst areas

    USGS Publications Warehouse

    Gutierrez, F.; Cooper, A.H.; Johnson, K.S.

    2008-01-01

    Sinkholes usually have a higher probability of occurrence and a greater genetic diversity in evaporite terrains than in carbonate karst areas. This is because evaporites have a higher solubility and, commonly, a lower mechanical strength. Subsidence damage resulting from evaporite dissolution generates substantial losses throughout the world, but the causes are only well understood in a few areas. To deal with these hazards, a phased approach is needed for sinkhole identification, investigation, prediction, and mitigation. Identification techniques include field surveys and geomorphological mapping combined with accounts from local people and historical sources. Detailed sinkhole maps can be constructed from sequential historical maps, recent topographical maps, and digital elevation models (DEMs) complemented with building-damage surveying, remote sensing, and high-resolution geodetic surveys. On a more detailed level, information from exposed paleosubsidence features (paleokarst), speleological explorations, geophysical investigations, trenching, dating techniques, and boreholes may help in investigating dissolution and subsidence features. Information on the hydrogeological pathways including caves, springs, and swallow holes are particularly important especially when corroborated by tracer tests. These diverse data sources make a valuable database-the karst inventory. From this dataset, sinkhole susceptibility zonations (relative probability) may be produced based on the spatial distribution of the features and good knowledge of the local geology. Sinkhole distribution can be investigated by spatial distribution analysis techniques including studies of preferential elongation, alignment, and nearest neighbor analysis. More objective susceptibility models may be obtained by analyzing the statistical relationships between the known sinkholes and the conditioning factors. Chronological information on sinkhole formation is required to estimate the probability of occurrence of sinkholes (number of sinkholes/km2 year). Such spatial and temporal predictions, frequently derived from limited records and based on the assumption that past sinkhole activity may be extrapolated to the future, are non-corroborated hypotheses. Validation methods allow us to assess the predictive capability of the susceptibility maps and to transform them into probability maps. Avoiding the most hazardous areas by preventive planning is the safest strategy for development in sinkhole-prone areas. Corrective measures could be applied to reduce the dissolution activity and subsidence processes. A more practical solution for safe development is to reduce the vulnerability of the structures by using subsidence-proof designs. ?? 2007 Springer-Verlag.

  10. On the properties of stochastic intermittency in rainfall processes.

    PubMed

    Molini, A; La, Barbera P; Lanza, L G

    2002-01-01

    In this work we propose a mixed approach to deal with the modelling of rainfall events, based on the analysis of geometrical and statistical properties of rain intermittency in time, combined with the predictability power derived from the analysis of no-rain periods distribution and from the binary decomposition of the rain signal. Some recent hypotheses on the nature of rain intermittency are reviewed too. In particular, the internal intermittent structure of a high resolution pluviometric time series covering one decade and recorded at the tipping bucket station of the University of Genova is analysed, by separating the internal intermittency of rainfall events from the inter-arrival process through a simple geometrical filtering procedure. In this way it is possible to associate no-rain intervals with a probability distribution both in virtue of their position within the event and their percentage. From this analysis, an invariant probability distribution for the no-rain periods within the events is obtained at different aggregation levels and its satisfactory agreement with a typical extreme value distribution is shown.

  11. Applications of Genomic Selection in Breeding Wheat for Rust Resistance.

    PubMed

    Ornella, Leonardo; González-Camacho, Juan Manuel; Dreisigacker, Susanne; Crossa, Jose

    2017-01-01

    There are a lot of methods developed to predict untested phenotypes in schemes commonly used in genomic selection (GS) breeding. The use of GS for predicting disease resistance has its own particularities: (a) most populations shows additivity in quantitative adult plant resistance (APR); (b) resistance needs effective combinations of major and minor genes; and (c) phenotype is commonly expressed in ordinal categorical traits, whereas most parametric applications assume that the response variable is continuous and normally distributed. Machine learning methods (MLM) can take advantage of examples (data) that capture characteristics of interest from an unknown underlying probability distribution (i.e., data-driven). We introduce some state-of-the-art MLM capable to predict rust resistance in wheat. We also present two parametric R packages for the reader to be able to compare.

  12. Directional data analysis under the general projected normal distribution

    PubMed Central

    Wang, Fangpo; Gelfand, Alan E.

    2013-01-01

    The projected normal distribution is an under-utilized model for explaining directional data. In particular, the general version provides flexibility, e.g., asymmetry and possible bimodality along with convenient regression specification. Here, we clarify the properties of this general class. We also develop fully Bayesian hierarchical models for analyzing circular data using this class. We show how they can be fit using MCMC methods with suitable latent variables. We show how posterior inference for distributional features such as the angular mean direction and concentration can be implemented as well as how prediction within the regression setting can be handled. With regard to model comparison, we argue for an out-of-sample approach using both a predictive likelihood scoring loss criterion and a cumulative rank probability score criterion. PMID:24046539

  13. Life prediction and mechanical reliability of NT551 silicon nitride

    NASA Astrophysics Data System (ADS)

    Andrews, Mark Jay

    The inert strength and fatigue performance of a diesel engine exhaust valve made from silicon nitride (Si3N4) ceramic were assessed. The Si3N4 characterized in this study was manufactured by Saint Gobain/Norton Industrial Ceramics and was designated as NT551. The evaluation was made utilizing a probabilistic life prediction algorithm that combined censored test specimen strength data with a Weibull distribution function and the stress field of the ceramic valve obtained from finite element analysis. The major assumptions of the life prediction algorithm are that the bulk ceramic material is isotropic and homogeneous and that the strength-limiting flaws are uniformly distributed. The results from mechanical testing indicated that NT551 was not a homogeneous ceramic and that its strength were functions of temperature, loading rate, and machining orientation. Fractographic analysis identified four different failure modes; 2 were identified as inhomogeneities that were located throughout the bulk of NT551 and were due to processing operations. The fractographic analysis concluded that the strength degradation of NT551 observed from the temperature and loading rate test parameters was due to a change of state that occurred in its secondary phase. Pristine and engine-tested valves made from NT551 were loaded to failure and the inert strengths were obtained. Fractographic analysis of the valves identified the same four failure mechanisms as found with the test specimens. The fatigue performance and the inert strength of the Si3N 4 valves were assessed from censored and uncensored test specimen strength data, respectively. The inert strength failure probability predictions were compared to the inert strength of the Si3N4 valves. The inert strength failure probability predictions were more conservative than the strength of the valves. The lack of correlation between predicted and actual valve strength was due to the nonuniform distribution of inhomogeneities present in NT551. For the same reasons, the predicted and actual fatigue performance did not correlate well. The results of this study should not be considered a limitation of the life prediction algorithm but emphasize the requirement that ceramics be homogeneous and strength-limiting flaws uniformly distributed as a perquisite for accurate life prediction and reliability analyses.

  14. A nonparametric multiple imputation approach for missing categorical data.

    PubMed

    Zhou, Muhan; He, Yulei; Yu, Mandi; Hsu, Chiu-Hsieh

    2017-06-06

    Incomplete categorical variables with more than two categories are common in public health data. However, most of the existing missing-data methods do not use the information from nonresponse (missingness) probabilities. We propose a nearest-neighbour multiple imputation approach to impute a missing at random categorical outcome and to estimate the proportion of each category. The donor set for imputation is formed by measuring distances between each missing value with other non-missing values. The distance function is calculated based on a predictive score, which is derived from two working models: one fits a multinomial logistic regression for predicting the missing categorical outcome (the outcome model) and the other fits a logistic regression for predicting missingness probabilities (the missingness model). A weighting scheme is used to accommodate contributions from two working models when generating the predictive score. A missing value is imputed by randomly selecting one of the non-missing values with the smallest distances. We conduct a simulation to evaluate the performance of the proposed method and compare it with several alternative methods. A real-data application is also presented. The simulation study suggests that the proposed method performs well when missingness probabilities are not extreme under some misspecifications of the working models. However, the calibration estimator, which is also based on two working models, can be highly unstable when missingness probabilities for some observations are extremely high. In this scenario, the proposed method produces more stable and better estimates. In addition, proper weights need to be chosen to balance the contributions from the two working models and achieve optimal results for the proposed method. We conclude that the proposed multiple imputation method is a reasonable approach to dealing with missing categorical outcome data with more than two levels for assessing the distribution of the outcome. In terms of the choices for the working models, we suggest a multinomial logistic regression for predicting the missing outcome and a binary logistic regression for predicting the missingness probability.

  15. Modeling pore corrosion in normally open gold- plated copper connectors.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Battaile, Corbett Chandler; Moffat, Harry K.; Sun, Amy Cha-Tien

    2008-09-01

    The goal of this study is to model the electrical response of gold plated copper electrical contacts exposed to a mixed flowing gas stream consisting of air containing 10 ppb H{sub 2}S at 30 C and a relative humidity of 70%. This environment accelerates the attack normally observed in a light industrial environment (essentially a simplified version of the Battelle Class 2 environment). Corrosion rates were quantified by measuring the corrosion site density, size distribution, and the macroscopic electrical resistance of the aged surface as a function of exposure time. A pore corrosion numerical model was used to predict bothmore » the growth of copper sulfide corrosion product which blooms through defects in the gold layer and the resulting electrical contact resistance of the aged surface. Assumptions about the distribution of defects in the noble metal plating and the mechanism for how corrosion blooms affect electrical contact resistance were needed to complete the numerical model. Comparisons are made to the experimentally observed number density of corrosion sites, the size distribution of corrosion product blooms, and the cumulative probability distribution of the electrical contact resistance. Experimentally, the bloom site density increases as a function of time, whereas the bloom size distribution remains relatively independent of time. These two effects are included in the numerical model by adding a corrosion initiation probability proportional to the surface area along with a probability for bloom-growth extinction proportional to the corrosion product bloom volume. The cumulative probability distribution of electrical resistance becomes skewed as exposure time increases. While the electrical contact resistance increases as a function of time for a fraction of the bloom population, the median value remains relatively unchanged. In order to model this behavior, the resistance calculated for large blooms has been weighted more heavily.« less

  16. Modeling stream fish distributions using interval-censored detection times.

    PubMed

    Ferreira, Mário; Filipe, Ana Filipa; Bardos, David C; Magalhães, Maria Filomena; Beja, Pedro

    2016-08-01

    Controlling for imperfect detection is important for developing species distribution models (SDMs). Occupancy-detection models based on the time needed to detect a species can be used to address this problem, but this is hindered when times to detection are not known precisely. Here, we extend the time-to-detection model to deal with detections recorded in time intervals and illustrate the method using a case study on stream fish distribution modeling. We collected electrofishing samples of six fish species across a Mediterranean watershed in Northeast Portugal. Based on a Bayesian hierarchical framework, we modeled the probability of water presence in stream channels, and the probability of species occupancy conditional on water presence, in relation to environmental and spatial variables. We also modeled time-to-first detection conditional on occupancy in relation to local factors, using modified interval-censored exponential survival models. Posterior distributions of occupancy probabilities derived from the models were used to produce species distribution maps. Simulations indicated that the modified time-to-detection model provided unbiased parameter estimates despite interval-censoring. There was a tendency for spatial variation in detection rates to be primarily influenced by depth and, to a lesser extent, stream width. Species occupancies were consistently affected by stream order, elevation, and annual precipitation. Bayesian P-values and AUCs indicated that all models had adequate fit and high discrimination ability, respectively. Mapping of predicted occupancy probabilities showed widespread distribution by most species, but uncertainty was generally higher in tributaries and upper reaches. The interval-censored time-to-detection model provides a practical solution to model occupancy-detection when detections are recorded in time intervals. This modeling framework is useful for developing SDMs while controlling for variation in detection rates, as it uses simple data that can be readily collected by field ecologists.

  17. The Effect of Velocity Correlation on the Spatial Evolution of Breakthrough Curves in Heterogeneous Media

    NASA Astrophysics Data System (ADS)

    Massoudieh, A.; Dentz, M.; Le Borgne, T.

    2017-12-01

    In heterogeneous media, the velocity distribution and the spatial correlation structure of velocity for solute particles determine the breakthrough curves and how they evolve as one moves away from the solute source. The ability to predict such evolution can help relating the spatio-statistical hydraulic properties of the media to the transport behavior and travel time distributions. While commonly used non-local transport models such as anomalous dispersion and classical continuous time random walk (CTRW) can reproduce breakthrough curve successfully by adjusting the model parameter values, they lack the ability to relate model parameters to the spatio-statistical properties of the media. This in turns limits the transferability of these models. In the research to be presented, we express concentration or flux of solutes as a distribution over their velocity. We then derive an integrodifferential equation that governs the evolution of the particle distribution over velocity at given times and locations for a particle ensemble, based on a presumed velocity correlation structure and an ergodic cross-sectional velocity distribution. This way, the spatial evolution of breakthrough curves away from the source is predicted based on cross-sectional velocity distribution and the connectivity, which is expressed by the velocity transition probability density. The transition probability is specified via a copula function that can help construct a joint distribution with a given correlation and given marginal velocities. Using this approach, we analyze the breakthrough curves depending on the velocity distribution and correlation properties. The model shows how the solute transport behavior evolves from ballistic transport at small spatial scales to Fickian dispersion at large length scales relative to the velocity correlation length.

  18. A hydroclimatological approach to predicting regional landslide probability using Landlab

    NASA Astrophysics Data System (ADS)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  19. Variable selection models for genomic selection using whole-genome sequence data and singular value decomposition.

    PubMed

    Meuwissen, Theo H E; Indahl, Ulf G; Ødegård, Jørgen

    2017-12-27

    Non-linear Bayesian genomic prediction models such as BayesA/B/C/R involve iteration and mostly Markov chain Monte Carlo (MCMC) algorithms, which are computationally expensive, especially when whole-genome sequence (WGS) data are analyzed. Singular value decomposition (SVD) of the genotype matrix can facilitate genomic prediction in large datasets, and can be used to estimate marker effects and their prediction error variances (PEV) in a computationally efficient manner. Here, we developed, implemented, and evaluated a direct, non-iterative method for the estimation of marker effects for the BayesC genomic prediction model. The BayesC model assumes a priori that markers have normally distributed effects with probability [Formula: see text] and no effect with probability (1 - [Formula: see text]). Marker effects and their PEV are estimated by using SVD and the posterior probability of the marker having a non-zero effect is calculated. These posterior probabilities are used to obtain marker-specific effect variances, which are subsequently used to approximate BayesC estimates of marker effects in a linear model. A computer simulation study was conducted to compare alternative genomic prediction methods, where a single reference generation was used to estimate marker effects, which were subsequently used for 10 generations of forward prediction, for which accuracies were evaluated. SVD-based posterior probabilities of markers having non-zero effects were generally lower than MCMC-based posterior probabilities, but for some regions the opposite occurred, resulting in clear signals for QTL-rich regions. The accuracies of breeding values estimated using SVD- and MCMC-based BayesC analyses were similar across the 10 generations of forward prediction. For an intermediate number of generations (2 to 5) of forward prediction, accuracies obtained with the BayesC model tended to be slightly higher than accuracies obtained using the best linear unbiased prediction of SNP effects (SNP-BLUP model). When reducing marker density from WGS data to 30 K, SNP-BLUP tended to yield the highest accuracies, at least in the short term. Based on SVD of the genotype matrix, we developed a direct method for the calculation of BayesC estimates of marker effects. Although SVD- and MCMC-based marker effects differed slightly, their prediction accuracies were similar. Assuming that the SVD of the marker genotype matrix is already performed for other reasons (e.g. for SNP-BLUP), computation times for the BayesC predictions were comparable to those of SNP-BLUP.

  20. Self-imposed length limits in recreational fisheries

    USGS Publications Warehouse

    Chizinski, Christopher J.; Martin, Dustin R.; Hurley, Keith L.; Pope, Kevin L.

    2014-01-01

    A primary motivating factor on the decision to harvest a fish among consumptive-orientated anglers is the size of the fish. There is likely a cost-benefit trade-off for harvest of individual fish that is size and species dependent, which should produce a logistic-type response of fish fate (release or harvest) as a function of fish size and species. We define the self-imposed length limit as the length at which a captured fish had a 50% probability of being harvested, which was selected because it marks the length of the fish where the probability of harvest becomes greater than the probability of release. We assessed the influences of fish size, catch per unit effort, size distribution of caught fish, and creel limit on the self-imposed length limits for bluegill Lepomis macrochirus, channel catfish Ictalurus punctatus, black crappie Pomoxis nigromaculatus and white crappie Pomoxis annularis combined, white bass Morone chrysops, and yellow perch Perca flavescens at six lakes in Nebraska, USA. As we predicted, the probability of harvest increased with increasing size for all species harvested, which supported the concept of a size-dependent trade-off in costs and benefits of harvesting individual fish. It was also clear that probability of harvest was not simply defined by fish length, but rather was likely influenced to various degrees by interactions between species, catch rate, size distribution, creel-limit regulation and fish size. A greater understanding of harvest decisions within the context of perceived likelihood that a creel limit will be realized by a given angler party, which is a function of fish availability, harvest regulation and angler skill and orientation, is needed to predict the influence that anglers have on fish communities and to allow managers to sustainable manage exploited fish populations in recreational fisheries.

  1. Convergence of Transition Probability Matrix in CLVMarkov Models

    NASA Astrophysics Data System (ADS)

    Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.

    2018-04-01

    A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.

  2. Unified nano-mechanics based probabilistic theory of quasibrittle and brittle structures: I. Strength, static crack growth, lifetime and scaling

    NASA Astrophysics Data System (ADS)

    Le, Jia-Liang; Bažant, Zdeněk P.; Bazant, Martin Z.

    2011-07-01

    Engineering structures must be designed for an extremely low failure probability such as 10 -6, which is beyond the means of direct verification by histogram testing. This is not a problem for brittle or ductile materials because the type of probability distribution of structural strength is fixed and known, making it possible to predict the tail probabilities from the mean and variance. It is a problem, though, for quasibrittle materials for which the type of strength distribution transitions from Gaussian to Weibullian as the structure size increases. These are heterogeneous materials with brittle constituents, characterized by material inhomogeneities that are not negligible compared to the structure size. Examples include concrete, fiber composites, coarse-grained or toughened ceramics, rocks, sea ice, rigid foams and bone, as well as many materials used in nano- and microscale devices. This study presents a unified theory of strength and lifetime for such materials, based on activation energy controlled random jumps of the nano-crack front, and on the nano-macro multiscale transition of tail probabilities. Part I of this study deals with the case of monotonic and sustained (or creep) loading, and Part II with fatigue (or cyclic) loading. On the scale of the representative volume element of material, the probability distribution of strength has a Gaussian core onto which a remote Weibull tail is grafted at failure probability of the order of 10 -3. With increasing structure size, the Weibull tail penetrates into the Gaussian core. The probability distribution of static (creep) lifetime is related to the strength distribution by the power law for the static crack growth rate, for which a physical justification is given. The present theory yields a simple relation between the exponent of this law and the Weibull moduli for strength and lifetime. The benefit is that the lifetime distribution can be predicted from short-time tests of the mean size effect on strength and tests of the power law for the crack growth rate. The theory is shown to match closely numerous test data on strength and static lifetime of ceramics and concrete, and explains why their histograms deviate systematically from the straight line in Weibull scale. Although the present unified theory is built on several previous advances, new contributions are here made to address: (i) a crack in a disordered nano-structure (such as that of hydrated Portland cement), (ii) tail probability of a fiber bundle (or parallel coupling) model with softening elements, (iii) convergence of this model to the Gaussian distribution, (iv) the stress-life curve under constant load, and (v) a detailed random walk analysis of crack front jumps in an atomic lattice. The nonlocal behavior is captured in the present theory through the finiteness of the number of links in the weakest-link model, which explains why the mean size effect coincides with that of the previously formulated nonlocal Weibull theory. Brittle structures correspond to the large-size limit of the present theory. An important practical conclusion is that the safety factors for strength and tolerable minimum lifetime for large quasibrittle structures (e.g., concrete structures and composite airframes or ship hulls, as well as various micro-devices) should be calculated as a function of structure size and geometry.

  3. [Research on engine remaining useful life prediction based on oil spectrum analysis and particle filtering].

    PubMed

    Sun, Lei; Jia, Yun-xian; Cai, Li-ying; Lin, Guo-yu; Zhao, Jin-song

    2013-09-01

    The spectrometric oil analysis(SOA) is an important technique for machine state monitoring, fault diagnosis and prognosis, and SOA based remaining useful life(RUL) prediction has an advantage of finding out the optimal maintenance strategy for machine system. Because the complexity of machine system, its health state degradation process can't be simply characterized by linear model, while particle filtering(PF) possesses obvious advantages over traditional Kalman filtering for dealing nonlinear and non-Gaussian system, the PF approach was applied to state forecasting by SOA, and the RUL prediction technique based on SOA and PF algorithm is proposed. In the prediction model, according to the estimating result of system's posterior probability, its prior probability distribution is realized, and the multi-step ahead prediction model based on PF algorithm is established. Finally, the practical SOA data of some engine was analyzed and forecasted by the above method, and the forecasting result was compared with that of traditional Kalman filtering method. The result fully shows the superiority and effectivity of the

  4. Trending in Probability of Collision Measurements via a Bayesian Zero-Inflated Beta Mixed Model

    NASA Technical Reports Server (NTRS)

    Vallejo, Jonathon; Hejduk, Matt; Stamey, James

    2015-01-01

    We investigate the performance of a generalized linear mixed model in predicting the Probabilities of Collision (Pc) for conjunction events. Specifically, we apply this model to the log(sub 10) transformation of these probabilities and argue that this transformation yields values that can be considered bounded in practice. Additionally, this bounded random variable, after scaling, is zero-inflated. Consequently, we model these values using the zero-inflated Beta distribution, and utilize the Bayesian paradigm and the mixed model framework to borrow information from past and current events. This provides a natural way to model the data and provides a basis for answering questions of interest, such as what is the likelihood of observing a probability of collision equal to the effective value of zero on a subsequent observation.

  5. Site-to-Source Finite Fault Distance Probability Distribution in Probabilistic Seismic Hazard and the Relationship Between Minimum Distances

    NASA Astrophysics Data System (ADS)

    Ortega, R.; Gutierrez, E.; Carciumaru, D. D.; Huesca-Perez, E.

    2017-12-01

    We present a method to compute the conditional and no-conditional probability density function (PDF) of the finite fault distance distribution (FFDD). Two cases are described: lines and areas. The case of lines has a simple analytical solution while, in the case of areas, the geometrical probability of a fault based on the strike, dip, and fault segment vertices is obtained using the projection of spheres in a piecewise rectangular surface. The cumulative distribution is computed by measuring the projection of a sphere of radius r in an effective area using an algorithm that estimates the area of a circle within a rectangle. In addition, we introduce the finite fault distance metrics. This distance is the distance where the maximum stress release occurs within the fault plane and generates a peak ground motion. Later, we can apply the appropriate ground motion prediction equations (GMPE) for PSHA. The conditional probability of distance given magnitude is also presented using different scaling laws. A simple model of constant distribution of the centroid at the geometrical mean is discussed, in this model hazard is reduced at the edges because the effective size is reduced. Nowadays there is a trend of using extended source distances in PSHA, however it is not possible to separate the fault geometry from the GMPE. With this new approach, it is possible to add fault rupture models separating geometrical and propagation effects.

  6. Cluster-based control of a separating flow over a smoothly contoured ramp

    NASA Astrophysics Data System (ADS)

    Kaiser, Eurika; Noack, Bernd R.; Spohn, Andreas; Cattafesta, Louis N.; Morzyński, Marek

    2017-12-01

    The ability to manipulate and control fluid flows is of great importance in many scientific and engineering applications. The proposed closed-loop control framework addresses a key issue of model-based control: The actuation effect often results from slow dynamics of strongly nonlinear interactions which the flow reveals at timescales much longer than the prediction horizon of any model. Hence, we employ a probabilistic approach based on a cluster-based discretization of the Liouville equation for the evolution of the probability distribution. The proposed methodology frames high-dimensional, nonlinear dynamics into low-dimensional, probabilistic, linear dynamics which considerably simplifies the optimal control problem while preserving nonlinear actuation mechanisms. The data-driven approach builds upon a state space discretization using a clustering algorithm which groups kinematically similar flow states into a low number of clusters. The temporal evolution of the probability distribution on this set of clusters is then described by a control-dependent Markov model. This Markov model can be used as predictor for the ergodic probability distribution for a particular control law. This probability distribution approximates the long-term behavior of the original system on which basis the optimal control law is determined. We examine how the approach can be used to improve the open-loop actuation in a separating flow dominated by Kelvin-Helmholtz shedding. For this purpose, the feature space, in which the model is learned, and the admissible control inputs are tailored to strongly oscillatory flows.

  7. Supervised learning of probability distributions by neural networks

    NASA Technical Reports Server (NTRS)

    Baum, Eric B.; Wilczek, Frank

    1988-01-01

    Supervised learning algorithms for feedforward neural networks are investigated analytically. The back-propagation algorithm described by Werbos (1974), Parker (1985), and Rumelhart et al. (1986) is generalized by redefining the values of the input and output neurons as probabilities. The synaptic weights are then varied to follow gradients in the logarithm of likelihood rather than in the error. This modification is shown to provide a more rigorous theoretical basis for the algorithm and to permit more accurate predictions. A typical application involving a medical-diagnosis expert system is discussed.

  8. Using beta binomials to estimate classification uncertainty for ensemble models.

    PubMed

    Clark, Robert D; Liang, Wenkel; Lee, Adam C; Lawless, Michael S; Fraczkiewicz, Robert; Waldman, Marvin

    2014-01-01

    Quantitative structure-activity (QSAR) models have enormous potential for reducing drug discovery and development costs as well as the need for animal testing. Great strides have been made in estimating their overall reliability, but to fully realize that potential, researchers and regulators need to know how confident they can be in individual predictions. Submodels in an ensemble model which have been trained on different subsets of a shared training pool represent multiple samples of the model space, and the degree of agreement among them contains information on the reliability of ensemble predictions. For artificial neural network ensembles (ANNEs) using two different methods for determining ensemble classification - one using vote tallies and the other averaging individual network outputs - we have found that the distribution of predictions across positive vote tallies can be reasonably well-modeled as a beta binomial distribution, as can the distribution of errors. Together, these two distributions can be used to estimate the probability that a given predictive classification will be in error. Large data sets comprised of logP, Ames mutagenicity, and CYP2D6 inhibition data are used to illustrate and validate the method. The distributions of predictions and errors for the training pool accurately predicted the distribution of predictions and errors for large external validation sets, even when the number of positive and negative examples in the training pool were not balanced. Moreover, the likelihood of a given compound being prospectively misclassified as a function of the degree of consensus between networks in the ensemble could in most cases be estimated accurately from the fitted beta binomial distributions for the training pool. Confidence in an individual predictive classification by an ensemble model can be accurately assessed by examining the distributions of predictions and errors as a function of the degree of agreement among the constituent submodels. Further, ensemble uncertainty estimation can often be improved by adjusting the voting or classification threshold based on the parameters of the error distribution. Finally, the profiles for models whose predictive uncertainty estimates are not reliable provide clues to that effect without the need for comparison to an external test set.

  9. The investigation of the lateral interaction effect's on traffic flow behavior under open boundaries

    NASA Astrophysics Data System (ADS)

    Bouadi, M.; Jetto, K.; Benyoussef, A.; El Kenz, A.

    2017-11-01

    In this paper, an open boundaries traffic flow system is studied by taking into account the lateral interaction with spatial defects. For a random defects distribution, if the vehicles velocities are weakly correlated, the traffic phases can be predicted by considering the corresponding inflow and outflow functions. Conversely, if the vehicles velocities are strongly correlated, a phase segregation appears inside the system's bulk which induces the maximum current appearance. Such velocity correlation depends mainly on the defects densities and the probabilities of lateral deceleration. However, for a compact defects distribution, the traffic phases are predictable by using the inflow in the system beginning, the inflow entering the defects zone and the outflow function.

  10. An improved probabilistic approach for linking progenitor and descendant galaxy populations using comoving number density

    NASA Astrophysics Data System (ADS)

    Wellons, Sarah; Torrey, Paul

    2017-06-01

    Galaxy populations at different cosmic epochs are often linked by cumulative comoving number density in observational studies. Many theoretical works, however, have shown that the cumulative number densities of tracked galaxy populations not only evolve in bulk, but also spread out over time. We present a method for linking progenitor and descendant galaxy populations which takes both of these effects into account. We define probability distribution functions that capture the evolution and dispersion of galaxy populations in number density space, and use these functions to assign galaxies at redshift zf probabilities of being progenitors/descendants of a galaxy population at another redshift z0. These probabilities are used as weights for calculating distributions of physical progenitor/descendant properties such as stellar mass, star formation rate or velocity dispersion. We demonstrate that this probabilistic method provides more accurate predictions for the evolution of physical properties than the assumption of either a constant number density or an evolving number density in a bin of fixed width by comparing predictions against galaxy populations directly tracked through a cosmological simulation. We find that the constant number density method performs least well at recovering galaxy properties, the evolving method density slightly better and the probabilistic method best of all. The improvement is present for predictions of stellar mass as well as inferred quantities such as star formation rate and velocity dispersion. We demonstrate that this method can also be applied robustly and easily to observational data, and provide a code package for doing so.

  11. Integrated seismic stochastic inversion and multi-attributes to delineate reservoir distribution: Case study MZ fields, Central Sumatra Basin

    NASA Astrophysics Data System (ADS)

    Haris, A.; Novriyani, M.; Suparno, S.; Hidayat, R.; Riyanto, A.

    2017-07-01

    This study presents the integration of seismic stochastic inversion and multi-attributes for delineating the reservoir distribution in term of lithology and porosity in the formation within depth interval between the Top Sihapas and Top Pematang. The method that has been used is a stochastic inversion, which is integrated with multi-attribute seismic by applying neural network Probabilistic Neural Network (PNN). Stochastic methods are used to predict the probability mapping sandstone as the result of impedance varied with 50 realizations that will produce a good probability. Analysis of Stochastic Seismic Tnversion provides more interpretive because it directly gives the value of the property. Our experiment shows that AT of stochastic inversion provides more diverse uncertainty so that the probability value will be close to the actual values. The produced AT is then used for an input of a multi-attribute analysis, which is used to predict the gamma ray, density and porosity logs. To obtain the number of attributes that are used, stepwise regression algorithm is applied. The results are attributes which are used in the process of PNN. This PNN method is chosen because it has the best correlation of others neural network method. Finally, we interpret the product of the multi-attribute analysis are in the form of pseudo-gamma ray volume, density volume and volume of pseudo-porosity to delineate the reservoir distribution. Our interpretation shows that the structural trap is identified in the southeastern part of study area, which is along the anticline.

  12. The coherence problem with th Unified Neutral Theory of biodiversity

    Treesearch

    James S. Clark

    2012-01-01

    The Unified Neutral Theory of Biodiversity (UNTB), proposed as an alternative to niche theory, has been viewed as a theory that species coexist without niche differences, without fitness differences, or with equal probability of success. Support is claimed when models lacking species differences predict highly aggregated metrics, such as species abundance distributions...

  13. Sequential experimental design based generalised ANOVA

    NASA Astrophysics Data System (ADS)

    Chakraborty, Souvik; Chowdhury, Rajib

    2016-07-01

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover, generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.

  14. Sequential experimental design based generalised ANOVA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chakraborty, Souvik, E-mail: csouvik41@gmail.com; Chowdhury, Rajib, E-mail: rajibfce@iitr.ac.in

    Over the last decade, surrogate modelling technique has gained wide popularity in the field of uncertainty quantification, optimization, model exploration and sensitivity analysis. This approach relies on experimental design to generate training points and regression/interpolation for generating the surrogate. In this work, it is argued that conventional experimental design may render a surrogate model inefficient. In order to address this issue, this paper presents a novel distribution adaptive sequential experimental design (DA-SED). The proposed DA-SED has been coupled with a variant of generalised analysis of variance (G-ANOVA), developed by representing the component function using the generalised polynomial chaos expansion. Moreover,more » generalised analytical expressions for calculating the first two statistical moments of the response, which are utilized in predicting the probability of failure, have also been developed. The proposed approach has been utilized in predicting probability of failure of three structural mechanics problems. It is observed that the proposed approach yields accurate and computationally efficient estimate of the failure probability.« less

  15. A hybrid model for predicting carbon monoxide from vehicular exhausts in urban environments

    NASA Astrophysics Data System (ADS)

    Gokhale, Sharad; Khare, Mukesh

    Several deterministic-based air quality models evaluate and predict the frequently occurring pollutant concentration well but, in general, are incapable of predicting the 'extreme' concentrations. In contrast, the statistical distribution models overcome the above limitation of the deterministic models and predict the 'extreme' concentrations. However, the environmental damages are caused by both extremes as well as by the sustained average concentration of pollutants. Hence, the model should predict not only 'extreme' ranges but also the 'middle' ranges of pollutant concentrations, i.e. the entire range. Hybrid modelling is one of the techniques that estimates/predicts the 'entire range' of the distribution of pollutant concentrations by combining the deterministic based models with suitable statistical distribution models ( Jakeman, et al., 1988). In the present paper, a hybrid model has been developed to predict the carbon monoxide (CO) concentration distributions at one of the traffic intersections, Income Tax Office (ITO), in the Delhi city, where the traffic is heterogeneous in nature and meteorology is 'tropical'. The model combines the general finite line source model (GFLSM) as its deterministic, and log logistic distribution (LLD) model, as its statistical components. The hybrid (GFLSM-LLD) model is then applied at the ITO intersection. The results show that the hybrid model predictions match with that of the observed CO concentration data within the 5-99 percentiles range. The model is further validated at different street location, i.e. Sirifort roadway. The validation results show that the model predicts CO concentrations fairly well ( d=0.91) in 10-95 percentiles range. The regulatory compliance is also developed to estimate the probability of exceedance of hourly CO concentration beyond the National Ambient Air Quality Standards (NAAQS) of India. It consists of light vehicles, heavy vehicles, three- wheelers (auto rickshaws) and two-wheelers (scooters, motorcycles, etc).

  16. Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system structural components

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.

    1987-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  17. Probabilistic Structural Analysis Methods for select space propulsion system structural components (PSAM)

    NASA Technical Reports Server (NTRS)

    Cruse, T. A.; Burnside, O. H.; Wu, Y.-T.; Polch, E. Z.; Dias, J. B.

    1988-01-01

    The objective is the development of several modular structural analysis packages capable of predicting the probabilistic response distribution for key structural variables such as maximum stress, natural frequencies, transient response, etc. The structural analysis packages are to include stochastic modeling of loads, material properties, geometry (tolerances), and boundary conditions. The solution is to be in terms of the cumulative probability of exceedance distribution (CDF) and confidence bounds. Two methods of probability modeling are to be included as well as three types of structural models - probabilistic finite-element method (PFEM); probabilistic approximate analysis methods (PAAM); and probabilistic boundary element methods (PBEM). The purpose in doing probabilistic structural analysis is to provide the designer with a more realistic ability to assess the importance of uncertainty in the response of a high performance structure. Probabilistic Structural Analysis Method (PSAM) tools will estimate structural safety and reliability, while providing the engineer with information on the confidence that should be given to the predicted behavior. Perhaps most critically, the PSAM results will directly provide information on the sensitivity of the design response to those variables which are seen to be uncertain.

  18. Surface drift prediction in the Adriatic Sea using hyper-ensemble statistics on atmospheric, ocean and wave models: Uncertainties and probability distribution areas

    USGS Publications Warehouse

    Rixen, M.; Ferreira-Coelho, E.; Signell, R.

    2008-01-01

    Despite numerous and regular improvements in underlying models, surface drift prediction in the ocean remains a challenging task because of our yet limited understanding of all processes involved. Hence, deterministic approaches to the problem are often limited by empirical assumptions on underlying physics. Multi-model hyper-ensemble forecasts, which exploit the power of an optimal local combination of available information including ocean, atmospheric and wave models, may show superior forecasting skills when compared to individual models because they allow for local correction and/or bias removal. In this work, we explore in greater detail the potential and limitations of the hyper-ensemble method in the Adriatic Sea, using a comprehensive surface drifter database. The performance of the hyper-ensembles and the individual models are discussed by analyzing associated uncertainties and probability distribution maps. Results suggest that the stochastic method may reduce position errors significantly for 12 to 72??h forecasts and hence compete with pure deterministic approaches. ?? 2007 NATO Undersea Research Centre (NURC).

  19. Non-extensive Statistics to the Cosmological Lithium Problem

    NASA Astrophysics Data System (ADS)

    Hou, S. Q.; He, J. J.; Parikh, A.; Kahl, D.; Bertulani, C. A.; Kajino, T.; Mathews, G. J.; Zhao, G.

    2017-01-01

    Big Bang nucleosynthesis (BBN) theory predicts the abundances of the light elements D, 3He, 4He, and 7Li produced in the early universe. The primordial abundances of D and 4He inferred from observational data are in good agreement with predictions, however, BBN theory overestimates the primordial 7Li abundance by about a factor of three. This is the so-called “cosmological lithium problem.” Solutions to this problem using conventional astrophysics and nuclear physics have not been successful over the past few decades, probably indicating the presence of new physics during the era of BBN. We have investigated the impact on BBN predictions of adopting a generalized distribution to describe the velocities of nucleons in the framework of Tsallis non-extensive statistics. This generalized velocity distribution is characterized by a parameter q, and reduces to the usually assumed Maxwell-Boltzmann distribution for q = 1. We find excellent agreement between predicted and observed primordial abundances of D, 4He, and 7Li for 1.069 ≤ q ≤ 1.082, suggesting a possible new solution to the cosmological lithium problem.

  20. Statistical description of non-Gaussian samples in the F2 layer of the ionosphere during heliogeophysical disturbances

    NASA Astrophysics Data System (ADS)

    Sergeenko, N. P.

    2017-11-01

    An adequate statistical method should be developed in order to predict probabilistically the range of ionospheric parameters. This problem is solved in this paper. The time series of the critical frequency of the layer F2- foF2( t) were subjected to statistical processing. For the obtained samples {δ foF2}, statistical distributions and invariants up to the fourth order are calculated. The analysis shows that the distributions differ from the Gaussian law during the disturbances. At levels of sufficiently small probability distributions, there are arbitrarily large deviations from the model of the normal process. Therefore, it is attempted to describe statistical samples {δ foF2} based on the Poisson model. For the studied samples, the exponential characteristic function is selected under the assumption that time series are a superposition of some deterministic and random processes. Using the Fourier transform, the characteristic function is transformed into a nonholomorphic excessive-asymmetric probability-density function. The statistical distributions of the samples {δ foF2} calculated for the disturbed periods are compared with the obtained model distribution function. According to the Kolmogorov's criterion, the probabilities of the coincidence of a posteriori distributions with the theoretical ones are P 0.7-0.9. The conducted analysis makes it possible to draw a conclusion about the applicability of a model based on the Poisson random process for the statistical description and probabilistic variation estimates during heliogeophysical disturbances of the variations {δ foF2}.

  1. Can we expect to predict climate if we cannot shadow weather?

    NASA Astrophysics Data System (ADS)

    Smith, Leonard

    2010-05-01

    What limits our ability to predict (or project) useful statistics of future climate? And how might we quantify those limits? In the early 1960s, Ed Lorenz illustrated one constraint on point forecasts of the weather (chaos) while noting another (model imperfections). In the mid-sixties he went on to discuss climate prediction, noting that chaos, per se, need not limit accurate forecasts of averages and the distributions that define climate. In short, chaos might place draconian limits on what we can say about a particular summer day in 2010 (or 2040), but it need not limit our ability to make accurate and informative statements about the weather over this summer as a whole, or climate distributions of the 2040's. If not chaos, what limits our ability to produce decision relevant probability distribution functions (PDFs)? Is this just a question of technology (raw computer power) and uncertain boundary conditions (emission scenarios)? Arguably, current model simulations of the Earth's climate are limited by model inadequacy: not that the initial or boundary conditions are unknown but that state-of-the-art models would not yield decision-relevant probability distributions even if they were known. Or to place this statement in an empirically falsifiable format: that in 2100 when the boundary conditions are known and computer power is (hopefully) sufficient to allow exhaustive exploration of today's state-of-the-art models: we will find today's models do not admit a trajectory consistent with our knowledge of the state of the earth in 2009 which would prove of decision support relevance for, say, 25 km, hourly resolution. In short: today's models cannot shadow the weather of this century even after the fact. Restating this conjecture in a more positive frame: a 2100 historian of science will be able to determine the highest space and time scales on which 2009 models could have (i) produced trajectories plausibly consistent with the (by then) observed twenty-first century and (ii) produced probability distributions useful as such for decision support. As it will be some time until such conjectures can be refuted, how might we best advise decision makers of the detail (specifically, space and time resolution of a quantity of interest as a function of lead-time) that it is rational to interpret model-based PDFs as decision-relevant probability distributions? Given the nonlinearities already incorporated in our models, how far into the future can one expect a simulation to get the temperature "right" given the simulation has precipitation badly "wrong"? When can biases in local temperature which melt model-ice no longer be dismissed, and neglected by presenting model-anomalies? At what lead times will feedbacks due to model inadequacies cause the 2007 model simulations to drift away from what today's basic science (and 2100 computer power) would suggest? How might one justify quantitative claims regarding "extreme events" (or NUMB weather)? Models are unlikely to forecast things they cannot shadow, or at least track. There is no constraint on rational scientists to take model distributions as their subjective probabilities, unless they believe the model is empirically adequate. How then are we to use today's simulations to inform today's decisions? Two approaches are considered. The first augments the model-based PDF with an explicit subjective-probability of a "Big Surprise". The second is to look not for a PDF but, following Solvency II, consider the risk from any event that cannot be ruled out at, say, the one in 200 level. The fact that neither approach provides the simplicity and apparent confidence of interpreting model-based PDFs as if they were objective probabilities does not contradict the claim that either might lead to better decision-making.

  2. Sexual differentiation in the distribution potential of northern jaguars (Panthera onca)

    USGS Publications Warehouse

    Boydston, Erin E.; Lopez Gonzalez, Carlos A.

    2005-01-01

    We estimated the potential geographic distribution of jaguars in the southwestern United States and northwestern Mexico by modeling the jaguar ecological niche from occurrence records. We modeled separately the distribution of males and females, assuming records of females probably represented established home ranges while male records likely included dispersal movements. The predicted distribution for males was larger than that for females. Eastern Sonora appeared capable for supporting male and female jaguars with potential range expansion into southeastern Arizona. New Mexico and Chihuahua contained environmental characteristics primarily limited to the male niche and thus may be areas into which males occasionally disperse.

  3. The orientation distribution of tunneling-related quantities

    NASA Astrophysics Data System (ADS)

    Seif, W. M.; Refaie, A. I.; Botros, M. M.

    2018-03-01

    In the nuclear tunneling processes involving deformed nuclei, most of the tunneling-related quantities depend on the relative orientations of the participating nuclei. In the presence of different multipole deformations, we study the variation of a few relevant quantities for the α-decay and the sub-barrier fusion processes, in an orientation degree of freedom. The knocking frequency and the penetration probability are evaluated within the Wentzel-Kramers-Brillouin approximation. The interaction potential is calculated with Skyrme-type nucleon-nucleon interaction. We found that the width of the potential pocket, the Coulomb barrier radius, the penetration probability, the α-decay width, and the fusion cross-section follow consistently the orientation-angle variation of the radius of the deformed nucleus. The orientation distribution patterns of the pocket width, the barrier radius, the logarithms of the penetrability, the decay width, and the fusion cross-section are found to be highly analogous to pattern of the deformed-nucleus radius. The curve patterns of the orientation angle distributions of the internal pocket depth, the Coulomb barrier height and width, as well as the knocking frequency simulate inversely the variation of the deformed nucleus radius. The predicted orientation behaviors will be of a special interest in predicting the optimum orientations for the tunneling processes.

  4. Assessing the status and trend of bat populations across broad geographic regions with dynamic distribution models

    USGS Publications Warehouse

    Rodhouse, Thomas J.; Ormsbee, Patricia C.; Irvine, Kathryn M.; Vierling, Lee A.; Szewczak, Joseph M.; Vierling, Kerri T.

    2012-01-01

    Despite its common status, M. lucifugus was only detected during ∼50% of the surveys in occupied sample units. The overall naïve estimate for the proportion of the study region occupied by the species was 0.69, but after accounting for imperfect detection, this increased to ∼0.90. Our models provide evidence of an association between NPP and forest cover and M. lucifugus distribution, with implications for the projected effects of accelerated climate change in the region, which include net aridification as snowpack and stream flows decline. Annual turnover, the probability that an occupied sample unit was a newly occupied one, was estimated to be low (∼0.04–0.14), resulting in flat trend estimated with relatively high precision (SD = 0.04). We mapped the variation in predicted occurrence probabilities and corresponding prediction uncertainty along the productivity gradient. Our results provide a much needed baseline against which future anticipated declines in M. lucifugus occurrence can be measured. The dynamic distribution modeling approach has broad applicability to regional bat monitoring efforts now underway in several countries and we suggest ways to improve and expand our grid-based monitoring program to gain robust insights into bat population status and trend across large portions of North America.

  5. Direct test of the Gaussian auxiliary field ansatz in nonconserved order parameter phase ordering dynamics

    NASA Astrophysics Data System (ADS)

    Yeung, Chuck

    2018-06-01

    The assumption that the local order parameter is related to an underlying spatially smooth auxiliary field, u (r ⃗,t ) , is a common feature in theoretical approaches to non-conserved order parameter phase separation dynamics. In particular, the ansatz that u (r ⃗,t ) is a Gaussian random field leads to predictions for the decay of the autocorrelation function which are consistent with observations, but distinct from predictions using alternative theoretical approaches. In this paper, the auxiliary field is obtained directly from simulations of the time-dependent Ginzburg-Landau equation in two and three dimensions. The results show that u (r ⃗,t ) is equivalent to the distance to the nearest interface. In two dimensions, the probability distribution, P (u ) , is well approximated as Gaussian except for small values of u /L (t ) , where L (t ) is the characteristic length-scale of the patterns. The behavior of P (u ) in three dimensions is more complicated; the non-Gaussian region for small u /L (t ) is much larger than that in two dimensions but the tails of P (u ) begin to approach a Gaussian form at intermediate times. However, at later times, the tails of the probability distribution appear to decay faster than a Gaussian distribution.

  6. Balancing exploration and exploitation in population-based sampling improves fragment-based de novo protein structure prediction.

    PubMed

    Simoncini, David; Schiex, Thomas; Zhang, Kam Y J

    2017-05-01

    Conformational search space exploration remains a major bottleneck for protein structure prediction methods. Population-based meta-heuristics typically enable the possibility to control the search dynamics and to tune the balance between local energy minimization and search space exploration. EdaFold is a fragment-based approach that can guide search by periodically updating the probability distribution over the fragment libraries used during model assembly. We implement the EdaFold algorithm as a Rosetta protocol and provide two different probability update policies: a cluster-based variation (EdaRose c ) and an energy-based one (EdaRose en ). We analyze the search dynamics of our new Rosetta protocols and show that EdaRose c is able to provide predictions with lower C αRMSD to the native structure than EdaRose en and Rosetta AbInitio Relax protocol. Our software is freely available as a C++ patch for the Rosetta suite and can be downloaded from http://www.riken.jp/zhangiru/software/. Our protocols can easily be extended in order to create alternative probability update policies and generate new search dynamics. Proteins 2017; 85:852-858. © 2016 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  7. Scalar utility theory and proportional processing: what does it actually imply?

    PubMed Central

    Rosenström, Tom; Wiesner, Karoline; Houston, Alasdair I

    2017-01-01

    Scalar Utility Theory (SUT) is a model used to predict animal and human choice behaviour in the context of reward amount, delay to reward, and variability in these quantities (risk preferences). This article reviews and extends SUT, deriving novel predictions. We show that, contrary to what has been implied in the literature, (1) SUT can predict both risk averse and risk prone behaviour for both reward amounts and delays to reward depending on experimental parameters, (2) SUT implies violations of several concepts of rational behaviour (e.g. it violates strong stochastic transitivity and its equivalents, and leads to probability matching) and (3) SUT can predict, but does not always predict, a linear relationship between risk sensitivity in choices and coefficient of variation in the decision-making experiment. SUT derives from Scalar Expectancy Theory which models uncertainty in behavioural timing using a normal distribution. We show that the above conclusions also hold for other distributions, such as the inverse Gaussian distribution derived from drift-diffusion models. A straightforward way to test the key assumptions of SUT is suggested and possible extensions, future prospects and mechanistic underpinnings are discussed. PMID:27288541

  8. Scalar utility theory and proportional processing: What does it actually imply?

    PubMed

    Rosenström, Tom; Wiesner, Karoline; Houston, Alasdair I

    2016-09-07

    Scalar Utility Theory (SUT) is a model used to predict animal and human choice behaviour in the context of reward amount, delay to reward, and variability in these quantities (risk preferences). This article reviews and extends SUT, deriving novel predictions. We show that, contrary to what has been implied in the literature, (1) SUT can predict both risk averse and risk prone behaviour for both reward amounts and delays to reward depending on experimental parameters, (2) SUT implies violations of several concepts of rational behaviour (e.g. it violates strong stochastic transitivity and its equivalents, and leads to probability matching) and (3) SUT can predict, but does not always predict, a linear relationship between risk sensitivity in choices and coefficient of variation in the decision-making experiment. SUT derives from Scalar Expectancy Theory which models uncertainty in behavioural timing using a normal distribution. We show that the above conclusions also hold for other distributions, such as the inverse Gaussian distribution derived from drift-diffusion models. A straightforward way to test the key assumptions of SUT is suggested and possible extensions, future prospects and mechanistic underpinnings are discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. A multimodal detection model of dolphins to estimate abundance validated by field experiments.

    PubMed

    Akamatsu, Tomonari; Ura, Tamaki; Sugimatsu, Harumi; Bahl, Rajendar; Behera, Sandeep; Panda, Sudarsan; Khan, Muntaz; Kar, S K; Kar, C S; Kimura, Satoko; Sasaki-Yamamoto, Yukiko

    2013-09-01

    Abundance estimation of marine mammals requires matching of detection of an animal or a group of animal by two independent means. A multimodal detection model using visual and acoustic cues (surfacing and phonation) that enables abundance estimation of dolphins is proposed. The method does not require a specific time window to match the cues of both means for applying mark-recapture method. The proposed model was evaluated using data obtained in field observations of Ganges River dolphins and Irrawaddy dolphins, as examples of dispersed and condensed distributions of animals, respectively. The acoustic detection probability was approximately 80%, 20% higher than that of visual detection for both species, regardless of the distribution of the animals in present study sites. The abundance estimates of Ganges River dolphins and Irrawaddy dolphins fairly agreed with the numbers reported in previous monitoring studies. The single animal detection probability was smaller than that of larger cluster size, as predicted by the model and confirmed by field data. However, dense groups of Irrawaddy dolphins showed difference in cluster sizes observed by visual and acoustic methods. Lower detection probability of single clusters of this species seemed to be caused by the clumped distribution of this species.

  10. Transient statistics in stabilizing periodic orbits

    NASA Astrophysics Data System (ADS)

    Meucci, R.; Gadomski, W.; Ciofini, M.; Arecchi, F. T.

    1995-11-01

    The statistics of chaotic and periodic transient time intervals preceding the stabilization of a given periodic orbit have been experimentally studied in a CO2 laser with modulated losses, subjected to a small subharmonic perturbation. As predicted by the theory, an exponential tail has been found in the probability distribution of chaotic transients. Furthermore, a fine periodic structure in the distributions of the periodic transients, resulting from the interaction of the control signal and the local structure of the chaotic attractor, has been revealed.

  11. A Bayesian approach to modeling 2D gravity data using polygon states

    NASA Astrophysics Data System (ADS)

    Titus, W. J.; Titus, S.; Davis, J. R.

    2015-12-01

    We present a Bayesian Markov chain Monte Carlo (MCMC) method for the 2D gravity inversion of a localized subsurface object with constant density contrast. Our models have four parameters: the density contrast, the number of vertices in a polygonal approximation of the object, an upper bound on the ratio of the perimeter squared to the area, and the vertices of a polygon container that bounds the object. Reasonable parameter values can be estimated prior to inversion using a forward model and geologic information. In addition, we assume that the field data have a common random uncertainty that lies between two bounds but that it has no systematic uncertainty. Finally, we assume that there is no uncertainty in the spatial locations of the measurement stations. For any set of model parameters, we use MCMC methods to generate an approximate probability distribution of polygons for the object. We then compute various probability distributions for the object, including the variance between the observed and predicted fields (an important quantity in the MCMC method), the area, the center of area, and the occupancy probability (the probability that a spatial point lies within the object). In addition, we compare probabilities of different models using parallel tempering, a technique which also mitigates trapping in local optima that can occur in certain model geometries. We apply our method to several synthetic data sets generated from objects of varying shape and location. We also analyze a natural data set collected across the Rio Grande Gorge Bridge in New Mexico, where the object (i.e. the air below the bridge) is known and the canyon is approximately 2D. Although there are many ways to view results, the occupancy probability proves quite powerful. We also find that the choice of the container is important. In particular, large containers should be avoided, because the more closely a container confines the object, the better the predictions match properties of object.

  12. Bootstrap Prediction Intervals in Non-Parametric Regression with Applications to Anomaly Detection

    NASA Technical Reports Server (NTRS)

    Kumar, Sricharan; Srivistava, Ashok N.

    2012-01-01

    Prediction intervals provide a measure of the probable interval in which the outputs of a regression model can be expected to occur. Subsequently, these prediction intervals can be used to determine if the observed output is anomalous or not, conditioned on the input. In this paper, a procedure for determining prediction intervals for outputs of nonparametric regression models using bootstrap methods is proposed. Bootstrap methods allow for a non-parametric approach to computing prediction intervals with no specific assumptions about the sampling distribution of the noise or the data. The asymptotic fidelity of the proposed prediction intervals is theoretically proved. Subsequently, the validity of the bootstrap based prediction intervals is illustrated via simulations. Finally, the bootstrap prediction intervals are applied to the problem of anomaly detection on aviation data.

  13. Dynamic Response of an Optomechanical System to a Stationary Random Excitation in the Time Domain

    DOE PAGES

    Palmer, Jeremy A.; Paez, Thomas L.

    2011-01-01

    Modern electro-optical instruments are typically designed with assemblies of optomechanical members that support optics such that alignment is maintained in service environments that include random vibration loads. This paper presents a nonlinear numerical analysis that calculates statistics for the peak lateral response of optics in an optomechanical sub-assembly subject to random excitation of the housing. The work is unique in that the prior art does not address peak response probability distribution for stationary random vibration in the time domain for a common lens-retainer-housing system with Coulomb damping. Analytical results are validated by using displacement response data from random vibration testingmore » of representative prototype sub-assemblies. A comparison of predictions to experimental results yields reasonable agreement. The Type I Asymptotic form provides the cumulative distribution function for peak response probabilities. Probabilities are calculated for actual lens centration tolerances. The probability that peak response will not exceed the centration tolerance is greater than 80% for prototype configurations where the tolerance is high (on the order of 30 micrometers). Conversely, the probability is low for those where the tolerance is less than 20 micrometers. The analysis suggests a design paradigm based on the influence of lateral stiffness on the magnitude of the response.« less

  14. The discrete Laplace exponential family and estimation of Y-STR haplotype frequencies.

    PubMed

    Andersen, Mikkel Meyer; Eriksen, Poul Svante; Morling, Niels

    2013-07-21

    Estimating haplotype frequencies is important in e.g. forensic genetics, where the frequencies are needed to calculate the likelihood ratio for the evidential weight of a DNA profile found at a crime scene. Estimation is naturally based on a population model, motivating the investigation of the Fisher-Wright model of evolution for haploid lineage DNA markers. An exponential family (a class of probability distributions that is well understood in probability theory such that inference is easily made by using existing software) called the 'discrete Laplace distribution' is described. We illustrate how well the discrete Laplace distribution approximates a more complicated distribution that arises by investigating the well-known population genetic Fisher-Wright model of evolution by a single-step mutation process. It was shown how the discrete Laplace distribution can be used to estimate haplotype frequencies for haploid lineage DNA markers (such as Y-chromosomal short tandem repeats), which in turn can be used to assess the evidential weight of a DNA profile found at a crime scene. This was done by making inference in a mixture of multivariate, marginally independent, discrete Laplace distributions using the EM algorithm to estimate the probabilities of membership of a set of unobserved subpopulations. The discrete Laplace distribution can be used to estimate haplotype frequencies with lower prediction error than other existing estimators. Furthermore, the calculations could be performed on a normal computer. This method was implemented in the freely available open source software R that is supported on Linux, MacOS and MS Windows. Copyright © 2013 Elsevier Ltd. All rights reserved.

  15. Probabilistic Tsunami Hazard Assessment along Nankai Trough (1) An assessment based on the information of the forthcoming earthquake that Earthquake Research Committee(2013) evaluated

    NASA Astrophysics Data System (ADS)

    Hirata, K.; Fujiwara, H.; Nakamura, H.; Osada, M.; Morikawa, N.; Kawai, S.; Ohsumi, T.; Aoi, S.; Yamamoto, N.; Matsuyama, H.; Toyama, N.; Kito, T.; Murashima, Y.; Murata, Y.; Inoue, T.; Saito, R.; Takayama, J.; Akiyama, S.; Korenaga, M.; Abe, Y.; Hashimoto, N.

    2015-12-01

    The Earthquake Research Committee(ERC)/HERP, Government of Japan (2013) revised their long-term evaluation of the forthcoming large earthquake along the Nankai Trough; the next earthquake is estimated M8 to 9 class, and the probability (P30) that the next earthquake will occur within the next 30 years (from Jan. 1, 2013) is 60% to 70%. In this study, we assess tsunami hazards (maximum coastal tsunami heights) in the near future, in terms of a probabilistic approach, from the next earthquake along Nankai Trough, on the basis of ERC(2013)'s report. The probabilistic tsunami hazard assessment that we applied is as follows; (1) Characterized earthquake fault models (CEFMs) are constructed on each of the 15 hypothetical source areas (HSA) that ERC(2013) showed. The characterization rule follows Toyama et al.(2015, JpGU). As results, we obtained total of 1441 CEFMs. (2) We calculate tsunamis due to CEFMs by solving nonlinear, finite-amplitude, long-wave equations with advection and bottom friction terms by finite-difference method. Run-up computation on land is included. (3) A time predictable model predicts the recurrent interval of the present seismic cycle is T=88.2 years (ERC,2013). We fix P30 = 67% by applying the renewal process based on BPT distribution with T and alpha=0.24 as its aperiodicity. (4) We divide the probability P30 into P30(i) for i-th subgroup consisting of the earthquakes occurring in each of 15 HSA by following a probability re-distribution concept (ERC,2014). Then each earthquake (CEFM) in i-th subgroup is assigned a probability P30(i)/N where N is the number of CEFMs in each sub-group. Note that such re-distribution concept of the probability is nothing but tentative because the present seismology cannot give deep knowledge enough to do it. Epistemic logic-tree approach may be required in future. (5) We synthesize a number of tsunami hazard curves at every evaluation points on coasts by integrating the information about 30 years occurrence probabilities P30(i) for all earthquakes (CEFMs) and calculated maximum coastal tsunami heights. In the synthesis, aleatory uncertainties relating to incompleteness of governing equations, CEFM modeling, bathymetry and topography data, etc, are modeled assuming a log-normal probabilistic distribution. Examples of tsunami hazard curves will be presented.

  16. Transport of chromium and selenium in the suboxic zone of a shallow aquifer: Influence of redox and adsorption reactions

    USGS Publications Warehouse

    Kent, D.B.; Davis, J.A.; Anderson, L.C.D.; Rea, B.A.; Waite, T.D.

    1994-01-01

    Breakthrough of Cr(VI) (chromate), Se(VI) (selenate), and O2 (dissolved oxygen) was observed in tracer tests conducted in a shallow, sand and gravel aquifer with mildly reducing conditions. Loss of Cr, probably due to reduction of Cr(VI) to Cr(III) and irreversible sorption of Cr(III), occurred along with slight retardation of Cr(VI), owing to reversible sorption. Reduction of Se(VI) and O2 was thermodynamically feasible but did not occur, indicating conditions, were unfavorable to microbial reduction. Cr(VI) reduction by constituents of aquifer sediments did not achieve local equilibrium during transport. The reduction rate was probably limited by incomplete contact between Cr(VI) transported along predominant flow paths and reductants located in regions within aquifer sediments of comparatively low permeability. Scatter in the amount of Cr reduction calculated from individual breakthrough curves at identical distances downgradient probably resulted from heterogeneities in the distribution of reductants in the sediments. Predictive modeling of the transport and fate of redox-sensitive solutes cannot be based strictly on thermodynamic considerations; knowledge of reaction rates is critical. Potentially important mass transfer rate limitations between solutes and reactants in sediments as well as heterogeneities in the distribution of redox properties in aquifers complicate determination of limiting rates for use in predictive simulations of the transport of redox-sensitive contaminants in groundwater.

  17. On the Prediction of Ground Motion

    NASA Astrophysics Data System (ADS)

    Lavallee, D.; Schmedes, J.; Archuleta, R. J.

    2012-12-01

    Using a slip-weakening dynamic model of rupture, we generated earthquake scenarios that provided the spatio-temporal evolution of the slip on the fault and the radiated field at the free surface. We observed scenarios where the rupture propagates at a supershear speed on some parts of the fault while remaining subshear for other parts of the fault. For some scenarios with nearly identical initial conditions, the rupture speed was always subshear. For both types of scenarios (mixture of supershear and subshear speeds and only subshear), we compute the peak ground accelerations (PGA) regularly distributed over the Earth's surface. We then calculate the probability density functions (PDF) of the PGA. For both types of scenarios, the PDF curves are asymmetrically shaped and asymptotically attenuated according to power law. This behavior of the PDF is similar to that observed for the PDF curves of PGA recorded during earthquakes. The main difference between scenarios with a supershear rupture speed and scenarios with only subshear rupture speed is the range of PGA values. Based on these results, we investigate three issues fundamental for the prediction of ground motion. It is important to recognize that recorded ground motions during an earthquake sample a small fraction of the radiation field. It is not obvious that such sampling will capture the largest ground motion generated during an earthquake, nor that the number of stations is large enough to properly infer the statistical properties associated with the radiation field. To quantify the effect of under (or low) sampling of the radiation field, we design three experiments. For a scenario where the rupture speed is only subshear, we construct multiple sets of observations. Each set is comprised of 100 randomly selected PGA values from all of the PGA's calculated at the Earth's surface. In the first experiment, we evaluate how the distributions of PGA in the sets compare with the distribution of all the PGA. For this experiment, we used different statistical tests (e.g. chi-square). This experiment quantifies the likelihood that a random set of PGA can be used to infer the statistical properties of all the PGA. In the second experiment, we fit the PDF of the PGA of every set with probability laws used in the literature to describe the PDF of recorded PGA: the lognormal law, the generalized maximum extreme value law, and the Levy law. For each set, the probability laws are then used to compute the probability to observe a PGA value that will cause "moderate to heavy" potential damage according to Instrumental Intensity scale developed by USGS. For each probability law, we compare predictions based on the set with the prediction estimated from all the PGA. This experiment quantifies the reliability and uncertainty in predicting an outcome due to under sampling the radiation field. The third experiment consists in using the sets discussed above and repeats the two investigations discussed above but this time comparing with a scenario where the rupture has a supershear speed over part of the fault. The objective here is to assess additional uncertainty in predicting PGA and damage resulting from ruptures that have supershear speeds.

  18. Novel Radiobiological Gamma Index for Evaluation of 3-Dimensional Predicted Dose Distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sumida, Iori, E-mail: sumida@radonc.med.osaka-u.ac.jp; Yamaguchi, Hajime; Kizaki, Hisao

    2015-07-15

    Purpose: To propose a gamma index-based dose evaluation index that integrates the radiobiological parameters of tumor control (TCP) and normal tissue complication probabilities (NTCP). Methods and Materials: Fifteen prostate and head and neck (H&N) cancer patients received intensity modulated radiation therapy. Before treatment, patient-specific quality assurance was conducted via beam-by-beam analysis, and beam-specific dose error distributions were generated. The predicted 3-dimensional (3D) dose distribution was calculated by back-projection of relative dose error distribution per beam. A 3D gamma analysis of different organs (prostate: clinical [CTV] and planned target volumes [PTV], rectum, bladder, femoral heads; H&N: gross tumor volume [GTV], CTV,more » spinal cord, brain stem, both parotids) was performed using predicted and planned dose distributions under 2%/2 mm tolerance and physical gamma passing rate was calculated. TCP and NTCP values were calculated for voxels with physical gamma indices (PGI) >1. We propose a new radiobiological gamma index (RGI) to quantify the radiobiological effects of TCP and NTCP and calculate radiobiological gamma passing rates. Results: The mean RGI gamma passing rates for prostate cases were significantly different compared with those of PGI (P<.03–.001). The mean RGI gamma passing rates for H&N cases (except for GTV) were significantly different compared with those of PGI (P<.001). Differences in gamma passing rates between PGI and RGI were due to dose differences between the planned and predicted dose distributions. Radiobiological gamma distribution was visualized to identify areas where the dose was radiobiologically important. Conclusions: RGI was proposed to integrate radiobiological effects into PGI. This index would assist physicians and medical physicists not only in physical evaluations of treatment delivery accuracy, but also in clinical evaluations of predicted dose distribution.« less

  19. Simulated big sagebrush regeneration supports predicted changes at the trailing and leading edges of distribution shifts

    USGS Publications Warehouse

    Schlaepfer, Daniel R.; Taylor, Kyle A.; Pennington, Victoria E.; Nelson, Kellen N.; Martin, Trace E.; Rottler, Caitlin M.; Lauenroth, William K.; Bradford, John B.

    2015-01-01

    Many semi-arid plant communities in western North America are dominated by big sagebrush. These ecosystems are being reduced in extent and quality due to economic development, invasive species, and climate change. These pervasive modifications have generated concern about the long-term viability of sagebrush habitat and sagebrush-obligate wildlife species (notably greater sage-grouse), highlighting the need for better understanding of the future big sagebrush distribution, particularly at the species' range margins. These leading and trailing edges of potential climate-driven sagebrush distribution shifts are likely to be areas most sensitive to climate change. We used a process-based regeneration model for big sagebrush, which simulates potential germination and seedling survival in response to climatic and edaphic conditions and tested expectations about current and future regeneration responses at trailing and leading edges that were previously identified using traditional species distribution models. Our results confirmed expectations of increased probability of regeneration at the leading edge and decreased probability of regeneration at the trailing edge below current levels. Our simulations indicated that soil water dynamics at the leading edge became more similar to the typical seasonal ecohydrological conditions observed within the current range of big sagebrush ecosystems. At the trailing edge, an increased winter and spring dryness represented a departure from conditions typically supportive of big sagebrush. Our results highlighted that minimum and maximum daily temperatures as well as soil water recharge and summer dry periods are important constraints for big sagebrush regeneration. Overall, our results confirmed previous predictions, i.e., we see consistent changes in areas identified as trailing and leading edges; however, we also identified potential local refugia within the trailing edge, mostly at sites at higher elevation. Decreasing regeneration probability at the trailing edge underscores the Schlaepfer et al. Future regeneration potential of big sagebrush potential futility of efforts to preserve and/or restore big sagebrush in these areas. Conversely, increasing regeneration probability at the leading edge suggest a growing potential for conflicts in management goals between maintaining existing grasslands by preventing sagebrush expansion versus accepting a shift in plant community composition to sagebrush dominance.

  20. Potential effects of climate change on geographic distribution of the Tertiary relict tree species Davidia involucrata in China

    PubMed Central

    Tang, Cindy Q.; Dong, Yi-Fei; Herrando-Moraira, Sonia; Matsui, Tetsuya; Ohashi, Haruka; He, Long-Yuan; Nakao, Katsuhiro; Tanaka, Nobuyuki; Tomita, Mizuki; Li, Xiao-Shuang; Yan, Hai-Zhong; Peng, Ming-Chun; Hu, Jun; Yang, Ruo-Han; Li, Wang-Jun; Yan, Kai; Hou, Xiuli; Zhang, Zhi-Ying; López-Pujol, Jordi

    2017-01-01

    This study, using species distribution modeling (involving a new approach that allows for uncertainty), predicts the distribution of climatically suitable areas prevailing during the mid-Holocene, the Last Glacial Maximum (LGM), and at present, and estimates the potential formation of new habitats in 2070 of the endangered and rare Tertiary relict tree Davidia involucrata Baill. The results regarding the mid-Holocene and the LGM demonstrate that south-central and southwestern China have been long-term stable refugia, and that the current distribution is limited to the prehistoric refugia. Given future distribution under six possible climate scenarios, only some parts of the current range of D. involucrata in the mid-high mountains of south-central and southwestern China would be maintained, while some shift west into higher mountains would occur. Our results show that the predicted suitable area offering high probability (0.5‒1) accounts for an average of only 29.2% among the models predicted for the future (2070), making D. involucrata highly vulnerable. We assess and propose priority protected areas in light of climate change. The information provided will also be relevant in planning conservation of other paleoendemic species having ecological traits and distribution ranges comparable to those of D. involucrata. PMID:28272437

  1. Distribution of cavity trees in midwestern old-growth and second-growth forests

    Treesearch

    Zhaofei Fan; Stephen R. Shifley; Martin A. Spetich; Frank R. Thompson; David R. Larsen

    2003-01-01

    We used classification and regression tree analysis to determine the primary variables associated with the occurrence of cavity trees and the hierarchical structure among those variables. We applied that information to develop logistic models predicting cavity tree probability as a function of diameter, species group, and decay class. Inventories of cavity abundance in...

  2. Distribution of cavity trees in midwesternold-growth and second-growth forests

    Treesearch

    Zhaofei Fan; Stephen R. Shifley; Martin A. Spetich; Frank R., III Thompson; David R. Larsen

    2003-01-01

    We used classification and regression tree analysis to determine the primary variables associated with the occurrence of cavity trees and the hierarchical structure among those variables. We applied that information to develop logistic models predicting cavity tree probability as a function of diameter, species group, and decay class. Inventories of cavity abundance in...

  3. Bragg peak prediction from quantitative proton computed tomography using different path estimates

    PubMed Central

    Wang, Dongxu; Mackie, T Rockwell

    2015-01-01

    This paper characterizes the performance of the straight-line path (SLP) and cubic spline path (CSP) as path estimates used in reconstruction of proton computed tomography (pCT). The GEANT4 Monte Carlo simulation toolkit is employed to simulate the imaging phantom and proton projections. SLP, CSP and the most-probable path (MPP) are constructed based on the entrance and exit information of each proton. The physical deviations of SLP, CSP and MPP from the real path are calculated. Using a conditional proton path probability map, the relative probability of SLP, CSP and MPP are calculated and compared. The depth dose and Bragg peak are predicted on the pCT images reconstructed using SLP, CSP, and MPP and compared with the simulation result. The root-mean-square physical deviations and the cumulative distribution of the physical deviations show that the performance of CSP is comparable to MPP while SLP is slightly inferior. About 90% of the SLP pixels and 99% of the CSP pixels lie in the 99% relative probability envelope of the MPP. Even at an imaging dose of ~0.1 mGy the proton Bragg peak for a given incoming energy can be predicted on the pCT image reconstructed using SLP, CSP, or MPP with 1 mm accuracy. This study shows that SLP and CSP, like MPP, are adequate path estimates for pCT reconstruction, and therefore can be chosen as the path estimation method for pCT reconstruction, which can aid the treatment planning and range prediction of proton radiation therapy. PMID:21212472

  4. Computer Simulation Results for the Two-Point Probability Function of Composite Media

    NASA Astrophysics Data System (ADS)

    Smith, P.; Torquato, S.

    1988-05-01

    Computer simulation results are reported for the two-point matrix probability function S2 of two-phase random media composed of disks distributed with an arbitrary degree of impenetrability λ. The novel technique employed to sample S2( r) (which gives the probability of finding the endpoints of a line segment of length r in the matrix) is very accurate and has a fast execution time. Results for the limiting cases λ = 0 (fully penetrable disks) and λ = 1 (hard disks), respectively, compare very favorably with theoretical predictions made by Torquato and Beasley and by Torquato and Lado. Results are also reported for several values of λ. that lie between these two extremes: cases which heretofore have not been examined.

  5. Analytical approach to an integrate-and-fire model with spike-triggered adaptation

    NASA Astrophysics Data System (ADS)

    Schwalger, Tilo; Lindner, Benjamin

    2015-12-01

    The calculation of the steady-state probability density for multidimensional stochastic systems that do not obey detailed balance is a difficult problem. Here we present the analytical derivation of the stationary joint and various marginal probability densities for a stochastic neuron model with adaptation current. Our approach assumes weak noise but is valid for arbitrary adaptation strength and time scale. The theory predicts several effects of adaptation on the statistics of the membrane potential of a tonically firing neuron: (i) a membrane potential distribution with a convex shape, (ii) a strongly increased probability of hyperpolarized membrane potentials induced by strong and fast adaptation, and (iii) a maximized variability associated with the adaptation current at a finite adaptation time scale.

  6. Quantifying Extrinsic Noise in Gene Expression Using the Maximum Entropy Framework

    PubMed Central

    Dixit, Purushottam D.

    2013-01-01

    We present a maximum entropy framework to separate intrinsic and extrinsic contributions to noisy gene expression solely from the profile of expression. We express the experimentally accessible probability distribution of the copy number of the gene product (mRNA or protein) by accounting for possible variations in extrinsic factors. The distribution of extrinsic factors is estimated using the maximum entropy principle. Our results show that extrinsic factors qualitatively and quantitatively affect the probability distribution of the gene product. We work out, in detail, the transcription of mRNA from a constitutively expressed promoter in Escherichia coli. We suggest that the variation in extrinsic factors may account for the observed wider-than-Poisson distribution of mRNA copy numbers. We successfully test our framework on a numerical simulation of a simple gene expression scheme that accounts for the variation in extrinsic factors. We also make falsifiable predictions, some of which are tested on previous experiments in E. coli whereas others need verification. Application of the presented framework to more complex situations is also discussed. PMID:23790383

  7. Quantifying extrinsic noise in gene expression using the maximum entropy framework.

    PubMed

    Dixit, Purushottam D

    2013-06-18

    We present a maximum entropy framework to separate intrinsic and extrinsic contributions to noisy gene expression solely from the profile of expression. We express the experimentally accessible probability distribution of the copy number of the gene product (mRNA or protein) by accounting for possible variations in extrinsic factors. The distribution of extrinsic factors is estimated using the maximum entropy principle. Our results show that extrinsic factors qualitatively and quantitatively affect the probability distribution of the gene product. We work out, in detail, the transcription of mRNA from a constitutively expressed promoter in Escherichia coli. We suggest that the variation in extrinsic factors may account for the observed wider-than-Poisson distribution of mRNA copy numbers. We successfully test our framework on a numerical simulation of a simple gene expression scheme that accounts for the variation in extrinsic factors. We also make falsifiable predictions, some of which are tested on previous experiments in E. coli whereas others need verification. Application of the presented framework to more complex situations is also discussed. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  8. Monte Carlo decision curve analysis using aggregate data.

    PubMed

    Hozo, Iztok; Tsalatsanis, Athanasios; Djulbegovic, Benjamin

    2017-02-01

    Decision curve analysis (DCA) is an increasingly used method for evaluating diagnostic tests and predictive models, but its application requires individual patient data. The Monte Carlo (MC) method can be used to simulate probabilities and outcomes of individual patients and offers an attractive option for application of DCA. We constructed a MC decision model to simulate individual probabilities of outcomes of interest. These probabilities were contrasted against the threshold probability at which a decision-maker is indifferent between key management strategies: treat all, treat none or use predictive model to guide treatment. We compared the results of DCA with MC simulated data against the results of DCA based on actual individual patient data for three decision models published in the literature: (i) statins for primary prevention of cardiovascular disease, (ii) hospice referral for terminally ill patients and (iii) prostate cancer surgery. The results of MC DCA and patient data DCA were identical. To the extent that patient data DCA were used to inform decisions about statin use, referral to hospice or prostate surgery, the results indicate that MC DCA could have also been used. As long as the aggregate parameters on distribution of the probability of outcomes and treatment effects are accurately described in the published reports, the MC DCA will generate indistinguishable results from individual patient data DCA. We provide a simple, easy-to-use model, which can facilitate wider use of DCA and better evaluation of diagnostic tests and predictive models that rely only on aggregate data reported in the literature. © 2017 Stichting European Society for Clinical Investigation Journal Foundation.

  9. The potential effect of global warming on the geographic and seasonal distribution of Phlebotomus papatasi in Southwest Asia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cross, E.R.; Hyams, K.C.

    1996-07-01

    The distribution of Phlebotomus papatasi in Southwest Asia is thought to be highly dependent on temperature and relative humidity. A discriminant analysis model based on weather data and reported vector surveys was developed to predict the seasonal and geographic distribution of P. papatasi in this region. To simulate global warming, temperature values for 115 weather stations were increased by 1 {degrees}C, 3{degrees}C, and 5{degrees}C, and the outcome variable coded as unknown in the model. Probability of occurrence values were then predicted for each location with a weather station. Stations with positive probability of occurrence values for May, June, July, andmore » August were considered locations where two or more life cycles of P. papatasi could occur and which could support endemic transmission of leishmaniasis and sandfly fever. Among 115 weather stations, 71 (62%) would be considered endemic with current temperature conditions; 14 (12%) additional station could become endemic with an increase of 1 {degrees}C; 17 (15%) more than a 3{degrees}C increase; and 12 (10%) more (all but one station) with a t{degrees}C increase. In addition to increased geographic distribution, seasonality of disease transmission could be extended throughout 12 months of the year in 7 (6%) locations with at least a 3{degrees}C rise in temperature and in 29 (25%) locations with a 5{degrees}C rise. 15 refs., 4 figs.« less

  10. A fully traits-based approach to modeling global vegetation distribution.

    PubMed

    van Bodegom, Peter M; Douma, Jacob C; Verheijen, Lieneke M

    2014-09-23

    Dynamic Global Vegetation Models (DGVMs) are indispensable for our understanding of climate change impacts. The application of traits in DGVMs is increasingly refined. However, a comprehensive analysis of the direct impacts of trait variation on global vegetation distribution does not yet exist. Here, we present such analysis as proof of principle. We run regressions of trait observations for leaf mass per area, stem-specific density, and seed mass from a global database against multiple environmental drivers, making use of findings of global trait convergence. This analysis explained up to 52% of the global variation of traits. Global trait maps, generated by coupling the regression equations to gridded soil and climate maps, showed up to orders of magnitude variation in trait values. Subsequently, nine vegetation types were characterized by the trait combinations that they possess using Gaussian mixture density functions. The trait maps were input to these functions to determine global occurrence probabilities for each vegetation type. We prepared vegetation maps, assuming that the most probable (and thus, most suited) vegetation type at each location will be realized. This fully traits-based vegetation map predicted 42% of the observed vegetation distribution correctly. Our results indicate that a major proportion of the predictive ability of DGVMs with respect to vegetation distribution can be attained by three traits alone if traits like stem-specific density and seed mass are included. We envision that our traits-based approach, our observation-driven trait maps, and our vegetation maps may inspire a new generation of powerful traits-based DGVMs.

  11. Modeling of fiber orientation in viscous fluid flow with application to self-compacting concrete

    NASA Astrophysics Data System (ADS)

    Kolařík, Filip; Patzák, Bořek

    2013-10-01

    In recent years, unconventional concrete reinforcement is of growing popularity. Especially fiber reinforcement has very wide usage in high performance concretes like "Self Compacting Concrete" (SCC). The design of advanced tailor-made structures made of SCC can take advantage of anisotropic orientation of fibers. Tools for fiber orientation predictions can contribute to design of tailor made structure and allow to develop casting procedures that enable to achieve the desired fiber distribution and orientation. This paper deals with development and implementation of suitable tool for prediction of fiber orientation in a fluid based on the knowledge of the velocity field. Statistical approach to the topic is employed. Fiber orientation is described by a probability distribution of the fiber angle.

  12. Interspecies scaling: predicting volumes, mean residence time and elimination half-life. Some suggestions.

    PubMed

    Mahmood, I

    1998-05-01

    Extrapolation of animal data to assess pharmacokinetic parameters in man is an important tool in drug development. Clearance, volume of distribution and elimination half-life are the three most frequently extrapolated pharmacokinetic parameters. Extensive work has been done to improve the predictive performance of allometric scaling for clearance. In general there is good correlation between body weight and volume, hence volume in man can be predicted with reasonable accuracy from animal data. Besides the volume of distribution in the central compartment (Vc), two other volume terms, the volume of distribution by area (Vbeta) and the volume of distribution at steady state (VdSS), are also extrapolated from animals to man. This report compares the predictive performance of allometric scaling for Vc, Vbeta and VdSS in man from animal data. The relationship between elimination half-life (t(1/2)) and body weight across species results in poor correlation, most probably because of the hybrid nature of this parameter. To predict half-life in man from animal data, an indirect method (CL=VK, where CL=clearance, V is volume and K is elimination rate constant) has been proposed. This report proposes another indirect method which uses the mean residence time (MRT). After establishing that MRT can be predicted across species, it was used to predict half-life using the equation MRT=1.44 x t(1/2). The results of the study indicate that Vc is predicted more accurately than Vbeta and VdSS in man. It should be emphasized that for first-time dosing in man, Vc is a more important pharmacokinetic parameter than Vbeta or VdSS. Furthermore, MRT can be predicted reasonably well for man and can be used for prediction of half-life.

  13. Distribution and habitat use of red panda in the Chitwan-Annapurna Landscape of Nepal

    PubMed Central

    Sherpa, Peema; Thapa, Gokarna Jung; Kokh, Manish; Lama, Sonam Tashi; Khanal, Kapil; Thapa, Arjun; Jnawali, Shant Raj

    2017-01-01

    In Nepal, the red panda (Ailurus fulgens) has been sparsely studied, although its range covers a wide area. The present study was carried out in the previously untapped Chitwan-Annapurna Landscape (CHAL) situated in central Nepal with an aim to explore current distributional status and identify key habitat use. Extensive field surveys conducted in 10 red panda range districts were used to estimate species distribution by presence-absence occupancy modeling and to predict distribution by presence-only modeling. The presence of red pandas was recorded in five districts: Rasuwa, Nuwakot, Myagdi, Baglung and Dhading. The predictive distribution model indicated that 1,904.44 km2 of potential red panda habitat is available in CHAL with the protected area covering nearly 41% of the total habitat. The habitat suitability analysis based on the probability of occurrence showed only 16.58% (A = 315.81 km2) of the total potential habitat is highly suitable. Red Panda occupancy was estimated to be around 0.0667, indicating nearly 7% (218 km2) of the total habitat is occupied with an average detection probability of 0.4482±0.377. Based on the habitat use analysis, altogether eight variables including elevation, slope, aspect, proximity to water sources, bamboo abundance, height, cover, and seasonal precipitation were observed to have significant roles in the distribution of red pandas. In addition, 25 tree species were documented from red panda sign plots out of 165 species recorded in the survey area. Most common was Betula utilis followed by Rhododendron spp. and Abies spectabilis. The extirpation of red pandas in previously reported areas indicates a need for immediate action for the long-term conservation of this species in CHAL. PMID:29020020

  14. Distribution and habitat use of red panda in the Chitwan-Annapurna Landscape of Nepal.

    PubMed

    Bista, Damber; Shrestha, Saroj; Sherpa, Peema; Thapa, Gokarna Jung; Kokh, Manish; Lama, Sonam Tashi; Khanal, Kapil; Thapa, Arjun; Jnawali, Shant Raj

    2017-01-01

    In Nepal, the red panda (Ailurus fulgens) has been sparsely studied, although its range covers a wide area. The present study was carried out in the previously untapped Chitwan-Annapurna Landscape (CHAL) situated in central Nepal with an aim to explore current distributional status and identify key habitat use. Extensive field surveys conducted in 10 red panda range districts were used to estimate species distribution by presence-absence occupancy modeling and to predict distribution by presence-only modeling. The presence of red pandas was recorded in five districts: Rasuwa, Nuwakot, Myagdi, Baglung and Dhading. The predictive distribution model indicated that 1,904.44 km2 of potential red panda habitat is available in CHAL with the protected area covering nearly 41% of the total habitat. The habitat suitability analysis based on the probability of occurrence showed only 16.58% (A = 315.81 km2) of the total potential habitat is highly suitable. Red Panda occupancy was estimated to be around 0.0667, indicating nearly 7% (218 km2) of the total habitat is occupied with an average detection probability of 0.4482±0.377. Based on the habitat use analysis, altogether eight variables including elevation, slope, aspect, proximity to water sources, bamboo abundance, height, cover, and seasonal precipitation were observed to have significant roles in the distribution of red pandas. In addition, 25 tree species were documented from red panda sign plots out of 165 species recorded in the survey area. Most common was Betula utilis followed by Rhododendron spp. and Abies spectabilis. The extirpation of red pandas in previously reported areas indicates a need for immediate action for the long-term conservation of this species in CHAL.

  15. Cell-size distribution in epithelial tissue formation and homeostasis

    PubMed Central

    Primo, Luca; Celani, Antonio

    2017-01-01

    How cell growth and proliferation are orchestrated in living tissues to achieve a given biological function is a central problem in biology. During development, tissue regeneration and homeostasis, cell proliferation must be coordinated by spatial cues in order for cells to attain the correct size and shape. Biological tissues also feature a notable homogeneity of cell size, which, in specific cases, represents a physiological need. Here, we study the temporal evolution of the cell-size distribution by applying the theory of kinetic fragmentation to tissue development and homeostasis. Our theory predicts self-similar probability density function (PDF) of cell size and explains how division times and redistribution ensure cell size homogeneity across the tissue. Theoretical predictions and numerical simulations of confluent non-homeostatic tissue cultures show that cell size distribution is self-similar. Our experimental data confirm predictions and reveal that, as assumed in the theory, cell division times scale like a power-law of the cell size. We find that in homeostatic conditions there is a stationary distribution with lognormal tails, consistently with our experimental data. Our theoretical predictions and numerical simulations show that the shape of the PDF depends on how the space inherited by apoptotic cells is redistributed and that apoptotic cell rates might also depend on size. PMID:28330988

  16. Cell-size distribution in epithelial tissue formation and homeostasis.

    PubMed

    Puliafito, Alberto; Primo, Luca; Celani, Antonio

    2017-03-01

    How cell growth and proliferation are orchestrated in living tissues to achieve a given biological function is a central problem in biology. During development, tissue regeneration and homeostasis, cell proliferation must be coordinated by spatial cues in order for cells to attain the correct size and shape. Biological tissues also feature a notable homogeneity of cell size, which, in specific cases, represents a physiological need. Here, we study the temporal evolution of the cell-size distribution by applying the theory of kinetic fragmentation to tissue development and homeostasis. Our theory predicts self-similar probability density function (PDF) of cell size and explains how division times and redistribution ensure cell size homogeneity across the tissue. Theoretical predictions and numerical simulations of confluent non-homeostatic tissue cultures show that cell size distribution is self-similar. Our experimental data confirm predictions and reveal that, as assumed in the theory, cell division times scale like a power-law of the cell size. We find that in homeostatic conditions there is a stationary distribution with lognormal tails, consistently with our experimental data. Our theoretical predictions and numerical simulations show that the shape of the PDF depends on how the space inherited by apoptotic cells is redistributed and that apoptotic cell rates might also depend on size. © 2017 The Author(s).

  17. Predicting the geographical distribution of two invasive termite species from occurrence data.

    PubMed

    Tonini, Francesco; Divino, Fabio; Lasinio, Giovanna Jona; Hochmair, Hartwig H; Scheffrahn, Rudolf H

    2014-10-01

    Predicting the potential habitat of species under both current and future climate change scenarios is crucial for monitoring invasive species and understanding a species' response to different environmental conditions. Frequently, the only data available on a species is the location of its occurrence (presence-only data). Using occurrence records only, two models were used to predict the geographical distribution of two destructive invasive termite species, Coptotermes gestroi (Wasmann) and Coptotermes formosanus Shiraki. The first model uses a Bayesian linear logistic regression approach adjusted for presence-only data while the second one is the widely used maximum entropy approach (Maxent). Results show that the predicted distributions of both C. gestroi and C. formosanus are strongly linked to urban development. The impact of future scenarios such as climate warming and population growth on the biotic distribution of both termite species was also assessed. Future climate warming seems to affect their projected probability of presence to a lesser extent than population growth. The Bayesian logistic approach outperformed Maxent consistently in all models according to evaluation criteria such as model sensitivity and ecological realism. The importance of further studies for an explicit treatment of residual spatial autocorrelation and a more comprehensive comparison between both statistical approaches is suggested.

  18. Resource selection models are useful in predicting fine-scale distributions of black-footed ferrets in prairie dog colonies

    USGS Publications Warehouse

    Eads, David A.; Jachowski, David S.; Biggins, Dean E.; Livieri, Travis M.; Matchett, Marc R.; Millspaugh, Joshua J.

    2012-01-01

    Wildlife-habitat relationships are often conceptualized as resource selection functions (RSFs)—models increasingly used to estimate species distributions and prioritize habitat conservation. We evaluated the predictive capabilities of 2 black-footed ferret (Mustela nigripes) RSFs developed on a 452-ha colony of black-tailed prairie dogs (Cynomys ludovicianus) in the Conata Basin, South Dakota. We used the RSFs to project the relative probability of occurrence of ferrets throughout an adjacent 227-ha colony. We evaluated performance of the RSFs using ferret space use data collected via postbreeding spotlight surveys June–October 2005–2006. In home ranges and core areas, ferrets selected the predicted "very high" and "high" occurrence categories of both RSFs. Count metrics also suggested selection of these categories; for each model in each year, approximately 81% of ferret locations occurred in areas of very high or high predicted occurrence. These results suggest usefulness of the RSFs in estimating the distribution of ferrets throughout a black-tailed prairie dog colony. The RSFs provide a fine-scale habitat assessment for ferrets that can be used to prioritize releases of ferrets and habitat restoration for prairie dogs and ferrets. A method to quickly inventory the distribution of prairie dog burrow openings would greatly facilitate application of the RSFs.

  19. Activated recombinative desorption: A potential component in mechanisms of spacecraft glow

    NASA Technical Reports Server (NTRS)

    Cross, J. B.

    1985-01-01

    The concept of activated recombination of atomic species on surfaces can explain the production of vibrationally and translationally excited desorbed molecular species. Equilibrium statistical mechanics predicts that the molecular quantum state distributions of desorbing molecules is a function of surface temperature only when the adsorption probability is unity and independent of initial collision conditions. In most cases, the adsorption probability is dependent upon initial conditions such as collision energy or internal quantum state distribution of impinging molecules. From detailed balance, such dynamical behavior is reflected in the internal quantum state distribution of the desorbing molecule. This concept, activated recombinative desorption, may offer a common thread in proposed mechanisms of spacecraft glow. Using molecular beam techniques and equipment available at Los Alamos, which includes a high translational energy 0-atom beam source, mass spectrometric detection of desorbed species, chemiluminescence/laser induced fluorescence detection of electronic and vibrationally excited reaction products, and Auger detection of surface adsorbed reaction products, a fundamental study of the gas surface chemistry underlying the glow process is proposed.

  20. Post-glacial redistribution and shifts in productivity of giant kelp forests

    PubMed Central

    Graham, Michael H.; Kinlan, Brian P.; Grosberg, Richard K.

    2010-01-01

    Quaternary glacial–interglacial cycles create lasting biogeographic, demographic and genetic effects on ecosystems, yet the ecological effects of ice ages on benthic marine communities are unknown. We analysed long-term datasets to develop a niche-based model of southern Californian giant kelp (Macrocystis pyrifera) forest distribution as a function of oceanography and geomorphology, and synthesized palaeo-oceanographic records to show that late Quaternary climate change probably drove high millennial variability in the distribution and productivity of this foundation species. Our predictions suggest that kelp forest biomass increased up to threefold from the glacial maximum to the mid-Holocene, then rapidly declined by 40–70 per cent to present levels. The peak in kelp forest productivity would have coincided with the earliest coastal archaeological sites in the New World. Similar late Quaternary changes in kelp forest distribution and productivity probably occurred in coastal upwelling systems along active continental margins worldwide, which would have resulted in complex shifts in the relative productivity of terrestrial and marine components of coastal ecosystems. PMID:19846450

  1. Statistical Orbit Determination using the Particle Filter for Incorporating Non-Gaussian Uncertainties

    NASA Technical Reports Server (NTRS)

    Mashiku, Alinda; Garrison, James L.; Carpenter, J. Russell

    2012-01-01

    The tracking of space objects requires frequent and accurate monitoring for collision avoidance. As even collision events with very low probability are important, accurate prediction of collisions require the representation of the full probability density function (PDF) of the random orbit state. Through representing the full PDF of the orbit state for orbit maintenance and collision avoidance, we can take advantage of the statistical information present in the heavy tailed distributions, more accurately representing the orbit states with low probability. The classical methods of orbit determination (i.e. Kalman Filter and its derivatives) provide state estimates based on only the second moments of the state and measurement errors that are captured by assuming a Gaussian distribution. Although the measurement errors can be accurately assumed to have a Gaussian distribution, errors with a non-Gaussian distribution could arise during propagation between observations. Moreover, unmodeled dynamics in the orbit model could introduce non-Gaussian errors into the process noise. A Particle Filter (PF) is proposed as a nonlinear filtering technique that is capable of propagating and estimating a more complete representation of the state distribution as an accurate approximation of a full PDF. The PF uses Monte Carlo runs to generate particles that approximate the full PDF representation. The PF is applied in the estimation and propagation of a highly eccentric orbit and the results are compared to the Extended Kalman Filter and Splitting Gaussian Mixture algorithms to demonstrate its proficiency.

  2. Scale relativity and quantization of planet obliquities.

    NASA Astrophysics Data System (ADS)

    Nottale, L.

    1998-07-01

    The author applies the theory of scale relativity to the equations of rotational motion of solid bodies. He predicts in the new framework that the obliquities and inclinations of planets and satellites in the solar system must be quantized. Namely, one expects their distribution to be no longer uniform between 0 and π, but instead to display well-defined peaks of probability density at angles θk = kπ/n. The author shows in the present paper that the observational data agree very well with the prediction for n = 7, including the retrograde bodies and those which are heeled over the ecliptic plane. In particular, the value 23°27' of the obliquity of the Earth, which partly determines its climate, is not a random one, but lies in one of the main probability peaks at θ = π/7.

  3. Pseudochemotaxis in inhomogeneous active Brownian systems

    NASA Astrophysics Data System (ADS)

    Vuijk, Hidde D.; Sharma, Abhinav; Mondal, Debasish; Sommer, Jens-Uwe; Merlitz, Holger

    2018-04-01

    We study dynamical properties of confined, self-propelled Brownian particles in an inhomogeneous activity profile. Using Brownian dynamics simulations, we calculate the probability to reach a fixed target and the mean first passage time to the target of an active particle. We show that both these quantities are strongly influenced by the inhomogeneous activity. When the activity is distributed such that high-activity zone is located between the target and the starting location, the target finding probability is increased and the passage time is decreased in comparison to a uniformly active system. Moreover, for a continuously distributed profile, the activity gradient results in a drift of active particle up the gradient bearing resemblance to chemotaxis. Integrating out the orientational degrees of freedom, we derive an approximate Fokker-Planck equation and show that the theoretical predictions are in very good agreement with the Brownian dynamics simulations.

  4. Quantum Theory of Wormholes

    NASA Astrophysics Data System (ADS)

    González-Díaz, Pedro F.

    We re-explore the effects of multiply-connected wormholes on ordinary matter at low energies. It is obtained that the path integral that describes these effects is given in terms of a Planckian probability distribution for the Coleman α-parameters, rather than a classical Gaussian distribution law. This implies that the path integral over all low-energy fields with the wormhole effective interactions can no longer vary continuously, and that the quantities α2 are interpretable as the momenta of a quantum field. Using the new result that, rather than being given in terms of the Coleman-Hawking probability, the Euclidean action must equal negative entropy, the model predicts a very small but still nonzero cosmological constant and quite reasonable values for the pion and neutrino masses. The divergence problems of Euclidean quantum gravity are also discussed in the light of the above results.

  5. Safe leads and lead changes in competitive team sports.

    PubMed

    Clauset, A; Kogan, M; Redner, S

    2015-06-01

    We investigate the time evolution of lead changes within individual games of competitive team sports. Exploiting ideas from the theory of random walks, the number of lead changes within a single game follows a Gaussian distribution. We show that the probability that the last lead change and the time of the largest lead size are governed by the same arcsine law, a bimodal distribution that diverges at the start and at the end of the game. We also determine the probability that a given lead is "safe" as a function of its size L and game time t. Our predictions generally agree with comprehensive data on more than 1.25 million scoring events in roughly 40,000 games across four professional or semiprofessional team sports, and are more accurate than popular heuristics currently used in sports analytics.

  6. Safe leads and lead changes in competitive team sports

    NASA Astrophysics Data System (ADS)

    Clauset, A.; Kogan, M.; Redner, S.

    2015-06-01

    We investigate the time evolution of lead changes within individual games of competitive team sports. Exploiting ideas from the theory of random walks, the number of lead changes within a single game follows a Gaussian distribution. We show that the probability that the last lead change and the time of the largest lead size are governed by the same arcsine law, a bimodal distribution that diverges at the start and at the end of the game. We also determine the probability that a given lead is "safe" as a function of its size L and game time t . Our predictions generally agree with comprehensive data on more than 1.25 million scoring events in roughly 40 000 games across four professional or semiprofessional team sports, and are more accurate than popular heuristics currently used in sports analytics.

  7. Brook trout distributional response to unconventional oil and gas development: Landscape context matters

    USGS Publications Warehouse

    Merriam, Eric R.; Petty, J. Todd; Maloney, Kelly O.; Young, John A.; Faulkner, Stephen; Slonecker, Terry; Milheim, Lesley E.; Hailegiorgis, Atesmachew; Niles, Jonathan M.

    2018-01-01

    We conducted a large-scale assessment of unconventional oil and gas (UOG) development effects on brook trout (Salvelinus fontinalis) distribution. We compiled 2231 brook trout collection records from the Upper Susquehanna River Watershed, USA. We used boosted regression tree (BRT) analysis to predict occurrence probability at the 1:24,000 stream-segment scale as a function of natural and anthropogenic landscape and climatic attributes. We then evaluated the importance of landscape context (i.e., pre-existing natural habitat quality and anthropogenic degradation) in modulating the effects of UOG on brook trout distribution under UOG development scenarios. BRT made use of 5 anthropogenic (28% relative influence) and 7 natural (72% relative influence) variables to model occurrence with a high degree of accuracy [Area Under the Receiver Operating Curve (AUC) = 0.85 and cross-validated AUC = 0.81]. UOG development impacted 11% (n = 2784) of streams and resulted in a loss of predicted occurrence in 126 (4%). Most streams impacted by UOG had unsuitable underlying natural habitat quality (n = 1220; 44%). Brook trout were predicted to be absent from an additional 26% (n = 733) of streams due to pre-existing non-UOG land uses (i.e., agriculture, residential and commercial development, or historic mining). Streams with a predicted and observed (via existing pre- and post-disturbance fish sampling records) loss of occurrence due to UOG tended to have intermediate natural habitat quality and/or intermediate levels of non-UOG stress. Simulated development of permitted but undeveloped UOG wells (n = 943) resulted in a loss of predicted occurrence in 27 additional streams. Loss of occurrence was strongly dependent upon landscape context, suggesting effects of current and future UOG development are likely most relevant in streams near the probability threshold due to pre-existing habitat degradation.

  8. A bioinformatic survey of distribution, conservation, and probable functions of LuxR solo regulators in bacteria.

    PubMed

    Subramoni, Sujatha; Florez Salcedo, Diana Vanessa; Suarez-Moreno, Zulma R

    2015-01-01

    LuxR solo transcriptional regulators contain both an autoinducer binding domain (ABD; N-terminal) and a DNA binding Helix-Turn-Helix domain (HTH; C-terminal), but are not associated with a cognate N-acyl homoserine lactone (AHL) synthase coding gene in the same genome. Although a few LuxR solos have been characterized, their distributions as well as their role in bacterial signal perception and other processes are poorly understood. In this study we have carried out a systematic survey of distribution of all ABD containing LuxR transcriptional regulators (QS domain LuxRs) available in the InterPro database (IPR005143), and identified those lacking a cognate AHL synthase. These LuxR solos were then analyzed regarding their taxonomical distribution, predicted functions of neighboring genes and the presence of complete AHL-QS systems in the genomes that carry them. Our analyses reveal the presence of one or multiple predicted LuxR solos in many proteobacterial genomes carrying QS domain LuxRs, some of them harboring genes for one or more AHL-QS circuits. The presence of LuxR solos in bacteria occupying diverse environments suggests potential ecological functions for these proteins beyond AHL and interkingdom signaling. Based on gene context and the conservation levels of invariant amino acids of ABD, we have classified LuxR solos into functionally meaningful groups or putative orthologs. Surprisingly, putative LuxR solos were also found in a few non-proteobacterial genomes which are not known to carry AHL-QS systems. Multiple predicted LuxR solos in the same genome appeared to have different levels of conservation of invariant amino acid residues of ABD questioning their binding to AHLs. In summary, this study provides a detailed overview of distribution of LuxR solos and their probable roles in bacteria with genome sequence information.

  9. A bioinformatic survey of distribution, conservation, and probable functions of LuxR solo regulators in bacteria

    PubMed Central

    Subramoni, Sujatha; Florez Salcedo, Diana Vanessa; Suarez-Moreno, Zulma R.

    2015-01-01

    LuxR solo transcriptional regulators contain both an autoinducer binding domain (ABD; N-terminal) and a DNA binding Helix-Turn-Helix domain (HTH; C-terminal), but are not associated with a cognate N-acyl homoserine lactone (AHL) synthase coding gene in the same genome. Although a few LuxR solos have been characterized, their distributions as well as their role in bacterial signal perception and other processes are poorly understood. In this study we have carried out a systematic survey of distribution of all ABD containing LuxR transcriptional regulators (QS domain LuxRs) available in the InterPro database (IPR005143), and identified those lacking a cognate AHL synthase. These LuxR solos were then analyzed regarding their taxonomical distribution, predicted functions of neighboring genes and the presence of complete AHL-QS systems in the genomes that carry them. Our analyses reveal the presence of one or multiple predicted LuxR solos in many proteobacterial genomes carrying QS domain LuxRs, some of them harboring genes for one or more AHL-QS circuits. The presence of LuxR solos in bacteria occupying diverse environments suggests potential ecological functions for these proteins beyond AHL and interkingdom signaling. Based on gene context and the conservation levels of invariant amino acids of ABD, we have classified LuxR solos into functionally meaningful groups or putative orthologs. Surprisingly, putative LuxR solos were also found in a few non-proteobacterial genomes which are not known to carry AHL-QS systems. Multiple predicted LuxR solos in the same genome appeared to have different levels of conservation of invariant amino acid residues of ABD questioning their binding to AHLs. In summary, this study provides a detailed overview of distribution of LuxR solos and their probable roles in bacteria with genome sequence information. PMID:25759807

  10. Neural response to reward anticipation under risk is nonlinear in probabilities.

    PubMed

    Hsu, Ming; Krajbich, Ian; Zhao, Chen; Camerer, Colin F

    2009-02-18

    A widely observed phenomenon in decision making under risk is the apparent overweighting of unlikely events and the underweighting of nearly certain events. This violates standard assumptions in expected utility theory, which requires that expected utility be linear (objective) in probabilities. Models such as prospect theory have relaxed this assumption and introduced the notion of a "probability weighting function," which captures the key properties found in experimental data. This study reports functional magnetic resonance imaging (fMRI) data that neural response to expected reward is nonlinear in probabilities. Specifically, we found that activity in the striatum during valuation of monetary gambles are nonlinear in probabilities in the pattern predicted by prospect theory, suggesting that probability distortion is reflected at the level of the reward encoding process. The degree of nonlinearity reflected in individual subjects' decisions is also correlated with striatal activity across subjects. Our results shed light on the neural mechanisms of reward processing, and have implications for future neuroscientific studies of decision making involving extreme tails of the distribution, where probability weighting provides an explanation for commonly observed behavioral anomalies.

  11. Data Applicability of Heritage and New Hardware For Launch Vehicle Reliability Models

    NASA Technical Reports Server (NTRS)

    Al Hassan, Mohammad; Novack, Steven

    2015-01-01

    Bayesian reliability requires the development of a prior distribution to represent degree of belief about the value of a parameter (such as a component's failure rate) before system specific data become available from testing or operations. Generic failure data are often provided in reliability databases as point estimates (mean or median). A component's failure rate is considered a random variable where all possible values are represented by a probability distribution. The applicability of the generic data source is a significant source of uncertainty that affects the spread of the distribution. This presentation discusses heuristic guidelines for quantifying uncertainty due to generic data applicability when developing prior distributions mainly from reliability predictions.

  12. Deriving the species richness distribution of Geotrupinae (Coleoptera: Scarabaeoidea) in Mexico from the overlap of individual model predictions.

    PubMed

    Trotta-Moreu, Nuria; Lobo, Jorge M

    2010-02-01

    Predictions from individual distribution models for Mexican Geotrupinae species were overlaid to obtain a total species richness map for this group. A database (GEOMEX) that compiles available information from the literature and from several entomological collections was used. A Maximum Entropy method (MaxEnt) was applied to estimate the distribution of each species, taking into account 19 climatic variables as predictors. For each species, suitability values ranging from 0 to 100 were calculated for each grid cell on the map, and 21 different thresholds were used to convert these continuous suitability values into binary ones (presence-absence). By summing all of the individual binary maps, we generated a species richness prediction for each of the considered thresholds. The number of species and faunal composition thus predicted for each Mexican state were subsequently compared with those observed in a preselected set of well-surveyed states. Our results indicate that the sum of individual predictions tends to overestimate species richness but that the selection of an appropriate threshold can reduce this bias. Even under the most optimistic prediction threshold, the mean species richness error is 61% of the observed species richness, with commission errors being significantly more common than omission errors (71 +/- 29 versus 18 +/- 10%). The estimated distribution of Geotrupinae species richness in Mexico in discussed, although our conclusions are preliminary and contingent on the scarce and probably biased available data.

  13. DBH Prediction Using Allometry Described by Bivariate Copula Distribution

    NASA Astrophysics Data System (ADS)

    Xu, Q.; Hou, Z.; Li, B.; Greenberg, J. A.

    2017-12-01

    Forest biomass mapping based on single tree detection from the airborne laser scanning (ALS) usually depends on an allometric equation that relates diameter at breast height (DBH) with per-tree aboveground biomass. The incapability of the ALS technology in directly measuring DBH leads to the need to predict DBH with other ALS-measured tree-level structural parameters. A copula-based method is proposed in the study to predict DBH with the ALS-measured tree height and crown diameter using a dataset measured in the Lassen National Forest in California. Instead of exploring an explicit mathematical equation that explains the underlying relationship between DBH and other structural parameters, the copula-based prediction method utilizes the dependency between cumulative distributions of these variables, and solves the DBH based on an assumption that for a single tree, the cumulative probability of each structural parameter is identical. Results show that compared with the bench-marking least-square linear regression and the k-MSN imputation, the copula-based method obtains better accuracy in the DBH for the Lassen National Forest. To assess the generalization of the proposed method, prediction uncertainty is quantified using bootstrapping techniques that examine the variability of the RMSE of the predicted DBH. We find that the copula distribution is reliable in describing the allometric relationship between tree-level structural parameters, and it contributes to the reduction of prediction uncertainty.

  14. Ecology, distribution, and predictive occurrence modeling of Palmers chipmunk (Tamias palmeri): a high-elevation small mammal endemic to the Spring Mountains in southern Nevada, USA

    USGS Publications Warehouse

    Lowrey, Chris E.; Longshore, Kathleen M.; Riddle, Brett R.; Mantooth, Stacy

    2016-01-01

    Although montane sky islands surrounded by desert scrub and shrub steppe comprise a large part of the biological diversity of the Basin and Range Province of southwestern North America, comprehensive ecological and population demographic studies for high-elevation small mammals within these areas are rare. Here, we examine the ecology and population parameters of the Palmer’s chipmunk (Tamias palmeri) in the Spring Mountains of southern Nevada, and present a predictive GIS-based distribution and probability of occurrence model at both home range and geographic spatial scales. Logistic regression analyses and Akaike Information Criterion model selection found variables of forest type, slope, and distance to water sources as predictive of chipmunk occurrence at the geographic scale. At the home range scale, increasing population density, decreasing overstory canopy cover, and decreasing understory canopy cover contributed to increased survival rates.

  15. Are We Predicting the Actual or Apparent Distribution of Temperate Marine Fishes?

    PubMed Central

    Monk, Jacquomo; Ierodiaconou, Daniel; Harvey, Euan; Rattray, Alex; Versace, Vincent L.

    2012-01-01

    Planning for resilience is the focus of many marine conservation programs and initiatives. These efforts aim to inform conservation strategies for marine regions to ensure they have inbuilt capacity to retain biological diversity and ecological function in the face of global environmental change – particularly changes in climate and resource exploitation. In the absence of direct biological and ecological information for many marine species, scientists are increasingly using spatially-explicit, predictive-modeling approaches. Through the improved access to multibeam sonar and underwater video technology these models provide spatial predictions of the most suitable regions for an organism at resolutions previously not possible. However, sensible-looking, well-performing models can provide very different predictions of distribution depending on which occurrence dataset is used. To examine this, we construct species distribution models for nine temperate marine sedentary fishes for a 25.7 km2 study region off the coast of southeastern Australia. We use generalized linear model (GLM), generalized additive model (GAM) and maximum entropy (MAXENT) to build models based on co-located occurrence datasets derived from two underwater video methods (i.e. baited and towed video) and fine-scale multibeam sonar based seafloor habitat variables. Overall, this study found that the choice of modeling approach did not considerably influence the prediction of distributions based on the same occurrence dataset. However, greater dissimilarity between model predictions was observed across the nine fish taxa when the two occurrence datasets were compared (relative to models based on the same dataset). Based on these results it is difficult to draw any general trends in regards to which video method provides more reliable occurrence datasets. Nonetheless, we suggest predictions reflecting the species apparent distribution (i.e. a combination of species distribution and the probability of detecting it). Consequently, we also encourage researchers and marine managers to carefully interpret model predictions. PMID:22536325

  16. Size distribution of submarine landslides along the U.S. Atlantic margin

    USGS Publications Warehouse

    Chaytor, J.D.; ten Brink, Uri S.; Solow, A.R.; Andrews, B.D.

    2009-01-01

    Assessment of the probability for destructive landslide-generated tsunamis depends on the knowledge of the number, size, and frequency of large submarine landslides. This paper investigates the size distribution of submarine landslides along the U.S. Atlantic continental slope and rise using the size of the landslide source regions (landslide failure scars). Landslide scars along the margin identified in a detailed bathymetric Digital Elevation Model (DEM) have areas that range between 0.89??km2 and 2410??km2 and volumes between 0.002??km3 and 179??km3. The area to volume relationship of these failure scars is almost linear (inverse power-law exponent close to 1), suggesting a fairly uniform failure thickness of a few 10s of meters in each event, with only rare, deep excavating landslides. The cumulative volume distribution of the failure scars is very well described by a log-normal distribution rather than by an inverse power-law, the most commonly used distribution for both subaerial and submarine landslides. A log-normal distribution centered on a volume of 0.86??km3 may indicate that landslides preferentially mobilize a moderate amount of material (on the order of 1??km3), rather than large landslides or very small ones. Alternatively, the log-normal distribution may reflect an inverse power law distribution modified by a size-dependent probability of observing landslide scars in the bathymetry data. If the latter is the case, an inverse power-law distribution with an exponent of 1.3 ?? 0.3, modified by a size-dependent conditional probability of identifying more failure scars with increasing landslide size, fits the observed size distribution. This exponent value is similar to the predicted exponent of 1.2 ?? 0.3 for subaerial landslides in unconsolidated material. Both the log-normal and modified inverse power-law distributions of the observed failure scar volumes suggest that large landslides, which have the greatest potential to generate damaging tsunamis, occur infrequently along the margin. ?? 2008 Elsevier B.V.

  17. NON-EXTENSIVE STATISTICS TO THE COSMOLOGICAL LITHIUM PROBLEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, S. Q.; He, J. J.; Parikh, A.

    Big Bang nucleosynthesis (BBN) theory predicts the abundances of the light elements D, {sup 3}He, {sup 4}He, and {sup 7}Li produced in the early universe. The primordial abundances of D and {sup 4}He inferred from observational data are in good agreement with predictions, however, BBN theory overestimates the primordial {sup 7}Li abundance by about a factor of three. This is the so-called “cosmological lithium problem.” Solutions to this problem using conventional astrophysics and nuclear physics have not been successful over the past few decades, probably indicating the presence of new physics during the era of BBN. We have investigated themore » impact on BBN predictions of adopting a generalized distribution to describe the velocities of nucleons in the framework of Tsallis non-extensive statistics. This generalized velocity distribution is characterized by a parameter q , and reduces to the usually assumed Maxwell–Boltzmann distribution for q  = 1. We find excellent agreement between predicted and observed primordial abundances of D, {sup 4}He, and {sup 7}Li for 1.069 ≤  q  ≤ 1.082, suggesting a possible new solution to the cosmological lithium problem.« less

  18. Using the Lorenz Curve to Characterize Risk Predictiveness and Etiologic Heterogeneity

    PubMed Central

    Mauguen, Audrey; Begg, Colin B.

    2017-01-01

    The Lorenz curve is a graphical tool that is used widely in econometrics. It represents the spread of a probability distribution, and its traditional use has been to characterize population distributions of wealth or income, or more specifically, inequalities in wealth or income. However, its utility in public health research has not been broadly established. The purpose of this article is to explain its special usefulness for characterizing the population distribution of disease risks, and in particular for identifying the precise disease burden that can be predicted to occur in segments of the population that are known to have especially high (or low) risks, a feature that is important for evaluating the yield of screening or other disease prevention initiatives. We demonstrate that, although the Lorenz curve represents the distribution of predicted risks in a population at risk for the disease, in fact it can be estimated from a case–control study conducted in the population without the need for information on absolute risks. We explore two different estimation strategies and compare their statistical properties using simulations. The Lorenz curve is a statistical tool that deserves wider use in public health research. PMID:27096256

  19. Gravitational wave hotspots: Ranking potential locations of single-source gravitational wave emission

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simon, Joseph; Polin, Abigail; Lommen, Andrea

    2014-03-20

    The steadily improving sensitivity of pulsar timing arrays (PTAs) suggests that gravitational waves (GWs) from supermassive black hole binary (SMBHB) systems in the nearby universe will be detectable sometime during the next decade. Currently, PTAs assume an equal probability of detection from every sky position, but as evidence grows for a non-isotropic distribution of sources, is there a most likely sky position for a detectable single source of GWs? In this paper, a collection of Galactic catalogs is used to calculate various metrics related to the detectability of a single GW source resolvable above a GW background, assuming that everymore » galaxy has the same probability of containing an SMBHB. Our analyses of these data reveal small probabilities that one of these sources is currently in the PTA band, but as sensitivity is improved regions of consistent probability density are found in predictable locations, specifically around local galaxy clusters.« less

  20. Using a topographic index to distribute variable source area runoff predicted with the SCS curve-number equation

    NASA Astrophysics Data System (ADS)

    Lyon, Steve W.; Walter, M. Todd; Gérard-Marchant, Pierre; Steenhuis, Tammo S.

    2004-10-01

    Because the traditional Soil Conservation Service curve-number (SCS-CN) approach continues to be used ubiquitously in water quality models, new application methods are needed that are consistent with variable source area (VSA) hydrological processes in the landscape. We developed and tested a distributed approach for applying the traditional SCS-CN equation to watersheds where VSA hydrology is a dominant process. Predicting the location of source areas is important for watershed planning because restricting potentially polluting activities from runoff source areas is fundamental to controlling non-point-source pollution. The method presented here used the traditional SCS-CN approach to predict runoff volume and spatial extent of saturated areas and a topographic index, like that used in TOPMODEL, to distribute runoff source areas through watersheds. The resulting distributed CN-VSA method was applied to two subwatersheds of the Delaware basin in the Catskill Mountains region of New York State and one watershed in south-eastern Australia to produce runoff-probability maps. Observed saturated area locations in the watersheds agreed with the distributed CN-VSA method. Results showed good agreement with those obtained from the previously validated soil moisture routing (SMR) model. When compared with the traditional SCS-CN method, the distributed CN-VSA method predicted a similar total volume of runoff, but vastly different locations of runoff generation. Thus, the distributed CN-VSA approach provides a physically based method that is simple enough to be incorporated into water quality models, and other tools that currently use the traditional SCS-CN method, while still adhering to the principles of VSA hydrology.

  1. Explosion probability of unexploded ordnance: expert beliefs.

    PubMed

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies are needed to better understand the explosion risks of UXO.

  2. Improving operational flood ensemble prediction by the assimilation of satellite soil moisture: comparison between lumped and semi-distributed schemes

    NASA Astrophysics Data System (ADS)

    Alvarez-Garreton, C.; Ryu, D.; Western, A. W.; Su, C.-H.; Crow, W. T.; Robertson, D. E.; Leahy, C.

    2014-09-01

    Assimilation of remotely sensed soil moisture data (SM-DA) to correct soil water stores of rainfall-runoff models has shown skill in improving streamflow prediction. In the case of large and sparsely monitored catchments, SM-DA is a particularly attractive tool. Within this context, we assimilate active and passive satellite soil moisture (SSM) retrievals using an ensemble Kalman filter to improve operational flood prediction within a large semi-arid catchment in Australia (>40 000 km2). We assess the importance of accounting for channel routing and the spatial distribution of forcing data by applying SM-DA to a lumped and a semi-distributed scheme of the probability distributed model (PDM). Our scheme also accounts for model error representation and seasonal biases and errors in the satellite data. Before assimilation, the semi-distributed model provided more accurate streamflow prediction (Nash-Sutcliffe efficiency, NS = 0.77) than the lumped model (NS = 0.67) at the catchment outlet. However, this did not ensure good performance at the "ungauged" inner catchments. After SM-DA, the streamflow ensemble prediction at the outlet was improved in both the lumped and the semi-distributed schemes: the root mean square error of the ensemble was reduced by 27 and 31%, respectively; the NS of the ensemble mean increased by 7 and 38%, respectively; the false alarm ratio was reduced by 15 and 25%, respectively; and the ensemble prediction spread was reduced while its reliability was maintained. Our findings imply that even when rainfall is the main driver of flooding in semi-arid catchments, adequately processed SSM can be used to reduce errors in the model soil moisture, which in turn provides better streamflow ensemble prediction. We demonstrate that SM-DA efficacy is enhanced when the spatial distribution in forcing data and routing processes are accounted for. At ungauged locations, SM-DA is effective at improving streamflow ensemble prediction, however, the updated prediction is still poor since SM-DA does not address systematic errors in the model.

  3. Spacecraft Collision Avoidance

    NASA Astrophysics Data System (ADS)

    Bussy-Virat, Charles

    The rapid increase of the number of objects in orbit around the Earth poses a serious threat to operational spacecraft and astronauts. In order to effectively avoid collisions, mission operators need to assess the risk of collision between the satellite and any other object whose orbit is likely to approach its trajectory. Several algorithms predict the probability of collision but have limitations that impair the accuracy of the prediction. An important limitation is that uncertainties in the atmospheric density are usually not taken into account in the propagation of the covariance matrix from current epoch to closest approach time. The Spacecraft Orbital Characterization Kit (SpOCK) was developed to accurately predict the positions and velocities of spacecraft. The central capability of SpOCK is a high accuracy numerical propagator of spacecraft orbits and computations of ancillary parameters. The numerical integration uses a comprehensive modeling of the dynamics of spacecraft in orbit that includes all the perturbing forces that a spacecraft is subject to in orbit. In particular, the atmospheric density is modeled by thermospheric models to allow for an accurate representation of the atmospheric drag. SpOCK predicts the probability of collision between two orbiting objects taking into account the uncertainties in the atmospheric density. Monte Carlo procedures are used to perturb the initial position and velocity of the primary and secondary spacecraft from their covariance matrices. Developed in C, SpOCK supports parallelism to quickly assess the risk of collision so it can be used operationally in real time. The upper atmosphere of the Earth is strongly driven by the solar activity. In particular, abrupt transitions from slow to fast solar wind cause important disturbances of the atmospheric density, hence of the drag acceleration that spacecraft are subject to. The Probability Distribution Function (PDF) model was developed to predict the solar wind speed five days in advance. In particular, the PDF model is able to predict rapid enhancements in the solar wind speed. It was found that 60% of the positive predictions were correct, while 91% of the negative predictions were correct, and 20% to 33% of the peaks in the speed were found by the model. En-semble forecasts provide the forecasters with an estimation of the uncertainty in the prediction, which can be used to derive uncertainties in the atmospheric density and in the drag acceleration. The dissertation then demonstrates that uncertainties in the atmospheric density result in large uncertainties in the prediction of the probability of collision. As an example, the effects of a geomagnetic storm on the probability of collision are illustrated. The research aims at providing tools and analyses that help understand and predict the effects of uncertainties in the atmospheric density on the probability of collision. The ultimate motivation is to support mission operators in making the correct decision with regard to a potential collision avoidance maneuver by providing an uncertainty on the prediction of the probability of collision instead of a single value. This approach can help avoid performing unnecessary costly maneuvers, while making sure that the risk of collision is fully evaluated.

  4. A hybrid probabilistic/spectral model of scalar mixing

    NASA Astrophysics Data System (ADS)

    Vaithianathan, T.; Collins, Lance

    2002-11-01

    In the probability density function (PDF) description of a turbulent reacting flow, the local temperature and species concentration are replaced by a high-dimensional joint probability that describes the distribution of states in the fluid. The PDF has the great advantage of rendering the chemical reaction source terms closed, independent of their complexity. However, molecular mixing, which involves two-point information, must be modeled. Indeed, the qualitative shape of the PDF is sensitive to this modeling, hence the reliability of the model to predict even the closed chemical source terms rests heavily on the mixing model. We will present a new closure to the mixing based on a spectral representation of the scalar field. The model is implemented as an ensemble of stochastic particles, each carrying scalar concentrations at different wavenumbers. Scalar exchanges within a given particle represent ``transfer'' while scalar exchanges between particles represent ``mixing.'' The equations governing the scalar concentrations at each wavenumber are derived from the eddy damped quasi-normal Markovian (or EDQNM) theory. The model correctly predicts the evolution of an initial double delta function PDF into a Gaussian as seen in the numerical study by Eswaran & Pope (1988). Furthermore, the model predicts the scalar gradient distribution (which is available in this representation) approaches log normal at long times. Comparisons of the model with data derived from direct numerical simulations will be shown.

  5. Effect of thermal noise on vesicles and capsules in shear flow.

    PubMed

    Abreu, David; Seifert, Udo

    2012-07-01

    We add thermal noise consistently to reduced models of undeformable vesicles and capsules in shear flow and derive analytically the corresponding stochastic equations of motion. We calculate the steady-state probability distribution function and construct the corresponding phase diagrams for the different dynamical regimes. For fluid vesicles, we predict that at small shear rates thermal fluctuations induce a tumbling motion for any viscosity contrast. For elastic capsules, due to thermal mixing, an intermittent regime appears in regions where deterministic models predict only pure tank treading or tumbling.

  6. On the use of the noncentral chi-square density function for the distribution of helicopter spectral estimates

    NASA Technical Reports Server (NTRS)

    Garber, Donald P.

    1993-01-01

    A probability density function for the variability of ensemble averaged spectral estimates from helicopter acoustic signals in Gaussian background noise was evaluated. Numerical methods for calculating the density function and for determining confidence limits were explored. Density functions were predicted for both synthesized and experimental data and compared with observed spectral estimate variability.

  7. Comparing neuronal spike trains with inhomogeneous Poisson distribution: evaluation procedure and experimental application in cases of cyclic activity.

    PubMed

    Fiore, Lorenzo; Lorenzetti, Walter; Ratti, Giovannino

    2005-11-30

    A procedure is proposed to compare single-unit spiking activity elicited in repetitive cycles with an inhomogeneous Poisson process (IPP). Each spike sequence in a cycle is discretized and represented as a point process on a circle. The interspike interval probability density predicted for an IPP is computed on the basis of the experimental firing probability density; differences from the experimental interval distribution are assessed. This procedure was applied to spike trains which were repetitively induced by opening-closing movements of the distal article of a lobster leg. As expected, the density of short interspike intervals, less than 20-40 ms in length, was found to lie greatly below the level predicted for an IPP, reflecting the occurrence of the refractory period. Conversely, longer intervals, ranging from 20-40 to 100-120 ms, were markedly more abundant than expected; this provided evidence for a time window of increased tendency to fire again after a spike. Less consistently, a weak depression of spike generation was observed for longer intervals. A Monte Carlo procedure, implemented for comparison, produced quite similar results, but was slightly less precise and more demanding as concerns computation time.

  8. Measurement of the fusion probability P{sub CN} for the reaction of {sup 50}Ti with {sup 208}Pb

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Naik, R. S.; Loveland, W.; Sprunger, P. H.

    2007-11-15

    The capture cross sections and fission fragment angular distributions were measured for the reaction of {sup 50}Ti with {sup 208}Pb at center of mass projectile energies (E{sub c.m.}) of 183.7, 186.2, 190.2, 194.2, and 202.3 MeV (E*=14.2, 16.6, 20.6, 24.7, and 32.7 MeV). From fitting the backward angle fragment angular distributions, the cross sections for quasifission and fusion-fission and P{sub CN}, the probability that the colliding nuclei go from the contact configuration to inside the fission saddle point, were deduced. These quantities, along with the known values of the evaporation residue production cross sections for this reaction, were used tomore » deduce values of the survival probabilities, W{sub sur}, for this reaction as a function of excitation energy. The deduced values of P{sub CN} and W{sub sur} and their dependence on excitation energy differ from some current theoretical predictions of these quantities.« less

  9. Development of a European Ensemble System for Seasonal Prediction: Application to crop yield

    NASA Astrophysics Data System (ADS)

    Terres, J. M.; Cantelaube, P.

    2003-04-01

    Western European agriculture is highly intensive and the weather is the main source of uncertainty for crop yield assessment and for crop management. In the current system, at the time when a crop yield forecast is issued, the weather conditions leading up to harvest time are unknown and are therefore a major source of uncertainty. The use of seasonal weather forecast would bring additional information for the remaining crop season and has valuable benefit for improving the management of agricultural markets and environmentally sustainable farm practices. An innovative method for supplying seasonal forecast information to crop simulation models has been developed in the frame of the EU funded research project DEMETER. It consists in running a crop model on each individual member of the seasonal hindcasts to derive a probability distribution of crop yield. Preliminary results of cumulative probability function of wheat yield provides information on both the yield anomaly and the reliability of the forecast. Based on the spread of the probability distribution, the end-user can directly quantify the benefits and risks of taking weather-sensitive decisions.

  10. Funnel plot control limits to identify poorly performing healthcare providers when there is uncertainty in the value of the benchmark.

    PubMed

    Manktelow, Bradley N; Seaton, Sarah E; Evans, T Alun

    2016-12-01

    There is an increasing use of statistical methods, such as funnel plots, to identify poorly performing healthcare providers. Funnel plots comprise the construction of control limits around a benchmark and providers with outcomes falling outside the limits are investigated as potential outliers. The benchmark is usually estimated from observed data but uncertainty in this estimate is usually ignored when constructing control limits. In this paper, the use of funnel plots in the presence of uncertainty in the value of the benchmark is reviewed for outcomes from a Binomial distribution. Two methods to derive the control limits are shown: (i) prediction intervals; (ii) tolerance intervals Tolerance intervals formally include the uncertainty in the value of the benchmark while prediction intervals do not. The probability properties of 95% control limits derived using each method were investigated through hypothesised scenarios. Neither prediction intervals nor tolerance intervals produce funnel plot control limits that satisfy the nominal probability characteristics when there is uncertainty in the value of the benchmark. This is not necessarily to say that funnel plots have no role to play in healthcare, but that without the development of intervals satisfying the nominal probability characteristics they must be interpreted with care. © The Author(s) 2014.

  11. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis

    PubMed Central

    Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina

    2015-01-01

    Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed. PMID:26167524

  12. Reliability Estimation of Parameters of Helical Wind Turbine with Vertical Axis.

    PubMed

    Dumitrascu, Adela-Eliza; Lepadatescu, Badea; Dumitrascu, Dorin-Ion; Nedelcu, Anisor; Ciobanu, Doina Valentina

    2015-01-01

    Due to the prolonged use of wind turbines they must be characterized by high reliability. This can be achieved through a rigorous design, appropriate simulation and testing, and proper construction. The reliability prediction and analysis of these systems will lead to identifying the critical components, increasing the operating time, minimizing failure rate, and minimizing maintenance costs. To estimate the produced energy by the wind turbine, an evaluation approach based on the Monte Carlo simulation model is developed which enables us to estimate the probability of minimum and maximum parameters. In our simulation process we used triangular distributions. The analysis of simulation results has been focused on the interpretation of the relative frequency histograms and cumulative distribution curve (ogive diagram), which indicates the probability of obtaining the daily or annual energy output depending on wind speed. The experimental researches consist in estimation of the reliability and unreliability functions and hazard rate of the helical vertical axis wind turbine designed and patented to climatic conditions for Romanian regions. Also, the variation of power produced for different wind speeds, the Weibull distribution of wind probability, and the power generated were determined. The analysis of experimental results indicates that this type of wind turbine is efficient at low wind speed.

  13. Confined active Brownian particles: theoretical description of propulsion-induced accumulation

    NASA Astrophysics Data System (ADS)

    Das, Shibananda; Gompper, Gerhard; Winkler, Roland G.

    2018-01-01

    The stationary-state distribution function of confined active Brownian particles (ABPs) is analyzed by computer simulations and analytical calculations. We consider a radial harmonic as well as an anharmonic confinement potential. In the simulations, the ABP is propelled with a prescribed velocity along a body-fixed direction, which is changing in a diffusive manner. For the analytical approach, the Cartesian components of the propulsion velocity are assumed to change independently; active Ornstein-Uhlenbeck particle (AOUP). This results in very different velocity distribution functions. The analytical solution of the Fokker-Planck equation for an AOUP in a harmonic potential is presented and a conditional distribution function is provided for the radial particle distribution at a given magnitude of the propulsion velocity. This conditional probability distribution facilitates the description of the coupling of the spatial coordinate and propulsion, which yields activity-induced accumulation of particles. For the anharmonic potential, a probability distribution function is derived within the unified colored noise approximation. The comparison of the simulation results with theoretical predictions yields good agreement for large rotational diffusion coefficients, e.g. due to tumbling, even for large propulsion velocities (Péclet numbers). However, we find significant deviations already for moderate Péclet number, when the rotational diffusion coefficient is on the order of the thermal one.

  14. Deriving field-based species sensitivity distributions (f-SSDs) from stacked species distribution models (S-SDMs).

    PubMed

    Schipper, Aafke M; Posthuma, Leo; de Zwart, Dick; Huijbregts, Mark A J

    2014-12-16

    Quantitative relationships between species richness and single environmental factors, also called species sensitivity distributions (SSDs), are helpful to understand and predict biodiversity patterns, identify environmental management options and set environmental quality standards. However, species richness is typically dependent on a variety of environmental factors, implying that it is not straightforward to quantify SSDs from field monitoring data. Here, we present a novel and flexible approach to solve this, based on the method of stacked species distribution modeling. First, a species distribution model (SDM) is established for each species, describing its probability of occurrence in relation to multiple environmental factors. Next, the predictions of the SDMs are stacked along the gradient of each environmental factor with the remaining environmental factors at fixed levels. By varying those fixed levels, our approach can be used to investigate how field-based SSDs for a given environmental factor change in relation to changing confounding influences, including for example optimal, typical, or extreme environmental conditions. This provides an asset in the evaluation of potential management measures to reach good ecological status.

  15. Mixed H2/H∞ distributed robust model predictive control for polytopic uncertain systems subject to actuator saturation and missing measurements

    NASA Astrophysics Data System (ADS)

    Song, Yan; Fang, Xiaosheng; Diao, Qingda

    2016-03-01

    In this paper, we discuss the mixed H2/H∞ distributed robust model predictive control problem for polytopic uncertain systems subject to randomly occurring actuator saturation and packet loss. The global system is decomposed into several subsystems, and all the subsystems are connected by a fixed topology network, which is the definition for the packet loss among the subsystems. To better use the successfully transmitted information via Internet, both the phenomena of actuator saturation and packet loss resulting from the limitation of the communication bandwidth are taken into consideration. A novel distributed controller model is established to account for the actuator saturation and packet loss in a unified representation by using two sets of Bernoulli distributed white sequences with known conditional probabilities. With the nonlinear feedback control law represented by the convex hull of a group of linear feedback laws, the distributed controllers for subsystems are obtained by solving an linear matrix inequality (LMI) optimisation problem. Finally, numerical studies demonstrate the effectiveness of the proposed techniques.

  16. Quantifying predictability in a model with statistical features of the atmosphere

    PubMed Central

    Kleeman, Richard; Majda, Andrew J.; Timofeyev, Ilya

    2002-01-01

    The Galerkin truncated inviscid Burgers equation has recently been shown by the authors to be a simple model with many degrees of freedom, with many statistical properties similar to those occurring in dynamical systems relevant to the atmosphere. These properties include long time-correlated, large-scale modes of low frequency variability and short time-correlated “weather modes” at smaller scales. The correlation scaling in the model extends over several decades and may be explained by a simple theory. Here a thorough analysis of the nature of predictability in the idealized system is developed by using a theoretical framework developed by R.K. This analysis is based on a relative entropy functional that has been shown elsewhere by one of the authors to measure the utility of statistical predictions precisely. The analysis is facilitated by the fact that most relevant probability distributions are approximately Gaussian if the initial conditions are assumed to be so. Rather surprisingly this holds for both the equilibrium (climatological) and nonequilibrium (prediction) distributions. We find that in most cases the absolute difference in the first moments of these two distributions (the “signal” component) is the main determinant of predictive utility variations. Contrary to conventional belief in the ensemble prediction area, the dispersion of prediction ensembles is generally of secondary importance in accounting for variations in utility associated with different initial conditions. This conclusion has potentially important implications for practical weather prediction, where traditionally most attention has focused on dispersion and its variability. PMID:12429863

  17. Ensemble modeling of stochastic unsteady open-channel flow in terms of its time-space evolutionary probability distribution - Part 2: numerical application

    NASA Astrophysics Data System (ADS)

    Dib, Alain; Kavvas, M. Levent

    2018-03-01

    The characteristic form of the Saint-Venant equations is solved in a stochastic setting by using a newly proposed Fokker-Planck Equation (FPE) methodology. This methodology computes the ensemble behavior and variability of the unsteady flow in open channels by directly solving for the flow variables' time-space evolutionary probability distribution. The new methodology is tested on a stochastic unsteady open-channel flow problem, with an uncertainty arising from the channel's roughness coefficient. The computed statistical descriptions of the flow variables are compared to the results obtained through Monte Carlo (MC) simulations in order to evaluate the performance of the FPE methodology. The comparisons show that the proposed methodology can adequately predict the results of the considered stochastic flow problem, including the ensemble averages, variances, and probability density functions in time and space. Unlike the large number of simulations performed by the MC approach, only one simulation is required by the FPE methodology. Moreover, the total computational time of the FPE methodology is smaller than that of the MC approach, which could prove to be a particularly crucial advantage in systems with a large number of uncertain parameters. As such, the results obtained in this study indicate that the proposed FPE methodology is a powerful and time-efficient approach for predicting the ensemble average and variance behavior, in both space and time, for an open-channel flow process under an uncertain roughness coefficient.

  18. Multiple model cardinalized probability hypothesis density filter

    NASA Astrophysics Data System (ADS)

    Georgescu, Ramona; Willett, Peter

    2011-09-01

    The Probability Hypothesis Density (PHD) filter propagates the first-moment approximation to the multi-target Bayesian posterior distribution while the Cardinalized PHD (CPHD) filter propagates both the posterior likelihood of (an unlabeled) target state and the posterior probability mass function of the number of targets. Extensions of the PHD filter to the multiple model (MM) framework have been published and were implemented either with a Sequential Monte Carlo or a Gaussian Mixture approach. In this work, we introduce the multiple model version of the more elaborate CPHD filter. We present the derivation of the prediction and update steps of the MMCPHD particularized for the case of two target motion models and proceed to show that in the case of a single model, the new MMCPHD equations reduce to the original CPHD equations.

  19. Axon and dendrite geography predict the specificity of synaptic connections in a functioning spinal cord network.

    PubMed

    Li, Wen-Chang; Cooke, Tom; Sautois, Bart; Soffe, Stephen R; Borisyuk, Roman; Roberts, Alan

    2007-09-10

    How specific are the synaptic connections formed as neuronal networks develop and can simple rules account for the formation of functioning circuits? These questions are assessed in the spinal circuits controlling swimming in hatchling frog tadpoles. This is possible because detailed information is now available on the identity and synaptic connections of the main types of neuron. The probabilities of synapses between 7 types of identified spinal neuron were measured directly by making electrical recordings from 500 pairs of neurons. For the same neuron types, the dorso-ventral distributions of axons and dendrites were measured and then used to calculate the probabilities that axons would encounter particular dendrites and so potentially form synaptic connections. Surprisingly, synapses were found between all types of neuron but contact probabilities could be predicted simply by the anatomical overlap of their axons and dendrites. These results suggested that synapse formation may not require axons to recognise specific, correct dendrites. To test the plausibility of simpler hypotheses, we first made computational models that were able to generate longitudinal axon growth paths and reproduce the axon distribution patterns and synaptic contact probabilities found in the spinal cord. To test if probabilistic rules could produce functioning spinal networks, we then made realistic computational models of spinal cord neurons, giving them established cell-specific properties and connecting them into networks using the contact probabilities we had determined. A majority of these networks produced robust swimming activity. Simple factors such as morphogen gradients controlling dorso-ventral soma, dendrite and axon positions may sufficiently constrain the synaptic connections made between different types of neuron as the spinal cord first develops and allow functional networks to form. Our analysis implies that detailed cellular recognition between spinal neuron types may not be necessary for the reliable formation of functional networks to generate early behaviour like swimming.

  20. Probabilities of Dilating Vesicoureteral Reflux in Children with First Time Simple Febrile Urinary Tract Infection, and Normal Renal and Bladder Ultrasound.

    PubMed

    Rianthavorn, Pornpimol; Tangngamsakul, Onjira

    2016-11-01

    We evaluated risk factors and assessed predicted probabilities for grade III or higher vesicoureteral reflux (dilating reflux) in children with a first simple febrile urinary tract infection and normal renal and bladder ultrasound. Data for 167 children 2 to 72 months old with a first febrile urinary tract infection and normal ultrasound were compared between those who had dilating vesicoureteral reflux (12 patients, 7.2%) and those who did not. Exclusion criteria consisted of history of prenatal hydronephrosis or familial reflux and complicated urinary tract infection. The logistic regression model was used to identify independent variables associated with dilating reflux. Predicted probabilities for dilating reflux were assessed. Patient age and prevalence of nonEscherichia coli bacteria were greater in children who had dilating reflux compared to those who did not (p = 0.02 and p = 0.004, respectively). Gender distribution was similar between the 2 groups (p = 0.08). In multivariate analysis older age and nonE. coli bacteria independently predicted dilating reflux, with odds ratios of 1.04 (95% CI 1.01-1.07, p = 0.02) and 3.76 (95% CI 1.05-13.39, p = 0.04), respectively. The impact of nonE. coli bacteria on predicted probabilities of dilating reflux increased with patient age. We support the concept of selective voiding cystourethrogram in children with a first simple febrile urinary tract infection and normal ultrasound. Voiding cystourethrogram should be considered in children with late onset urinary tract infection due to nonE. coli bacteria since they are at risk for dilating reflux even if the ultrasound is normal. Copyright © 2016 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  1. Improving effectiveness of systematic conservation planning with density data.

    PubMed

    Veloz, Samuel; Salas, Leonardo; Altman, Bob; Alexander, John; Jongsomjit, Dennis; Elliott, Nathan; Ballard, Grant

    2015-08-01

    Systematic conservation planning aims to design networks of protected areas that meet conservation goals across large landscapes. The optimal design of these conservation networks is most frequently based on the modeled habitat suitability or probability of occurrence of species, despite evidence that model predictions may not be highly correlated with species density. We hypothesized that conservation networks designed using species density distributions more efficiently conserve populations of all species considered than networks designed using probability of occurrence models. To test this hypothesis, we used the Zonation conservation prioritization algorithm to evaluate conservation network designs based on probability of occurrence versus density models for 26 land bird species in the U.S. Pacific Northwest. We assessed the efficacy of each conservation network based on predicted species densities and predicted species diversity. High-density model Zonation rankings protected more individuals per species when networks protected the highest priority 10-40% of the landscape. Compared with density-based models, the occurrence-based models protected more individuals in the lowest 50% priority areas of the landscape. The 2 approaches conserved species diversity in similar ways: predicted diversity was higher in higher priority locations in both conservation networks. We conclude that both density and probability of occurrence models can be useful for setting conservation priorities but that density-based models are best suited for identifying the highest priority areas. Developing methods to aggregate species count data from unrelated monitoring efforts and making these data widely available through ecoinformatics portals such as the Avian Knowledge Network will enable species count data to be more widely incorporated into systematic conservation planning efforts. © 2015, Society for Conservation Biology.

  2. Bayesian calibration of mechanistic aquatic biogeochemical models and benefits for environmental management

    NASA Astrophysics Data System (ADS)

    Arhonditsis, George B.; Papantou, Dimitra; Zhang, Weitao; Perhar, Gurbir; Massos, Evangelia; Shi, Molu

    2008-09-01

    Aquatic biogeochemical models have been an indispensable tool for addressing pressing environmental issues, e.g., understanding oceanic response to climate change, elucidation of the interplay between plankton dynamics and atmospheric CO 2 levels, and examination of alternative management schemes for eutrophication control. Their ability to form the scientific basis for environmental management decisions can be undermined by the underlying structural and parametric uncertainty. In this study, we outline how we can attain realistic predictive links between management actions and ecosystem response through a probabilistic framework that accommodates rigorous uncertainty analysis of a variety of error sources, i.e., measurement error, parameter uncertainty, discrepancy between model and natural system. Because model uncertainty analysis essentially aims to quantify the joint probability distribution of model parameters and to make inference about this distribution, we believe that the iterative nature of Bayes' Theorem is a logical means to incorporate existing knowledge and update the joint distribution as new information becomes available. The statistical methodology begins with the characterization of parameter uncertainty in the form of probability distributions, then water quality data are used to update the distributions, and yield posterior parameter estimates along with predictive uncertainty bounds. Our illustration is based on a six state variable (nitrate, ammonium, dissolved organic nitrogen, phytoplankton, zooplankton, and bacteria) ecological model developed for gaining insight into the mechanisms that drive plankton dynamics in a coastal embayment; the Gulf of Gera, Island of Lesvos, Greece. The lack of analytical expressions for the posterior parameter distributions was overcome using Markov chain Monte Carlo simulations; a convenient way to obtain representative samples of parameter values. The Bayesian calibration resulted in realistic reproduction of the key temporal patterns of the system, offered insights into the degree of information the data contain about model inputs, and also allowed the quantification of the dependence structure among the parameter estimates. Finally, our study uses two synthetic datasets to examine the ability of the updated model to provide estimates of predictive uncertainty for water quality variables of environmental management interest.

  3. Multiscale modelling of precipitation in concentrated alloys: from atomistic Monte Carlo simulations to cluster dynamics I thermodynamics

    NASA Astrophysics Data System (ADS)

    Lépinoux, J.; Sigli, C.

    2018-01-01

    In a recent paper, the authors showed how the clusters free energies are constrained by the coagulation probability, and explained various anomalies observed during the precipitation kinetics in concentrated alloys. This coagulation probability appeared to be a too complex function to be accurately predicted knowing only the cluster distribution in Cluster Dynamics (CD). Using atomistic Monte Carlo (MC) simulations, it is shown that during a transformation at constant temperature, after a short transient regime, the transformation occurs at quasi-equilibrium. It is proposed to use MC simulations until the system quasi-equilibrates then to switch to CD which is mean field but not limited by a box size like MC. In this paper, we explain how to take into account the information available before the quasi-equilibrium state to establish guidelines to safely predict the cluster free energies.

  4. Refinement of the probability density function model for preferential concentration of aerosol particles in isotropic turbulence

    NASA Astrophysics Data System (ADS)

    Zaichik, Leonid I.; Alipchenkov, Vladimir M.

    2007-11-01

    The purposes of the paper are threefold: (i) to refine the statistical model of preferential particle concentration in isotropic turbulence that was previously proposed by Zaichik and Alipchenkov [Phys. Fluids 15, 1776 (2003)], (ii) to investigate the effect of clustering of low-inertia particles using the refined model, and (iii) to advance a simple model for predicting the collision rate of aerosol particles. The model developed is based on a kinetic equation for the two-point probability density function of the relative velocity distribution of particle pairs. Improvements in predicting the preferential concentration of low-inertia particles are attained due to refining the description of the turbulent velocity field of the carrier fluid by including a difference between the time scales of the of strain and rotation rate correlations. The refined model results in a better agreement with direct numerical simulations for aerosol particles.

  5. Can different quantum state vectors correspond to the same physical state? An experimental test

    NASA Astrophysics Data System (ADS)

    Nigg, Daniel; Monz, Thomas; Schindler, Philipp; Martinez, Esteban A.; Hennrich, Markus; Blatt, Rainer; Pusey, Matthew F.; Rudolph, Terry; Barrett, Jonathan

    2016-01-01

    A century after the development of quantum theory, the interpretation of a quantum state is still discussed. If a physicist claims to have produced a system with a particular quantum state vector, does this represent directly a physical property of the system, or is the state vector merely a summary of the physicist’s information about the system? Assume that a state vector corresponds to a probability distribution over possible values of an unknown physical or ‘ontic’ state. Then, a recent no-go theorem shows that distinct state vectors with overlapping distributions lead to predictions different from quantum theory. We report an experimental test of these predictions using trapped ions. Within experimental error, the results confirm quantum theory. We analyse which kinds of models are ruled out.

  6. Wormlike Chain Theory and Bending of Short DNA

    NASA Astrophysics Data System (ADS)

    Mazur, Alexey K.

    2007-05-01

    The probability distributions for bending angles in double helical DNA obtained in all-atom molecular dynamics simulations are compared with theoretical predictions. The computed distributions remarkably agree with the wormlike chain theory and qualitatively differ from predictions of the subelastic chain model. The computed data exhibit only small anomalies in the apparent flexibility of short DNA and cannot account for the recently reported AFM data. It is possible that the current atomistic DNA models miss some essential mechanisms of DNA bending on intermediate length scales. Analysis of bent DNA structures reveal, however, that the bending motion is structurally heterogeneous and directionally anisotropic on the length scales where the experimental anomalies were detected. These effects are essential for interpretation of the experimental data and they also can be responsible for the apparent discrepancy.

  7. Anomalous finite-size effects in the Battle of the Sexes

    NASA Astrophysics Data System (ADS)

    Cremer, J.; Reichenbach, T.; Frey, E.

    2008-06-01

    The Battle of the Sexes describes asymmetric conflicts in mating behavior of males and females. Males can be philanderer or faithful, while females are either fast or coy, leading to a cyclic dynamics. The adjusted replicator equation predicts stable coexistence of all four strategies. In this situation, we consider the effects of fluctuations stemming from a finite population size. We show that they unavoidably lead to extinction of two strategies in the population. However, the typical time until extinction occurs strongly prolongs with increasing system size. In the emerging time window, a quasi-stationary probability distribution forms that is anomalously flat in the vicinity of the coexistence state. This behavior originates in a vanishing linear deterministic drift near the fixed point. We provide numerical data as well as an analytical approach to the mean extinction time and the quasi-stationary probability distribution.

  8. Model microswimmers in channels with varying cross section

    NASA Astrophysics Data System (ADS)

    Malgaretti, Paolo; Stark, Holger

    2017-05-01

    We study different types of microswimmers moving in channels with varying cross section and thereby interacting hydrodynamically with the channel walls. Starting from the Smoluchowski equation for a dilute suspension, for which interactions among swimmers can be neglected, we derive analytic expressions for the lateral probability distribution between plane channel walls. For weakly corrugated channels, we extend the Fick-Jacobs approach to microswimmers and thereby derive an effective equation for the probability distribution along the channel axis. Two regimes arise dominated either by entropic forces due to the geometrical confinement or by the active motion. In particular, our results show that the accumulation of microswimmers at channel walls is sensitive to both the underlying swimming mechanism and the geometry of the channels. Finally, for asymmetric channel corrugation, our model predicts a rectification of microswimmers along the channel, the strength and direction of which strongly depends on the swimmer type.

  9. Future probabilities of coastal floods in Finland

    NASA Astrophysics Data System (ADS)

    Pellikka, Havu; Leijala, Ulpu; Johansson, Milla M.; Leinonen, Katri; Kahma, Kimmo K.

    2018-04-01

    Coastal planning requires detailed knowledge of future flooding risks, and effective planning must consider both short-term sea level variations and the long-term trend. We calculate distributions that combine short- and long-term effects to provide estimates of flood probabilities in 2050 and 2100 on the Finnish coast in the Baltic Sea. Our distributions of short-term sea level variations are based on 46 years (1971-2016) of observations from the 13 Finnish tide gauges. The long-term scenarios of mean sea level combine postglacial land uplift, regionally adjusted scenarios of global sea level rise, and the effect of changes in the wind climate. The results predict that flooding risks will clearly increase by 2100 in the Gulf of Finland and the Bothnian Sea, while only a small increase or no change compared to present-day conditions is expected in the Bothnian Bay, where the land uplift is stronger.

  10. Parsimonious nonstationary flood frequency analysis

    NASA Astrophysics Data System (ADS)

    Serago, Jake M.; Vogel, Richard M.

    2018-02-01

    There is now widespread awareness of the impact of anthropogenic influences on extreme floods (and droughts) and thus an increasing need for methods to account for such influences when estimating a frequency distribution. We introduce a parsimonious approach to nonstationary flood frequency analysis (NFFA) based on a bivariate regression equation which describes the relationship between annual maximum floods, x, and an exogenous variable which may explain the nonstationary behavior of x. The conditional mean, variance and skewness of both x and y = ln (x) are derived, and combined with numerous common probability distributions including the lognormal, generalized extreme value and log Pearson type III models, resulting in a very simple and general approach to NFFA. Our approach offers several advantages over existing approaches including: parsimony, ease of use, graphical display, prediction intervals, and opportunities for uncertainty analysis. We introduce nonstationary probability plots and document how such plots can be used to assess the improved goodness of fit associated with a NFFA.

  11. Crystallization of hard spheres revisited. I. Extracting kinetics and free energy landscape from forward flux sampling.

    PubMed

    Richard, David; Speck, Thomas

    2018-03-28

    We investigate the kinetics and the free energy landscape of the crystallization of hard spheres from a supersaturated metastable liquid though direct simulations and forward flux sampling. In this first paper, we describe and test two different ways to reconstruct the free energy barriers from the sampled steady state probability distribution of cluster sizes without sampling the equilibrium distribution. The first method is based on mean first passage times, and the second method is based on splitting probabilities. We verify both methods for a single particle moving in a double-well potential. For the nucleation of hard spheres, these methods allow us to probe a wide range of supersaturations and to reconstruct the kinetics and the free energy landscape from the same simulation. Results are consistent with the scaling predicted by classical nucleation theory although a quantitative fit requires a rather large effective interfacial tension.

  12. Crystallization of hard spheres revisited. I. Extracting kinetics and free energy landscape from forward flux sampling

    NASA Astrophysics Data System (ADS)

    Richard, David; Speck, Thomas

    2018-03-01

    We investigate the kinetics and the free energy landscape of the crystallization of hard spheres from a supersaturated metastable liquid though direct simulations and forward flux sampling. In this first paper, we describe and test two different ways to reconstruct the free energy barriers from the sampled steady state probability distribution of cluster sizes without sampling the equilibrium distribution. The first method is based on mean first passage times, and the second method is based on splitting probabilities. We verify both methods for a single particle moving in a double-well potential. For the nucleation of hard spheres, these methods allow us to probe a wide range of supersaturations and to reconstruct the kinetics and the free energy landscape from the same simulation. Results are consistent with the scaling predicted by classical nucleation theory although a quantitative fit requires a rather large effective interfacial tension.

  13. Conflation and aggregation of spatial data improve predictive models for species with limited habitats: a case of the threatened yellow-billed cuckoo in Arizona, USA

    USGS Publications Warehouse

    Villarreal, Miguel L.; van Riper, Charles; Petrakis, Roy E.

    2013-01-01

    Riparian vegetation provides important wildlife habitat in the Southwestern United States, but limited distributions and spatial complexity often leads to inaccurate representation in maps used to guide conservation. We test the use of data conflation and aggregation on multiple vegetation/land-cover maps to improve the accuracy of habitat models for the threatened western yellow-billed cuckoo (Coccyzus americanus occidentalis). We used species observations (n = 479) from a state-wide survey to develop habitat models from 1) three vegetation/land-cover maps produced at different geographic scales ranging from state to national, and 2) new aggregate maps defined by the spatial agreement of cover types, which were defined as high (agreement = all data sets), moderate (agreement ≥ 2), and low (no agreement required). Model accuracies, predicted habitat locations, and total area of predicted habitat varied considerably, illustrating the effects of input data quality on habitat predictions and resulting potential impacts on conservation planning. Habitat models based on aggregated and conflated data were more accurate and had higher model sensitivity than original vegetation/land-cover, but this accuracy came at the cost of reduced geographic extent of predicted habitat. Using the highest performing models, we assessed cuckoo habitat preference and distribution in Arizona and found that major watersheds containing high-probably habitat are fragmented by a wide swath of low-probability habitat. Focus on riparian restoration in these areas could provide more breeding habitat for the threatened cuckoo, offset potential future habitat losses in adjacent watershed, and increase regional connectivity for other threatened vertebrates that also use riparian corridors.

  14. Multi-dimensional classification of GABAergic interneurons with Bayesian network-modeled label uncertainty.

    PubMed

    Mihaljević, Bojan; Bielza, Concha; Benavides-Piccione, Ruth; DeFelipe, Javier; Larrañaga, Pedro

    2014-01-01

    Interneuron classification is an important and long-debated topic in neuroscience. A recent study provided a data set of digitally reconstructed interneurons classified by 42 leading neuroscientists according to a pragmatic classification scheme composed of five categorical variables, namely, of the interneuron type and four features of axonal morphology. From this data set we now learned a model which can classify interneurons, on the basis of their axonal morphometric parameters, into these five descriptive variables simultaneously. Because of differences in opinion among the neuroscientists, especially regarding neuronal type, for many interneurons we lacked a unique, agreed-upon classification, which we could use to guide model learning. Instead, we guided model learning with a probability distribution over the neuronal type and the axonal features, obtained, for each interneuron, from the neuroscientists' classification choices. We conveniently encoded such probability distributions with Bayesian networks, calling them label Bayesian networks (LBNs), and developed a method to predict them. This method predicts an LBN by forming a probabilistic consensus among the LBNs of the interneurons most similar to the one being classified. We used 18 axonal morphometric parameters as predictor variables, 13 of which we introduce in this paper as quantitative counterparts to the categorical axonal features. We were able to accurately predict interneuronal LBNs. Furthermore, when extracting crisp (i.e., non-probabilistic) predictions from the predicted LBNs, our method outperformed related work on interneuron classification. Our results indicate that our method is adequate for multi-dimensional classification of interneurons with probabilistic labels. Moreover, the introduced morphometric parameters are good predictors of interneuron type and the four features of axonal morphology and thus may serve as objective counterparts to the subjective, categorical axonal features.

  15. Seasonal streamflow prediction using ensemble streamflow prediction technique for the Rangitata and Waitaki River basins on the South Island of New Zealand

    NASA Astrophysics Data System (ADS)

    Singh, Shailesh Kumar

    2014-05-01

    Streamflow forecasts are essential for making critical decision for optimal allocation of water supplies for various demands that include irrigation for agriculture, habitat for fisheries, hydropower production and flood warning. The major objective of this study is to explore the Ensemble Streamflow Prediction (ESP) based forecast in New Zealand catchments and to highlights the present capability of seasonal flow forecasting of National Institute of Water and Atmospheric Research (NIWA). In this study a probabilistic forecast framework for ESP is presented. The basic assumption in ESP is that future weather pattern were experienced historically. Hence, past forcing data can be used with current initial condition to generate an ensemble of prediction. Small differences in initial conditions can result in large difference in the forecast. The initial state of catchment can be obtained by continuously running the model till current time and use this initial state with past forcing data to generate ensemble of flow for future. The approach taken here is to run TopNet hydrological models with a range of past forcing data (precipitation, temperature etc.) with current initial conditions. The collection of runs is called the ensemble. ESP give probabilistic forecasts for flow. From ensemble members the probability distributions can be derived. The probability distributions capture part of the intrinsic uncertainty in weather or climate. An ensemble stream flow prediction which provide probabilistic hydrological forecast with lead time up to 3 months is presented for Rangitata, Ahuriri, and Hooker and Jollie rivers in South Island of New Zealand. ESP based seasonal forecast have better skill than climatology. This system can provide better over all information for holistic water resource management.

  16. Design prediction for long term stress rupture service of composite pressure vessels

    NASA Technical Reports Server (NTRS)

    Robinson, Ernest Y.

    1992-01-01

    Extensive stress rupture studies on glass composites and Kevlar composites were conducted by the Lawrence Radiation Laboratory beginning in the late 1960's and extending to about 8 years in some cases. Some of the data from these studies published over the years were incomplete or were tainted by spurious failures, such as grip slippage. Updated data sets were defined for both fiberglass and Kevlar composite stand test specimens. These updated data are analyzed in this report by a convenient form of the bivariate Weibull distribution, to establish a consistent set of design prediction charts that may be used as a conservative basis for predicting the stress rupture life of composite pressure vessels. The updated glass composite data exhibit an invariant Weibull modulus with lifetime. The data are analyzed in terms of homologous service load (referenced to the observed median strength). The equations relating life, homologous load, and probability are given, and corresponding design prediction charts are presented. A similar approach is taken for Kevlar composites, where the updated stand data do show a turndown tendency at long life accompanied by a corresponding change (increase) of the Weibull modulus. The turndown characteristic is not present in stress rupture test data of Kevlar pressure vessels. A modification of the stress rupture equations is presented to incorporate a latent, but limited, strength drop, and design prediction charts are presented that incorporate such behavior. The methods presented utilize Cartesian plots of the probability distributions (which are a more natural display for the design engineer), based on median normalized data that are independent of statistical parameters and are readily defined for any set of test data.

  17. Design of a sampling plan to detect ochratoxin A in green coffee.

    PubMed

    Vargas, E A; Whitaker, T B; Dos Santos, E A; Slate, A B; Lima, F B; Franca, R C A

    2006-01-01

    The establishment of maximum limits for ochratoxin A (OTA) in coffee by importing countries requires that coffee-producing countries develop scientifically based sampling plans to assess OTA contents in lots of green coffee before coffee enters the market thus reducing consumer exposure to OTA, minimizing the number of lots rejected, and reducing financial loss for producing countries. A study was carried out to design an official sampling plan to determine OTA in green coffee produced in Brazil. Twenty-five lots of green coffee (type 7 - approximately 160 defects) were sampled according to an experimental protocol where 16 test samples were taken from each lot (total of 16 kg) resulting in a total of 800 OTA analyses. The total, sampling, sample preparation, and analytical variances were 10.75 (CV = 65.6%), 7.80 (CV = 55.8%), 2.84 (CV = 33.7%), and 0.11 (CV = 6.6%), respectively, assuming a regulatory limit of 5 microg kg(-1) OTA and using a 1 kg sample, Romer RAS mill, 25 g sub-samples, and high performance liquid chromatography. The observed OTA distribution among the 16 OTA sample results was compared to several theoretical distributions. The 2 parameter-log normal distribution was selected to model OTA test results for green coffee as it gave the best fit across all 25 lot distributions. Specific computer software was developed using the variance and distribution information to predict the probability of accepting or rejecting coffee lots at specific OTA concentrations. The acceptation probability was used to compute an operating characteristic (OC) curve specific to a sampling plan design. The OC curve was used to predict the rejection of good lots (sellers' or exporters' risk) and the acceptance of bad lots (buyers' or importers' risk).

  18. Impact assessment of a high-speed railway line on species distribution: application to the European tree frog (Hyla arborea) in Franche-Comté.

    PubMed

    Clauzel, Céline; Girardet, Xavier; Foltête, Jean-Christophe

    2013-09-30

    The aim of the present work is to assess the potential long-distance effect of a high-speed railway line on the distribution of the European tree frog (Hyla arborea) in eastern France by combining graph-based analysis and species distribution models. This combination is a way to integrate patch-level connectivity metrics on different scales into a predictive model. The approach used is put in place before the construction of the infrastructure and allows areas potentially affected by isolation to be mapped. Through a diachronic analysis, comparing species distribution before and after the construction of the infrastructure, we identify changes in the probability of species presence and we determine the maximum distance of impact. The results show that the potential impact decreases with distance from the high-speed railway line and the largest disturbances occur within the first 500 m. Between 500 m and 3500 m, the infrastructure generates a moderate decrease in the probability of presence with maximum values close to -40%. Beyond 3500 m the average disturbance is less than -10%. The spatial extent of the impact is greater than the dispersal distance of the tree frog, confirming the assumption of the long-distance effect of the infrastructure. This predictive modelling approach appears to be a useful tool for environmental impact assessment and strategic environmental assessment. The results of the species distribution assessment may provide guidance for field surveys and support for conservation decisions by identifying the areas most affected. Copyright © 2013 Elsevier Ltd. All rights reserved.

  19. Modelling the spatial distribution of Fasciola hepatica in dairy cattle in Europe.

    PubMed

    Ducheyne, Els; Charlier, Johannes; Vercruysse, Jozef; Rinaldi, Laura; Biggeri, Annibale; Demeler, Janina; Brandt, Christina; De Waal, Theo; Selemetas, Nikolaos; Höglund, Johan; Kaba, Jaroslaw; Kowalczyk, Slawomir J; Hendrickx, Guy

    2015-03-26

    A harmonized sampling approach in combination with spatial modelling is required to update current knowledge of fasciolosis in dairy cattle in Europe. Within the scope of the EU project GLOWORM, samples from 3,359 randomly selected farms in 849 municipalities in Belgium, Germany, Ireland, Poland and Sweden were collected and their infection status assessed using an indirect bulk tank milk (BTM) enzyme-linked immunosorbent assay (ELISA). Dairy farms were considered exposed when the optical density ratio (ODR) exceeded the 0.3 cut-off. Two ensemble-modelling techniques, Random Forests (RF) and Boosted Regression Trees (BRT), were used to obtain the spatial distribution of the probability of exposure to Fasciola hepatica using remotely sensed environmental variables (1-km spatial resolution) and interpolated values from meteorological stations as predictors. The median ODRs amounted to 0.31, 0.12, 0.54, 0.25 and 0.44 for Belgium, Germany, Ireland, Poland and southern Sweden, respectively. Using the 0.3 threshold, 571 municipalities were categorized as positive and 429 as negative. RF was seen as capable of predicting the spatial distribution of exposure with an area under the receiver operation characteristic (ROC) curve (AUC) of 0.83 (0.96 for BRT). Both models identified rainfall and temperature as the most important factors for probability of exposure. Areas of high and low exposure were identified by both models, with BRT better at discriminating between low-probability and high-probability exposure; this model may therefore be more useful in practise. Given a harmonized sampling strategy, it should be possible to generate robust spatial models for fasciolosis in dairy cattle in Europe to be used as input for temporal models and for the detection of deviations in baseline probability. Further research is required for model output in areas outside the eco-climatic range investigated.

  20. The Effect of Nondeterministic Parameters on Shock-Associated Noise Prediction Modeling

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.; Khavaran, Abbas

    2010-01-01

    Engineering applications for aircraft noise prediction contain models for physical phenomenon that enable solutions to be computed quickly. These models contain parameters that have an uncertainty not accounted for in the solution. To include uncertainty in the solution, nondeterministic computational methods are applied. Using prediction models for supersonic jet broadband shock-associated noise, fixed model parameters are replaced by probability distributions to illustrate one of these methods. The results show the impact of using nondeterministic parameters both on estimating the model output uncertainty and on the model spectral level prediction. In addition, a global sensitivity analysis is used to determine the influence of the model parameters on the output, and to identify the parameters with the least influence on model output.

  1. Predicting a future lifetime through Box-Cox transformation.

    PubMed

    Yang, Z

    1999-09-01

    In predicting a future lifetime based on a sample of past lifetimes, the Box-Cox transformation method provides a simple and unified procedure that is shown in this article to meet or often outperform the corresponding frequentist solution in terms of coverage probability and average length of prediction intervals. Kullback-Leibler information and second-order asymptotic expansion are used to justify the Box-Cox procedure. Extensive Monte Carlo simulations are also performed to evaluate the small sample behavior of the procedure. Certain popular lifetime distributions, such as Weibull, inverse Gaussian and Birnbaum-Saunders are served as illustrative examples. One important advantage of the Box-Cox procedure lies in its easy extension to linear model predictions where the exact frequentist solutions are often not available.

  2. Prediction of Spatiotemporal Patterns of Neural Activity from Pairwise Correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marre, O.; El Boustani, S.; Fregnac, Y.

    We designed a model-based analysis to predict the occurrence of population patterns in distributed spiking activity. Using a maximum entropy principle with a Markovian assumption, we obtain a model that accounts for both spatial and temporal pairwise correlations among neurons. This model is tested on data generated with a Glauber spin-glass system and is shown to correctly predict the occurrence probabilities of spatiotemporal patterns significantly better than Ising models only based on spatial correlations. This increase of predictability was also observed on experimental data recorded in parietal cortex during slow-wave sleep. This approach can also be used to generate surrogatesmore » that reproduce the spatial and temporal correlations of a given data set.« less

  3. Statistical thermodynamics of clustered populations.

    PubMed

    Matsoukas, Themis

    2014-08-01

    We present a thermodynamic theory for a generic population of M individuals distributed into N groups (clusters). We construct the ensemble of all distributions with fixed M and N, introduce a selection functional that embodies the physics that governs the population, and obtain the distribution that emerges in the scaling limit as the most probable among all distributions consistent with the given physics. We develop the thermodynamics of the ensemble and establish a rigorous mapping to regular thermodynamics. We treat the emergence of a so-called giant component as a formal phase transition and show that the criteria for its emergence are entirely analogous to the equilibrium conditions in molecular systems. We demonstrate the theory by an analytic model and confirm the predictions by Monte Carlo simulation.

  4. Modelling benthic macrofauna and seagrass distribution patterns in a North Sea tidal basin in response to 2050 climatic and environmental scenarios

    NASA Astrophysics Data System (ADS)

    Singer, Anja; Millat, Gerald; Staneva, Joanna; Kröncke, Ingrid

    2017-03-01

    Small-scale spatial distribution patterns of seven macrofauna species, seagrass beds and mixed mussel/oyster reefs were modelled for the Jade Bay (North Sea, Germany) in response to climatic and environmental scenarios (representing 2050). For the species distribution models four presence-absence modelling methods were merged within the ensemble forecasting platform 'biomod2'. The present spatial distribution (representing 2009) was modelled by statistically related species presences, true species absences and six high-resolution environmental grids. The future spatial distribution was then predicted in response to expected climate change-induced ongoing (1) sea-level rise and (2) water temperature increase. Between 2009 and 2050, the present and future prediction maps revealed a significant range gain for two macrofauna species (Macoma balthica, Tubificoides benedii), whereas the species' range sizes of five macrofauna species remained relatively stable across space and time. The predicted probability of occurrence (PO) of two macrofauna species (Cerastoderma edule, Scoloplos armiger) decreased significantly under the potential future habitat conditions. In addition, a clear seagrass bed extension (Zostera noltii) on the lower intertidal flats (mixed sediments) and a decrease in the PO of mixed Mytilus edulis/Crassostrea gigas reefs was predicted for 2050. Until the mid-21st century, our future climatic and environmental scenario revealed significant changes in the range sizes (gains-losses) and/or the PO (increases-decreases) for seven of the 10 modelled species at the study site.

  5. A computational framework to empower probabilistic protein design

    PubMed Central

    Fromer, Menachem; Yanover, Chen

    2008-01-01

    Motivation: The task of engineering a protein to perform a target biological function is known as protein design. A commonly used paradigm casts this functional design problem as a structural one, assuming a fixed backbone. In probabilistic protein design, positional amino acid probabilities are used to create a random library of sequences to be simultaneously screened for biological activity. Clearly, certain choices of probability distributions will be more successful in yielding functional sequences. However, since the number of sequences is exponential in protein length, computational optimization of the distribution is difficult. Results: In this paper, we develop a computational framework for probabilistic protein design following the structural paradigm. We formulate the distribution of sequences for a structure using the Boltzmann distribution over their free energies. The corresponding probabilistic graphical model is constructed, and we apply belief propagation (BP) to calculate marginal amino acid probabilities. We test this method on a large structural dataset and demonstrate the superiority of BP over previous methods. Nevertheless, since the results obtained by BP are far from optimal, we thoroughly assess the paradigm using high-quality experimental data. We demonstrate that, for small scale sub-problems, BP attains identical results to those produced by exact inference on the paradigmatic model. However, quantitative analysis shows that the distributions predicted significantly differ from the experimental data. These findings, along with the excellent performance we observed using BP on the smaller problems, suggest potential shortcomings of the paradigm. We conclude with a discussion of how it may be improved in the future. Contact: fromer@cs.huji.ac.il PMID:18586717

  6. A machine learning approach to quantifying geologic similarities between sites of gas hydrate accumulation

    NASA Astrophysics Data System (ADS)

    Runyan, T. E.; Wood, W. T.; Palmsten, M. L.; Zhang, R.

    2016-12-01

    Gas hydrates, specifically methane hydrates, are sparsely sampled on a global scale, and their accumulation is difficult to predict geospatially. Several attempts have been made at estimating global inventories, and to some extent geospatial distribution, using geospatial extrapoltions guided with geophysical and geochemical methods. Our objective is to quantitatively predict the geospatial likelihood of encountering methane hydrates, with uncertainty. Predictions could be incorporated into analyses of drilling hazards as well as climate change. We use global data sets (including water depth, temperature, pressure, TOC, sediment thickness, and heat flow) as parameters to train a k-nearest neighbor (KNN) machine learning technique. The KNN is unsupervised and non-parametric, we do not provide any interpretive influence on prior probability distribution, so our results are strictly data driven. We have selected as test sites several locations where gas hydrates have been well studied, each with significantly different geologic settings.These include: The Blake Ridge (U.S. East Coast), Hydrate Ridge (U.S. West Coast), and the Gulf of Mexico. We then use KNN to quantify similarities between these sites, and determine, via the distance in parameter space, what is the likelihood and uncertainty of encountering gas hydrate anywhere in the world. Here we are operating under the assumption that the distance in parameter space is proportional to the probability of the occurrence of gas hydrate. We then compare these global similarity maps made from our several test sites to identify the geologic (geophyisical, bio-geochemical) parameters best suited for predicting gas hydrate occurrence.

  7. Metocean design parameter estimation for fixed platform based on copula functions

    NASA Astrophysics Data System (ADS)

    Zhai, Jinjin; Yin, Qilin; Dong, Sheng

    2017-08-01

    Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.

  8. Models of multidimensional discrete distribution of probabilities of random variables in information systems

    NASA Astrophysics Data System (ADS)

    Gromov, Yu Yu; Minin, Yu V.; Ivanova, O. G.; Morozova, O. N.

    2018-03-01

    Multidimensional discrete distributions of probabilities of independent random values were received. Their one-dimensional distribution is widely used in probability theory. Producing functions of those multidimensional distributions were also received.

  9. Maximum predictive power and the superposition principle

    NASA Technical Reports Server (NTRS)

    Summhammer, Johann

    1994-01-01

    In quantum physics the direct observables are probabilities of events. We ask how observed probabilities must be combined to achieve what we call maximum predictive power. According to this concept the accuracy of a prediction must only depend on the number of runs whose data serve as input for the prediction. We transform each probability to an associated variable whose uncertainty interval depends only on the amount of data and strictly decreases with it. We find that for a probability which is a function of two other probabilities maximum predictive power is achieved when linearly summing their associated variables and transforming back to a probability. This recovers the quantum mechanical superposition principle.

  10. Forecasting the Rupture Directivity of Large Earthquakes: Centroid Bias of the Conditional Hypocenter Distribution

    NASA Astrophysics Data System (ADS)

    Donovan, J.; Jordan, T. H.

    2012-12-01

    Forecasting the rupture directivity of large earthquakes is an important problem in probabilistic seismic hazard analysis (PSHA), because directivity is known to strongly influence ground motions. We describe how rupture directivity can be forecast in terms of the "conditional hypocenter distribution" or CHD, defined to be the probability distribution of a hypocenter given the spatial distribution of moment release (fault slip). The simplest CHD is a uniform distribution, in which the hypocenter probability density equals the moment-release probability density. For rupture models in which the rupture velocity and rise time depend only on the local slip, the CHD completely specifies the distribution of the directivity parameter D, defined in terms of the degree-two polynomial moments of the source space-time function. This parameter, which is zero for a bilateral rupture and unity for a unilateral rupture, can be estimated from finite-source models or by the direct inversion of seismograms (McGuire et al., 2002). We compile D-values from published studies of 65 large earthquakes and show that these data are statistically inconsistent with the uniform CHD advocated by McGuire et al. (2002). Instead, the data indicate a "centroid biased" CHD, in which the expected distance between the hypocenter and the hypocentroid is less than that of a uniform CHD. In other words, the observed directivities appear to be closer to bilateral than predicted by this simple model. We discuss the implications of these results for rupture dynamics and fault-zone heterogeneities. We also explore their PSHA implications by modifying the CyberShake simulation-based hazard model for the Los Angeles region, which assumed a uniform CHD (Graves et al., 2011).

  11. Estimating large carnivore populations at global scale based on spatial predictions of density and distribution – Application to the jaguar (Panthera onca)

    PubMed Central

    Robinson, Hugh S.; Abarca, Maria; Zeller, Katherine A.; Velasquez, Grisel; Paemelaere, Evi A. D.; Goldberg, Joshua F.; Payan, Esteban; Hoogesteijn, Rafael; Boede, Ernesto O.; Schmidt, Krzysztof; Lampo, Margarita; Viloria, Ángel L.; Carreño, Rafael; Robinson, Nathaniel; Lukacs, Paul M.; Nowak, J. Joshua; Salom-Pérez, Roberto; Castañeda, Franklin; Boron, Valeria; Quigley, Howard

    2018-01-01

    Broad scale population estimates of declining species are desired for conservation efforts. However, for many secretive species including large carnivores, such estimates are often difficult. Based on published density estimates obtained through camera trapping, presence/absence data, and globally available predictive variables derived from satellite imagery, we modelled density and occurrence of a large carnivore, the jaguar, across the species’ entire range. We then combined these models in a hierarchical framework to estimate the total population. Our models indicate that potential jaguar density is best predicted by measures of primary productivity, with the highest densities in the most productive tropical habitats and a clear declining gradient with distance from the equator. Jaguar distribution, in contrast, is determined by the combined effects of human impacts and environmental factors: probability of jaguar occurrence increased with forest cover, mean temperature, and annual precipitation and declined with increases in human foot print index and human density. Probability of occurrence was also significantly higher for protected areas than outside of them. We estimated the world’s jaguar population at 173,000 (95% CI: 138,000–208,000) individuals, mostly concentrated in the Amazon Basin; elsewhere, populations tend to be small and fragmented. The high number of jaguars results from the large total area still occupied (almost 9 million km2) and low human densities (< 1 person/km2) coinciding with high primary productivity in the core area of jaguar range. Our results show the importance of protected areas for jaguar persistence. We conclude that combining modelling of density and distribution can reveal ecological patterns and processes at global scales, can provide robust estimates for use in species assessments, and can guide broad-scale conservation actions. PMID:29579129

  12. State-space modeling to support management of brucellosis in the Yellowstone bison population

    USGS Publications Warehouse

    Hobbs, N. Thompson; Geremia, Chris; Treanor, John; Wallen, Rick; White, P.J.; Hooten, Mevin B.; Rhyan, Jack C.

    2015-01-01

    The bison (Bison bison) of the Yellowstone ecosystem, USA, exemplify the difficulty of conserving large mammals that migrate across the boundaries of conservation areas. Bison are infected with brucellosis (Brucella abortus) and their seasonal movements can expose livestock to infection. Yellowstone National Park has embarked on a program of adaptive management of bison, which requires a model that assimilates data to support management decisions. We constructed a Bayesian state-space model to reveal the influence of brucellosis on the Yellowstone bison population. A frequency-dependent model of brucellosis transmission was superior to a density-dependent model in predicting out-of-sample observations of horizontal transmission probability. A mixture model including both transmission mechanisms converged on frequency dependence. Conditional on the frequency-dependent model, brucellosis median transmission rate was 1.87 yr−1. The median of the posterior distribution of the basic reproductive ratio (R0) was 1.75. Seroprevalence of adult females varied around 60% over two decades, but only 9.6 of 100 adult females were infectious. Brucellosis depressed recruitment; estimated population growth rate λ averaged 1.07 for an infected population and 1.11 for a healthy population. We used five-year forecasting to evaluate the ability of different actions to meet management goals relative to no action. Annually removing 200 seropositive female bison increased by 30-fold the probability of reducing seroprevalence below 40% and increased by a factor of 120 the probability of achieving a 50% reduction in transmission probability relative to no action. Annually vaccinating 200 seronegative animals increased the likelihood of a 50% reduction in transmission probability by fivefold over no action. However, including uncertainty in the ability to implement management by representing stochastic variation in the number of accessible bison dramatically reduced the probability of achieving goals using interventions relative to no action. Because the width of the posterior predictive distributions of future population states expands rapidly with increases in the forecast horizon, managers must accept high levels of uncertainty. These findings emphasize the necessity of iterative, adaptive management with relatively short-term commitment to action and frequent reevaluation in response to new data and model forecasts. We believe our approach has broad applications.

  13. Machine health prognostics using the Bayesian-inference-based probabilistic indication and high-order particle filtering framework

    NASA Astrophysics Data System (ADS)

    Yu, Jianbo

    2015-12-01

    Prognostics is much efficient to achieve zero-downtime performance, maximum productivity and proactive maintenance of machines. Prognostics intends to assess and predict the time evolution of machine health degradation so that machine failures can be predicted and prevented. A novel prognostics system is developed based on the data-model-fusion scheme using the Bayesian inference-based self-organizing map (SOM) and an integration of logistic regression (LR) and high-order particle filtering (HOPF). In this prognostics system, a baseline SOM is constructed to model the data distribution space of healthy machine under an assumption that predictable fault patterns are not available. Bayesian inference-based probability (BIP) derived from the baseline SOM is developed as a quantification indication of machine health degradation. BIP is capable of offering failure probability for the monitored machine, which has intuitionist explanation related to health degradation state. Based on those historic BIPs, the constructed LR and its modeling noise constitute a high-order Markov process (HOMP) to describe machine health propagation. HOPF is used to solve the HOMP estimation to predict the evolution of the machine health in the form of a probability density function (PDF). An on-line model update scheme is developed to adapt the Markov process changes to machine health dynamics quickly. The experimental results on a bearing test-bed illustrate the potential applications of the proposed system as an effective and simple tool for machine health prognostics.

  14. The Statistics of Urban Scaling and Their Connection to Zipf’s Law

    PubMed Central

    Gomez-Lievano, Andres; Youn, HyeJin; Bettencourt, Luís M. A.

    2012-01-01

    Urban scaling relations characterizing how diverse properties of cities vary on average with their population size have recently been shown to be a general quantitative property of many urban systems around the world. However, in previous studies the statistics of urban indicators were not analyzed in detail, raising important questions about the full characterization of urban properties and how scaling relations may emerge in these larger contexts. Here, we build a self-consistent statistical framework that characterizes the joint probability distributions of urban indicators and city population sizes across an urban system. To develop this framework empirically we use one of the most granular and stochastic urban indicators available, specifically measuring homicides in cities of Brazil, Colombia and Mexico, three nations with high and fast changing rates of violent crime. We use these data to derive the conditional probability of the number of homicides per year given the population size of a city. To do this we use Bayes’ rule together with the estimated conditional probability of city size given their number of homicides and the distribution of total homicides. We then show that scaling laws emerge as expectation values of these conditional statistics. Knowledge of these distributions implies, in turn, a relationship between scaling and population size distribution exponents that can be used to predict Zipf’s exponent from urban indicator statistics. Our results also suggest how a general statistical theory of urban indicators may be constructed from the stochastic dynamics of social interaction processes in cities. PMID:22815745

  15. Radiation transport codes for potential applications related to radiobiology and radiotherapy using protons, neutrons, and negatively charged pions

    NASA Technical Reports Server (NTRS)

    Armstrong, T. W.

    1972-01-01

    Several Monte Carlo radiation transport computer codes are used to predict quantities of interest in the fields of radiotherapy and radiobiology. The calculational methods are described and comparisions of calculated and experimental results are presented for dose distributions produced by protons, neutrons, and negatively charged pions. Comparisons of calculated and experimental cell survival probabilities are also presented.

  16. Statistical Maps of Ground Magnetic Disturbance Derived from Global Geospace Models

    NASA Astrophysics Data System (ADS)

    Rigler, E. J.; Wiltberger, M. J.; Love, J. J.

    2017-12-01

    Electric currents in space are the principal driver of magnetic variations measured at Earth's surface. These in turn induce geoelectric fields that present a natural hazard for technological systems like high-voltage power distribution networks. Modern global geospace models can reasonably simulate large-scale geomagnetic response to solar wind variations, but they are less successful at deterministic predictions of intense localized geomagnetic activity that most impacts technological systems on the ground. Still, recent studies have shown that these models can accurately reproduce the spatial statistical distributions of geomagnetic activity, suggesting that their physics are largely correct. Since the magnetosphere is a largely externally driven system, most model-measurement discrepancies probably arise from uncertain boundary conditions. So, with realistic distributions of solar wind parameters to establish its boundary conditions, we use the Lyon-Fedder-Mobarry (LFM) geospace model to build a synthetic multivariate statistical model of gridded ground magnetic disturbance. From this, we analyze the spatial modes of geomagnetic response, regress on available measurements to fill in unsampled locations on the grid, and estimate the global probability distribution of extreme magnetic disturbance. The latter offers a prototype geomagnetic "hazard map", similar to those used to characterize better-known geophysical hazards like earthquakes and floods.

  17. The statistical properties and possible causes of polar motion prediction errors

    NASA Astrophysics Data System (ADS)

    Kosek, Wieslaw; Kalarus, Maciej; Wnek, Agnieszka; Zbylut-Gorska, Maria

    2015-08-01

    The pole coordinate data predictions from different prediction contributors of the Earth Orientation Parameters Combination of Prediction Pilot Project (EOPCPPP) were studied to determine the statistical properties of polar motion forecasts by looking at the time series of differences between them and the future IERS pole coordinates data. The mean absolute errors, standard deviations as well as the skewness and kurtosis of these differences were computed together with their error bars as a function of prediction length. The ensemble predictions show a little smaller mean absolute errors or standard deviations however their skewness and kurtosis values are similar as the for predictions from different contributors. The skewness and kurtosis enable to check whether these prediction differences satisfy normal distribution. The kurtosis values diminish with the prediction length which means that the probability distribution of these prediction differences is becoming more platykurtic than letptokurtic. Non zero skewness values result from oscillating character of these differences for particular prediction lengths which can be due to the irregular change of the annual oscillation phase in the joint fluid (atmospheric + ocean + land hydrology) excitation functions. The variations of the annual oscillation phase computed by the combination of the Fourier transform band pass filter and the Hilbert transform from pole coordinates data as well as from pole coordinates model data obtained from fluid excitations are in a good agreement.

  18. Storage in alluvial deposits controls the timing of particle delivery from large watersheds, filtering upland erosional signals and delaying benefits from watershed best management practices

    NASA Astrophysics Data System (ADS)

    Pizzuto, J. E.; Skalak, K.; Karwan, D. L.

    2017-12-01

    Transport of suspended sediment and sediment-borne constituents (here termed fluvial particles) through large river systems can be significantly influenced by episodic storage in floodplains and other alluvial deposits. Geomorphologists quantify the importance of storage using sediment budgets, but these data alone are insufficient to determine how storage influences the routing of fluvial particles through river corridors across large spatial scales. For steady state systems, models that combine sediment budget data with "waiting time distributions" (to define how long deposited particles remain stored until being remobilized) and velocities during transport events can provide useful predictions. Limited field data suggest that waiting time distributions are well represented by power laws, extending from <1 to >104 years, while the probability of storage defined by sediment budgets varies from 0.1 km-1 for small drainage basins to 0.001 km-1 for the world's largest watersheds. Timescales of particle delivery from large watersheds are determined by storage rather than by transport processes, with most particles requiring 102 -104 years to reach the basin outlet. These predictions suggest that erosional "signals" induced by climate change, tectonics, or anthropogenic activity will be transformed by storage before delivery to the outlets of large watersheds. In particular, best management practices (BMPs) implemented in upland source areas, designed to reduce the loading of fluvial particles to estuarine receiving waters, will not achieve their intended benefits for centuries (or longer). For transient systems, waiting time distributions cannot be constant, but will vary as portions of transient sediment "pulses" enter and are later released from storage. The delivery of sediment pulses under transient conditions can be predicted by adopting the hypothesis that the probability of erosion of stored particles will decrease with increasing "age" (where age is defined as the elapsed time since deposition). Then, waiting time and age distributions for stored particles become predictions based on the architecture of alluvial storage and the tendency for erosional processes to preferentially remove younger deposits, improving assessment of watershed BMPs and other important applications.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, M; Choi, E; Chuong, M

    Purpose: To evaluate weather the current radiobiological models can predict the normal liver complications of radioactive Yttrium-90 ({sup 90}Y) selective-internal-radiation-treatment (SIRT) for metastatic liver lesions based on the post-infusion {sup 90}Y PET images. Methods: A total of 20 patients with metastatic liver tumors treated with SIRT that received a post-infusion {sup 90}Y-PET/CT scan were analyzed in this work. The 3D activity distribution of the PET images was converted into a 3D dose distribution via a kernel convolution process. The physical dose distribution was converted into the equivalent dose (EQ2) delivered at 2 Gy based on the linear-quadratic (LQ) model consideringmore » the dose rate effect. The biological endpoint of this work was radiation-induce liver disease (RILD). The NTCPs were calculated with four different repair-times (T1/2-Liver-Repair= 0,0.5,1.0,2.0 hr) and three published NTCP models (Lyman-external-RT, Lyman 90Y-HCC-SIRT, parallel model) were compared to the incidence of RILD of the recruited patients to evaluate their ability of outcome prediction. Results: The mean normal liver physical dose (avg. 51.9 Gy, range 31.9–69.8 Gy) is higher than the suggested liver dose constraint for external beam treatment (∼30 Gy). However, none of the patients in our study developed RILD after the SIRT. The estimated probability of ‘no patient developing RILD’ obtained from the two Lyman models are 46.3% to 48.3% (T1/2-Liver-Repair= 0hr) and <1% for all other repair times. For the parallel model, the estimated probability is 97.3% (0hr), 51.7% (0.5hr), 2.0% (1.0hr) and <1% (2.0hr). Conclusion: Molecular-images providing the distribution of {sup 90}Y enable the dose-volume based dose/outcome analysis for SIRT. Current NTCP models fail to predict RILD complications in our patient population, unless a very short repair-time for the liver is assumed. The discrepancy between the Lyman {sup 90}Y-HCC-SIRT model predicted and the clinically observed outcomes further demonstrates the need of an NTCP model specific to the metastatic liver SIRT.« less

  20. Statistical Surrogate Models for Estimating Probability of High-Consequence Climate Change

    NASA Astrophysics Data System (ADS)

    Field, R.; Constantine, P.; Boslough, M.

    2011-12-01

    We have posed the climate change problem in a framework similar to that used in safety engineering, by acknowledging that probabilistic risk assessments focused on low-probability, high-consequence climate events are perhaps more appropriate than studies focused simply on best estimates. To properly explore the tails of the distribution requires extensive sampling, which is not possible with existing coupled atmospheric models due to the high computational cost of each simulation. We have developed specialized statistical surrogate models (SSMs) that can be used to make predictions about the tails of the associated probability distributions. A SSM is different than a deterministic surrogate model in that it represents each climate variable of interest as a space/time random field, that is, a random variable for every fixed location in the atmosphere at all times. The SSM can be calibrated to available spatial and temporal data from existing climate databases, or to a collection of outputs from general circulation models. Because of its reduced size and complexity, the realization of a large number of independent model outputs from a SSM becomes computationally straightforward, so that quantifying the risk associated with low-probability, high-consequence climate events becomes feasible. A Bayesian framework was also developed to provide quantitative measures of confidence, via Bayesian credible intervals, to assess these risks. To illustrate the use of the SSM, we considered two collections of NCAR CCSM 3.0 output data. The first collection corresponds to average December surface temperature for years 1990-1999 based on a collection of 8 different model runs obtained from the Program for Climate Model Diagnosis and Intercomparison (PCMDI). We calibrated the surrogate model to the available model data and make various point predictions. We also analyzed average precipitation rate in June, July, and August over a 54-year period assuming a cyclic Y2K ocean model. We applied the calibrated surrogate model to study the probability that the precipitation rate falls below certain thresholds and utilized the Bayesian approach to quantify our confidence in these predictions. Sandia is a multiprogram laboratory operated by Sandia Corporation, a Lockheed Martin Company, for the United States Department of Energy's National Nuclear Security Administration under Contract DE-AC04-94AL85000.

  1. Hyperspectral Mapping of the Invasive Species Pepperweed and the Development of a Habitat Suitability Model

    NASA Technical Reports Server (NTRS)

    Nguyen, Andrew; Gole, Alexander; Randall, Jarom; Dlott, Glade; Zhang, Sylvia; Alfaro, Brian; Schmidt, Cindy; Skiles, J. W.

    2011-01-01

    Mapping and predicting the spatial distribution of invasive plant species is central to habitat management, however difficult to implement at landscape and regional scales. Remote sensing techniques can reduce the impact field campaigns have on these ecologically sensitive areas and can provide a regional and multi-temporal view of invasive species spread. Invasive perennial pepperweed (Lepidium latifolium) is now widespread in fragmented estuaries of the South San Francisco Bay, and is shown to degrade native vegetation in estuaries and adjacent habitats, thereby reducing forage and shelter for wildlife. The purpose of this study is to map the present distribution of pepperweed in estuarine areas of the South San Francisco Bay Salt Pond Restoration Project (Alviso, CA), and create a habitat suitability model to predict future spread. Pepperweed reflectance data were collected in-situ with a GER 1500 spectroradiometer along with 88 corresponding pepperweed presence and absence points used for building the statistical models. The spectral angle mapper (SAM) classification algorithm was used to distinguish the reflectance spectrum of pepperweed and map its distribution using an image from EO-1 Hyperion. To map pepperweed, we performed a supervised classification on an ASTER image with a resulting classification accuracy of 71.8%. We generated a weighted overlay analysis model within a geographic information system (GIS) framework to predict areas in the study site most susceptible to pepperweed colonization. Variables for the model included propensity for disturbance, status of pond restoration, proximity to water channels, and terrain curvature. A Generalized Additive Model (GAM) was also used to generate a probability map and investigate the statistical probability that each variable contributed to predict pepperweed spread. Results from the GAM revealed distance to channels, distance to ponds and curvature were statistically significant (p < 0.01) in determining the locations of suitable pepperweed habitats.

  2. Limits on the prediction of helicopter rotor noise using thickness and loading sources: Validation of helicopter noise prediction techniques

    NASA Technical Reports Server (NTRS)

    Succi, G. P.

    1983-01-01

    The techniques of helicopter rotor noise prediction attempt to describe precisely the details of the noise field and remove the empiricisms and restrictions inherent in previous methods. These techniques require detailed inputs of the rotor geometry, operating conditions, and blade surface pressure distribution. The Farassat noise prediction techniques was studied, and high speed helicopter noise prediction using more detailed representations of the thickness and loading noise sources was investigated. These predictions were based on the measured blade surface pressures on an AH-1G rotor and compared to the measured sound field. Although refinements in the representation of the thickness and loading noise sources improve the calculation, there are still discrepancies between the measured and predicted sound field. Analysis of the blade surface pressure data indicates shocks on the blades, which are probably responsible for these discrepancies.

  3. Analysis of the Mean Absolute Error (MAE) and the Root Mean Square Error (RMSE) in Assessing Rounding Model

    NASA Astrophysics Data System (ADS)

    Wang, Weijie; Lu, Yanmin

    2018-03-01

    Most existing Collaborative Filtering (CF) algorithms predict a rating as the preference of an active user toward a given item, which is always a decimal fraction. Meanwhile, the actual ratings in most data sets are integers. In this paper, we discuss and demonstrate why rounding can bring different influences to these two metrics; prove that rounding is necessary in post-processing of the predicted ratings, eliminate of model prediction bias, improving the accuracy of the prediction. In addition, we also propose two new rounding approaches based on the predicted rating probability distribution, which can be used to round the predicted rating to an optimal integer rating, and get better prediction accuracy compared to the Basic Rounding approach. Extensive experiments on different data sets validate the correctness of our analysis and the effectiveness of our proposed rounding approaches.

  4. Modelling the spatial distribution of the nuisance mosquito species Anopheles plumbeus (Diptera: Culicidae) in the Netherlands.

    PubMed

    Ibañez-Justicia, Adolfo; Cianci, Daniela

    2015-05-01

    Landscape modifications, urbanization or changes of use of rural-agricultural areas can create more favourable conditions for certain mosquito species and therefore indirectly cause nuisance problems for humans. This could potentially result in mosquito-borne disease outbreaks when the nuisance is caused by mosquito species that can transmit pathogens. Anopheles plumbeus is a nuisance mosquito species and a potential malaria vector. It is one of the most frequently observed species in the Netherlands. Information on the distribution of this species is essential for risk assessments. The purpose of the study was to investigate the potential spatial distribution of An. plumbeus in the Netherlands. Random forest models were used to link the occurrence and the abundance of An. plumbeus with environmental features and to produce distribution maps in the Netherlands. Mosquito data were collected using a cross-sectional study design in the Netherlands, from April to October 2010-2013. The environmental data were obtained from satellite imagery and weather stations. Statistical measures (accuracy for the occurrence model and mean squared error for the abundance model) were used to evaluate the models performance. The models were externally validated. The maps show that forested areas (centre of the Netherlands) and the east of the country were predicted as suitable for An. plumbeus. In particular high suitability and high abundance was predicted in the south-eastern provinces Limburg and North Brabant. Elevation, precipitation, day and night temperature and vegetation indices were important predictors for calculating the probability of occurrence for An. plumbeus. The probability of occurrence, vegetation indices and precipitation were important for predicting its abundance. The AUC value was 0.73 and the error in the validation was 0.29; the mean squared error value was 0.12. The areas identified by the model as suitable and with high abundance of An. plumbeus, are consistent with the areas from which nuisance was reported. Our results can be helpful in the assessment of vector-borne disease risk.

  5. Scalable Joint Models for Reliable Uncertainty-Aware Event Prediction.

    PubMed

    Soleimani, Hossein; Hensman, James; Saria, Suchi

    2017-08-21

    Missing data and noisy observations pose significant challenges for reliably predicting events from irregularly sampled multivariate time series (longitudinal) data. Imputation methods, which are typically used for completing the data prior to event prediction, lack a principled mechanism to account for the uncertainty due to missingness. Alternatively, state-of-the-art joint modeling techniques can be used for jointly modeling the longitudinal and event data and compute event probabilities conditioned on the longitudinal observations. These approaches, however, make strong parametric assumptions and do not easily scale to multivariate signals with many observations. Our proposed approach consists of several key innovations. First, we develop a flexible and scalable joint model based upon sparse multiple-output Gaussian processes. Unlike state-of-the-art joint models, the proposed model can explain highly challenging structure including non-Gaussian noise while scaling to large data. Second, we derive an optimal policy for predicting events using the distribution of the event occurrence estimated by the joint model. The derived policy trades-off the cost of a delayed detection versus incorrect assessments and abstains from making decisions when the estimated event probability does not satisfy the derived confidence criteria. Experiments on a large dataset show that the proposed framework significantly outperforms state-of-the-art techniques in event prediction.

  6. Protein-Protein Interaction Site Predictions with Three-Dimensional Probability Distributions of Interacting Atoms on Protein Surfaces

    PubMed Central

    Chen, Ching-Tai; Peng, Hung-Pin; Jian, Jhih-Wei; Tsai, Keng-Chang; Chang, Jeng-Yih; Yang, Ei-Wen; Chen, Jun-Bo; Ho, Shinn-Ying; Hsu, Wen-Lian; Yang, An-Suei

    2012-01-01

    Protein-protein interactions are key to many biological processes. Computational methodologies devised to predict protein-protein interaction (PPI) sites on protein surfaces are important tools in providing insights into the biological functions of proteins and in developing therapeutics targeting the protein-protein interaction sites. One of the general features of PPI sites is that the core regions from the two interacting protein surfaces are complementary to each other, similar to the interior of proteins in packing density and in the physicochemical nature of the amino acid composition. In this work, we simulated the physicochemical complementarities by constructing three-dimensional probability density maps of non-covalent interacting atoms on the protein surfaces. The interacting probabilities were derived from the interior of known structures. Machine learning algorithms were applied to learn the characteristic patterns of the probability density maps specific to the PPI sites. The trained predictors for PPI sites were cross-validated with the training cases (consisting of 432 proteins) and were tested on an independent dataset (consisting of 142 proteins). The residue-based Matthews correlation coefficient for the independent test set was 0.423; the accuracy, precision, sensitivity, specificity were 0.753, 0.519, 0.677, and 0.779 respectively. The benchmark results indicate that the optimized machine learning models are among the best predictors in identifying PPI sites on protein surfaces. In particular, the PPI site prediction accuracy increases with increasing size of the PPI site and with increasing hydrophobicity in amino acid composition of the PPI interface; the core interface regions are more likely to be recognized with high prediction confidence. The results indicate that the physicochemical complementarity patterns on protein surfaces are important determinants in PPIs, and a substantial portion of the PPI sites can be predicted correctly with the physicochemical complementarity features based on the non-covalent interaction data derived from protein interiors. PMID:22701576

  7. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    PubMed

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  8. Predicting anthropogenic soils across the Amazonia

    NASA Astrophysics Data System (ADS)

    Mcmichael, C.; Palace, M. W.; Bush, M. B.; Braswell, B. H.; Hagen, S. C.; Silman, M.; Neves, E.; Czarnecki, C.

    2012-12-01

    Hidden under the forest canopy in lowland Amazonia are nutrient-enriched soils, called terra pretas (or Amazonian black earths), which were formed by prehistoric indigenous populations. These anthrosols are in stark contrast to typical nutrient-poor Amazonian soils, and have retained increased nutrient levels for hundreds of years. Because of their long-term nutrient retaining ability, terra pretas may be crucial for developing sustainable agricultural practices in Amazonia, especially given the deforestation necessary for traditional slash-and-burn systems. However, the frequency and distribution of terra preta soils across the landscape remains debatable, and archaeologists have estimated that terra pretas cover anywhere from 0.1% to 10% of the lowland Amazonian forests. The highest concentration of terra preta soils has been found along the central and eastern portions of the Amazon River and its major tributaries, but whether this is a true pattern or simply reflects sampling bias remains unknown. A possible explanation is that specific environmental or biotic conditions were preferred for human settlement and terra preta formation. Here, we use environmental parameters to predict the probabilities of terra preta soils across lowland Amazonian forests. We compiled a database of 2708 sites across Amazonia, including locations that contain terra pretas (n = 917), and those that are known to be terra preta-free (n = 1791). More than 20 environmental variables, including precipitation, elevation, slope, soil fertility, and distance to river were converted into 90-m resolution raster images across Amazonia and used to model the probability of terra preta occurrence. The relationship between the predictor variables and the occurrence of terra preta was examined using three modeling techniques: logistic regression, auto-logistic regression, and maximum entropy estimations. All three techniques provided similar predictions for terra preta distributions and the amount of area covered by terra preta. Distance to river, locations of bluffs, elevation, and soil fertility were important factors in determining distributions of terra preta, while other environmental variables had less effect. Terra pretas were most likely to be found in central and eastern Amazonia near the confluences of the Amazon River and its major tributaries. Within this general area of higher probability, terra pretas are most likely found atop the bluffs overlooking the rivers as opposed to lying on the floodplain. Interestingly, terra pretas are more probable in areas with less-fertile and more highly weathered soils. Although all three modeling techniques provided similar predictions of terra preta across Amazonia, we suggest that maximum entropy modeling is the best technique to predict anthropogenic soils across the vast Amazonian landscape. The auto-logistic regression corrects for spatial autocorrelation inherent to archaeological surveys, but still requires absence data, which was collected at different times and on different spatial scales than the presence data. The maximum entropy model requires presence only data, accounts for spatial autocorrelation, and is not affected by the differential soil sampling techniques.

  9. Exact Extremal Statistics in the Classical 1D Coulomb Gas

    NASA Astrophysics Data System (ADS)

    Dhar, Abhishek; Kundu, Anupam; Majumdar, Satya N.; Sabhapandit, Sanjib; Schehr, Grégory

    2017-08-01

    We consider a one-dimensional classical Coulomb gas of N -like charges in a harmonic potential—also known as the one-dimensional one-component plasma. We compute, analytically, the probability distribution of the position xmax of the rightmost charge in the limit of large N . We show that the typical fluctuations of xmax around its mean are described by a nontrivial scaling function, with asymmetric tails. This distribution is different from the Tracy-Widom distribution of xmax for Dyson's log gas. We also compute the large deviation functions of xmax explicitly and show that the system exhibits a third-order phase transition, as in the log gas. Our theoretical predictions are verified numerically.

  10. Benford's Law and articles of scientific journals: comparison of JCR® and Scopus data.

    PubMed

    Alves, Alexandre Donizeti; Yanasse, Horacio Hideki; Soma, Nei Yoshihiro

    2014-01-01

    Benford's Law is a logarithmic probability distribution function used to predict the distribution of the first significant digits in numerical data. This paper presents the results of a study of the distribution of the first significant digits of the number of articles published of journals indexed in the JCR ® Sciences and Social Sciences Editions from 2007 to 2011. The data of these journals were also analyzed by the country of origin and the journal's category. Results considering the number of articles published informed by Scopus are also presented. Comparing the results we observe that there is a significant difference in the data informed in the two databases.

  11. Stochastic Growth Theory of Spatially-Averaged Distributions of Langmuir Fields in Earth's Foreshock

    NASA Technical Reports Server (NTRS)

    Boshuizen, Christopher R.; Cairns, Iver H.; Robinson, P. A.

    2001-01-01

    Langmuir-like waves in the foreshock of Earth are characteristically bursty and irregular, and are the subject of a number of recent studies. Averaged over the foreshock, it is observed that the probability distribution is power-law P(bar)(log E) in the wave field E with the bar denoting this averaging over position, In this paper it is shown that stochastic growth theory (SGT) can explain a power-law spatially-averaged distributions P(bar)(log E), when the observed power-law variations of the mean and standard deviation of log E with position are combined with the log normal statistics predicted by SGT at each location.

  12. Analytical Algorithms to Quantify the Uncertainty in Remaining Useful Life Prediction

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Saxena, Abhinav; Daigle, Matthew; Goebel, Kai

    2013-01-01

    This paper investigates the use of analytical algorithms to quantify the uncertainty in the remaining useful life (RUL) estimate of components used in aerospace applications. The prediction of RUL is affected by several sources of uncertainty and it is important to systematically quantify their combined effect by computing the uncertainty in the RUL prediction in order to aid risk assessment, risk mitigation, and decisionmaking. While sampling-based algorithms have been conventionally used for quantifying the uncertainty in RUL, analytical algorithms are computationally cheaper and sometimes, are better suited for online decision-making. While exact analytical algorithms are available only for certain special cases (for e.g., linear models with Gaussian variables), effective approximations can be made using the the first-order second moment method (FOSM), the first-order reliability method (FORM), and the inverse first-order reliability method (Inverse FORM). These methods can be used not only to calculate the entire probability distribution of RUL but also to obtain probability bounds on RUL. This paper explains these three methods in detail and illustrates them using the state-space model of a lithium-ion battery.

  13. Semiparametric Bayesian classification with longitudinal markers

    PubMed Central

    De la Cruz-Mesía, Rolando; Quintana, Fernando A.; Müller, Peter

    2013-01-01

    Summary We analyse data from a study involving 173 pregnant women. The data are observed values of the β human chorionic gonadotropin hormone measured during the first 80 days of gestational age, including from one up to six longitudinal responses for each woman. The main objective in this study is to predict normal versus abnormal pregnancy outcomes from data that are available at the early stages of pregnancy. We achieve the desired classification with a semiparametric hierarchical model. Specifically, we consider a Dirichlet process mixture prior for the distribution of the random effects in each group. The unknown random-effects distributions are allowed to vary across groups but are made dependent by using a design vector to select different features of a single underlying random probability measure. The resulting model is an extension of the dependent Dirichlet process model, with an additional probability model for group classification. The model is shown to perform better than an alternative model which is based on independent Dirichlet processes for the groups. Relevant posterior distributions are summarized by using Markov chain Monte Carlo methods. PMID:24368871

  14. The role of ensemble post-processing for modeling the ensemble tail

    NASA Astrophysics Data System (ADS)

    Van De Vyver, Hans; Van Schaeybroeck, Bert; Vannitsem, Stéphane

    2016-04-01

    The past decades the numerical weather prediction community has witnessed a paradigm shift from deterministic to probabilistic forecast and state estimation (Buizza and Leutbecher, 2015; Buizza et al., 2008), in an attempt to quantify the uncertainties associated with initial-condition and model errors. An important benefit of a probabilistic framework is the improved prediction of extreme events. However, one may ask to what extent such model estimates contain information on the occurrence probability of extreme events and how this information can be optimally extracted. Different approaches have been proposed and applied on real-world systems which, based on extreme value theory, allow the estimation of extreme-event probabilities conditional on forecasts and state estimates (Ferro, 2007; Friederichs, 2010). Using ensemble predictions generated with a model of low dimensionality, a thorough investigation is presented quantifying the change of predictability of extreme events associated with ensemble post-processing and other influencing factors including the finite ensemble size, lead time and model assumption and the use of different covariates (ensemble mean, maximum, spread...) for modeling the tail distribution. Tail modeling is performed by deriving extreme-quantile estimates using peak-over-threshold representation (generalized Pareto distribution) or quantile regression. Common ensemble post-processing methods aim to improve mostly the ensemble mean and spread of a raw forecast (Van Schaeybroeck and Vannitsem, 2015). Conditional tail modeling, on the other hand, is a post-processing in itself, focusing on the tails only. Therefore, it is unclear how applying ensemble post-processing prior to conditional tail modeling impacts the skill of extreme-event predictions. This work is investigating this question in details. Buizza, Leutbecher, and Isaksen, 2008: Potential use of an ensemble of analyses in the ECMWF Ensemble Prediction System, Q. J. R. Meteorol. Soc. 134: 2051-2066.Buizza and Leutbecher, 2015: The forecast skill horizon, Q. J. R. Meteorol. Soc. 141: 3366-3382.Ferro, 2007: A probability model for verifying deterministic forecasts of extreme events. Weather and Forecasting 22 (5), 1089-1100.Friederichs, 2010: Statistical downscaling of extreme precipitation events using extreme value theory. Extremes 13, 109-132.Van Schaeybroeck and Vannitsem, 2015: Ensemble post-processing using member-by-member approaches: theoretical aspects. Q.J.R. Meteorol. Soc., 141: 807-818.

  15. Quantifying and Predicting Three-Dimensional Heterogeneity in Transient Storage Using Roving Profiling

    NASA Astrophysics Data System (ADS)

    Kaplan, D. A.; Reaver, N.; Hensley, R. T.; Cohen, M. J.

    2017-12-01

    Hydraulic transport is an important component of nutrient spiraling in streams. Quantifying conservative solute transport is a prerequisite for understanding the cycling and fate of reactive solutes, such as nutrients. Numerous studies have modeled solute transport within streams using the one-dimensional advection, dispersion and storage (ADS) equation calibrated to experimental data from tracer experiments. However, there are limitations to the information about in-stream transient storage that can be derived from calibrated ADS model parameters. Transient storage (TS) in the ADS model is most often modeled as a single process, and calibrated model parameters are "lumped" values that are the best-fit representation of multiple real-world TS processes. In this study, we developed a roving profiling method to assess and predict spatial heterogeneity of in-stream TS. We performed five tracer experiments on three spring-fed rivers in Florida (USA) using Rhodamine WT. During each tracer release, stationary fluorometers were deployed to measure breakthrough curves for multiple reaches within the river. Teams of roving samplers moved along the rivers measuring tracer concentrations at various locations and depths within the reaches. A Bayesian statistical method was used to calibrate the ADS model to the stationary breakthrough curves, resulting in probability distributions for both the advective and TS zone as a function of river distance and time. Rover samples were then assigned a probability of being from either the advective or TS zone by comparing measured concentrations to the probability distributions of concentrations in the ADS advective and TS zones. A regression model was used to predict the probability of any in-stream position being located within the advective versus TS zone based on spatiotemporal predictors (time, river position, depth, and distance from bank) and eco-geomorphological feature (eddies, woody debris, benthic depressions, and aquatic vegetation). Results confirm that TS is spatially variable as a function of spatiotemporal and eco-geomorphological features. A substantial number of samples with nearly equivalent chances of being from the advective or TS zones suggests that the distinction between zones is often poorly defined.

  16. Regional Permafrost Probability Modelling in the northwestern Cordillera, 59°N - 61°N, Canada

    NASA Astrophysics Data System (ADS)

    Bonnaventure, P. P.; Lewkowicz, A. G.

    2010-12-01

    High resolution (30 x 30 m) permafrost probability models were created for eight mountainous areas in the Yukon and northernmost British Columbia. Empirical-statistical modelling based on the Basal Temperature of Snow (BTS) method was used to develop spatial relationships. Model inputs include equivalent elevation (a variable that incorporates non-uniform temperature change with elevation), potential incoming solar radiation and slope. Probability relationships between predicted BTS and permafrost presence were developed for each area using late-summer physical observations in pits, or by using year-round ground temperature measurements. A high-resolution spatial model for the region has now been generated based on seven of the area models. Each was applied to the entire region, and their predictions were then blended based on a distance decay function from the model source area. The regional model is challenging to validate independently because there are few boreholes in the region. However, a comparison of results to a recently established inventory of rock glaciers for the Yukon suggests its validity because predicted permafrost probabilities were 0.8 or greater for almost 90% of these landforms. Furthermore, the regional model results have a similar spatial pattern to those modelled independently in the eighth area, although predicted probabilities using the regional model are generally higher. The regional model predicts that permafrost underlies about half of the non-glaciated terrain in the region, with probabilities increasing regionally from south to north and from east to west. Elevation is significant, but not always linked in a straightforward fashion because of weak or inverted trends in permafrost probability below treeline. Above treeline, however, permafrost probabilities increase and approach 1.0 in very high elevation areas throughout the study region. The regional model shows many similarities to previous Canadian permafrost maps (Heginbottom and Radburn, 1992; Heginbottom et al., 1995) but is several orders of magnitude more detailed. It also exhibits some significant differences, including the presence of an area of valley-floor continuous permafrost around Beaver Creek near the Alaskan border in the west, as well as higher probabilities of permafrost in the central parts of the region near the boundaries of the sporadic and extensive discontinuous zones. In addition, parts of the northernmost portion of the region would be classified as sporadic discontinuous permafrost because of inversions in the terrestrial surface lapse rate which cause permafrost probabilities to decrease with elevation through the forest. These model predictions are expected to of direct use for infrastructure planning and northern development and can serve as a benchmark for future studies of permafrost distribution in the Yukon. References Heginbottom JR, Dubreuil MA and Haker PT. 1995. Canada Permafrost. (1:7,500,000 scale). In The National Atlas of Canada, 5th Edition, sheet MCR 4177. Ottawa: National Resources Canada. Heginbottom, J.A. and Radburn, L.K. 1992. Permafrost and ground ice conditions of northwestern Canada; Geological Survey of Canada, Map 1691A, scale 1:1,000,000. Digitized by S. Smith, Geological Survey of Canada.

  17. Predicting species distributions from checklist data using site-occupancy models

    USGS Publications Warehouse

    Kery, M.; Gardner, B.; Monnerat, C.

    2010-01-01

    Aim: (1) To increase awareness of the challenges induced by imperfect detection, which is a fundamental issue in species distribution modelling; (2) to emphasize the value of replicate observations for species distribution modelling; and (3) to show how 'cheap' checklist data in faunal/floral databases may be used for the rigorous modelling of distributions by site-occupancy models. Location: Switzerland. Methods: We used checklist data collected by volunteers during 1999 and 2000 to analyse the distribution of the blue hawker, Aeshna cyanea (Odonata, Aeshnidae), a common dragonfly in Switzerland. We used data from repeated visits to 1-ha pixels to derive 'detection histories' and apply site-occupancy models to estimate the 'true' species distribution, i.e. corrected for imperfect detection. We modelled blue hawker distribution as a function of elevation and year and its detection probability of elevation, year and season. Results: The best model contained cubic polynomial elevation effects for distribution and quadratic effects of elevation and season for detectability. We compared the site-occupancy model with a conventional distribution model based on a generalized linear model, which assumes perfect detectability (p = 1). The conventional distribution map looked very different from the distribution map obtained using site-occupancy models that accounted for the imperfect detection. The conventional model underestimated the species distribution by 60%, and the slope parameters of the occurrence-elevation relationship were also underestimated when assuming p = 1. Elevation was not only an important predictor of blue hawker occurrence, but also of the detection probability, with a bell-shaped relationship. Furthermore, detectability increased over the season. The average detection probability was estimated at only 0.19 per survey. Main conclusions: Conventional species distribution models do not model species distributions per se but rather the apparent distribution, i.e. an unknown proportion of species distributions. That unknown proportion is equivalent to detectability. Imperfect detection in conventional species distribution models yields underestimates of the extent of distributions and covariate effects that are biased towards zero. In addition, patterns in detectability will erroneously be ascribed to species distributions. In contrast, site-occupancy models applied to replicated detection/non-detection data offer a powerful framework for making inferences about species distributions corrected for imperfect detection. The use of 'cheap' checklist data greatly enhances the scope of applications of this useful class of models. ?? 2010 Blackwell Publishing Ltd.

  18. Performance of two predictive uncertainty estimation approaches for conceptual Rainfall-Runoff Model: Bayesian Joint Inference and Hydrologic Uncertainty Post-processing

    NASA Astrophysics Data System (ADS)

    Hernández-López, Mario R.; Romero-Cuéllar, Jonathan; Camilo Múnera-Estrada, Juan; Coccia, Gabriele; Francés, Félix

    2017-04-01

    It is noticeably important to emphasize the role of uncertainty particularly when the model forecasts are used to support decision-making and water management. This research compares two approaches for the evaluation of the predictive uncertainty in hydrological modeling. First approach is the Bayesian Joint Inference of hydrological and error models. Second approach is carried out through the Model Conditional Processor using the Truncated Normal Distribution in the transformed space. This comparison is focused on the predictive distribution reliability. The case study is applied to two basins included in the Model Parameter Estimation Experiment (MOPEX). These two basins, which have different hydrological complexity, are the French Broad River (North Carolina) and the Guadalupe River (Texas). The results indicate that generally, both approaches are able to provide similar predictive performances. However, the differences between them can arise in basins with complex hydrology (e.g. ephemeral basins). This is because obtained results with Bayesian Joint Inference are strongly dependent on the suitability of the hypothesized error model. Similarly, the results in the case of the Model Conditional Processor are mainly influenced by the selected model of tails or even by the selected full probability distribution model of the data in the real space, and by the definition of the Truncated Normal Distribution in the transformed space. In summary, the different hypotheses that the modeler choose on each of the two approaches are the main cause of the different results. This research also explores a proper combination of both methodologies which could be useful to achieve less biased hydrological parameter estimation. For this approach, firstly the predictive distribution is obtained through the Model Conditional Processor. Secondly, this predictive distribution is used to derive the corresponding additive error model which is employed for the hydrological parameter estimation with the Bayesian Joint Inference methodology.

  19. Improving Photometric Redshifts for Hyper Suprime-Cam

    NASA Astrophysics Data System (ADS)

    Speagle, Josh S.; Leauthaud, Alexie; Eisenstein, Daniel; Bundy, Kevin; Capak, Peter L.; Leistedt, Boris; Masters, Daniel C.; Mortlock, Daniel; Peiris, Hiranya; HSC Photo-z Team; HSC Weak Lensing Team

    2017-01-01

    Deriving accurate photometric redshift (photo-z) probability distribution functions (PDFs) are crucial science components for current and upcoming large-scale surveys. We outline how rigorous Bayesian inference and machine learning can be combined to quickly derive joint photo-z PDFs to individual galaxies and their parent populations. Using the first 170 deg^2 of data from the ongoing Hyper Suprime-Cam survey, we demonstrate our method is able to generate accurate predictions and reliable credible intervals over ~370k high-quality redshifts. We then use galaxy-galaxy lensing to empirically validate our predicted photo-z's over ~14M objects, finding a robust signal.

  20. An empirical analysis of the distribution of the duration of overshoots in a stationary gaussian stochastic process

    NASA Technical Reports Server (NTRS)

    Parrish, R. S.; Carter, M. C.

    1974-01-01

    This analysis utilizes computer simulation and statistical estimation. Realizations of stationary gaussian stochastic processes with selected autocorrelation functions are computer simulated. Analysis of the simulated data revealed that the mean and the variance of a process were functionally dependent upon the autocorrelation parameter and crossing level. Using predicted values for the mean and standard deviation, by the method of moments, the distribution parameters was estimated. Thus, given the autocorrelation parameter, crossing level, mean, and standard deviation of a process, the probability of exceeding the crossing level for a particular length of time was calculated.

  1. Prediction future asset price which is non-concordant with the historical distribution

    NASA Astrophysics Data System (ADS)

    Seong, Ng Yew; Hin, Pooi Ah

    2015-12-01

    This paper attempts to predict the major characteristics of the future asset price which is non-concordant with the distribution estimated from the price today and the prices on a large number of previous days. The three major characteristics of the i-th non-concordant asset price are the length of the interval between the occurrence time of the previous non-concordant asset price and that of the present non-concordant asset price, the indicator which denotes that the non-concordant price is extremely small or large by its values -1 and 1 respectively, and the degree of non-concordance given by the negative logarithm of the probability of the left tail or right tail of which one of the end points is given by the observed future price. The vector of three major characteristics of the next non-concordant price is modelled to be dependent on the vectors corresponding to the present and l - 1 previous non-concordant prices via a 3-dimensional conditional distribution which is derived from a 3(l + 1)-dimensional power-normal mixture distribution. The marginal distribution for each of the three major characteristics can then be derived from the conditional distribution. The mean of the j-th marginal distribution is an estimate of the value of the j-th characteristics of the next non-concordant price. Meanwhile, the 100(α/2) % and 100(1 - α/2) % points of the j-th marginal distribution can be used to form a prediction interval for the j-th characteristic of the next non-concordant price. The performance measures of the above estimates and prediction intervals indicate that the fitted conditional distribution is satisfactory. Thus the incorporation of the distribution of the characteristics of the next non-concordant price in the model for asset price has a good potential of yielding a more realistic model.

  2. Design of optimal hyperthermia protocols for prostate cancer by controlling HSP expression through computer modeling (Invited Paper)

    NASA Astrophysics Data System (ADS)

    Rylander, Marissa N.; Feng, Yusheng; Diller, Kenneth; Bass, J.

    2005-04-01

    Heat shock proteins (HSP) are critical components of a complex defense mechanism essential for preserving cell survival under adverse environmental conditions. It is inevitable that hyperthermia will enhance tumor tissue viability, due to HSP expression in regions where temperatures are insufficient to coagulate proteins, and would likely increase the probability of cancer recurrence. Although hyperthermia therapy is commonly used in conjunction with radiotherapy, chemotherapy, and gene therapy to increase therapeutic effectiveness, the efficacy of these therapies can be substantially hindered due to HSP expression when hyperthermia is applied prior to these procedures. Therefore, in planning hyperthermia protocols, prediction of the HSP response of the tumor must be incorporated into the treatment plan to optimize the thermal dose delivery and permit prediction of overall tissue response. In this paper, we present a highly accurate, adaptive, finite element tumor model capable of predicting the HSP expression distribution and tissue damage region based on measured cellular data when hyperthermia protocols are specified. Cubic spline representations of HSP27 and HSP70, and Arrhenius damage models were integrated into the finite element model to enable prediction of the HSP expression and damage distribution in the tissue following laser heating. Application of the model can enable optimized treatment planning by controlling of the tissue response to therapy based on accurate prediction of the HSP expression and cell damage distribution.

  3. Modelling of PM10 concentration for industrialized area in Malaysia: A case study in Shah Alam

    NASA Astrophysics Data System (ADS)

    N, Norazian Mohamed; Abdullah, M. M. A.; Tan, Cheng-yau; Ramli, N. A.; Yahaya, A. S.; Fitri, N. F. M. Y.

    In Malaysia, the predominant air pollutants are suspended particulate matter (SPM) and nitrogen dioxide (NO2). This research is on PM10 as they may trigger harm to human health as well as environment. Six distributions, namely Weibull, log-normal, gamma, Rayleigh, Gumbel and Frechet were chosen to model the PM10 observations at the chosen industrial area i.e. Shah Alam. One-year period hourly average data for 2006 and 2007 were used for this research. For parameters estimation, method of maximum likelihood estimation (MLE) was selected. Four performance indicators that are mean absolute error (MAE), root mean squared error (RMSE), coefficient of determination (R2) and prediction accuracy (PA), were applied to determine the goodness-of-fit criteria of the distributions. The best distribution that fits with the PM10 observations in Shah Alamwas found to be log-normal distribution. The probabilities of the exceedences concentration were calculated and the return period for the coming year was predicted from the cumulative density function (cdf) obtained from the best-fit distributions. For the 2006 data, Shah Alam was predicted to exceed 150 μg/m3 for 5.9 days in 2007 with a return period of one occurrence per 62 days. For 2007, the studied area does not exceed the MAAQG of 150 μg/m3

  4. Exact calculation of distributions on integers, with application to sequence alignment.

    PubMed

    Newberg, Lee A; Lawrence, Charles E

    2009-01-01

    Computational biology is replete with high-dimensional discrete prediction and inference problems. Dynamic programming recursions can be applied to several of the most important of these, including sequence alignment, RNA secondary-structure prediction, phylogenetic inference, and motif finding. In these problems, attention is frequently focused on some scalar quantity of interest, a score, such as an alignment score or the free energy of an RNA secondary structure. In many cases, score is naturally defined on integers, such as a count of the number of pairing differences between two sequence alignments, or else an integer score has been adopted for computational reasons, such as in the test of significance of motif scores. The probability distribution of the score under an appropriate probabilistic model is of interest, such as in tests of significance of motif scores, or in calculation of Bayesian confidence limits around an alignment. Here we present three algorithms for calculating the exact distribution of a score of this type; then, in the context of pairwise local sequence alignments, we apply the approach so as to find the alignment score distribution and Bayesian confidence limits.

  5. Suboptimal Decision Criteria Are Predicted by Subjectively Weighted Probabilities and Rewards

    PubMed Central

    Ackermann, John F.; Landy, Michael S.

    2014-01-01

    Subjects performed a visual detection task in which the probability of target occurrence at each of the two possible locations, and the rewards for correct responses for each, were varied across conditions. To maximize monetary gain, observers should bias their responses, choosing one location more often than the other in line with the varied probabilities and rewards. Typically, and in our task, observers do not bias their responses to the extent they should, and instead distribute their responses more evenly across locations, a phenomenon referred to as ‘conservatism.’ We investigated several hypotheses regarding the source of the conservatism. We measured utility and probability weighting functions under Prospect Theory for each subject in an independent economic choice task and used the weighting-function parameters to calculate each subject’s subjective utility (SU(c)) as a function of the criterion c, and the corresponding weighted optimal criteria (wcopt). Subjects’ criteria were not close to optimal relative to wcopt. The slope of SU (c) and of expected gain EG(c) at the neutral criterion corresponding to β = 1 were both predictive of subjects’ criteria. The slope of SU(c) was a better predictor of observers’ decision criteria overall. Thus, rather than behaving optimally, subjects move their criterion away from the neutral criterion by estimating how much they stand to gain by such a change based on the slope of subjective gain as a function of criterion, using inherently distorted probabilities and values. PMID:25366822

  6. Suboptimal decision criteria are predicted by subjectively weighted probabilities and rewards.

    PubMed

    Ackermann, John F; Landy, Michael S

    2015-02-01

    Subjects performed a visual detection task in which the probability of target occurrence at each of the two possible locations, and the rewards for correct responses for each, were varied across conditions. To maximize monetary gain, observers should bias their responses, choosing one location more often than the other in line with the varied probabilities and rewards. Typically, and in our task, observers do not bias their responses to the extent they should, and instead distribute their responses more evenly across locations, a phenomenon referred to as 'conservatism.' We investigated several hypotheses regarding the source of the conservatism. We measured utility and probability weighting functions under Prospect Theory for each subject in an independent economic choice task and used the weighting-function parameters to calculate each subject's subjective utility (SU(c)) as a function of the criterion c, and the corresponding weighted optimal criteria (wc opt ). Subjects' criteria were not close to optimal relative to wc opt . The slope of SU(c) and of expected gain EG(c) at the neutral criterion corresponding to β = 1 were both predictive of the subjects' criteria. The slope of SU(c) was a better predictor of observers' decision criteria overall. Thus, rather than behaving optimally, subjects move their criterion away from the neutral criterion by estimating how much they stand to gain by such a change based on the slope of subjective gain as a function of criterion, using inherently distorted probabilities and values.

  7. Impact of signal scattering and parametric uncertainties on receiver operating characteristics

    NASA Astrophysics Data System (ADS)

    Wilson, D. Keith; Breton, Daniel J.; Hart, Carl R.; Pettit, Chris L.

    2017-05-01

    The receiver operating characteristic (ROC curve), which is a plot of the probability of detection as a function of the probability of false alarm, plays a key role in the classical analysis of detector performance. However, meaningful characterization of the ROC curve is challenging when practically important complications such as variations in source emissions, environmental impacts on the signal propagation, uncertainties in the sensor response, and multiple sources of interference are considered. In this paper, a relatively simple but realistic model for scattered signals is employed to explore how parametric uncertainties impact the ROC curve. In particular, we show that parametric uncertainties in the mean signal and noise power substantially raise the tails of the distributions; since receiver operation with a very low probability of false alarm and a high probability of detection is normally desired, these tails lead to severely degraded performance. Because full a priori knowledge of such parametric uncertainties is rarely available in practice, analyses must typically be based on a finite sample of environmental states, which only partially characterize the range of parameter variations. We show how this effect can lead to misleading assessments of system performance. For the cases considered, approximately 64 or more statistically independent samples of the uncertain parameters are needed to accurately predict the probabilities of detection and false alarm. A connection is also described between selection of suitable distributions for the uncertain parameters, and Bayesian adaptive methods for inferring the parameters.

  8. A strategy for improved computational efficiency of the method of anchored distributions

    NASA Astrophysics Data System (ADS)

    Over, Matthew William; Yang, Yarong; Chen, Xingyuan; Rubin, Yoram

    2013-06-01

    This paper proposes a strategy for improving the computational efficiency of model inversion using the method of anchored distributions (MAD) by "bundling" similar model parametrizations in the likelihood function. Inferring the likelihood function typically requires a large number of forward model (FM) simulations for each possible model parametrization; as a result, the process is quite expensive. To ease this prohibitive cost, we present an approximation for the likelihood function called bundling that relaxes the requirement for high quantities of FM simulations. This approximation redefines the conditional statement of the likelihood function as the probability of a set of similar model parametrizations "bundle" replicating field measurements, which we show is neither a model reduction nor a sampling approach to improving the computational efficiency of model inversion. To evaluate the effectiveness of these modifications, we compare the quality of predictions and computational cost of bundling relative to a baseline MAD inversion of 3-D flow and transport model parameters. Additionally, to aid understanding of the implementation we provide a tutorial for bundling in the form of a sample data set and script for the R statistical computing language. For our synthetic experiment, bundling achieved a 35% reduction in overall computational cost and had a limited negative impact on predicted probability distributions of the model parameters. Strategies for minimizing error in the bundling approximation, for enforcing similarity among the sets of model parametrizations, and for identifying convergence of the likelihood function are also presented.

  9. Where to Dig for Fossils: Combining Climate-Envelope, Taphonomy and Discovery Models

    PubMed Central

    Block, Sebastián; Saltré, Frédérik; Rodríguez-Rey, Marta; Fordham, Damien A.; Unkel, Ingmar; Bradshaw, Corey J. A.

    2016-01-01

    Fossils represent invaluable data to reconstruct the past history of life, yet fossil-rich sites are often rare and difficult to find. The traditional fossil-hunting approach focuses on small areas and has not yet taken advantage of modelling techniques commonly used in ecology to account for an organism’s past distributions. We propose a new method to assist finding fossils at continental scales based on modelling the past distribution of species, the geological suitability of fossil preservation and the likelihood of fossil discovery in the field, and apply it to several genera of Australian megafauna that went extinct in the Late Quaternary. Our models predicted higher fossil potentials for independent sites than for randomly selected locations (mean Kolmogorov-Smirnov statistic = 0.66). We demonstrate the utility of accounting for the distribution history of fossil taxa when trying to find the most suitable areas to look for fossils. For some genera, the probability of finding fossils based on simple climate-envelope models was higher than the probability based on models incorporating current conditions associated with fossil preservation and discovery as predictors. However, combining the outputs from climate-envelope, preservation, and discovery models resulted in the most accurate predictions of potential fossil sites at a continental scale. We proposed potential areas to discover new fossils of Diprotodon, Zygomaturus, Protemnodon, Thylacoleo, and Genyornis, and provide guidelines on how to apply our approach to assist fossil hunting in other continents and geological settings. PMID:27027874

  10. Where to Dig for Fossils: Combining Climate-Envelope, Taphonomy and Discovery Models.

    PubMed

    Block, Sebastián; Saltré, Frédérik; Rodríguez-Rey, Marta; Fordham, Damien A; Unkel, Ingmar; Bradshaw, Corey J A

    2016-01-01

    Fossils represent invaluable data to reconstruct the past history of life, yet fossil-rich sites are often rare and difficult to find. The traditional fossil-hunting approach focuses on small areas and has not yet taken advantage of modelling techniques commonly used in ecology to account for an organism's past distributions. We propose a new method to assist finding fossils at continental scales based on modelling the past distribution of species, the geological suitability of fossil preservation and the likelihood of fossil discovery in the field, and apply it to several genera of Australian megafauna that went extinct in the Late Quaternary. Our models predicted higher fossil potentials for independent sites than for randomly selected locations (mean Kolmogorov-Smirnov statistic = 0.66). We demonstrate the utility of accounting for the distribution history of fossil taxa when trying to find the most suitable areas to look for fossils. For some genera, the probability of finding fossils based on simple climate-envelope models was higher than the probability based on models incorporating current conditions associated with fossil preservation and discovery as predictors. However, combining the outputs from climate-envelope, preservation, and discovery models resulted in the most accurate predictions of potential fossil sites at a continental scale. We proposed potential areas to discover new fossils of Diprotodon, Zygomaturus, Protemnodon, Thylacoleo, and Genyornis, and provide guidelines on how to apply our approach to assist fossil hunting in other continents and geological settings.

  11. Predicting structures in the Zone of Avoidance

    NASA Astrophysics Data System (ADS)

    Sorce, Jenny G.; Colless, Matthew; Kraan-Korteweg, Renée C.; Gottlöber, Stefan

    2017-11-01

    The Zone of Avoidance (ZOA), whose emptiness is an artefact of our Galaxy dust, has been challenging observers as well as theorists for many years. Multiple attempts have been made on the observational side to map this region in order to better understand the local flows. On the theoretical side, however, this region is often simply statistically populated with structures but no real attempt has been made to confront theoretical and observed matter distributions. This paper takes a step forward using constrained realizations (CRs) of the local Universe shown to be perfect substitutes of local Universe-like simulations for smoothed high-density peak studies. Far from generating completely `random' structures in the ZOA, the reconstruction technique arranges matter according to the surrounding environment of this region. More precisely, the mean distributions of structures in a series of constrained and random realizations (RRs) differ: while densities annihilate each other when averaging over 200 RRs, structures persist when summing 200 CRs. The probability distribution function of ZOA grid cells to be highly overdense is a Gaussian with a 15 per cent mean in the random case, while that of the constrained case exhibits large tails. This implies that areas with the largest probabilities host most likely a structure. Comparisons between these predictions and observations, like those of the Puppis 3 cluster, show a remarkable agreement and allow us to assert the presence of the, recently highlighted by observations, Vela supercluster at about 180 h-1 Mpc, right behind the thickest dust layers of our Galaxy.

  12. Technical note: Bayesian calibration of dynamic ruminant nutrition models.

    PubMed

    Reed, K F; Arhonditsis, G B; France, J; Kebreab, E

    2016-08-01

    Mechanistic models of ruminant digestion and metabolism have advanced our understanding of the processes underlying ruminant animal physiology. Deterministic modeling practices ignore the inherent variation within and among individual animals and thus have no way to assess how sources of error influence model outputs. We introduce Bayesian calibration of mathematical models to address the need for robust mechanistic modeling tools that can accommodate error analysis by remaining within the bounds of data-based parameter estimation. For the purpose of prediction, the Bayesian approach generates a posterior predictive distribution that represents the current estimate of the value of the response variable, taking into account both the uncertainty about the parameters and model residual variability. Predictions are expressed as probability distributions, thereby conveying significantly more information than point estimates in regard to uncertainty. Our study illustrates some of the technical advantages of Bayesian calibration and discusses the future perspectives in the context of animal nutrition modeling. Copyright © 2016 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  13. A synopsis of climate change effects on groundwater recharge

    NASA Astrophysics Data System (ADS)

    Smerdon, Brian D.

    2017-12-01

    Six review articles published between 2011 and 2016 on groundwater and climate change are briefly summarized. This synopsis focuses on aspects related to predicting changes to groundwater recharge conditions, with several common conclusions between the review articles being noted. The uncertainty of distribution and trend in future precipitation from General Circulation Models (GCMs) results in varying predictions of recharge, so much so that modelling studies are often not able to predict the magnitude and direction (increase or decrease) of future recharge conditions. Evolution of modelling approaches has led to the use of multiple GCMs and hydrologic models to create an envelope of future conditions that reflects the probability distribution. The choice of hydrologic model structure and complexity, and the choice of emissions scenario, has been investigated and somewhat resolved; however, recharge results remain sensitive to downscaling methods. To overcome uncertainty and provide practical use in water management, the research community indicates that modelling at a mesoscale, somewhere between watersheds and continents, is likely ideal. Improvements are also suggested for incorporating groundwater processes within GCMs.

  14. Data Assimilation - Advances and Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Brian J.

    2014-07-30

    This presentation provides an overview of data assimilation (model calibration) for complex computer experiments. Calibration refers to the process of probabilistically constraining uncertain physics/engineering model inputs to be consistent with observed experimental data. An initial probability distribution for these parameters is updated using the experimental information. Utilization of surrogate models and empirical adjustment for model form error in code calibration form the basis for the statistical methodology considered. The role of probabilistic code calibration in supporting code validation is discussed. Incorporation of model form uncertainty in rigorous uncertainty quantification (UQ) analyses is also addressed. Design criteria used within a batchmore » sequential design algorithm are introduced for efficiently achieving predictive maturity and improved code calibration. Predictive maturity refers to obtaining stable predictive inference with calibrated computer codes. These approaches allow for augmentation of initial experiment designs for collecting new physical data. A standard framework for data assimilation is presented and techniques for updating the posterior distribution of the state variables based on particle filtering and the ensemble Kalman filter are introduced.« less

  15. Risky business: The impact of climate and climate variability on human population dynamics in Western Europe during the Last Glacial Maximum

    NASA Astrophysics Data System (ADS)

    Burke, Ariane; Kageyama, Masa; Latombe, Guilllaume; Fasel, Marc; Vrac, Mathieu; Ramstein, Gilles; James, Patrick M. A.

    2017-05-01

    The extent to which climate change has affected the course of human evolution is an enduring question. The ability to maintain spatially extensive social networks and a fluid social structure allows human foragers to ;map onto; the landscape, mitigating the impact of ecological risk and conferring resilience. But what are the limits of resilience and to which environmental variables are foraging populations sensitive? We address this question by testing the impact of a suite of environmental variables, including climate variability, on the distribution of human populations in Western Europe during the Last Glacial Maximum (LGM). Climate variability affects the distribution of plant and animal resources unpredictably, creating an element of risk for foragers for whom mobility comes at a cost. We produce a model of habitat suitability that allows us to generate predictions about the probable distribution of human populations and discuss the implications of these predictions for the structure of human populations and their social and cultural evolution during the LGM.

  16. A new species of Desmopachria Babington (Coleoptera: Dytiscidae) from Cuba with a prediction of its geographic distribution and notes on other Cuban species of the genus.

    PubMed

    Megna, Yoandri S; Sánchez-Fernández, David

    2014-01-10

    A new species, Desmopachria andreae sp. n. is described from Cuba. Diagnostic characters including illustrations of male genitalia are provided and illustrated for the five species of the genus occurring on the island. For these five species both a simple key to adults and maps of their known distribution in Cuba are also provided. Using a Maximun Entropy method (MaxEnt), a distribution model was developed for D. andreae sp.n. Based on the model's predictions, this species has a higher probability of occurring in high altitude forests (above 1000 m a.s.l.), characterised by relatively low temperatures especially during the hottest and wettest seasons, specifically, the mountainous areas of the Macizo de Guamuhaya (Central Cuba), Sierra Maestra (S Cuba) and Nipe-Sagua-Baracoa (NE Cuba). In some of these areas the species has not yet been recorded, and should be searched for in future field surveys.

  17. Combining public participatory surveillance and occupancy modelling to predict the distributional response of Ixodes scapularis to climate change.

    PubMed

    Lieske, David J; Lloyd, Vett K

    2018-03-01

    Ixodes scapularis, a known vector of Borrelia burgdorferi sensu stricto (Bbss), is undergoing range expansion in many parts of Canada. The province of New Brunswick, which borders jurisdictions with established populations of I. scapularis, constitutes a range expansion zone for this species. To better understand the current and potential future distribution of this tick under climate change projections, this study applied occupancy modelling to distributional records of adult ticks that successfully overwintered, obtained through passive surveillance. This study indicates that I. scapularis occurs throughout the southern-most portion of the province, in close proximity to coastlines and major waterways. Milder winter conditions, as indicated by the number of degree days <0 °C, was determined to be a strong predictor of tick occurrence, as was, to a lesser degree, rising levels of annual precipitation, leading to a final model with a predictive accuracy of 0.845 (range: 0.828-0.893). Both RCP 4.5 and RCP 8.5 climate projections predict that a significant proportion of the province (roughly a quarter to a third) will be highly suitable for I. scapularis by the 2080s. Comparison with cases of canine infection show good spatial agreement with baseline model predictions, but the presence of canine Borrelia infections beyond the climate envelope, defined by the highest probabilities of tick occurrence, suggest the presence of Bbss-carrying ticks distributed by long-range dispersal events. This research demonstrates that predictive statistical modelling of multi-year surveillance information is an efficient way to identify areas where I. scapularis is most likely to occur, and can be used to guide subsequent active sampling efforts in order to better understand fine scale species distributional patterns. Copyright © 2018 The Authors. Published by Elsevier GmbH.. All rights reserved.

  18. Mathematical modeling and experimental validation of the spatial distribution of boron in the root of Arabidopsis thaliana identify high boron accumulation in the tip and predict a distinct root tip uptake function.

    PubMed

    Shimotohno, Akie; Sotta, Naoyuki; Sato, Takafumi; De Ruvo, Micol; Marée, Athanasius F M; Grieneisen, Verônica A; Fujiwara, Toru

    2015-04-01

    Boron, an essential micronutrient, is transported in roots of Arabidopsis thaliana mainly by two different types of transporters, BORs and NIPs (nodulin26-like intrinsic proteins). Both are plasma membrane localized, but have distinct transport properties and patterns of cell type-specific accumulation with different polar localizations, which are likely to affect boron distribution. Here, we used mathematical modeling and an experimental determination to address boron distributions in the root. A computational model of the root is created at the cellular level, describing the boron transporters as observed experimentally. Boron is allowed to diffuse into roots, in cells and cell walls, and to be transported over plasma membranes, reflecting the properties of the different transporters. The model predicts that a region around the quiescent center has a higher concentration of soluble boron than other portions. To evaluate this prediction experimentally, we determined the boron distribution in roots using laser ablation-inductivity coupled plasma-mass spectrometry. The analysis indicated that the boron concentration is highest near the tip and is lower in the more proximal region of the meristem zone, similar to the pattern of soluble boron distribution predicted by the model. Our model also predicts that upward boron flux does not continuously increase from the root tip toward the mature region, indicating that boron taken up in the root tip is not efficiently transported to shoots. This suggests that root tip-absorbed boron is probably used for local root growth, and that instead it is the more mature root regions which have a greater role in transporting boron toward the shoots. © The Author 2015. Published by Oxford University Press on behalf of Japanese Society of Plant Physiologists.

  19. Mathematical Modeling and Experimental Validation of the Spatial Distribution of Boron in the Root of Arabidopsis thaliana Identify High Boron Accumulation in the Tip and Predict a Distinct Root Tip Uptake Function

    PubMed Central

    Shimotohno, Akie; Sotta, Naoyuki; Sato, Takafumi; De Ruvo, Micol; Marée, Athanasius F.M.; Grieneisen, Verônica A.; Fujiwara, Toru

    2015-01-01

    Boron, an essential micronutrient, is transported in roots of Arabidopsis thaliana mainly by two different types of transporters, BORs and NIPs (nodulin26-like intrinsic proteins). Both are plasma membrane localized, but have distinct transport properties and patterns of cell type-specific accumulation with different polar localizations, which are likely to affect boron distribution. Here, we used mathematical modeling and an experimental determination to address boron distributions in the root. A computational model of the root is created at the cellular level, describing the boron transporters as observed experimentally. Boron is allowed to diffuse into roots, in cells and cell walls, and to be transported over plasma membranes, reflecting the properties of the different transporters. The model predicts that a region around the quiescent center has a higher concentration of soluble boron than other portions. To evaluate this prediction experimentally, we determined the boron distribution in roots using laser ablation-inductivity coupled plasma-mass spectrometry. The analysis indicated that the boron concentration is highest near the tip and is lower in the more proximal region of the meristem zone, similar to the pattern of soluble boron distribution predicted by the model. Our model also predicts that upward boron flux does not continuously increase from the root tip toward the mature region, indicating that boron taken up in the root tip is not efficiently transported to shoots. This suggests that root tip-absorbed boron is probably used for local root growth, and that instead it is the more mature root regions which have a greater role in transporting boron toward the shoots. PMID:25670713

  20. Anurans in a Subarctic Tundra Landscape Near Cape Churchill, Manitoba

    USGS Publications Warehouse

    Reiter, M.E.; Boal, C.W.; Andersen, D.E.

    2008-01-01

    Distribution, abundance, and habitat relationships of anurans inhabiting subarctic regions are poorly understood, and anuran monitoring protocols developed for temperate regions may not be applicable across large roadless areas of northern landscapes. In addition, arctic and subarctic regions of North America are predicted to experience changes in climate and, in some areas, are experiencing habitat alteration due to high rates of herbivory by breeding and migrating waterfowl. To better understand subarctic anuran abundance, distribution, and habitat associations, we conducted anuran calling surveys in the Cape Churchill region of Wapusk National Park, Manitoba, Canada, in 2004 and 2005. We conducted surveys along ~l-km transects distributed across three landscape types (coastal tundra, interior sedge meadow-tundra, and boreal forest-tundra interface) to estimate densities and probabilities of detection of Boreal Chorus Frogs (Pseudacris maculata) and Wood Frogs (Lithobates sylvaticus). We detected a Wood Frog or Boreal Chorus Frog on 22 (87%) of 26 transects surveyed, but probability of detection varied between years and species and among landscape types. Estimated densities of both species increased from the coastal zone inland toward the boreal forest edge. Our results suggest anurans occur across all three landscape types in our study area, but that species-specific spatial patterns exist in their abundances. Considerations for both spatial and temporal variation in abundance and detection probability need to be incorporated into surveys and monitoring programs for subarctic anurans.

  1. Glyph-based analysis of multimodal directional distributions in vector field ensembles

    NASA Astrophysics Data System (ADS)

    Jarema, Mihaela; Demir, Ismail; Kehrer, Johannes; Westermann, Rüdiger

    2015-04-01

    Ensemble simulations are increasingly often performed in the geosciences in order to study the uncertainty and variability of model predictions. Describing ensemble data by mean and standard deviation can be misleading in case of multimodal distributions. We present first results of a glyph-based visualization of multimodal directional distributions in 2D and 3D vector ensemble data. Directional information on the circle/sphere is modeled using mixtures of probability density functions (pdfs), which enables us to characterize the distributions with relatively few parameters. The resulting mixture models are represented by 2D and 3D lobular glyphs showing direction, spread and strength of each principal mode of the distributions. A 3D extension of our approach is realized by means of an efficient GPU rendering technique. We demonstrate our method in the context of ensemble weather simulations.

  2. Dispersion of the invasive common carp Cyprinus carpio in southern South America: changes and expectations, westward and southward.

    PubMed

    Crichigno, S; Cordero, P; Blasetti, G; Cussac, V

    2016-07-01

    Common carp Cyprinus carpio possess multiple traits that contribute to their success as an invasive species. They have been introduced across the globe, and abundant populations can have numerous negative effects. Although ecological niche-based modelling techniques have been used to predict the potential range of C. carpio invasion in U.S.A., occurrence and abundance patterns have not yet been considered on a regional scale. In the present review new locations are documented, the status of the southernmost population has been studied and the probability of new lakes and reservoirs being colonized by C. carpio has been obtained and related to environmental conditions. The new localities for C. carpio have expanded its distribution westward, into the Andean Region, and present results from the South American southernmost population have shown a well-established population. Analysis of presence data provided two principal results: (1) the probability of a site being with C. carpio can be inferred using environmental variables and (2) the probability of a site being with C. carpio is a useful tool for the prediction of future invasions. Selective fishing on the Negro basin could constitute a potential mitigation measure, decreasing the abundance of the species and thus reducing the species' potential for southward expansion. These results reinforce the idea that artisanal fisheries, food production and conservation interests should be taken into account by local government management agencies in any discussion regarding the southern distribution of C. carpio in the near future. © 2016 The Fisheries Society of the British Isles.

  3. Niche modeling predictions of the potential distribution of Marmota himalayana, the host animal of plague in Yushu County of Qinghai.

    PubMed

    Lu, Liang; Ren, Zhoupeng; Yue, Yujuan; Yu, Xiaotao; Lu, Shan; Li, Guichang; Li, Hailong; Wei, Jianchun; Liu, Jingli; Mu, You; Hai, Rong; Yang, Yonghai; Wei, Rongjie; Kan, Biao; Wang, Hu; Wang, Jinfeng; Wang, Zuyun; Liu, Qiyong; Xu, Jianguo

    2016-02-24

    After the earthquake on 14, April 2010 at Yushu in China, a plague epidemic hosted by Himalayan marmot (Marmota himalayana) became a major public health concern during the reconstruction period. A rapid assessment of the distribution of Himalayan marmot in the area was urgent. The aims of this study were to analyze the relationship between environmental factors and the distribution of burrow systems of the marmot and to predict the distribution of marmots. Two types of marmot burrows (hibernation and temporary) in Yushu County were investigated from June to September in 2011. The location of every burrow was recorded with a global positioning system receiver. An ecological niche model was used to determine the relationship between the burrow occurrence data and environmental variables, such as land surface temperature (LST) in winter and summer, normalized difference vegetation index (NDVI) in winter and summer, elevation, and soil type. The predictive accuracies of the models were assessed by the area under the curve of the receiving operator curve. The models for hibernation and temporary burrows both performed well. The contribution orders of the variables were LST in winter and soil type, NDVI in winter and elevation for the hibernation burrow model, and LST in summer, NDVI in summer, soil type and elevation in the temporary burrow model. There were non-linear relationships between the probability of burrow presence and LST, NDVI and elevation. LST of 14 and 23 °C, NDVI of 0.22 and 0.60, and 4100 m were inflection points. A substantially higher probability of burrow presence was observed in swamp soil and dark felty soil than in other soil types. The potential area for hibernation burrows was 5696 km(2) (37.7% of Yushu County), and the area for temporary burrows was 7711 km(2) (51.0% of Yushu County). The results suggested that marmots preferred warm areas with relatively low altitudes and good vegetation conditions in Yushu County. Based on these results, the present research is useful in understanding the niche selection and distribution pattern of marmots in this region.

  4. The study of RMB exchange rate complex networks based on fluctuation mode

    NASA Astrophysics Data System (ADS)

    Yao, Can-Zhong; Lin, Ji-Nan; Zheng, Xu-Zhou; Liu, Xiao-Feng

    2015-10-01

    In the paper, we research on the characteristics of RMB exchange rate time series fluctuation with methods of symbolization and coarse gaining. First, based on fluctuation features of RMB exchange rate, we define the first type of fluctuation mode as one specific foreign currency against RMB in four days' fluctuating situations, and the second type as four different foreign currencies against RMB in one day's fluctuating situation. With the transforming method, we construct the unique-currency and multi-currency complex networks. Further, through analyzing the topological features including out-degree, betweenness centrality and clustering coefficient of fluctuation-mode complex networks, we find that the out-degree distribution of both types of fluctuation mode basically follows power-law distributions with exponents between 1 and 2. The further analysis reveals that the out-degree and the clustering coefficient generally obey the approximated negative correlation. With this result, we confirm previous observations showing that the RMB exchange rate exhibits a characteristic of long-range memory. Finally, we analyze the most probable transmission route of fluctuation modes, and provide probability prediction matrix. The transmission route for RMB exchange rate fluctuation modes exhibits the characteristics of partially closed loop, repeat and reversibility, which lays a solid foundation for predicting RMB exchange rate fluctuation patterns with large volume of data.

  5. Development of the first georeferenced map of Rhipicephalus (Boophilus) spp. in Mexico from 1970 to date and prediction of its spatial distribution.

    PubMed

    Alcala-Canto, Yazmin; Figueroa-Castillo, Juan Antonio; Ibarra-Velarde, Froylán; Vera-Montenegro, Yolanda; Cervantes-Valencia, María Eugenia; Salem, Abdelfattah Z M; Cuéllar-Ordaz, Jorge Alfredo

    2018-05-07

    The tick genus Ripicephalus (Boophilus), particularly R. microplus, is one of the most important ectoparasites that affects livestock health and considered an epidemiological risk because it causes significant economic losses due, mainly, to restrictions in the export of infested animals to several countries. Its spatial distribution has been tied to environmental factors, mainly warm temperatures and high relative humidity. In this work, we integrated a dataset consisting of 5843 records of Rhipicephalus spp., in Mexico covering close to 50 years to know which environmental variables mostly influence this ticks' distribution. Occurrences were georeferenced using the software DIVA-GIS and the potential current distribution was modelled using the maximum entropy method (Maxent). The algorithm generated a map of high predictive capability (Area under the curve = 0.942), providing the various contribution and permutation importance of the tested variables. Precipitation seasonality, particularly in March, and isothermality were found to be the most significant climate variables in determining the probability of spatial distribution of Rhipicephalus spp. in Mexico (15.7%, 36.0% and 11.1%, respectively). Our findings demonstrate that Rhipicephalus has colonized Mexico widely, including areas characterized by different types of climate. We conclude that the Maxent distribution model using Rhipicephalus records and a set of environmental variables can predict the extent of the tick range in this country, information that should support the development of integrated control strategies.

  6. On the continuity of the stationary state distribution of DPCM

    NASA Astrophysics Data System (ADS)

    Naraghi-Pour, Morteza; Neuhoff, David L.

    1990-03-01

    Continuity and singularity properties of the stationary state distribution of differential pulse code modulation (DPCM) are explored. Two-level DPCM (i.e., delta modulation) operating on a first-order autoregressive source is considered, and it is shown that, when the magnitude of the DPCM prediciton coefficient is between zero and one-half, the stationary state distribution is singularly continuous; i.e., it is not discrete but concentrates on an uncountable set with a Lebesgue measure of zero. Consequently, it cannot be represented with a probability density function. For prediction coefficients with magnitude greater than or equal to one-half, the distribution is pure, i.e., either absolutely continuous and representable with a density function, or singular. This problem is compared to the well-known and still substantially unsolved problem of symmetric Bernoulli convolutions.

  7. Cetacean population density estimation from single fixed sensors using passive acoustics.

    PubMed

    Küsel, Elizabeth T; Mellinger, David K; Thomas, Len; Marques, Tiago A; Moretti, David; Ward, Jessica

    2011-06-01

    Passive acoustic methods are increasingly being used to estimate animal population density. Most density estimation methods are based on estimates of the probability of detecting calls as functions of distance. Typically these are obtained using receivers capable of localizing calls or from studies of tagged animals. However, both approaches are expensive to implement. The approach described here uses a MonteCarlo model to estimate the probability of detecting calls from single sensors. The passive sonar equation is used to predict signal-to-noise ratios (SNRs) of received clicks, which are then combined with a detector characterization that predicts probability of detection as a function of SNR. Input distributions for source level, beam pattern, and whale depth are obtained from the literature. Acoustic propagation modeling is used to estimate transmission loss. Other inputs for density estimation are call rate, obtained from the literature, and false positive rate, obtained from manual analysis of a data sample. The method is applied to estimate density of Blainville's beaked whales over a 6-day period around a single hydrophone located in the Tongue of the Ocean, Bahamas. Results are consistent with those from previous analyses, which use additional tag data. © 2011 Acoustical Society of America

  8. High monetary reward rates and caloric rewards decrease temporal persistence

    PubMed Central

    Bode, Stefan; Murawski, Carsten

    2017-01-01

    Temporal persistence refers to an individual's capacity to wait for future rewards, while forgoing possible alternatives. This requires a trade-off between the potential value of delayed rewards and opportunity costs, and is relevant to many real-world decisions, such as dieting. Theoretical models have previously suggested that high monetary reward rates, or positive energy balance, may result in decreased temporal persistence. In our study, 50 fasted participants engaged in a temporal persistence task, incentivised with monetary rewards. In alternating blocks of this task, rewards were delivered at delays drawn randomly from distributions with either a lower or higher maximum reward rate. During some blocks participants received either a caloric drink or water. We used survival analysis to estimate participants' probability of quitting conditional on the delay distribution and the consumed liquid. Participants had a higher probability of quitting in blocks with the higher reward rate. Furthermore, participants who consumed the caloric drink had a higher probability of quitting than those who consumed water. Our results support the predictions from the theoretical models, and importantly, suggest that both higher monetary reward rates and physiologically relevant rewards can decrease temporal persistence, which is a crucial determinant for survival in many species. PMID:28228517

  9. Is Einsteinian no-signalling violated in Bell tests?

    NASA Astrophysics Data System (ADS)

    Kupczynski, Marian

    2017-11-01

    Relativistic invariance is a physical law verified in several domains of physics. The impossibility of faster than light influences is not questioned by quantum theory. In quantum electrodynamics, in quantum field theory and in the standard model relativistic invariance is incorporated by construction. Quantum mechanics predicts strong long range correlations between outcomes of spin projection measurements performed in distant laboratories. In spite of these strong correlations marginal probability distributions should not depend on what was measured in the other laboratory what is called shortly: non-signalling. In several experiments, performed to test various Bell-type inequalities, some unexplained dependence of empirical marginal probability distributions on distant settings was observed. In this paper we demonstrate how a particular identification and selection procedure of paired distant outcomes is the most probable cause for this apparent violation of no-signalling principle. Thus this unexpected setting dependence does not prove the existence of superluminal influences and Einsteinian no-signalling principle has to be tested differently in dedicated experiments. We propose a detailed protocol telling how such experiments should be designed in order to be conclusive. We also explain how magical quantum correlations may be explained in a locally causal way.

  10. High monetary reward rates and caloric rewards decrease temporal persistence.

    PubMed

    Fung, Bowen J; Bode, Stefan; Murawski, Carsten

    2017-02-22

    Temporal persistence refers to an individual's capacity to wait for future rewards, while forgoing possible alternatives. This requires a trade-off between the potential value of delayed rewards and opportunity costs, and is relevant to many real-world decisions, such as dieting. Theoretical models have previously suggested that high monetary reward rates, or positive energy balance, may result in decreased temporal persistence. In our study, 50 fasted participants engaged in a temporal persistence task, incentivised with monetary rewards. In alternating blocks of this task, rewards were delivered at delays drawn randomly from distributions with either a lower or higher maximum reward rate. During some blocks participants received either a caloric drink or water. We used survival analysis to estimate participants' probability of quitting conditional on the delay distribution and the consumed liquid. Participants had a higher probability of quitting in blocks with the higher reward rate. Furthermore, participants who consumed the caloric drink had a higher probability of quitting than those who consumed water. Our results support the predictions from the theoretical models, and importantly, suggest that both higher monetary reward rates and physiologically relevant rewards can decrease temporal persistence, which is a crucial determinant for survival in many species. © 2017 The Authors.

  11. Investigation of Dielectric Breakdown Characteristics for Double-break Vacuum Interrupter and Dielectric Breakdown Probability Distribution in Vacuum Interrupter

    NASA Astrophysics Data System (ADS)

    Shioiri, Tetsu; Asari, Naoki; Sato, Junichi; Sasage, Kosuke; Yokokura, Kunio; Homma, Mitsutaka; Suzuki, Katsumi

    To investigate the reliability of equipment of vacuum insulation, a study was carried out to clarify breakdown probability distributions in vacuum gap. Further, a double-break vacuum circuit breaker was investigated for breakdown probability distribution. The test results show that the breakdown probability distribution of the vacuum gap can be represented by a Weibull distribution using a location parameter, which shows the voltage that permits a zero breakdown probability. The location parameter obtained from Weibull plot depends on electrode area. The shape parameter obtained from Weibull plot of vacuum gap was 10∼14, and is constant irrespective non-uniform field factor. The breakdown probability distribution after no-load switching can be represented by Weibull distribution using a location parameter. The shape parameter after no-load switching was 6∼8.5, and is constant, irrespective of gap length. This indicates that the scatter of breakdown voltage was increased by no-load switching. If the vacuum circuit breaker uses a double break, breakdown probability at low voltage becomes lower than single-break probability. Although potential distribution is a concern in the double-break vacuum cuicuit breaker, its insulation reliability is better than that of the single-break vacuum interrupter even if the bias of the vacuum interrupter's sharing voltage is taken into account.

  12. Population pharmacokinetics and maximum a posteriori probability Bayesian estimator of abacavir: application of individualized therapy in HIV-infected infants and toddlers.

    PubMed

    Zhao, Wei; Cella, Massimo; Della Pasqua, Oscar; Burger, David; Jacqz-Aigrain, Evelyne

    2012-04-01

    Abacavir is used to treat HIV infection in both adults and children. The recommended paediatric dose is 8 mg kg(-1) twice daily up to a maximum of 300 mg twice daily. Weight was identified as the central covariate influencing pharmacokinetics of abacavir in children. A population pharmacokinetic model was developed to describe both once and twice daily pharmacokinetic profiles of abacavir in infants and toddlers. Standard dosage regimen is associated with large interindividual variability in abacavir concentrations. A maximum a posteriori probability Bayesian estimator of AUC(0-) (t) based on three time points (0, 1 or 2, and 3 h) is proposed to support area under the concentration-time curve (AUC) targeted individualized therapy in infants and toddlers. To develop a population pharmacokinetic model for abacavir in HIV-infected infants and toddlers, which will be used to describe both once and twice daily pharmacokinetic profiles, identify covariates that explain variability and propose optimal time points to optimize the area under the concentration-time curve (AUC) targeted dosage and individualize therapy. The pharmacokinetics of abacavir was described with plasma concentrations from 23 patients using nonlinear mixed-effects modelling (NONMEM) software. A two-compartment model with first-order absorption and elimination was developed. The final model was validated using bootstrap, visual predictive check and normalized prediction distribution errors. The Bayesian estimator was validated using the cross-validation and simulation-estimation method. The typical population pharmacokinetic parameters and relative standard errors (RSE) were apparent systemic clearance (CL) 13.4 () h−1 (RSE 6.3%), apparent central volume of distribution 4.94 () (RSE 28.7%), apparent peripheral volume of distribution 8.12 () (RSE14.2%), apparent intercompartment clearance 1.25 () h−1 (RSE 16.9%) and absorption rate constant 0.758 h−1 (RSE 5.8%). The covariate analysis identified weight as the individual factor influencing the apparent oral clearance: CL = 13.4 × (weight/12)1.14. The maximum a posteriori probability Bayesian estimator, based on three concentrations measured at 0, 1 or 2, and 3 h after drug intake allowed predicting individual AUC0–t. The population pharmacokinetic model developed for abacavir in HIV-infected infants and toddlers accurately described both once and twice daily pharmacokinetic profiles. The maximum a posteriori probability Bayesian estimator of AUC(0-) (t) was developed from the final model and can be used routinely to optimize individual dosing. © 2011 The Authors. British Journal of Clinical Pharmacology © 2011 The British Pharmacological Society.

  13. Using a Betabinomial distribution to estimate the prevalence of adherence to physical activity guidelines among children and youth.

    PubMed

    Garriguet, Didier

    2016-04-01

    Estimates of the prevalence of adherence to physical activity guidelines in the population are generally the result of averaging individual probability of adherence based on the number of days people meet the guidelines and the number of days they are assessed. Given this number of active and inactive days (days assessed minus days active), the conditional probability of meeting the guidelines that has been used in the past is a Beta (1 + active days, 1 + inactive days) distribution assuming the probability p of a day being active is bounded by 0 and 1 and averages 50%. A change in the assumption about the distribution of p is required to better match the discrete nature of the data and to better assess the probability of adherence when the percentage of active days in the population differs from 50%. Using accelerometry data from the Canadian Health Measures Survey, the probability of adherence to physical activity guidelines is estimated using a conditional probability given the number of active and inactive days distributed as a Betabinomial(n, a + active days , β + inactive days) assuming that p is randomly distributed as Beta(a, β) where the parameters a and β are estimated by maximum likelihood. The resulting Betabinomial distribution is discrete. For children aged 6 or older, the probability of meeting physical activity guidelines 7 out of 7 days is similar to published estimates. For pre-schoolers, the Betabinomial distribution yields higher estimates of adherence to the guidelines than the Beta distribution, in line with the probability of being active on any given day. In estimating the probability of adherence to physical activity guidelines, the Betabinomial distribution has several advantages over the previously used Beta distribution. It is a discrete distribution and maximizes the richness of accelerometer data.

  14. Predicting impacts of future human population growth and development on occupancy rates of forest-dependent birds

    USGS Publications Warehouse

    Brown, Michelle L.; Donovan, Therese; Schwenk, W. Scott; Theobald, David M.

    2014-01-01

    Forest loss and fragmentation are among the largest threats to forest-dwelling wildlife species today, and projected increases in human population growth are expected to increase these threats in the next century. We combined spatially-explicit growth models with wildlife distribution models to predict the effects of human development on 5 forest-dependent bird species in Vermont, New Hampshire, and Massachusetts, USA. We used single-species occupancy models to derive the probability of occupancy for each species across the study area in the years 2000 and 2050. Over half a million new housing units were predicted to be added to the landscape. The maximum change in housing density was nearly 30 houses per hectare; however, 30% of the towns in the study area were projected to add less than 1 housing unit per hectare. In the face of predicted human growth, the overall occupancy of each species decreased by as much as 38% (ranging from 19% to 38% declines in the worst-case scenario) in the year 2050. These declines were greater outside of protected areas than within protected lands. Ninety-seven percent of towns experienced some decline in species occupancy within their borders, highlighting the value of spatially-explicit models. The mean decrease in occupancy probability within towns ranged from 3% for hairy woodpecker to 8% for ovenbird and hermit thrush. Reductions in occupancy probability occurred on the perimeters of cities and towns where exurban development is predicted to increase in the study area. This spatial approach to wildlife planning provides data to evaluate trade-offs between development scenarios and forest-dependent wildlife species.

  15. Correlation of spatial climate/weather maps and the advantages of using the Mahalanobis metric in predictions

    NASA Astrophysics Data System (ADS)

    Stephenson, D. B.

    1997-10-01

    The skill in predicting spatially varying weather/climate maps depends on the definition of the measure of similarity between the maps. Under the justifiable approximation that the anomaly maps are distributed multinormally, it is shown analytically that the choice of weighting metric, used in defining the anomaly correlation between spatial maps, can change the resulting probability distribution of the correlation coefficient. The estimate of the numbers of degrees of freedom based on the variance of the correlation distribution can vary from unity up to the number of grid points depending on the choice of weighting metric. The (pseudo-) inverse of the sample covariance matrix acts as a special choice for the metric in that it gives a correlation distribution which has minimal kurtosis and maximum dimension. Minimal kurtosis suggests that the average predictive skill might be improved due to the rarer occurrence of troublesome outlier patterns far from the mean state. Maximum dimension has a disadvantage for analogue prediction schemes in that it gives the minimum number of analogue states. This metric also has an advantage in that it allows one to powerfully test the null hypothesis of multinormality by examining the second and third moments of the correlation coefficient which were introduced by Mardia as invariant measures of multivariate kurtosis and skewness. For these reasons, it is suggested that this metric could be usefully employed in the prediction of weather/climate and in fingerprinting anthropogenic climate change. The ideas are illustrated using the bivariate example of the observed monthly mean sea-level pressures at Darwin and Tahitifrom 1866 1995.

  16. Base pair probability estimates improve the prediction accuracy of RNA non-canonical base pairs

    PubMed Central

    2017-01-01

    Prediction of RNA tertiary structure from sequence is an important problem, but generating accurate structure models for even short sequences remains difficult. Predictions of RNA tertiary structure tend to be least accurate in loop regions, where non-canonical pairs are important for determining the details of structure. Non-canonical pairs can be predicted using a knowledge-based model of structure that scores nucleotide cyclic motifs, or NCMs. In this work, a partition function algorithm is introduced that allows the estimation of base pairing probabilities for both canonical and non-canonical interactions. Pairs that are predicted to be probable are more likely to be found in the true structure than pairs of lower probability. Pair probability estimates can be further improved by predicting the structure conserved across multiple homologous sequences using the TurboFold algorithm. These pairing probabilities, used in concert with prior knowledge of the canonical secondary structure, allow accurate inference of non-canonical pairs, an important step towards accurate prediction of the full tertiary structure. Software to predict non-canonical base pairs and pairing probabilities is now provided as part of the RNAstructure software package. PMID:29107980

  17. Performance assessment of a Bayesian Forecasting System (BFS) for real-time flood forecasting

    NASA Astrophysics Data System (ADS)

    Biondi, D.; De Luca, D. L.

    2013-02-01

    SummaryThe paper evaluates, for a number of flood events, the performance of a Bayesian Forecasting System (BFS), with the aim of evaluating total uncertainty in real-time flood forecasting. The predictive uncertainty of future streamflow is estimated through the Bayesian integration of two separate processors. The former evaluates the propagation of input uncertainty on simulated river discharge, the latter computes the hydrological uncertainty of actual river discharge associated with all other possible sources of error. A stochastic model and a distributed rainfall-runoff model were assumed, respectively, for rainfall and hydrological response simulations. A case study was carried out for a small basin in the Calabria region (southern Italy). The performance assessment of the BFS was performed with adequate verification tools suited for probabilistic forecasts of continuous variables such as streamflow. Graphical tools and scalar metrics were used to evaluate several attributes of the forecast quality of the entire time-varying predictive distributions: calibration, sharpness, accuracy, and continuous ranked probability score (CRPS). Besides the overall system, which incorporates both sources of uncertainty, other hypotheses resulting from the BFS properties were examined, corresponding to (i) a perfect hydrological model; (ii) a non-informative rainfall forecast for predicting streamflow; and (iii) a perfect input forecast. The results emphasize the importance of using different diagnostic approaches to perform comprehensive analyses of predictive distributions, to arrive at a multifaceted view of the attributes of the prediction. For the case study, the selected criteria revealed the interaction of the different sources of error, in particular the crucial role of the hydrological uncertainty processor when compensating, at the cost of wider forecast intervals, for the unreliable and biased predictive distribution resulting from the Precipitation Uncertainty Processor.

  18. Estimation of State Transition Probabilities: A Neural Network Model

    NASA Astrophysics Data System (ADS)

    Saito, Hiroshi; Takiyama, Ken; Okada, Masato

    2015-12-01

    Humans and animals can predict future states on the basis of acquired knowledge. This prediction of the state transition is important for choosing the best action, and the prediction is only possible if the state transition probability has already been learned. However, how our brains learn the state transition probability is unknown. Here, we propose a simple algorithm for estimating the state transition probability by utilizing the state prediction error. We analytically and numerically confirmed that our algorithm is able to learn the probability completely with an appropriate learning rate. Furthermore, our learning rule reproduced experimentally reported psychometric functions and neural activities in the lateral intraparietal area in a decision-making task. Thus, our algorithm might describe the manner in which our brains learn state transition probabilities and predict future states.

  19. Predictive accuracy of a ground-water model--Lessons from a postaudit

    USGS Publications Warehouse

    Konikow, Leonard F.

    1986-01-01

    Hydrogeologic studies commonly include the development, calibration, and application of a deterministic simulation model. To help assess the value of using such models to make predictions, a postaudit was conducted on a previously studied area in the Salt River and lower Santa Cruz River basins in central Arizona. A deterministic, distributed-parameter model of the ground-water system in these alluvial basins was calibrated by Anderson (1968) using about 40 years of data (1923–64). The calibrated model was then used to predict future water-level changes during the next 10 years (1965–74). Examination of actual water-level changes in 77 wells from 1965–74 indicates a poor correlation between observed and predicted water-level changes. The differences have a mean of 73 ft that is, predicted declines consistently exceeded those observed and a standard deviation of 47 ft. The bias in the predicted water-level change can be accounted for by the large error in the assumed total pumpage during the prediction period. However, the spatial distribution of errors in predicted water-level change does not correlate with the spatial distribution of errors in pumpage. Consequently, the lack of precision probably is not related only to errors in assumed pumpage, but may indicate the presence of other sources of error in the model, such as the two-dimensional representation of a three-dimensional problem or the lack of consideration of land-subsidence processes. This type of postaudit is a valuable method of verifying a model, and an evaluation of predictive errors can provide an increased understanding of the system and aid in assessing the value of undertaking development of a revised model.

  20. Chromosome Model reveals Dynamic Redistribution of DNA Damage into Nuclear Sub-domains

    NASA Technical Reports Server (NTRS)

    Costes, Sylvain V.; Ponomarev, Artem; Chen, James L.; Cucinotta, Francis A.; Barcellos-Hoff, Helen

    2007-01-01

    Several proteins involved in the response to DNA double strand breaks (DSB) form microscopically visible nuclear domains, or foci, after exposure to ionizing radiation. Radiation-induced foci (RIF) are believed to be located where DNA damage is induced. To test this assumption, we analyzed the spatial distribution of 53BP1, phosphorylated ATM and gammaH2AX RIF in cells irradiated with high linear energy transfer (LET) radiation. Since energy is randomly deposited along high-LET particle paths, RIF along these paths should also be randomly distributed. The probability to induce DSB can be derived from DNA fragment data measured experimentally by pulsed-field gel electrophoresis. We used this probability in Monte Carlo simulations to predict DSB locations in synthetic nuclei geometrically described by a complete set of human chromosomes, taking into account microscope optics from real experiments. As expected, simulations produced DNA-weighted random (Poisson) distributions. In contrast, the distributions of RIF obtained as early as 5 min after exposure to high LET (1 GeV/amu Fe) were non-random. This deviation from the expected DNA-weighted random pattern can be further characterized by relative DNA image measurements. This novel imaging approach shows that RIF were located preferentially at the interface between high and low DNA density regions, and were more frequent in regions with lower density DNA than predicted. This deviation from random behavior was more pronounced within the first 5 min following irradiation for phosphorylated ATM RIF, while gammaH2AX and 53BP1 RIF showed very pronounced deviation up to 30 min after exposure. These data suggest the existence of repair centers in mammalian epithelial cells. These centers would be nuclear sub-domains where DNA lesions would be collected for more efficient repair.

  1. Probability differently modulating the effects of reward and punishment on visuomotor adaptation.

    PubMed

    Song, Yanlong; Smiley-Oyen, Ann L

    2017-12-01

    Recent human motor learning studies revealed that punishment seemingly accelerated motor learning but reward enhanced consolidation of motor memory. It is not evident how intrinsic properties of reward and punishment modulate the potentially dissociable effects of reward and punishment on motor learning and motor memory. It is also not clear what causes the dissociation of the effects of reward and punishment. By manipulating probability of distribution, a critical property of reward and punishment, the present study demonstrated that probability had distinct modulation on the effects of reward and punishment in adapting to a sudden visual rotation and consolidation of the adaptation memory. Specifically, two probabilities of monetary reward and punishment distribution, 50 and 100%, were applied during young adult participants adapting to a sudden visual rotation. Punishment and reward showed distinct effects on motor adaptation and motor memory. The group that received punishments in 100% of the adaptation trials adapted significantly faster than the other three groups, but the group that received rewards in 100% of the adaptation trials showed marked savings in re-adapting to the same rotation. In addition, the group that received punishments in 50% of the adaptation trials that were randomly selected also had savings in re-adapting to the same rotation. Sensitivity to sensory prediction error or difference in explicit process induced by reward and punishment may likely contribute to the distinct effects of reward and punishment.

  2. Species Distribution Modelling of Aedes aegypti in two dengue-endemic regions of Pakistan.

    PubMed

    Fatima, Syeda Hira; Atif, Salman; Rasheed, Syed Basit; Zaidi, Farrah; Hussain, Ejaz

    2016-03-01

    Statistical tools are effectively used to determine the distribution of mosquitoes and to make ecological inferences about the vector-borne disease dynamics. In this study, we utilised species distribution models to understand spatial patterns of Aedes aegypti in two dengue-prevalent regions of Pakistan, Lahore and Swat. Species distribution models can potentially indicate the probability of suitability of Ae. aegypti once introduced to new regions like Swat, where invasion of this species is a recent phenomenon. The distribution of Ae. aegypti was determined by applying the MaxEnt algorithm on a set of potential environmental factors and species sample records. The ecological dependency of species on each environmental variable was analysed using response curves. We quantified the statistical performance of the models based on accuracy assessment and spatial predictions. Our results suggest that Ae. aegypti is widely distributed in Lahore. Human population density and urban infrastructure are primarily responsible for greater probability of mosquito occurrence in this region. In Swat, Ae. aegypti has clumped distribution, where urban patches provide refuge to the species in an otherwise hostile heterogeneous environment and road networks are assumed to have facilitated in passive-mediated dispersal of species. In Pakistan, Ae. aegypti is expanding its range northwards; this could be associated with rapid urbanisation, trade and travel. The main implication of this expansion is that more people are at risk of dengue fever in the northern highlands of Pakistan. © 2016 John Wiley & Sons Ltd.

  3. An operational real-time flood forecasting system in Southern Italy

    NASA Astrophysics Data System (ADS)

    Ortiz, Enrique; Coccia, Gabriele; Todini, Ezio

    2015-04-01

    A real-time flood forecasting system has been operating since year 2012 as a non-structural measure for mitigating the flood risk in Campania Region (Southern Italy), within the Sele river basin (3.240 km2). The Sele Flood Forecasting System (SFFS) has been built within the FEWS (Flood Early Warning System) platform developed by Deltares and it assimilates the numerical weather predictions of the COSMO LAM family: the deterministic COSMO-LAMI I2, the deterministic COSMO-LAMI I7 and the ensemble numerical weather predictions COSMO-LEPS (16 members). Sele FFS is composed by a cascade of three main models. The first model is a fully continuous physically based distributed hydrological model, named TOPKAPI-eXtended (Idrologia&Ambiente s.r.l., Naples, Italy), simulating the dominant processes controlling the soil water dynamics, runoff generation and discharge with a spatial resolution of 250 m. The second module is a set of Neural-Networks (ANN) built for forecasting the river stages at a set of monitored cross-sections. The third component is a Model Conditional Processor (MCP), which provides the predictive uncertainty (i.e., the probability of occurrence of a future flood event) within the framework of a multi-temporal forecast, according to the most recent advancements on this topic (Coccia and Todini, HESS, 2011). The MCP provides information about the probability of exceedance of a maximum river stage within the forecast lead time, by means of a discrete time function representing the variation of cumulative probability of exceeding a river stage during the forecast lead time and the distribution of the time occurrence of the flood peak, starting from one or more model forecasts. This work shows the Sele FFS performance after two years of operation, evidencing the added-values that can provide to a flood early warning and emergency management system.

  4. The origin of anomalous transport in porous media - is it possible to make a priori predictions?

    NASA Astrophysics Data System (ADS)

    Bijeljic, Branko; Blunt, Martin

    2013-04-01

    Despite the range of significant applications of flow and solute transport in porous rock, including contaminant migration in subsurface hydrology, geological storage of carbon-dioxide and tracer studies and miscible displacement in oil recovery, even the qualitative behavior in the subsurface is uncertain. The non-Fickian nature of dispersive processes in heterogeneous porous media has been demonstrated experimentally from pore to field scales. However, the exact relationship between structure, velocity field and transport has not been fully understood. Advances in X ray imaging techniques made it possible to accurately describe structure of the pore space, helping predict flow and anomalous transport behaviour using direct simulation. This is demonstrated by simulating solute transport through 3D images of rock samples, with resolutions of a few microns, representing geological media of increasing pore-scale complexity: a sandpack, a sandstone, and a carbonate. A novel methodology is developed that predicts solute transport at the pore scale by using probability density functions of displacement (propagators) and probability density function of transit time between the image voxels, and relates it to probability density function of normalized local velocity. A key advantage is that full information on velocity and solute concentration is retained in the models. The methodology includes solving for Stokes flow by Open Foam, solving for advective transport by the novel streamline simulation method, and superimposing diffusive transport diffusion by the random walk method. It is shown how computed propagators for beadpack, sandstone and carbonate depend on the spread in the velocity distribution. A narrow velocity distribution in the beadpack leads to the least anomalous behaviour where the propagators rapidly become Gaussian; the wider velocity distribution in the sandstone gives rise to a small immobile concentration peak, and a large secondary mobile peak moving at approximately the average flow speed; in the carbonate with the widest velocity distribution the stagnant concentration peak is persistent, while the emergence of a smaller secondary mobile peak is observed, leading to a highly anomalous behavior. This defines different generic nature of non-Fickian transport in the three media and quantifies the effect of pore structure on transport. Moreover, the propagators obtained by the model are in a very good agreement with the propagators measured on beadpack, Bentheimer sandstone and Portland carbonate cores in nuclear magnetic resonance experiments. These findings demonstrate that it is possible to make a priori predictions of anomalous transport in porous media. The importance of these findings for transport in complex carbonate rock micro-CT images is discussed, classifying them in terms of degree of anomalous transport that can have an impact at the field scale. Extensions to reactive transport will be discussed.

  5. A probabilistic approach for shallow rainfall-triggered landslide modeling at basin scale. A case study in the Luquillo Forest, Puerto Rico

    NASA Astrophysics Data System (ADS)

    Dialynas, Y. G.; Arnone, E.; Noto, L. V.; Bras, R. L.

    2013-12-01

    Slope stability depends on geotechnical and hydrological factors that exhibit wide natural spatial variability, yet sufficient measurements of the related parameters are rarely available over entire study areas. The uncertainty associated with the inability to fully characterize hydrologic behavior has an impact on any attempt to model landslide hazards. This work suggests a way to systematically account for this uncertainty in coupled distributed hydrological-stability models for shallow landslide hazard assessment. A probabilistic approach for the prediction of rainfall-triggered landslide occurrence at basin scale was implemented in an existing distributed eco-hydrological and landslide model, tRIBS-VEGGIE -landslide (Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator - VEGetation Generator for Interactive Evolution). More precisely, we upgraded tRIBS-VEGGIE- landslide to assess the likelihood of shallow landslides by accounting for uncertainty related to geotechnical and hydrological factors that directly affect slope stability. Natural variability of geotechnical soil characteristics was considered by randomizing soil cohesion and friction angle. Hydrological uncertainty related to the estimation of matric suction was taken into account by considering soil retention parameters as correlated random variables. The probability of failure is estimated through an assumed theoretical Factor of Safety (FS) distribution, conditioned on soil moisture content. At each cell, the temporally variant FS statistics are approximated by the First Order Second Moment (FOSM) method, as a function of parameters statistical properties. The model was applied on the Rio Mameyes Basin, located in the Luquillo Experimental Forest in Puerto Rico, where previous landslide analyses have been carried out. At each time step, model outputs include the probability of landslide occurrence across the basin, and the most probable depth of failure at each soil column. The use of the proposed probabilistic approach for shallow landslide prediction is able to reveal and quantify landslide risk at slopes assessed as stable by simpler deterministic methods.

  6. Likelihood analysis of species occurrence probability from presence-only data for modelling species distributions

    USGS Publications Warehouse

    Royle, J. Andrew; Chandler, Richard B.; Yackulic, Charles; Nichols, James D.

    2012-01-01

    1. Understanding the factors affecting species occurrence is a pre-eminent focus of applied ecological research. However, direct information about species occurrence is lacking for many species. Instead, researchers sometimes have to rely on so-called presence-only data (i.e. when no direct information about absences is available), which often results from opportunistic, unstructured sampling. MAXENT is a widely used software program designed to model and map species distribution using presence-only data. 2. We provide a critical review of MAXENT as applied to species distribution modelling and discuss how it can lead to inferential errors. A chief concern is that MAXENT produces a number of poorly defined indices that are not directly related to the actual parameter of interest – the probability of occurrence (ψ). This focus on an index was motivated by the belief that it is not possible to estimate ψ from presence-only data; however, we demonstrate that ψ is identifiable using conventional likelihood methods under the assumptions of random sampling and constant probability of species detection. 3. The model is implemented in a convenient r package which we use to apply the model to simulated data and data from the North American Breeding Bird Survey. We demonstrate that MAXENT produces extreme under-predictions when compared to estimates produced by logistic regression which uses the full (presence/absence) data set. We note that MAXENT predictions are extremely sensitive to specification of the background prevalence, which is not objectively estimated using the MAXENT method. 4. As with MAXENT, formal model-based inference requires a random sample of presence locations. Many presence-only data sets, such as those based on museum records and herbarium collections, may not satisfy this assumption. However, when sampling is random, we believe that inference should be based on formal methods that facilitate inference about interpretable ecological quantities instead of vaguely defined indices.

  7. Climate Change Impact Assessment in Pacific North West Using Copula based Coupling of Temperature and Precipitation variables

    NASA Astrophysics Data System (ADS)

    Qin, Y.; Rana, A.; Moradkhani, H.

    2014-12-01

    The multi downscaled-scenario products allow us to better assess the uncertainty of the changes/variations of precipitation and temperature in the current and future periods. Joint Probability distribution functions (PDFs), of both the climatic variables, might help better understand the interdependence of the two, and thus in-turn help in accessing the future with confidence. Using the joint distribution of temperature and precipitation is also of significant importance in hydrological applications and climate change studies. In the present study, we have used multi-modelled statistically downscaled-scenario ensemble of precipitation and temperature variables using 2 different statistically downscaled climate dataset. The datasets used are, 10 Global Climate Models (GCMs) downscaled products from CMIP5 daily dataset, namely, those from the Bias Correction and Spatial Downscaling (BCSD) technique generated at Portland State University and from the Multivariate Adaptive Constructed Analogs (MACA) technique, generated at University of Idaho, leading to 2 ensemble time series from 20 GCM products. Thereafter the ensemble PDFs of both precipitation and temperature is evaluated for summer, winter, and yearly periods for all the 10 sub-basins across Columbia River Basin (CRB). Eventually, Copula is applied to establish the joint distribution of two variables enabling users to model the joint behavior of the variables with any level of correlation and dependency. Moreover, the probabilistic distribution helps remove the limitations on marginal distributions of variables in question. The joint distribution is then used to estimate the change trends of the joint precipitation and temperature in the current and future, along with estimation of the probabilities of the given change. Results have indicated towards varied change trends of the joint distribution of, summer, winter, and yearly time scale, respectively in all 10 sub-basins. Probabilities of changes, as estimated by the joint precipitation and temperature, will provide useful information/insights for hydrological and climate change predictions.

  8. Persistence of exponential bed thickness distributions in the stratigraphic record: Experiments and theory

    NASA Astrophysics Data System (ADS)

    Straub, K. M.; Ganti, V. K.; Paola, C.; Foufoula-Georgiou, E.

    2010-12-01

    Stratigraphy preserved in alluvial basins houses the most complete record of information necessary to reconstruct past environmental conditions. Indeed, the character of the sedimentary record is inextricably related to the surface processes that formed it. In this presentation we explore how the signals of surface processes are recorded in stratigraphy through the use of physical and numerical experiments. We focus on linking surface processes to stratigraphy in 1D by quantifying the probability distributions of processes that govern the evolution of depositional systems to the probability distribution of preserved bed thicknesses. In this study we define a bed as a package of sediment bounded above and below by erosional surfaces. In a companion presentation we document heavy-tailed statistics of erosion and deposition from high-resolution temporal elevation data recorded during a controlled physical experiment. However, the heavy tails in the magnitudes of erosional and depositional events are not preserved in the experimental stratigraphy. Similar to many bed thickness distributions reported in field studies we find that an exponential distribution adequately describes the thicknesses of beds preserved in our experiment. We explore the generation of exponential bed thickness distributions from heavy-tailed surface statistics using 1D numerical models. These models indicate that when the full distribution of elevation fluctuations (both erosional and depositional events) is symmetrical, the resulting distribution of bed thicknesses is exponential in form. Finally, we illustrate that a predictable relationship exists between the coefficient of variation of surface elevation fluctuations and the scale-parameter of the resulting exponential distribution of bed thicknesses.

  9. Heuristics can produce surprisingly rational probability estimates: Comment on Costello and Watts (2014).

    PubMed

    Nilsson, Håkan; Juslin, Peter; Winman, Anders

    2016-01-01

    Costello and Watts (2014) present a model assuming that people's knowledge of probabilities adheres to probability theory, but that their probability judgments are perturbed by a random noise in the retrieval from memory. Predictions for the relationships between probability judgments for constituent events and their disjunctions and conjunctions, as well as for sums of such judgments were derived from probability theory. Costello and Watts (2014) report behavioral data showing that subjective probability judgments accord with these predictions. Based on the finding that subjective probability judgments follow probability theory, Costello and Watts (2014) conclude that the results imply that people's probability judgments embody the rules of probability theory and thereby refute theories of heuristic processing. Here, we demonstrate the invalidity of this conclusion by showing that all of the tested predictions follow straightforwardly from an account assuming heuristic probability integration (Nilsson, Winman, Juslin, & Hansson, 2009). We end with a discussion of a number of previous findings that harmonize very poorly with the predictions by the model suggested by Costello and Watts (2014). (c) 2015 APA, all rights reserved).

  10. Methods of Information Geometry to model complex shapes

    NASA Astrophysics Data System (ADS)

    De Sanctis, A.; Gattone, S. A.

    2016-09-01

    In this paper, a new statistical method to model patterns emerging in complex systems is proposed. A framework for shape analysis of 2- dimensional landmark data is introduced, in which each landmark is represented by a bivariate Gaussian distribution. From Information Geometry we know that Fisher-Rao metric endows the statistical manifold of parameters of a family of probability distributions with a Riemannian metric. Thus this approach allows to reconstruct the intermediate steps in the evolution between observed shapes by computing the geodesic, with respect to the Fisher-Rao metric, between the corresponding distributions. Furthermore, the geodesic path can be used for shape predictions. As application, we study the evolution of the rat skull shape. A future application in Ophthalmology is introduced.

  11. A Local-Realistic Model of Quantum Mechanics Based on a Discrete Spacetime

    NASA Astrophysics Data System (ADS)

    Sciarretta, Antonio

    2018-01-01

    This paper presents a realistic, stochastic, and local model that reproduces nonrelativistic quantum mechanics (QM) results without using its mathematical formulation. The proposed model only uses integer-valued quantities and operations on probabilities, in particular assuming a discrete spacetime under the form of a Euclidean lattice. Individual (spinless) particle trajectories are described as random walks. Transition probabilities are simple functions of a few quantities that are either randomly associated to the particles during their preparation, or stored in the lattice nodes they visit during the walk. QM predictions are retrieved as probability distributions of similarly-prepared ensembles of particles. The scenarios considered to assess the model comprise of free particle, constant external force, harmonic oscillator, particle in a box, the Delta potential, particle on a ring, particle on a sphere and include quantization of energy levels and angular momentum, as well as momentum entanglement.

  12. Ignition probability of polymer-bonded explosives accounting for multiple sources of material stochasticity

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, S.; Barua, A.; Zhou, M., E-mail: min.zhou@me.gatech.edu

    2014-05-07

    Accounting for the combined effect of multiple sources of stochasticity in material attributes, we develop an approach that computationally predicts the probability of ignition of polymer-bonded explosives (PBXs) under impact loading. The probabilistic nature of the specific ignition processes is assumed to arise from two sources of stochasticity. The first source involves random variations in material microstructural morphology; the second source involves random fluctuations in grain-binder interfacial bonding strength. The effect of the first source of stochasticity is analyzed with multiple sets of statistically similar microstructures and constant interfacial bonding strength. Subsequently, each of the microstructures in the multiple setsmore » is assigned multiple instantiations of randomly varying grain-binder interfacial strengths to analyze the effect of the second source of stochasticity. Critical hotspot size-temperature states reaching the threshold for ignition are calculated through finite element simulations that explicitly account for microstructure and bulk and interfacial dissipation to quantify the time to criticality (t{sub c}) of individual samples, allowing the probability distribution of the time to criticality that results from each source of stochastic variation for a material to be analyzed. Two probability superposition models are considered to combine the effects of the multiple sources of stochasticity. The first is a parallel and series combination model, and the second is a nested probability function model. Results show that the nested Weibull distribution provides an accurate description of the combined ignition probability. The approach developed here represents a general framework for analyzing the stochasticity in the material behavior that arises out of multiple types of uncertainty associated with the structure, design, synthesis and processing of materials.« less

  13. The Forecast Interpretation Tool—a Monte Carlo technique for blending climatic distributions with probabilistic forecasts

    USGS Publications Warehouse

    Husak, Gregory J.; Michaelsen, Joel; Kyriakidis, P.; Verdin, James P.; Funk, Chris; Galu, Gideon

    2011-01-01

    Probabilistic forecasts are produced from a variety of outlets to help predict rainfall, and other meteorological events, for periods of 1 month or more. Such forecasts are expressed as probabilities of a rainfall event, e.g. being in the upper, middle, or lower third of the relevant distribution of rainfall in the region. The impact of these forecasts on the expectation for the event is not always clear or easily conveyed. This article proposes a technique based on Monte Carlo simulation for adjusting existing climatologic statistical parameters to match forecast information, resulting in new parameters defining the probability of events for the forecast interval. The resulting parameters are shown to approximate the forecasts with reasonable accuracy. To show the value of the technique as an application for seasonal rainfall, it is used with consensus forecast developed for the Greater Horn of Africa for the 2009 March-April-May season. An alternative, analytical approach is also proposed, and discussed in comparison to the first simulation-based technique.

  14. Nonparametric Density Estimation Based on Self-Organizing Incremental Neural Network for Large Noisy Data.

    PubMed

    Nakamura, Yoshihiro; Hasegawa, Osamu

    2017-01-01

    With the ongoing development and expansion of communication networks and sensors, massive amounts of data are continuously generated in real time from real environments. Beforehand, prediction of a distribution underlying such data is difficult; furthermore, the data include substantial amounts of noise. These factors make it difficult to estimate probability densities. To handle these issues and massive amounts of data, we propose a nonparametric density estimator that rapidly learns data online and has high robustness. Our approach is an extension of both kernel density estimation (KDE) and a self-organizing incremental neural network (SOINN); therefore, we call our approach KDESOINN. An SOINN provides a clustering method that learns about the given data as networks of prototype of data; more specifically, an SOINN can learn the distribution underlying the given data. Using this information, KDESOINN estimates the probability density function. The results of our experiments show that KDESOINN outperforms or achieves performance comparable to the current state-of-the-art approaches in terms of robustness, learning time, and accuracy.

  15. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  16. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  17. Landscape factors influencing the spatial distribution and abundance of mosquito vector Culex quinquefasciatus (Diptera: Culicidae) in a mixed residential-agricultural community in Hawai'i

    USGS Publications Warehouse

    Reiter, M.E.; Lapointe, D.A.

    2007-01-01

    Mosquito-borne avian diseases, principally avian malaria (Plasmodium relictum Grassi and Feletti) and avian pox (Avipoxvirus sp.) have been implicated as the key limiting factor associated with recent declines of endemic avifauna in the Hawaiian Island archipelago. We present data on the relative abundance, infection status, and spatial distribution of the primary mosquito vector Culex quinquefasciatus Say (Diptera: Culicidae) across a mixed, residential-agricultural community adjacent to Hawai'i Volcanoes National Park on Hawai'i Island. We modeled the effect of agriculture and forest fragmentation in determining relative abundance of adult Cx. quinquefasciatus in Volcano Village, and we implement our statistical model in a geographic information system to generate a probability of mosquito capture prediction surface for the study area. Our model was based on biweekly captures of adult mosquitoes from 20 locations within Volcano Village from October 2001 to April 2003. We used mixed effects logistic regression to model the probability of capturing a mosquito, and we developed a set of 17 competing models a priori to specifically evaluate the effect of agriculture and fragmentation (i.e., residential landscapes) at two spatial scales. In total, 2,126 mosquitoes were captured in CO 2-baited traps with an average probability of 0.27 (SE = 0.10) of capturing one or more mosquitoes per trap night. Twelve percent of mosquitoes captured were infected with P. relictum. Our data indicate that agricultural lands and forest fragmentation significantly increase the probability of mosquito capture. The prediction surface identified areas along the Hawai'i Volcanoes National Park boundary that may have high relative abundance of the vector. Our data document the potential of avian malaria transmission in residential-agricultural landscapes and support the need for vector management that extends beyond reserve boundaries and considers a reserve's spatial position in a highly heterogeneous landscape.

  18. Cryptosporidiosis susceptibility and risk: a case study.

    PubMed

    Makri, Anna; Modarres, Reza; Parkin, Rebecca

    2004-02-01

    Regional estimates of cryptosporidiosis risks from drinking water exposure were developed and validated, accounting for AIDS status and age. We constructed a model with probability distributions and point estimates representing Cryptosporidium in tap water, tap water consumed per day (exposure characterization); dose response, illness given infection, prolonged illness given illness; and three conditional probabilities describing the likelihood of case detection by active surveillance (health effects characterization). The model predictions were combined with population data to derive expected case numbers and incidence rates per 100,000 population, by age and AIDS status, borough specific and for New York City overall in 2000 (risk characterization). They were compared with same-year surveillance data to evaluate predictive ability, assumed to represent true incidence of waterborne cryptosporidiosis. The predicted mean risks, similar to previously published estimates for this region, overpredicted observed incidence-most extensively when accounting for AIDS status. The results suggest that overprediction may be due to conservative parameters applied to both non-AIDS and AIDS populations, and that biological differences for children need to be incorporated. Interpretations are limited by the unknown accuracy of available surveillance data, in addition to variability and uncertainty of model predictions. The model appears sensitive to geographical differences in AIDS prevalence. The use of surveillance data for validation and model parameters pertinent to susceptibility are discussed.

  19. Random Partition Distribution Indexed by Pairwise Information

    PubMed Central

    Dahl, David B.; Day, Ryan; Tsai, Jerry W.

    2017-01-01

    We propose a random partition distribution indexed by pairwise similarity information such that partitions compatible with the similarities are given more probability. The use of pairwise similarities, in the form of distances, is common in some clustering algorithms (e.g., hierarchical clustering), but we show how to use this type of information to define a prior partition distribution for flexible Bayesian modeling. A defining feature of the distribution is that it allocates probability among partitions within a given number of subsets, but it does not shift probability among sets of partitions with different numbers of subsets. Our distribution places more probability on partitions that group similar items yet keeps the total probability of partitions with a given number of subsets constant. The distribution of the number of subsets (and its moments) is available in closed-form and is not a function of the similarities. Our formulation has an explicit probability mass function (with a tractable normalizing constant) so the full suite of MCMC methods may be used for posterior inference. We compare our distribution with several existing partition distributions, showing that our formulation has attractive properties. We provide three demonstrations to highlight the features and relative performance of our distribution. PMID:29276318

  20. Nested Sampling for Bayesian Model Comparison in the Context of Salmonella Disease Dynamics

    PubMed Central

    Dybowski, Richard; McKinley, Trevelyan J.; Mastroeni, Pietro; Restif, Olivier

    2013-01-01

    Understanding the mechanisms underlying the observed dynamics of complex biological systems requires the statistical assessment and comparison of multiple alternative models. Although this has traditionally been done using maximum likelihood-based methods such as Akaike's Information Criterion (AIC), Bayesian methods have gained in popularity because they provide more informative output in the form of posterior probability distributions. However, comparison between multiple models in a Bayesian framework is made difficult by the computational cost of numerical integration over large parameter spaces. A new, efficient method for the computation of posterior probabilities has recently been proposed and applied to complex problems from the physical sciences. Here we demonstrate how nested sampling can be used for inference and model comparison in biological sciences. We present a reanalysis of data from experimental infection of mice with Salmonella enterica showing the distribution of bacteria in liver cells. In addition to confirming the main finding of the original analysis, which relied on AIC, our approach provides: (a) integration across the parameter space, (b) estimation of the posterior parameter distributions (with visualisations of parameter correlations), and (c) estimation of the posterior predictive distributions for goodness-of-fit assessments of the models. The goodness-of-fit results suggest that alternative mechanistic models and a relaxation of the quasi-stationary assumption should be considered. PMID:24376528

  1. Application of Maxent Multivariate Analysis to Define Climate-Change Effects on Species Distributions and Changes

    DTIC Science & Technology

    2014-09-01

    approaches. Ecological Modelling Volume 200, Issues 1–2, 10, pp 1–19. Buhlmann, Kurt A ., Thomas S.B. Akre , John B. Iverson, Deno Karapatakis, Russell A ...statistical multivariate analysis to define the current and projected future range probability for species of interest to Army land managers. A software...15 Figure 4. RCW omission rate and predicted area as a function of the cumulative threshold

  2. Spatial Prediction of Coxiella burnetii Outbreak Exposure via Notified Case Counts in a Dose-Response Model.

    PubMed

    Brooke, Russell J; Kretzschmar, Mirjam E E; Hackert, Volker; Hoebe, Christian J P A; Teunis, Peter F M; Waller, Lance A

    2017-01-01

    We develop a novel approach to study an outbreak of Q fever in 2009 in the Netherlands by combining a human dose-response model with geostatistics prediction to relate probability of infection and associated probability of illness to an effective dose of Coxiella burnetii. The spatial distribution of the 220 notified cases in the at-risk population are translated into a smooth spatial field of dose. Based on these symptomatic cases, the dose-response model predicts a median of 611 asymptomatic infections (95% range: 410, 1,084) for the 220 reported symptomatic cases in the at-risk population; 2.78 (95% range: 1.86, 4.93) asymptomatic infections for each reported case. The low attack rates observed during the outbreak range from (Equation is included in full-text article.)to (Equation is included in full-text article.). The estimated peak levels of exposure extend to the north-east from the point source with an increasing proportion of asymptomatic infections further from the source. Our work combines established methodology from model-based geostatistics and dose-response modeling allowing for a novel approach to study outbreaks. Unobserved infections and the spatially varying effective dose can be predicted using the flexible framework without assuming any underlying spatial structure of the outbreak process. Such predictions are important for targeting interventions during an outbreak, estimating future disease burden, and determining acceptable risk levels.

  3. Parameterizing the Spatial Markov Model from Breakthrough Curve Data Alone

    NASA Astrophysics Data System (ADS)

    Sherman, T.; Bolster, D.; Fakhari, A.; Miller, S.; Singha, K.

    2017-12-01

    The spatial Markov model (SMM) uses a correlated random walk and has been shown to effectively capture anomalous transport in porous media systems; in the SMM, particles' future trajectories are correlated to their current velocity. It is common practice to use a priori Lagrangian velocity statistics obtained from high resolution simulations to determine a distribution of transition probabilities (correlation) between velocity classes that govern predicted transport behavior; however, this approach is computationally cumbersome. Here, we introduce a methodology to quantify velocity correlation from Breakthrough (BTC) curve data alone; discretizing two measured BTCs into a set of arrival times and reverse engineering the rules of the SMM allows for prediction of velocity correlation, thereby enabling parameterization of the SMM in studies where Lagrangian velocity statistics are not available. The introduced methodology is applied to estimate velocity correlation from BTCs measured in high resolution simulations, thus allowing for a comparison of estimated parameters with known simulated values. Results show 1) estimated transition probabilities agree with simulated values and 2) using the SMM with estimated parameterization accurately predicts BTCs downstream. Additionally, we include uncertainty measurements by calculating lower and upper estimates of velocity correlation, which allow for prediction of a range of BTCs. The simulated BTCs fall in the range of predicted BTCs. This research proposes a novel method to parameterize the SMM from BTC data alone, thereby reducing the SMM's computational costs and widening its applicability.

  4. Cyclic variation in seasonal recruitment and the evolution of the seasonal decline in Ural owl clutch size.

    PubMed Central

    Brommer, Jon E; Pietiäinen, Hannu; Kokko, Hanna

    2002-01-01

    Plastic life-history traits can be viewed as adaptive responses to environmental conditions, described by a reaction norm. In birds, the decline in clutch size with advancing laying date has been viewed as a reaction norm in response to the parent's own (somatic or local environmental) condition and the seasonal decline in its offspring's reproductive value. Theory predicts that differences in the seasonal recruitment are mirrored in the seasonal decrease in clutch size. We tested this prediction in the Ural owl. The owl's main prey, voles, show a cycle of low, increase and peak phases. Recruitment probability had a humped distribution in both increase and peak phases. Average recruitment probability was two to three times higher in the increase phase and declined faster in the latter part of the season when compared with the peak phase. Clutch size decreased twice as steep in the peak (0.1 eggs day-1) as in the increase phase (0.05 eggs day-1). This result appears to refute theoretical predictions of seasonal clutch size declines. However, a re-examination of current theory shows that the predictions of modelling are less robust to details of seasonal condition accumulation in birds than originally thought. The observed pattern can be predicted, assuming specifically shaped seasonal increases in condition across individuals. PMID:11916482

  5. A brief introduction to probability.

    PubMed

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  6. On the degree distribution of horizontal visibility graphs associated with Markov processes and dynamical systems: diagrammatic and variational approaches

    NASA Astrophysics Data System (ADS)

    Lacasa, Lucas

    2014-09-01

    Dynamical processes can be transformed into graphs through a family of mappings called visibility algorithms, enabling the possibility of (i) making empirical time series analysis and signal processing and (ii) characterizing classes of dynamical systems and stochastic processes using the tools of graph theory. Recent works show that the degree distribution of these graphs encapsulates much information on the signals' variability, and therefore constitutes a fundamental feature for statistical learning purposes. However, exact solutions for the degree distributions are only known in a few cases, such as for uncorrelated random processes. Here we analytically explore these distributions in a list of situations. We present a diagrammatic formalism which computes for all degrees their corresponding probability as a series expansion in a coupling constant which is the number of hidden variables. We offer a constructive solution for general Markovian stochastic processes and deterministic maps. As case tests we focus on Ornstein-Uhlenbeck processes, fully chaotic and quasiperiodic maps. Whereas only for certain degree probabilities can all diagrams be summed exactly, in the general case we show that the perturbation theory converges. In a second part, we make use of a variational technique to predict the complete degree distribution for special classes of Markovian dynamics with fast-decaying correlations. In every case we compare the theory with numerical experiments.

  7. Potential distribution of Mexican primates: modeling the ecological niche with the maximum entropy algorithm.

    PubMed

    Vidal-García, Francisca; Serio-Silva, Juan Carlos

    2011-07-01

    We developed a potential distribution model for the tropical rain forest species of primates of southern Mexico: the black howler monkey (Alouatta pigra), the mantled howler monkey (Alouatta palliata), and the spider monkey (Ateles geoffroyi). To do so, we applied the maximum entropy algorithm from the ecological niche modeling program MaxEnt. For each species, we used occurrence records from scientific collections, and published and unpublished sources, and we also used the 19 environmental coverage variables related to precipitation and temperature from WorldClim to develop the models. The predicted distribution of A. pigra was strongly associated with the mean temperature of the warmest quarter (23.6%), whereas the potential distributions of A. palliata and A. geoffroyi were strongly associated with precipitation during the coldest quarter (52.2 and 34.3% respectively). The potential distribution of A. geoffroyi is broader than that of the Alouatta spp. The areas with the greatest probability of presence of A. pigra and A. palliata are strongly associated with riparian vegetation, whereas the presence of A. geoffroyi is more strongly associated with the presence of rain forest. Our most significant contribution is the identification of areas with a high probability of the presence of these primate species, which is information that can be applied to planning future studies and then establishing criteria for the creation of areas to primate conservation in Mexico.

  8. Chance-Constrained AC Optimal Power Flow for Distribution Systems With Renewables

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DallAnese, Emiliano; Baker, Kyri; Summers, Tyler

    This paper focuses on distribution systems featuring renewable energy sources (RESs) and energy storage systems, and presents an AC optimal power flow (OPF) approach to optimize system-level performance objectives while coping with uncertainty in both RES generation and loads. The proposed method hinges on a chance-constrained AC OPF formulation where probabilistic constraints are utilized to enforce voltage regulation with prescribed probability. A computationally more affordable convex reformulation is developed by resorting to suitable linear approximations of the AC power-flow equations as well as convex approximations of the chance constraints. The approximate chance constraints provide conservative bounds that hold for arbitrarymore » distributions of the forecasting errors. An adaptive strategy is then obtained by embedding the proposed AC OPF task into a model predictive control framework. Finally, a distributed solver is developed to strategically distribute the solution of the optimization problems across utility and customers.« less

  9. Predictive models attribute effects on fish assemblages to toxicity and habitat alteration.

    PubMed

    de Zwart, Dick; Dyer, Scott D; Posthuma, Leo; Hawkins, Charles P

    2006-08-01

    Biological assessments should both estimate the condition of a biological resource (magnitude of alteration) and provide environmental managers with a diagnosis of the potential causes of impairment. Although methods of quantifying condition are well developed, identifying and proportionately attributing impairment to probable causes remain problematic. Furthermore, analyses of both condition and cause have often been difficult to communicate. We developed an approach that (1) links fish, habitat, and chemistry data collected from hundreds of sites in Ohio (USA) streams, (2) assesses the biological condition at each site, (3) attributes impairment to multiple probable causes, and (4) provides the results of the analyses in simple-to-interpret pie charts. The data set was managed using a geographic information system. Biological condition was assessed using a RIVPACS (river invertebrate prediction and classification system)-like predictive model. The model provided probabilities of capture for 117 fish species based on the geographic location of sites and local habitat descriptors. Impaired biological condition was defined as the proportion of those native species predicted to occur at a site that were observed. The potential toxic effects of exposure to mixtures of contaminants were estimated using species sensitivity distributions and mixture toxicity principles. Generalized linear regression models described species abundance as a function of habitat characteristics. Statistically linking biological condition, habitat characteristics including mixture risks, and species abundance allowed us to evaluate the losses of species with environmental conditions. Results were mapped as simple effect and probable-cause pie charts (EPC pie diagrams), with pie sizes corresponding to magnitude of local impairment, and slice sizes to the relative probable contributions of different stressors. The types of models we used have been successfully applied in ecology and ecotoxicology, but they have not previously been used in concert to quantify impairment and its likely causes. Although data limitations constrained our ability to examine complex interactions between stressors and species, the direct relationships we detected likely represent conservative estimates of stressor contributions to local impairment. Future refinements of the general approach and specific methods described here should yield even more promising results.

  10. Landscape genetics and the spatial distribution of chronic wasting disease

    USGS Publications Warehouse

    Blanchong, Julie A.; Samuel, M.D.; Scribner, K.T.; Weckworth, B.V.; Langenberg, J.A.; Filcek, K.B.

    2008-01-01

    Predicting the spread of wildlife disease is critical for identifying populations at risk, targeting surveillance and designing proactive management programmes. We used a landscape genetics approach to identify landscape features that influenced gene flow and the distribution of chronic wasting disease (CWD) in Wisconsin white-tailed deer. CWD prevalence was negatively correlated with genetic differentiation of study area deer from deer in the area of disease origin (core-area). Genetic differentiation was greatest, and CWD prevalence lowest, in areas separated from the core-area by the Wisconsin River, indicating that this river reduced deer gene flow and probably disease spread. Features of the landscape that influence host dispersal and spatial patterns of disease can be identified based on host spatial genetic structure. Landscape genetics may be used to predict high-risk populations based on their genetic connection to infected populations and to target disease surveillance, control and preventative activities. ?? 2007 The Royal Society.

  11. Statistical time-dependent model for the interstellar gas

    NASA Technical Reports Server (NTRS)

    Gerola, H.; Kafatos, M.; Mccray, R.

    1974-01-01

    We present models for temperature and ionization structure of low, uniform-density (approximately 0.3 per cu cm) interstellar gas in a galactic disk which is exposed to soft X rays from supernova outbursts occurring randomly in space and time. The structure was calculated by computing the time record of temperature and ionization at a given point by Monte Carlo simulation. The calculation yields probability distribution functions for ionized fraction, temperature, and their various observable moments. These time-dependent models predict a bimodal temperature distribution of the gas that agrees with various observations. Cold regions in the low-density gas may have the appearance of clouds in 21-cm absorption. The time-dependent model, in contrast to the steady-state model, predicts large fluctuations in ionization rate and the existence of cold (approximately 30 K), ionized (ionized fraction equal to about 0.1) regions.

  12. On the application of a hairpin vortex model of wall turbulence to trailing edge noise prediction

    NASA Technical Reports Server (NTRS)

    Liu, N. S.; Shamroth, S. J.

    1985-01-01

    The goal is to develop a technique via a hairpin vortex model of the turbulent boundary layer, which would lead to the estimation of the aerodynamic input for use in trailing edge noise prediction theories. The work described represents an initial step in reaching this goal. The hairpin vortex is considered as the underlying structure of the wall turbulence and the turbulent boundary layer is viewed as an ensemble of typical hairpin vortices of different sizes. A synthesis technique is examined which links the mean flow and various turbulence quantities via these typical vortices. The distribution of turbulence quantities among vortices of different scales follows directly from the probability distribution needed to give the measured mean flow vorticity. The main features of individual representative hairpin vortices are discussed in detail and a preliminary assessment of the synthesis approach is made.

  13. A simple two-stage model predicts response time distributions.

    PubMed

    Carpenter, R H S; Reddi, B A J; Anderson, A J

    2009-08-15

    The neural mechanisms underlying reaction times have previously been modelled in two distinct ways. When stimuli are hard to detect, response time tends to follow a random-walk model that integrates noisy sensory signals. But studies investigating the influence of higher-level factors such as prior probability and response urgency typically use highly detectable targets, and response times then usually correspond to a linear rise-to-threshold mechanism. Here we show that a model incorporating both types of element in series - a detector integrating noisy afferent signals, followed by a linear rise-to-threshold performing decision - successfully predicts not only mean response times but, much more stringently, the observed distribution of these times and the rate of decision errors over a wide range of stimulus detectability. By reconciling what previously may have seemed to be conflicting theories, we are now closer to having a complete description of reaction time and the decision processes that underlie it.

  14. On the use of Bayesian Monte-Carlo in evaluation of nuclear data

    NASA Astrophysics Data System (ADS)

    De Saint Jean, Cyrille; Archier, Pascal; Privas, Edwin; Noguere, Gilles

    2017-09-01

    As model parameters, necessary ingredients of theoretical models, are not always predicted by theory, a formal mathematical framework associated to the evaluation work is needed to obtain the best set of parameters (resonance parameters, optical models, fission barrier, average width, multigroup cross sections) with Bayesian statistical inference by comparing theory to experiment. The formal rule related to this methodology is to estimate the posterior density probability function of a set of parameters by solving an equation of the following type: pdf(posterior) ˜ pdf(prior) × a likelihood function. A fitting procedure can be seen as an estimation of the posterior density probability of a set of parameters (referred as x→?) knowing a prior information on these parameters and a likelihood which gives the probability density function of observing a data set knowing x→?. To solve this problem, two major paths could be taken: add approximations and hypothesis and obtain an equation to be solved numerically (minimum of a cost function or Generalized least Square method, referred as GLS) or use Monte-Carlo sampling of all prior distributions and estimate the final posterior distribution. Monte Carlo methods are natural solution for Bayesian inference problems. They avoid approximations (existing in traditional adjustment procedure based on chi-square minimization) and propose alternative in the choice of probability density distribution for priors and likelihoods. This paper will propose the use of what we are calling Bayesian Monte Carlo (referred as BMC in the rest of the manuscript) in the whole energy range from thermal, resonance and continuum range for all nuclear reaction models at these energies. Algorithms will be presented based on Monte-Carlo sampling and Markov chain. The objectives of BMC are to propose a reference calculation for validating the GLS calculations and approximations, to test probability density distributions effects and to provide the framework of finding global minimum if several local minimums exist. Application to resolved resonance, unresolved resonance and continuum evaluation as well as multigroup cross section data assimilation will be presented.

  15. TU-AB-BRB-01: Coverage Evaluation and Probabilistic Treatment Planning as a Margin Alternative

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Siebers, J.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  16. TU-AB-BRB-03: Coverage-Based Treatment Planning to Accommodate Organ Deformable Motions and Contouring Uncertainties for Prostate Treatment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, H.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  17. TU-AB-BRB-02: Stochastic Programming Methods for Handling Uncertainty and Motion in IMRT Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Unkelbach, J.

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  18. TU-AB-BRB-00: New Methods to Ensure Target Coverage

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2015-06-15

    The accepted clinical method to accommodate targeting uncertainties inherent in fractionated external beam radiation therapy is to utilize GTV-to-CTV and CTV-to-PTV margins during the planning process to design a PTV-conformal static dose distribution on the planning image set. Ideally, margins are selected to ensure a high (e.g. >95%) target coverage probability (CP) in spite of inherent inter- and intra-fractional positional variations, tissue motions, and initial contouring uncertainties. Robust optimization techniques, also known as probabilistic treatment planning techniques, explicitly incorporate the dosimetric consequences of targeting uncertainties by including CP evaluation into the planning optimization process along with coverage-based planning objectives. Themore » treatment planner no longer needs to use PTV and/or PRV margins; instead robust optimization utilizes probability distributions of the underlying uncertainties in conjunction with CP-evaluation for the underlying CTVs and OARs to design an optimal treated volume. This symposium will describe CP-evaluation methods as well as various robust planning techniques including use of probability-weighted dose distributions, probability-weighted objective functions, and coverage optimized planning. Methods to compute and display the effect of uncertainties on dose distributions will be presented. The use of robust planning to accommodate inter-fractional setup uncertainties, organ deformation, and contouring uncertainties will be examined as will its use to accommodate intra-fractional organ motion. Clinical examples will be used to inter-compare robust and margin-based planning, highlighting advantages of robust-plans in terms of target and normal tissue coverage. Robust-planning limitations as uncertainties approach zero and as the number of treatment fractions becomes small will be presented, as well as the factors limiting clinical implementation of robust planning. Learning Objectives: To understand robust-planning as a clinical alternative to using margin-based planning. To understand conceptual differences between uncertainty and predictable motion. To understand fundamental limitations of the PTV concept that probabilistic planning can overcome. To understand the major contributing factors to target and normal tissue coverage probability. To understand the similarities and differences of various robust planning techniques To understand the benefits and limitations of robust planning techniques.« less

  19. [Research on Kalman interpolation prediction model based on micro-region PM2.5 concentration].

    PubMed

    Wang, Wei; Zheng, Bin; Chen, Binlin; An, Yaoming; Jiang, Xiaoming; Li, Zhangyong

    2018-02-01

    In recent years, the pollution problem of particulate matter, especially PM2.5, is becoming more and more serious, which has attracted many people's attention from all over the world. In this paper, a Kalman prediction model combined with cubic spline interpolation is proposed, which is applied to predict the concentration of PM2.5 in the micro-regional environment of campus, and to realize interpolation simulation diagram of concentration of PM2.5 and simulate the spatial distribution of PM2.5. The experiment data are based on the environmental information monitoring system which has been set up by our laboratory. And the predicted and actual values of PM2.5 concentration data have been checked by the way of Wilcoxon signed-rank test. We find that the value of bilateral progressive significance probability was 0.527, which is much greater than the significant level α = 0.05. The mean absolute error (MEA) of Kalman prediction model was 1.8 μg/m 3 , the average relative error (MER) was 6%, and the correlation coefficient R was 0.87. Thus, the Kalman prediction model has a better effect on the prediction of concentration of PM2.5 than those of the back propagation (BP) prediction and support vector machine (SVM) prediction. In addition, with the combination of Kalman prediction model and the spline interpolation method, the spatial distribution and local pollution characteristics of PM2.5 can be simulated.

  20. Weak Measurement and Quantum Smoothing of a Superconducting Qubit

    NASA Astrophysics Data System (ADS)

    Tan, Dian

    In quantum mechanics, the measurement outcome of an observable in a quantum system is intrinsically random, yielding a probability distribution. The state of the quantum system can be described by a density matrix rho(t), which depends on the information accumulated until time t, and represents our knowledge about the system. The density matrix rho(t) gives probabilities for the outcomes of measurements at time t. Further probing of the quantum system allows us to refine our prediction in hindsight. In this thesis, we experimentally examine a quantum smoothing theory in a superconducting qubit by introducing an auxiliary matrix E(t) which is conditioned on information obtained from time t to a final time T. With the complete information before and after time t, the pair of matrices [rho(t), E(t)] can be used to make smoothed predictions for the measurement outcome at time t. We apply the quantum smoothing theory in the case of continuous weak measurement unveiling the retrodicted quantum trajectories and weak values. In the case of strong projective measurement, while the density matrix rho(t) with only diagonal elements in a given basis |n〉 may be treated as a classical mixture, we demonstrate a failure of this classical mixture description in determining the smoothed probabilities for the measurement outcome at time t with both diagonal rho(t) and diagonal E(t). We study the correlations between quantum states and weak measurement signals and examine aspects of the time symmetry of continuous quantum measurement. We also extend our study of quantum smoothing theory to the case of resonance fluorescence of a superconducting qubit with homodyne measurement and observe some interesting effects such as the modification of the excited state probabilities, weak values, and evolution of the predicted and retrodicted trajectories.

  1. Minimizing predation risk in a landscape of multiple predators: effects on the spatial distribution of African ungulates.

    PubMed

    Thaker, Maria; Vanak, Abi T; Owen, Cailey R; Ogden, Monika B; Niemann, Sophie M; Slotow, Rob

    2011-02-01

    Studies that focus on single predator-prey interactions can be inadequate for understanding antipredator responses in multi-predator systems. Yet there is still a general lack of information about the strategies of prey to minimize predation risk from multiple predators at the landscape level. Here we examined the distribution of seven African ungulate species in the fenced Karongwe Game Reserve (KGR), South Africa, as a function of predation risk from all large carnivore species (lion, leopard, cheetah, African wild dog, and spotted hyena). Using observed kill data, we generated ungulate-specific predictions of relative predation risk and of riskiness of habitats. To determine how ungulates minimize predation risk at the landscape level, we explicitly tested five hypotheses consisting of strategies that reduce the probability of encountering predators, and the probability of being killed. All ungulate species avoided risky habitats, and most selected safer habitats, thus reducing their probability of being killed. To reduce the probability of encountering predators, most of the smaller prey species (impala, warthog, waterbuck, kudu) avoided the space use of all predators, while the larger species (wildebeest, zebra, giraffe) only avoided areas where lion and leopard space use were high. The strength of avoidance for the space use of predators generally did not correspond to the relative predation threat from those predators. Instead, ungulates used a simpler behavioral rule of avoiding the activity areas of sit-and-pursue predators (lion and leopard), but not those of cursorial predators (cheetah and African wild dog). In general, selection and avoidance of habitats was stronger than avoidance of the predator activity areas. We expect similar decision rules to drive the distribution pattern of ungulates in other African savannas and in other multi-predator systems, especially where predators differ in their hunting modes.

  2. Use of weather data and remote sensing to predict the geographic and seasonal distribution of Phlebotomus papatasi in southwest Asia.

    PubMed

    Cross, E R; Newcomb, W W; Tucker, C J

    1996-05-01

    Sandfly fever and leishmaniasis were major causes of infectious disease morbidity among military personnel deployed to the Middle East during World War II. Recently, leishmaniasis has been reported in the United Nations Multinational Forces and Observers in the Sinai. Despite these indications of endemicity, no cases of sandfly fever and only 31 cases of leishmaniasis have been identified among U.S. veterans of the Persian Gulf War. The distribution in the Persian Gulf of the vector, Phlebotomus papatasi, is thought to be highly dependent on environmental conditions, especially temperature and relative humidity. A computer model was developed using the occurrence of P. papatasi as the dependent variable and weather data as the independent variables. The results of this model indicated that the greatest sand fly activity and thus the highest risk of sandfly fever and leishmania infections occurred during the spring/summer months before U.S. troops were deployed to the Persian Gulf. Because the weather model produced probability of occurrence information for locations of the weather stations only, normalized difference vegetation index (NDVI) levels from remotely sensed Advanced Very High Resolution Radiometer satellites were determined for each weather station. From the results of the frequency of NDVI levels by probability of occurrence, the range of NDVI levels for presence of the vector was determined. The computer then identified all pixels within the NDVI range indicated and produced a computer-generated map of the probable distribution of P. papatasi. The resulting map expanded the analysis to areas where there were no weather stations and from which no information was reported in the literature, identifying these areas as having either a high or low probability of vector occurrence.

  3. Dissociation rate of bromine diatomics in an argon heat bath

    NASA Technical Reports Server (NTRS)

    Razner, R.; Hopkins, D.

    1973-01-01

    The evolution of a collection of 300 K bromine diatomics embedded in a heat bath of argon atoms at 1800 K was studied by computer, and a dissociation-rate constant for the reaction Br2 + BR + Ar yields Br + Ar was determined. Previously published probability distributions for energy and angular momentum transfers in classical three-dimensional Br2-Ar collisions were used in conjunction with a newly developed Monte Carlo scheme for this purpose. Results are compared with experimental shock-tube data and the predictions of several other theoretical models. A departure from equilibrium is obtained which is significantly greater than that predicted by any of these other theories.

  4. The ratio of inclusive jet cross sections at square √s = 630 GeV and square √s = 1800 GeV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krane, John

    This dissertation presents an analysis of hadronic jet production from proton-antiproton collisions at two center-of-mass energies. Measurements were performed in the central region (|η|<0.5) of the D0 detector at Fermi National Accelerator Laboratory (Batavia, IL). Results are compared to next-to-leading-order QCD predictions generated with JETRAD and EKS Monte Carlo. Several techniques reduce the uncertainty in the ratio of cross sections as low as 5%. The observed normalization difference results in a low probability that the data and predictions describe the same distribution.

  5. Relative mass distributions of neutron-rich thermally fissile nuclei within a statistical model

    NASA Astrophysics Data System (ADS)

    Kumar, Bharat; Kannan, M. T. Senthil; Balasubramaniam, M.; Agrawal, B. K.; Patra, S. K.

    2017-09-01

    We study the binary mass distribution for the recently predicted thermally fissile neutron-rich uranium and thorium nuclei using a statistical model. The level density parameters needed for the study are evaluated from the excitation energies of the temperature-dependent relativistic mean field formalism. The excitation energy and the level density parameter for a given temperature are employed in the convolution integral method to obtain the probability of the particular fragmentation. As representative cases, we present the results for the binary yields of 250U and 254Th. The relative yields are presented for three different temperatures: T =1 , 2, and 3 MeV.

  6. Strong regularities in world wide web surfing

    PubMed

    Huberman; Pirolli; Pitkow; Lukose

    1998-04-03

    One of the most common modes of accessing information in the World Wide Web is surfing from one document to another along hyperlinks. Several large empirical studies have revealed common patterns of surfing behavior. A model that assumes that users make a sequence of decisions to proceed to another page, continuing as long as the value of the current page exceeds some threshold, yields the probability distribution for the number of pages that a user visits within a given Web site. This model was verified by comparing its predictions with detailed measurements of surfing patterns. The model also explains the observed Zipf-like distributions in page hits observed at Web sites.

  7. HIV Testing and Counseling Leads to Immediate Consistent Condom Use among South African Stable HIV-discordant Couples

    PubMed Central

    Rosenberg, Nora E; Pettifor, Audrey E; Bruyn, Guy DE; Westreich, Daniel; Delany-Moretlwe, Sinead; Behets, Frieda; Maman, Suzanne; Coetzee, David; Kamupira, Mercy; Miller, William C

    2012-01-01

    Introduction Effective behavioral HIV prevention is needed for stable HIV-discordant couples at risk for HIV, especially those without access to biomedical prevention. This analysis addressed whether HIV testing and counseling (HTC) with ongoing counseling and condom distribution lead to reduced unprotected sex in HIV-discordant couples. Methods Partners in Prevention HSV/HIV Transmission Study was a randomized trial conducted from 2004–2008 assessing whether acyclovir reduced HIV transmission from HSV-2/HIV-1 co-infected persons to HIV-uninfected sex partners. This analysis relied on self-reported behavioral data from 508 HIV-infected South African participants. The exposure was timing of first HTC: 0–7, 8–14, 15–30, or >30 days before baseline. In each exposure group, predicted probabilities of unprotected sex in the last month were calculated at baseline, month one, and month twelve using generalized estimating equations with a logit link and exchangeable correlation matrix. Results At baseline, participants who knew their HIV status for less time experienced higher predicted probabilities of unprotected sex in the last month: 0–7 days, 0.71; 8–14 days, 0.52; 15–30 days, 0.49; >30 days, 0.26. At month one, once all participants had been aware of being in HIV-discordant relationships for ≥ 1 month, predicted probabilities declined: 0–7 days, 0.08; 8–14 days, 0.08; 15–30 days, 0.15; >30 days, 0.14. Lower predicted probabilities were sustained through month twelve: 0–7 days, 0.08; 8–14 days, 0.11; 15–30 days, 0.05; >30 days, 0.19. Conclusions Unprotected sex declined after HIV-positive diagnosis, and declined further after awareness of HIV-discordance. Identifying HIV-discordant couples for behavioral prevention is important for reducing HIV transmission risk. PMID:23117500

  8. Survival Predictions of Ceramic Crowns Using Statistical Fracture Mechanics

    PubMed Central

    Nasrin, S.; Katsube, N.; Seghi, R.R.; Rokhlin, S.I.

    2017-01-01

    This work establishes a survival probability methodology for interface-initiated fatigue failures of monolithic ceramic crowns under simulated masticatory loading. A complete 3-dimensional (3D) finite element analysis model of a minimally reduced molar crown was developed using commercially available hardware and software. Estimates of material surface flaw distributions and fatigue parameters for 3 reinforced glass-ceramics (fluormica [FM], leucite [LR], and lithium disilicate [LD]) and a dense sintered yttrium-stabilized zirconia (YZ) were obtained from the literature and incorporated into the model. Utilizing the proposed fracture mechanics–based model, crown survival probability as a function of loading cycles was obtained from simulations performed on the 4 ceramic materials utilizing identical crown geometries and loading conditions. The weaker ceramic materials (FM and LR) resulted in lower survival rates than the more recently developed higher-strength ceramic materials (LD and YZ). The simulated 10-y survival rate of crowns fabricated from YZ was only slightly better than those fabricated from LD. In addition, 2 of the model crown systems (FM and LD) were expanded to determine regional-dependent failure probabilities. This analysis predicted that the LD-based crowns were more likely to fail from fractures initiating from margin areas, whereas the FM-based crowns showed a slightly higher probability of failure from fractures initiating from the occlusal table below the contact areas. These 2 predicted fracture initiation locations have some agreement with reported fractographic analyses of failed crowns. In this model, we considered the maximum tensile stress tangential to the interfacial surface, as opposed to the more universally reported maximum principal stress, because it more directly impacts crack propagation. While the accuracy of these predictions needs to be experimentally verified, the model can provide a fundamental understanding of the importance that pre-existing flaws at the intaglio surface have on fatigue failures. PMID:28107637

  9. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    PubMed

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  10. A Brownian model for recurrent earthquakes

    USGS Publications Warehouse

    Matthews, M.V.; Ellsworth, W.L.; Reasenberg, P.A.

    2002-01-01

    We construct a probability model for rupture times on a recurrent earthquake source. Adding Brownian perturbations to steady tectonic loading produces a stochastic load-state process. Rupture is assumed to occur when this process reaches a critical-failure threshold. An earthquake relaxes the load state to a characteristic ground level and begins a new failure cycle. The load-state process is a Brownian relaxation oscillator. Intervals between events have a Brownian passage-time distribution that may serve as a temporal model for time-dependent, long-term seismic forecasting. This distribution has the following noteworthy properties: (1) the probability of immediate rerupture is zero; (2) the hazard rate increases steadily from zero at t = 0 to a finite maximum near the mean recurrence time and then decreases asymptotically to a quasi-stationary level, in which the conditional probability of an event becomes time independent; and (3) the quasi-stationary failure rate is greater than, equal to, or less than the mean failure rate because the coefficient of variation is less than, equal to, or greater than 1/???2 ??? 0.707. In addition, the model provides expressions for the hazard rate and probability of rupture on faults for which only a bound can be placed on the time of the last rupture. The Brownian relaxation oscillator provides a connection between observable event times and a formal state variable that reflects the macromechanics of stress and strain accumulation. Analysis of this process reveals that the quasi-stationary distance to failure has a gamma distribution, and residual life has a related exponential distribution. It also enables calculation of "interaction" effects due to external perturbations to the state, such as stress-transfer effects from earthquakes outside the target source. The influence of interaction effects on recurrence times is transient and strongly dependent on when in the loading cycle step pertubations occur. Transient effects may be much stronger than would be predicted by the "clock change" method and characteristically decay inversely with elapsed time after the perturbation.

  11. Changes in tropical precipitation cluster size distributions under global warming

    NASA Astrophysics Data System (ADS)

    Neelin, J. D.; Quinn, K. M.

    2016-12-01

    The total amount of precipitation integrated across a tropical storm or other precipitation feature (contiguous clusters of precipitation exceeding a minimum rain rate) is a useful measure of the aggregate size of the disturbance. To establish baseline behavior in current climate, the probability distribution of cluster sizes from multiple satellite retrievals and National Center for Environmental Prediction (NCEP) reanalysis is compared to those from Coupled Model Intercomparison Project (CMIP5) models and the Geophysical Fluid Dynamics Laboratory high-resolution atmospheric model (HIRAM-360 and -180). With the caveat that a minimum rain rate threshold is important in the models (which tend to overproduce low rain rates), the models agree well with observations in leading properties. In particular, scale-free power law ranges in which the probability drops slowly with increasing cluster size are well modeled, followed by a rapid drop in probability of the largest clusters above a cutoff scale. Under the RCP 8.5 global warming scenario, the models indicate substantial increases in probability (up to an order of magnitude) of the largest clusters by the end of century. For models with continuous time series of high resolution output, there is substantial spread on when these probability increases for the largest precipitation clusters should be detectable, ranging from detectable within the observational period to statistically significant trends emerging only in the second half of the century. Examination of NCEP reanalysis and SSMI/SSMIS series of satellite retrievals from 1979 to present does not yield reliable evidence of trends at this time. The results suggest improvements in inter-satellite calibration of the SSMI/SSMIS retrievals could aid future detection.

  12. The global impact distribution of Near-Earth objects

    NASA Astrophysics Data System (ADS)

    Rumpf, Clemens; Lewis, Hugh G.; Atkinson, Peter M.

    2016-02-01

    Asteroids that could collide with the Earth are listed on the publicly available Near-Earth object (NEO) hazard web sites maintained by the National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA). The impact probability distribution of 69 potentially threatening NEOs from these lists that produce 261 dynamically distinct impact instances, or Virtual Impactors (VIs), were calculated using the Asteroid Risk Mitigation and Optimization Research (ARMOR) tool in conjunction with OrbFit. ARMOR projected the impact probability of each VI onto the surface of the Earth as a spatial probability distribution. The projection considers orbit solution accuracy and the global impact probability. The method of ARMOR is introduced and the tool is validated against two asteroid-Earth collision cases with objects 2008 TC3 and 2014 AA. In the analysis, the natural distribution of impact corridors is contrasted against the impact probability distribution to evaluate the distributions' conformity with the uniform impact distribution assumption. The distribution of impact corridors is based on the NEO population and orbital mechanics. The analysis shows that the distribution of impact corridors matches the common assumption of uniform impact distribution and the result extends the evidence base for the uniform assumption from qualitative analysis of historic impact events into the future in a quantitative way. This finding is confirmed in a parallel analysis of impact points belonging to a synthetic population of 10,006 VIs. Taking into account the impact probabilities introduced significant variation into the results and the impact probability distribution, consequently, deviates markedly from uniformity. The concept of impact probabilities is a product of the asteroid observation and orbit determination technique and, thus, represents a man-made component that is largely disconnected from natural processes. It is important to consider impact probabilities because such information represents the best estimate of where an impact might occur.

  13. Predicting the ocurrence probability of freak waves baed on buoy data and non-stationary extreme value models

    NASA Astrophysics Data System (ADS)

    Tomas, A.; Menendez, M.; Mendez, F. J.; Coco, G.; Losada, I. J.

    2012-04-01

    In the last decades, freak or rogue waves have become an important topic in engineering and science. Forecasting the occurrence probability of freak waves is a challenge for oceanographers, engineers, physicists and statisticians. There are several mechanisms responsible for the formation of freak waves, and different theoretical formulations (primarily based on numerical models with simplifying assumption) have been proposed to predict the occurrence probability of freak wave in a sea state as a function of N (number of individual waves) and kurtosis (k). On the other hand, different attempts to parameterize k as a function of spectral parameters such as the Benjamin-Feir Index (BFI) and the directional spreading (Mori et al., 2011) have been proposed. The objective of this work is twofold: (1) develop a statistical model to describe the uncertainty of maxima individual wave height, Hmax, considering N and k as covariates; (2) obtain a predictive formulation to estimate k as a function of aggregated sea state spectral parameters. For both purposes, we use free surface measurements (more than 300,000 20-minutes sea states) from the Spanish deep water buoy network (Puertos del Estado, Spanish Ministry of Public Works). Non-stationary extreme value models are nowadays widely used to analyze the time-dependent or directional-dependent behavior of extreme values of geophysical variables such as significant wave height (Izaguirre et al., 2010). In this work, a Generalized Extreme Value (GEV) statistical model for the dimensionless maximum wave height (x=Hmax/Hs) in every sea state is used to assess the probability of freak waves. We allow the location, scale and shape parameters of the GEV distribution to vary as a function of k and N. The kurtosis-dependency is parameterized using third-order polynomials and the model is fitted using standard log-likelihood theory, obtaining a very good behavior to predict the occurrence probability of freak waves (x>2). Regarding the second objective of this work, we apply different algorithms using three spectral parameters (wave steepness, directional dispersion, frequential dispersion) as predictors, to estimate the probability density function of the kurtosis for a given sea state. ACKNOWLEDGMENTS The authors thank to Puertos del Estado (Spanish Ministry of Public Works) for providing the free surface measurement database.

  14. Know the risk, take the win: how executive functions and probability processing influence advantageous decision making under risk conditions.

    PubMed

    Brand, Matthias; Schiebener, Johannes; Pertl, Marie-Theres; Delazer, Margarete

    2014-01-01

    Recent models on decision making under risk conditions have suggested that numerical abilities are important ingredients of advantageous decision-making performance, but empirical evidence is still limited. The results of our first study show that logical reasoning and basic mental calculation capacities predict ratio processing and that ratio processing predicts decision making under risk. In the second study, logical reasoning together with executive functions predicted probability processing (numeracy and probability knowledge), and probability processing predicted decision making under risk. These findings suggest that increasing an individual's understanding of ratios and probabilities should lead to more advantageous decisions under risk conditions.

  15. Stable laws and cosmic ray physics

    NASA Astrophysics Data System (ADS)

    Genolini, Y.; Salati, P.; Serpico, P. D.; Taillet, R.

    2017-04-01

    Context. In the new "precision era" for cosmic ray astrophysics, scientists making theoretical predictions cannot content themselves with average trends, but need to correctly take into account intrinsic uncertainties. The space-time discreteness of the cosmic ray sources, together with a substantial ignorance of their precise epochs and locations (with the possible exception of the most recent and close ones) play an important role in this sense. Aims: We elaborate a statistical theory to deal with this problem, relating the composite probability P(Ψ) to obtain a flux Ψ at the Earth and the single-source probability p(ψ) to contribute with a flux ψ. The main difficulty arises from the fact that p(ψ) is a "heavy tail" distribution, characterized by power-law or broken power-law behavior up to very large fluxes, for which the central limit theorem does not hold, and leading to distributions different from Gaussian. The functional form of the distribution for the aggregated flux is nonetheless unchanged by its own convolution, that is, it belongs to the so-called stable laws class. Methods: We analytically discuss the regime of validity of the stable laws associated with the distributions arising in cosmic ray astrophysics, as well as the limitations to the treatment imposed by causal considerations and partial source catalog knowledge. We validate our results with extensive Monte Carlo simulations, for different regimes of propagation parameters and energies. Results: We find that relatively simple recipes provide a satisfactory description of the probability P(Ψ). We also find that a naive Gaussian fit to simulation results would underestimate the probability of very large fluxes, that is, several times above the average, while overestimating the probability of relatively milder excursions. At large energies, large flux fluctuations are prevented by causal considerations, while at low energies, a partial knowledge of the recent and nearby population of sources plays an important role. A few proposals have been recently discussed in the literature to account for spectral breaks reported in cosmic ray data in terms of local contributions. We apply our newly developed theory to assess their probabilities, finding that they are relatively small, typically at the 0.1% level or smaller, never exceeding 1%. Conclusions: The use of heavy tail distributions is relevant in assessing how likely a measured cosmic ray flux is to depart from the average expectation in a given model. The existing mathematical theory leading to stable laws can be adapted to the case of interest via some recipes that closely reproduce numerical simulations and are relatively easy to implement.

  16. Experimental Study of the Effect of the Initial Spectrum Width on the Statistics of Random Wave Groups

    NASA Astrophysics Data System (ADS)

    Shemer, L.; Sergeeva, A.

    2009-12-01

    The statistics of random water wave field determines the probability of appearance of extremely high (freak) waves. This probability is strongly related to the spectral wave field characteristics. Laboratory investigation of the spatial variation of the random wave-field statistics for various initial conditions is thus of substantial practical importance. Unidirectional nonlinear random wave groups are investigated experimentally in the 300 m long Large Wave Channel (GWK) in Hannover, Germany, which is the biggest facility of its kind in Europe. Numerous realizations of a wave field with the prescribed frequency power spectrum, yet randomly-distributed initial phases of each harmonic, were generated by a computer-controlled piston-type wavemaker. Several initial spectral shapes with identical dominant wave length but different width were considered. For each spectral shape, the total duration of sampling in all realizations was long enough to yield sufficient sample size for reliable statistics. Through all experiments, an effort had been made to retain the characteristic wave height value and thus the degree of nonlinearity of the wave field. Spatial evolution of numerous statistical wave field parameters (skewness, kurtosis and probability distributions) is studied using about 25 wave gauges distributed along the tank. It is found that, depending on the initial spectral shape, the frequency spectrum of the wave field may undergo significant modification in the course of its evolution along the tank; the values of all statistical wave parameters are strongly related to the local spectral width. A sample of the measured wave height probability functions (scaled by the variance of surface elevation) is plotted in Fig. 1 for the initially narrow rectangular spectrum. The results in Fig. 1 resemble findings obtained in [1] for the initial Gaussian spectral shape. The probability of large waves notably surpasses that predicted by the Rayleigh distribution and is the highest at the distance of about 100 m. Acknowledgement This study is carried out in the framework of the EC supported project "Transnational access to large-scale tests in the Large Wave Channel (GWK) of Forschungszentrum Küste (Contract HYDRALAB III - No. 022441). [1] L. Shemer and A. Sergeeva, J. Geophys. Res. Oceans 114, C01015 (2009). Figure 1. Variation along the tank of the measured wave height distribution for rectangular initial spectral shape, the carrier wave period T0=1.5 s.

  17. Big data prediction of durations for online collective actions based on peak's timing

    NASA Astrophysics Data System (ADS)

    Nie, Shizhao; Wang, Zheng; Pujia, Wangmo; Nie, Yuan; Lu, Peng

    2018-02-01

    Peak Model states that each collective action has a life circle, which contains four periods of "prepare", "outbreak", "peak", and "vanish"; and the peak determines the max energy and the whole process. The peak model's re-simulation indicates that there seems to be a stable ratio between the peak's timing (TP) and the total span (T) or duration of collective actions, which needs further validations through empirical data of collective actions. Therefore, the daily big data of online collective actions is applied to validate the model; and the key is to check the ratio between peak's timing and the total span. The big data is obtained from online data recording & mining of websites. It is verified by the empirical big data that there is a stable ratio between TP and T; furthermore, it seems to be normally distributed. This rule holds for both the general cases and the sub-types of collective actions. Given the distribution of the ratio, estimated probability density function can be obtained, and therefore the span can be predicted via the peak's timing. Under the scenario of big data, the instant span (how long the collective action lasts or when it ends) will be monitored and predicted in real-time. With denser data (Big Data), the estimation of the ratio's distribution gets more robust, and the prediction of collective actions' spans or durations will be more accurate.

  18. Selective Attention in Pigeon Temporal Discrimination.

    PubMed

    Subramaniam, Shrinidhi; Kyonka, Elizabeth

    2017-07-27

    Cues can vary in how informative they are about when specific outcomes, such as food availability, will occur. This study was an experimental investigation of the functional relation between cue informativeness and temporal discrimination in a peak-interval (PI) procedure. Each session consisted of fixed-interval (FI) 2-s and 4-s schedules of food and occasional, 12-s PI trials during which pecks had no programmed consequences. Across conditions, the phi (ϕ) correlation between key light color and FI schedule value was manipulated. Red and green key lights signaled the onset of either or both FI schedules. Different colors were either predictive (ϕ = 1), moderately predictive (ϕ = 0.2-0.8), or not predictive (ϕ = 0) of a specific FI schedule. This study tested the hypothesis that temporal discrimination is a function of the momentary conditional probability of food; that is, pigeons peck the most at either 2 s or 4 s when ϕ = 1 and peck at both intervals when ϕ < 1. Response distributions were bimodal Gaussian curves; distributions from red- and green-key PI trials converged when ϕ ≤ 0.6. Peak times estimated by summed Gaussian functions, averaged across conditions and pigeons, were 1.85 s and 3.87 s, however, pigeons did not always maximize the momentary probability of food. When key light color was highly correlated with FI schedules (ϕ ≥ 0.6), estimates of peak times indicated that temporal discrimination accuracy was reduced at the unlikely interval, but not the likely interval. The mechanism of this reduced temporal discrimination accuracy could be interpreted as an attentional process.

  19. Wave-height hazard analysis in Eastern Coast of Spain - Bayesian approach using generalized Pareto distribution

    NASA Astrophysics Data System (ADS)

    Egozcue, J. J.; Pawlowsky-Glahn, V.; Ortego, M. I.

    2005-03-01

    Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0.

  20. Ecology of nonnative Siberian prawn (Palaemon modestus) in the lower Snake River, Washington, USA

    USGS Publications Warehouse

    Erhardt, John M.; Tiffan, Kenneth F.

    2016-01-01

    We assessed the abundance, distribution, and ecology of the nonnative Siberian prawn Palaemon modestus in the lower Snake River, Washington, USA. Analysis of prawn passage abundance at three Snake River dams showed that populations are growing at exponential rates, especially at Little Goose Dam where over 464,000 prawns were collected in 2015. Monthly beam trawling during 2011–2013 provided information on prawn abundance and distribution in Lower Granite and Little Goose Reservoirs. Zero-inflated regression predicted that the probability of prawn presence increased with decreasing water velocity and increasing depth. Negative binomial models predicted higher catch rates of prawns in deeper water and in closer proximity to dams. Temporally, prawn densities decreased slightly in the summer, likely due to the mortality of older individuals, and then increased in autumn and winter with the emergence and recruitment of young of the year. Seasonal length frequencies showed that distinct juvenile and adult size classes exist throughout the year, suggesting prawns live from 1 to 2 years and may be able to reproduce multiple times during their life. Most juvenile prawns become reproductive adults in 1 year, and peak reproduction occurs from late July through October. Mean fecundity (189 eggs) and reproductive output (11.9 %) are similar to that in their native range. The current use of deep habitats by prawns likely makes them unavailable to most predators in the reservoirs. The distribution and role of Siberian prawns in the lower Snake River food web will probably continue to change as the population grows and warrants continued monitoring and investigation.

  1. Computer Modeling to Evaluate the Impact of Technology Changes on Resident Procedural Volume.

    PubMed

    Grenda, Tyler R; Ballard, Tiffany N S; Obi, Andrea T; Pozehl, William; Seagull, F Jacob; Chen, Ryan; Cohn, Amy M; Daskin, Mark S; Reddy, Rishindra M

    2016-12-01

    As resident "index" procedures change in volume due to advances in technology or reliance on simulation, it may be difficult to ensure trainees meet case requirements. Training programs are in need of metrics to determine how many residents their institutional volume can support. As a case study of how such metrics can be applied, we evaluated a case distribution simulation model to examine program-level mediastinoscopy and endobronchial ultrasound (EBUS) volumes needed to train thoracic surgery residents. A computer model was created to simulate case distribution based on annual case volume, number of trainees, and rotation length. Single institutional case volume data (2011-2013) were applied, and 10 000 simulation years were run to predict the likelihood (95% confidence interval) of all residents (4 trainees) achieving board requirements for operative volume during a 2-year program. The mean annual mediastinoscopy volume was 43. In a simulation of pre-2012 board requirements (thoracic pathway, 25; cardiac pathway, 10), there was a 6% probability of all 4 residents meeting requirements. Under post-2012 requirements (thoracic, 15; cardiac, 10), however, the likelihood increased to 88%. When EBUS volume (mean 19 cases per year) was concurrently evaluated in the post-2012 era (thoracic, 10; cardiac, 0), the likelihood of all 4 residents meeting case requirements was only 23%. This model provides a metric to predict the probability of residents meeting case requirements in an era of changing volume by accounting for unpredictable and inequitable case distribution. It could be applied across operations, procedures, or disease diagnoses and may be particularly useful in developing resident curricula and schedules.

  2. Microdosimetric Modeling of Biological Effectiveness for Boron Neutron Capture Therapy Considering Intra- and Intercellular Heterogeneity in 10B Distribution.

    PubMed

    Sato, Tatsuhiko; Masunaga, Shin-Ichiro; Kumada, Hiroaki; Hamada, Nobuyuki

    2018-01-17

    We here propose a new model for estimating the biological effectiveness for boron neutron capture therapy (BNCT) considering intra- and intercellular heterogeneity in 10 B distribution. The new model was developed from our previously established stochastic microdosimetric kinetic model that determines the surviving fraction of cells irradiated with any radiations. In the model, the probability density of the absorbed doses in microscopic scales is the fundamental physical index for characterizing the radiation fields. A new computational method was established to determine the probability density for application to BNCT using the Particle and Heavy Ion Transport code System PHITS. The parameters used in the model were determined from the measured surviving fraction of tumor cells administrated with two kinds of 10 B compounds. The model quantitatively highlighted the indispensable need to consider the synergetic effect and the dose dependence of the biological effectiveness in the estimate of the therapeutic effect of BNCT. The model can predict the biological effectiveness of newly developed 10 B compounds based on their intra- and intercellular distributions, and thus, it can play important roles not only in treatment planning but also in drug discovery research for future BNCT.

  3. An exactly solvable coarse-grained model for species diversity

    NASA Astrophysics Data System (ADS)

    Suweis, Samir; Rinaldo, Andrea; Maritan, Amos

    2012-07-01

    We present novel analytical results concerning ecosystem species diversity that stem from a proposed coarse-grained neutral model based on birth-death processes. The relevance of the problem lies in the urgency for understanding and synthesizing both theoretical results from ecological neutral theory and empirical evidence on species diversity preservation. The neutral model of biodiversity deals with ecosystems at the same trophic level, where per capita vital rates are assumed to be species independent. Closed-form analytical solutions for the neutral theory are obtained within a coarse-grained model, where the only input is the species persistence time distribution. Our results pertain to: the probability distribution function of the number of species in the ecosystem, both in transient and in stationary states; the n-point connected time correlation function; and the survival probability, defined as the distribution of time spans to local extinction for a species randomly sampled from the community. Analytical predictions are also tested on empirical data from an estuarine fish ecosystem. We find that emerging properties of the ecosystem are very robust and do not depend on specific details of the model, with implications for biodiversity and conservation biology.

  4. Modeling close encounters with massive asteroids: a Markovian approach. An application to the Vesta family

    NASA Astrophysics Data System (ADS)

    Carruba, V.; Roig, F.; Michtchenko, T. A.; Ferraz-Mello, S.; Nesvorný, D.

    2007-04-01

    Context: Nearly all members of the Vesta family cross the orbits of (4) Vesta, one of the most massive asteroids in the main belt, and some of them approach it closely. When mutual velocities during such close encounters are low, the trajectory of the small body can be gravitationally deflected, consequently changing its heliocentric orbital elements. While the effect of a single close encounter may be small, repeated close encounters may significantly change the proper element distribution of members of asteroid families. Aims: We develop a model of the long-term effect of close encounters with massive asteroids, so as to be able to predict how far former members of the Vesta family could have drifted away from the family. Methods: We first developed a new symplectic integrator that simulates both the effects of close encounters and the Yarkovsky effect. We analyzed the results of a simulation involving a fictitious Vesta family, and propagated the asteroid proper element distribution using the probability density function (pdf hereafter), i.e. the function that describes the probability of having an encounter that modifies a proper element x by Δx, for all the possible values of Δx. Given any asteroids' proper element distribution at time t, the distribution at time t+T may be predicted if the pdf is known (Bachelier 1900, Théorie de la spéculation; Hughes 1995, Random Walks and Random Environments, Vol. I). Results: We applied our new method to the problem of V-type asteroids outside the Vesta family (i.e., the 31 currently known asteroids in the inner asteroid belt that have the same spectral type of members as the Vesta family, but that are outside the limits of the dynamical family) and determined that at least ten objects have a significant diffusion probability over the minimum estimated age of the Vesta family of 1.2 Gyr (Carruba et al. 2005, A&A, 441, 819). These objects can therefore be explained in the framework of diffusion via repeated close encounters with (4) Vesta of asteroids originally closer to the parent body. Conclusions: We computed diffusion probabilities at the location of four of these asteroids for various initial conditions, parametrized by values of initial ejection velocity V_ej. Based on our results, we believe the Vesta family age is (1200 ± 700) Myr old, with an initial ejection velocity of (240 ± 60) m/s. Appendices are only available in electronic form at http://www.aanda.org

  5. The role of correlations in uncertainty quantification of transportation relevant fuel models

    DOE PAGES

    Fridlyand, Aleksandr; Johnson, Matthew S.; Goldsborough, S. Scott; ...

    2017-02-03

    Large reaction mechanisms are often used to describe the combustion behavior of transportation-relevant fuels like gasoline, where these are typically represented by surrogate blends, e.g., n-heptane/iso-octane/toluene. We describe efforts to quantify the uncertainty in the predictions of such mechanisms at realistic engine conditions, seeking to better understand the robustness of the model as well as the important reaction pathways and their impacts on combustion behavior. In this work, we examine the importance of taking into account correlations among reactions that utilize the same rate rules and those with multiple product channels on forward propagation of uncertainty by Monte Carlo simulations.more » Automated means are developed to generate the uncertainty factor assignment for a detailed chemical kinetic mechanism, by first uniquely identifying each reacting species, then sorting each of the reactions based on the rate rule utilized. Simulation results reveal that in the low temperature combustion regime for iso-octane, the majority of the uncertainty in the model predictions can be attributed to low temperature reactions of the fuel sub-mechanism. The foundational, or small-molecule chemistry (C 0-C 4) only contributes significantly to uncertainties in the predictions at the highest temperatures (Tc=900 K). Accounting for correlations between important reactions is shown to produce non-negligible differences in the estimates of uncertainty. Including correlations among reactions that use the same rate rules increases uncertainty in the model predictions, while accounting for correlations among reactions with multiple branches decreases uncertainty in some cases. Significant non-linear response is observed in the model predictions depending on how the probability distributions of the uncertain rate constants are defined.Finally, we concluded that care must be exercised in defining these probability distributions in order to reduce bias, and physically unrealistic estimates in the forward propagation of uncertainty for a range of UQ activities.« less

  6. Probabilistic determination of the ecological risk from OTNE in aquatic and terrestrial compartments based on US-wide monitoring data.

    PubMed

    McDonough, Kathleen; Casteel, Kenneth; Zoller, Ann; Wehmeyer, Kenneth; Hulzebos, Etje; Rila, Jean-Paul; Salvito, Daniel; Federle, Thomas

    2017-01-01

    OTNE [1-(1,2,3,4,5,6,7,8-octahydro-2,3,8,8-tetramethyl-2-naphthyl)ethan-1-one; trade name Iso E Super] is a fragrance ingredient commonly used in consumer products which are disposed down the drain. This research measured effluent and sludge concentrations of OTNE at 44 US wastewater treatment plants (WWTP). The mean effluent and sludge concentrations were 0.69 ± 0.65 μg/L and 20.6 ± 33.8 mg/kg dw respectively. Distribution of OTNE effluent concentrations and dilution factors were used to predict surface water and sediment concentrations and distributions of OTNE sludge concentrations and loading rates were used to predict terrestrial concentrations. The 90th percentile concentration of OTNE in US WWTP mixing zones was predicted to be 0.04 and 0.85 μg/L under mean and 7Q10 low flow (lowest river flow occurring over a 7 day period every 10 years) conditions respectively. The 90th percentile sediment concentrations under mean and 7Q10 low flow conditions were predicted to be 0.081 and 1.6 mg/kg dw respectively. Based on current US sludge application practices, the 90th percentile OTNE terrestrial concentration was 1.38 mg/kg dw. The probability of OTNE concentrations being below the predicted no effect concentration (PNEC) for the aquatic and sediment compartments was greater than 99%. For the terrestrial compartment, the probability of OTNE concentrations being lower than the PNEC was 97% for current US sludge application practices. Based on the results of this study, OTNE concentrations in US WWTP effluent and sludge do not pose an ecological risk to aquatic, sediment and terrestrial organisms. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. A validation study of a stochastic model of human interaction

    NASA Astrophysics Data System (ADS)

    Burchfield, Mitchel Talmadge

    The purpose of this dissertation is to validate a stochastic model of human interactions which is part of a developmentalism paradigm. Incorporating elements of ancient and contemporary philosophy and science, developmentalism defines human development as a progression of increasing competence and utilizes compatible theories of developmental psychology, cognitive psychology, educational psychology, social psychology, curriculum development, neurology, psychophysics, and physics. To validate a stochastic model of human interactions, the study addressed four research questions: (a) Does attitude vary over time? (b) What are the distributional assumptions underlying attitudes? (c) Does the stochastic model, {-}N{intlimitssbsp{-infty}{infty}}varphi(chi,tau)\\ Psi(tau)dtau, have utility for the study of attitudinal distributions and dynamics? (d) Are the Maxwell-Boltzmann, Fermi-Dirac, and Bose-Einstein theories applicable to human groups? Approximately 25,000 attitude observations were made using the Semantic Differential Scale. Positions of individuals varied over time and the logistic model predicted observed distributions with correlations between 0.98 and 1.0, with estimated standard errors significantly less than the magnitudes of the parameters. The results bring into question the applicability of Fisherian research designs (Fisher, 1922, 1928, 1938) for behavioral research based on the apparent failure of two fundamental assumptions-the noninteractive nature of the objects being studied and normal distribution of attributes. The findings indicate that individual belief structures are representable in terms of a psychological space which has the same or similar properties as physical space. The psychological space not only has dimension, but individuals interact by force equations similar to those described in theoretical physics models. Nonlinear regression techniques were used to estimate Fermi-Dirac parameters from the data. The model explained a high degree of the variance in each probability distribution. The correlation between predicted and observed probabilities ranged from a low of 0.955 to a high value of 0.998, indicating that humans behave in psychological space as Fermions behave in momentum space.

  8. Using satellite radiotelemetry data to delineate and manage wildlife populations

    USGS Publications Warehouse

    Amstrup, Steven C.; McDonald, T.L.; Durner, George M.

    2004-01-01

    The greatest promise of radiotelemetry always has been a better understanding of animal movements. Telemetry has helped us know when animals are active, how active they are, how far and how fast they move, the geographic areas they occupy, and whether individuals vary in these traits. Unfortunately, the inability to estimate the error in animals utilization distributions (UDs), has prevented probabilistic linkage of movements data, which are always retrospective, with future management actions. We used the example of the harvested population of polar bears (Ursus maritimus) in the Southern Beaufort Sea to illustrate a method that provides that linkage. We employed a 2-dimensional Gaussian kernel density estimator to smooth and scale frequencies of polar bear radio locations within cells of a grid overlying our study area. True 2-dimensional smoothing allowed us to create accurate descriptions of the UDs of individuals and groups of bears. We used a new method of clustering, based upon the relative use collared bears made of each cell in our grid, to assign individual animals to populations. We applied the fast Fourier transform to make bootstrapped estimates of the error in UDs computationally feasible. Clustering and kernel smoothing identified 3 populations of polar bears in the region between Wrangel Island, Russia, and Banks Island, Canada. The relative probability of occurrence of animals from each population varied significantly among grid cells distributed across the study area. We displayed occurrence probabilities as contour maps wherein each contour line corresponded with a change in relative probability. Only at the edges of our study area and in some offshore regions were bootstrapped estimates of error in occurrence probabilities too high to allow prediction. Error estimates, which also were displayed as contours, allowed us to show that occurrence probabilities did not vary by season. Near Barrow, Alaska, 50% of bears observed are predicted to be from the Chukchi Sea population and 50% from the Southern Beaufort Sea population. At Tuktoyaktuk, Northwest Territories, Canada, 50% are from the Southern Beaufort Sea and 50% from the Northern Beaufort Sea population. The methods described here will aid managers of all wildlife that can be studied by telemetry to allocate harvests and other human perturbations to the appropriate populations, make risk assessments, and predict impacts of human activities. They will aid researchers by providing the refined descriptions of study populations that are necessary for population estimation and other investigative tasks. Arctic, Beaufort Sea, boundaries, clustering, Fourier transform, kernel, management, polar bears, population delineation, radiotelemetry, satellite, smoothing, Ursus maritimus

  9. Bayesian modeling and inference for diagnostic accuracy and probability of disease based on multiple diagnostic biomarkers with and without a perfect reference standard.

    PubMed

    Jafarzadeh, S Reza; Johnson, Wesley O; Gardner, Ian A

    2016-03-15

    The area under the receiver operating characteristic (ROC) curve (AUC) is used as a performance metric for quantitative tests. Although multiple biomarkers may be available for diagnostic or screening purposes, diagnostic accuracy is often assessed individually rather than in combination. In this paper, we consider the interesting problem of combining multiple biomarkers for use in a single diagnostic criterion with the goal of improving the diagnostic accuracy above that of an individual biomarker. The diagnostic criterion created from multiple biomarkers is based on the predictive probability of disease, conditional on given multiple biomarker outcomes. If the computed predictive probability exceeds a specified cutoff, the corresponding subject is allocated as 'diseased'. This defines a standard diagnostic criterion that has its own ROC curve, namely, the combined ROC (cROC). The AUC metric for cROC, namely, the combined AUC (cAUC), is used to compare the predictive criterion based on multiple biomarkers to one based on fewer biomarkers. A multivariate random-effects model is proposed for modeling multiple normally distributed dependent scores. Bayesian methods for estimating ROC curves and corresponding (marginal) AUCs are developed when a perfect reference standard is not available. In addition, cAUCs are computed to compare the accuracy of different combinations of biomarkers for diagnosis. The methods are evaluated using simulations and are applied to data for Johne's disease (paratuberculosis) in cattle. Copyright © 2015 John Wiley & Sons, Ltd.

  10. Conservation status assessment of an endangered insular raptor: the Sharp-shinned Hawk in Puerto Rico

    USGS Publications Warehouse

    Gallardo, Julio C.; Vilella, Francisco

    2017-01-01

    Sharp‐shinned Hawks (Accipiter striatus) are forest raptors that are widely distributed in the Americas. A subspecies endemic to Puerto Rico (A. s. venator) is listed as endangered and restricted to mature and old secondary montane forests and shade coffee plantations. However, recent information about the population status and distribution of Puerto Rican Sharp‐shinned Hawks is lacking. We developed a spatial geographic distribution model for Sharp‐shinned Hawks in Puerto Rico from 33 locations collected during four breeding seasons (2013–2016) using biologically relevant landscape variables (aspect, canopy closure, elevation, rainfall, slope, and terrain roughness). Elevation accounted for 89.8% of the model fit and predicted that the greatest probability of occurrence of Sharp‐shinned Hawks in Puerto Rico (> 60%) was at elevations above 900 m. Based on our model, an estimated 56.1 km2 of habitat exists in Puerto Rico with a high probability of occurrence. This total represents ~0.6% of the island's area. Public lands included 43.8% of habitat with high probability of occurrence (24.6 km2), 96% of which was located within four protected areas. Our results suggest that Sharp‐shinned Hawks are rare in Puerto Rico and restricted to the higher elevations of the Cordillera Central. Additional research is needed to identify and address ecological limiting factors, and recovery actions are needed to avoid the extinction of this endemic island raptor.

  11. Extinction probabilities and stationary distributions of mobile genetic elements in prokaryotes: The birth-death-diversification model.

    PubMed

    Drakos, Nicole E; Wahl, Lindi M

    2015-12-01

    Theoretical approaches are essential to our understanding of the complex dynamics of mobile genetic elements (MGEs) within genomes. Recently, the birth-death-diversification model was developed to describe the dynamics of mobile promoters (MPs), a particular class of MGEs in prokaryotes. A unique feature of this model is that genetic diversification of elements was included. To explore the implications of diversification on the longterm fate of MGE lineages, in this contribution we analyze the extinction probabilities, extinction times and equilibrium solutions of the birth-death-diversification model. We find that diversification increases both the survival and growth rate of MGE families, but the strength of this effect depends on the rate of horizontal gene transfer (HGT). We also find that the distribution of MGE families per genome is not necessarily monotonically decreasing, as observed for MPs, but may have a peak in the distribution that is related to the HGT rate. For MPs specifically, we find that new families have a high extinction probability, and predict that the number of MPs is increasing, albeit at a very slow rate. Additionally, we develop an extension of the birth-death-diversification model which allows MGEs in different regions of the genome, for example coding and non-coding, to be described by different rates. This extension may offer a potential explanation as to why the majority of MPs are located in non-promoter regions of the genome. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Pre-scission model predictions of fission fragment mass distributions for super-heavy elements

    NASA Astrophysics Data System (ADS)

    Carjan, N.; Ivanyuk, F. A.; Oganessian, Yu. Ts.

    2017-12-01

    The total deformation energy just before the moment of neck rupture for the heaviest nuclei for which spontaneous fission has been detected (Ds281279-, 281Rg and Cn284282-) is calculated. The Strutinsky's prescription is used and nuclear shapes just before scission are described in terms of Cassinian ovals defined for the fixed value of elongation parameter α = 0.98 and generalized by the inclusion of four additional shape parameters: α1, α3, α4, and α6. Supposing that the probability of each point in the deformation space is given by Boltzmann factor, the distribution of the fission-fragment masses is estimated. The octupole deformation α3 at scission is found to play a decisive role in determining the main feature of the mass distribution: symmetric or asymmetric. Only the inclusion of α3 leads to an asymmetric division. Finally, the calculations are extended to an unexplored region of super-heavy nuclei: the even-even Fl (Z = 114), Lv (Z = 116), Og (Z = 118) and (Z = 126) isotopes. For these nuclei, the most probable mass of the light fragment has an almost constant value (≈136) like in the case of the most probable mass of the heavy fragment in the actinide region. It is the neutron shell at 82 that makes this light fragment so stable. Naturally, for very neutron-deficient isotopes, the mass division becomes symmetric when N = 2 × 82.

  13. Predictions from star formation in the multiverse

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bousso, Raphael; Leichenauer, Stefan

    2010-03-15

    We compute trivariate probability distributions in the landscape, scanning simultaneously over the cosmological constant, the primordial density contrast, and spatial curvature. We consider two different measures for regulating the divergences of eternal inflation, and three different models for observers. In one model, observers are assumed to arise in proportion to the entropy produced by stars; in the others, they arise at a fixed time (5 or 10x10{sup 9} years) after star formation. The star formation rate, which underlies all our observer models, depends sensitively on the three scanning parameters. We employ a recently developed model of star formation in themore » multiverse, a considerable refinement over previous treatments of the astrophysical and cosmological properties of different pocket universes. For each combination of observer model and measure, we display all single and bivariate probability distributions, both with the remaining parameter(s) held fixed and marginalized. Our results depend only weakly on the observer model but more strongly on the measure. Using the causal diamond measure, the observed parameter values (or bounds) lie within the central 2{sigma} of nearly all probability distributions we compute, and always within 3{sigma}. This success is encouraging and rather nontrivial, considering the large size and dimension of the parameter space. The causal patch measure gives similar results as long as curvature is negligible. If curvature dominates, the causal patch leads to a novel runaway: it prefers a negative value of the cosmological constant, with the smallest magnitude available in the landscape.« less

  14. Evaluation of carotid plaque echogenicity based on the integral of the cumulative probability distribution using gray-scale ultrasound images.

    PubMed

    Huang, Xiaowei; Zhang, Yanling; Meng, Long; Abbott, Derek; Qian, Ming; Wong, Kelvin K L; Zheng, Rongqing; Zheng, Hairong; Niu, Lili

    2017-01-01

    Carotid plaque echogenicity is associated with the risk of cardiovascular events. Gray-scale median (GSM) of the ultrasound image of carotid plaques has been widely used as an objective method for evaluation of plaque echogenicity in patients with atherosclerosis. We proposed a computer-aided method to evaluate plaque echogenicity and compared its efficiency with GSM. One hundred and twenty-five carotid plaques (43 echo-rich, 35 intermediate, 47 echolucent) were collected from 72 patients in this study. The cumulative probability distribution curves were obtained based on statistics of the pixels in the gray-level images of plaques. The area under the cumulative probability distribution curve (AUCPDC) was calculated as its integral value to evaluate plaque echogenicity. The classification accuracy for three types of plaques is 78.4% (kappa value, κ = 0.673), when the AUCPDC is used for classifier training, whereas GSM is 64.8% (κ = 0.460). The receiver operating characteristic curves were produced to test the effectiveness of AUCPDC and GSM for the identification of echolucent plaques. The area under the curve (AUC) was 0.817 when AUCPDC was used for training the classifier, which is higher than that achieved using GSM (AUC = 0.746). Compared with GSM, the AUCPDC showed a borderline association with coronary heart disease (Spearman r = 0.234, p = 0.050). Our experimental results suggest that AUCPDC analysis is a promising method for evaluation of plaque echogenicity and predicting cardiovascular events in patients with plaques.

  15. Joint genome-wide prediction in several populations accounting for randomness of genotypes: A hierarchical Bayes approach. I: Multivariate Gaussian priors for marker effects and derivation of the joint probability mass function of genotypes.

    PubMed

    Martínez, Carlos Alberto; Khare, Kshitij; Banerjee, Arunava; Elzo, Mauricio A

    2017-03-21

    It is important to consider heterogeneity of marker effects and allelic frequencies in across population genome-wide prediction studies. Moreover, all regression models used in genome-wide prediction overlook randomness of genotypes. In this study, a family of hierarchical Bayesian models to perform across population genome-wide prediction modeling genotypes as random variables and allowing population-specific effects for each marker was developed. Models shared a common structure and differed in the priors used and the assumption about residual variances (homogeneous or heterogeneous). Randomness of genotypes was accounted for by deriving the joint probability mass function of marker genotypes conditional on allelic frequencies and pedigree information. As a consequence, these models incorporated kinship and genotypic information that not only permitted to account for heterogeneity of allelic frequencies, but also to include individuals with missing genotypes at some or all loci without the need for previous imputation. This was possible because the non-observed fraction of the design matrix was treated as an unknown model parameter. For each model, a simpler version ignoring population structure, but still accounting for randomness of genotypes was proposed. Implementation of these models and computation of some criteria for model comparison were illustrated using two simulated datasets. Theoretical and computational issues along with possible applications, extensions and refinements were discussed. Some features of the models developed in this study make them promising for genome-wide prediction, the use of information contained in the probability distribution of genotypes is perhaps the most appealing. Further studies to assess the performance of the models proposed here and also to compare them with conventional models used in genome-wide prediction are needed. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. The Integrated Medical Model: A Probabilistic Simulation Model for Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting mass and volume constraints.

  17. The Integrated Medical Model: A Probabilistic Simulation Model Predicting In-Flight Medical Risks

    NASA Technical Reports Server (NTRS)

    Keenan, Alexandra; Young, Millennia; Saile, Lynn; Boley, Lynn; Walton, Marlei; Kerstman, Eric; Shah, Ronak; Goodenow, Debra A.; Myers, Jerry G., Jr.

    2015-01-01

    The Integrated Medical Model (IMM) is a probabilistic model that uses simulation to predict mission medical risk. Given a specific mission and crew scenario, medical events are simulated using Monte Carlo methodology to provide estimates of resource utilization, probability of evacuation, probability of loss of crew, and the amount of mission time lost due to illness. Mission and crew scenarios are defined by mission length, extravehicular activity (EVA) schedule, and crew characteristics including: sex, coronary artery calcium score, contacts, dental crowns, history of abdominal surgery, and EVA eligibility. The Integrated Medical Evidence Database (iMED) houses the model inputs for one hundred medical conditions using in-flight, analog, and terrestrial medical data. Inputs include incidence, event durations, resource utilization, and crew functional impairment. Severity of conditions is addressed by defining statistical distributions on the dichotomized best and worst-case scenarios for each condition. The outcome distributions for conditions are bounded by the treatment extremes of the fully treated scenario in which all required resources are available and the untreated scenario in which no required resources are available. Upon occurrence of a simulated medical event, treatment availability is assessed, and outcomes are generated depending on the status of the affected crewmember at the time of onset, including any pre-existing functional impairments or ongoing treatment of concurrent conditions. The main IMM outcomes, including probability of evacuation and loss of crew life, time lost due to medical events, and resource utilization, are useful in informing mission planning decisions. To date, the IMM has been used to assess mission-specific risks with and without certain crewmember characteristics, to determine the impact of eliminating certain resources from the mission medical kit, and to design medical kits that maximally benefit crew health while meeting mass and volume constraints.

  18. Methods, apparatus and system for notification of predictable memory failure

    DOEpatents

    Cher, Chen-Yong; Andrade Costa, Carlos H.; Park, Yoonho; Rosenburg, Bryan S.; Ryu, Kyung D.

    2017-01-03

    A method for providing notification of a predictable memory failure includes the steps of: obtaining information regarding at least one condition associated with a memory; calculating a memory failure probability as a function of the obtained information; calculating a failure probability threshold; and generating a signal when the memory failure probability exceeds the failure probability threshold, the signal being indicative of a predicted future memory failure.

  19. Predicting longitudinal trajectories of health probabilities with random-effects multinomial logit regression.

    PubMed

    Liu, Xian; Engel, Charles C

    2012-12-20

    Researchers often encounter longitudinal health data characterized with three or more ordinal or nominal categories. Random-effects multinomial logit models are generally applied to account for potential lack of independence inherent in such clustered data. When parameter estimates are used to describe longitudinal processes, however, random effects, both between and within individuals, need to be retransformed for correctly predicting outcome probabilities. This study attempts to go beyond existing work by developing a retransformation method that derives longitudinal growth trajectories of unbiased health probabilities. We estimated variances of the predicted probabilities by using the delta method. Additionally, we transformed the covariates' regression coefficients on the multinomial logit function, not substantively meaningful, to the conditional effects on the predicted probabilities. The empirical illustration uses the longitudinal data from the Asset and Health Dynamics among the Oldest Old. Our analysis compared three sets of the predicted probabilities of three health states at six time points, obtained from, respectively, the retransformation method, the best linear unbiased prediction, and the fixed-effects approach. The results demonstrate that neglect of retransforming random errors in the random-effects multinomial logit model results in severely biased longitudinal trajectories of health probabilities as well as overestimated effects of covariates on the probabilities. Copyright © 2012 John Wiley & Sons, Ltd.

  20. Precipitation Cluster Distributions: Current Climate Storm Statistics and Projected Changes Under Global Warming

    NASA Astrophysics Data System (ADS)

    Quinn, Kevin Martin

    The total amount of precipitation integrated across a precipitation cluster (contiguous precipitating grid cells exceeding a minimum rain rate) is a useful measure of the aggregate size of the disturbance, expressed as the rate of water mass lost or latent heat released, i.e. the power of the disturbance. Probability distributions of cluster power are examined during boreal summer (May-September) and winter (January-March) using satellite-retrieved rain rates from the Tropical Rainfall Measuring Mission (TRMM) 3B42 and Special Sensor Microwave Imager and Sounder (SSM/I and SSMIS) programs, model output from the High Resolution Atmospheric Model (HIRAM, roughly 0.25-0.5 0 resolution), seven 1-2° resolution members of the Coupled Model Intercomparison Project Phase 5 (CMIP5) experiment, and National Center for Atmospheric Research Large Ensemble (NCAR LENS). Spatial distributions of precipitation-weighted centroids are also investigated in observations (TRMM-3B42) and climate models during winter as a metric for changes in mid-latitude storm tracks. Observed probability distributions for both seasons are scale-free from the smallest clusters up to a cutoff scale at high cluster power, after which the probability density drops rapidly. When low rain rates are excluded by choosing a minimum rain rate threshold in defining clusters, the models accurately reproduce observed cluster power statistics and winter storm tracks. Changes in behavior in the tail of the distribution, above the cutoff, are important for impacts since these quantify the frequency of the most powerful storms. End-of-century cluster power distributions and storm track locations are investigated in these models under a "business as usual" global warming scenario. The probability of high cluster power events increases by end-of-century across all models, by up to an order of magnitude for the highest-power events for which statistics can be computed. For the three models in the suite with continuous time series of high resolution output, there is substantial variability on when these probability increases for the most powerful precipitation clusters become detectable, ranging from detectable within the observational period to statistically significant trends emerging only after 2050. A similar analysis of National Centers for Environmental Prediction (NCEP) Reanalysis 2 and SSM/I-SSMIS rain rate retrievals in the recent observational record does not yield reliable evidence of trends in high-power cluster probabilities at this time. Large impacts to mid-latitude storm tracks are projected over the West Coast and eastern North America, with no less than 8 of the 9 models examined showing large increases by end-of-century in the probability density of the most powerful storms, ranging up to a factor of 6.5 in the highest range bin for which historical statistics are computed. However, within these regional domains, there is considerable variation among models in pinpointing exactly where the largest increases will occur.

Top