Sample records for gamma probability distribution

  1. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.

    1983-01-01

    Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  2. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.H.

    1980-01-01

    Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  3. Measurement of absolute gamma emission probabilities

    NASA Astrophysics Data System (ADS)

    Sumithrarachchi, Chandana S.; Rengan, Krish; Griffin, Henry C.

    2003-06-01

    The energies and emission probabilities (intensities) of gamma-rays emitted in radioactive decays of particular nuclides are the most important characteristics by which to quantify mixtures of radionuclides. Often, quantification is limited by uncertainties in measured intensities. A technique was developed to reduce these uncertainties. The method involves obtaining a pure sample of a nuclide using radiochemical techniques, and using appropriate fractions for beta and gamma measurements. The beta emission rates were measured using a liquid scintillation counter, and the gamma emission rates were measured with a high-purity germanium detector. Results were combined to obtain absolute gamma emission probabilities. All sources of uncertainties greater than 0.1% were examined. The method was tested with 38Cl and 88Rb.

  4. The exact probability distribution of the rank product statistics for replicated experiments.

    PubMed

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  5. Inverse Gaussian gamma distribution model for turbulence-induced fading in free-space optical communication.

    PubMed

    Cheng, Mingjian; Guo, Ya; Li, Jiangting; Zheng, Xiaotong; Guo, Lixin

    2018-04-20

    We introduce an alternative distribution to the gamma-gamma (GG) distribution, called inverse Gaussian gamma (IGG) distribution, which can efficiently describe moderate-to-strong irradiance fluctuations. The proposed stochastic model is based on a modulation process between small- and large-scale irradiance fluctuations, which are modeled by gamma and inverse Gaussian distributions, respectively. The model parameters of the IGG distribution are directly related to atmospheric parameters. The accuracy of the fit among the IGG, log-normal, and GG distributions with the experimental probability density functions in moderate-to-strong turbulence are compared, and results indicate that the newly proposed IGG model provides an excellent fit to the experimental data. As the receiving diameter is comparable with the atmospheric coherence radius, the proposed IGG model can reproduce the shape of the experimental data, whereas the GG and LN models fail to match the experimental data. The fundamental channel statistics of a free-space optical communication system are also investigated in an IGG-distributed turbulent atmosphere, and a closed-form expression for the outage probability of the system is derived with Meijer's G-function.

  6. Probability distributions of the electroencephalogram envelope of preterm infants.

    PubMed

    Saji, Ryoya; Hirasawa, Kyoko; Ito, Masako; Kusuda, Satoshi; Konishi, Yukuo; Taga, Gentaro

    2015-06-01

    To determine the stationary characteristics of electroencephalogram (EEG) envelopes for prematurely born (preterm) infants and investigate the intrinsic characteristics of early brain development in preterm infants. Twenty neurologically normal sets of EEGs recorded in infants with a post-conceptional age (PCA) range of 26-44 weeks (mean 37.5 ± 5.0 weeks) were analyzed. Hilbert transform was applied to extract the envelope. We determined the suitable probability distribution of the envelope and performed a statistical analysis. It was found that (i) the probability distributions for preterm EEG envelopes were best fitted by lognormal distributions at 38 weeks PCA or less, and by gamma distributions at 44 weeks PCA; (ii) the scale parameter of the lognormal distribution had positive correlations with PCA as well as a strong negative correlation with the percentage of low-voltage activity; (iii) the shape parameter of the lognormal distribution had significant positive correlations with PCA; (iv) the statistics of mode showed significant linear relationships with PCA, and, therefore, it was considered a useful index in PCA prediction. These statistics, including the scale parameter of the lognormal distribution and the skewness and mode derived from a suitable probability distribution, may be good indexes for estimating stationary nature in developing brain activity in preterm infants. The stationary characteristics, such as discontinuity, asymmetry, and unimodality, of preterm EEGs are well indicated by the statistics estimated from the probability distribution of the preterm EEG envelopes. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  7. Probability distribution functions for unit hydrographs with optimization using genetic algorithm

    NASA Astrophysics Data System (ADS)

    Ghorbani, Mohammad Ali; Singh, Vijay P.; Sivakumar, Bellie; H. Kashani, Mahsa; Atre, Atul Arvind; Asadi, Hakimeh

    2017-05-01

    A unit hydrograph (UH) of a watershed may be viewed as the unit pulse response function of a linear system. In recent years, the use of probability distribution functions (pdfs) for determining a UH has received much attention. In this study, a nonlinear optimization model is developed to transmute a UH into a pdf. The potential of six popular pdfs, namely two-parameter gamma, two-parameter Gumbel, two-parameter log-normal, two-parameter normal, three-parameter Pearson distribution, and two-parameter Weibull is tested on data from the Lighvan catchment in Iran. The probability distribution parameters are determined using the nonlinear least squares optimization method in two ways: (1) optimization by programming in Mathematica; and (2) optimization by applying genetic algorithm. The results are compared with those obtained by the traditional linear least squares method. The results show comparable capability and performance of two nonlinear methods. The gamma and Pearson distributions are the most successful models in preserving the rising and recession limbs of the unit hydographs. The log-normal distribution has a high ability in predicting both the peak flow and time to peak of the unit hydrograph. The nonlinear optimization method does not outperform the linear least squares method in determining the UH (especially for excess rainfall of one pulse), but is comparable.

  8. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    NASA Astrophysics Data System (ADS)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  9. 134Cs emission probabilities determination by gamma spectrometry

    NASA Astrophysics Data System (ADS)

    de Almeida, M. C. M.; Poledna, R.; Delgado, J. U.; Silva, R. L.; Araujo, M. T. F.; da Silva, C. J.

    2018-03-01

    The National Laboratory for Ionizing Radiation Metrology (LNMRI/IRD/CNEN) of Rio de Janeiro performed primary and secondary standardization of different radionuclides reaching satisfactory uncertainties. A solution of 134Cs radionuclide was purchased from commercial supplier to emission probabilities determination of some of its energies. 134Cs is a beta gamma emitter with 754 days of half-life. This radionuclide is used as standard in environmental, water and food control. It is also important to germanium detector calibration. The gamma emission probabilities (Pγ) were determined mainly for some energies of the 134Cs by efficiency curve method and the Pγ absolute uncertainties obtained were below 1% (k=1).

  10. Time Evolving Fission Chain Theory and Fast Neutron and Gamma-Ray Counting Distributions

    DOE PAGES

    Kim, K. S.; Nakae, L. F.; Prasad, M. K.; ...

    2015-11-01

    Here, we solve a simple theoretical model of time evolving fission chains due to Feynman that generalizes and asymptotically approaches the point model theory. The point model theory has been used to analyze thermal neutron counting data. This extension of the theory underlies fast counting data for both neutrons and gamma rays from metal systems. Fast neutron and gamma-ray counting is now possible using liquid scintillator arrays with nanosecond time resolution. For individual fission chains, the differential equations describing three correlated probability distributions are solved: the time-dependent internal neutron population, accumulation of fissions in time, and accumulation of leaked neutronsmore » in time. Explicit analytic formulas are given for correlated moments of the time evolving chain populations. The equations for random time gate fast neutron and gamma-ray counting distributions, due to randomly initiated chains, are presented. Correlated moment equations are given for both random time gate and triggered time gate counting. There are explicit formulas for all correlated moments are given up to triple order, for all combinations of correlated fast neutrons and gamma rays. The nonlinear differential equations for probabilities for time dependent fission chain populations have a remarkably simple Monte Carlo realization. A Monte Carlo code was developed for this theory and is shown to statistically realize the solutions to the fission chain theory probability distributions. Combined with random initiation of chains and detection of external quanta, the Monte Carlo code generates time tagged data for neutron and gamma-ray counting and from these data the counting distributions.« less

  11. Novel Radiobiological Gamma Index for Evaluation of 3-Dimensional Predicted Dose Distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sumida, Iori, E-mail: sumida@radonc.med.osaka-u.ac.jp; Yamaguchi, Hajime; Kizaki, Hisao

    2015-07-15

    Purpose: To propose a gamma index-based dose evaluation index that integrates the radiobiological parameters of tumor control (TCP) and normal tissue complication probabilities (NTCP). Methods and Materials: Fifteen prostate and head and neck (H&N) cancer patients received intensity modulated radiation therapy. Before treatment, patient-specific quality assurance was conducted via beam-by-beam analysis, and beam-specific dose error distributions were generated. The predicted 3-dimensional (3D) dose distribution was calculated by back-projection of relative dose error distribution per beam. A 3D gamma analysis of different organs (prostate: clinical [CTV] and planned target volumes [PTV], rectum, bladder, femoral heads; H&N: gross tumor volume [GTV], CTV,more » spinal cord, brain stem, both parotids) was performed using predicted and planned dose distributions under 2%/2 mm tolerance and physical gamma passing rate was calculated. TCP and NTCP values were calculated for voxels with physical gamma indices (PGI) >1. We propose a new radiobiological gamma index (RGI) to quantify the radiobiological effects of TCP and NTCP and calculate radiobiological gamma passing rates. Results: The mean RGI gamma passing rates for prostate cases were significantly different compared with those of PGI (P<.03–.001). The mean RGI gamma passing rates for H&N cases (except for GTV) were significantly different compared with those of PGI (P<.001). Differences in gamma passing rates between PGI and RGI were due to dose differences between the planned and predicted dose distributions. Radiobiological gamma distribution was visualized to identify areas where the dose was radiobiologically important. Conclusions: RGI was proposed to integrate radiobiological effects into PGI. This index would assist physicians and medical physicists not only in physical evaluations of treatment delivery accuracy, but also in clinical evaluations of predicted dose distribution.« less

  12. Maximizing a Probability: A Student Workshop on an Application of Continuous Distributions

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2010-01-01

    For many students meeting, say, the gamma distribution for the first time, it may well turn out to be a rather fruitless encounter unless they are immediately able to see an application of this probability model to some real-life situation. With this in mind, we pose here an appealing problem that can be used as the basis for a workshop activity…

  13. The probability distribution model of air pollution index and its dominants in Kuala Lumpur

    NASA Astrophysics Data System (ADS)

    AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah

    2016-11-01

    This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.

  14. Use of the gamma distribution to represent monthly rainfall in Africa for drought monitoring applications

    USGS Publications Warehouse

    Husak, Gregory J.; Michaelsen, Joel C.; Funk, Christopher C.

    2007-01-01

    Evaluating a range of scenarios that accurately reflect precipitation variability is critical for water resource applications. Inputs to these applications can be provided using location- and interval-specific probability distributions. These distributions make it possible to estimate the likelihood of rainfall being within a specified range. In this paper, we demonstrate the feasibility of fitting cell-by-cell probability distributions to grids of monthly interpolated, continent-wide data. Future work will then detail applications of these grids to improved satellite-remote sensing of drought and interpretations of probabilistic climate outlook forum forecasts. The gamma distribution is well suited to these applications because it is fairly familiar to African scientists, and capable of representing a variety of distribution shapes. This study tests the goodness-of-fit using the Kolmogorov–Smirnov (KS) test, and compares these results against another distribution commonly used in rainfall events, the Weibull. The gamma distribution is suitable for roughly 98% of the locations over all months. The techniques and results presented in this study provide a foundation for use of the gamma distribution to generate drivers for various rain-related models. These models are used as decision support tools for the management of water and agricultural resources as well as food reserves by providing decision makers with ways to evaluate the likelihood of various rainfall accumulations and assess different scenarios in Africa. 

  15. Some properties of a 5-parameter bivariate probability distribution

    NASA Technical Reports Server (NTRS)

    Tubbs, J. D.; Brewer, D. W.; Smith, O. E.

    1983-01-01

    A five-parameter bivariate gamma distribution having two shape parameters, two location parameters and a correlation parameter was developed. This more general bivariate gamma distribution reduces to the known four-parameter distribution. The five-parameter distribution gives a better fit to the gust data. The statistical properties of this general bivariate gamma distribution and a hypothesis test were investigated. Although these developments have come too late in the Shuttle program to be used directly as design criteria for ascent wind gust loads, the new wind gust model has helped to explain the wind profile conditions which cause large dynamic loads. Other potential applications of the newly developed five-parameter bivariate gamma distribution are in the areas of reliability theory, signal noise, and vibration mechanics.

  16. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  17. A method to describe inelastic gamma field distribution in neutron gamma density logging.

    PubMed

    Zhang, Feng; Zhang, Quanying; Liu, Juntao; Wang, Xinguang; Wu, He; Jia, Wenbao; Ti, Yongzhou; Qiu, Fei; Zhang, Xiaoyang

    2017-11-01

    Pulsed neutron gamma density logging (NGD) is of great significance for radioprotection and density measurement in LWD, however, the current methods have difficulty in quantitative calculation and single factor analysis for the inelastic gamma field distribution. In order to clarify the NGD mechanism, a new method is developed to describe the inelastic gamma field distribution. Based on the fast-neutron scattering and gamma attenuation, the inelastic gamma field distribution is characterized by the inelastic scattering cross section, fast-neutron scattering free path, formation density and other parameters. And the contribution of formation parameters on the field distribution is quantitatively analyzed. The results shows the contribution of density attenuation is opposite to that of inelastic scattering cross section and fast-neutron scattering free path. And as the detector-spacing increases, the density attenuation gradually plays a dominant role in the gamma field distribution, which means large detector-spacing is more favorable for the density measurement. Besides, the relationship of density sensitivity and detector spacing was studied according to this gamma field distribution, therefore, the spacing of near and far gamma ray detector is determined. The research provides theoretical guidance for the tool parameter design and density determination of pulsed neutron gamma density logging technique. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Statistical measurement of the gamma-ray source-count distribution as a function of energy

    NASA Astrophysics Data System (ADS)

    Zechlin, H.-S.; Cuoco, A.; Donato, F.; Fornengo, N.; Regis, M.

    2017-01-01

    Photon counts statistics have recently been proven to provide a sensitive observable for characterizing gamma-ray source populations and for measuring the composition of the gamma-ray sky. In this work, we generalize the use of the standard 1-point probability distribution function (1pPDF) to decompose the high-latitude gamma-ray emission observed with Fermi-LAT into: (i) point-source contributions, (ii) the Galactic foreground contribution, and (iii) a diffuse isotropic background contribution. We analyze gamma-ray data in five adjacent energy bands between 1 and 171 GeV. We measure the source-count distribution dN/dS as a function of energy, and demonstrate that our results extend current measurements from source catalogs to the regime of so far undetected sources. Our method improves the sensitivity for resolving point-source populations by about one order of magnitude in flux. The dN/dS distribution as a function of flux is found to be compatible with a broken power law. We derive upper limits on further possible breaks as well as the angular power of unresolved sources. We discuss the composition of the gamma-ray sky and capabilities of the 1pPDF method.

  19. Simulated and measured neutron/gamma light output distribution for poly-energetic neutron/gamma sources

    NASA Astrophysics Data System (ADS)

    Hosseini, S. A.; Zangian, M.; Aghabozorgi, S.

    2018-03-01

    In the present paper, the light output distribution due to poly-energetic neutron/gamma (neutron or gamma) source was calculated using the developed MCNPX-ESUT-PE (MCNPX-Energy engineering of Sharif University of Technology-Poly Energetic version) computational code. The simulation of light output distribution includes the modeling of the particle transport, the calculation of scintillation photons induced by charged particles, simulation of the scintillation photon transport and considering the light resolution obtained from the experiment. The developed computational code is able to simulate the light output distribution due to any neutron/gamma source. In the experimental step of the present study, the neutron-gamma discrimination based on the light output distribution was performed using the zero crossing method. As a case study, 241Am-9Be source was considered and the simulated and measured neutron/gamma light output distributions were compared. There is an acceptable agreement between the discriminated neutron/gamma light output distributions obtained from the simulation and experiment.

  20. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  1. Statistical Measurement of the Gamma-Ray Source-count Distribution as a Function of Energy

    NASA Astrophysics Data System (ADS)

    Zechlin, Hannes-S.; Cuoco, Alessandro; Donato, Fiorenza; Fornengo, Nicolao; Regis, Marco

    2016-08-01

    Statistical properties of photon count maps have recently been proven as a new tool to study the composition of the gamma-ray sky with high precision. We employ the 1-point probability distribution function of six years of Fermi-LAT data to measure the source-count distribution dN/dS and the diffuse components of the high-latitude gamma-ray sky as a function of energy. To that aim, we analyze the gamma-ray emission in five adjacent energy bands between 1 and 171 GeV. It is demonstrated that the source-count distribution as a function of flux is compatible with a broken power law up to energies of ˜50 GeV. The index below the break is between 1.95 and 2.0. For higher energies, a simple power-law fits the data, with an index of {2.2}-0.3+0.7 in the energy band between 50 and 171 GeV. Upper limits on further possible breaks as well as the angular power of unresolved sources are derived. We find that point-source populations probed by this method can explain {83}-13+7% ({81}-19+52%) of the extragalactic gamma-ray background between 1.04 and 1.99 GeV (50 and 171 GeV). The method has excellent capabilities for constraining the gamma-ray luminosity function and the spectra of unresolved blazars.

  2. Statistical measurement of the gamma-ray source-count distribution as a function of energy

    DOE PAGES

    Zechlin, Hannes-S.; Cuoco, Alessandro; Donato, Fiorenza; ...

    2016-07-29

    Statistical properties of photon count maps have recently been proven as a new tool to study the composition of the gamma-ray sky with high precision. Here, we employ the 1-point probability distribution function of six years of Fermi-LAT data to measure the source-count distribution dN/dS and the diffuse components of the high-latitude gamma-ray sky as a function of energy. To that aim, we analyze the gamma-ray emission in five adjacent energy bands between 1 and 171 GeV. It is demonstrated that the source-count distribution as a function of flux is compatible with a broken power law up to energies of ~50 GeV. Furthermore, the index below the break is between 1.95 and 2.0. For higher energies, a simple power-law fits the data, with an index ofmore » $${2.2}_{-0.3}^{+0.7}$$ in the energy band between 50 and 171 GeV. Upper limits on further possible breaks as well as the angular power of unresolved sources are derived. We find that point-source populations probed by this method can explain $${83}_{-13}^{+7}$$% ($${81}_{-19}^{+52}$$%) of the extragalactic gamma-ray background between 1.04 and 1.99 GeV (50 and 171 GeV). Our method has excellent capabilities for constraining the gamma-ray luminosity function and the spectra of unresolved blazars.« less

  3. Statistical measurement of the gamma-ray source-count distribution as a function of energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zechlin, Hannes-S.; Cuoco, Alessandro; Donato, Fiorenza

    Statistical properties of photon count maps have recently been proven as a new tool to study the composition of the gamma-ray sky with high precision. Here, we employ the 1-point probability distribution function of six years of Fermi-LAT data to measure the source-count distribution dN/dS and the diffuse components of the high-latitude gamma-ray sky as a function of energy. To that aim, we analyze the gamma-ray emission in five adjacent energy bands between 1 and 171 GeV. It is demonstrated that the source-count distribution as a function of flux is compatible with a broken power law up to energies of ~50 GeV. Furthermore, the index below the break is between 1.95 and 2.0. For higher energies, a simple power-law fits the data, with an index ofmore » $${2.2}_{-0.3}^{+0.7}$$ in the energy band between 50 and 171 GeV. Upper limits on further possible breaks as well as the angular power of unresolved sources are derived. We find that point-source populations probed by this method can explain $${83}_{-13}^{+7}$$% ($${81}_{-19}^{+52}$$%) of the extragalactic gamma-ray background between 1.04 and 1.99 GeV (50 and 171 GeV). Our method has excellent capabilities for constraining the gamma-ray luminosity function and the spectra of unresolved blazars.« less

  4. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    PubMed

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  5. Optimal methods for fitting probability distributions to propagule retention time in studies of zoochorous dispersal.

    PubMed

    Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi

    2016-02-01

    Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We

  6. The Z {yields} cc-bar {yields} {gamma}{gamma}*, Z {yields} bb-bar {yields} {gamma}{gamma}* triangle diagrams and the Z {yields} {gamma}{psi}, Z {yields} {gamma}Y decays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Achasov, N. N., E-mail: achasov@math.nsc.ru

    2011-03-15

    The approach to the Z {yields} {gamma}{psi} and Z {yields} {gamma}Y decay study is presented in detail, based on the sum rules for the Z {yields} cc-bar {yields} {gamma}{gamma}* and Z {yields} bb-bar {yields} {gamma}{gamma}* amplitudes and their derivatives. The branching ratios of the Z {yields} {gamma}{psi} and Z {yields} {gamma}Y decays are calculated for different hypotheses on saturation of the sum rules. The lower bounds of {Sigma}{sub {psi}} BR(Z {yields} {gamma}{psi}) = 1.95 Multiplication-Sign 10{sup -7} and {Sigma}{sub {upsilon}} BR(Z {yields} {gamma}Y) = 7.23 Multiplication-Sign 10{sup -7} are found. Deviations from the lower bounds are discussed, including the possibilitymore » of BR(Z {yields} {gamma}J/{psi}(1S)) {approx} BR(Z {yields} {gamma}Y(1S)) {approx} 10{sup -6}, that could be probably measured in LHC. The angular distributions in the Z {yields} {gamma}{psi} and Z {yields} {gamma}Y decays are also calculated.« less

  7. The estimation of tree posterior probabilities using conditional clade probability distributions.

    PubMed

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  8. Fitness Probability Distribution of Bit-Flip Mutation.

    PubMed

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  9. Incorporating Skew into RMS Surface Roughness Probability Distribution

    NASA Technical Reports Server (NTRS)

    Stahl, Mark T.; Stahl, H. Philip.

    2013-01-01

    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.

  10. A bivariate gamma probability distribution with application to gust modeling. [for the ascent flight of the space shuttle

    NASA Technical Reports Server (NTRS)

    Smith, O. E.; Adelfang, S. I.; Tubbs, J. D.

    1982-01-01

    A five-parameter gamma distribution (BGD) having two shape parameters, two location parameters, and a correlation parameter is investigated. This general BGD is expressed as a double series and as a single series of the modified Bessel function. It reduces to the known special case for equal shape parameters. Practical functions for computer evaluations for the general BGD and for special cases are presented. Applications to wind gust modeling for the ascent flight of the space shuttle are illustrated.

  11. Estimation of neutron energy distributions from prompt gamma emissions

    NASA Astrophysics Data System (ADS)

    Panikkath, Priyada; Udupi, Ashwini; Sarkar, P. K.

    2017-11-01

    A technique of estimating the incident neutron energy distribution from emitted prompt gamma intensities from a system exposed to neutrons is presented. The emitted prompt gamma intensities or the measured photo peaks in a gamma detector are related to the incident neutron energy distribution through a convolution of the response of the system generating the prompt gammas to mono-energetic neutrons. Presently, the system studied is a cylinder of high density polyethylene (HDPE) placed inside another cylinder of borated HDPE (BHDPE) having an outer Pb-cover and exposed to neutrons. The emitted five prompt gamma peaks from hydrogen, boron, carbon and lead can be utilized to unfold the incident neutron energy distribution as an under-determined deconvolution problem. Such an under-determined set of equations are solved using the genetic algorithm based Monte Carlo de-convolution code GAMCD. Feasibility of the proposed technique is demonstrated theoretically using the Monte Carlo calculated response matrix and intensities of emitted prompt gammas from the Pb-covered BHDPE-HDPE system in the case of several incident neutron spectra spanning different energy ranges.

  12. Probability distributions for multimeric systems.

    PubMed

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  13. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  14. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the

  15. On the emergence of a generalised Gamma distribution. Application to traded volume in financial markets

    NASA Astrophysics Data System (ADS)

    Duarte Queirós, S. M.

    2005-08-01

    This letter reports on a stochastic dynamical scenario whose associated stationary probability density function is exactly a generalised form, with a power law instead of exponencial decay, of the ubiquitous Gamma distribution. This generalisation, also known as F-distribution, was empirically proposed for the first time to adjust for high-frequency stock traded volume distributions in financial markets and verified in experiments with granular material. The dynamical assumption presented herein is based on local temporal fluctuations of the average value of the observable under study. This proposal is related to superstatistics and thus to the current nonextensive statistical mechanics framework. For the specific case of stock traded volume, we connect the local fluctuations in the mean stock traded volume with the typical herding behaviour presented by financial traders. Last of all, NASDAQ 1 and 2 minute stock traded volume sequences and probability density functions are numerically reproduced.

  16. Sampling probability distributions of lesions in mammograms

    NASA Astrophysics Data System (ADS)

    Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.

    2015-03-01

    One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.

  17. Exact probability distribution functions for Parrondo's games

    NASA Astrophysics Data System (ADS)

    Zadourian, Rubina; Saakian, David B.; Klümper, Andreas

    2016-12-01

    We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.

  18. Exact probability distribution functions for Parrondo's games.

    PubMed

    Zadourian, Rubina; Saakian, David B; Klümper, Andreas

    2016-12-01

    We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.

  19. Fission prompt gamma-ray multiplicity distribution measurements and simulations at DANCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chyzh, A; Wu, C Y; Ullmann, J

    2010-08-24

    The nearly energy independence of the DANCE efficiency and multiplicity response to {gamma} rays makes it possible to measure the prompt {gamma}-ray multiplicity distribution in fission. We demonstrate this unique capability of DANCE through the comparison of {gamma}-ray energy and multiplicity distribution between the measurement and numerical simulation for three radioactive sources {sup 22}Na, {sup 60}Co, and {sup 88}Y. The prospect for measuring the {gamma}-ray multiplicity distribution for both spontaneous and neutron-induced fission is discussed.

  20. Derivation of a Multiparameter Gamma Model for Analyzing the Residence-Time Distribution Function for Nonideal Flow Systems as an Alternative to the Advection-Dispersion Equation

    DOE PAGES

    Embry, Irucka; Roland, Victor; Agbaje, Oluropo; ...

    2013-01-01

    A new residence-time distribution (RTD) function has been developed and applied to quantitative dye studies as an alternative to the traditional advection-dispersion equation (AdDE). The new method is based on a jointly combined four-parameter gamma probability density function (PDF). The gamma residence-time distribution (RTD) function and its first and second moments are derived from the individual two-parameter gamma distributions of randomly distributed variables, tracer travel distance, and linear velocity, which are based on their relationship with time. The gamma RTD function was used on a steady-state, nonideal system modeled as a plug-flow reactor (PFR) in the laboratory to validate themore » effectiveness of the model. The normalized forms of the gamma RTD and the advection-dispersion equation RTD were compared with the normalized tracer RTD. The normalized gamma RTD had a lower mean-absolute deviation (MAD) (0.16) than the normalized form of the advection-dispersion equation (0.26) when compared to the normalized tracer RTD. The gamma RTD function is tied back to the actual physical site due to its randomly distributed variables. The results validate using the gamma RTD as a suitable alternative to the advection-dispersion equation for quantitative tracer studies of non-ideal flow systems.« less

  1. Determination of photon emission probabilities for the main gamma-rays of ²²³Ra in equilibrium with its progeny.

    PubMed

    Pibida, L; Zimmerman, B; Fitzgerald, R; King, L; Cessna, J T; Bergeron, D E

    2015-07-01

    The currently published (223)Ra gamma-ray emission probabilities display a wide variation in the values depending on the source of the data. The National Institute of Standards and Technology performed activity measurements on a (223)Ra solution that was used to prepare several sources that were used to determine the photon emission probabilities for the main gamma-rays of (223)Ra in equilibrium with its progeny. Several high purity germanium (HPGe) detectors were used to perform the gamma-ray spectrometry measurements. Published by Elsevier Ltd.

  2. Gravitational lensing, time delay, and gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Mao, Shude

    1992-01-01

    The probability distributions of time delay in gravitational lensing by point masses and isolated galaxies (modeled as singular isothermal spheres) are studied. For point lenses (all with the same mass) the probability distribution is broad, and with a peak at delta(t) of about 50 S; for singular isothermal spheres, the probability distribution is a rapidly decreasing function with increasing time delay, with a median delta(t) equals about 1/h month, and its behavior depends sensitively on the luminosity function of galaxies. The present simplified calculation is particularly relevant to the gamma-ray bursts if they are of cosmological origin. The frequency of 'recurrent' bursts due to gravitational lensing by galaxies is probably between 0.05 and 0.4 percent. Gravitational lensing can be used as a test of the cosmological origin of gamma-ray bursts.

  3. ProbOnto: ontology and knowledge base of probability distributions.

    PubMed

    Swat, Maciej J; Grenon, Pierre; Wimalaratne, Sarala

    2016-09-01

    Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. http://probonto.org mjswat@ebi.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  4. Method of self-consistent evaluation of absolute emission probabilities of particles and gamma rays

    NASA Astrophysics Data System (ADS)

    Badikov, Sergei; Chechev, Valery

    2017-09-01

    In assumption of well installed decay scheme the method provides a) exact balance relationships, b) lower (compared to the traditional techniques) uncertainties of recommended absolute emission probabilities of particles and gamma rays, c) evaluation of correlations between the recommended emission probabilities (for the same and different decay modes). Application of the method for the decay data evaluation for even curium isotopes led to paradoxical results. The multidimensional confidence regions for the probabilities of the most intensive alpha transitions constructed on the basis of present and the ENDF/B-VII.1, JEFF-3.1, DDEP evaluations are inconsistent whereas the confidence intervals for the evaluated probabilities of single transitions agree with each other.

  5. Integral-moment analysis of the BATSE gamma-ray burst intensity distribution

    NASA Technical Reports Server (NTRS)

    Horack, John M.; Emslie, A. Gordon

    1994-01-01

    We have applied the technique of integral-moment analysis to the intensity distribution of the first 260 gamma-ray bursts observed by the Burst and Transient Source Experiment (BATSE) on the Compton Gamma Ray Observatory. This technique provides direct measurement of properties such as the mean, variance, and skewness of the convolved luminosity-number density distribution, as well as associated uncertainties. Using this method, one obtains insight into the nature of the source distributions unavailable through computation of traditional single parameters such as V/V(sub max)). If the luminosity function of the gamma-ray bursts is strongly peaked, giving bursts only a narrow range of luminosities, these results are then direct probes of the radial distribution of sources, regardless of whether the bursts are a local phenomenon, are distributed in a galactic halo, or are at cosmological distances. Accordingly, an integral-moment analysis of the intensity distribution of the gamma-ray bursts provides for the most complete analytic description of the source distribution available from the data, and offers the most comprehensive test of the compatibility of a given hypothesized distribution with observation.

  6. Work probability distribution and tossing a biased coin

    NASA Astrophysics Data System (ADS)

    Saha, Arnab; Bhattacharjee, Jayanta K.; Chakraborty, Sagar

    2011-01-01

    We show that the rare events present in dissipated work that enters Jarzynski equality, when mapped appropriately to the phenomenon of large deviations found in a biased coin toss, are enough to yield a quantitative work probability distribution for the Jarzynski equality. This allows us to propose a recipe for constructing work probability distribution independent of the details of any relevant system. The underlying framework, developed herein, is expected to be of use in modeling other physical phenomena where rare events play an important role.

  7. Modeling highway travel time distribution with conditional probability models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliveira Neto, Francisco Moraes; Chin, Shih-Miao; Hwang, Ho-Ling

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program providesmore » a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).« less

  8. Extended q -Gaussian and q -exponential distributions from gamma random variables

    NASA Astrophysics Data System (ADS)

    Budini, Adrián A.

    2015-05-01

    The family of q -Gaussian and q -exponential probability densities fit the statistical behavior of diverse complex self-similar nonequilibrium systems. These distributions, independently of the underlying dynamics, can rigorously be obtained by maximizing Tsallis "nonextensive" entropy under appropriate constraints, as well as from superstatistical models. In this paper we provide an alternative and complementary scheme for deriving these objects. We show that q -Gaussian and q -exponential random variables can always be expressed as a function of two statistically independent gamma random variables with the same scale parameter. Their shape index determines the complexity q parameter. This result also allows us to define an extended family of asymmetric q -Gaussian and modified q -exponential densities, which reduce to the standard ones when the shape parameters are the same. Furthermore, we demonstrate that a simple change of variables always allows relating any of these distributions with a beta stochastic variable. The extended distributions are applied in the statistical description of different complex dynamics such as log-return signals in financial markets and motion of point defects in a fluid flow.

  9. Probability Distributions for Random Quantum Operations

    NASA Astrophysics Data System (ADS)

    Schultz, Kevin

    Motivated by uncertainty quantification and inference of quantum information systems, in this work we draw connections between the notions of random quantum states and operations in quantum information with probability distributions commonly encountered in the field of orientation statistics. This approach identifies natural sample spaces and probability distributions upon these spaces that can be used in the analysis, simulation, and inference of quantum information systems. The theory of exponential families on Stiefel manifolds provides the appropriate generalization to the classical case. Furthermore, this viewpoint motivates a number of additional questions into the convex geometry of quantum operations relative to both the differential geometry of Stiefel manifolds as well as the information geometry of exponential families defined upon them. In particular, we draw on results from convex geometry to characterize which quantum operations can be represented as the average of a random quantum operation. This project was supported by the Intelligence Advanced Research Projects Activity via Department of Interior National Business Center Contract Number 2012-12050800010.

  10. Comparative analysis through probability distributions of a data set

    NASA Astrophysics Data System (ADS)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  11. Estimating the probability distribution of the incubation period for rabies using data from the 1948-1954 rabies epidemic in Tokyo.

    PubMed

    Tojinbara, Kageaki; Sugiura, K; Yamada, A; Kakitani, I; Kwan, N C L; Sugiura, K

    2016-01-01

    Data of 98 rabies cases in dogs and cats from the 1948-1954 rabies epidemic in Tokyo were used to estimate the probability distribution of the incubation period. Lognormal, gamma and Weibull distributions were used to model the incubation period. The maximum likelihood estimates of the mean incubation period ranged from 27.30 to 28.56 days according to different distributions. The mean incubation period was shortest with the lognormal distribution (27.30 days), and longest with the Weibull distribution (28.56 days). The best distribution in terms of AIC value was the lognormal distribution with mean value of 27.30 (95% CI: 23.46-31.55) days and standard deviation of 20.20 (15.27-26.31) days. There were no significant differences between the incubation periods for dogs and cats, or between those for male and female dogs. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. Predicting the probability of slip in gait: methodology and distribution study.

    PubMed

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  13. Probability distribution functions in turbulent convection

    NASA Technical Reports Server (NTRS)

    Balachandar, S.; Sirovich, L.

    1991-01-01

    Results of an extensive investigation of probability distribution functions (pdfs) for Rayleigh-Benard convection, in hard turbulence regime, are presented. It is shown that the pdfs exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to the universality is presented.

  14. Statistical study of air pollutant concentrations via generalized gamma distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marani, A.; Lavagnini, I.; Buttazzoni, C.

    1986-11-01

    This paper deals with modeling observed frequency distributions of air quality data measured in the area of Venice, Italy. The paper discusses the application of the generalized gamma distribution (ggd) which has not been commonly applied to air quality data notwithstanding the fact that it embodies most distribution models used for air quality analyses. The approach yields important simplifications for statistical analyses. A comparison among the ggd and other relevant models (standard gamma, Weibull, lognormal), carried out on daily sulfur dioxide concentrations in the area of Venice underlines the efficiency of ggd models in portraying experimental data.

  15. Testing the anisotropy in the angular distribution of Fermi/GBM gamma-ray bursts

    NASA Astrophysics Data System (ADS)

    Tarnopolski, M.

    2017-12-01

    Gamma-ray bursts (GRBs) were confirmed to be of extragalactic origin due to their isotropic angular distribution, combined with the fact that they exhibited an intensity distribution that deviated strongly from the -3/2 power law. This finding was later confirmed with the first redshift, equal to at least z = 0.835, measured for GRB970508. Despite this result, the data from CGRO/BATSE and Swift/BAT indicate that long GRBs are indeed distributed isotropically, but the distribution of short GRBs is anisotropic. Fermi/GBM has detected 1669 GRBs up to date, and their sky distribution is examined in this paper. A number of statistical tests are applied: nearest neighbour analysis, fractal dimension, dipole and quadrupole moments of the distribution function decomposed into spherical harmonics, binomial test and the two-point angular correlation function. Monte Carlo benchmark testing of each test is performed in order to evaluate its reliability. It is found that short GRBs are distributed anisotropically in the sky, and long ones have an isotropic distribution. The probability that these results are not a chance occurrence is equal to at least 99.98 per cent and 30.68 per cent for short and long GRBs, respectively. The cosmological context of this finding and its relation to large-scale structures is discussed.

  16. Probability distributions for Markov chain based quantum walks

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan; Liu, Chaobin; Venegas-Andraca, Salvador E.

    2018-01-01

    We analyze the probability distributions of the quantum walks induced from Markov chains by Szegedy (2004). The first part of this paper is devoted to the quantum walks induced from finite state Markov chains. It is shown that the probability distribution on the states of the underlying Markov chain is always convergent in the Cesaro sense. In particular, we deduce that the limiting distribution is uniform if the transition matrix is symmetric. In the case of a non-symmetric Markov chain, we exemplify that the limiting distribution of the quantum walk is not necessarily identical with the stationary distribution of the underlying irreducible Markov chain. The Szegedy scheme can be extended to infinite state Markov chains (random walks). In the second part, we formulate the quantum walk induced from a lazy random walk on the line. We then obtain the weak limit of the quantum walk. It is noted that the current quantum walk appears to spread faster than its counterpart-quantum walk on the line driven by the Grover coin discussed in literature. The paper closes with an outlook on possible future directions.

  17. Models of multidimensional discrete distribution of probabilities of random variables in information systems

    NASA Astrophysics Data System (ADS)

    Gromov, Yu Yu; Minin, Yu V.; Ivanova, O. G.; Morozova, O. N.

    2018-03-01

    Multidimensional discrete distributions of probabilities of independent random values were received. Their one-dimensional distribution is widely used in probability theory. Producing functions of those multidimensional distributions were also received.

  18. Performance of multi-hop parallel free-space optical communication over gamma-gamma fading channel with pointing errors.

    PubMed

    Gao, Zhengguang; Liu, Hongzhan; Ma, Xiaoping; Lu, Wei

    2016-11-10

    Multi-hop parallel relaying is considered in a free-space optical (FSO) communication system deploying binary phase-shift keying (BPSK) modulation under the combined effects of a gamma-gamma (GG) distribution and misalignment fading. Based on the best path selection criterion, the cumulative distribution function (CDF) of this cooperative random variable is derived. Then the performance of this optical mesh network is analyzed in detail. A Monte Carlo simulation is also conducted to demonstrate the effectiveness of the results for the average bit error rate (ABER) and outage probability. The numerical result proves that it needs a smaller average transmitted optical power to achieve the same ABER and outage probability when using the multi-hop parallel network in FSO links. Furthermore, the system use of more number of hops and cooperative paths can improve the quality of the communication.

  19. Nonadditive entropies yield probability distributions with biases not warranted by the data.

    PubMed

    Pressé, Steve; Ghosh, Kingshuk; Lee, Julian; Dill, Ken A

    2013-11-01

    Different quantities that go by the name of entropy are used in variational principles to infer probability distributions from limited data. Shore and Johnson showed that maximizing the Boltzmann-Gibbs form of the entropy ensures that probability distributions inferred satisfy the multiplication rule of probability for independent events in the absence of data coupling such events. Other types of entropies that violate the Shore and Johnson axioms, including nonadditive entropies such as the Tsallis entropy, violate this basic consistency requirement. Here we use the axiomatic framework of Shore and Johnson to show how such nonadditive entropy functions generate biases in probability distributions that are not warranted by the underlying data.

  20. Hybrid computer technique yields random signal probability distributions

    NASA Technical Reports Server (NTRS)

    Cameron, W. D.

    1965-01-01

    Hybrid computer determines the probability distributions of instantaneous and peak amplitudes of random signals. This combined digital and analog computer system reduces the errors and delays of manual data analysis.

  1. Probability and the changing shape of response distributions for orientation.

    PubMed

    Anderson, Britt

    2014-11-18

    Spatial attention and feature-based attention are regarded as two independent mechanisms for biasing the processing of sensory stimuli. Feature attention is held to be a spatially invariant mechanism that advantages a single feature per sensory dimension. In contrast to the prediction of location independence, I found that participants were able to report the orientation of a briefly presented visual grating better for targets defined by high probability conjunctions of features and locations even when orientations and locations were individually uniform. The advantage for high-probability conjunctions was accompanied by changes in the shape of the response distributions. High-probability conjunctions had error distributions that were not normally distributed but demonstrated increased kurtosis. The increase in kurtosis could be explained as a change in the variances of the component tuning functions that comprise a population mixture. By changing the mixture distribution of orientation-tuned neurons, it is possible to change the shape of the discrimination function. This prompts the suggestion that attention may not "increase" the quality of perceptual processing in an absolute sense but rather prioritizes some stimuli over others. This results in an increased number of highly accurate responses to probable targets and, simultaneously, an increase in the number of very inaccurate responses. © 2014 ARVO.

  2. Polynomial probability distribution estimation using the method of moments

    PubMed Central

    Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram–Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation. PMID:28394949

  3. Polynomial probability distribution estimation using the method of moments.

    PubMed

    Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram-Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation.

  4. Probability density of aperture-averaged irradiance fluctuations for long range free space optical communication links.

    PubMed

    Lyke, Stephen D; Voelz, David G; Roggemann, Michael C

    2009-11-20

    The probability density function (PDF) of aperture-averaged irradiance fluctuations is calculated from wave-optics simulations of a laser after propagating through atmospheric turbulence to investigate the evolution of the distribution as the aperture diameter is increased. The simulation data distribution is compared to theoretical gamma-gamma and lognormal PDF models under a variety of scintillation regimes from weak to strong. Results show that under weak scintillation conditions both the gamma-gamma and lognormal PDF models provide a good fit to the simulation data for all aperture sizes studied. Our results indicate that in moderate scintillation the gamma-gamma PDF provides a better fit to the simulation data than the lognormal PDF for all aperture sizes studied. In the strong scintillation regime, the simulation data distribution is gamma gamma for aperture sizes much smaller than the coherence radius rho0 and lognormal for aperture sizes on the order of rho0 and larger. Examples of how these results affect the bit-error rate of an on-off keyed free space optical communication link are presented.

  5. PHEV Energy Use Estimation: Validating the Gamma Distribution for Representing the Random Daily Driving Distance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Zhenhong; Dong, Jing; Liu, Changzheng

    2012-01-01

    The petroleum and electricity consumptions of plug-in hybrid electric vehicles (PHEVs) are sensitive to the variation of daily vehicle miles traveled (DVMT). Some studies assume DVMT to follow a Gamma distribution, but such a Gamma assumption is yet to be validated. This study finds the Gamma assumption valid in the context of PHEV energy analysis, based on continuous GPS travel data of 382 vehicles, each tracked for at least 183 days. The validity conclusion is based on the found small prediction errors, resulting from the Gamma assumption, in PHEV petroleum use, electricity use, and energy cost. The finding that themore » Gamma distribution is valid and reliable is important. It paves the way for the Gamma distribution to be assumed for analyzing energy uses of PHEVs in the real world. The Gamma distribution can be easily specified with very few pieces of driver information and is relatively easy for mathematical manipulation. Given the validation in this study, the Gamma distribution can now be used with better confidence in a variety of applications, such as improving vehicle consumer choice models, quantifying range anxiety for battery electric vehicles, investigating roles of charging infrastructure, and constructing online calculators that provide personal estimates of PHEV energy use.« less

  6. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.

  7. Correlation between discrete probability and reaction front propagation rate in heterogeneous mixtures

    NASA Astrophysics Data System (ADS)

    Naine, Tarun Bharath; Gundawar, Manoj Kumar

    2017-09-01

    We demonstrate a very powerful correlation between the discrete probability of distances of neighboring cells and thermal wave propagation rate, for a system of cells spread on a one-dimensional chain. A gamma distribution is employed to model the distances of neighboring cells. In the absence of an analytical solution and the differences in ignition times of adjacent reaction cells following non-Markovian statistics, invariably the solution for thermal wave propagation rate for a one-dimensional system with randomly distributed cells is obtained by numerical simulations. However, such simulations which are based on Monte-Carlo methods require several iterations of calculations for different realizations of distribution of adjacent cells. For several one-dimensional systems, differing in the value of shaping parameter of the gamma distribution, we show that the average reaction front propagation rates obtained by a discrete probability between two limits, shows excellent agreement with those obtained numerically. With the upper limit at 1.3, the lower limit depends on the non-dimensional ignition temperature. Additionally, this approach also facilitates the prediction of burning limits of heterogeneous thermal mixtures. The proposed method completely eliminates the need for laborious, time intensive numerical calculations where the thermal wave propagation rates can now be calculated based only on macroscopic entity of discrete probability.

  8. Steady-state distributions of probability fluxes on complex networks

    NASA Astrophysics Data System (ADS)

    Chełminiak, Przemysław; Kurzyński, Michał

    2017-02-01

    We consider a simple model of the Markovian stochastic dynamics on complex networks to examine the statistical properties of the probability fluxes. The additional transition, called hereafter a gate, powered by the external constant force breaks a detailed balance in the network. We argue, using a theoretical approach and numerical simulations, that the stationary distributions of the probability fluxes emergent under such conditions converge to the Gaussian distribution. By virtue of the stationary fluctuation theorem, its standard deviation depends directly on the square root of the mean flux. In turn, the nonlinear relation between the mean flux and the external force, which provides the key result of the present study, allows us to calculate the two parameters that entirely characterize the Gaussian distribution of the probability fluxes both close to as well as far from the equilibrium state. Also, the other effects that modify these parameters, such as the addition of shortcuts to the tree-like network, the extension and configuration of the gate and a change in the network size studied by means of computer simulations are widely discussed in terms of the rigorous theoretical predictions.

  9. High energy gamma ray results from the second small astronomy satellite

    NASA Technical Reports Server (NTRS)

    Fichtel, C. E.; Hartman, R. C.; Kniffen, D. A.; Thompson, D. J.; Bignami, G. F.; Oegelman, H.; Oezel, M. F.; Tuemer, T.

    1974-01-01

    A high energy (35 MeV) gamma ray telescope employing a thirty-two level magnetic core spark chamber system was flown on SAS 2. The high energy galactic gamma radiation is observed to dominate over the general diffuse radiation along the entire galactic plane, and when examined in detail, the longitudinal and latitudinal distribution seem generally correlated with galactic structural features, particularly with arm segments. The general high energy gamma radiation from the galactic plane, explained on the basis of its angular distribution and magnitude, probably results primarily from cosmic ray interactions with interstellar matter.

  10. Probability distributions of continuous measurement results for conditioned quantum evolution

    NASA Astrophysics Data System (ADS)

    Franquet, A.; Nazarov, Yuli V.

    2017-02-01

    We address the statistics of continuous weak linear measurement on a few-state quantum system that is subject to a conditioned quantum evolution. For a conditioned evolution, both the initial and final states of the system are fixed: the latter is achieved by the postselection in the end of the evolution. The statistics may drastically differ from the nonconditioned case, and the interference between initial and final states can be observed in the probability distributions of measurement outcomes as well as in the average values exceeding the conventional range of nonconditioned averages. We develop a proper formalism to compute the distributions of measurement outcomes, and evaluate and discuss the distributions in experimentally relevant setups. We demonstrate the manifestations of the interference between initial and final states in various regimes. We consider analytically simple examples of nontrivial probability distributions. We reveal peaks (or dips) at half-quantized values of the measurement outputs. We discuss in detail the case of zero overlap between initial and final states demonstrating anomalously big average outputs and sudden jump in time-integrated output. We present and discuss the numerical evaluation of the probability distribution aiming at extending the analytical results and describing a realistic experimental situation of a qubit in the regime of resonant fluorescence.

  11. Probability of lensing magnification by cosmologically distributed galaxies

    NASA Technical Reports Server (NTRS)

    Pei, Yichuan C.

    1993-01-01

    We present the analytical formulae for computing the magnification probability caused by cosmologically distributed galaxies. The galaxies are assumed to be singular, truncated-isothermal spheres without both evolution and clustering in redshift. We find that, for a fixed total mass, extended galaxies produce a broader shape in the magnification probability distribution and hence are less efficient as gravitational lenses than compact galaxies. The high-magnification tail caused by large galaxies is well approximated by an A exp -3 form, while the tail by small galaxies is slightly shallower. The mean magnification as a function of redshift is, however, found to be independent of the size of the lensing galaxies. In terms of the flux conservation, our formulae for the isothermal galaxy model predict a mean magnification to within a few percent with the Dyer-Roeder model of a clumpy universe.

  12. Newton/Poisson-Distribution Program

    NASA Technical Reports Server (NTRS)

    Bowerman, Paul N.; Scheuer, Ernest M.

    1990-01-01

    NEWTPOIS, one of two computer programs making calculations involving cumulative Poisson distributions. NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714) used independently of one another. NEWTPOIS determines Poisson parameter for given cumulative probability, from which one obtains percentiles for gamma distributions with integer shape parameters and percentiles for X(sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Program written in C.

  13. Neural correlates of the divergence of instrumental probability distributions.

    PubMed

    Liljeholm, Mimi; Wang, Shuo; Zhang, June; O'Doherty, John P

    2013-07-24

    Flexible action selection requires knowledge about how alternative actions impact the environment: a "cognitive map" of instrumental contingencies. Reinforcement learning theories formalize this map as a set of stochastic relationships between actions and states, such that for any given action considered in a current state, a probability distribution is specified over possible outcome states. Here, we show that activity in the human inferior parietal lobule correlates with the divergence of such outcome distributions-a measure that reflects whether discrimination between alternative actions increases the controllability of the future-and, further, that this effect is dissociable from those of other information theoretic and motivational variables, such as outcome entropy, action values, and outcome utilities. Our results suggest that, although ultimately combined with reward estimates to generate action values, outcome probability distributions associated with alternative actions may be contrasted independently of valence computations, to narrow the scope of the action selection problem.

  14. Superthermal photon bunching in terms of simple probability distributions

    NASA Astrophysics Data System (ADS)

    Lettau, T.; Leymann, H. A. M.; Melcher, B.; Wiersig, J.

    2018-05-01

    We analyze the second-order photon autocorrelation function g(2 ) with respect to the photon probability distribution and discuss the generic features of a distribution that results in superthermal photon bunching [g(2 )(0 ) >2 ]. Superthermal photon bunching has been reported for a number of optical microcavity systems that exhibit processes such as superradiance or mode competition. We show that a superthermal photon number distribution cannot be constructed from the principle of maximum entropy if only the intensity and the second-order autocorrelation are given. However, for bimodal systems, an unbiased superthermal distribution can be constructed from second-order correlations and the intensities alone. Our findings suggest modeling superthermal single-mode distributions by a mixture of a thermal and a lasinglike state and thus reveal a generic mechanism in the photon probability distribution responsible for creating superthermal photon bunching. We relate our general considerations to a physical system, i.e., a (single-emitter) bimodal laser, and show that its statistics can be approximated and understood within our proposed model. Furthermore, the excellent agreement of the statistics of the bimodal laser and our model reveals that the bimodal laser is an ideal source of bunched photons, in the sense that it can generate statistics that contain no other features but the superthermal bunching.

  15. Net present value probability distributions from decline curve reserves estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, D.E.; Huffman, C.H.; Thompson, R.S.

    1995-12-31

    This paper demonstrates how reserves probability distributions can be used to develop net present value (NPV) distributions. NPV probability distributions were developed from the rate and reserves distributions presented in SPE 28333. This real data study used practicing engineer`s evaluations of production histories. Two approaches were examined to quantify portfolio risk. The first approach, the NPV Relative Risk Plot, compares the mean NPV with the NPV relative risk ratio for the portfolio. The relative risk ratio is the NPV standard deviation (a) divided the mean ({mu}) NPV. The second approach, a Risk - Return Plot, is a plot of themore » {mu} discounted cash flow rate of return (DCFROR) versus the {sigma} for the DCFROR distribution. This plot provides a risk-return relationship for comparing various portfolios. These methods may help evaluate property acquisition and divestiture alternatives and assess the relative risk of a suite of wells or fields for bank loans.« less

  16. Methods for fitting a parametric probability distribution to most probable number data.

    PubMed

    Williams, Michael S; Ebel, Eric D

    2012-07-02

    Every year hundreds of thousands, if not millions, of samples are collected and analyzed to assess microbial contamination in food and water. The concentration of pathogenic organisms at the end of the production process is low for most commodities, so a highly sensitive screening test is used to determine whether the organism of interest is present in a sample. In some applications, samples that test positive are subjected to quantitation. The most probable number (MPN) technique is a common method to quantify the level of contamination in a sample because it is able to provide estimates at low concentrations. This technique uses a series of dilution count experiments to derive estimates of the concentration of the microorganism of interest. An application for these data is food-safety risk assessment, where the MPN concentration estimates can be fitted to a parametric distribution to summarize the range of potential exposures to the contaminant. Many different methods (e.g., substitution methods, maximum likelihood and regression on order statistics) have been proposed to fit microbial contamination data to a distribution, but the development of these methods rarely considers how the MPN technique influences the choice of distribution function and fitting method. An often overlooked aspect when applying these methods is whether the data represent actual measurements of the average concentration of microorganism per milliliter or the data are real-valued estimates of the average concentration, as is the case with MPN data. In this study, we propose two methods for fitting MPN data to a probability distribution. The first method uses a maximum likelihood estimator that takes average concentration values as the data inputs. The second is a Bayesian latent variable method that uses the counts of the number of positive tubes at each dilution to estimate the parameters of the contamination distribution. The performance of the two fitting methods is compared for two

  17. Estimation of rates-across-sites distributions in phylogenetic substitution models.

    PubMed

    Susko, Edward; Field, Chris; Blouin, Christian; Roger, Andrew J

    2003-10-01

    Previous work has shown that it is often essential to account for the variation in rates at different sites in phylogenetic models in order to avoid phylogenetic artifacts such as long branch attraction. In most current models, the gamma distribution is used for the rates-across-sites distributions and is implemented as an equal-probability discrete gamma. In this article, we introduce discrete distribution estimates with large numbers of equally spaced rate categories allowing us to investigate the appropriateness of the gamma model. With large numbers of rate categories, these discrete estimates are flexible enough to approximate the shape of almost any distribution. Likelihood ratio statistical tests and a nonparametric bootstrap confidence-bound estimation procedure based on the discrete estimates are presented that can be used to test the fit of a parametric family. We applied the methodology to several different protein data sets, and found that although the gamma model often provides a good parametric model for this type of data, rate estimates from an equal-probability discrete gamma model with a small number of categories will tend to underestimate the largest rates. In cases when the gamma model assumption is in doubt, rate estimates coming from the discrete rate distribution estimate with a large number of rate categories provide a robust alternative to gamma estimates. An alternative implementation of the gamma distribution is proposed that, for equal numbers of rate categories, is computationally more efficient during optimization than the standard gamma implementation and can provide more accurate estimates of site rates.

  18. A note on `Analysis of gamma-ray burst duration distribution using mixtures of skewed distributions'

    NASA Astrophysics Data System (ADS)

    Kwong, Hok Shing; Nadarajah, Saralees

    2018-01-01

    Tarnopolski [Monthly Notices of the Royal Astronomical Society, 458 (2016) 2024-2031] analysed data sets on gamma-ray burst durations using skew distributions. He showed that the best fits are provided by two skew normal and three Gaussian distributions. Here, we suggest other distributions, including some that are heavy tailed. At least one of these distributions is shown to provide better fits than those considered in Tarnopolski. Five criteria are used to assess best fits.

  19. Impact of temporal probability in 4D dose calculation for lung tumors.

    PubMed

    Rouabhi, Ouided; Ma, Mingyu; Bayouth, John; Xia, Junyi

    2015-11-08

    The purpose of this study was to evaluate the dosimetric uncertainty in 4D dose calculation using three temporal probability distributions: uniform distribution, sinusoidal distribution, and patient-specific distribution derived from the patient respiratory trace. Temporal probability, defined as the fraction of time a patient spends in each respiratory amplitude, was evaluated in nine lung cancer patients. Four-dimensional computed tomography (4D CT), along with deformable image registration, was used to compute 4D dose incorporating the patient's respiratory motion. First, the dose of each of 10 phase CTs was computed using the same planning parameters as those used in 3D treatment planning based on the breath-hold CT. Next, deformable image registration was used to deform the dose of each phase CT to the breath-hold CT using the deformation map between the phase CT and the breath-hold CT. Finally, the 4D dose was computed by summing the deformed phase doses using their corresponding temporal probabilities. In this study, 4D dose calculated from the patient-specific temporal probability distribution was used as the ground truth. The dosimetric evaluation matrix included: 1) 3D gamma analysis, 2) mean tumor dose (MTD), 3) mean lung dose (MLD), and 4) lung V20. For seven out of nine patients, both uniform and sinusoidal temporal probability dose distributions were found to have an average gamma passing rate > 95% for both the lung and PTV regions. Compared with 4D dose calculated using the patient respiratory trace, doses using uniform and sinusoidal distribution showed a percentage difference on average of -0.1% ± 0.6% and -0.2% ± 0.4% in MTD, -0.2% ± 1.9% and -0.2% ± 1.3% in MLD, 0.09% ± 2.8% and -0.07% ± 1.8% in lung V20, -0.1% ± 2.0% and 0.08% ± 1.34% in lung V10, 0.47% ± 1.8% and 0.19% ± 1.3% in lung V5, respectively. We concluded that four-dimensional dose computed using either a uniform or sinusoidal temporal probability distribution can

  20. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    ERIC Educational Resources Information Center

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  1. Peculiarities of gamma-quanta distribution at 20 TeV energy

    NASA Technical Reports Server (NTRS)

    Ermakov, P. M.; Loktionov, A. A.; Lukin, Y. T.; Sadykov, T. K.

    1985-01-01

    The angular distribution of protons from the fragmentational region is analyzed. The gamma-quanta families are generated in a dense target by cosmic ray particles at 20 Tev energy. Families were found which had dense groups (spikes) of gamma-quanta where the rapidity/density is 3 times more than the average value determined for all registered families. The experimental data is compared with the results of artificial families simulation.

  2. Probability distribution of extreme share returns in Malaysia

    NASA Astrophysics Data System (ADS)

    Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin

    2014-09-01

    The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.

  3. A probable probability distribution of a series nonequilibrium states in a simple system out of equilibrium

    NASA Astrophysics Data System (ADS)

    Gao, Haixia; Li, Ting; Xiao, Changming

    2016-05-01

    When a simple system is in its nonequilibrium state, it will shift to its equilibrium state. Obviously, in this process, there are a series of nonequilibrium states. With the assistance of Bayesian statistics and hyperensemble, a probable probability distribution of these nonequilibrium states can be determined by maximizing the hyperensemble entropy. It is known that the largest probability is the equilibrium state, and the far a nonequilibrium state is away from the equilibrium one, the smaller the probability will be, and the same conclusion can also be obtained in the multi-state space. Furthermore, if the probability stands for the relative time the corresponding nonequilibrium state can stay, then the velocity of a nonequilibrium state returning back to its equilibrium can also be determined through the reciprocal of the derivative of this probability. It tells us that the far away the state from the equilibrium is, the faster the returning velocity will be; if the system is near to its equilibrium state, the velocity will tend to be smaller and smaller, and finally tends to 0 when it gets the equilibrium state.

  4. Computer simulation of random variables and vectors with arbitrary probability distribution laws

    NASA Technical Reports Server (NTRS)

    Bogdan, V. M.

    1981-01-01

    Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.

  5. Probability Distribution of Turbulent Kinetic Energy Dissipation Rate in Ocean: Observations and Approximations

    NASA Astrophysics Data System (ADS)

    Lozovatsky, I.; Fernando, H. J. S.; Planella-Morato, J.; Liu, Zhiyu; Lee, J.-H.; Jinadasa, S. U. P.

    2017-10-01

    The probability distribution of turbulent kinetic energy dissipation rate in stratified ocean usually deviates from the classic lognormal distribution that has been formulated for and often observed in unstratified homogeneous layers of atmospheric and oceanic turbulence. Our measurements of vertical profiles of micro-scale shear, collected in the East China Sea, northern Bay of Bengal, to the south and east of Sri Lanka, and in the Gulf Stream region, show that the probability distributions of the dissipation rate ɛ˜r in the pycnoclines (r ˜ 1.4 m is the averaging scale) can be successfully modeled by the Burr (type XII) probability distribution. In weakly stratified boundary layers, lognormal distribution of ɛ˜r is preferable, although the Burr is an acceptable alternative. The skewness Skɛ and the kurtosis Kɛ of the dissipation rate appear to be well correlated in a wide range of Skɛ and Kɛ variability.

  6. Study on probability distributions for evolution in modified extremal optimization

    NASA Astrophysics Data System (ADS)

    Zeng, Guo-Qiang; Lu, Yong-Zai; Mao, Wei-Jie; Chu, Jian

    2010-05-01

    It is widely believed that the power-law is a proper probability distribution being effectively applied for evolution in τ-EO (extremal optimization), a general-purpose stochastic local-search approach inspired by self-organized criticality, and its applications in some NP-hard problems, e.g., graph partitioning, graph coloring, spin glass, etc. In this study, we discover that the exponential distributions or hybrid ones (e.g., power-laws with exponential cutoff) being popularly used in the research of network sciences may replace the original power-laws in a modified τ-EO method called self-organized algorithm (SOA), and provide better performances than other statistical physics oriented methods, such as simulated annealing, τ-EO and SOA etc., from the experimental results on random Euclidean traveling salesman problems (TSP) and non-uniform instances. From the perspective of optimization, our results appear to demonstrate that the power-law is not the only proper probability distribution for evolution in EO-similar methods at least for TSP, the exponential and hybrid distributions may be other choices.

  7. Determination of photon emission probability for the main gamma ray and half-life measurements of 64Cu.

    PubMed

    Pibida, L; Zimmerman, B; Bergeron, D E; Fitzgerald, R; Cessna, J T; King, L

    2017-11-01

    The National Institute of Standards and Technology (NIST) performed new standardization measurements for 64 Cu. As part of this work the photon emission probability for the main gamma-ray line and the half-life were determined using several high-purity germanium (HPGe) detectors. Half-life determinations were also carried out with a NaI(Tl) well counter and two pressurized ionization chambers. Published by Elsevier Ltd.

  8. SU-E-T-471: Improvement of Gamma Knife Treatment Planning Through Tumor Control Probability for Metastatic Brain Tumors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, Z; Feng, Y; Lo, S

    2015-06-15

    Purpose: The dose–volume histogram (DVH) has been normally accepted as a tool for treatment plan evaluation. However, spatial information is lacking in DVH. As a supplement to the DVH in three-dimensional treatment planning, the differential DVH (DDVH) provides the spatial variation, the size and magnitude of the different dose regions within a region of interest, which can be incorporated into tumor control probability model. This study was to provide a method in evaluating and improving Gamma Knife treatment planning. Methods: 10 patients with brain metastases from different primary tumors including melanoma (#1,#4,#5, #10), breast cancer (#2), prostate cancer (#3) andmore » lung cancer (#6–9) were analyzed. By using Leksell GammaPlan software, two plans were prepared for each patient. Special attention was given to the DDVHs that were different for different plans and were used for a comparison between two plans. Dose distribution inside target and tumor control probability (TCP) based on DDVH were calculated, where cell density and radiobiological parameters were adopted from literature. The plans were compared based on DVH, DDVH and TCP. Results: Using DVH, the coverage and selectivity were the same between plans for 10 patients. DDVH were different between two plans for each patient. The paired t-test showed no significant difference in TCP between the two plans. For brain metastases from melanoma (#1, #4–5), breast cancer (#2) and lung cancer (#6–8), the difference in TCP was less than 5%. But the difference in TCP was about 6.5% for patient #3 with the metastasis from prostate cancer, 10.1% and 178.7% for two patients (#9–10) with metastasis from lung cancer. Conclusion: Although DVH provides average dose–volume information, DDVH provides differential dose– volume information with respect to different regions inside the tumor. TCP provides radiobiological information and adds additional information on improving treatment planning as well as

  9. Unfolding the fission prompt gamma-ray energy and multiplicity distribution measured by DANCE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chyzh, A; Wu, C Y; Bredeweg, T

    2010-10-16

    The nearly energy independence of the {gamma}-ray efficiency and multiplicity response for the DANCE array, the unusual characteristic elucidated in our early technical report (LLNL-TR-452298), gives one a unique opportunity to derive the true prompt {gamma}-ray energy and multiplicity distribution in fission from the measurement. This unfolding procedure for the experimental data will be described in details and examples will be given to demonstrate the feasibility of reconstruction of the true distribution.

  10. Precipitation intensity probability distribution modelling for hydrological and construction design purposes

    NASA Astrophysics Data System (ADS)

    Koshinchanov, Georgy; Dimitrov, Dobri

    2008-11-01

    The characteristics of rainfall intensity are important for many purposes, including design of sewage and drainage systems, tuning flood warning procedures, etc. Those estimates are usually statistical estimates of the intensity of precipitation realized for certain period of time (e.g. 5, 10 min., etc) with different return period (e.g. 20, 100 years, etc). The traditional approach in evaluating the mentioned precipitation intensities is to process the pluviometer's records and fit probability distribution to samples of intensities valid for certain locations ore regions. Those estimates further become part of the state regulations to be used for various economic activities. Two problems occur using the mentioned approach: 1. Due to various factors the climate conditions are changed and the precipitation intensity estimates need regular update; 2. As far as the extremes of the probability distribution are of particular importance for the practice, the methodology of the distribution fitting needs specific attention to those parts of the distribution. The aim of this paper is to make review of the existing methodologies for processing the intensive rainfalls and to refresh some of the statistical estimates for the studied areas. The methodologies used in Bulgaria for analyzing the intensive rainfalls and produce relevant statistical estimates: The method of the maximum intensity, used in the National Institute of Meteorology and Hydrology to process and decode the pluviometer's records, followed by distribution fitting for each precipitation duration period; As the above, but with separate modeling of probability distribution for the middle and high probability quantiles. Method is similar to the first one, but with a threshold of 0,36 mm/min of intensity; Another method proposed by the Russian hydrologist G. A. Aleksiev for regionalization of estimates over some territory, improved and adapted by S. Gerasimov for Bulgaria; Next method is considering only the

  11. Development of MCNPX-ESUT computer code for simulation of neutron/gamma pulse height distribution

    NASA Astrophysics Data System (ADS)

    Abolfazl Hosseini, Seyed; Vosoughi, Naser; Zangian, Mehdi

    2015-05-01

    In this paper, the development of the MCNPX-ESUT (MCNPX-Energy Engineering of Sharif University of Technology) computer code for simulation of neutron/gamma pulse height distribution is reported. Since liquid organic scintillators like NE-213 are well suited and routinely used for spectrometry in mixed neutron/gamma fields, this type of detectors is selected for simulation in the present study. The proposed algorithm for simulation includes four main steps. The first step is the modeling of the neutron/gamma particle transport and their interactions with the materials in the environment and detector volume. In the second step, the number of scintillation photons due to charged particles such as electrons, alphas, protons and carbon nuclei in the scintillator material is calculated. In the third step, the transport of scintillation photons in the scintillator and lightguide is simulated. Finally, the resolution corresponding to the experiment is considered in the last step of the simulation. Unlike the similar computer codes like SCINFUL, NRESP7 and PHRESP, the developed computer code is applicable to both neutron and gamma sources. Hence, the discrimination of neutron and gamma in the mixed fields may be performed using the MCNPX-ESUT computer code. The main feature of MCNPX-ESUT computer code is that the neutron/gamma pulse height simulation may be performed without needing any sort of post processing. In the present study, the pulse height distributions due to a monoenergetic neutron/gamma source in NE-213 detector using MCNPX-ESUT computer code is simulated. The simulated neutron pulse height distributions are validated through comparing with experimental data (Gohil et al. Nuclear Instruments and Methods in Physics Research Section A: Accelerators, Spectrometers, Detectors and Associated Equipment, 664 (2012) 304-309.) and the results obtained from similar computer codes like SCINFUL, NRESP7 and Geant4. The simulated gamma pulse height distribution for a 137Cs

  12. Work probability distribution for a ferromagnet with long-ranged and short-ranged correlations

    NASA Astrophysics Data System (ADS)

    Bhattacharjee, J. K.; Kirkpatrick, T. R.; Sengers, J. V.

    2018-04-01

    Work fluctuations and work probability distributions are fundamentally different in systems with short-ranged versus long-ranged correlations. Specifically, in systems with long-ranged correlations the work distribution is extraordinarily broad compared to systems with short-ranged correlations. This difference profoundly affects the possible applicability of fluctuation theorems like the Jarzynski fluctuation theorem. The Heisenberg ferromagnet, well below its Curie temperature, is a system with long-ranged correlations in very low magnetic fields due to the presence of Goldstone modes. As the magnetic field is increased the correlations gradually become short ranged. Hence, such a ferromagnet is an ideal system for elucidating the changes of the work probability distribution as one goes from a domain with long-ranged correlations to a domain with short-ranged correlations by tuning the magnetic field. A quantitative analysis of this crossover behavior of the work probability distribution and the associated fluctuations is presented.

  13. Neural Correlates of the Divergence of Instrumental Probability Distributions

    PubMed Central

    Wang, Shuo; Zhang, June; O'Doherty, John P.

    2013-01-01

    Flexible action selection requires knowledge about how alternative actions impact the environment: a “cognitive map” of instrumental contingencies. Reinforcement learning theories formalize this map as a set of stochastic relationships between actions and states, such that for any given action considered in a current state, a probability distribution is specified over possible outcome states. Here, we show that activity in the human inferior parietal lobule correlates with the divergence of such outcome distributions–a measure that reflects whether discrimination between alternative actions increases the controllability of the future–and, further, that this effect is dissociable from those of other information theoretic and motivational variables, such as outcome entropy, action values, and outcome utilities. Our results suggest that, although ultimately combined with reward estimates to generate action values, outcome probability distributions associated with alternative actions may be contrasted independently of valence computations, to narrow the scope of the action selection problem. PMID:23884955

  14. Measurements of gas hydrate formation probability distributions on a quasi-free water droplet

    NASA Astrophysics Data System (ADS)

    Maeda, Nobuo

    2014-06-01

    A High Pressure Automated Lag Time Apparatus (HP-ALTA) can measure gas hydrate formation probability distributions from water in a glass sample cell. In an HP-ALTA gas hydrate formation originates near the edges of the sample cell and gas hydrate films subsequently grow across the water-guest gas interface. It would ideally be desirable to be able to measure gas hydrate formation probability distributions of a single water droplet or mist that is freely levitating in a guest gas, but this is technically challenging. The next best option is to let a water droplet sit on top of a denser, immiscible, inert, and wall-wetting hydrophobic liquid to avoid contact of a water droplet with the solid walls. Here we report the development of a second generation HP-ALTA which can measure gas hydrate formation probability distributions of a water droplet which sits on a perfluorocarbon oil in a container that is coated with 1H,1H,2H,2H-Perfluorodecyltriethoxysilane. It was found that the gas hydrate formation probability distributions of such a quasi-free water droplet were significantly lower than those of water in a glass sample cell.

  15. Uncertainty squared: Choosing among multiple input probability distributions and interpreting multiple output probability distributions in Monte Carlo climate risk models

    NASA Astrophysics Data System (ADS)

    Baer, P.; Mastrandrea, M.

    2006-12-01

    Simple probabilistic models which attempt to estimate likely transient temperature change from specified CO2 emissions scenarios must make assumptions about at least six uncertain aspects of the causal chain between emissions and temperature: current radiative forcing (including but not limited to aerosols), current land use emissions, carbon sinks, future non-CO2 forcing, ocean heat uptake, and climate sensitivity. Of these, multiple PDFs (probability density functions) have been published for the climate sensitivity, a couple for current forcing and ocean heat uptake, one for future non-CO2 forcing, and none for current land use emissions or carbon cycle uncertainty (which are interdependent). Different assumptions about these parameters, as well as different model structures, will lead to different estimates of likely temperature increase from the same emissions pathway. Thus policymakers will be faced with a range of temperature probability distributions for the same emissions scenarios, each described by a central tendency and spread. Because our conventional understanding of uncertainty and probability requires that a probabilistically defined variable of interest have only a single mean (or median, or modal) value and a well-defined spread, this "multidimensional" uncertainty defies straightforward utilization in policymaking. We suggest that there are no simple solutions to the questions raised. Crucially, we must dispel the notion that there is a "true" probability probabilities of this type are necessarily subjective, and reasonable people may disagree. Indeed, we suggest that what is at stake is precisely the question, what is it reasonable to believe, and to act as if we believe? As a preliminary suggestion, we demonstrate how the output of a simple probabilistic climate model might be evaluated regarding the reasonableness of the outputs it calculates with different input PDFs. We suggest further that where there is insufficient evidence to clearly

  16. Score distributions of gapped multiple sequence alignments down to the low-probability tail

    NASA Astrophysics Data System (ADS)

    Fieth, Pascal; Hartmann, Alexander K.

    2016-08-01

    Assessing the significance of alignment scores of optimally aligned DNA or amino acid sequences can be achieved via the knowledge of the score distribution of random sequences. But this requires obtaining the distribution in the biologically relevant high-scoring region, where the probabilities are exponentially small. For gapless local alignments of infinitely long sequences this distribution is known analytically to follow a Gumbel distribution. Distributions for gapped local alignments and global alignments of finite lengths can only be obtained numerically. To obtain result for the small-probability region, specific statistical mechanics-based rare-event algorithms can be applied. In previous studies, this was achieved for pairwise alignments. They showed that, contrary to results from previous simple sampling studies, strong deviations from the Gumbel distribution occur in case of finite sequence lengths. Here we extend the studies to multiple sequence alignments with gaps, which are much more relevant for practical applications in molecular biology. We study the distributions of scores over a large range of the support, reaching probabilities as small as 10-160, for global and local (sum-of-pair scores) multiple alignments. We find that even after suitable rescaling, eliminating the sequence-length dependence, the distributions for multiple alignment differ from the pairwise alignment case. Furthermore, we also show that the previously discussed Gaussian correction to the Gumbel distribution needs to be refined, also for the case of pairwise alignments.

  17. Exact probability distribution function for the volatility of cumulative production

    NASA Astrophysics Data System (ADS)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  18. Relationships between log N-log S and celestial distribution of gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Nishimura, J.; Yamagami, T.

    1985-01-01

    The apparent conflict between log N-log S curve and isotropic celestial distribution of the gamma ray bursts is discussed. A possible selection effect due to the time profile of each burst is examined. It is shown that the contradiction is due to this selection effect of the gamma ray bursts.

  19. A Variance Distribution Model of Surface EMG Signals Based on Inverse Gamma Distribution.

    PubMed

    Hayashi, Hideaki; Furui, Akira; Kurita, Yuichi; Tsuji, Toshio

    2017-11-01

    Objective: This paper describes the formulation of a surface electromyogram (EMG) model capable of representing the variance distribution of EMG signals. Methods: In the model, EMG signals are handled based on a Gaussian white noise process with a mean of zero for each variance value. EMG signal variance is taken as a random variable that follows inverse gamma distribution, allowing the representation of noise superimposed onto this variance. Variance distribution estimation based on marginal likelihood maximization is also outlined in this paper. The procedure can be approximated using rectified and smoothed EMG signals, thereby allowing the determination of distribution parameters in real time at low computational cost. Results: A simulation experiment was performed to evaluate the accuracy of distribution estimation using artificially generated EMG signals, with results demonstrating that the proposed model's accuracy is higher than that of maximum-likelihood-based estimation. Analysis of variance distribution using real EMG data also suggested a relationship between variance distribution and signal-dependent noise. Conclusion: The study reported here was conducted to examine the performance of a proposed surface EMG model capable of representing variance distribution and a related distribution parameter estimation method. Experiments using artificial and real EMG data demonstrated the validity of the model. Significance: Variance distribution estimated using the proposed model exhibits potential in the estimation of muscle force. Objective: This paper describes the formulation of a surface electromyogram (EMG) model capable of representing the variance distribution of EMG signals. Methods: In the model, EMG signals are handled based on a Gaussian white noise process with a mean of zero for each variance value. EMG signal variance is taken as a random variable that follows inverse gamma distribution, allowing the representation of noise superimposed onto this

  20. Probability Distribution Estimated From the Minimum, Maximum, and Most Likely Values: Applied to Turbine Inlet Temperature Uncertainty

    NASA Technical Reports Server (NTRS)

    Holland, Frederic A., Jr.

    2004-01-01

    Modern engineering design practices are tending more toward the treatment of design parameters as random variables as opposed to fixed, or deterministic, values. The probabilistic design approach attempts to account for the uncertainty in design parameters by representing them as a distribution of values rather than as a single value. The motivations for this effort include preventing excessive overdesign as well as assessing and assuring reliability, both of which are important for aerospace applications. However, the determination of the probability distribution is a fundamental problem in reliability analysis. A random variable is often defined by the parameters of the theoretical distribution function that gives the best fit to experimental data. In many cases the distribution must be assumed from very limited information or data. Often the types of information that are available or reasonably estimated are the minimum, maximum, and most likely values of the design parameter. For these situations the beta distribution model is very convenient because the parameters that define the distribution can be easily determined from these three pieces of information. Widely used in the field of operations research, the beta model is very flexible and is also useful for estimating the mean and standard deviation of a random variable given only the aforementioned three values. However, an assumption is required to determine the four parameters of the beta distribution from only these three pieces of information (some of the more common distributions, like the normal, lognormal, gamma, and Weibull distributions, have two or three parameters). The conventional method assumes that the standard deviation is a certain fraction of the range. The beta parameters are then determined by solving a set of equations simultaneously. A new method developed in-house at the NASA Glenn Research Center assumes a value for one of the beta shape parameters based on an analogy with the normal

  1. Measurements of neutron distribution in neutrons-gamma-rays mixed field using imaging plate for neutron capture therapy.

    PubMed

    Tanaka, Kenichi; Endo, Satoru; Hoshi, Masaharu

    2010-01-01

    The imaging plate (IP) technique is tried to be used as a handy method to measure the spatial neutron distribution via the (157)Gd(n,gamma)(158)Gd reaction for neutron capture therapy (NCT). For this purpose, IP is set in a water phantom and irradiated in a mixed field of neutrons and gamma-rays. The Hiroshima University Radiobiological Research Accelerator is utilized for this experiment. The neutrons are moderated with 20-cm-thick D(2)O to obtain suitable neutron field for NCT. The signal for IP doped with Gd as a neutron-response enhancer is subtracted with its contribution by gamma-rays, which was estimated using IP without Gd. The gamma-ray response of Gd-doped IP to non-Gd IP is set at 1.34, the value measured for (60)Co gamma-rays, in estimating the gamma-ray contribution to Gd-doped IP signal. Then measured distribution of the (157)Gd(n,gamma)(158)Gd reaction rate agrees within 10% with the calculated value based on the method that has already been validated for its reproducibility of Au activation. However, the evaluated distribution of the (157)Gd(n,gamma)(158)Gd reaction rate is so sensitive to gamma-ray energy, e.g. the discrepancy of the (157)Gd(n,gamma)(158)Gd reaction rate between measurement and calculation becomes 30% for the photon energy change from 33keV to 1.253MeV.

  2. Regional probability distribution of the annual reference evapotranspiration and its effective parameters in Iran

    NASA Astrophysics Data System (ADS)

    Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad

    2017-10-01

    The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.

  3. Distribution of cosmic gamma rays in the galactic anticenter region as observed by SAS-2

    NASA Technical Reports Server (NTRS)

    Kniffen, D. A.; Fichtel, C. E.; Hartman, R. C.; Thompson, D. J.; Ozel, M. E.; Tumer, T.; Bignami, G. F.; Ogelman, H.

    1975-01-01

    The high energy (above 35 MeV) gamma ray telescope flown on the second Small Astronomy Satellite has collected over one thousand gamma rays from the direction of the galactic anticenter. In addition to the diffuse galactic emission the distribution indicates a strong pulsed contribution from the Crab nebula with the same period and phase as the NP0532 pulsar. There also seems to be an excess in the direction of (gal. long. ? 195 deg; gal. lat ? +5 deg) where there is a region containing old supernova remnants. Search for gamma ray pulsations from other pulsars in the region do not show any statistically significant signal. The general intensity distribution of the gamma rays away from the plane appear to be similar to nonthermal radio emission brightness contours.

  4. (p,$gamma$) ANGULAR DISTRIBUTION MEASUREMENTS ON F$sup 19$(p,$alpha$$gamma$)O$sup 16$ AT 340, 598, AND 669 kev (in German)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Retz-Schmidt, Th.

    1958-10-01

    Experimental envestigations of the behavior of the 6.14-Mev radiation in the F/sup 19/(p, alpha gamma )O/sup 16/ reaction gave the following angular distributions: I gamma (669) ~ isotrop, I gamma (598) ~ 1 + 0.17 cos/sup 2/ THETA , and I gamma (340) ~ 1-0.035 cos/sup 2/ THETA . The result in the last case which deviates from earlier measurements is in better agreement with the basic assumption that in addition to the s-protons approximately 1% d-protons participate in the reaction at E/sub p/ = 340 kev. (tr-auth)

  5. Modelling the Probability of Landslides Impacting Road Networks

    NASA Astrophysics Data System (ADS)

    Taylor, F. E.; Malamud, B. D.

    2012-04-01

    During a landslide triggering event, the threat of landslides blocking roads poses a risk to logistics, rescue efforts and communities dependant on those road networks. Here we present preliminary results of a stochastic model we have developed to evaluate the probability of landslides intersecting a simple road network during a landslide triggering event and apply simple network indices to measure the state of the road network in the affected region. A 4000 x 4000 cell array with a 5 m x 5 m resolution was used, with a pre-defined simple road network laid onto it, and landslides 'randomly' dropped onto it. Landslide areas (AL) were randomly selected from a three-parameter inverse gamma probability density function, consisting of a power-law decay of about -2.4 for medium and large values of AL and an exponential rollover for small values of AL; the rollover (maximum probability) occurs at about AL = 400 m2 This statistical distribution was chosen based on three substantially complete triggered landslide inventories recorded in existing literature. The number of landslide areas (NL) selected for each triggered event iteration was chosen to have an average density of 1 landslide km-2, i.e. NL = 400 landslide areas chosen randomly for each iteration, and was based on several existing triggered landslide event inventories. A simple road network was chosen, in a 'T' shape configuration, with one road 1 x 4000 cells (5 m x 20 km) in a 'T' formation with another road 1 x 2000 cells (5 m x 10 km). The landslide areas were then randomly 'dropped' over the road array and indices such as the location, size (ABL) and number of road blockages (NBL) recorded. This process was performed 500 times (iterations) in a Monte-Carlo type simulation. Initial results show that for a landslide triggering event with 400 landslides over a 400 km2 region, the number of road blocks per iteration, NBL,ranges from 0 to 7. The average blockage area for the 500 iterations (A¯ BL) is about 3000 m

  6. SER Analysis of MPPM-Coded MIMO-FSO System over Uncorrelated and Correlated Gamma-Gamma Atmospheric Turbulence Channels

    NASA Astrophysics Data System (ADS)

    Khallaf, Haitham S.; Garrido-Balsells, José M.; Shalaby, Hossam M. H.; Sampei, Seiichi

    2015-12-01

    The performance of multiple-input multiple-output free space optical (MIMO-FSO) communication systems, that adopt multipulse pulse position modulation (MPPM) techniques, is analyzed. Both exact and approximate symbol-error rates (SERs) are derived for both cases of uncorrelated and correlated channels. The effects of background noise, receiver shot-noise, and atmospheric turbulence are taken into consideration in our analysis. The random fluctuations of the received optical irradiance, produced by the atmospheric turbulence, is modeled by the widely used gamma-gamma statistical distribution. Uncorrelated MIMO channels are modeled by the α-μ distribution. A closed-form expression for the probability density function of the optical received irradiance is derived for the case of correlated MIMO channels. Using our analytical expressions, the degradation of the system performance with the increment of the correlation coefficients between MIMO channels is corroborated.

  7. Cumulative Poisson Distribution Program

    NASA Technical Reports Server (NTRS)

    Bowerman, Paul N.; Scheuer, Ernest M.; Nolty, Robert

    1990-01-01

    Overflow and underflow in sums prevented. Cumulative Poisson Distribution Program, CUMPOIS, one of two computer programs that make calculations involving cumulative Poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), used independently of one another. CUMPOIS determines cumulative Poisson distribution, used to evaluate cumulative distribution function (cdf) for gamma distributions with integer shape parameters and cdf for X (sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Written in C.

  8. The hump in the Cerenkov lateral distribution of gamma ray showers

    NASA Technical Reports Server (NTRS)

    Sinha, S.; Sao, M. V. S.

    1985-01-01

    The lateral distribution of atmospheric Cerenkov photons emitted by gamma ray showers of energy 100 GeV is calculated. The lateral distribution shows a characteristic hump at a distance of approx. 135 meter from the core. The hump is shown to be due to electrons of threshold energy 1 GeV, above which the mean scattering angle becomes smaller than the Cerenkov angle.

  9. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    PubMed

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  10. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less

  11. Zipf 's law and the effect of ranking on probability distributions

    NASA Astrophysics Data System (ADS)

    Günther, R.; Levitin, L.; Schapiro, B.; Wagner, P.

    1996-02-01

    Ranking procedures are widely used in the description of many different types of complex systems. Zipf's law is one of the most remarkable frequency-rank relationships and has been observed independently in physics, linguistics, biology, demography, etc. We show that ranking plays a crucial role in making it possible to detect empirical relationships in systems that exist in one realization only, even when the statistical ensemble to which the systems belong has a very broad probability distribution. Analytical results and numerical simulations are presented which clarify the relations between the probability distributions and the behavior of expected values for unranked and ranked random variables. This analysis is performed, in particular, for the evolutionary model presented in our previous papers which leads to Zipf's law and reveals the underlying mechanism of this phenomenon in terms of a system with interdependent and interacting components as opposed to the “ideal gas” models suggested by previous researchers. The ranking procedure applied to this model leads to a new, unexpected phenomenon: a characteristic “staircase” behavior of the mean values of the ranked variables (ranked occupation numbers). This result is due to the broadness of the probability distributions for the occupation numbers and does not follow from the “ideal gas” model. Thus, it provides an opportunity, by comparison with empirical data, to obtain evidence as to which model relates to reality.

  12. Investigation of Dielectric Breakdown Characteristics for Double-break Vacuum Interrupter and Dielectric Breakdown Probability Distribution in Vacuum Interrupter

    NASA Astrophysics Data System (ADS)

    Shioiri, Tetsu; Asari, Naoki; Sato, Junichi; Sasage, Kosuke; Yokokura, Kunio; Homma, Mitsutaka; Suzuki, Katsumi

    To investigate the reliability of equipment of vacuum insulation, a study was carried out to clarify breakdown probability distributions in vacuum gap. Further, a double-break vacuum circuit breaker was investigated for breakdown probability distribution. The test results show that the breakdown probability distribution of the vacuum gap can be represented by a Weibull distribution using a location parameter, which shows the voltage that permits a zero breakdown probability. The location parameter obtained from Weibull plot depends on electrode area. The shape parameter obtained from Weibull plot of vacuum gap was 10∼14, and is constant irrespective non-uniform field factor. The breakdown probability distribution after no-load switching can be represented by Weibull distribution using a location parameter. The shape parameter after no-load switching was 6∼8.5, and is constant, irrespective of gap length. This indicates that the scatter of breakdown voltage was increased by no-load switching. If the vacuum circuit breaker uses a double break, breakdown probability at low voltage becomes lower than single-break probability. Although potential distribution is a concern in the double-break vacuum cuicuit breaker, its insulation reliability is better than that of the single-break vacuum interrupter even if the bias of the vacuum interrupter's sharing voltage is taken into account.

  13. The calculation of neutron capture gamma-ray yields for space shielding applications

    NASA Technical Reports Server (NTRS)

    Yost, K. J.

    1972-01-01

    The application of nuclear models to the calculation of neutron capture and inelastic scattering gamma yields is discussed. The gamma ray cascade model describes the cascade process in terms of parameters which either: (1) embody statistical assumptions regarding electric and magnetic multipole transition strengths, level densities, and spin and parity distributions or (2) are fixed by experiment such as measured energies, spin and parity values, and transition probabilities for low lying states.

  14. Landslide Probability Assessment by the Derived Distributions Technique

    NASA Astrophysics Data System (ADS)

    Muñoz, E.; Ochoa, A.; Martínez, H.

    2012-12-01

    Landslides are potentially disastrous events that bring along human and economic losses; especially in cities where an accelerated and unorganized growth leads to settlements on steep and potentially unstable areas. Among the main causes of landslides are geological, geomorphological, geotechnical, climatological, hydrological conditions and anthropic intervention. This paper studies landslides detonated by rain, commonly known as "soil-slip", which characterize by having a superficial failure surface (Typically between 1 and 1.5 m deep) parallel to the slope face and being triggered by intense and/or sustained periods of rain. This type of landslides is caused by changes on the pore pressure produced by a decrease in the suction when a humid front enters, as a consequence of the infiltration initiated by rain and ruled by the hydraulic characteristics of the soil. Failure occurs when this front reaches a critical depth and the shear strength of the soil in not enough to guarantee the stability of the mass. Critical rainfall thresholds in combination with a slope stability model are widely used for assessing landslide probability. In this paper we present a model for the estimation of the occurrence of landslides based on the derived distributions technique. Since the works of Eagleson in the 1970s the derived distributions technique has been widely used in hydrology to estimate the probability of occurrence of extreme flows. The model estimates the probability density function (pdf) of the Factor of Safety (FOS) from the statistical behavior of the rainfall process and some slope parameters. The stochastic character of the rainfall is transformed by means of a deterministic failure model into FOS pdf. Exceedance probability and return period estimation is then straightforward. The rainfall process is modeled as a Rectangular Pulses Poisson Process (RPPP) with independent exponential pdf for mean intensity and duration of the storms. The Philip infiltration model

  15. Dosimetric effects of Onyx embolization on Gamma Knife arteriovenous malformation dose distributions.

    PubMed

    Schlesinger, David J; Nordström, Håkan; Lundin, Anders; Xu, Zhiyuan; Sheehan, Jason P

    2016-12-01

    OBJECTIVE Patients with arteriovenous malformations (AVMs) treated with Gamma Knife radiosurgery (GKRS) subsequent to embolization suffer from elevated local failure rates and differences in adverse radiation effects. Onyx is a common embolic material for AVMs. Onyx is formulated with tantalum, a high atomic number (Z = 73) element that has been investigated as a source of dosimetric uncertainty contributing to the less favorable clinical results. However, prior studies have not modeled the complicated anatomical and beam geometries characteristic of GKRS. This study investigated the magnitude of dose perturbation that can occur due to Onyx embolization using clinically realistic anatomical and Gamma Knife beam models. METHODS Leksell GammaPlan (LGP) was used to segment the AVM nidus and areas of Onyx from postcontrast stereotactic MRI for 7 patients treated with GKRS postembolization. The resulting contours, skull surface, and clinically selected dose distributions were exported from LGP in DICOM-RT (Digital Imaging and Communications in Medicine-radiotherapy) format. Isocenter locations and dwell times were recorded from the LGP database. Contours were converted into 3D mesh representations using commercial and in-house mesh-editing software. The resulting data were imported into a Monte Carlo (MC) dose calculation engine (Pegasos, Elekta Instruments AB) with a beam geometry for the Gamma Knife Perfexion. The MC-predicted dose distributions were calculated with Onyx assigned manufacturer-reported physical constants (MC-Onyx), and then compared with corresponding distributions in which Onyx was reassigned constants for water (MC-water). Differences in dose metrics were determined, including minimum, maximum, and mean dose to the AVM nidus; selectivity index; and target coverage. Combined differences in dose magnitude and distance to agreement were calculated as 3D Gamma analysis passing rates using tolerance criteria of 0.5%/0.5 mm, 1.0%/1.0 mm, and 3.0%/3.0 mm

  16. Code CUGEL: A code to unfold Ge(Li) spectrometer polyenergetic gamma photon experimental distributions

    NASA Technical Reports Server (NTRS)

    Steyn, J. J.; Born, U.

    1970-01-01

    A FORTRAN code was developed for the Univac 1108 digital computer to unfold lithium-drifted germanium semiconductor spectrometers, polyenergetic gamma photon experimental distributions. It was designed to analyze the combination continuous and monoenergetic gamma radiation field of radioisotope volumetric sources. The code generates the detector system response matrix function and applies it to monoenergetic spectral components discretely and to the continuum iteratively. It corrects for system drift, source decay, background, and detection efficiency. Results are presented in digital form for differential and integrated photon number and energy distributions, and for exposure dose.

  17. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    PubMed

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  18. Vertical changes in the probability distribution of downward irradiance within the near-surface ocean under sunny conditions

    NASA Astrophysics Data System (ADS)

    Gernez, Pierre; Stramski, Dariusz; Darecki, Miroslaw

    2011-07-01

    Time series measurements of fluctuations in underwater downward irradiance, Ed, within the green spectral band (532 nm) show that the probability distribution of instantaneous irradiance varies greatly as a function of depth within the near-surface ocean under sunny conditions. Because of intense light flashes caused by surface wave focusing, the near-surface probability distributions are highly skewed to the right and are heavy tailed. The coefficients of skewness and excess kurtosis at depths smaller than 1 m can exceed 3 and 20, respectively. We tested several probability models, such as lognormal, Gumbel, Fréchet, log-logistic, and Pareto, which are potentially suited to describe the highly skewed heavy-tailed distributions. We found that the models cannot approximate with consistently good accuracy the high irradiance values within the right tail of the experimental distribution where the probability of these values is less than 10%. This portion of the distribution corresponds approximately to light flashes with Ed > 1.5?, where ? is the time-averaged downward irradiance. However, the remaining part of the probability distribution covering all irradiance values smaller than the 90th percentile can be described with a reasonable accuracy (i.e., within 20%) with a lognormal model for all 86 measurements from the top 10 m of the ocean included in this analysis. As the intensity of irradiance fluctuations decreases with depth, the probability distribution tends toward a function symmetrical around the mean like the normal distribution. For the examined data set, the skewness and excess kurtosis assumed values very close to zero at a depth of about 10 m.

  19. Human EEG gamma oscillations in neuropsychiatric disorders.

    PubMed

    Herrmann, C S; Demiralp, T

    2005-12-01

    Due to their small amplitude, the importance of high-frequency EEG oscillations with respect to cognitive functions and disorders is often underestimated as compared to slower oscillations. This article reviews the literature on the alterations of gamma oscillations (about 30-80 Hz) during the course of neuropsychiatric disorders and relates them to a model for the functional role of these oscillations for memory matching. The synchronous firing of neurons in the gamma-band has been proposed to bind multiple features of an object, which are coded in a distributed manner in the brain, and is modulated by cognitive processes such as attention and memory. In certain neuropsychiatric disorders the gamma activity shows significant changes. In schizophrenic patients, negative symptoms correlate with a decrease of gamma responses, whereas a significant increase in gamma amplitudes is observed during positive symptoms such as hallucinations. A reduction is also observed in Alzheimer's Disease (AD), whereas an increase is found in epileptic patients, probably reflecting both cortical excitation and perceptual distortions such as déjà vu phenomena frequently observed in epilepsy. ADHD patients also exhibit increased gamma amplitudes. A hypothesis of a gamma axis of these disorders mainly based on the significance of gamma oscillations for memory matching is formulated.

  20. Generating an Empirical Probability Distribution for the Andrews-Pregibon Statistic.

    ERIC Educational Resources Information Center

    Jarrell, Michele G.

    A probability distribution was developed for the Andrews-Pregibon (AP) statistic. The statistic, developed by D. F. Andrews and D. Pregibon (1978), identifies multivariate outliers. It is a ratio of the determinant of the data matrix with an observation deleted to the determinant of the entire data matrix. Although the AP statistic has been used…

  1. Estimating probable flaw distributions in PWR steam generator tubes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorman, J.A.; Turner, A.P.L.

    1997-02-01

    This paper describes methods for estimating the number and size distributions of flaws of various types in PWR steam generator tubes. These estimates are needed when calculating the probable primary to secondary leakage through steam generator tubes under postulated accidents such as severe core accidents and steam line breaks. The paper describes methods for two types of predictions: (1) the numbers of tubes with detectable flaws of various types as a function of time, and (2) the distributions in size of these flaws. Results are provided for hypothetical severely affected, moderately affected and lightly affected units. Discussion is provided regardingmore » uncertainties and assumptions in the data and analyses.« less

  2. Gas Hydrate Formation Probability Distributions: The Effect of Shear and Comparisons with Nucleation Theory.

    PubMed

    May, Eric F; Lim, Vincent W; Metaxas, Peter J; Du, Jianwei; Stanwix, Paul L; Rowland, Darren; Johns, Michael L; Haandrikman, Gert; Crosby, Daniel; Aman, Zachary M

    2018-03-13

    Gas hydrate formation is a stochastic phenomenon of considerable significance for any risk-based approach to flow assurance in the oil and gas industry. In principle, well-established results from nucleation theory offer the prospect of predictive models for hydrate formation probability in industrial production systems. In practice, however, heuristics are relied on when estimating formation risk for a given flowline subcooling or when quantifying kinetic hydrate inhibitor (KHI) performance. Here, we present statistically significant measurements of formation probability distributions for natural gas hydrate systems under shear, which are quantitatively compared with theoretical predictions. Distributions with over 100 points were generated using low-mass, Peltier-cooled pressure cells, cycled in temperature between 40 and -5 °C at up to 2 K·min -1 and analyzed with robust algorithms that automatically identify hydrate formation and initial growth rates from dynamic pressure data. The application of shear had a significant influence on the measured distributions: at 700 rpm mass-transfer limitations were minimal, as demonstrated by the kinetic growth rates observed. The formation probability distributions measured at this shear rate had mean subcoolings consistent with theoretical predictions and steel-hydrate-water contact angles of 14-26°. However, the experimental distributions were substantially wider than predicted, suggesting that phenomena acting on macroscopic length scales are responsible for much of the observed stochastic formation. Performance tests of a KHI provided new insights into how such chemicals can reduce the risk of hydrate blockage in flowlines. Our data demonstrate that the KHI not only reduces the probability of formation (by both shifting and sharpening the distribution) but also reduces hydrate growth rates by a factor of 2.

  3. Maximum-entropy probability distributions under Lp-norm constraints

    NASA Technical Reports Server (NTRS)

    Dolinar, S.

    1991-01-01

    Continuous probability density functions and discrete probability mass functions are tabulated which maximize the differential entropy or absolute entropy, respectively, among all probability distributions with a given L sub p norm (i.e., a given pth absolute moment when p is a finite integer) and unconstrained or constrained value set. Expressions for the maximum entropy are evaluated as functions of the L sub p norm. The most interesting results are obtained and plotted for unconstrained (real valued) continuous random variables and for integer valued discrete random variables. The maximum entropy expressions are obtained in closed form for unconstrained continuous random variables, and in this case there is a simple straight line relationship between the maximum differential entropy and the logarithm of the L sub p norm. Corresponding expressions for arbitrary discrete and constrained continuous random variables are given parametrically; closed form expressions are available only for special cases. However, simpler alternative bounds on the maximum entropy of integer valued discrete random variables are obtained by applying the differential entropy results to continuous random variables which approximate the integer valued random variables in a natural manner. All the results are presented in an integrated framework that includes continuous and discrete random variables, constraints on the permissible value set, and all possible values of p. Understanding such as this is useful in evaluating the performance of data compression schemes.

  4. Evaluation of the Three Parameter Weibull Distribution Function for Predicting Fracture Probability in Composite Materials

    DTIC Science & Technology

    1978-03-01

    for the risk of rupture for a unidirectionally laminat - ed composite subjected to pure bending. (5D This equation can be simplified further by use of...C EVALUATION OF THE THREE PARAMETER WEIBULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS. THESIS / AFIT/GAE...EVALUATION OF THE THREE PARAMETER WE1BULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS THESIS Presented

  5. Neighbor-Dependent Ramachandran Probability Distributions of Amino Acids Developed from a Hierarchical Dirichlet Process Model

    PubMed Central

    Mitra, Rajib; Jordan, Michael I.; Dunbrack, Roland L.

    2010-01-01

    Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1) input data size and criteria for structure inclusion (resolution, R-factor, etc.); 2) filtering of suspect conformations and outliers using B-factors or other features; 3) secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included); 4) the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5) whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately) have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp. PMID:20442867

  6. The effect of microscopic friction and size distributions on conditional probability distributions in soft particle packings

    NASA Astrophysics Data System (ADS)

    Saitoh, Kuniyasu; Magnanimo, Vanessa; Luding, Stefan

    2017-10-01

    Employing two-dimensional molecular dynamics (MD) simulations of soft particles, we study their non-affine responses to quasi-static isotropic compression where the effects of microscopic friction between the particles in contact and particle size distributions are examined. To quantify complicated restructuring of force-chain networks under isotropic compression, we introduce the conditional probability distributions (CPDs) of particle overlaps such that a master equation for distribution of overlaps in the soft particle packings can be constructed. From our MD simulations, we observe that the CPDs are well described by q-Gaussian distributions, where we find that the correlation for the evolution of particle overlaps is suppressed by microscopic friction, while it significantly increases with the increase of poly-dispersity.

  7. Suzaku Wide-band All-sky Monitor measurements of duration distributions of gamma-ray bursts

    NASA Astrophysics Data System (ADS)

    Ohmori, Norisuke; Yamaoka, Kazutaka; Ohno, Masanori; Sugita, Satoshi; Kinoshita, Ryuuji; Nishioka, Yusuke; Hurley, Kevin; Hanabata, Yoshitaka; Tashiro, Makoto S.; Enomoto, Junichi; Fujinuma, Takeshi; Fukazawa, Yasushi; Iwakiri, Wataru; Kawano, Takafumi; Kokubun, Motohide; Makishima, Kazuo; Matsuoka, Shunsuke; Nagayoshi, Tsutomu; Nakagawa, Yujin E.; Nakaya, Souhei; Nakazawa, Kazuhiro; Takahashi, Tadayuki; Takeda, Sawako; Terada, Yukikatsu; Urata, Yuji; Yabe, Seiya; Yasuda, Tetsuya; Yamauchi, Makoto

    2016-06-01

    We report on the T90 and T50 duration distributions and their relations with spectral hardness using 1464 gamma-ray bursts (GRBs), which were observed by the Suzaku Wide-band All-sky Monitor (WAM) from 2005 August 4 to 2010 December 29. The duration distribution is clearly bimodal in three energy ranges (50-120, 120-250, and 250-550 keV), but is unclear in the 550-5000 keV range, probably because of the limited sample size. The WAM durations decrease with energy according to a power-law index of -0.058(-0.034, +0.033). The hardness-duration relation reveals the presence of short-hard and long-soft GRBs. The short:long event ratio tends to be higher with increasing energy. We compared the WAM distribution with ones measured by eight other GRB instruments. The WAM T90 distribution is very similar to those of INTEGRAL/SPI-ACS and Granat/PHEBUS, and least likely to match the Swift/BAT distribution. The WAM short:long event ratio (0.25:0.75) is much different from Swift/BAT (0.08:0.92), but is almost the same as CGRO/BATSE (0.25:0.75). To explain this difference for BAT, we examined three effects: BAT trigger types, energy dependence of the duration, and detection sensitivity differences between BAT and WAM. As a result, we found that the ratio difference could be explained mainly by energy dependence including soft extended emissions for short GRBs and much better sensitivity for BAT which can detect weak/long GRBs. The reason for the same short:long event ratio for BATSE and WAM was confirmed by calculation using the trigger efficiency curve.

  8. Comparison of three-parameter probability distributions for representing annual extreme and partial duration precipitation series

    NASA Astrophysics Data System (ADS)

    Wilks, Daniel S.

    1993-10-01

    Performance of 8 three-parameter probability distributions for representing annual extreme and partial duration precipitation data at stations in the northeastern and southeastern United States is investigated. Particular attention is paid to fidelity on the right tail, through use of a bootstrap procedure simulating extrapolation on the right tail beyond the data. It is found that the beta-κ distribution best describes the extreme right tail of annual extreme series, and the beta-P distribution is best for the partial duration data. The conventionally employed two-parameter Gumbel distribution is found to substantially underestimate probabilities associated with the larger precipitation amounts for both annual extreme and partial duration data. Fitting the distributions using left-censored data did not result in improved fits to the right tail.

  9. A microcomputer program for energy assessment and aggregation using the triangular probability distribution

    USGS Publications Warehouse

    Crovelli, R.A.; Balay, R.H.

    1991-01-01

    A general risk-analysis method was developed for petroleum-resource assessment and other applications. The triangular probability distribution is used as a model with an analytic aggregation methodology based on probability theory rather than Monte-Carlo simulation. Among the advantages of the analytic method are its computational speed and flexibility, and the saving of time and cost on a microcomputer. The input into the model consists of a set of components (e.g. geologic provinces) and, for each component, three potential resource estimates: minimum, most likely (mode), and maximum. Assuming a triangular probability distribution, the mean, standard deviation, and seven fractiles (F100, F95, F75, F50, F25, F5, and F0) are computed for each component, where for example, the probability of more than F95 is equal to 0.95. The components are aggregated by combining the means, standard deviations, and respective fractiles under three possible siutations (1) perfect positive correlation, (2) complete independence, and (3) any degree of dependence between these two polar situations. A package of computer programs named the TRIAGG system was written in the Turbo Pascal 4.0 language for performing the analytic probabilistic methodology. The system consists of a program for processing triangular probability distribution assessments and aggregations, and a separate aggregation routine for aggregating aggregations. The user's documentation and program diskette of the TRIAGG system are available from USGS Open File Services. TRIAGG requires an IBM-PC/XT/AT compatible microcomputer with 256kbyte of main memory, MS-DOS 3.1 or later, either two diskette drives or a fixed disk, and a 132 column printer. A graphics adapter and color display are optional. ?? 1991.

  10. Idealized models of the joint probability distribution of wind speeds

    NASA Astrophysics Data System (ADS)

    Monahan, Adam H.

    2018-05-01

    The joint probability distribution of wind speeds at two separate locations in space or points in time completely characterizes the statistical dependence of these two quantities, providing more information than linear measures such as correlation. In this study, we consider two models of the joint distribution of wind speeds obtained from idealized models of the dependence structure of the horizontal wind velocity components. The bivariate Rice distribution follows from assuming that the wind components have Gaussian and isotropic fluctuations. The bivariate Weibull distribution arises from power law transformations of wind speeds corresponding to vector components with Gaussian, isotropic, mean-zero variability. Maximum likelihood estimates of these distributions are compared using wind speed data from the mid-troposphere, from different altitudes at the Cabauw tower in the Netherlands, and from scatterometer observations over the sea surface. While the bivariate Rice distribution is more flexible and can represent a broader class of dependence structures, the bivariate Weibull distribution is mathematically simpler and may be more convenient in many applications. The complexity of the mathematical expressions obtained for the joint distributions suggests that the development of explicit functional forms for multivariate speed distributions from distributions of the components will not be practical for more complicated dependence structure or more than two speed variables.

  11. Small-Scale Spatio-Temporal Distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) Using Probability Kriging.

    PubMed

    Wang, S Q; Zhang, H Y; Li, Z L

    2016-10-01

    Understanding spatio-temporal distribution of pest in orchards can provide important information that could be used to design monitoring schemes and establish better means for pest control. In this study, the spatial and temporal distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) was assessed, and activity trends were evaluated by using probability kriging. Adults of B. minax were captured in two successive occurrences in a small-scale citrus orchard by using food bait traps, which were placed both inside and outside the orchard. The weekly spatial distribution of B. minax within the orchard and adjacent woods was examined using semivariogram parameters. The edge concentration was discovered during the most weeks in adult occurrence, and the population of the adults aggregated with high probability within a less-than-100-m-wide band on both of the sides of the orchard and the woods. The sequential probability kriged maps showed that the adults were estimated in the marginal zone with higher probability, especially in the early and peak stages. The feeding, ovipositing, and mating behaviors of B. minax are possible explanations for these spatio-temporal patterns. Therefore, spatial arrangement and distance to the forest edge of traps or spraying spot should be considered to enhance pest control on B. minax in small-scale orchards.

  12. A method to deconvolve stellar rotational velocities II. The probability distribution function via Tikhonov regularization

    NASA Astrophysics Data System (ADS)

    Christen, Alejandra; Escarate, Pedro; Curé, Michel; Rial, Diego F.; Cassetti, Julia

    2016-10-01

    Aims: Knowing the distribution of stellar rotational velocities is essential for understanding stellar evolution. Because we measure the projected rotational speed v sin I, we need to solve an ill-posed problem given by a Fredholm integral of the first kind to recover the "true" rotational velocity distribution. Methods: After discretization of the Fredholm integral we apply the Tikhonov regularization method to obtain directly the probability distribution function for stellar rotational velocities. We propose a simple and straightforward procedure to determine the Tikhonov parameter. We applied Monte Carlo simulations to prove that the Tikhonov method is a consistent estimator and asymptotically unbiased. Results: This method is applied to a sample of cluster stars. We obtain confidence intervals using a bootstrap method. Our results are in close agreement with those obtained using the Lucy method for recovering the probability density distribution of rotational velocities. Furthermore, Lucy estimation lies inside our confidence interval. Conclusions: Tikhonov regularization is a highly robust method that deconvolves the rotational velocity probability density function from a sample of v sin I data directly without the need for any convergence criteria.

  13. Goodness of fit of probability distributions for sightings as species approach extinction.

    PubMed

    Vogel, Richard M; Hosking, Jonathan R M; Elphick, Chris S; Roberts, David L; Reed, J Michael

    2009-04-01

    Estimating the probability that a species is extinct and the timing of extinctions is useful in biological fields ranging from paleoecology to conservation biology. Various statistical methods have been introduced to infer the time of extinction and extinction probability from a series of individual sightings. There is little evidence, however, as to which of these models provide adequate fit to actual sighting records. We use L-moment diagrams and probability plot correlation coefficient (PPCC) hypothesis tests to evaluate the goodness of fit of various probabilistic models to sighting data collected for a set of North American and Hawaiian bird populations that have either gone extinct, or are suspected of having gone extinct, during the past 150 years. For our data, the uniform, truncated exponential, and generalized Pareto models performed moderately well, but the Weibull model performed poorly. Of the acceptable models, the uniform distribution performed best based on PPCC goodness of fit comparisons and sequential Bonferroni-type tests. Further analyses using field significance tests suggest that although the uniform distribution is the best of those considered, additional work remains to evaluate the truncated exponential model more fully. The methods we present here provide a framework for evaluating subsequent models.

  14. A compound scattering pdf for the ultrasonic echo envelope and its relationship to K and Nakagami distributions.

    PubMed

    Shankar, P Mohana

    2003-03-01

    A compound probability density function (pdf) is presented to describe the envelope of the backscattered echo from tissue. This pdf allows local and global variation in scattering cross sections in tissue. The ultrasonic backscattering cross sections are assumed to be gamma distributed. The gamma distribution also is used to model the randomness in the average cross sections. This gamma-gamma model results in the compound scattering pdf for the envelope. The relationship of this compound pdf to the Rayleigh, K, and Nakagami distributions is explored through an analysis of the signal-to-noise ratio of the envelopes and random number simulations. The three parameter compound pdf appears to be flexible enough to represent envelope statistics giving rise to Rayleigh, K, and Nakagami distributions.

  15. Count data, detection probabilities, and the demography, dynamics, distribution, and decline of amphibians.

    PubMed

    Schmidt, Benedikt R

    2003-08-01

    The evidence for amphibian population declines is based on count data that were not adjusted for detection probabilities. Such data are not reliable even when collected using standard methods. The formula C = Np (where C is a count, N the true parameter value, and p is a detection probability) relates count data to demography, population size, or distributions. With unadjusted count data, one assumes a linear relationship between C and N and that p is constant. These assumptions are unlikely to be met in studies of amphibian populations. Amphibian population data should be based on methods that account for detection probabilities.

  16. Multi-scale Characterization and Modeling of Surface Slope Probability Distribution for ~20-km Diameter Lunar Craters

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Robinson, M. S.; Boyd, A. K.

    2013-12-01

    Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was

  17. Methods to elicit probability distributions from experts: a systematic review of reported practice in health technology assessment.

    PubMed

    Grigore, Bogdan; Peters, Jaime; Hyde, Christopher; Stein, Ken

    2013-11-01

    Elicitation is a technique that can be used to obtain probability distribution from experts about unknown quantities. We conducted a methodology review of reports where probability distributions had been elicited from experts to be used in model-based health technology assessments. Databases including MEDLINE, EMBASE and the CRD database were searched from inception to April 2013. Reference lists were checked and citation mapping was also used. Studies describing their approach to the elicitation of probability distributions were included. Data was abstracted on pre-defined aspects of the elicitation technique. Reports were critically appraised on their consideration of the validity, reliability and feasibility of the elicitation exercise. Fourteen articles were included. Across these studies, the most marked features were heterogeneity in elicitation approach and failure to report key aspects of the elicitation method. The most frequently used approaches to elicitation were the histogram technique and the bisection method. Only three papers explicitly considered the validity, reliability and feasibility of the elicitation exercises. Judged by the studies identified in the review, reports of expert elicitation are insufficient in detail and this impacts on the perceived usability of expert-elicited probability distributions. In this context, the wider credibility of elicitation will only be improved by better reporting and greater standardisation of approach. Until then, the advantage of eliciting probability distributions from experts may be lost.

  18. Comparative effects of 60Co gamma-rays and neon and helium ions on cycle duration and division probability of EMT 6 cells. A time-lapse cinematography study.

    PubMed

    Collyn-d'Hooghe, M; Hemon, D; Gilet, R; Curtis, S B; Valleron, A J; Malaise, E P

    1981-03-01

    Exponentially growing cultures of EMT 6 cells were irradiated in vitro with neon ions, helium ions or 60Co gamma-rays. Time-lapse cinematography allowed the determination, for individual cells, of cycle duration, success of the mitotic division and the age of the cell at the moment of irradiation. Irradiation induced a significant mitotic delay increasing proportionally with the delivered dose. Using mitotic delay as an endpoint, the r.b.e. for neon ions with respect to 60Co gamma-rays was 3.3 +/- 0.2 while for helium ions it was 1.2 +/- 0.1. Mitotic delay was greatest in those cells that had progressed furthest in their cycle at the time of irradiation. No significant mitotic delay was observed in the post-irradiation generation. Division probability was significantly reduced by irradiation both in the irradiated and in the post-irradiated generation. The reduction in division probability obtained with 3 Gy of neon ions was similar to that obtained after irradiation with 6 Gy of helium ions or 60Co gamma-rays.

  19. Size distributions of air showers accompanied with high energy gamma ray bundles observed at Mt. Chacaltaya

    NASA Technical Reports Server (NTRS)

    Matano, T.; Machida, M.; Tsuchima, I.; Kawasumi, N.; Honda, K.; Hashimoto, K.; Martinic, N.; Zapata, J.; Navia, C. E.; Aquirre, C.

    1985-01-01

    Size distributions of air showers accompanied with bundle of high energy gamma rays and/or large size bursts under emulsion chambers, to study the composition of primary cosmic rays and also characteristics of high energy nuclear interaction. Air showers initiated by particles with a large cross section of interaction may develop from narrow region of the atmosphere near the top. Starting levels of air showers by particles with smaller cross section fluctuate in wider region of the atmosphere. Air showers of extremely small size accompanied with bundle of gamma rays may be ones initiated by protons at lower level after penetrating deep atmosphere without interaction. It is determined that the relative size distribution according to the total energy of bundle of gamma rays and the total burst size observed under 15 cm lead absorber.

  20. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    PubMed Central

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2014-01-01

    Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016

  1. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  2. Animating Statistics: A New Kind of Applet for Exploring Probability Distributions

    ERIC Educational Resources Information Center

    Kahle, David

    2014-01-01

    In this article, I introduce a novel applet ("module") for exploring probability distributions, their samples, and various related statistical concepts. The module is primarily designed to be used by the instructor in the introductory course, but it can be used far beyond it as well. It is a free, cross-platform, stand-alone interactive…

  3. The effects of pure density evolution on the brightness distribution of cosmological gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Horack, J. M.; Emslie, A. G.; Hartmann, D. H.

    1995-01-01

    In this work, we explore the effects of burst rate density evolution on the observed brightness distribution of cosmological gamma-ray bursts. Although the brightness distribution of gamma-ray bursts observed by the BATSE experiment has been shown to be consistent with a nonevolving source population observed to redshifts of order unity, evolution of some form is likely to be present in the gamma-ray bursts. Additionally, nonevolving models place significant constraints on the range of observed burst luminosities, which are relaxed if evolution of the burst population is present. In this paper, three analytic forms of density evolution are examined. In general, forms of evolution with densities that increase monotonically with redshift require that the BATSE data correspond to bursts at larger redshifts, or to incorporate a wider range of burst luminosities, or both. Independent estimates of the maximum observed redshift in the BATSE data and/or the range of luminosity from which a large fraction of the observed bursts are drawn therefore allow for constraints to be placed on the amount of evolution that may be present in the burst population. Specifically, if recent measurements obtained from analysis of the BATSE duration distribution of the actual limiting redshift in the BATSE data at z(sub lim) = 2 are correct, the BATSE N(P) distribution in a Lambda = 0 universe is inconsistent at a level of approximately 3 alpha with nonevolving gamma-ray bursts and some form of evolution in the population is required. The sense of this required source evolution is to provide a higher density, larger luminosities, or both with increasing redshift.

  4. The force distribution probability function for simple fluids by density functional theory.

    PubMed

    Rickayzen, G; Heyes, D M

    2013-02-28

    Classical density functional theory (DFT) is used to derive a formula for the probability density distribution function, P(F), and probability distribution function, W(F), for simple fluids, where F is the net force on a particle. The final formula for P(F) ∝ exp(-AF(2)), where A depends on the fluid density, the temperature, and the Fourier transform of the pair potential. The form of the DFT theory used is only applicable to bounded potential fluids. When combined with the hypernetted chain closure of the Ornstein-Zernike equation, the DFT theory for W(F) agrees with molecular dynamics computer simulations for the Gaussian and bounded soft sphere at high density. The Gaussian form for P(F) is still accurate at lower densities (but not too low density) for the two potentials, but with a smaller value for the constant, A, than that predicted by the DFT theory.

  5. Radial particle distributions in PARMILA simulation beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boicourt, G.P.

    1984-03-01

    The estimation of beam spill in particle accelerators is becoming of greater importance as higher current designs are being funded. To the present, no numerical method for predicting beam-spill has been available. In this paper, we present an approach to the loss-estimation problem that uses probability distributions fitted to particle-simulation beams. The properties of the PARMILA code's radial particle distribution are discussed, and a broad class of probability distributions are examined to check their ability to fit it. The possibility that the PARMILA distribution is a mixture is discussed, and a fitting distribution consisting of a mixture of two generalizedmore » gamma distributions is found. An efficient algorithm to accomplish the fit is presented. Examples of the relative prediction of beam spill are given. 26 references, 18 figures, 1 table.« less

  6. Using gamma distribution to determine half-life of rotenone, applied in freshwater.

    PubMed

    Rohan, Maheswaran; Fairweather, Alastair; Grainger, Natasha

    2015-09-15

    Following the use of rotenone to eradicate invasive pest fish, a dynamic first-order kinetic model is usually used to determine the half-life and rate at which rotenone dissipated from the treated waterbody. In this study, we investigate the use of a stochastic gamma model for determining the half-life and rate at which rotenone dissipates from waterbodies. The first-order kinetic and gamma models produced similar values for the half-life (4.45 days and 5.33 days respectively) and days to complete dissipation (51.2 days and 52.48 days respectively). However, the gamma model fitted the data better and was more flexible than the first-order kinetic model, allowing us to use covariates and to predict a possible range for the half-life of rotenone. These benefits are particularly important when examining the influence that different environmental factors have on rotenone dissipation and when trying to predict the rate at which rotenone will dissipate during future operations. We therefore recommend that in future the gamma distribution model is used when calculating the half-life of rotenone in preference to the dynamic first-order kinetics model. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Transient Properties of Probability Distribution for a Markov Process with Size-dependent Additive Noise

    NASA Astrophysics Data System (ADS)

    Yamada, Yuhei; Yamazaki, Yoshihiro

    2018-04-01

    This study considered a stochastic model for cluster growth in a Markov process with a cluster size dependent additive noise. According to this model, the probability distribution of the cluster size transiently becomes an exponential or a log-normal distribution depending on the initial condition of the growth. In this letter, a master equation is obtained for this model, and derivation of the distributions is discussed.

  8. Zirconium and Yttrium (p, d) Surrogate Nuclear Reactions: Measurement and determination of gamma-ray probabilities: Experimental Physics Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Burke, J. T.; Hughes, R. O.; Escher, J. E.

    This technical report documents the surrogate reaction method and experimental results used to determine the desired neutron induced cross sections of 87Y(n,g) and the known 90Zr(n,g) cross section. This experiment was performed at the STARLiTeR apparatus located at Texas A&M Cyclotron Institute using the K150 Cyclotron which produced a 28.56 MeV proton beam. The proton beam impinged on Y and Zr targets to produce the nuclear reactions 89Y(p,d) 88Y and 92Zr(p,d) 91Zr. Both particle singles data and particle-gamma ray coincident data were measured during the experiment. This data was used to determine the γ-ray probability as a function of energymore » for these reactions. The results for the γ-ray probabilities as a function of energy for both these nuclei are documented here. For completeness, extensive tabulated and graphical results are provided in the appendices.« less

  9. On the issues of probability distribution of GPS carrier phase observations

    NASA Astrophysics Data System (ADS)

    Luo, X.; Mayer, M.; Heck, B.

    2009-04-01

    In common practice the observables related to Global Positioning System (GPS) are assumed to follow a Gauss-Laplace normal distribution. Actually, full knowledge of the observables' distribution is not required for parameter estimation by means of the least-squares algorithm based on the functional relation between observations and unknown parameters as well as the associated variance-covariance matrix. However, the probability distribution of GPS observations plays a key role in procedures for quality control (e.g. outlier and cycle slips detection, ambiguity resolution) and in reliability-related assessments of the estimation results. Under non-ideal observation conditions with respect to the factors impacting GPS data quality, for example multipath effects and atmospheric delays, the validity of the normal distribution postulate of GPS observations is in doubt. This paper presents a detailed analysis of the distribution properties of GPS carrier phase observations using double difference residuals. For this purpose 1-Hz observation data from the permanent SAPOS

  10. Most recent common ancestor probability distributions in gene genealogies under selection.

    PubMed

    Slade, P F

    2000-12-01

    A computational study is made of the conditional probability distribution for the allelic type of the most recent common ancestor in genealogies of samples of n genes drawn from a population under selection, given the initial sample configuration. Comparisons with the corresponding unconditional cases are presented. Such unconditional distributions differ from samples drawn from the unique stationary distribution of population allelic frequencies, known as Wright's formula, and are quantified. Biallelic haploid and diploid models are considered. A simplified structure for the ancestral selection graph of S. M. Krone and C. Neuhauser (1997, Theor. Popul. Biol. 51, 210-237) is enhanced further, reducing the effective branching rate in the graph. This improves efficiency of such a nonneutral analogue of the coalescent for use with computational likelihood-inference techniques.

  11. Probability density functions for use when calculating standardised drought indices

    NASA Astrophysics Data System (ADS)

    Svensson, Cecilia; Prosdocimi, Ilaria; Hannaford, Jamie

    2015-04-01

    Time series of drought indices like the standardised precipitation index (SPI) and standardised flow index (SFI) require a statistical probability density function to be fitted to the observed (generally monthly) precipitation and river flow data. Once fitted, the quantiles are transformed to a Normal distribution with mean = 0 and standard deviation = 1. These transformed data are the SPI/SFI, which are widely used in drought studies, including for drought monitoring and early warning applications. Different distributions were fitted to rainfall and river flow data accumulated over 1, 3, 6 and 12 months for 121 catchments in the United Kingdom. These catchments represent a range of catchment characteristics in a mid-latitude climate. Both rainfall and river flow data have a lower bound at 0, as rains and flows cannot be negative. Their empirical distributions also tend to have positive skewness, and therefore the Gamma distribution has often been a natural and suitable choice for describing the data statistically. However, after transformation of the data to Normal distributions to obtain the SPIs and SFIs for the 121 catchments, the distributions are rejected in 11% and 19% of cases, respectively, by the Shapiro-Wilk test. Three-parameter distributions traditionally used in hydrological applications, such as the Pearson type 3 for rainfall and the Generalised Logistic and Generalised Extreme Value distributions for river flow, tend to make the transformed data fit better, with rejection rates of 5% or less. However, none of these three-parameter distributions have a lower bound at zero. This means that the lower tail of the fitted distribution may potentially go below zero, which would result in a lower limit to the calculated SPI and SFI values (as observations can never reach into this lower tail of the theoretical distribution). The Tweedie distribution can overcome the problems found when using either the Gamma or the above three-parameter distributions. The

  12. A least squares approach to estimating the probability distribution of unobserved data in multiphoton microscopy

    NASA Astrophysics Data System (ADS)

    Salama, Paul

    2008-02-01

    Multi-photon microscopy has provided biologists with unprecedented opportunities for high resolution imaging deep into tissues. Unfortunately deep tissue multi-photon microscopy images are in general noisy since they are acquired at low photon counts. To aid in the analysis and segmentation of such images it is sometimes necessary to initially enhance the acquired images. One way to enhance an image is to find the maximum a posteriori (MAP) estimate of each pixel comprising an image, which is achieved by finding a constrained least squares estimate of the unknown distribution. In arriving at the distribution it is assumed that the noise is Poisson distributed, the true but unknown pixel values assume a probability mass function over a finite set of non-negative values, and since the observed data also assumes finite values because of low photon counts, the sum of the probabilities of the observed pixel values (obtained from the histogram of the acquired pixel values) is less than one. Experimental results demonstrate that it is possible to closely estimate the unknown probability mass function with these assumptions.

  13. Probability distribution for the Gaussian curvature of the zero level surface of a random function

    NASA Astrophysics Data System (ADS)

    Hannay, J. H.

    2018-04-01

    A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z)  =  0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f  =  0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.

  14. Learning Probabilities From Random Observables in High Dimensions: The Maximum Entropy Distribution and Others

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Cocco, Simona; Monasson, Rémi

    2015-11-01

    We consider the problem of learning a target probability distribution over a set of N binary variables from the knowledge of the expectation values (with this target distribution) of M observables, drawn uniformly at random. The space of all probability distributions compatible with these M expectation values within some fixed accuracy, called version space, is studied. We introduce a biased measure over the version space, which gives a boost increasing exponentially with the entropy of the distributions and with an arbitrary inverse `temperature' Γ . The choice of Γ allows us to interpolate smoothly between the unbiased measure over all distributions in the version space (Γ =0) and the pointwise measure concentrated at the maximum entropy distribution (Γ → ∞ ). Using the replica method we compute the volume of the version space and other quantities of interest, such as the distance R between the target distribution and the center-of-mass distribution over the version space, as functions of α =(log M)/N and Γ for large N. Phase transitions at critical values of α are found, corresponding to qualitative improvements in the learning of the target distribution and to the decrease of the distance R. However, for fixed α the distance R does not vary with Γ which means that the maximum entropy distribution is not closer to the target distribution than any other distribution compatible with the observable values. Our results are confirmed by Monte Carlo sampling of the version space for small system sizes (N≤ 10).

  15. Probability weighted moments: Definition and relation to parameters of several distributions expressable in inverse form

    USGS Publications Warehouse

    Greenwood, J. Arthur; Landwehr, J. Maciunas; Matalas, N.C.; Wallis, J.R.

    1979-01-01

    Distributions whose inverse forms are explicitly defined, such as Tukey's lambda, may present problems in deriving their parameters by more conventional means. Probability weighted moments are introduced and shown to be potentially useful in expressing the parameters of these distributions.

  16. Study on detecting spatial distribution of neutrons and gamma rays using a multi-imaging plate system.

    PubMed

    Tanaka, Kenichi; Sakurai, Yoshinori; Endo, Satoru; Takada, Jun

    2014-06-01

    In order to measure the spatial distributions of neutrons and gamma rays separately using the imaging plate, the requirement for the converter to enhance specific component was investigated with the PHITS code. Consequently, enhancing fast neutrons using recoil protons from epoxy resin was not effective due to high sensitivity of the imaging plate to gamma rays. However, the converter of epoxy resin doped with (10)B was found to have potential for thermal and epithermal neutrons, and graphite for gamma rays. Copyright © 2014 Elsevier Ltd. All rights reserved.

  17. The Impact of an Instructional Intervention Designed to Support Development of Stochastic Understanding of Probability Distribution

    ERIC Educational Resources Information Center

    Conant, Darcy Lynn

    2013-01-01

    Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…

  18. The second fermi large area telescope catalog of gamma-ray pulsars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdo, A. A.; Ajello, M.; Allafort, A.

    2013-09-19

    This catalog summarizes 117 high-confidence ≥0.1 GeV gamma-ray pulsar detections using three years of data acquired by the Large Area Telescope (LAT) on the Fermi satellite. Half are neutron stars discovered using LAT data through periodicity searches in gamma-ray and radio data around LAT unassociated source positions. The 117 pulsars are evenly divided into three groups: millisecond pulsars, young radio-loud pulsars, and young radio-quiet pulsars. We characterize the pulse profiles and energy spectra and derive luminosities when distance information exists. Spectral analysis of the off-peak phase intervals indicates probable pulsar wind nebula emission for four pulsars, and off-peak magnetospheric emissionmore » for several young and millisecond pulsars. We compare the gamma-ray properties with those in the radio, optical, and X-ray bands. We provide flux limits for pulsars with no observed gamma-ray emission, highlighting a small number of gamma-faint, radio-loud pulsars. The large, varied gamma-ray pulsar sample constrains emission models. Fermi's selection biases complement those of radio surveys, enhancing comparisons with predicted population distributions.« less

  19. The second FERMI large area telescope catalog of gamma-ray pulsars

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abdo, A. A.; Ajello, M.; Allafort, A.

    2013-09-19

    This catalog summarizes 117 high-confidence ≥0.1 GeV gamma-ray pulsar detections using three years of data acquired by the Large Area Telescope (LAT) on the Fermi satellite. Half are neutron stars discovered using LAT data through periodicity searches in gamma-ray and radio data around LAT unassociated source positions. The 117 pulsars are evenly divided into three groups: millisecond pulsars, young radio-loud pulsars, and young radio-quiet pulsars. We characterize the pulse profiles and energy spectra and derive luminosities when distance information exists. Spectral analysis of the off-peak phase intervals indicates probable pulsar wind nebula emission for four pulsars, and off-peak magnetospheric emissionmore » for several young and millisecond pulsars. We compare the gamma-ray properties with those in the radio, optical, and X-ray bands. We provide flux limits for pulsars with no observed gamma-ray emission, highlighting a small number of gamma-faint, radio-loud pulsars. The large, varied gamma-ray pulsar sample constrains emission models. Fermi's selection biases complement those of radio surveys, enhancing comparisons with predicted population distributions.« less

  20. Probability distribution functions for intermittent scrape-off layer plasma fluctuations

    NASA Astrophysics Data System (ADS)

    Theodorsen, A.; Garcia, O. E.

    2018-03-01

    A stochastic model for intermittent fluctuations in the scrape-off layer of magnetically confined plasmas has been constructed based on a super-position of uncorrelated pulses arriving according to a Poisson process. In the most common applications of the model, the pulse amplitudes are assumed exponentially distributed, supported by conditional averaging of large-amplitude fluctuations in experimental measurement data. This basic assumption has two potential limitations. First, statistical analysis of measurement data using conditional averaging only reveals the tail of the amplitude distribution to be exponentially distributed. Second, exponentially distributed amplitudes leads to a positive definite signal which cannot capture fluctuations in for example electric potential and radial velocity. Assuming pulse amplitudes which are not positive definite often make finding a closed form for the probability density function (PDF) difficult, even if the characteristic function remains relatively simple. Thus estimating model parameters requires an approach based on the characteristic function, not the PDF. In this contribution, the effect of changing the amplitude distribution on the moments, PDF and characteristic function of the process is investigated and a parameter estimation method using the empirical characteristic function is presented and tested on synthetically generated data. This proves valuable for describing intermittent fluctuations of all plasma parameters in the boundary region of magnetized plasmas.

  1. Audio feature extraction using probability distribution function

    NASA Astrophysics Data System (ADS)

    Suhaib, A.; Wan, Khairunizam; Aziz, Azri A.; Hazry, D.; Razlan, Zuradzman M.; Shahriman A., B.

    2015-05-01

    Voice recognition has been one of the popular applications in robotic field. It is also known to be recently used for biometric and multimedia information retrieval system. This technology is attained from successive research on audio feature extraction analysis. Probability Distribution Function (PDF) is a statistical method which is usually used as one of the processes in complex feature extraction methods such as GMM and PCA. In this paper, a new method for audio feature extraction is proposed which is by using only PDF as a feature extraction method itself for speech analysis purpose. Certain pre-processing techniques are performed in prior to the proposed feature extraction method. Subsequently, the PDF result values for each frame of sampled voice signals obtained from certain numbers of individuals are plotted. From the experimental results obtained, it can be seen visually from the plotted data that each individuals' voice has comparable PDF values and shapes.

  2. Using type IV Pearson distribution to calculate the probabilities of underrun and overrun of lists of multiple cases.

    PubMed

    Wang, Jihan; Yang, Kai

    2014-07-01

    An efficient operating room needs both little underutilised and overutilised time to achieve optimal cost efficiency. The probabilities of underrun and overrun of lists of cases can be estimated by a well defined duration distribution of the lists. To propose a method of predicting the probabilities of underrun and overrun of lists of cases using Type IV Pearson distribution to support case scheduling. Six years of data were collected. The first 5 years of data were used to fit distributions and estimate parameters. The data from the last year were used as testing data to validate the proposed methods. The percentiles of the duration distribution of lists of cases were calculated by Type IV Pearson distribution and t-distribution. Monte Carlo simulation was conducted to verify the accuracy of percentiles defined by the proposed methods. Operating rooms in John D. Dingell VA Medical Center, United States, from January 2005 to December 2011. Differences between the proportion of lists of cases that were completed within the percentiles of the proposed duration distribution of the lists and the corresponding percentiles. Compared with the t-distribution, the proposed new distribution is 8.31% (0.38) more accurate on average and 14.16% (0.19) more accurate in calculating the probabilities at the 10th and 90th percentiles of the distribution, which is a major concern of operating room schedulers. The absolute deviations between the percentiles defined by Type IV Pearson distribution and those from Monte Carlo simulation varied from 0.20  min (0.01) to 0.43  min (0.03). Operating room schedulers can rely on the most recent 10 cases with the same combination of surgeon and procedure(s) for distribution parameter estimation to plan lists of cases. Values are mean (SEM). The proposed Type IV Pearson distribution is more accurate than t-distribution to estimate the probabilities of underrun and overrun of lists of cases. However, as not all the individual case durations

  3. A brief introduction to probability.

    PubMed

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  4. Fieldable computer system for determining gamma-ray pulse-height distributions, flux spectra, and dose rates from Little Boy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moss, C.E.; Lucas, M.C.; Tisinger, E.W.

    1984-01-01

    Our system consists of a LeCroy 3500 data acquisition system with a built-in CAMAC crate and eight bismuth-germanate detectors 7.62 cm in diameter and 7.62 cm long. Gamma-ray pulse-height distributions are acquired simultaneously for up to eight positions. The system was very carefully calibrated and characterized from 0.1 to 8.3 MeV using gamma-ray spectra from a variety of radioactive sources. By fitting the pulse-height distributions from the sources with a function containing 17 parameters, we determined theoretical repsonse functions. We use these response functions to unfold the distributions to obtain flux spectra. A flux-to-dose-rate conversion curve based on the workmore » of Dimbylow and Francis is then used to obtain dose rates. Direct use of measured spectra and flux-to-dose-rate curves to obtain dose rates avoids the errors that can arise from spectrum dependence in simple gamma-ray dosimeter instruments. We present some gamma-ray doses for the Little Boy assembly operated at low power. These results can be used to determine the exposures of the Hiroshima survivors and thus aid in the establishment of radation exposure limits for the nuclear industry.« less

  5. Spatial Probability Distribution of Strata's Lithofacies and its Impacts on Land Subsidence in Huairou Emergency Water Resources Region of Beijing

    NASA Astrophysics Data System (ADS)

    Li, Y.; Gong, H.; Zhu, L.; Guo, L.; Gao, M.; Zhou, C.

    2016-12-01

    Continuous over-exploitation of groundwater causes dramatic drawdown, and leads to regional land subsidence in the Huairou Emergency Water Resources region, which is located in the up-middle part of the Chaobai river basin of Beijing. Owing to the spatial heterogeneity of strata's lithofacies of the alluvial fan, ground deformation has no significant positive correlation with groundwater drawdown, and one of the challenges ahead is to quantify the spatial distribution of strata's lithofacies. The transition probability geostatistics approach provides potential for characterizing the distribution of heterogeneous lithofacies in the subsurface. Combined the thickness of clay layer extracted from the simulation, with deformation field acquired from PS-InSAR technology, the influence of strata's lithofacies on land subsidence can be analyzed quantitatively. The strata's lithofacies derived from borehole data were generalized into four categories and their probability distribution in the observe space was mined by using the transition probability geostatistics, of which clay was the predominant compressible material. Geologically plausible realizations of lithofacies distribution were produced, accounting for complex heterogeneity in alluvial plain. At a particular probability level of more than 40 percent, the volume of clay defined was 55 percent of the total volume of strata's lithofacies. This level, equaling nearly the volume of compressible clay derived from the geostatistics, was thus chosen to represent the boundary between compressible and uncompressible material. The method incorporates statistical geological information, such as distribution proportions, average lengths and juxtaposition tendencies of geological types, mainly derived from borehole data and expert knowledge, into the Markov chain model of transition probability. Some similarities of patterns were indicated between the spatial distribution of deformation field and clay layer. In the area with

  6. Gamma-Ray Bursts from Neutron Star Kicks

    NASA Astrophysics Data System (ADS)

    Huang, Y. F.; Dai, Z. G.; Lu, T.; Cheng, K. S.; Wu, X. F.

    2003-09-01

    The idea that gamma-ray bursts might be a phenomenon associated with neutron star kicks was first proposed by Dar & Plaga. Here we study this mechanism in more detail and point out that the neutron star should be a high-speed one (with proper motion larger than ~1000 km s-1). It is shown that the model agrees well with observations in many aspects, such as the energetics, the event rate, the collimation, the bimodal distribution of durations, the narrowly clustered intrinsic energy, and the association of gamma-ray bursts with supernovae and star-forming regions. We also discuss the implications of this model on the neutron star kick mechanism and suggest that the high kick speed was probably acquired as the result of the electromagnetic rocket effect of a millisecond magnetar with an off-centered magnetic dipole.

  7. Spatial distribution of the gamma-ray bursts at very high redshift

    NASA Astrophysics Data System (ADS)

    Mészáros, Attila

    2018-05-01

    The author - with his collaborators - already in years 1995-96 have shown - purely from the analyses of the observations - that the gamma-ray bursts (GRBs) can be till redshift 20. Since that time several other statistical studies of the spatial distribution of GRBs were provided. Remarkable conclusions concerning the star-formation rate and the validity of the cosmological principle were obtained about the regions of the cosmic dawn. In this contribution these efforts are surveyed.

  8. Improving Conceptual Models Using AEM Data and Probability Distributions

    NASA Astrophysics Data System (ADS)

    Davis, A. C.; Munday, T. J.; Christensen, N. B.

    2012-12-01

    With emphasis being placed on uncertainty in groundwater modelling and prediction, coupled with questions concerning the value of geophysical methods in hydrogeology, it is important to ask meaningful questions of hydrogeophysical data and inversion results. For example, to characterise aquifers using electromagnetic (EM) data, we ask questions such as "Given that the electrical conductivity of aquifer 'A' is less than x, where is that aquifer elsewhere in the survey area?" The answer may be given by examining inversion models, selecting locations and layers that satisfy the condition 'conductivity <= x', and labelling them as aquifer 'A'. One difficulty with this approach is that the inversion model result often be considered to be the only model for the data. In reality it is just one image of the subsurface that, given the method and the regularisation imposed in the inversion, agrees with measured data within a given error bound. We have no idea whether the final model realised by the inversion satisfies the global minimum error, or whether it is simply in a local minimum. There is a distribution of inversion models that satisfy the error tolerance condition: the final model is not the only one, nor is it necessarily the correct one. AEM inversions are often linearised in the calculation of the parameter sensitivity: we rely on the second derivatives in the Taylor expansion, thus the minimum model has all layer parameters distributed about their mean parameter value with well-defined variance. We investigate the validity of the minimum model, and its uncertainty, by examining the full posterior covariance matrix. We ask questions of the minimum model, and answer them in a probabilistically. The simplest question we can pose is "What is the probability that all layer resistivity values are <= a cut-off value?" We can calculate through use of the erf or the erfc functions. The covariance values of the inversion become marginalised in the integration: only the

  9. Unveiling the Gamma-Ray Source Count Distribution Below the Fermi Detection Limit with Photon Statistics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zechlin, Hannes-S.; Cuoco, Alessandro; Donato, Fiorenza

    The source-count distribution as a function of their flux, dN/dS, is one of the main quantities characterizing gamma-ray source populations. In this paper, we employ statistical properties of the Fermi Large Area Telescope (LAT) photon counts map to measure the composition of the extragalactic gamma-ray sky at high latitudes (|b| greater-than or slanted equal to 30°) between 1 and 10 GeV. We present a new method, generalizing the use of standard pixel-count statistics, to decompose the total observed gamma-ray emission into (a) point-source contributions, (b) the Galactic foreground contribution, and (c) a truly diffuse isotropic background contribution. Using the 6more » yr Fermi-LAT data set (P7REP), we show that the dN/dS distribution in the regime of so far undetected point sources can be consistently described with a power law with an index between 1.9 and 2.0. We measure dN/dS down to an integral flux of ~2 x 10 -11cm -2s -1, improving beyond the 3FGL catalog detection limit by about one order of magnitude. The overall dN/dS distribution is consistent with a broken power law, with a break at 2.1 +1.0 -1.3 x 10 -8cm -2s -1. The power-law index n 1 = 3.1 +0.7 -0.5 for bright sources above the break hardens to n 2 = 1.97 ± 0.03 for fainter sources below the break. A possible second break of the dN/dS distribution is constrained to be at fluxes below 6.4 x 10 -11cm -2s -1 at 95% confidence level. Finally, the high-latitude gamma-ray sky between 1 and 10 GeV is shown to be composed of ~25% point sources, ~69.3% diffuse Galactic foreground emission, and ~6% isotropic diffuse background.« less

  10. Unveiling the Gamma-Ray Source Count Distribution Below the Fermi Detection Limit with Photon Statistics

    DOE PAGES

    Zechlin, Hannes-S.; Cuoco, Alessandro; Donato, Fiorenza; ...

    2016-07-26

    The source-count distribution as a function of their flux, dN/dS, is one of the main quantities characterizing gamma-ray source populations. In this paper, we employ statistical properties of the Fermi Large Area Telescope (LAT) photon counts map to measure the composition of the extragalactic gamma-ray sky at high latitudes (|b| greater-than or slanted equal to 30°) between 1 and 10 GeV. We present a new method, generalizing the use of standard pixel-count statistics, to decompose the total observed gamma-ray emission into (a) point-source contributions, (b) the Galactic foreground contribution, and (c) a truly diffuse isotropic background contribution. Using the 6more » yr Fermi-LAT data set (P7REP), we show that the dN/dS distribution in the regime of so far undetected point sources can be consistently described with a power law with an index between 1.9 and 2.0. We measure dN/dS down to an integral flux of ~2 x 10 -11cm -2s -1, improving beyond the 3FGL catalog detection limit by about one order of magnitude. The overall dN/dS distribution is consistent with a broken power law, with a break at 2.1 +1.0 -1.3 x 10 -8cm -2s -1. The power-law index n 1 = 3.1 +0.7 -0.5 for bright sources above the break hardens to n 2 = 1.97 ± 0.03 for fainter sources below the break. A possible second break of the dN/dS distribution is constrained to be at fluxes below 6.4 x 10 -11cm -2s -1 at 95% confidence level. Finally, the high-latitude gamma-ray sky between 1 and 10 GeV is shown to be composed of ~25% point sources, ~69.3% diffuse Galactic foreground emission, and ~6% isotropic diffuse background.« less

  11. Growth kinetics of gamma-prime precipitates in a directionally solidified eutectic, gamma/gamma-prime-delta

    NASA Technical Reports Server (NTRS)

    Tewari, S. N.

    1976-01-01

    A directionally solidified eutectic alloy (DSEA), of those viewed as potential candidates for the next generation of aircraft gas turbine blade materials, is studied for the gamma-prime growth kinetics, in the system Ni-Nb-Cr-Al, specifically: Ni-20 w/o Nb-6 w/o Cr-2.5 w/o Al gamma/gamma-prime-delta DSEA. Heat treatment, polishing and etching, and preparation for electron micrography are described, and the size distribution of gamma-prime phase following various anneals is plotted, along with gamma-prime growth kinetics in this specific DSEA, and the cube of gamma-prime particle size vs anneal time. Activation energies and coarsening kinetics are studied.

  12. Analysis of nonlocal neural fields for both general and gamma-distributed connectivities

    NASA Astrophysics Data System (ADS)

    Hutt, Axel; Atay, Fatihcan M.

    2005-04-01

    This work studies the stability of equilibria in spatially extended neuronal ensembles. We first derive the model equation from statistical properties of the neuron population. The obtained integro-differential equation includes synaptic and space-dependent transmission delay for both general and gamma-distributed synaptic connectivities. The latter connectivity type reveals infinite, finite, and vanishing self-connectivities. The work derives conditions for stationary and nonstationary instabilities for both kernel types. In addition, a nonlinear analysis for general kernels yields the order parameter equation of the Turing instability. To compare the results to findings for partial differential equations (PDEs), two typical PDE-types are derived from the examined model equation, namely the general reaction-diffusion equation and the Swift-Hohenberg equation. Hence, the discussed integro-differential equation generalizes these PDEs. In the case of the gamma-distributed kernels, the stability conditions are formulated in terms of the mean excitatory and inhibitory interaction ranges. As a novel finding, we obtain Turing instabilities in fields with local inhibition-lateral excitation, while wave instabilities occur in fields with local excitation and lateral inhibition. Numerical simulations support the analytical results.

  13. Properties of the probability density function of the non-central chi-squared distribution

    NASA Astrophysics Data System (ADS)

    András, Szilárd; Baricz, Árpád

    2008-10-01

    In this paper we consider the probability density function (pdf) of a non-central [chi]2 distribution with arbitrary number of degrees of freedom. For this function we prove that can be represented as a finite sum and we deduce a partial derivative formula. Moreover, we show that the pdf is log-concave when the degrees of freedom is greater or equal than 2. At the end of this paper we present some Turán-type inequalities for this function and an elegant application of the monotone form of l'Hospital's rule in probability theory is given.

  14. Variation in the standard deviation of the lure rating distribution: Implications for estimates of recollection probability.

    PubMed

    Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin

    2017-10-01

    In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.

  15. The Distribution of Cosmic-Ray Sources in the Galaxy, Gamma-Rays and the Gradient in the CO-to-H2 Relation

    NASA Technical Reports Server (NTRS)

    Strong, A. W.; Moskalenko, I. V.; Reimer, O.; Diehl, S.; Diehl, R.

    2004-01-01

    We present a solution to the apparent discrepancy between the radial gradient in the diffuse Galactic gamma-ray emissivity and the distribution of supernova remnants, believed to be the sources of cosmic rays. Recent determinations of the pulsar distribution have made the discrepancy even more apparent. The problem is shown to be plausibly solved by a variation in the Wco-to-N(H2) scaling factor. If this factor increases by a factor of 5-10 from the inner to the outer Galaxy, as expected from the Galactic metallicity gradient and supported by other evidence, we show that the source distribution required to match the radial gradient of gamma-rays can be reconciled with the distribution of supernova remnants as traced by current studies of pulsars. The resulting model fits the EGRET gamma-ray profiles extremely well in longitude, and reproduces the mid-latitude inner Galaxy intensities better than previous models.

  16. The Probability Distribution for a Biased Spinner

    ERIC Educational Resources Information Center

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  17. A Performance Comparison on the Probability Plot Correlation Coefficient Test using Several Plotting Positions for GEV Distribution.

    NASA Astrophysics Data System (ADS)

    Ahn, Hyunjun; Jung, Younghun; Om, Ju-Seong; Heo, Jun-Haeng

    2014-05-01

    It is very important to select the probability distribution in Statistical hydrology. Goodness of fit test is a statistical method that selects an appropriate probability model for a given data. The probability plot correlation coefficient (PPCC) test as one of the goodness of fit tests was originally developed for normal distribution. Since then, this test has been widely applied to other probability models. The PPCC test is known as one of the best goodness of fit test because it shows higher rejection powers among them. In this study, we focus on the PPCC tests for the GEV distribution which is widely used in the world. For the GEV model, several plotting position formulas are suggested. However, the PPCC statistics are derived only for the plotting position formulas (Goel and De, In-na and Nguyen, and Kim et al.) in which the skewness coefficient (or shape parameter) are included. And then the regression equations are derived as a function of the shape parameter and sample size for a given significance level. In addition, the rejection powers of these formulas are compared using Monte-Carlo simulation. Keywords: Goodness-of-fit test, Probability plot correlation coefficient test, Plotting position, Monte-Carlo Simulation ACKNOWLEDGEMENTS This research was supported by a grant 'Establishing Active Disaster Management System of Flood Control Structures by using 3D BIM Technique' [NEMA-12-NH-57] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of Korea.

  18. The Intensity Distribution for Gamma-Ray Bursts Observed with BATSE

    NASA Technical Reports Server (NTRS)

    Pendleton, Geoffrey N.; Mallozzi, Robert S.; Paciesas, William S.; Briggs, Michael S.; Preece, Robert D.; Koshut, Tom M.; Horack, John M.; Meegan, Charles A.; Fishman, Gerald J.; Hakkila, Jon; hide

    1996-01-01

    The intensity distributions of gamma-ray bursts observed by BATSE from 19 April 1991 to 19 September 1994 are presented. For this data set, (V/V(sub max)) is 0.329 +/- 0.011, which is 15.5 sigma away from the value of 0.5 expected for a homogeneous distribution. Standard cosmological model parameters are obtained by fitting the differentially binned peak flux distribution expressed in units of photons cm(exp -2) s(exp -1) in the energy range 50-300 keV. The value of z calculated for a peak flux of 1 photon cm(exp -2) s(exp -1) is 0.8 +/- 0.33. The procedures used to produce the peak flux data and C(sub p)/C(sub lim) data are presented. The differences between the two representations of burst intensity are emphasized so that researchers can determine which type of data is most appropriate for their studies. The sky sensitivity correction as a function of intensity for the peak flux data is also described.

  19. Comparison of high energy gamma rays from absolute value of b greater than 30 deg with the galactic neutral hydrogen distribution

    NASA Technical Reports Server (NTRS)

    Ozel, M. E.; Ogelman, H.; Tumer, T.; Fichtel, C. E.; Hartman, R. C.; Kniffen, D. A.; Thompson, F. J.

    1978-01-01

    High-energy gamma-ray (energy above 35 MeV) data from the SAS 2 satellite have been used to compare the intensity distribution of gamma rays with that of neutral hydrogen (H I) density along the line of sight, at high galactic latitudes (absolute values greater than 30 deg). A model has been constructed for the case where the observed gamma-ray intensity has been assumed to be the sum of a galactic component proportional to the H I distribution plus an isotropic extragalactic emission. A chi-squared test of the model parameters indicates that about 30% of the total high-latitude emission may originate within the Galaxy.

  20. Probability Distributions over Cryptographic Protocols

    DTIC Science & Technology

    2009-06-01

    Artificial Immune Algorithm . . . . . . . . . . . . . . . . . . . 9 3 Design Decisions 11 3.1 Common Ground...creation algorithm for unbounded distribution . . . . . . . 24 4.2 Message creation algorithm for unbounded naive distribution . . . . 24 4.3 Protocol...creation algorithm for intended-run distributions . . . . . . 26 4.4 Protocol and message creation algorithm for realistic distribution . . 32 ix THIS

  1. Seasonal Variability of Middle Latitude Ozone in the Lowermost Stratosphere Derived from Probability Distribution Functions

    NASA Technical Reports Server (NTRS)

    Rood, Richard B.; Douglass, Anne R.; Cerniglia, Mark C.; Sparling, Lynn C.; Nielsen, J. Eric

    1999-01-01

    We present a study of the distribution of ozone in the lowermost stratosphere with the goal of characterizing the observed variability. The air in the lowermost stratosphere is divided into two population groups based on Ertel's potential vorticity at 300 hPa. High (low) potential vorticity at 300 hPa indicates that the tropopause is low (high), and the identification of these two groups is made to account for the dynamic variability. Conditional probability distribution functions are used to define the statistics of the ozone distribution from both observations and a three-dimensional model simulation using winds from the Goddard Earth Observing System Data Assimilation System for transport. Ozone data sets include ozonesonde observations from northern midlatitude stations (1991-96) and midlatitude observations made by the Halogen Occultation Experiment (HALOE) on the Upper Atmosphere Research Satellite (UARS) (1994- 1998). The conditional probability distribution functions are calculated at a series of potential temperature surfaces spanning the domain from the midlatitude tropopause to surfaces higher than the mean tropical tropopause (approximately 380K). The probability distribution functions are similar for the two data sources, despite differences in horizontal and vertical resolution and spatial and temporal sampling. Comparisons with the model demonstrate that the model maintains a mix of air in the lowermost stratosphere similar to the observations. The model also simulates a realistic annual cycle. Results show that during summer, much of the observed variability is explained by the height of the tropopause. During the winter and spring, when the tropopause fluctuations are larger, less of the variability is explained by tropopause height. This suggests that more mixing occurs during these seasons. During all seasons, there is a transition zone near the tropopause that contains air characteristic of both the troposphere and the stratosphere. The

  2. A New Insight into the Earthquake Recurrence Studies from the Three-parameter Generalized Exponential Distributions

    NASA Astrophysics Data System (ADS)

    Pasari, S.; Kundu, D.; Dikshit, O.

    2012-12-01

    Earthquake recurrence interval is one of the important ingredients towards probabilistic seismic hazard assessment (PSHA) for any location. Exponential, gamma, Weibull and lognormal distributions are quite established probability models in this recurrence interval estimation. However, they have certain shortcomings too. Thus, it is imperative to search for some alternative sophisticated distributions. In this paper, we introduce a three-parameter (location, scale and shape) exponentiated exponential distribution and investigate the scope of this distribution as an alternative of the afore-mentioned distributions in earthquake recurrence studies. This distribution is a particular member of the exponentiated Weibull distribution. Despite of its complicated form, it is widely accepted in medical and biological applications. Furthermore, it shares many physical properties with gamma and Weibull family. Unlike gamma distribution, the hazard function of generalized exponential distribution can be easily computed even if the shape parameter is not an integer. To contemplate the plausibility of this model, a complete and homogeneous earthquake catalogue of 20 events (M ≥ 7.0) spanning for the period 1846 to 1995 from North-East Himalayan region (20-32 deg N and 87-100 deg E) has been used. The model parameters are estimated using maximum likelihood estimator (MLE) and method of moment estimator (MOME). No geological or geophysical evidences have been considered in this calculation. The estimated conditional probability reaches quite high after about a decade for an elapsed time of 17 years (i.e. 2012). Moreover, this study shows that the generalized exponential distribution fits the above data events more closely compared to the conventional models and hence it is tentatively concluded that generalized exponential distribution can be effectively considered in earthquake recurrence studies.

  3. Probability evolution method for exit location distribution

    NASA Astrophysics Data System (ADS)

    Zhu, Jinjie; Chen, Zhen; Liu, Xianbin

    2018-03-01

    The exit problem in the framework of the large deviation theory has been a hot topic in the past few decades. The most probable escape path in the weak-noise limit has been clarified by the Freidlin-Wentzell action functional. However, noise in real physical systems cannot be arbitrarily small while noise with finite strength may induce nontrivial phenomena, such as noise-induced shift and noise-induced saddle-point avoidance. Traditional Monte Carlo simulation of noise-induced escape will take exponentially large time as noise approaches zero. The majority of the time is wasted on the uninteresting wandering around the attractors. In this paper, a new method is proposed to decrease the escape simulation time by an exponentially large factor by introducing a series of interfaces and by applying the reinjection on them. This method can be used to calculate the exit location distribution. It is verified by examining two classical examples and is compared with theoretical predictions. The results show that the method performs well for weak noise while may induce certain deviations for large noise. Finally, some possible ways to improve our method are discussed.

  4. Parameter Estimation in Astronomy with Poisson-Distributed Data. 1; The (CHI)2(gamma) Statistic

    NASA Technical Reports Server (NTRS)

    Mighell, Kenneth J.

    1999-01-01

    Applying the standard weighted mean formula, [Sigma (sub i)n(sub i)ssigma(sub i, sup -2)], to determine the weighted mean of data, n(sub i), drawn from a Poisson distribution, will, on average, underestimate the true mean by approx. 1 for all true mean values larger than approx.3 when the common assumption is made that the error of the i th observation is sigma(sub i) = max square root of n(sub i), 1).This small, but statistically significant offset, explains the long-known observation that chi-square minimization techniques which use the modified Neyman'chi(sub 2) statistic, chi(sup 2, sub N) equivalent Sigma(sub i)((n(sub i) - y(sub i)(exp 2)) / max(n(sub i), 1), to compare Poisson - distributed data with model values, y(sub i), will typically predict a total number of counts that underestimates the true total by about 1 count per bin. Based on my finding that weighted mean of data drawn from a Poisson distribution can be determined using the formula [Sigma(sub i)[n(sub i) + min(n(sub i), 1)](n(sub i) + 1)(exp -1)] / [Sigma(sub i)(n(sub i) + 1)(exp -1))], I propose that a new chi(sub 2) statistic, chi(sup 2, sub gamma) equivalent, should always be used to analyze Poisson- distributed data in preference to the modified Neyman's chi(exp 2) statistic. I demonstrated the power and usefulness of,chi(sub gamma, sup 2) minimization by using two statistical fitting techniques and five chi(exp 2) statistics to analyze simulated X-ray power - low 15 - channel spectra with large and small counts per bin. I show that chi(sub gamma, sup 2) minimization with the Levenberg - Marquardt or Powell's method can produce excellent results (mean slope errors approx. less than 3%) with spectra having as few as 25 total counts.

  5. On the probability distribution of daily streamflow in the United States

    USGS Publications Warehouse

    Blum, Annalise G.; Archfield, Stacey A.; Vogel, Richard M.

    2017-01-01

    Daily streamflows are often represented by flow duration curves (FDCs), which illustrate the frequency with which flows are equaled or exceeded. FDCs have had broad applications across both operational and research hydrology for decades; however, modeling FDCs has proven elusive. Daily streamflow is a complex time series with flow values ranging over many orders of magnitude. The identification of a probability distribution that can approximate daily streamflow would improve understanding of the behavior of daily flows and the ability to estimate FDCs at ungaged river locations. Comparisons of modeled and empirical FDCs at nearly 400 unregulated, perennial streams illustrate that the four-parameter kappa distribution provides a very good representation of daily streamflow across the majority of physiographic regions in the conterminous United States (US). Further, for some regions of the US, the three-parameter generalized Pareto and lognormal distributions also provide a good approximation to FDCs. Similar results are found for the period of record FDCs, representing the long-term hydrologic regime at a site, and median annual FDCs, representing the behavior of flows in a typical year.

  6. Estimating the empirical probability of submarine landslide occurrence

    USGS Publications Warehouse

    Geist, Eric L.; Parsons, Thomas E.; Mosher, David C.; Shipp, Craig; Moscardelli, Lorena; Chaytor, Jason D.; Baxter, Christopher D. P.; Lee, Homa J.; Urgeles, Roger

    2010-01-01

    The empirical probability for the occurrence of submarine landslides at a given location can be estimated from age dates of past landslides. In this study, tools developed to estimate earthquake probability from paleoseismic horizons are adapted to estimate submarine landslide probability. In both types of estimates, one has to account for the uncertainty associated with age-dating individual events as well as the open time intervals before and after the observed sequence of landslides. For observed sequences of submarine landslides, we typically only have the age date of the youngest event and possibly of a seismic horizon that lies below the oldest event in a landslide sequence. We use an empirical Bayes analysis based on the Poisson-Gamma conjugate prior model specifically applied to the landslide probability problem. This model assumes that landslide events as imaged in geophysical data are independent and occur in time according to a Poisson distribution characterized by a rate parameter λ. With this method, we are able to estimate the most likely value of λ and, importantly, the range of uncertainty in this estimate. Examples considered include landslide sequences observed in the Santa Barbara Channel, California, and in Port Valdez, Alaska. We confirm that given the uncertainties of age dating that landslide complexes can be treated as single events by performing statistical test of age dates representing the main failure episode of the Holocene Storegga landslide complex.

  7. A revision of the gamma-evaluation concept for the comparison of dose distributions.

    PubMed

    Bakai, Annemarie; Alber, Markus; Nüsslin, Fridtjof

    2003-11-07

    A method for the quantitative four-dimensional (4D) evaluation of discrete dose data based on gradient-dependent local acceptance thresholds is presented. The method takes into account the local dose gradients of a reference distribution for critical appraisal of misalignment and collimation errors. These contribute to the maximum tolerable dose error at each evaluation point to which the local dose differences between comparison and reference data are compared. As shown, the presented concept is analogous to the gamma-concept of Low et al (1998a Med. Phys. 25 656-61) if extended to (3+1) dimensions. The pointwise dose comparisons of the reformulated concept are easier to perform and speed up the evaluation process considerably, especially for fine-grid evaluations of 3D dose distributions. The occurrences of false negative indications due to the discrete nature of the data are reduced with the method. The presented method was applied to film-measured, clinical data and compared with gamma-evaluations. 4D and 3D evaluations were performed. Comparisons prove that 4D evaluations have to be given priority, especially if complex treatment situations are verified, e.g., non-coplanar beam configurations.

  8. Statistical Distributions of Optical Flares from Gamma-Ray Bursts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yi, Shuang-Xi; Yu, Hai; Wang, F. Y.

    2017-07-20

    We statistically study gamma-ray burst (GRB) optical flares from the Swift /UVOT catalog. We compile 119 optical flares, including 77 flares with redshift measurements. Some tight correlations among the timescales of optical flares are found. For example, the rise time is correlated with the decay time, and the duration time is correlated with the peak time of optical flares. These two tight correlations indicate that longer rise times are associated with longer decay times of optical flares and also suggest that broader optical flares peak at later times, which are consistent with the corresponding correlations of X-ray flares. We alsomore » study the frequency distributions of optical flare parameters, including the duration time, rise time, decay time, peak time, and waiting time. Similar power-law distributions for optical and X-ray flares are found. Our statistic results imply that GRB optical flares and X-ray flares may share the similar physical origin, and both of them are possibly related to central engine activities.« less

  9. Seasonal Variability of Middle Latitude Ozone in the Lowermost Stratosphere Derived from Probability Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cerniglia, M. C.; Douglass, A. R.; Rood, R. B.; Sparling, L. C..; Nielsen, J. E.

    1999-01-01

    We present a study of the distribution of ozone in the lowermost stratosphere with the goal of understanding the relative contribution to the observations of air of either distinctly tropospheric or stratospheric origin. The air in the lowermost stratosphere is divided into two population groups based on Ertel's potential vorticity at 300 hPa. High [low] potential vorticity at 300 hPa suggests that the tropopause is low [high], and the identification of the two groups helps to account for dynamic variability. Conditional probability distribution functions are used to define the statistics of the mix from both observations and model simulations. Two data sources are chosen. First, several years of ozonesonde observations are used to exploit the high vertical resolution. Second, observations made by the Halogen Occultation Experiment [HALOE] on the Upper Atmosphere Research Satellite [UARS] are used to understand the impact on the results of the spatial limitations of the ozonesonde network. The conditional probability distribution functions are calculated at a series of potential temperature surfaces spanning the domain from the midlatitude tropopause to surfaces higher than the mean tropical tropopause [about 380K]. Despite the differences in spatial and temporal sampling, the probability distribution functions are similar for the two data sources. Comparisons with the model demonstrate that the model maintains a mix of air in the lowermost stratosphere similar to the observations. The model also simulates a realistic annual cycle. By using the model, possible mechanisms for the maintenance of mix of air in the lowermost stratosphere are revealed. The relevance of the results to the assessment of the environmental impact of aircraft effluence is discussed.

  10. Seasonal Variability of Middle Latitude Ozone in the Lowermost Stratosphere Derived from Probability Distribution Functions

    NASA Technical Reports Server (NTRS)

    Cerniglia, M. C.; Douglass, A. R.; Rood, R. B.; Sparling, L. C.; Nielsen, J. E.

    1999-01-01

    We present a study of the distribution of ozone in the lowermost stratosphere with the goal of understanding the relative contribution to the observations of air of either distinctly tropospheric or stratospheric origin. The air in the lowermost stratosphere is divided into two population groups based on Ertel's potential vorticity at 300 hPa. High [low] potential vorticity at 300 hPa suggests that the tropopause is low [high], and the identification of the two groups helps to account for dynamic variability. Conditional probability distribution functions are used to define the statistics of the mix from both observations and model simulations. Two data sources are chosen. First, several years of ozonesonde observations are used to exploit the high vertical resolution. Second, observations made by the Halogen Occultation Experiment [HALOE) on the Upper Atmosphere Research Satellite [UARS] are used to understand the impact on the results of the spatial limitations of the ozonesonde network. The conditional probability distribution functions are calculated at a series of potential temperature surfaces spanning the domain from the midlatitude tropopause to surfaces higher than the mean tropical tropopause [approximately 380K]. Despite the differences in spatial and temporal sampling, the probability distribution functions are similar for the two data sources. Comparisons with the model demonstrate that the model maintains a mix of air in the lowermost stratosphere similar to the observations. The model also simulates a realistic annual cycle. By using the model, possible mechanisms for the maintenance of mix of air in the lowermost stratosphere are revealed. The relevance of the results to the assessment of the environmental impact of aircraft effluence is discussed.

  11. Computation of marginal distributions of peak-heights in electropherograms for analysing single source and mixture STR DNA samples.

    PubMed

    Cowell, Robert G

    2018-05-04

    Current models for single source and mixture samples, and probabilistic genotyping software based on them used for analysing STR electropherogram data, assume simple probability distributions, such as the gamma distribution, to model the allelic peak height variability given the initial amount of DNA prior to PCR amplification. Here we illustrate how amplicon number distributions, for a model of the process of sample DNA collection and PCR amplification, may be efficiently computed by evaluating probability generating functions using discrete Fourier transforms. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Monitoring the distribution of prompt gamma rays in boron neutron capture therapy using a multiple-scattering Compton camera: A Monte Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Lee, Taewoong; Lee, Hyounggun; Lee, Wonho

    2015-10-01

    This study evaluated the use of Compton imaging technology to monitor prompt gamma rays emitted by 10B in boron neutron capture therapy (BNCT) applied to a computerized human phantom. The Monte Carlo method, including particle-tracking techniques, was used for simulation. The distribution of prompt gamma rays emitted by the phantom during irradiation with neutron beams is closely associated with the distribution of the boron in the phantom. Maximum likelihood expectation maximization (MLEM) method was applied to the information obtained from the detected prompt gamma rays to reconstruct the distribution of the tumor including the boron uptake regions (BURs). The reconstructed Compton images of the prompt gamma rays were combined with the cross-sectional images of the human phantom. Quantitative analysis of the intensity curves showed that all combined images matched the predetermined conditions of the simulation. The tumors including the BURs were distinguishable if they were more than 2 cm apart.

  13. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks.

    PubMed

    Zhuang, Jiancang; Ogata, Yosihiko

    2006-04-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata, Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method.

  14. Constraining Gamma-Ray Pulsar Gap Models with a Simulated Pulsar Population

    NASA Technical Reports Server (NTRS)

    Pierbattista, Marco; Grenier, I. A.; Harding, A. K.; Gonthier, P. L.

    2012-01-01

    With the large sample of young gamma-ray pulsars discovered by the Fermi Large Area Telescope (LAT), population synthesis has become a powerful tool for comparing their collective properties with model predictions. We synthesised a pulsar population based on a radio emission model and four gamma-ray gap models (Polar Cap, Slot Gap, Outer Gap, and One Pole Caustic). Applying gamma-ray and radio visibility criteria, we normalise the simulation to the number of detected radio pulsars by a select group of ten radio surveys. The luminosity and the wide beams from the outer gaps can easily account for the number of Fermi detections in 2 years of observations. The wide slot-gap beam requires an increase by a factor of 10 of the predicted luminosity to produce a reasonable number of gamma-ray pulsars. Such large increases in the luminosity may be accommodated by implementing offset polar caps. The narrow polar-cap beams contribute at most only a handful of LAT pulsars. Using standard distributions in birth location and pulsar spin-down power (E), we skew the initial magnetic field and period distributions in a an attempt to account for the high E Fermi pulsars. While we compromise the agreement between simulated and detected distributions of radio pulsars, the simulations fail to reproduce the LAT findings: all models under-predict the number of LAT pulsars with high E , and they cannot explain the high probability of detecting both the radio and gamma-ray beams at high E. The beaming factor remains close to 1.0 over 4 decades in E evolution for the slot gap whereas it significantly decreases with increasing age for the outer gaps. The evolution of the enhanced slot-gap luminosity with E is compatible with the large dispersion of gamma-ray luminosity seen in the LAT data. The stronger evolution predicted for the outer gap, which is linked to the polar cap heating by the return current, is apparently not supported by the LAT data. The LAT sample of gamma-ray pulsars

  15. Break Point Distribution on Chromosome 3 of Human Epithelial Cells exposed to Gamma Rays, Neutrons and Fe Ions

    NASA Technical Reports Server (NTRS)

    Hada, M.; Saganti, P. B.; Gersey, B.; Wilkins, R.; Cucinotta, F. A.; Wu, H.

    2007-01-01

    Most of the reported studies of break point distribution on the damaged chromosomes from radiation exposure were carried out with the G-banding technique or determined based on the relative length of the broken chromosomal fragments. However, these techniques lack the accuracy in comparison with the later developed multicolor banding in situ hybridization (mBAND) technique that is generally used for analysis of intrachromosomal aberrations such as inversions. Using mBAND, we studied chromosome aberrations in human epithelial cells exposed in vitro to both low or high dose rate gamma rays in Houston, low dose rate secondary neutrons at Los Alamos National Laboratory and high dose rate 600 MeV/u Fe ions at NASA Space Radiation Laboratory. Detailed analysis of the inversion type revealed that all of the three radiation types induced a low incidence of simple inversions. Half of the inversions observed after neutron or Fe ion exposure, and the majority of inversions in gamma-irradiated samples were accompanied by other types of intrachromosomal aberrations. In addition, neutrons and Fe ions induced a significant fraction of inversions that involved complex rearrangements of both inter- and intrachromosome exchanges. We further compared the distribution of break point on chromosome 3 for the three radiation types. The break points were found to be randomly distributed on chromosome 3 after neutrons or Fe ions exposure, whereas non-random distribution with clustering break points was observed for gamma-rays. The break point distribution may serve as a potential fingerprint of high-LET radiation exposure.

  16. Unit-Sphere Anisotropic Multiaxial Stochastic-Strength Model Probability Density Distribution for the Orientation of Critical Flaws

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel

    2013-01-01

    Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software

  17. A novel gamma-fitting statistical method for anti-drug antibody assays to establish assay cut points for data with non-normal distribution.

    PubMed

    Schlain, Brian; Amaravadi, Lakshmi; Donley, Jean; Wickramasekera, Ananda; Bennett, Donald; Subramanyam, Meena

    2010-01-31

    In recent years there has been growing recognition of the impact of anti-drug or anti-therapeutic antibodies (ADAs, ATAs) on the pharmacokinetic and pharmacodynamic behavior of the drug, which ultimately affects drug exposure and activity. These anti-drug antibodies can also impact safety of the therapeutic by inducing a range of reactions from hypersensitivity to neutralization of the activity of an endogenous protein. Assessments of immunogenicity, therefore, are critically dependent on the bioanalytical method used to test samples, in which a positive versus negative reactivity is determined by a statistically derived cut point based on the distribution of drug naïve samples. For non-normally distributed data, a novel gamma-fitting method for obtaining assay cut points is presented. Non-normal immunogenicity data distributions, which tend to be unimodal and positively skewed, can often be modeled by 3-parameter gamma fits. Under a gamma regime, gamma based cut points were found to be more accurate (closer to their targeted false positive rates) compared to normal or log-normal methods and more precise (smaller standard errors of cut point estimators) compared with the nonparametric percentile method. Under a gamma regime, normal theory based methods for estimating cut points targeting a 5% false positive rate were found in computer simulation experiments to have, on average, false positive rates ranging from 6.2 to 8.3% (or positive biases between +1.2 and +3.3%) with bias decreasing with the magnitude of the gamma shape parameter. The log-normal fits tended, on average, to underestimate false positive rates with negative biases as large a -2.3% with absolute bias decreasing with the shape parameter. These results were consistent with the well known fact that gamma distributions become less skewed and closer to a normal distribution as their shape parameters increase. Inflated false positive rates, especially in a screening assay, shifts the emphasis to confirm

  18. Combining Probability Distributions of Wind Waves and Sea Level Variations to Assess Return Periods of Coastal Floods

    NASA Astrophysics Data System (ADS)

    Leijala, U.; Bjorkqvist, J. V.; Pellikka, H.; Johansson, M. M.; Kahma, K. K.

    2017-12-01

    Predicting the behaviour of the joint effect of sea level and wind waves is of great significance due to the major impact of flooding events in densely populated coastal regions. As mean sea level rises, the effect of sea level variations accompanied by the waves will be even more harmful in the future. The main challenge when evaluating the effect of waves and sea level variations is that long time series of both variables rarely exist. Wave statistics are also highly location-dependent, thus requiring wave buoy measurements and/or high-resolution wave modelling. As an initial approximation of the joint effect, the variables may be treated as independent random variables, to achieve the probability distribution of their sum. We present results of a case study based on three probability distributions: 1) wave run-up constructed from individual wave buoy measurements, 2) short-term sea level variability based on tide gauge data, and 3) mean sea level projections based on up-to-date regional scenarios. The wave measurements were conducted during 2012-2014 on the coast of city of Helsinki located in the Gulf of Finland in the Baltic Sea. The short-term sea level distribution contains the last 30 years (1986-2015) of hourly data from Helsinki tide gauge, and the mean sea level projections are scenarios adjusted for the Gulf of Finland. Additionally, we present a sensitivity test based on six different theoretical wave height distributions representing different wave behaviour in relation to sea level variations. As these wave distributions are merged with one common sea level distribution, we can study how the different shapes of the wave height distribution affect the distribution of the sum, and which one of the components is dominating under different wave conditions. As an outcome of the method, we obtain a probability distribution of the maximum elevation of the continuous water mass, which enables a flexible tool for evaluating different risk levels in the

  19. Constraining the redshift distribution of ultrahigh-energy-cosmic-ray sources by isotropic gamma-ray background

    NASA Astrophysics Data System (ADS)

    Liu, Ruo-Yu; Taylor, Andrew; Wang, Xiang-Yu; Aharonian, Felix

    2017-01-01

    By interacting with the cosmic background photons during their propagation through intergalactic space, ultrahigh energy cosmic rays (UHECRs) produce energetic electron/positron pairs and photons which will initiate electromagnetic cascades, contributing to the isotropic gamma-ray background (IGRB). The generated gamma-ray flux level highly depends on the redshift evolution of the UHECR sources. Recently, the Fermi-LAT collaboration reported that 86-14+16 of the total extragalactic gamma-ray flux comes from extragalactic point sources including those unresolved ones. This leaves a limited room for the diffusive gamma ray generated via UHECR propagation, and subsequently constrains their source distribution in the Universe. Normalizing the total cosmic ray energy budget with the observed UHECR flux in the energy band of (1-4)×1018 eV, we calculate the diffuse gamma-ray flux generated through UHECR propagation. We find that in order to not overshoot the new IGRB limit, these sub-ankle UHECRs should be produced mainly by nearby sources, with a possible non-negligible contribution from our Galaxy. The distance for the majority of UHECR sources can be further constrained if a given fraction of the observed IGRB at 820 GeV originates from UHECR. We note that our result should be conservative since there may be various other contributions to the IGRB that is not included here.

  20. An Estimation of the Gamma-Ray Burst Afterglow Apparent Optical Brightness Distribution Function

    NASA Astrophysics Data System (ADS)

    Akerlof, Carl W.; Swan, Heather F.

    2007-12-01

    By using recent publicly available observational data obtained in conjunction with the NASA Swift gamma-ray burst (GRB) mission and a novel data analysis technique, we have been able to make some rough estimates of the GRB afterglow apparent optical brightness distribution function. The results suggest that 71% of all burst afterglows have optical magnitudes with mR<22.1 at 1000 s after the burst onset, the dimmest detected object in the data sample. There is a strong indication that the apparent optical magnitude distribution function peaks at mR~19.5. Such estimates may prove useful in guiding future plans to improve GRB counterpart observation programs. The employed numerical techniques might find application in a variety of other data analysis problems in which the intrinsic distributions must be inferred from a heterogeneous sample.

  1. Comparison of gamma-gamma Phase Coarsening Responses of Three Powder Metal Disk Superalloys

    NASA Technical Reports Server (NTRS)

    Gabb, T. P.; Gayda, J.; Johnson, D. F.; MacKay, R. A.; Rogers, R. B.; Sudbrack, C. K.; Garg, A.; Locci, I. E.; Semiatin, S. L.; Kang, E.

    2016-01-01

    The phase microstructures of several powder metal (PM) disk superalloys were quantitatively evaluated. Contents, chemistries, and lattice parameters of gamma and gamma strengthening phase were determined for conventionally heat treated Alloy 10, LSHR, and ME3 superalloys, after electrolytic phase extractions. Several of long term heat treatments were then performed, to allow quantification of the precipitation, content, and size distribution of gamma at a long time interval to approximate equilibrium conditions. Additional coarsening heat treatments were performed at multiple temperatures and shorter time intervals, to allow quantification of the precipitation, contents and size distributions of gamma at conditions diverging from equilibrium. Modest differences in gamma and gamma lattice parameters and their mismatch were observed among the alloys, which varied with heat treatment. Yet, gamma coarsening rates were very similar for all three alloys in the heat treatment conditions examined. Alloy 10 had higher gamma dissolution and formation temperatures than LSHR and ME3, but a lower lattice mismatch, which was slightly positive for all three alloys at room temperature. The gamma precipitates of Alloy 10 appeared to remain coherent at higher temperatures than for LSHR and ME3. Higher coarsening rates were observed for gamma precipitates residing along grain boundaries than for those within grains in all three alloys, during slow-moderate quenching from supersolvus solution heat treatments, and during aging at temperatures of 843 C and higher.

  2. Probability distributions of hydraulic conductivity for the hydrogeologic units of the Death Valley regional ground-water flow system, Nevada and California

    USGS Publications Warehouse

    Belcher, Wayne R.; Sweetkind, Donald S.; Elliott, Peggy E.

    2002-01-01

    The use of geologic information such as lithology and rock properties is important to constrain conceptual and numerical hydrogeologic models. This geologic information is difficult to apply explicitly to numerical modeling and analyses because it tends to be qualitative rather than quantitative. This study uses a compilation of hydraulic-conductivity measurements to derive estimates of the probability distributions for several hydrogeologic units within the Death Valley regional ground-water flow system, a geologically and hydrologically complex region underlain by basin-fill sediments, volcanic, intrusive, sedimentary, and metamorphic rocks. Probability distributions of hydraulic conductivity for general rock types have been studied previously; however, this study provides more detailed definition of hydrogeologic units based on lithostratigraphy, lithology, alteration, and fracturing and compares the probability distributions to the aquifer test data. Results suggest that these probability distributions can be used for studies involving, for example, numerical flow modeling, recharge, evapotranspiration, and rainfall runoff. These probability distributions can be used for such studies involving the hydrogeologic units in the region, as well as for similar rock types elsewhere. Within the study area, fracturing appears to have the greatest influence on the hydraulic conductivity of carbonate bedrock hydrogeologic units. Similar to earlier studies, we find that alteration and welding in the Tertiary volcanic rocks greatly influence hydraulic conductivity. As alteration increases, hydraulic conductivity tends to decrease. Increasing degrees of welding appears to increase hydraulic conductivity because welding increases the brittleness of the volcanic rocks, thus increasing the amount of fracturing.

  3. Detecting background changes in environments with dynamic foreground by separating probability distribution function mixtures using Pearson's method of moments

    NASA Astrophysics Data System (ADS)

    Jenkins, Colleen; Jordan, Jay; Carlson, Jeff

    2007-02-01

    This paper presents parameter estimation techniques useful for detecting background changes in a video sequence with extreme foreground activity. A specific application of interest is automated detection of the covert placement of threats (e.g., a briefcase bomb) inside crowded public facilities. We propose that a histogram of pixel intensity acquired from a fixed mounted camera over time for a series of images will be a mixture of two Gaussian functions: the foreground probability distribution function and background probability distribution function. We will use Pearson's Method of Moments to separate the two probability distribution functions. The background function can then be "remembered" and changes in the background can be detected. Subsequent comparisons of background estimates are used to detect changes. Changes are flagged to alert security forces to the presence and location of potential threats. Results are presented that indicate the significant potential for robust parameter estimation techniques as applied to video surveillance.

  4. Wealth of the world's richest publicly traded companies per industry and per employee: Gamma, Log-normal and Pareto power-law as universal distributions?

    NASA Astrophysics Data System (ADS)

    Soriano-Hernández, P.; del Castillo-Mussot, M.; Campirán-Chávez, I.; Montemayor-Aldrete, J. A.

    2017-04-01

    Forbes Magazine published its list of leading or strongest publicly-traded two thousand companies in the world (G-2000) based on four independent metrics: sales or revenues, profits, assets and market value. Every one of these wealth metrics yields particular information on the corporate size or wealth size of each firm. The G-2000 cumulative probability wealth distribution per employee (per capita) for all four metrics exhibits a two-class structure: quasi-exponential in the lower part, and a Pareto power-law in the higher part. These two-class structure per capita distributions are qualitatively similar to income and wealth distributions in many countries of the world, but the fraction of firms per employee within the high-class Pareto is about 49% in sales per employee, and 33% after averaging on the four metrics, whereas in countries the fraction of rich agents in the Pareto zone is less than 10%. The quasi-exponential zone can be adjusted by Gamma or Log-normal distributions. On the other hand, Forbes classifies the G-2000 firms in 82 different industries or economic activities. Within each industry, the wealth distribution per employee also follows a two-class structure, but when the aggregate wealth of firms in each industry for the four metrics is divided by the total number of employees in that industry, then the 82 points of the aggregate wealth distribution by industry per employee can be well adjusted by quasi-exponential curves for the four metrics.

  5. A GIS model-based assessment of the environmental distribution of gamma-hexachlorocyclohexane in European soils and waters.

    PubMed

    Vizcaíno, P; Pistocchi, A

    2010-10-01

    The MAPPE GIS based multimedia model is used to produce a quantitative description of the behaviour of gamma-hexachlorocyclohexane (gamma-HCH) in Europe, with emphasis on continental surface waters. The model is found to reasonably reproduce gamma-HCH distributions and variations along the years in atmosphere and soil; for continental surface waters, concentrations were reasonably well predicted for year 1995, when lindane was still used in agriculture, while for 2005, assuming severe restrictions in use, yields to substantial underestimation. Much better results were yielded when same mode of release as in 1995 was considered, supporting the conjecture that for gamma-HCH, emission data rather that model structure and parameterization can be responsible for wrong estimation of concentrations. Future research should be directed to improve the quality of emission data. Joint interpretation of monitoring and modelling results, highlights that lindane emissions in Europe, despite the marked decreasing trend, persist beyond the provisions of existing legislation. Copyright (c) 2010 Elsevier Ltd. All rights reserved.

  6. Density probability distribution functions of diffuse gas in the Milky Way

    NASA Astrophysics Data System (ADS)

    Berkhuijsen, E. M.; Fletcher, A.

    2008-10-01

    In a search for the signature of turbulence in the diffuse interstellar medium (ISM) in gas density distributions, we determined the probability distribution functions (PDFs) of the average volume densities of the diffuse gas. The densities were derived from dispersion measures and HI column densities towards pulsars and stars at known distances. The PDFs of the average densities of the diffuse ionized gas (DIG) and the diffuse atomic gas are close to lognormal, especially when lines of sight at |b| < 5° and |b| >= 5° are considered separately. The PDF of at high |b| is twice as wide as that at low |b|. The width of the PDF of the DIG is about 30 per cent smaller than that of the warm HI at the same latitudes. The results reported here provide strong support for the existence of a lognormal density PDF in the diffuse ISM, consistent with a turbulent origin of density structure in the diffuse gas.

  7. Probability distribution of the entanglement across a cut at an infinite-randomness fixed point

    NASA Astrophysics Data System (ADS)

    Devakul, Trithep; Majumdar, Satya N.; Huse, David A.

    2017-03-01

    We calculate the probability distribution of entanglement entropy S across a cut of a finite one-dimensional spin chain of length L at an infinite-randomness fixed point using Fisher's strong randomness renormalization group (RG). Using the random transverse-field Ising model as an example, the distribution is shown to take the form p (S |L ) ˜L-ψ (k ) , where k ≡S /ln[L /L0] , the large deviation function ψ (k ) is found explicitly, and L0 is a nonuniversal microscopic length. We discuss the implications of such a distribution on numerical techniques that rely on entanglement, such as matrix-product-state-based techniques. Our results are verified with numerical RG simulations, as well as the actual entanglement entropy distribution for the random transverse-field Ising model which we calculate for large L via a mapping to Majorana fermions.

  8. Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events.

    PubMed

    Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon

    2016-01-01

    Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate.

  9. Serial Spike Time Correlations Affect Probability Distribution of Joint Spike Events

    PubMed Central

    Shahi, Mina; van Vreeswijk, Carl; Pipa, Gordon

    2016-01-01

    Detecting the existence of temporally coordinated spiking activity, and its role in information processing in the cortex, has remained a major challenge for neuroscience research. Different methods and approaches have been suggested to test whether the observed synchronized events are significantly different from those expected by chance. To analyze the simultaneous spike trains for precise spike correlation, these methods typically model the spike trains as a Poisson process implying that the generation of each spike is independent of all the other spikes. However, studies have shown that neural spike trains exhibit dependence among spike sequences, such as the absolute and relative refractory periods which govern the spike probability of the oncoming action potential based on the time of the last spike, or the bursting behavior, which is characterized by short epochs of rapid action potentials, followed by longer episodes of silence. Here we investigate non-renewal processes with the inter-spike interval distribution model that incorporates spike-history dependence of individual neurons. For that, we use the Monte Carlo method to estimate the full shape of the coincidence count distribution and to generate false positives for coincidence detection. The results show that compared to the distributions based on homogeneous Poisson processes, and also non-Poisson processes, the width of the distribution of joint spike events changes. Non-renewal processes can lead to both heavy tailed or narrow coincidence distribution. We conclude that small differences in the exact autostructure of the point process can cause large differences in the width of a coincidence distribution. Therefore, manipulations of the autostructure for the estimation of significance of joint spike events seem to be inadequate. PMID:28066225

  10. Failure probability under parameter uncertainty.

    PubMed

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  11. CUMPOIS- CUMULATIVE POISSON DISTRIBUTION PROGRAM

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The Cumulative Poisson distribution program, CUMPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, CUMPOIS (NPO-17714) and NEWTPOIS (NPO-17715), can be used independently of one another. CUMPOIS determines the approximate cumulative binomial distribution, evaluates the cumulative distribution function (cdf) for gamma distributions with integer shape parameters, and evaluates the cdf for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. CUMPOIS calculates the probability that n or less events (ie. cumulative) will occur within any unit when the expected number of events is given as lambda. Normally, this probability is calculated by a direct summation, from i=0 to n, of terms involving the exponential function, lambda, and inverse factorials. This approach, however, eventually fails due to underflow for sufficiently large values of n. Additionally, when the exponential term is moved outside of the summation for simplification purposes, there is a risk that the terms remaining within the summation, and the summation itself, will overflow for certain values of i and lambda. CUMPOIS eliminates these possibilities by multiplying an additional exponential factor into the summation terms and the partial sum whenever overflow/underflow situations threaten. The reciprocal of this term is then multiplied into the completed sum giving the cumulative probability. The CUMPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting lambda and n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 26K. CUMPOIS was

  12. Probability distribution of financial returns in a model of multiplicative Brownian motion with stochastic diffusion coefficient

    NASA Astrophysics Data System (ADS)

    Silva, Antonio

    2005-03-01

    It is well-known that the mathematical theory of Brownian motion was first developed in the Ph. D. thesis of Louis Bachelier for the French stock market before Einstein [1]. In Ref. [2] we studied the so-called Heston model, where the stock-price dynamics is governed by multiplicative Brownian motion with stochastic diffusion coefficient. We solved the corresponding Fokker-Planck equation exactly and found an analytic formula for the time-dependent probability distribution of stock price changes (returns). The formula interpolates between the exponential (tent-shaped) distribution for short time lags and the Gaussian (parabolic) distribution for long time lags. The theoretical formula agrees very well with the actual stock-market data ranging from the Dow-Jones index [2] to individual companies [3], such as Microsoft, Intel, etc. [] [1] Louis Bachelier, ``Th'eorie de la sp'eculation,'' Annales Scientifiques de l''Ecole Normale Sup'erieure, III-17:21-86 (1900).[] [2] A. A. Dragulescu and V. M. Yakovenko, ``Probability distribution of returns in the Heston model with stochastic volatility,'' Quantitative Finance 2, 443--453 (2002); Erratum 3, C15 (2003). [cond-mat/0203046] [] [3] A. C. Silva, R. E. Prange, and V. M. Yakovenko, ``Exponential distribution of financial returns at mesoscopic time lags: a new stylized fact,'' Physica A 344, 227--235 (2004). [cond-mat/0401225

  13. The mathematical formula of the intravaginal ejaculation latency time (IELT) distribution of lifelong premature ejaculation differs from the IELT distribution formula of men in the general male population

    PubMed Central

    Janssen, Paddy K.C.

    2016-01-01

    Purpose To find the most accurate mathematical description of the intravaginal ejaculation latency time (IELT) distribution in the general male population. Materials and Methods We compared the fitness of various well-known mathematical distributions with the IELT distribution of two previously published stopwatch studies of the Caucasian general male population and a stopwatch study of Dutch Caucasian men with lifelong premature ejaculation (PE). The accuracy of fitness is expressed by the Goodness of Fit (GOF). The smaller the GOF, the more accurate is the fitness. Results The 3 IELT distributions are gamma distributions, but the IELT distribution of lifelong PE is another gamma distribution than the IELT distribution of men in the general male population. The Lognormal distribution of the gamma distributions most accurately fits the IELT distribution of 965 men in the general population, with a GOF of 0.057. The Gumbel Max distribution most accurately fits the IELT distribution of 110 men with lifelong PE with a GOF of 0.179. There are more men with lifelong PE ejaculating within 30 and 60 seconds than can be extrapolated from the probability density curve of the Lognormal IELT distribution of men in the general population. Conclusions Men with lifelong PE have a distinct IELT distribution, e.g., a Gumbel Max IELT distribution, that can only be retrieved from the general male population Lognormal IELT distribution when thousands of men would participate in a IELT stopwatch study. The mathematical formula of the Lognormal IELT distribution is useful for epidemiological research of the IELT. PMID:26981594

  14. Study on probability distribution of prices in electricity market: A case study of zhejiang province, china

    NASA Astrophysics Data System (ADS)

    Zhou, H.; Chen, B.; Han, Z. X.; Zhang, F. Q.

    2009-05-01

    The study on probability density function and distribution function of electricity prices contributes to the power suppliers and purchasers to estimate their own management accurately, and helps the regulator monitor the periods deviating from normal distribution. Based on the assumption of normal distribution load and non-linear characteristic of the aggregate supply curve, this paper has derived the distribution of electricity prices as the function of random variable of load. The conclusion has been validated with the electricity price data of Zhejiang market. The results show that electricity prices obey normal distribution approximately only when supply-demand relationship is loose, whereas the prices deviate from normal distribution and present strong right-skewness characteristic. Finally, the real electricity markets also display the narrow-peak characteristic when undersupply occurs.

  15. Confining the angular distribution of terrestrial gamma ray flash emission

    NASA Astrophysics Data System (ADS)

    Gjesteland, T.; Østgaard, N.; Collier, A. B.; Carlson, B. E.; Cohen, M. B.; Lehtinen, N. G.

    2011-11-01

    Terrestrial gamma ray flashes (TGFs) are bremsstrahlung emissions from relativistic electrons accelerated in electric fields associated with thunder storms, with photon energies up to at least 40 MeV, which sets the lowest estimate of the total potential of 40 MV. The electric field that produces TGFs will be reflected by the initial angular distribution of the TGF emission. Here we present the first constraints on the TGF emission cone based on accurately geolocated TGFs. The source lightning discharges associated with TGFs detected by RHESSI are determined from the Atmospheric Weather Electromagnetic System for Observation, Modeling, and Education (AWESOME) network and the World Wide Lightning Location Network (WWLLN). The distribution of the observation angles for 106 TGFs are compared to Monte Carlo simulations. We find that TGF emissions within a half angle >30° are consistent with the distributions of observation angle derived from the networks. In addition, 36 events occurring before 2006 are used for spectral analysis. The energy spectra are binned according to observation angle. The result is a significant softening of the TGF energy spectrum for large (>40°) observation angles, which is consistent with a TGF emission half angle (<40°). The softening is due to Compton scattering which reduces the photon energies.

  16. Development and application of an empirical probability distribution for the prediction error of re-entry body maximum dynamic pressure

    NASA Technical Reports Server (NTRS)

    Lanzi, R. James; Vincent, Brett T.

    1993-01-01

    The relationship between actual and predicted re-entry maximum dynamic pressure is characterized using a probability density function and a cumulative distribution function derived from sounding rocket flight data. This paper explores the properties of this distribution and demonstrates applications of this data with observed sounding rocket re-entry body damage characteristics to assess probabilities of sustaining various levels of heating damage. The results from this paper effectively bridge the gap existing in sounding rocket reentry analysis between the known damage level/flight environment relationships and the predicted flight environment.

  17. A Comparison of the Exact Kruskal-Wallis Distribution to Asymptotic Approximations for All Sample Sizes up to 105

    ERIC Educational Resources Information Center

    Meyer, J. Patrick; Seaman, Michael A.

    2013-01-01

    The authors generated exact probability distributions for sample sizes up to 35 in each of three groups ("n" less than or equal to 105) and up to 10 in each of four groups ("n" less than or equal to 40). They compared the exact distributions to the chi-square, gamma, and beta approximations. The beta approximation was best in…

  18. A general formula for computing maximum proportion correct scores in various psychophysical paradigms with arbitrary probability distributions of stimulus observations.

    PubMed

    Dai, Huanping; Micheyl, Christophe

    2015-05-01

    Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.

  19. Computer methods for sampling from the gamma distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, M.E.; Tadikamalla, P.R.

    1978-01-01

    Considerable attention has recently been directed at developing ever faster algorithms for generating gamma random variates on digital computers. This paper surveys the current state of the art including the leading algorithms of Ahrens and Dieter, Atkinson, Cheng, Fishman, Marsaglia, Tadikamalla, and Wallace. General random variate generation techniques are explained with reference to these gamma algorithms. Computer simulation experiments on IBM and CDC computers are reported.

  20. Transformation and Self-Similarity Properties of Gamma and Weibull Fragment Size Distributions

    DTIC Science & Technology

    2015-12-01

    spray formed when a fast gas stream blows over a liquid volume.” As a theoretical justification, they showed that Gamma size distributions are...of Fracture, 140, 243, 2006 P.-K. Wu, G. A. Ruff, and G. M. Faeth, Primary Breakup in Liquid - Gas Mixing Layers, Atomization and Sprays, 1, 421-440...103 meter (m) barn (b) 1 × 10–28 square meter (m2) gallon (gal, U.S. liquid ) 3.785 412 × 10–3 cubic meter (m3) cubic foot (ft3) 2.831 685 × 10–2

  1. Brain early infarct detection using gamma correction extreme-level eliminating with weighting distribution.

    PubMed

    Teh, V; Sim, K S; Wong, E K

    2016-11-01

    According to the statistic from World Health Organization (WHO), stroke is one of the major causes of death globally. Computed tomography (CT) scan is one of the main medical diagnosis system used for diagnosis of ischemic stroke. CT scan provides brain images in Digital Imaging and Communication in Medicine (DICOM) format. The presentation of CT brain images is mainly relied on the window setting (window center and window width), which converts an image from DICOM format into normal grayscale format. Nevertheless, the ordinary window parameter could not deliver a proper contrast on CT brain images for ischemic stroke detection. In this paper, a new proposed method namely gamma correction extreme-level eliminating with weighting distribution (GCELEWD) is implemented to improve the contrast on CT brain images. GCELEWD is capable of highlighting the hypodense region for diagnosis of ischemic stroke. The performance of this new proposed technique, GCELEWD, is compared with four of the existing contrast enhancement technique such as brightness preserving bi-histogram equalization (BBHE), dualistic sub-image histogram equalization (DSIHE), extreme-level eliminating histogram equalization (ELEHE), and adaptive gamma correction with weighting distribution (AGCWD). GCELEWD shows better visualization for ischemic stroke detection and higher values with image quality assessment (IQA) module. SCANNING 38:842-856, 2016. © 2016 Wiley Periodicals, Inc. © Wiley Periodicals, Inc.

  2. Construction and identification of a D-Vine model applied to the probability distribution of modal parameters in structural dynamics

    NASA Astrophysics Data System (ADS)

    Dubreuil, S.; Salaün, M.; Rodriguez, E.; Petitjean, F.

    2018-01-01

    This study investigates the construction and identification of the probability distribution of random modal parameters (natural frequencies and effective parameters) in structural dynamics. As these parameters present various types of dependence structures, the retained approach is based on pair copula construction (PCC). A literature review leads us to choose a D-Vine model for the construction of modal parameters probability distributions. Identification of this model is based on likelihood maximization which makes it sensitive to the dimension of the distribution, namely the number of considered modes in our context. To this respect, a mode selection preprocessing step is proposed. It allows the selection of the relevant random modes for a given transfer function. The second point, addressed in this study, concerns the choice of the D-Vine model. Indeed, D-Vine model is not uniquely defined. Two strategies are proposed and compared. The first one is based on the context of the study whereas the second one is purely based on statistical considerations. Finally, the proposed approaches are numerically studied and compared with respect to their capabilities, first in the identification of the probability distribution of random modal parameters and second in the estimation of the 99 % quantiles of some transfer functions.

  3. Electron number probability distributions for correlated wave functions.

    PubMed

    Francisco, E; Martín Pendás, A; Blanco, M A

    2007-03-07

    Efficient formulas for computing the probability of finding exactly an integer number of electrons in an arbitrarily chosen volume are only known for single-determinant wave functions [E. Cances et al., Theor. Chem. Acc. 111, 373 (2004)]. In this article, an algebraic method is presented that extends these formulas to the case of multideterminant wave functions and any number of disjoint volumes. The derived expressions are applied to compute the probabilities within the atomic domains derived from the space partitioning based on the quantum theory of atoms in molecules. Results for a series of test molecules are presented, paying particular attention to the effects of electron correlation and of some numerical approximations on the computed probabilities.

  4. A probability space for quantum models

    NASA Astrophysics Data System (ADS)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  5. Primary gamma rays. [resulting from cosmic ray interaction with interstellar matter

    NASA Technical Reports Server (NTRS)

    Fichtel, C. E.

    1974-01-01

    Within this galaxy, cosmic rays reveal their presence in interstellar space and probably in source regions by their interactions with interstellar matter which lead to gamma rays with a very characteristic energy spectrum. From the study of the intensity of the high energy gamma radiation as a function of galactic longitude, it is already clear that cosmic rays are almost certainly not uniformly distributed in the galaxy and are not concentrated in the center of the galaxy. The galactic cosmic rays appear to be tied to galactic structural features, presumably by the galactic magnetic fields which are in turn held by the matter in the arm segments and the clouds. On the extragalactic scale, it is now possible to say that cosmic rays are not universal at the density seen near the earth. The diffuse celestial gamma ray spectrum that is observed presents the interesting possibility of cosmological studies and possible evidence for a residual universal cosmic ray density, which is much lower than the present galactic cosmic ray density.

  6. Multichannel Speech Enhancement Based on Generalized Gamma Prior Distribution with Its Online Adaptive Estimation

    NASA Astrophysics Data System (ADS)

    Dat, Tran Huy; Takeda, Kazuya; Itakura, Fumitada

    We present a multichannel speech enhancement method based on MAP speech spectral magnitude estimation using a generalized gamma model of speech prior distribution, where the model parameters are adapted from actual noisy speech in a frame-by-frame manner. The utilization of a more general prior distribution with its online adaptive estimation is shown to be effective for speech spectral estimation in noisy environments. Furthermore, the multi-channel information in terms of cross-channel statistics are shown to be useful to better adapt the prior distribution parameters to the actual observation, resulting in better performance of speech enhancement algorithm. We tested the proposed algorithm in an in-car speech database and obtained significant improvements of the speech recognition performance, particularly under non-stationary noise conditions such as music, air-conditioner and open window.

  7. Count distribution for mixture of two exponentials as renewal process duration with applications

    NASA Astrophysics Data System (ADS)

    Low, Yeh Ching; Ong, Seng Huat

    2016-06-01

    A count distribution is presented by considering a renewal process where the distribution of the duration is a finite mixture of exponential distributions. This distribution is able to model over dispersion, a feature often found in observed count data. The computation of the probabilities and renewal function (expected number of renewals) are examined. Parameter estimation by the method of maximum likelihood is considered with applications of the count distribution to real frequency count data exhibiting over dispersion. It is shown that the mixture of exponentials count distribution fits over dispersed data better than the Poisson process and serves as an alternative to the gamma count distribution.

  8. The Finite-Size Scaling Relation for the Order-Parameter Probability Distribution of the Six-Dimensional Ising Model

    NASA Astrophysics Data System (ADS)

    Merdan, Ziya; Karakuş, Özlem

    2016-11-01

    The six dimensional Ising model with nearest-neighbor pair interactions has been simulated and verified numerically on the Creutz Cellular Automaton by using five bit demons near the infinite-lattice critical temperature with the linear dimensions L=4,6,8,10. The order parameter probability distribution for six dimensional Ising model has been calculated at the critical temperature. The constants of the analytical function have been estimated by fitting to probability function obtained numerically at the finite size critical point.

  9. NEWTPOIS- NEWTON POISSON DISTRIBUTION PROGRAM

    NASA Technical Reports Server (NTRS)

    Bowerman, P. N.

    1994-01-01

    The cumulative poisson distribution program, NEWTPOIS, is one of two programs which make calculations involving cumulative poisson distributions. Both programs, NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714), can be used independently of one another. NEWTPOIS determines percentiles for gamma distributions with integer shape parameters and calculates percentiles for chi-square distributions with even degrees of freedom. It can be used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. NEWTPOIS determines the Poisson parameter (lambda), that is; the mean (or expected) number of events occurring in a given unit of time, area, or space. Given that the user already knows the cumulative probability for a specific number of occurrences (n) it is usually a simple matter of substitution into the Poisson distribution summation to arrive at lambda. However, direct calculation of the Poisson parameter becomes difficult for small positive values of n and unmanageable for large values. NEWTPOIS uses Newton's iteration method to extract lambda from the initial value condition of the Poisson distribution where n=0, taking successive estimations until some user specified error term (epsilon) is reached. The NEWTPOIS program is written in C. It was developed on an IBM AT with a numeric co-processor using Microsoft C 5.0. Because the source code is written using standard C structures and functions, it should compile correctly on most C compilers. The program format is interactive, accepting epsilon, n, and the cumulative probability of the occurrence of n as inputs. It has been implemented under DOS 3.2 and has a memory requirement of 30K. NEWTPOIS was developed in 1988.

  10. Description of atomic burials in compact globular proteins by Fermi-Dirac probability distributions.

    PubMed

    Gomes, Antonio L C; de Rezende, Júlia R; Pereira de Araújo, Antônio F; Shakhnovich, Eugene I

    2007-02-01

    We perform a statistical analysis of atomic distributions as a function of the distance R from the molecular geometrical center in a nonredundant set of compact globular proteins. The number of atoms increases quadratically for small R, indicating a constant average density inside the core, reaches a maximum at a size-dependent distance R(max), and falls rapidly for larger R. The empirical curves turn out to be consistent with the volume increase of spherical concentric solid shells and a Fermi-Dirac distribution in which the distance R plays the role of an effective atomic energy epsilon(R) = R. The effective chemical potential mu governing the distribution increases with the number of residues, reflecting the size of the protein globule, while the temperature parameter beta decreases. Interestingly, betamu is not as strongly dependent on protein size and appears to be tuned to maintain approximately half of the atoms in the high density interior and the other half in the exterior region of rapidly decreasing density. A normalized size-independent distribution was obtained for the atomic probability as a function of the reduced distance, r = R/R(g), where R(g) is the radius of gyration. The global normalized Fermi distribution, F(r), can be reasonably decomposed in Fermi-like subdistributions for different atomic types tau, F(tau)(r), with Sigma(tau)F(tau)(r) = F(r), which depend on two additional parameters mu(tau) and h(tau). The chemical potential mu(tau) affects a scaling prefactor and depends on the overall frequency of the corresponding atomic type, while the maximum position of the subdistribution is determined by h(tau), which appears in a type-dependent atomic effective energy, epsilon(tau)(r) = h(tau)r, and is strongly correlated to available hydrophobicity scales. Better adjustments are obtained when the effective energy is not assumed to be necessarily linear, or epsilon(tau)*(r) = h(tau)*r(alpha,), in which case a correlation with hydrophobicity

  11. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  12. Introduction and Application of non-stationary Standardized Precipitation Index Considering Probability Distribution Function and Return Period

    NASA Astrophysics Data System (ADS)

    Park, J.; Lim, Y. J.; Sung, J. H.; Kang, H. S.

    2017-12-01

    The widely used meteorological drought index, the Standardized Precipitation Index (SPI) basically assumes stationarity, but recent change in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process has been proposed. The results are evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the shape of probability distribution function wider than before. This understanding implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.

  13. Introduction and application of non-stationary standardized precipitation index considering probability distribution function and return period

    NASA Astrophysics Data System (ADS)

    Park, Junehyeong; Sung, Jang Hyun; Lim, Yoon-Jin; Kang, Hyun-Suk

    2018-05-01

    The widely used meteorological drought index, the Standardized Precipitation Index (SPI), basically assumes stationarity, but recent changes in the climate have led to a need to review this hypothesis. In this study, a new non-stationary SPI that considers not only the modified probability distribution parameter but also the return period under the non-stationary process was proposed. The results were evaluated for two severe drought cases during the last 10 years in South Korea. As a result, SPIs considered that the non-stationary hypothesis underestimated the drought severity than the stationary SPI despite that these past two droughts were recognized as significantly severe droughts. It may be caused by that the variances of summer and autumn precipitation become larger over time then it can make the probability distribution wider than before. This implies that drought expressions by statistical index such as SPI can be distorted by stationary assumption and cautious approach is needed when deciding drought level considering climate changes.

  14. Angular Distribution of Gamma-Ray Bursts: An Observational Probe of Cosmological Principle

    NASA Astrophysics Data System (ADS)

    Mészáros, A.; Balázs, L. G.; Vavrek, R.; Horváth, I.; Bagoly, Z.

    The test of the isotropy in the angular distribution of the gamma-ray bursts collected in BATSE Catalog (Meegan C. A. et al., http://www.batse.msfc.nasa.gov/data, 2000) is a test of cosmological principle itself, because the gamma-ray bursts are at cosmological distances. Several articles of the authors study this question (Balázs L. G., Mészáros A., & Horváth I., Astron. Astrophys., 339, 1, 1998; Balázs L. G., Mészáros A., Horváth I., & Vavrek R., Astron. Astrophys. Suppl., 138, 417, 1999; Mészáros A., Bagoly Z., & Vavrek R. Astron. Astrophys., in press, 2000). The final conclusion concerning the validity of isotropy is complicated both by instrumental effects and by the fact that there are three subgroups of gamma-ray bursts ("short", "intermediate", "long"; separation is done with respect to the duration of bursts). The long bursts are surely up to z ≃ 4 (z is the redshift); for the remaining two subclasses the redshifts are unknown. The done tests of isotropy suggest (after the elimination of instrumental effects) the existence of anisotropy for the intermediate subclass on the confidence level > 95%. On the other hand, for the remaining two subclasses the situation is unclear; there is no unambiguous rejection of isotropy for them yet on the higher than 95% confidence level. If the bursts of intermediate subclass are at high z-s (say, at, z > 0.1), then the validity of cosmological principle would be at a serious doubt.

  15. Towards a theoretical determination of the geographical probability distribution of meteoroid impacts on Earth

    NASA Astrophysics Data System (ADS)

    Zuluaga, Jorge I.; Sucerquia, Mario

    2018-06-01

    Tunguska and Chelyabinsk impact events occurred inside a geographical area of only 3.4 per cent of the Earth's surface. Although two events hardly constitute a statistically significant demonstration of a geographical pattern of impacts, their spatial coincidence is at least tantalizing. To understand if this concurrence reflects an underlying geographical and/or temporal pattern, we must aim at predicting the spatio-temporal distribution of meteoroid impacts on Earth. For this purpose we designed, implemented, and tested a novel numerical technique, the `Gravitational Ray Tracing' (GRT) designed to compute the relative impact probability (RIP) on the surface of any planet. GRT is inspired by the so-called ray-casting techniques used to render realistic images of complex 3D scenes. In this paper we describe the method and the results of testing it at the time of large impact events. Our findings suggest a non-trivial pattern of impact probabilities at any given time on the Earth. Locations at 60-90° from the apex are more prone to impacts, especially at midnight. Counterintuitively, sites close to apex direction have the lowest RIP, while in the antapex RIP are slightly larger than average. We present here preliminary maps of RIP at the time of Tunguska and Chelyabinsk events and found no evidence of a spatial or temporal pattern, suggesting that their coincidence was fortuitous. We apply the GRT method to compute theoretical RIP at the location and time of 394 large fireballs. Although the predicted spatio-temporal impact distribution matches marginally the observed events, we successfully predict their impact speed distribution.

  16. Pricing the property claim service (PCS) catastrophe insurance options using gamma distribution

    NASA Astrophysics Data System (ADS)

    Noviyanti, Lienda; Soleh, Achmad Zanbar; Setyanto, Gatot R.

    2017-03-01

    The catastrophic events like earthquakes, hurricanes or flooding are characteristics for some areas, a properly calculated annual premium would be closely as high as the loss insured. From an actuarial perspective, such events constitute the risk that are not insurable. On the other hand people living in such areas need protection. In order to securitize the catastrophe risk, futures or options based on a loss index could be considered. Chicago Board of Trade launched a new class of catastrophe insurance options based on new indices provided by Property Claim Services (PCS). The PCS-option is based on the Property Claim Service Index (PCS-Index). The index are used to determine and payout in writing index-based insurance derivatives. The objective of this paper is to price PCS Catastrophe Insurance Option based on PCS Catastrophe index. Gamma Distribution is used to estimate PCS Catastrophe index distribution.

  17. Gamma-ray Output Spectra from 239 Pu Fission

    DOE PAGES

    Ullmann, John

    2015-05-25

    The gamma-ray multiplicities, individual gamma-ray energy spectra, and total gamma energy spectra following neutron-induced fission of 239Pu were measured using the DANCE detector at Los Alamos. Corrections for detector response were made using a forward-modeling technique based on propagating sets of gamma rays generated from a paramaterized model through a GEANT model of the DANCE array and adjusting the parameters for best fit to the measured spectra. The results for the gamma-ray spectrum and multiplicity are in general agreement with previous results, but the measured total gamma-ray energy is about 10% higher. We found that a dependence of the gamma-raymore » spectrum on the gamma-ray multplicity was also observed. Finally, global model calculations of the multiplicity and gamma energy distributions are in good agreement with the data, but predict a slightly softer total-energy distribution.« less

  18. Average capacity of the ground to train communication link of a curved track in the turbulence of gamma-gamma distribution

    NASA Astrophysics Data System (ADS)

    Yang, Yanqiu; Yu, Lin; Zhang, Yixin

    2017-04-01

    A model of the average capacity of optical wireless communication link with pointing errors for the ground-to-train of the curved track is established based on the non-Kolmogorov. By adopting the gamma-gamma distribution model, we derive the average capacity expression for this channel. The numerical analysis reveals that heavier fog reduces the average capacity of link. The strength of atmospheric turbulence, the variance of pointing errors, and the covered track length need to be reduced for the larger average capacity of link. The normalized beamwidth and the average signal-to-noise ratio (SNR) of the turbulence-free link need to be increased. We can increase the transmit aperture to expand the beamwidth and enhance the signal intensity, thereby decreasing the impact of the beam wander accordingly. As the system adopting the automatic tracking of beam at the receiver positioned on the roof of the train, for eliminating the pointing errors caused by beam wander and train vibration, the equivalent average capacity of the channel will achieve a maximum value. The impact of the non-Kolmogorov spectral index's variation on the average capacity of link can be ignored.

  19. Gesture Recognition Based on the Probability Distribution of Arm Trajectories

    NASA Astrophysics Data System (ADS)

    Wan, Khairunizam; Sawada, Hideyuki

    The use of human motions for the interaction between humans and computers is becoming an attractive alternative to verbal media, especially through the visual interpretation of the human body motion. In particular, hand gestures are used as non-verbal media for the humans to communicate with machines that pertain to the use of the human gestures to interact with them. This paper introduces a 3D motion measurement of the human upper body for the purpose of the gesture recognition, which is based on the probability distribution of arm trajectories. In this study, by examining the characteristics of the arm trajectories given by a signer, motion features are selected and classified by using a fuzzy technique. Experimental results show that the use of the features extracted from arm trajectories effectively works on the recognition of dynamic gestures of a human, and gives a good performance to classify various gesture patterns.

  20. Generalized quantum Fokker-Planck, diffusion, and Smoluchowski equations with true probability distribution functions.

    PubMed

    Banik, Suman Kumar; Bag, Bidhan Chandra; Ray, Deb Shankar

    2002-05-01

    Traditionally, quantum Brownian motion is described by Fokker-Planck or diffusion equations in terms of quasiprobability distribution functions, e.g., Wigner functions. These often become singular or negative in the full quantum regime. In this paper a simple approach to non-Markovian theory of quantum Brownian motion using true probability distribution functions is presented. Based on an initial coherent state representation of the bath oscillators and an equilibrium canonical distribution of the quantum mechanical mean values of their coordinates and momenta, we derive a generalized quantum Langevin equation in c numbers and show that the latter is amenable to a theoretical analysis in terms of the classical theory of non-Markovian dynamics. The corresponding Fokker-Planck, diffusion, and Smoluchowski equations are the exact quantum analogs of their classical counterparts. The present work is independent of path integral techniques. The theory as developed here is a natural extension of its classical version and is valid for arbitrary temperature and friction (the Smoluchowski equation being considered in the overdamped limit).

  1. Performance analysis of OOK-based FSO systems in Gamma-Gamma turbulence with imprecise channel models

    NASA Astrophysics Data System (ADS)

    Feng, Jianfeng; Zhao, Xiaohui

    2017-11-01

    For an FSO communication system with imprecise channel model, we investigate its system performance based on outage probability, average BEP and ergodic capacity. The exact FSO links are modeled as Gamma-Gamma fading channel in consideration of both atmospheric turbulence and pointing errors, and the imprecise channel model is treated as the superposition of exact channel gain and a Gaussian random variable. After we derive the PDF, CDF and nth moment of the imprecise channel gain, and based on these statistics the expressions for the outage probability, the average BEP and the ergodic capacity in terms of the Meijer's G functions are obtained. Both numerical and analytical results are presented. The simulation results show that the communication performance deteriorates in the imprecise channel model, and approaches to the exact performance curves as the channel model becomes accurate.

  2. Spatial distribution and cognitive correlates of gamma noise power in schizophrenia.

    PubMed

    Díez, A; Suazo, V; Casado, P; Martín-Loeches, M; Molina, V

    2013-06-01

    Brain activity is less organized in patients with schizophrenia than in healthy controls (HC). Noise power (scalp-recorded electroencephalographic activity unlocked to stimuli) may be of use for studying this disorganization. Method Fifty-four patients with schizophrenia (29 minimally treated and 25 stable treated), 23 first-degree relatives and 27 HC underwent clinical and cognitive assessments and an electroencephalographic recording during an oddball P300 paradigm to calculate noise power magnitude in the gamma band. We used a principal component analysis (PCA) to determine the factor structure of gamma noise power values across electrodes and the clinical and cognitive correlates of the resulting factors. The PCA revealed three noise power factors, roughly corresponding to the default mode network (DMN), frontal and occipital regions respectively. Patients showed higher gamma noise power loadings in the first factor when compared to HC and first-degree relatives. In the patients, frontal gamma noise factor scores related significantly and inversely to working memory and problem-solving performance. There were no associations with symptoms. There is an elevated gamma activity unrelated to task processing over regions coherent with the DMN topography in patients with schizophrenia. The same type of gamma activity over frontal regions is inversely related to performance in tasks with high involvement in these frontal areas. The idea of gamma noise as a possible biological marker for schizophrenia seems promising. Gamma noise might be of use in the study of underlying neurophysiological mechanisms involved in this disease.

  3. Generalized skew-symmetric interfacial probability distribution in reflectivity and small-angle scattering analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, Zhang; Chen, Wei

    Generalized skew-symmetric probability density functions are proposed to model asymmetric interfacial density distributions for the parameterization of any arbitrary density profiles in the `effective-density model'. The penetration of the densities into adjacent layers can be selectively controlled and parameterized. A continuous density profile is generated and discretized into many independent slices of very thin thickness with constant density values and sharp interfaces. The discretized profile can be used to calculate reflectivities via Parratt's recursive formula, or small-angle scattering via the concentric onion model that is also developed in this work.

  4. Generalized skew-symmetric interfacial probability distribution in reflectivity and small-angle scattering analysis

    DOE PAGES

    Jiang, Zhang; Chen, Wei

    2017-11-03

    Generalized skew-symmetric probability density functions are proposed to model asymmetric interfacial density distributions for the parameterization of any arbitrary density profiles in the `effective-density model'. The penetration of the densities into adjacent layers can be selectively controlled and parameterized. A continuous density profile is generated and discretized into many independent slices of very thin thickness with constant density values and sharp interfaces. The discretized profile can be used to calculate reflectivities via Parratt's recursive formula, or small-angle scattering via the concentric onion model that is also developed in this work.

  5. An innovative method for offshore wind farm site selection based on the interval number with probability distribution

    NASA Astrophysics Data System (ADS)

    Wu, Yunna; Chen, Kaifeng; Xu, Hu; Xu, Chuanbo; Zhang, Haobo; Yang, Meng

    2017-12-01

    There is insufficient research relating to offshore wind farm site selection in China. The current methods for site selection have some defects. First, information loss is caused by two aspects: the implicit assumption that the probability distribution on the interval number is uniform; and ignoring the value of decision makers' (DMs') common opinion on the criteria information evaluation. Secondly, the difference in DMs' utility function has failed to receive attention. An innovative method is proposed in this article to solve these drawbacks. First, a new form of interval number and its weighted operator are proposed to reflect the uncertainty and reduce information loss. Secondly, a new stochastic dominance degree is proposed to quantify the interval number with a probability distribution. Thirdly, a two-stage method integrating the weighted operator with stochastic dominance degree is proposed to evaluate the alternatives. Finally, a case from China proves the effectiveness of this method.

  6. Does Breast Cancer Drive the Building of Survival Probability Models among States? An Assessment of Goodness of Fit for Patient Data from SEER Registries

    PubMed

    Khan, Hafiz; Saxena, Anshul; Perisetti, Abhilash; Rafiq, Aamrin; Gabbidon, Kemesha; Mende, Sarah; Lyuksyutova, Maria; Quesada, Kandi; Blakely, Summre; Torres, Tiffany; Afesse, Mahlet

    2016-12-01

    Background: Breast cancer is a worldwide public health concern and is the most prevalent type of cancer in women in the United States. This study concerned the best fit of statistical probability models on the basis of survival times for nine state cancer registries: California, Connecticut, Georgia, Hawaii, Iowa, Michigan, New Mexico, Utah, and Washington. Materials and Methods: A probability random sampling method was applied to select and extract records of 2,000 breast cancer patients from the Surveillance Epidemiology and End Results (SEER) database for each of the nine state cancer registries used in this study. EasyFit software was utilized to identify the best probability models by using goodness of fit tests, and to estimate parameters for various statistical probability distributions that fit survival data. Results: Statistical analysis for the summary of statistics is reported for each of the states for the years 1973 to 2012. Kolmogorov-Smirnov, Anderson-Darling, and Chi-squared goodness of fit test values were used for survival data, the highest values of goodness of fit statistics being considered indicative of the best fit survival model for each state. Conclusions: It was found that California, Connecticut, Georgia, Iowa, New Mexico, and Washington followed the Burr probability distribution, while the Dagum probability distribution gave the best fit for Michigan and Utah, and Hawaii followed the Gamma probability distribution. These findings highlight differences between states through selected sociodemographic variables and also demonstrate probability modeling differences in breast cancer survival times. The results of this study can be used to guide healthcare providers and researchers for further investigations into social and environmental factors in order to reduce the occurrence of and mortality due to breast cancer. Creative Commons Attribution License

  7. Analytical results for the statistical distribution related to a memoryless deterministic walk: dimensionality effect and mean-field models.

    PubMed

    Terçariol, César Augusto Sangaletti; Martinez, Alexandre Souto

    2005-08-01

    Consider a medium characterized by N points whose coordinates are randomly generated by a uniform distribution along the edges of a unitary d-dimensional hypercube. A walker leaves from each point of this disordered medium and moves according to the deterministic rule to go to the nearest point which has not been visited in the preceding mu steps (deterministic tourist walk). Each trajectory generated by this dynamics has an initial nonperiodic part of t steps (transient) and a final periodic part of p steps (attractor). The neighborhood rank probabilities are parametrized by the normalized incomplete beta function Id= I1/4 [1/2, (d+1) /2] . The joint distribution S(N) (mu,d) (t,p) is relevant, and the marginal distributions previously studied are particular cases. We show that, for the memory-less deterministic tourist walk in the euclidean space, this distribution is Sinfinity(1,d) (t,p) = [Gamma (1+ I(-1)(d)) (t+ I(-1)(d) ) /Gamma(t+p+ I(-1)(d)) ] delta(p,2), where t=0, 1,2, ... infinity, Gamma(z) is the gamma function and delta(i,j) is the Kronecker delta. The mean-field models are the random link models, which correspond to d-->infinity, and the random map model which, even for mu=0 , presents nontrivial cycle distribution [ S(N)(0,rm) (p) proportional to p(-1) ] : S(N)(0,rm) (t,p) =Gamma(N)/ {Gamma[N+1- (t+p) ] N( t+p)}. The fundamental quantities are the number of explored points n(e)=t+p and Id. Although the obtained distributions are simple, they do not follow straightforwardly and they have been validated by numerical experiments.

  8. Probability distributions of whisker-surface contact: quantifying elements of the rat vibrissotactile natural scene.

    PubMed

    Hobbs, Jennifer A; Towal, R Blythe; Hartmann, Mitra J Z

    2015-08-01

    Analysis of natural scene statistics has been a powerful approach for understanding neural coding in the auditory and visual systems. In the field of somatosensation, it has been more challenging to quantify the natural tactile scene, in part because somatosensory signals are so tightly linked to the animal's movements. The present work takes a step towards quantifying the natural tactile scene for the rat vibrissal system by simulating rat whisking motions to systematically investigate the probabilities of whisker-object contact in naturalistic environments. The simulations permit an exhaustive search through the complete space of possible contact patterns, thereby allowing for the characterization of the patterns that would most likely occur during long sequences of natural exploratory behavior. We specifically quantified the probabilities of 'concomitant contact', that is, given that a particular whisker makes contact with a surface during a whisk, what is the probability that each of the other whiskers will also make contact with the surface during that whisk? Probabilities of concomitant contact were quantified in simulations that assumed increasingly naturalistic conditions: first, the space of all possible head poses; second, the space of behaviorally preferred head poses as measured experimentally; and third, common head poses in environments such as cages and burrows. As environments became more naturalistic, the probability distributions shifted from exhibiting a 'row-wise' structure to a more diagonal structure. Results also reveal that the rat appears to use motor strategies (e.g. head pitches) that generate contact patterns that are particularly well suited to extract information in the presence of uncertainty. © 2015. Published by The Company of Biologists Ltd.

  9. Distributed Attention Is Implemented through Theta-Rhythmic Gamma Modulation.

    PubMed

    Landau, Ayelet Nina; Schreyer, Helene Marianne; van Pelt, Stan; Fries, Pascal

    2015-08-31

    When subjects monitor a single location, visual target detection depends on the pre-target phase of an ∼8 Hz brain rhythm. When multiple locations are monitored, performance decrements suggest a division of the 8 Hz rhythm over the number of locations, indicating that different locations are sequentially sampled. Indeed, when subjects monitor two locations, performance benefits alternate at a 4 Hz rhythm. These performance alternations were revealed after a reset of attention to one location. Although resets are common and important events for attention, it is unknown whether, in the absence of resets, ongoing attention samples stimuli in alternation. Here, we examined whether spatially specific attentional sampling can be revealed by ongoing pre-target brain rhythms. Visually induced gamma-band activity plays a role in spatial attention. Therefore, we hypothesized that performance on two simultaneously monitored stimuli can be predicted by a 4 Hz modulation of gamma-band activity. Brain rhythms were assessed with magnetoencephalography (MEG) while subjects monitored bilateral grating stimuli for a unilateral target event. The corresponding contralateral gamma-band responses were subtracted from each other to isolate spatially selective, target-related fluctuations. The resulting lateralized gamma-band activity (LGA) showed opposite pre-target 4 Hz phases for detected versus missed targets. The 4 Hz phase of pre-target LGA accounted for a 14.5% modulation in performance. These findings suggest that spatial attention is a theta-rhythmic sampling process that is continuously ongoing, with each sampling cycle being implemented through gamma-band synchrony. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. The galactic gamma-ray distribution: Implications for galactic structure and the radial cosmic ray gradient

    NASA Technical Reports Server (NTRS)

    Harding, A. K.; Stecker, F. W.

    1984-01-01

    The radial distribution of gamma ray emissivity in the Galaxy was derived from flux longitude profiles, using both the final SAS-2 results and the recently corrected COS-B results and analyzing the northern and southern galactic regions separately. The recent CO surveys of the Southern Hemisphere, were used in conjunction with the Northern Hemisphere data, to derive the radial distribution of cosmic rays on both sides of the galactic plane. In addition to the 5 kpc ring, there is evidence from the radial asymmetry for spiral features which are consistent with those derived from the distribution of bright HII regions. Positive evidence was also found for a strong increase in the cosmic ray flux in the inner Galaxy, particularly in the 5 kpc region in both halves of the plane.

  11. Monitoring Radionuclide Transport and Spatial Distribution with a 1D Gamma-Ray Scanner

    NASA Astrophysics Data System (ADS)

    Dozier, R.; Erdmann, B.; Sams, A.; Barber, K.; DeVol, T. A.; Moysey, S. M.; Powell, B. A.

    2016-12-01

    Understanding radionuclide movement in the environment is important for informing strategies for radioactive waste management and disposal. A 1-dimensional (1D) gamma-ray emission scanning system was developed to investigate radionuclide transport behavior within soils. Two case studies illustrate the use of the system for non-destructively monitoring transport processes within a soil column. The first case study explores the system capabilities for simultaneously detecting technetium-99m (99mTc), iodine-131 (131I), and sodium-22 (22Na) moving through a column (length = 14.1 cm, diameter = 3.8 cm) packed with soil from the Department of Energy's Savannah River Site. A sodium iodide (NaI) detector was placed at 4 cm above the influent and a Bismuth germanate (BGO) detector at about 10 cm above the influent. The NaI detector results show 99mTc, 131I, and 22Na having similar breakthrough curves with the tail of 99mTc being lower than that of 131I and 22Na. NaCl tracer results compliment the gamma-ray emission measurements. These results are promising because we are able to monitor movement of the isotopes in the column in real-time. In the second case study, the 1D gamma scanner was used to quantify radionuclide mobility within a lysimeter (length = 51 cm, diameter = 10 cm). A cementitious waste form containing cobalt-60 (60Co), barium-133 (133Ba), cesium-137 (137Cs), and europium-152 (152Eu), with the amount of each contained in the cement ranging from 3 to 8.5 MBq, was placed at the midpoint of the lysimeter. The lysimeter was then exposed to natural rainfall and environmental conditions and effluent samples were collected and quantified on a quarterly basis. Following 3.3 years of exposure, the radionuclide distribution in the lysimeter was quantified with a 0.64 cm collimated high-purity germanium gamma-ray spectrometer. Diffusion of 137Cs away from the cementitious wasteform was observed. No movement was seen for 133Ba, 60Co, or 152Eu within the detection limits

  12. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  13. Time-dependent landslide probability mapping

    USGS Publications Warehouse

    Campbell, Russell H.; Bernknopf, Richard L.; ,

    1993-01-01

    Case studies where time of failure is known for rainfall-triggered debris flows can be used to estimate the parameters of a hazard model in which the probability of failure is a function of time. As an example, a time-dependent function for the conditional probability of a soil slip is estimated from independent variables representing hillside morphology, approximations of material properties, and the duration and rate of rainfall. If probabilities are calculated in a GIS (geomorphic information system ) environment, the spatial distribution of the result for any given hour can be displayed on a map. Although the probability levels in this example are uncalibrated, the method offers a potential for evaluating different physical models and different earth-science variables by comparing the map distribution of predicted probabilities with inventory maps for different areas and different storms. If linked with spatial and temporal socio-economic variables, this method could be used for short-term risk assessment.

  14. Spatial distribution and occurrence probability of regional new particle formation events in eastern China

    NASA Astrophysics Data System (ADS)

    Shen, Xiaojing; Sun, Junying; Kivekäs, Niku; Kristensson, Adam; Zhang, Xiaoye; Zhang, Yangmei; Zhang, Lu; Fan, Ruxia; Qi, Xuefei; Ma, Qianli; Zhou, Huaigang

    2018-01-01

    In this work, the spatial extent of new particle formation (NPF) events and the relative probability of observing particles originating from different spatial origins around three rural sites in eastern China were investigated using the NanoMap method, using particle number size distribution (PNSD) data and air mass back trajectories. The length of the datasets used were 7, 1.5, and 3 years at rural sites Shangdianzi (SDZ) in the North China Plain (NCP), Mt. Tai (TS) in central eastern China, and Lin'an (LAN) in the Yangtze River Delta region in eastern China, respectively. Regional NPF events were observed to occur with the horizontal extent larger than 500 km at SDZ and TS, favoured by the fast transport of northwesterly air masses. At LAN, however, the spatial footprint of NPF events was mostly observed around the site within 100-200 km. Difference in the horizontal spatial distribution of new particle source areas at different sites was connected to typical meteorological conditions at the sites. Consecutive large-scale regional NPF events were observed at SDZ and TS simultaneously and were associated with a high surface pressure system dominating over this area. Simultaneous NPF events at SDZ and LAN were seldom observed. At SDZ the polluted air masses arriving over the NCP were associated with higher particle growth rate (GR) and new particle formation rate (J) than air masses from Inner Mongolia (IM). At TS the same phenomenon was observed for J, but GR was somewhat lower in air masses arriving over the NCP compared to those arriving from IM. The capability of NanoMap to capture the NPF occurrence probability depends on the length of the dataset of PNSD measurement but also on topography around the measurement site and typical air mass advection speed during NPF events. Thus the long-term measurements of PNSD in the planetary boundary layer are necessary in the further study of spatial extent and the

  15. Gamma rays of energy or = 10(15) eV from Cyg X-3

    NASA Technical Reports Server (NTRS)

    Kifune, T.; Nishijima, K.; Hara, T.; Hatano, Y.; Hayashida, N.; Honda, M.; Kamata, K.; Matsubara, Y.; Mori, M.; Nagano, M.

    1985-01-01

    The experimental data of extensive air showers observed at Akeno have been analyzed to detect the gamma ray signal from Cyg X-3. After muon poor air showers are selected, the correlation of data acquisition time with 4.8 hours X-ray period is studied, giving the data concentration near the phase 0.6, the time of X-ray maximum. The probability that uniform backgrounds create the distribution is 0.2%. The time averaged integral gamma ray flux is estimated as (1.1 + or - 0.4)x 10 to the -14th power cm(-2) sec(-1) for Eo 10 to the 15th power eV and (8.8 + or - 5.0)x 10 to the 14th power cm(-2) sec(-1) for Eo 6 x 10 to the 14th power eV.

  16. Site-to-Source Finite Fault Distance Probability Distribution in Probabilistic Seismic Hazard and the Relationship Between Minimum Distances

    NASA Astrophysics Data System (ADS)

    Ortega, R.; Gutierrez, E.; Carciumaru, D. D.; Huesca-Perez, E.

    2017-12-01

    We present a method to compute the conditional and no-conditional probability density function (PDF) of the finite fault distance distribution (FFDD). Two cases are described: lines and areas. The case of lines has a simple analytical solution while, in the case of areas, the geometrical probability of a fault based on the strike, dip, and fault segment vertices is obtained using the projection of spheres in a piecewise rectangular surface. The cumulative distribution is computed by measuring the projection of a sphere of radius r in an effective area using an algorithm that estimates the area of a circle within a rectangle. In addition, we introduce the finite fault distance metrics. This distance is the distance where the maximum stress release occurs within the fault plane and generates a peak ground motion. Later, we can apply the appropriate ground motion prediction equations (GMPE) for PSHA. The conditional probability of distance given magnitude is also presented using different scaling laws. A simple model of constant distribution of the centroid at the geometrical mean is discussed, in this model hazard is reduced at the edges because the effective size is reduced. Nowadays there is a trend of using extended source distances in PSHA, however it is not possible to separate the fault geometry from the GMPE. With this new approach, it is possible to add fault rupture models separating geometrical and propagation effects.

  17. Experimental investigation of the intensity fluctuation joint probability and conditional distributions of the twin-beam quantum state.

    PubMed

    Zhang, Yun; Kasai, Katsuyuki; Watanabe, Masayoshi

    2003-01-13

    We give the intensity fluctuation joint probability of the twin-beam quantum state, which was generated with an optical parametric oscillator operating above threshold. Then we present what to our knowledge is the first measurement of the intensity fluctuation conditional probability distributions of twin beams. The measured inference variance of twin beams 0.62+/-0.02, which is less than the standard quantum limit of unity, indicates inference with a precision better than that of separable states. The measured photocurrent variance exhibits a quantum correlation of as much as -4.9+/-0.2 dB between the signal and the idler.

  18. In favor of general probability distributions: lateral prefrontal and insular cortices respond to stimulus inherent, but irrelevant differences.

    PubMed

    Mestres-Missé, Anna; Trampel, Robert; Turner, Robert; Kotz, Sonja A

    2016-04-01

    A key aspect of optimal behavior is the ability to predict what will come next. To achieve this, we must have a fairly good idea of the probability of occurrence of possible outcomes. This is based both on prior knowledge about a particular or similar situation and on immediately relevant new information. One question that arises is: when considering converging prior probability and external evidence, is the most probable outcome selected or does the brain represent degrees of uncertainty, even highly improbable ones? Using functional magnetic resonance imaging, the current study explored these possibilities by contrasting words that differ in their probability of occurrence, namely, unbalanced ambiguous words and unambiguous words. Unbalanced ambiguous words have a strong frequency-based bias towards one meaning, while unambiguous words have only one meaning. The current results reveal larger activation in lateral prefrontal and insular cortices in response to dominant ambiguous compared to unambiguous words even when prior and contextual information biases one interpretation only. These results suggest a probability distribution, whereby all outcomes and their associated probabilities of occurrence--even if very low--are represented and maintained.

  19. The distribution of the intervals between neural impulses in the maintained discharges of retinal ganglion cells.

    PubMed

    Levine, M W

    1991-01-01

    Simulated neural impulse trains were generated by a digital realization of the integrate-and-fire model. The variability in these impulse trains had as its origin a random noise of specified distribution. Three different distributions were used: the normal (Gaussian) distribution (no skew, normokurtic), a first-order gamma distribution (positive skew, leptokurtic), and a uniform distribution (no skew, platykurtic). Despite these differences in the distribution of the variability, the distributions of the intervals between impulses were nearly indistinguishable. These inter-impulse distributions were better fit with a hyperbolic gamma distribution than a hyperbolic normal distribution, although one might expect a better approximation for normally distributed inverse intervals. Consideration of why the inter-impulse distribution is independent of the distribution of the causative noise suggests two putative interval distributions that do not depend on the assumed noise distribution: the log normal distribution, which is predicated on the assumption that long intervals occur with the joint probability of small input values, and the random walk equation, which is the diffusion equation applied to a random walk model of the impulse generating process. Either of these equations provides a more satisfactory fit to the simulated impulse trains than the hyperbolic normal or hyperbolic gamma distributions. These equations also provide better fits to impulse trains derived from the maintained discharges of ganglion cells in the retinae of cats or goldfish. It is noted that both equations are free from the constraint that the coefficient of variation (CV) have a maximum of unity.(ABSTRACT TRUNCATED AT 250 WORDS)

  20. Distribution functions of air-scattered gamma rays above isotropic plane sources

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael, J A; Lamonds, H A

    1967-06-01

    Using the moments method of Spencer and Fano and a reconstruction technique suggested by Berger, the authors have calculated energy and angular distribution functions for air-scattered gamma rays emitied from infinite-plane isotropic monoenergetic sources as iunctions of source energy, radiation incidence angle at the detector, and detector altitude. Incremental and total buildup factors have been calculated for both number and exposure. The results are presented in tabular form for a detector located at altitudes of 3, 50, 100, 200, 300, 400, 500, and 1000 feet above source planes of 15 discrete energies spanning the range of 0.1 to 3.0 MeV.more » Calculational techniques including results of sensitivity studies are discussed and plots of typical results are presented. (auth)« less

  1. AN EVOLVING STELLAR INITIAL MASS FUNCTION AND THE GAMMA-RAY BURST REDSHIFT DISTRIBUTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, F. Y.; Dai, Z. G.

    2011-02-01

    Recent studies suggest that Swift gamma-ray bursts (GRBs) may not trace an ordinary star formation history (SFH). Here, we show that the GRB rate turns out to be consistent with the SFH with an evolving stellar initial mass function (IMF). We first show that the latest Swift sample of GRBs reveals an increasing evolution in the GRB rate relative to the ordinary star formation rate at high redshifts. We then assume only massive stars with masses greater than the critical value to produce GRBs and use an evolving stellar IMF suggested by Dave to fit the latest GRB redshift distribution.more » This evolving IMF would increase the relative number of massive stars, which could lead to more GRB explosions at high redshifts. We find that the evolving IMF can well reproduce the observed redshift distribution of Swift GRBs.« less

  2. A chi-square goodness-of-fit test for non-identically distributed random variables: with application to empirical Bayes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Conover, W.J.; Cox, D.D.; Martz, H.F.

    1997-12-01

    When using parametric empirical Bayes estimation methods for estimating the binomial or Poisson parameter, the validity of the assumed beta or gamma conjugate prior distribution is an important diagnostic consideration. Chi-square goodness-of-fit tests of the beta or gamma prior hypothesis are developed for use when the binomial sample sizes or Poisson exposure times vary. Nine examples illustrate the application of the methods, using real data from such diverse applications as the loss of feedwater flow rates in nuclear power plants, the probability of failure to run on demand and the failure rates of the high pressure coolant injection systems atmore » US commercial boiling water reactors, the probability of failure to run on demand of emergency diesel generators in US commercial nuclear power plants, the rate of failure of aircraft air conditioners, baseball batting averages, the probability of testing positive for toxoplasmosis, and the probability of tumors in rats. The tests are easily applied in practice by means of corresponding Mathematica{reg_sign} computer programs which are provided.« less

  3. Development and application of a probability distribution retrieval scheme to the remote sensing of clouds and precipitation

    NASA Astrophysics Data System (ADS)

    McKague, Darren Shawn

    2001-12-01

    The statistical properties of clouds and precipitation on a global scale are important to our understanding of climate. Inversion methods exist to retrieve the needed cloud and precipitation properties from satellite data pixel-by-pixel that can then be summarized over large data sets to obtain the desired statistics. These methods can be quite computationally expensive, and typically don't provide errors on the statistics. A new method is developed to directly retrieve probability distributions of parameters from the distribution of measured radiances. The method also provides estimates of the errors on the retrieved distributions. The method can retrieve joint distributions of parameters that allows for the study of the connection between parameters. A forward radiative transfer model creates a mapping from retrieval parameter space to radiance space. A Monte Carlo procedure uses the mapping to transform probability density from the observed radiance histogram to a two- dimensional retrieval property probability distribution function (PDF). An estimate of the uncertainty in the retrieved PDF is calculated from random realizations of the radiance to retrieval parameter PDF transformation given the uncertainty of the observed radiances, the radiance PDF, the forward radiative transfer, the finite number of prior state vectors, and the non-unique mapping to retrieval parameter space. The retrieval method is also applied to the remote sensing of precipitation from SSM/I microwave data. A method of stochastically generating hydrometeor fields based on the fields from a numerical cloud model is used to create the precipitation parameter radiance space transformation. The impact of vertical and horizontal variability within the hydrometeor fields has a significant impact on algorithm performance. Beamfilling factors are computed from the simulated hydrometeor fields. The beamfilling factors vary quite a bit depending upon the horizontal structure of the rain. The

  4. Evaluation of the radiobiological gamma index with motion interplay in tangential IMRT breast treatment

    PubMed Central

    Sumida, Iori; Yamaguchi, Hajime; Das, Indra J.; Kizaki, Hisao; Aboshi, Keiko; Tsujii, Mari; Yamada, Yuji; Tamari, Kiesuke; Suzuki, Osamu; Seo, Yuji; Isohashi, Fumiaki; Yoshioka, Yasuo; Ogawa, Kazuhiko

    2016-01-01

    The purpose of this study was to evaluate the impact of the motion interplay effect in early-stage left-sided breast cancer intensity-modulated radiation therapy (IMRT), incorporating the radiobiological gamma index (RGI). The IMRT dosimetry for various breathing amplitudes and cycles was investigated in 10 patients. The predicted dose was calculated using the convolution of segmented measured doses. The physical gamma index (PGI) of the planning target volume (PTV) and the organs at risk (OAR) was calculated by comparing the original with the predicted dose distributions. The RGI was calculated from the PGI using the tumor control probability (TCP) and the normal tissue complication probability (NTCP). The predicted mean dose and the generalized equivalent uniform dose (gEUD) to the target with various breathing amplitudes were lower than the original dose (P < 0.01). The predicted mean dose and gEUD to the OARs with motion were higher than for the original dose to the OARs (P < 0.01). However, the predicted data did not differ significantly between the various breathing cycles for either the PTV or the OARs. The mean RGI gamma passing rate for the PTV was higher than that for the PGI (P < 0.01), and for OARs, the RGI values were higher than those for the PGI (P < 0.01). The gamma passing rates of the RGI for the target and the OARs other than the contralateral lung differed significantly from those of the PGI under organ motion. Provided an NTCP value <0.05 is considered acceptable, it may be possible, by taking breathing motion into consideration, to escalate the dose to achieve the PTV coverage without compromising the TCP. PMID:27534793

  5. Applications of Bayesian Statistics to Problems in Gamma-Ray Bursts

    NASA Technical Reports Server (NTRS)

    Meegan, Charles A.

    1997-01-01

    This presentation will describe two applications of Bayesian statistics to Gamma Ray Bursts (GRBS). The first attempts to quantify the evidence for a cosmological versus galactic origin of GRBs using only the observations of the dipole and quadrupole moments of the angular distribution of bursts. The cosmological hypothesis predicts isotropy, while the galactic hypothesis is assumed to produce a uniform probability distribution over positive values for these moments. The observed isotropic distribution indicates that the Bayes factor for the cosmological hypothesis over the galactic hypothesis is about 300. Another application of Bayesian statistics is in the estimation of chance associations of optical counterparts with galaxies. The Bayesian approach is preferred to frequentist techniques here because the Bayesian approach easily accounts for galaxy mass distributions and because one can incorporate three disjoint hypotheses: (1) bursts come from galactic centers, (2) bursts come from galaxies in proportion to luminosity, and (3) bursts do not come from external galaxies. This technique was used in the analysis of the optical counterpart to GRB970228.

  6. On probability-possibility transformations

    NASA Technical Reports Server (NTRS)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  7. Modeling the probability distribution of positional errors incurred by residential address geocoding.

    PubMed

    Zimmerman, Dale L; Fang, Xiangming; Mazumdar, Soumya; Rushton, Gerard

    2007-01-10

    The assignment of a point-level geocode to subjects' residences is an important data assimilation component of many geographic public health studies. Often, these assignments are made by a method known as automated geocoding, which attempts to match each subject's address to an address-ranged street segment georeferenced within a streetline database and then interpolate the position of the address along that segment. Unfortunately, this process results in positional errors. Our study sought to model the probability distribution of positional errors associated with automated geocoding and E911 geocoding. Positional errors were determined for 1423 rural addresses in Carroll County, Iowa as the vector difference between each 100%-matched automated geocode and its true location as determined by orthophoto and parcel information. Errors were also determined for 1449 60%-matched geocodes and 2354 E911 geocodes. Huge (> 15 km) outliers occurred among the 60%-matched geocoding errors; outliers occurred for the other two types of geocoding errors also but were much smaller. E911 geocoding was more accurate (median error length = 44 m) than 100%-matched automated geocoding (median error length = 168 m). The empirical distributions of positional errors associated with 100%-matched automated geocoding and E911 geocoding exhibited a distinctive Greek-cross shape and had many other interesting features that were not capable of being fitted adequately by a single bivariate normal or t distribution. However, mixtures of t distributions with two or three components fit the errors very well. Mixtures of bivariate t distributions with few components appear to be flexible enough to fit many positional error datasets associated with geocoding, yet parsimonious enough to be feasible for nascent applications of measurement-error methodology to spatial epidemiology.

  8. The solar gamma ray and neutron capabilities of COMPTEL on the Gamma Ray Observatory

    NASA Technical Reports Server (NTRS)

    Ryan, James M.; Lockwood, John A.

    1989-01-01

    The imaging Compton telescope COMPTEL on the Gamma Ray Observatory (GRO) has unusual spectroscopic capabilities for measuring solar gamma-ray and neutron emission. The launch of the GRO is scheduled for June 1990 near the peak of the sunspot cycle. With a 30 to 40 percent probability for the Sun being in the COMPTEL field-of-view during the sunlit part of an orbit, a large number of flares will be observed above the 800 keV gamma-ray threshold of the telescope. The telescope energy range extends to 30 MeV with high time resolution burst spectra available from 0.1 to 10 MeV. Strong Compton tail suppression of instrumental gamma-ray interactions will facilitate improved spectral analysis of solar flare emissions. In addition, the high signal to noise ratio for neutron detection and measurement will provide new neutron spectroscopic capabilities. Specifically, a flare similar to that of 3 June 1982 will provide spectroscopic data on greater than 1500 individual neutrons, enough to construct an unambiguous spectrum in the energy range of 20 to 200 MeV. Details of the instrument and its response to solar gamma-rays and neutrons will be presented.

  9. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    PubMed

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  10. A classification scheme for edge-localized modes based on their probability distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shabbir, A., E-mail: aqsa.shabbir@ugent.be; Max Planck Institute for Plasma Physics, D-85748 Garching; Hornung, G.

    We present here an automated classification scheme which is particularly well suited to scenarios where the parameters have significant uncertainties or are stochastic quantities. To this end, the parameters are modeled with probability distributions in a metric space and classification is conducted using the notion of nearest neighbors. The presented framework is then applied to the classification of type I and type III edge-localized modes (ELMs) from a set of carbon-wall plasmas at JET. This provides a fast, standardized classification of ELM types which is expected to significantly reduce the effort of ELM experts in identifying ELM types. Further, themore » classification scheme is general and can be applied to various other plasma phenomena as well.« less

  11. Did A Galactic Gamma-Ray Burst Kill the Dinosaurs?

    NASA Astrophysics Data System (ADS)

    Brecher, K.

    1997-12-01

    Gamma-ray bursts now appear to be primarily of extragalactic origin. Statistically, assuming isotropic emission, the observed event rates and fluxes imply that one event occurs per 10(4) \\ - 10(6) \\ years per galaxy, with about 10(51) \\ - 10(53) \\ ergs in gamma-rays emitted per event. Unless the Milky Way is unusual, a gamma-ray burst should occur within 10(2) \\ - 10(3) \\ pc of the Sun in a time span of order 10(8) \\ years. Independent of the underlying cause of the event, it would irradiate the solar system with a brief flash of MeV gamma-rays with a fluence as large as 10(9) - 10(11) \\ erg cm(-2) . What is the effect of such an event on the Earth and objects nearby? Ruderman (\\underbar{Science}, 184, 1079, 1974) and subsequent authors have considered a number of effects of a flash of gamma-rays from a nearby supernova explosion on the Earth's atmosphere, and on its biota. However, with regard to the demise of the dinosaurs, it appears that there was a marked increase in the deposition rate of the rare earth iridium coincident with their extinction. For this reason, an asteroid-Earth impact has been considered the leading contender for the death of the dinosaurs. Here we consider a new mechanism for mass biological extinctions, caused by small comets nudged into the inner solar system by nearby gamma-ray bursts. If comets populate the Oort cloud with a wide distribution of masses, radii and orbital eccentricities, we find that small (< 1 km), low density (10(-2) \\ gm cm(-3) ) objects in highly eccentric orbits can be injected into the inner solar system by a nearby gamma-ray burst. For a relatively brief period of time, the near Earth comet population would increase dramatically. The consequent increased probability of comet-Earth impacts of appropriate energy and material content could account for many of the characteristics of the Cretaceous-Tertiary or other terrestrial mass biological extinctions.

  12. Probability distribution and statistical properties of spherically compensated cosmic regions in ΛCDM cosmology

    NASA Astrophysics Data System (ADS)

    Alimi, Jean-Michel; de Fromont, Paul

    2018-04-01

    The statistical properties of cosmic structures are well known to be strong probes for cosmology. In particular, several studies tried to use the cosmic void counting number to obtain tight constrains on dark energy. In this paper, we model the statistical properties of these regions using the CoSphere formalism (de Fromont & Alimi) in both primordial and non-linearly evolved Universe in the standard Λ cold dark matter model. This formalism applies similarly for minima (voids) and maxima (such as DM haloes), which are here considered symmetrically. We first derive the full joint Gaussian distribution of CoSphere's parameters in the Gaussian random field. We recover the results of Bardeen et al. only in the limit where the compensation radius becomes very large, i.e. when the central extremum decouples from its cosmic environment. We compute the probability distribution of the compensation size in this primordial field. We show that this distribution is redshift independent and can be used to model cosmic voids size distribution. We also derive the statistical distribution of the peak parameters introduced by Bardeen et al. and discuss their correlation with the cosmic environment. We show that small central extrema with low density are associated with narrow compensation regions with deep compensation density, while higher central extrema are preferentially located in larger but smoother over/under massive regions.

  13. Population Synthesis of Radio & Gamma-Ray Millisecond Pulsars

    NASA Astrophysics Data System (ADS)

    Frederick, Sara; Gonthier, P. L.; Harding, A. K.

    2014-01-01

    In recent years, the number of known gamma-ray millisecond pulsars (MSPs) in the Galactic disk has risen substantially thanks to confirmed detections by Fermi Gamma-ray Space Telescope (Fermi). We have developed a new population synthesis of gamma-ray and radio MSPs in the galaxy which uses Markov Chain Monte Carlo techniques to explore the large and small worlds of the model parameter space and allows for comparisons of the simulated and detected MSP distributions. The simulation employs empirical radio and gamma-ray luminosity models that are dependent upon the pulsar period and period derivative with freely varying exponents. Parameters associated with the birth distributions are also free to vary. The computer code adjusts the magnitudes of the model luminosities to reproduce the number of MSPs detected by a group of ten radio surveys, thus normalizing the simulation and predicting the MSP birth rates in the Galaxy. Computing many Markov chains leads to preferred sets of model parameters that are further explored through two statistical methods. Marginalized plots define confidence regions in the model parameter space using maximum likelihood methods. A secondary set of confidence regions is determined in parallel using Kuiper statistics calculated from comparisons of cumulative distributions. These two techniques provide feedback to affirm the results and to check for consistency. Radio flux and dispersion measure constraints have been imposed on the simulated gamma-ray distributions in order to reproduce realistic detection conditions. The simulated and detected distributions agree well for both sets of radio and gamma-ray pulsar characteristics, as evidenced by our various comparisons.

  14. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  15. Learning probability distributions from smooth observables and the maximum entropy principle: some remarks

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Monasson, Rémi

    2015-09-01

    The maximum entropy principle (MEP) is a very useful working hypothesis in a wide variety of inference problems, ranging from biological to engineering tasks. To better understand the reasons of the success of MEP, we propose a statistical-mechanical formulation to treat the space of probability distributions constrained by the measures of (experimental) observables. In this paper we first review the results of a detailed analysis of the simplest case of randomly chosen observables. In addition, we investigate by numerical and analytical means the case of smooth observables, which is of practical relevance. Our preliminary results are presented and discussed with respect to the efficiency of the MEP.

  16. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    NASA Technical Reports Server (NTRS)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  17. Characterizing the Lyman-alpha forest flux probability distribution function using Legendre polynomials

    NASA Astrophysics Data System (ADS)

    Cieplak, Agnieszka; Slosar, Anze

    2017-01-01

    The Lyman-alpha forest has become a powerful cosmological probe of the underlying matter distribution at high redshift. It is a highly non-linear field with much information present beyond the two-point statistics of the power spectrum. The flux probability distribution function (PDF) in particular has been used as a successful probe of small-scale physics. In addition to the cosmological evolution however, it is also sensitive to pixel noise, spectrum resolution, and continuum fitting, all of which lead to possible biased estimators. Here we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over the binned PDF as is commonly done. Since the n-th coefficient can be expressed as a linear combination of the first n moments of the field, this allows for the coefficients to be measured in the presence of noise and allows for a clear route towards marginalization over the mean flux. In addition, we use hydrodynamic cosmological simulations to demonstrate that in the presence of noise, a finite number of these coefficients are well measured with a very sharp transition into noise dominance. This compresses the information into a finite small number of well-measured quantities.

  18. Estimation of the Vertical Distribution of Radiocesium in Soil on the Basis of the Characteristics of Gamma-Ray Spectra Obtained via Aerial Radiation Monitoring Using an Unmanned Helicopter.

    PubMed

    Ochi, Kotaro; Sasaki, Miyuki; Ishida, Mutsushi; Hamamoto, Shoichiro; Nishimura, Taku; Sanada, Yukihisa

    2017-08-17

    After the Fukushima Daiichi Nuclear Power Plant accident, the vertical distribution of radiocesium in soil has been investigated to better understand the behavior of radiocesium in the environment. The typical method used for measuring the vertical distribution of radiocesium is troublesome because it requires collection and measurement of the activity of soil samples. In this study, we established a method of estimating the vertical distribution of radiocesium by focusing on the characteristics of gamma-ray spectra obtained via aerial radiation monitoring using an unmanned helicopter. The estimates are based on actual measurement data collected at an extended farm. In this method, the change in the ratio of direct gamma rays to scattered gamma rays at various depths in the soil was utilized to quantify the vertical distribution of radiocesium. The results show a positive correlation between the abovementioned and the actual vertical distributions of radiocesium measured in the soil samples. A vertical distribution map was created on the basis of this ratio using a simple equation derived from the abovementioned correlation. This technique can provide a novel approach for effective selection of high-priority areas that require decontamination.

  19. Estimation of the Vertical Distribution of Radiocesium in Soil on the Basis of the Characteristics of Gamma-Ray Spectra Obtained via Aerial Radiation Monitoring Using an Unmanned Helicopter

    PubMed Central

    Ochi, Kotaro; Sasaki, Miyuki; Ishida, Mutsushi; Sanada, Yukihisa

    2017-01-01

    After the Fukushima Daiichi Nuclear Power Plant accident, the vertical distribution of radiocesium in soil has been investigated to better understand the behavior of radiocesium in the environment. The typical method used for measuring the vertical distribution of radiocesium is troublesome because it requires collection and measurement of the activity of soil samples. In this study, we established a method of estimating the vertical distribution of radiocesium by focusing on the characteristics of gamma-ray spectra obtained via aerial radiation monitoring using an unmanned helicopter. The estimates are based on actual measurement data collected at an extended farm. In this method, the change in the ratio of direct gamma rays to scattered gamma rays at various depths in the soil was utilized to quantify the vertical distribution of radiocesium. The results show a positive correlation between the abovementioned and the actual vertical distributions of radiocesium measured in the soil samples. A vertical distribution map was created on the basis of this ratio using a simple equation derived from the abovementioned correlation. This technique can provide a novel approach for effective selection of high-priority areas that require decontamination. PMID:28817098

  20. Fitting distributions to microbial contamination data collected with an unequal probability sampling design.

    PubMed

    Williams, M S; Ebel, E D; Cao, Y

    2013-01-01

    The fitting of statistical distributions to microbial sampling data is a common application in quantitative microbiology and risk assessment applications. An underlying assumption of most fitting techniques is that data are collected with simple random sampling, which is often times not the case. This study develops a weighted maximum likelihood estimation framework that is appropriate for microbiological samples that are collected with unequal probabilities of selection. A weighted maximum likelihood estimation framework is proposed for microbiological samples that are collected with unequal probabilities of selection. Two examples, based on the collection of food samples during processing, are provided to demonstrate the method and highlight the magnitude of biases in the maximum likelihood estimator when data are inappropriately treated as a simple random sample. Failure to properly weight samples to account for how data are collected can introduce substantial biases into inferences drawn from the data. The proposed methodology will reduce or eliminate an important source of bias in inferences drawn from the analysis of microbial data. This will also make comparisons between studies and the combination of results from different studies more reliable, which is important for risk assessment applications. © 2012 No claim to US Government works.

  1. Probability density functions of power-in-bucket and power-in-fiber for an infrared laser beam propagating in the maritime environment.

    PubMed

    Nelson, Charles; Avramov-Zamurovic, Svetlana; Korotkova, Olga; Malek-Madani, Reza; Sova, Raymond; Davidson, Frederic

    2013-11-01

    Irradiance fluctuations of an infrared laser beam from a shore-to-ship data link ranging from 5.1 to 17.8 km are compared to lognormal (LN), gamma-gamma (GG) with aperture averaging, and gamma-Laguerre (GL) distributions. From our data analysis, the LN and GG probability density function (PDF) models were generally in good agreement in near-weak to moderate fluctuations. This was also true in moderate to strong fluctuations when the spatial coherence radius was smaller than the detector aperture size, with the exception of the 2.54 cm power-in-bucket (PIB) where the LN PDF model fit best. For moderate to strong fluctuations, the GG PDF model tended to outperform the LN PDF model when the spatial coherence radius was greater than the detector aperture size. Additionally, the GL PDF model had the best or next to best overall fit in all cases with the exception of the 2.54 cm PIB where the scintillation index was highest. The GL PDF model also appears to be robust for off-of-beam center laser beam applications.

  2. The gamma cycle.

    PubMed

    Fries, Pascal; Nikolić, Danko; Singer, Wolf

    2007-07-01

    Activated neuronal groups typically engage in rhythmic synchronization in the gamma-frequency range (30-100 Hz). Experimental and modeling studies demonstrate that each gamma cycle is framed by synchronized spiking of inhibitory interneurons. Here, we review evidence suggesting that the resulting rhythmic network inhibition interacts with excitatory input to pyramidal cells such that the more excited cells fire earlier in the gamma cycle. Thus, the amplitude of excitatory drive is recoded into phase values of discharges relative to the gamma cycle. This recoding enables transmission and read out of amplitude information within a single gamma cycle without requiring rate integration. Furthermore, variation of phase relations can be exploited to facilitate or inhibit exchange of information between oscillating cell assemblies. The gamma cycle could thus serve as a fundamental computational mechanism for the implementation of a temporal coding scheme that enables fast processing and flexible routing of activity, supporting fast selection and binding of distributed responses. This review is part of the INMED/TINS special issue Physiogenic and pathogenic oscillations: the beauty and the beast, based on presentations at the annual INMED/TINS symposium (http://inmednet.com).

  3. On the probability distribution function of the mass surface density of molecular clouds. II.

    NASA Astrophysics Data System (ADS)

    Fischera, Jörg

    2014-11-01

    The probability distribution function (PDF) of the mass surface density of molecular clouds provides essential information about the structure of molecular cloud gas and condensed structures out of which stars may form. In general, the PDF shows two basic components: a broad distribution around the maximum with resemblance to a log-normal function, and a tail at high mass surface densities attributed to turbulence and self-gravity. In a previous paper, the PDF of condensed structures has been analyzed and an analytical formula presented based on a truncated radial density profile, ρ(r) = ρc/ (1 + (r/r0)2)n/ 2 with central density ρc and inner radius r0, widely used in astrophysics as a generalization of physical density profiles. In this paper, the results are applied to analyze the PDF of self-gravitating, isothermal, pressurized, spherical (Bonnor-Ebert spheres) and cylindrical condensed structures with emphasis on the dependence of the PDF on the external pressure pext and on the overpressure q-1 = pc/pext, where pc is the central pressure. Apart from individual clouds, we also consider ensembles of spheres or cylinders, where effects caused by a variation of pressure ratio, a distribution of condensed cores within a turbulent gas, and (in case of cylinders) a distribution of inclination angles on the mean PDF are analyzed. The probability distribution of pressure ratios q-1 is assumed to be given by P(q-1) ∝ q-k1/ (1 + (q0/q)γ)(k1 + k2) /γ, where k1, γ, k2, and q0 are fixed parameters. The PDF of individual spheres with overpressures below ~100 is well represented by the PDF of a sphere with an analytical density profile with n = 3. At higher pressure ratios, the PDF at mass surface densities Σ ≪ Σ(0), where Σ(0) is the central mass surface density, asymptotically approaches the PDF of a sphere with n = 2. Consequently, the power-law asymptote at mass surface densities above the peak steepens from Psph(Σ) ∝ Σ-2 to Psph(Σ) ∝ Σ-3. The

  4. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    USGS Publications Warehouse

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving

  5. On Probability Domains IV

    NASA Astrophysics Data System (ADS)

    Frič, Roman; Papčo, Martin

    2017-12-01

    Stressing a categorical approach, we continue our study of fuzzified domains of probability, in which classical random events are replaced by measurable fuzzy random events. In operational probability theory (S. Bugajski) classical random variables are replaced by statistical maps (generalized distribution maps induced by random variables) and in fuzzy probability theory (S. Gudder) the central role is played by observables (maps between probability domains). We show that to each of the two generalized probability theories there corresponds a suitable category and the two resulting categories are dually equivalent. Statistical maps and observables become morphisms. A statistical map can send a degenerated (pure) state to a non-degenerated one —a quantum phenomenon and, dually, an observable can map a crisp random event to a genuine fuzzy random event —a fuzzy phenomenon. The dual equivalence means that the operational probability theory and the fuzzy probability theory coincide and the resulting generalized probability theory has two dual aspects: quantum and fuzzy. We close with some notes on products and coproducts in the dual categories.

  6. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  7. Photocounting distributions for exponentially decaying sources.

    PubMed

    Teich, M C; Card, H C

    1979-05-01

    Exact photocounting distributions are obtained for a pulse of light whose intensity is exponentially decaying in time, when the underlying photon statistics are Poisson. It is assumed that the starting time for the sampling interval (which is of arbitrary duration) is uniformly distributed. The probability of registering n counts in the fixed time T is given in terms of the incomplete gamma function for n >/= 1 and in terms of the exponential integral for n = 0. Simple closed-form expressions are obtained for the count mean and variance. The results are expected to be of interest in certain studies involving spontaneous emission, radiation damage in solids, and nuclear counting. They will also be useful in neurobiology and psychophysics, since habituation and sensitization processes may sometimes be characterized by the same stochastic model.

  8. Normal probability plots with confidence.

    PubMed

    Chantarangsi, Wanpen; Liu, Wei; Bretz, Frank; Kiatsupaibul, Seksan; Hayter, Anthony J; Wan, Fang

    2015-01-01

    Normal probability plots are widely used as a statistical tool for assessing whether an observed simple random sample is drawn from a normally distributed population. The users, however, have to judge subjectively, if no objective rule is provided, whether the plotted points fall close to a straight line. In this paper, we focus on how a normal probability plot can be augmented by intervals for all the points so that, if the population distribution is normal, then all the points should fall into the corresponding intervals simultaneously with probability 1-α. These simultaneous 1-α probability intervals provide therefore an objective mean to judge whether the plotted points fall close to the straight line: the plotted points fall close to the straight line if and only if all the points fall into the corresponding intervals. The powers of several normal probability plot based (graphical) tests and the most popular nongraphical Anderson-Darling and Shapiro-Wilk tests are compared by simulation. Based on this comparison, recommendations are given in Section 3 on which graphical tests should be used in what circumstances. An example is provided to illustrate the methods. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  9. Gamma-ray luminosity and photon index evolution of FSRQ blazars and contribution to the gamma-ray background

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singal, J.; Ko, A.; Petrosian, V., E-mail: jsingal@richmond.edu

    We present the redshift evolutions and distributions of the gamma-ray luminosity and photon spectral index of flat spectrum radio quasar (FSRQ) type blazars, using non-parametric methods to obtain the evolutions and distributions directly from the data. The sample we use for analysis consists of almost all FSRQs observed with a greater than approximately 7σ detection threshold in the first-year catalog of the Fermi Gamma-ray Space Telescope's Large Area Telescope, with redshifts as determined from optical spectroscopy by Shaw et al. We find that FSQRs undergo rapid gamma-ray luminosity evolution, but negligible photon index evolution, with redshift. With these evolutions accountedmore » for we determine the density evolution and luminosity function of FSRQs and calculate their total contribution to the extragalactic gamma-ray background radiation, resolved and unresolved, which is found to be 16(+10/–4)%, in agreement with previous studies.« less

  10. Solar Gamma Rays Above 8 MeV

    NASA Technical Reports Server (NTRS)

    Crannell, C. J.; Crannell, H.; Ramaty, R.

    1978-01-01

    Processes which lead to the production of gamma rays with energy greater than 8 MeV in solar flares are reviewed and evaluated. Excited states produced by inelastic scattering, charge exchange, and spallation reactions in the abundant nuclear species are considered in order to identify nuclear lines which may contribute to the Gamma ray spectrum of solar flares. The flux of 15.11 MeV Gamma rays relative to the flux of 4.44 MeV Gamma rays from the de-excitation of the corresponding states in C12 is calculated for a number of assumed distributions of exciting particles. This flux ratio is a sensitive diagnostic of accelerated particle spectra. Other high energy nuclear levels are not so isolated as the 15.11 MeV state and are not expected to be so strong. The spectrum of Gamma rays from the decay of Pi dey is sensitive to the energy distribution of particles accelerated to energies greater than 100 MeV.

  11. High throughput nonparametric probability density estimation.

    PubMed

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  12. High throughput nonparametric probability density estimation

    PubMed Central

    Farmer, Jenny

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference. PMID:29750803

  13. Is a data set distributed as a power law? A test, with application to gamma-ray burst brightnesses

    NASA Technical Reports Server (NTRS)

    Wijers, Ralph A. M. J.; Lubin, Lori M.

    1994-01-01

    We present a method to determine whether an observed sample of data is drawn from a parent distribution that is pure power law. The method starts from a class of statistics which have zero expectation value under the null hypothesis, H(sub 0), that the distribution is a pure power law: F(x) varies as x(exp -alpha). We study one simple member of the class, named the `bending statistic' B, in detail. It is most effective for detection a type of deviation from a power law where the power-law slope varies slowly and monotonically as a function of x. Our estimator of B has a distribution under H(sub 0) that depends only on the size of the sample, not on the parameters of the parent population, and is approximated well by a normal distribution even for modest sample sizes. The bending statistic can therefore be used to test a set of numbers is drawn from any power-law parent population. Since many measurable quantities in astrophysics have distriibutions that are approximately power laws, and since deviations from the ideal power law often provide interesting information about the object of study (e.g., a `bend' or `break' in a luminosity function, a line in an X- or gamma-ray spectrum), we believe that a test of this type will be useful in many different contexts. In the present paper, we apply our test to various subsamples of gamma-ray burst brightness from the first-year Burst and Transient Source Experiment (BATSE) catalog and show that we can only marginally detect the expected steepening of the log (N (greater than C(sub max))) - log (C(sub max)) distribution.

  14. Measurement of the {sup 157}Gd(n,{gamma}) reaction with the DANCE {gamma} calorimeter array

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chyzh, A.; Dashdorj, D.; Lawrence Livermore National Laboratory, Livermore, California 94551

    2011-07-15

    The {sup 157}Gd(n,{gamma}) reaction was measured with the DANCE {gamma} calorimeter (consisting of 160 BaF{sub 2} scintillation detectors) at the Los Alamos Neutron Science Center. The multiplicity distributions of the {gamma} decay were used to determine the resonance spins up to E{sub n}=300 eV. The {gamma}-ray energy spectra for different multiplicities were measured for the s-wave resonances. The shapes of these spectra were compared with simulations based on the use of the DICEBOX statistical model code. Simulations showed that the scissors mode is required not only for the ground-state transitions but also for transitions between excited states.

  15. Bayesian Hierarchical Random Intercept Model Based on Three Parameter Gamma Distribution

    NASA Astrophysics Data System (ADS)

    Wirawati, Ika; Iriawan, Nur; Irhamah

    2017-06-01

    Hierarchical data structures are common throughout many areas of research. Beforehand, the existence of this type of data was less noticed in the analysis. The appropriate statistical analysis to handle this type of data is the hierarchical linear model (HLM). This article will focus only on random intercept model (RIM), as a subclass of HLM. This model assumes that the intercept of models in the lowest level are varied among those models, and their slopes are fixed. The differences of intercepts were suspected affected by some variables in the upper level. These intercepts, therefore, are regressed against those upper level variables as predictors. The purpose of this paper would demonstrate a proven work of the proposed two level RIM of the modeling on per capita household expenditure in Maluku Utara, which has five characteristics in the first level and three characteristics of districts/cities in the second level. The per capita household expenditure data in the first level were captured by the three parameters Gamma distribution. The model, therefore, would be more complex due to interaction of many parameters for representing the hierarchical structure and distribution pattern of the data. To simplify the estimation processes of parameters, the computational Bayesian method couple with Markov Chain Monte Carlo (MCMC) algorithm and its Gibbs Sampling are employed.

  16. Galactic plane gamma-radiation

    NASA Technical Reports Server (NTRS)

    Hartman, R. C.; Kniffen, D. A.; Thompson, D. J.; Fichtel, C. E.; Ogelman, H. B.; Tumer, T.; Ozel, M. E.

    1979-01-01

    Analysis of the SAS 2 data together with the COS B results shows that the distribution of galactic gamma-radiation has several similarities to that of other large-scale tracers of galactic structure. The radiation is primarily confined to a thin disc which exhibits offsets from b = 0 degrees similar to warping at radio frequencies. The principal distinction of the gamma-radiation is a stronger contrast in intensity between the region from 310 to 45 degrees in longitude and the regions away from the center that can be attributed to a variation in cosmic-ray density as a function of position in Galaxy. The diffuse galactic gamma-ray energy spectrum shows no significant variation in direction, and the spectrum seen along the plane is the same as that for the galactic component of the gamma-radiation at high altitudes. The uniformity of the galactic gamma-ray spectrum, the smooth decrease in intensity as a function of altitude, and the absence of any galactic gamma-ray sources at high altitudes indicate a diffuse origin for bulk of the galactic gamma-radiation rather than a collection of localized sources.

  17. Gamma-radiation effects on luminescence properties of Eu3+ activated LaPO4 phosphor

    NASA Astrophysics Data System (ADS)

    Vujčić, Ivica; Gavrilović, Tamara; Sekulić, Milica; Mašić, Slobodan; Putić, Slaviša; Papan, Jelena; Dramićanin, Miroslav D.

    2018-05-01

    Eu3+ activated LaPO4 phosphors were prepared by a high-temperature solid-state method and irradiated to different high-doses gamma-radiation in the 0-4 MGy range. No effects of high-doses of high-energy radiation on phosphor's morphology and structure were observed, as documented by electron microscopy and X-ray diffraction measurements. On the other hand, photoluminescence measurements showed that emission properties of phosphor were affected by gamma-radiation; changes in radiative properties being prominent for absorbed radiation doses up to 250 kGy after which no additional changes are observed. Judd-Ofelt analysis of emission spectra is performed to thoroughly investigate radiative properties of phosphors. Analysis showed that radiative transition probability of Eu3+ emission decreases while non-radiative probability increases upon gamma-irradiation. Quantum efficiency of emission is decreased from about 46% to 35% when Eu3+ doped LaPO4 powders are exposed to gamma-radiation of 250 kGy dose, showing no additional decrease for higher gamma-radiation doses.

  18. Modelling rainfall amounts using mixed-gamma model for Kuantan district

    NASA Astrophysics Data System (ADS)

    Zakaria, Roslinazairimah; Moslim, Nor Hafizah

    2017-05-01

    An efficient design of flood mitigation and construction of crop growth models depend upon good understanding of the rainfall process and characteristics. Gamma distribution is usually used to model nonzero rainfall amounts. In this study, the mixed-gamma model is applied to accommodate both zero and nonzero rainfall amounts. The mixed-gamma model presented is for the independent case. The formulae of mean and variance are derived for the sum of two and three independent mixed-gamma variables, respectively. Firstly, the gamma distribution is used to model the nonzero rainfall amounts and the parameters of the distribution (shape and scale) are estimated using the maximum likelihood estimation method. Then, the mixed-gamma model is defined for both zero and nonzero rainfall amounts simultaneously. The formulae of mean and variance for the sum of two and three independent mixed-gamma variables derived are tested using the monthly rainfall amounts from rainfall stations within Kuantan district in Pahang Malaysia. Based on the Kolmogorov-Smirnov goodness of fit test, the results demonstrate that the descriptive statistics of the observed sum of rainfall amounts is not significantly different at 5% significance level from the generated sum of independent mixed-gamma variables. The methodology and formulae demonstrated can be applied to find the sum of more than three independent mixed-gamma variables.

  19. Combined Effects of Gamma Radiation and High Dietary Iron on Peripheral Leukocyte Distribution and Function

    NASA Technical Reports Server (NTRS)

    Crucian, Brian E.; Morgan, Jennifer L. L.; Quiriarte, Heather A.; Sams, Clarence F.; Smith, Scott M.; Zwart, Sara R.

    2012-01-01

    Both radiation and increased iron stores can independently increase oxidative damage, resulting in protein, lipid and DNA oxidation. Oxidative stress increases the risk of many health problems including cancer, cataracts, and heart disease. This study, a subset of a larger interdisciplinary investigation of the combined effect of iron overload on sensitivity to radiation injury, monitored immune parameters in the peripheral blood of rats subjected to gamma radiation, high dietary iron or both. Specific immune measures consisted of: (1) peripheral leukocyte distribution, (2) plasma cytokine levels and (3) cytokine production profiles following whole blood mitogenic stimulation

  20. RoboPol: the optical polarization of gamma-ray-loud and gamma-ray-quiet blazars

    NASA Astrophysics Data System (ADS)

    Angelakis, E.; Hovatta, T.; Blinov, D.; Pavlidou, V.; Kiehlmann, S.; Myserlis, I.; Böttcher, M.; Mao, P.; Panopoulou, G. V.; Liodakis, I.; King, O. G.; Baloković, M.; Kus, A.; Kylafis, N.; Mahabal, A.; Marecki, A.; Paleologou, E.; Papadakis, I.; Papamastorakis, I.; Pazderski, E.; Pearson, T. J.; Prabhudesai, S.; Ramaprakash, A. N.; Readhead, A. C. S.; Reig, P.; Tassis, K.; Urry, M.; Zensus, J. A.

    2016-12-01

    We present average R-band optopolarimetric data, as well as variability parameters, from the first and second RoboPol observing season. We investigate whether gamma-ray-loud and gamma-ray-quiet blazars exhibit systematic differences in their optical polarization properties. We find that gamma-ray-loud blazars have a systematically higher polarization fraction (0.092) than gamma-ray-quiet blazars (0.031), with the hypothesis of the two samples being drawn from the same distribution of polarization fractions being rejected at the 3σ level. We have not found any evidence that this discrepancy is related to differences in the redshift distribution, rest-frame R-band luminosity density, or the source classification. The median polarization fraction versus synchrotron-peak-frequency plot shows an envelope implying that high-synchrotron-peaked sources have a smaller range of median polarization fractions concentrated around lower values. Our gamma-ray-quiet sources show similar median polarization fractions although they are all low-synchrotron-peaked. We also find that the randomness of the polarization angle depends on the synchrotron peak frequency. For high-synchrotron-peaked sources, it tends to concentrate around preferred directions while for low-synchrotron-peaked sources, it is more variable and less likely to have a preferred direction. We propose a scenario which mediates efficient particle acceleration in shocks and increases the helical B-field component immediately downstream of the shock.

  1. mrpy: Renormalized generalized gamma distribution for HMF and galaxy ensemble properties comparisons

    NASA Astrophysics Data System (ADS)

    Murray, Steven G.; Robotham, Aaron S. G.; Power, Chris

    2018-02-01

    mrpy calculates the MRP parameterization of the Halo Mass Function. It calculates basic statistics of the truncated generalized gamma distribution (TGGD) with the TGGD class, including mean, mode, variance, skewness, pdf, and cdf. It generates MRP quantities with the MRP class, such as differential number counts and cumulative number counts, and offers various methods for generating normalizations. It can generate the MRP-based halo mass function as a function of physical parameters via the mrp_b13 function, and fit MRP parameters to data in the form of arbitrary curves and in the form of a sample of variates with the SimFit class. mrpy also calculates analytic hessians and jacobians at any point, and allows the user to alternate parameterizations of the same form via the reparameterize module.

  2. Void probability as a function of the void's shape and scale-invariant models. [in studies of spacial galactic distribution

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1992-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  3. Prompt gamma-ray imaging for small animals

    NASA Astrophysics Data System (ADS)

    Xu, Libai

    Small animal imaging is recognized as a powerful discovery tool for small animal modeling of human diseases, which is providing an important clue to complete understanding of disease mechanisms and is helping researchers develop and test new treatments. The current small animal imaging techniques include positron emission tomography (PET), single photon emission tomography (SPECT), computed tomography (CT), magnetic resonance imaging (MRI), and ultrasound (US). A new imaging modality called prompt gamma-ray imaging (PGI) has been identified and investigated primarily by Monte Carlo simulation. Currently it is suggested for use on small animals. This new technique could greatly enhance and extend the present capabilities of PET and SPECT imaging from ingested radioisotopes to the imaging of selected non-radioactive elements, such as Gd, Cd, Hg, and B, and has the great potential to be used in Neutron Cancer Therapy to monitor neutron distribution and neutron-capture agent distribution. This approach consists of irradiating small animals in the thermal neutron beam of a nuclear reactor to produce prompt gamma rays from the elements in the sample by the radiative capture (n, gamma) reaction. These prompt gamma rays are emitted in energies that are characteristic of each element and they are also produced in characteristic coincident chains. After measuring these prompt gamma rays by surrounding spectrometry array, the distribution of each element of interest in the sample is reconstructed from the mapping of each detected signature gamma ray by either electronic collimations or mechanical collimations. In addition, the transmitted neutrons from the beam can be simultaneously used for very sensitive anatomical imaging, which provides the registration for the elemental distributions obtained from PGI. The primary approach is to use Monte Carlo simulation methods either with the specific purpose code CEARCPG, developed at NC State University or with the general purpose

  4. The probability distribution of side-chain conformations in [Leu] and [Met]enkephalin determines the potency and selectivity to mu and delta opiate receptors.

    PubMed

    Nielsen, Bjørn G; Jensen, Morten Ø; Bohr, Henrik G

    2003-01-01

    The structure of enkephalin, a small neuropeptide with five amino acids, has been simulated on computers using molecular dynamics. Such simulations exhibit a few stable conformations, which also have been identified experimentally. The simulations provide the possibility to perform cluster analysis in the space defined by potentially pharmacophoric measures such as dihedral angles, side-chain orientation, etc. By analyzing the statistics of the resulting clusters, the probability distribution of the side-chain conformations may be determined. These probabilities allow us to predict the selectivity of [Leu]enkephalin and [Met]enkephalin to the known mu- and delta-type opiate receptors to which they bind as agonists. Other plausible consequences of these probability distributions are discussed in relation to the way in which they may influence the dynamics of the synapse. Copyright 2003 Wiley Periodicals, Inc. Biopolymers (Pept Sci) 71: 577-592, 2003

  5. Fermi GBM Observations of Terrestrial Gamma Flashes

    NASA Technical Reports Server (NTRS)

    Wilson-Hodge, Colleen A.; Briggs, M. S.; Connaughton, V.; Fishman, G. J.; Bhat, P. N.; Paciesas, W. S.; Preece, R. D.; Kippen, R. M.; vonKienlin, A.; Dwyer, J. R.; hide

    2010-01-01

    In its first two years of operation, the Fermi Gamma Ray Burst Monitor (GBM) has observed 79 Terrestrial Gamma Flashes (TGFs). The thick Bismuth Germanate (BGO) detectors are excellent for TGF spectroscopy, having a high probability of recording the full energy of an incident photon, spanning a broad energy range from 150 keV to 40 MeV, and recording a large number of photons per TGF. Correlations between GBM TGF triggers and lightning sferics detected with the World-Wide Lightning Location Network indicate that TGFs and lightning are simultaneous to within tens of microseconds.

  6. Benchmarking PARTISN with Analog Monte Carlo: Moments of the Neutron Number and the Cumulative Fission Number Probability Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Rourke, Patrick Francis

    The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.

  7. Gamma-Ray Telescopes: 400 Years of Astronomical Telescopes

    NASA Technical Reports Server (NTRS)

    Gehrels, Neil; Cannizzo, John K.

    2010-01-01

    The last half-century has seen dramatic developments in gamma-ray telescopes, from their initial conception and development through to their blossoming into full maturity as a potent research tool in astronomy. Gamma-ray telescopes are leading research in diverse areas such as gamma-ray bursts, blazars, Galactic transients, and the Galactic distribution of Al-26.

  8. A probability distribution model of tooth pits for evaluating time-varying mesh stiffness of pitting gears

    NASA Astrophysics Data System (ADS)

    Lei, Yaguo; Liu, Zongyao; Wang, Delong; Yang, Xiao; Liu, Huan; Lin, Jing

    2018-06-01

    Tooth damage often causes a reduction in gear mesh stiffness. Thus time-varying mesh stiffness (TVMS) can be treated as an indication of gear health conditions. This study is devoted to investigating the mesh stiffness variations of a pair of external spur gears with tooth pitting, and proposes a new model for describing tooth pitting based on probability distribution. In the model, considering the appearance and development process of tooth pitting, we model the pitting on the surface of spur gear teeth as a series of pits with a uniform distribution in the direction of tooth width and a normal distribution in the direction of tooth height, respectively. In addition, four pitting degrees, from no pitting to severe pitting, are modeled. Finally, influences of tooth pitting on TVMS are analyzed in details and the proposed model is validated by comparing with a finite element model. The comparison results show that the proposed model is effective for the TVMS evaluations of pitting gears.

  9. Optimizing probability of detection point estimate demonstration

    NASA Astrophysics Data System (ADS)

    Koshti, Ajay M.

    2017-04-01

    The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using point estimate method. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. Traditionally largest flaw size in the set is considered to be a conservative estimate of the flaw size with minimum 90% probability and 95% confidence. The flaw size is denoted as α90/95PE. The paper investigates relationship between range of flaw sizes in relation to α90, i.e. 90% probability flaw size, to provide a desired PPD. The range of flaw sizes is expressed as a proportion of the standard deviation of the probability density distribution. Difference between median or average of the 29 flaws and α90 is also expressed as a proportion of standard deviation of the probability density distribution. In general, it is concluded that, if probability of detection increases with flaw size, average of 29 flaw sizes would always be larger than or equal to α90 and is an acceptable measure of α90/95PE. If NDE technique has sufficient sensitivity and signal-to-noise ratio, then the 29 flaw-set can be optimized to meet requirements of minimum required PPD, maximum allowable POF, requirements on flaw size tolerance about mean flaw size and flaw size detectability requirements. The paper provides procedure for optimizing flaw sizes in the point estimate demonstration flaw-set.

  10. A new probability distribution model of turbulent irradiance based on Born perturbation theory

    NASA Astrophysics Data System (ADS)

    Wang, Hongxing; Liu, Min; Hu, Hao; Wang, Qian; Liu, Xiguo

    2010-10-01

    The subject of the PDF (Probability Density Function) of the irradiance fluctuations in a turbulent atmosphere is still unsettled. Theory reliably describes the behavior in the weak turbulence regime, but theoretical description in the strong and whole turbulence regimes are still controversial. Based on Born perturbation theory, the physical manifestations and correlations of three typical PDF models (Rice-Nakagami, exponential-Bessel and negative-exponential distribution) were theoretically analyzed. It is shown that these models can be derived by separately making circular-Gaussian, strong-turbulence and strong-turbulence-circular-Gaussian approximations in Born perturbation theory, which denies the viewpoint that the Rice-Nakagami model is only applicable in the extremely weak turbulence regime and provides theoretical arguments for choosing rational models in practical applications. In addition, a common shortcoming of the three models is that they are all approximations. A new model, called the Maclaurin-spread distribution, is proposed without any approximation except for assuming the correlation coefficient to be zero. So, it is considered that the new model can exactly reflect the Born perturbation theory. Simulated results prove the accuracy of this new model.

  11. Opacity probability distribution functions for electronic systems of CN and C2 molecules including their stellar isotopic forms.

    NASA Technical Reports Server (NTRS)

    Querci, F.; Kunde, V. G.; Querci, M.

    1971-01-01

    The basis and techniques are presented for generating opacity probability distribution functions for the CN molecule (red and violet systems) and the C2 molecule (Swan, Phillips, Ballik-Ramsay systems), two of the more important diatomic molecules in the spectra of carbon stars, with a view to including these distribution functions in equilibrium model atmosphere calculations. Comparisons to the CO molecule are also shown. T he computation of the monochromatic absorption coefficient uses the most recent molecular data with revision of the oscillator strengths for some of the band systems. The total molecular stellar mass absorption coefficient is established through fifteen equations of molecular dissociation equilibrium to relate the distribution functions to each other on a per gram of stellar material basis.

  12. COSMIC MICROWAVE BACKGROUND LIKELIHOOD APPROXIMATION FOR BANDED PROBABILITY DISTRIBUTIONS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gjerløw, E.; Mikkelsen, K.; Eriksen, H. K.

    We investigate sets of random variables that can be arranged sequentially such that a given variable only depends conditionally on its immediate predecessor. For such sets, we show that the full joint probability distribution may be expressed exclusively in terms of uni- and bivariate marginals. Under the assumption that the cosmic microwave background (CMB) power spectrum likelihood only exhibits correlations within a banded multipole range, Δl{sub C}, we apply this expression to two outstanding problems in CMB likelihood analysis. First, we derive a statistically well-defined hybrid likelihood estimator, merging two independent (e.g., low- and high-l) likelihoods into a single expressionmore » that properly accounts for correlations between the two. Applying this expression to the Wilkinson Microwave Anisotropy Probe (WMAP) likelihood, we verify that the effect of correlations on cosmological parameters in the transition region is negligible in terms of cosmological parameters for WMAP; the largest relative shift seen for any parameter is 0.06σ. However, because this may not hold for other experimental setups (e.g., for different instrumental noise properties or analysis masks), but must rather be verified on a case-by-case basis, we recommend our new hybridization scheme for future experiments for statistical self-consistency reasons. Second, we use the same expression to improve the convergence rate of the Blackwell-Rao likelihood estimator, reducing the required number of Monte Carlo samples by several orders of magnitude, and thereby extend it to high-l applications.« less

  13. Distribution of gamma-glutamyl-beta-alanylhistidine isopeptide in the macromolecular fractions of commercial meat extracts and correlation with the color of the macromolecular fractions.

    PubMed

    Kuroda, Motonaka; Harada, Tsutomu

    2002-03-27

    The measurement of gamma-glutamyl-beta-alanylhistidine isopeptide in the macromolecular fraction of various commercial meat extracts indicated that all of the commercial meat extracts tested contained the isopeptide, in concentrations ranging from 0.04 to 0.87 micromol/g of dry matter. This variation was suggested to be due to the differences between the processes of extraction and the differences in the initial amounts of carnosine. A positive correlation between the content of gamma-glutamyl-beta-alanylhistidine and the color of the macromolecular fraction was observed. These results suggested that gamma-glutamyl-beta-alanylhistidine is widely distributed in meat products and that the content can be used as an index of protein denaturation during the heating process.

  14. Stellar Photon Archaeology with Gamma-Rays

    NASA Technical Reports Server (NTRS)

    Stecker, Floyd W.

    2009-01-01

    Ongoing deep surveys of galaxy luminosity distribution functions, spectral energy distributions and backwards evolution models of star formation rates can be used to calculate the past history of intergalactic photon densities and, from them, the present and past optical depth of the Universe to gamma-rays from pair production interactions with these photons. The energy-redshift dependence of the optical depth of the Universe to gamma-rays has become known as the Fazio-Stecker relation (Fazio & Stecker 1970). Stecker, Malkan & Scully have calculated the densities of intergalactic background light (IBL) photons of energies from 0.03 eV to the Lyman limit at 13.6 eV and for 0$ < z < $6, using deep survey galaxy observations from Spitzer, Hubble and GALEX and have consequently predicted spectral absorption features for extragalactic gamma-ray sources. This procedure can also be reversed. Determining the cutoff energies of gamma-ray sources with known redshifts using the recently launched Fermi gamma-ray space telescope may enable a more precise determination of the IBL photon densities in the past, i.e., the "archaeo-IBL.", and therefore allow a better measure of the past history of the total star formation rate, including that from galaxies too faint to be observed.

  15. Segmentation and intensity estimation of microarray images using a gamma-t mixture model.

    PubMed

    Baek, Jangsun; Son, Young Sook; McLachlan, Geoffrey J

    2007-02-15

    We present a new approach to the analysis of images for complementary DNA microarray experiments. The image segmentation and intensity estimation are performed simultaneously by adopting a two-component mixture model. One component of this mixture corresponds to the distribution of the background intensity, while the other corresponds to the distribution of the foreground intensity. The intensity measurement is a bivariate vector consisting of red and green intensities. The background intensity component is modeled by the bivariate gamma distribution, whose marginal densities for the red and green intensities are independent three-parameter gamma distributions with different parameters. The foreground intensity component is taken to be the bivariate t distribution, with the constraint that the mean of the foreground is greater than that of the background for each of the two colors. The degrees of freedom of this t distribution are inferred from the data but they could be specified in advance to reduce the computation time. Also, the covariance matrix is not restricted to being diagonal and so it allows for nonzero correlation between R and G foreground intensities. This gamma-t mixture model is fitted by maximum likelihood via the EM algorithm. A final step is executed whereby nonparametric (kernel) smoothing is undertaken of the posterior probabilities of component membership. The main advantages of this approach are: (1) it enjoys the well-known strengths of a mixture model, namely flexibility and adaptability to the data; (2) it considers the segmentation and intensity simultaneously and not separately as in commonly used existing software, and it also works with the red and green intensities in a bivariate framework as opposed to their separate estimation via univariate methods; (3) the use of the three-parameter gamma distribution for the background red and green intensities provides a much better fit than the normal (log normal) or t distributions; (4) the

  16. Investigation of Probability Distributions Using Dice Rolling Simulation

    ERIC Educational Resources Information Center

    Lukac, Stanislav; Engel, Radovan

    2010-01-01

    Dice are considered one of the oldest gambling devices and thus many mathematicians have been interested in various dice gambling games in the past. Dice have been used to teach probability, and dice rolls can be effectively simulated using technology. The National Council of Teachers of Mathematics (NCTM) recommends that teachers use simulations…

  17. Likelihood analysis of species occurrence probability from presence-only data for modelling species distributions

    USGS Publications Warehouse

    Royle, J. Andrew; Chandler, Richard B.; Yackulic, Charles; Nichols, James D.

    2012-01-01

    1. Understanding the factors affecting species occurrence is a pre-eminent focus of applied ecological research. However, direct information about species occurrence is lacking for many species. Instead, researchers sometimes have to rely on so-called presence-only data (i.e. when no direct information about absences is available), which often results from opportunistic, unstructured sampling. MAXENT is a widely used software program designed to model and map species distribution using presence-only data. 2. We provide a critical review of MAXENT as applied to species distribution modelling and discuss how it can lead to inferential errors. A chief concern is that MAXENT produces a number of poorly defined indices that are not directly related to the actual parameter of interest – the probability of occurrence (ψ). This focus on an index was motivated by the belief that it is not possible to estimate ψ from presence-only data; however, we demonstrate that ψ is identifiable using conventional likelihood methods under the assumptions of random sampling and constant probability of species detection. 3. The model is implemented in a convenient r package which we use to apply the model to simulated data and data from the North American Breeding Bird Survey. We demonstrate that MAXENT produces extreme under-predictions when compared to estimates produced by logistic regression which uses the full (presence/absence) data set. We note that MAXENT predictions are extremely sensitive to specification of the background prevalence, which is not objectively estimated using the MAXENT method. 4. As with MAXENT, formal model-based inference requires a random sample of presence locations. Many presence-only data sets, such as those based on museum records and herbarium collections, may not satisfy this assumption. However, when sampling is random, we believe that inference should be based on formal methods that facilitate inference about interpretable ecological quantities

  18. Bivariate normal, conditional and rectangular probabilities: A computer program with applications

    NASA Technical Reports Server (NTRS)

    Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.

    1980-01-01

    Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.

  19. An Experiment to Demonstrate the Energy Broadening of Annihilation Gamma Rays

    ERIC Educational Resources Information Center

    Ouseph, P. J.; DuBard, James L.

    1978-01-01

    Shows that when positions annihilate in solid materials the energy distribution of the annihilation gamma rays is much broader than that of a 0.511-Mev gamma peak. This broadening is caused by the momentum distribution of the electrons in the material. (Author/GA)

  20. Performance Probability Distributions for Sediment Control Best Management Practices

    NASA Astrophysics Data System (ADS)

    Ferrell, L.; Beighley, R.; Walsh, K.

    2007-12-01

    Controlling soil erosion and sediment transport can be a significant challenge during the construction process due to the extent and conditions of bare, disturbed soils. Best Management Practices (BMPs) are used as the framework for the design of sediment discharge prevention systems in stormwater pollution prevention plans which are typically required for construction sites. This research focuses on commonly-used BMP systems for perimeter control of sediment export: silt fences and fiber rolls. Although these systems are widely used, the physical and engineering parameters describing their performance are not well understood. Performance expectations are based on manufacturer results, but due to the dynamic conditions that exist on a construction site performance expectations are not always achievable in the field. Based on experimental results product performance is shown to be highly variable. Experiments using the same installation procedures show inconsistent sediment removal performances ranging from (>)85 percent to zero. The goal of this research is to improve the determination of off-site sediment yield based on probabilistic performance results of perimeter control BMPs. BMPs are evaluated in the Soil Erosion Research Laboratory (SERL) in the Civil and Environmental Engineering department at San Diego State University. SERL experiments are performed on a 3-m by 10-m tilting soil bed with a soil depth of 0.5 meters and a slope of 33 percent. The simulated storm event consists of 17 mm/hr for 20 minutes followed by 51 mm/hr for 30 minutes. The storm event is based on an ASTM design storm intended to simulate BMP failures. BMP performance is assessed based on experiments where BMPs are installed per manufacture specifications, less than optimal installations, and no treatment conditions. Preliminary results from 30 experiments are presented and used to develop probability distributions for BMP sediment removal efficiencies. The results are then combined with

  1. A discussion on the origin of quantum probabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holik, Federico, E-mail: olentiev2@gmail.com; Departamento de Matemática - Ciclo Básico Común, Universidad de Buenos Aires - Pabellón III, Ciudad Universitaria, Buenos Aires; Sáenz, Manuel

    We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivationmore » of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases.« less

  2. Lateral distribution of high energy hadrons and gamma ray in air shower cores observed with emulsion chambers

    NASA Technical Reports Server (NTRS)

    Matano, T.; Machida, M.; Kawasumi, N.; Tsushima, I.; Honda, K.; Hashimoto, K.; Navia, C. E.; Matinic, N.; Aquirre, C.

    1985-01-01

    A high energy event of a bundle of electrons, gamma rays and hadronic gamma rays in an air shower core were observed. The bundles were detected with an emulsion chamber with thickness of 15 cm lead. This air shower is estimated to be initiated with a proton with energy around 10 to the 17th power to 10 to the 18th power eV at an altitude of around 100 gmc/2. Lateral distributions of the electromagnetic component with energy above 2 TeV and also the hadronic component of energy above 6 TeV of this air shower core were determined. Particles in the bundle are produced with process of the development of the nuclear cascade, the primary energy of each interaction in the cascade which produces these particles is unknown. To know the primary energy dependence of transverse momentum, the average products of energy and distance for various average energies of secondary particles are studied.

  3. Systematic Effects on Duration Measurements of Gamma-Ray Bursts

    NASA Technical Reports Server (NTRS)

    Koshut, Thomas M.; Paciesas, William S.; Kouveliotou, Chryssa; vanParadijs, Jan; Pendleton, Geoffrey N.; Fishman, Gerald J.; Meegan, Charles A.

    1996-01-01

    The parameters T(sub 90) and T(sub 50) have recently been introduced as a measurement of the duration of gamma-ray bursts. We present here a description of the method of measuring T(sub 90) and T(sub 50) and its application to gamma-ray bursts observed with the Burst and Transient Source Experiment (BATSE) onboard the Compton Gamma-Ray Observatory (CGRO). We use simulated as well as observed time profiles to address some of the possible systematic effects affecting individual T(sub 90) (T(sub 50)) measurements. We show that these systematic effects do not mimic those effects that would result from time dilation if the burst sources are at distances of several Gpc. We discuss the impact of these systematic effects on the T(sub 90) (T(sub 50)) distributions for the gamma-ray bursts observed with BATSE. We distinguish between various types of T(sub 90) (T(sub 50)) distributions, and discuss the ways in which distributions observed with different experiments can vary, even though the measurements for commonly observed bursts may be the same. We then discuss the distributions observed with BATSE and compare them to those observed with other experiments.

  4. Hemlock Alkaloids in Aloes. Occurrence and Distribution of gamma-Coniceine.

    PubMed

    Dring, J V; Nash, R J; Roberts, M F; Reynolds, T

    1984-10-01

    The hemlock alkaloid gamma-coniceine was identified in a number of ALOE species, namely A. GILLILANDII, Reynolds A. BALLYI Reynolds, A. RUSPOLIANA Baker, A. IBITIENSIS Perrier and A. DELTOIDEODONTA Baker. Coniine was identified in A. VIGUIERI Perrier. The levels of gamma-coniceine are higher than those found in CONIUM MACULATUM L. Some species also contained trace amounts of conhydrinone and pseudoconhydrin. Three of the species are Madagascan endemics, one is restricted to Arabia, while the rest are remote from each other in East Africa. Some of the species are loosely related but there is no overall taxonomic affinity between them.

  5. ANNz2: Photometric Redshift and Probability Distribution Function Estimation using Machine Learning

    NASA Astrophysics Data System (ADS)

    Sadeh, I.; Abdalla, F. B.; Lahav, O.

    2016-10-01

    We present ANNz2, a new implementation of the public software for photometric redshift (photo-z) estimation of Collister & Lahav, which now includes generation of full probability distribution functions (PDFs). ANNz2 utilizes multiple machine learning methods, such as artificial neural networks and boosted decision/regression trees. The objective of the algorithm is to optimize the performance of the photo-z estimation, to properly derive the associated uncertainties, and to produce both single-value solutions and PDFs. In addition, estimators are made available, which mitigate possible problems of non-representative or incomplete spectroscopic training samples. ANNz2 has already been used as part of the first weak lensing analysis of the Dark Energy Survey, and is included in the experiment's first public data release. Here we illustrate the functionality of the code using data from the tenth data release of the Sloan Digital Sky Survey and the Baryon Oscillation Spectroscopic Survey. The code is available for download at http://github.com/IftachSadeh/ANNZ.

  6. Skill of Ensemble Seasonal Probability Forecasts

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.; Binter, Roman; Du, Hailiang; Niehoerster, Falk

    2010-05-01

    In operational forecasting, the computational complexity of large simulation models is, ideally, justified by enhanced performance over simpler models. We will consider probability forecasts and contrast the skill of ENSEMBLES-based seasonal probability forecasts of interest to the finance sector (specifically temperature forecasts for Nino 3.4 and the Atlantic Main Development Region (MDR)). The ENSEMBLES model simulations will be contrasted against forecasts from statistical models based on the observations (climatological distributions) and empirical dynamics based on the observations but conditioned on the current state (dynamical climatology). For some start dates, individual ENSEMBLES models yield significant skill even at a lead-time of 14 months. The nature of this skill is discussed, and chances of application are noted. Questions surrounding the interpretation of probability forecasts based on these multi-model ensemble simulations are then considered; the distributions considered are formed by kernel dressing the ensemble and blending with the climatology. The sources of apparent (RMS) skill in distributions based on multi-model simulations is discussed, and it is demonstrated that the inclusion of "zero-skill" models in the long range can improve Root-Mean-Square-Error scores, casting some doubt on the common justification for the claim that all models should be included in forming an operational probability forecast. It is argued that the rational response varies with lead time.

  7. Fisher information for two gamma frailty bivariate Weibull models.

    PubMed

    Bjarnason, H; Hougaard, P

    2000-03-01

    The asymptotic properties of frailty models for multivariate survival data are not well understood. To study this aspect, the Fisher information is derived in the standard bivariate gamma frailty model, where the survival distribution is of Weibull form conditional on the frailty. For comparison, the Fisher information is also derived in the bivariate gamma frailty model, where the marginal distribution is of Weibull form.

  8. Log-gamma directed polymer with fixed endpoints via the replica Bethe Ansatz

    NASA Astrophysics Data System (ADS)

    Thiery, Thimothée; Le Doussal, Pierre

    2014-10-01

    We study the model of a discrete directed polymer (DP) on a square lattice with homogeneous inverse gamma distribution of site random Boltzmann weights, introduced by Seppalainen (2012 Ann. Probab. 40 19-73). The integer moments of the partition sum, \\overline{Z^n} , are studied using a transfer matrix formulation, which appears as a generalization of the Lieb-Liniger quantum mechanics of bosons to discrete time and space. In the present case of the inverse gamma distribution the model is integrable in terms of a coordinate Bethe Ansatz, as discovered by Brunet. Using the Brunet-Bethe eigenstates we obtain an exact expression for the integer moments of \\overline{Z^n} for polymers of arbitrary lengths and fixed endpoint positions. Although these moments do not exist for all integer n, we are nevertheless able to construct a generating function which reproduces all existing integer moments and which takes the form of a Fredholm determinant (FD). This suggests an analytic continuation via a Mellin-Barnes transform and we thereby propose a FD ansatz representation for the probability distribution function (PDF) of Z and its Laplace transform. In the limit of a very long DP, this ansatz yields that the distribution of the free energy converges to the Gaussian unitary ensemble (GUE) Tracy-Widom distribution up to a non-trivial average and variance that we calculate. Our asymptotic predictions coincide with a result by Borodin et al (2013 Commun. Math. Phys. 324 215-32) based on a formula obtained by Corwin et al (2011 arXiv:1110.3489) using the geometric Robinson-Schensted-Knuth (gRSK) correspondence. In addition we obtain the dependence on the endpoint position and the exact elastic coefficient at a large time. We argue the equivalence between our formula and that of Borodin et al. As we will discuss, this provides a connection between quantum integrability and tropical combinatorics.

  9. MODELING THE MULTIWAVELENGTH EMISSION FROM G73.9+0.9: GAMMA RAYS FROM AN SNR–MC INTERACTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Araya, Miguel, E-mail: miguel.araya@ucr.ac.cr

    G73.9+0.9 has been classified as a probable shell-type supernova remnant, though it has also been suggested that it could have a pulsar wind nebula (PWN). Here, a broadband model of the non-thermal emission of G73.9+0.9 from radio to gamma rays is presented. The model includes a new gamma-ray observation obtained by the analysis of seven years of data from the Fermi/LAT telescope. Above 200 MeV, the source is detected with a significance of 13σ and the spectrum of the radiation is best described by a power law with an index of ∼2.5. The leptonic mechanisms are hard to reconcile withmore » the measured radio and gamma-ray spectral energy distribution. A PWN origin for the high-energy emission is also not very likely, due to the lack of detection of pulsars and of X-ray emission in the region, as well as from the shape of the gamma-ray spectrum. Given the possibility that the object is interacting with molecular clouds, a hadronic origin of the high-energy emission is more likely, and the spectral properties of the cosmic rays responsible for this radiation are derived.« less

  10. A gamma beam profile imager for ELI-NP Gamma Beam System

    NASA Astrophysics Data System (ADS)

    Cardarelli, P.; Paternò, G.; Di Domenico, G.; Consoli, E.; Marziani, M.; Andreotti, M.; Evangelisti, F.; Squerzanti, S.; Gambaccini, M.; Albergo, S.; Cappello, G.; Tricomi, A.; Veltri, M.; Adriani, O.; Borgheresi, R.; Graziani, G.; Passaleva, G.; Serban, A.; Starodubtsev, O.; Variola, A.; Palumbo, L.

    2018-06-01

    The Gamma Beam System of ELI-Nuclear Physics is a high brilliance monochromatic gamma source based on the inverse Compton interaction between an intense high power laser and a bright electron beam with tunable energy. The source, currently being assembled in Magurele (Romania), is designed to provide a beam with tunable average energy ranging from 0.2 to 19.5 MeV, rms energy bandwidth down to 0.5% and flux of about 108 photons/s. The system includes a set of detectors for the diagnostic and complete characterization of the gamma beam. To evaluate the spatial distribution of the beam a gamma beam profile imager is required. For this purpose, a detector based on a scintillator target coupled to a CCD camera was designed and a prototype was tested at INFN-Ferrara laboratories. A set of analytical calculations and Monte Carlo simulations were carried out to optimize the imager design and evaluate the performance expected with ELI-NP gamma beam. In this work the design of the imager is described in detail, as well as the simulation tools used and the results obtained. The simulation parameters were tuned and cross-checked with the experimental measurements carried out on the assembled prototype using the beam from an x-ray tube.

  11. The sensitivity of EGRET to gamma ray polarization

    NASA Astrophysics Data System (ADS)

    Mattox, John R.

    1990-05-01

    A Monte Carlo simulation shows that EGRET (Energetic Gamma-Ray Experimental Telescope) does not even have sufficient sensitivity to detect 100 percent polarized gamma-rays. This is confirmed by analysis of calibration data. A Monte Carlo study shows that the sensitivity of EGRET to polarization peaks around 100 MeV. However, more than 10 5 gamma-ray events with 100 percent polarization would be required for a 3 sigma significance detection - more than available from calibration, and probably more than will result from a single score source during flight. A drift chamber gamma ray telescope under development (Hunter and Cuddapah 1989) will offer better sensitivity to polarization. The lateral position uncertainty will be improved by an order of magnitude. Also, if pair production occurs in the drift chamber gas (xenon at 2 bar) instead of tantalum foils, the effects of multiple Coulomb scattering will be reduced.

  12. Probability distribution of haplotype frequencies under the two-locus Wright-Fisher model by diffusion approximation.

    PubMed

    Boitard, Simon; Loisel, Patrice

    2007-05-01

    The probability distribution of haplotype frequencies in a population, and the way it is influenced by genetical forces such as recombination, selection, random drift ...is a question of fundamental interest in population genetics. For large populations, the distribution of haplotype frequencies for two linked loci under the classical Wright-Fisher model is almost impossible to compute because of numerical reasons. However the Wright-Fisher process can in such cases be approximated by a diffusion process and the transition density can then be deduced from the Kolmogorov equations. As no exact solution has been found for these equations, we developed a numerical method based on finite differences to solve them. It applies to transient states and models including selection or mutations. We show by several tests that this method is accurate for computing the conditional joint density of haplotype frequencies given that no haplotype has been lost. We also prove that it is far less time consuming than other methods such as Monte Carlo simulations.

  13. Extinction probabilities and stationary distributions of mobile genetic elements in prokaryotes: The birth-death-diversification model.

    PubMed

    Drakos, Nicole E; Wahl, Lindi M

    2015-12-01

    Theoretical approaches are essential to our understanding of the complex dynamics of mobile genetic elements (MGEs) within genomes. Recently, the birth-death-diversification model was developed to describe the dynamics of mobile promoters (MPs), a particular class of MGEs in prokaryotes. A unique feature of this model is that genetic diversification of elements was included. To explore the implications of diversification on the longterm fate of MGE lineages, in this contribution we analyze the extinction probabilities, extinction times and equilibrium solutions of the birth-death-diversification model. We find that diversification increases both the survival and growth rate of MGE families, but the strength of this effect depends on the rate of horizontal gene transfer (HGT). We also find that the distribution of MGE families per genome is not necessarily monotonically decreasing, as observed for MPs, but may have a peak in the distribution that is related to the HGT rate. For MPs specifically, we find that new families have a high extinction probability, and predict that the number of MPs is increasing, albeit at a very slow rate. Additionally, we develop an extension of the birth-death-diversification model which allows MGEs in different regions of the genome, for example coding and non-coding, to be described by different rates. This extension may offer a potential explanation as to why the majority of MPs are located in non-promoter regions of the genome. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. Tomographic analysis of neutron and gamma pulse shape distributions from liquid scintillation detectors at Joint European Torus.

    PubMed

    Giacomelli, L; Conroy, S; Gorini, G; Horton, L; Murari, A; Popovichev, S; Syme, D B

    2014-02-01

    The Joint European Torus (JET, Culham, UK) is the largest tokamak in the world devoted to nuclear fusion experiments of magnetic confined Deuterium (D)/Deuterium-Tritium (DT) plasmas. Neutrons produced in these plasmas are measured using various types of neutron detectors and spectrometers. Two of these instruments on JET make use of organic liquid scintillator detectors. The neutron emission profile monitor implements 19 liquid scintillation counters to detect the 2.45 MeV neutron emission from D plasmas. A new compact neutron spectrometer is operational at JET since 2010 to measure the neutron energy spectra from both D and DT plasmas. Liquid scintillation detectors are sensitive to both neutron and gamma radiation but give light responses of different decay time such that pulse shape discrimination techniques can be applied to identify the neutron contribution of interest from the data. The most common technique consists of integrating the radiation pulse shapes within different ranges of their rising and/or trailing edges. In this article, a step forward in this type of analysis is presented. The method applies a tomographic analysis of the 3-dimensional neutron and gamma pulse shape and pulse height distribution data obtained from liquid scintillation detectors such that n/γ discrimination can be improved to lower energies and additional information can be gained on neutron contributions to the gamma events and vice versa.

  15. Probability modeling of high flow extremes in Yingluoxia watershed, the upper reaches of Heihe River basin

    NASA Astrophysics Data System (ADS)

    Li, Zhanling; Li, Zhanjie; Li, Chengcheng

    2014-05-01

    Probability modeling of hydrological extremes is one of the major research areas in hydrological science. Most basins in humid and semi-humid south and east of China are concerned for probability modeling analysis of high flow extremes. While, for the inland river basin which occupies about 35% of the country area, there is a limited presence of such studies partly due to the limited data availability and a relatively low mean annual flow. The objective of this study is to carry out probability modeling of high flow extremes in the upper reach of Heihe River basin, the second largest inland river basin in China, by using the peak over threshold (POT) method and Generalized Pareto Distribution (GPD), in which the selection of threshold and inherent assumptions for POT series are elaborated in details. For comparison, other widely used probability distributions including generalized extreme value (GEV), Lognormal, Log-logistic and Gamma are employed as well. Maximum likelihood estimate is used for parameter estimations. Daily flow data at Yingluoxia station from 1978 to 2008 are used. Results show that, synthesizing the approaches of mean excess plot, stability features of model parameters, return level plot and the inherent independence assumption of POT series, an optimum threshold of 340m3/s is finally determined for high flow extremes in Yingluoxia watershed. The resulting POT series is proved to be stationary and independent based on Mann-Kendall test, Pettitt test and autocorrelation test. In terms of Kolmogorov-Smirnov test, Anderson-Darling test and several graphical diagnostics such as quantile and cumulative density function plots, GPD provides the best fit to high flow extremes in the study area. The estimated high flows for long return periods demonstrate that, as the return period increasing, the return level estimates are probably more uncertain. The frequency of high flow extremes exhibits a very slight but not significant decreasing trend from 1978 to

  16. Probable gamma-aminobutyric acid involvement in bisphenol A effect at the hypothalamic level in adult male rats.

    PubMed

    Cardoso, Nancy; Pandolfi, Matías; Lavalle, Justina; Carbone, Silvia; Ponzo, Osvaldo; Scacchi, Pablo; Reynoso, Roxana

    2011-12-01

    The aim of the present study was to investigate the effects of bisphenol A (BPA) on the neuroendocrine mechanism of control of the reproductive axis in adult male rats exposed to it during pre- and early postnatal periods. Wistar mated rats were treated with either 0.1% ethanol or BPA in their drinking water until their offspring were weaned at the age of 21 days. The estimated average dose of exposure to dams was approximately 2.5 mg/kg body weight per day of BPA. After 21 days, the pups were separated from the mother and sacrificed on 70 day of life. Gn-RH and gamma-aminobutyric acid (GABA) release from hypothalamic fragments was measured. LH, FSH, and testosterone concentrations were determined, and histological and morphometrical studies of testis were performed. Gn-RH release decreased significantly, while GABA serum levels were markedly increased by treatment. LH serum levels showed no changes, and FSH and testosterone levels decreased significantly. Histological studies showed abnormalities in the tubular organization of the germinal epithelium. The cytoarchitecture of germinal cells was apparently normal, and a reduction of the nuclear area of Leydig cells but not their number was observed. Taken all together, these results provide evidence of the effect caused by BPA on the adult male reproductive axis when exposed during pre- and postnatal period. Moreover, our findings suggest a probable GABA involvement in its effect at the hypothalamic level.

  17. Kinematics of the Elastic Scattering of $gamma$ in Hydrogen (Compton Effecte Between 300 and 1500 Mev; CINEMATICA DELLA DIFFUSIONE ELASTICA DI $gamma$ IN IDROGENO (EFFETTO COMPTON) TRA 300 E 1500 MEV

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salvadori, P.

    1962-10-31

    The proton (p ) and gamma energy and angular distributions from the elastic (Compton) interaction p + gamma -- p + gamma are calculated. The results are tabulated for 25-Mev gamma increments, from 300 to 1500 Mev. (T.F.H.)

  18. Impact of Image Noise on Gamma Index Calculation

    NASA Astrophysics Data System (ADS)

    Chen, M.; Mo, X.; Parnell, D.; Olivera, G.; Galmarini, D.; Lu, W.

    2014-03-01

    Purpose: The Gamma Index defines an asymmetric metric between the evaluated image and the reference image. It provides a quantitative comparison that can be used to indicate sample-wised pass/fail on the agreement of the two images. The Gamma passing/failing rate has become an important clinical evaluation tool. However, the presence of noise in the evaluated and/or reference images may change the Gamma Index, hence the passing/failing rate, and further, clinical decisions. In this work, we systematically studied the impact of the image noise on the Gamma Index calculation. Methods: We used both analytic formulation and numerical calculations in our study. The numerical calculations included simulations and clinical images. Three different noise scenarios were studied in simulations: noise in reference images only, in evaluated images only, and in both. Both white and spatially correlated noises of various magnitudes were simulated. For clinical images of various noise levels, the Gamma Index of measurement against calculation, calculation against measurement, and measurement against measurement, were evaluated. Results: Numerical calculations for both the simulation and clinical data agreed with the analytic formulations, and the clinical data agreed with the simulations. For the Gamma Index of measurement against calculation, its distribution has an increased mean and an increased standard deviation as the noise increases. On the contrary, for the Gamma index of calculation against measurement, its distribution has a decreased mean and stabilized standard deviation as the noise increases. White noise has greater impact on the Gamma Index than spatially correlated noise. Conclusions: The noise has significant impact on the Gamma Index calculation and the impact is asymmetric. The Gamma Index should be reported along with the noise levels in both reference and evaluated images. Reporting of the Gamma Index with switched roles of the images as reference and

  19. LUMINOSITY EVOLUTION OF GAMMA-RAY PULSARS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hirotani, Kouichi, E-mail: hirotani@tiara.sinica.edu.tw

    2013-04-01

    We investigate the electrodynamic structure of a pulsar outer-magnetospheric particle accelerator and the resulting gamma-ray emission. By considering the condition for the accelerator to be self-sustained, we derive how the trans-magnetic-field thickness of the accelerator evolves with the pulsar age. It is found that the thickness is small but increases steadily if the neutron-star envelope is contaminated by sufficient light elements. For such a light element envelope, the gamma-ray luminosity of the accelerator is kept approximately constant as a function of age in the initial 10,000 yr, forming the lower bound of the observed distribution of the gamma-ray luminosity ofmore » rotation-powered pulsars. If the envelope consists of only heavy elements, on the other hand, the thickness is greater, but it increases less rapidly than a light element envelope. For such a heavy element envelope, the gamma-ray luminosity decreases relatively rapidly, forming the upper bound of the observed distribution. The gamma-ray luminosity of a general pulsar resides between these two extreme cases, reflecting the envelope composition and the magnetic inclination angle with respect to the rotation axis. The cutoff energy of the primary curvature emission is regulated below several GeV even for young pulsars because the gap thickness, and hence the acceleration electric field, is suppressed by the polarization of the produced pairs.« less

  20. High energy gamma ray astronomy

    NASA Technical Reports Server (NTRS)

    Fichtel, Carl E.

    1987-01-01

    High energy gamma ray astronomy has evolved with the space age. Nonexistent twenty-five years ago, there is now a general sketch of the gamma ray sky which should develop into a detailed picture with the results expected to be forthcoming over the next decade. The galactic plane is the dominant feature of the gamma ray sky, the longitude and latitude distribution being generally correlated with galactic structural features including the spiral arms. Two molecular clouds were already seen. Two of the three strongest gamma ray sources are pulsars. The highly variable X-ray source Cygnus X-3 was seen at one time, but not another in the 100 MeV region, and it was also observed at very high energies. Beyond the Milky Way Galaxy, there is seen a diffuse radiation, whose origin remains uncertain, as well as at least one quasar, 3C 273. Looking to the future, the satellite opportunities for high energy gamma ray astronomy in the near term are the GAMMA-I planned to be launched in late 1987 and the Gamma Ray Observatory, scheduled for launch in 1990. The Gamma Ray Observatory will carry a total of four instruments covering the entire energy range from 30,000 eV to 3 x 10 to the 10th eV with over an order of magnitude increase in sensitivity relative to previous satellite instruments.

  1. Reversion and conversion of Mycobacterium tuberculosis IFN-gamma ELISpot results during anti-tuberculous treatment in HIV-infected children.

    PubMed

    Connell, Tom G; Davies, Mary-Ann; Johannisen, Christine; Wood, Kathryn; Pienaar, Sandy; Wilkinson, Katalin A; Wilkinson, Robert J; Zar, Heather J; Beatty, David; Nicol, Mark P; Curtis, Nigel; Eley, Brian

    2010-05-27

    Recent interest has focused on the potential use of serial interferon gamma (IFN-gamma) release assay (IGRA) measurements to assess the response to anti-tuberculous (TB) treatment. The kinetics of IFN-gamma responses to Mycobacterium tuberculosis (MTB) antigens in HIV-infected children during treatment have not however been previously investigated. IFN-gamma responses to the MTB antigens, ESAT-6, CFP-10 and PPD were measured by an enzyme-linked immunospot assay (IFN-gamma ELISpot) at presentation and at one, two and six months after starting anti-tuberculous treatment in HIV-infected children with definite or probable TB. Responses at different time points were compared using a Mann-Whitney U test with paired data analysed using the Wilcoxon signed rank test. A Fisher's exact or Chi-squared test was used to compare proportions when test results were analysed as dichotomous outcomes. Of 102 children with suspected TB, 22 (21%) had definite TB and 24 (23%) probable TB. At least one follow up IFN-gamma ELISpot assay result was available for 31 (67%) of the 46 children. In children with definite or probable TB in whom the IFN-gamma ELISpot assay result was positive at presentation, anti-tuberculous treatment was accompanied by a significant decrease in both the magnitude of the IFN-gamma response to individual or combined MTB-specific antigens (ESAT-6 median 110 SFCs/106 PBMC (IQR 65-305) at presentation vs. 15 (10-115) at six months, p = 0.04; CFP-10 177 (48-508) vs. 20 (5-165), p = 0.004, ESAT-6 or CFP-10 median 250 SFCs/106 PBMC (IQR 94-508) vs. 25 (10-165), p = 0.004) and in the proportion of children with a positive IFN-gamma ELISpot assay (Fisher's exact test: ESAT-6 15/0 vs 5/11, p = 0.0002, CFP-10 22/0 vs 8/17, p = 0.0001, ESAT-6 or CFP-10 22/0 vs. 9/17, p= 0.002). However almost half of the children had a positive IFN-gamma ELISpot assay after six months of anti-tuberculous treatment. In addition, there was conversion of the IFN-gamma ELISpot assay result

  2. Probability distributions of bed load particle velocities, accelerations, hop distances, and travel times informed by Jaynes's principle of maximum entropy

    USGS Publications Warehouse

    Furbish, David; Schmeeckle, Mark; Schumer, Rina; Fathel, Siobhan

    2016-01-01

    We describe the most likely forms of the probability distributions of bed load particle velocities, accelerations, hop distances, and travel times, in a manner that formally appeals to inferential statistics while honoring mechanical and kinematic constraints imposed by equilibrium transport conditions. The analysis is based on E. Jaynes's elaboration of the implications of the similarity between the Gibbs entropy in statistical mechanics and the Shannon entropy in information theory. By maximizing the information entropy of a distribution subject to known constraints on its moments, our choice of the form of the distribution is unbiased. The analysis suggests that particle velocities and travel times are exponentially distributed and that particle accelerations follow a Laplace distribution with zero mean. Particle hop distances, viewed alone, ought to be distributed exponentially. However, the covariance between hop distances and travel times precludes this result. Instead, the covariance structure suggests that hop distances follow a Weibull distribution. These distributions are consistent with high-resolution measurements obtained from high-speed imaging of bed load particle motions. The analysis brings us closer to choosing distributions based on our mechanical insight.

  3. Z{gamma}{gamma}{gamma} {yields} 0 Processes in SANC

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bardin, D. Yu., E-mail: bardin@nu.jinr.ru; Kalinovskaya, L. V., E-mail: kalinov@nu.jinr.ru; Uglov, E. D., E-mail: corner@nu.jinr.ru

    2013-11-15

    We describe the analytic and numerical evaluation of the {gamma}{gamma} {yields} {gamma}Z process cross section and the Z {yields} {gamma}{gamma}{gamma} decay rate within the SANC system multi-channel approach at the one-loop accuracy level with all masses taken into account. The corresponding package for numeric calculations is presented. For checking of the results' correctness we make a comparison with the other independent calculations.

  4. Integrated seismic stochastic inversion and multi-attributes to delineate reservoir distribution: Case study MZ fields, Central Sumatra Basin

    NASA Astrophysics Data System (ADS)

    Haris, A.; Novriyani, M.; Suparno, S.; Hidayat, R.; Riyanto, A.

    2017-07-01

    This study presents the integration of seismic stochastic inversion and multi-attributes for delineating the reservoir distribution in term of lithology and porosity in the formation within depth interval between the Top Sihapas and Top Pematang. The method that has been used is a stochastic inversion, which is integrated with multi-attribute seismic by applying neural network Probabilistic Neural Network (PNN). Stochastic methods are used to predict the probability mapping sandstone as the result of impedance varied with 50 realizations that will produce a good probability. Analysis of Stochastic Seismic Tnversion provides more interpretive because it directly gives the value of the property. Our experiment shows that AT of stochastic inversion provides more diverse uncertainty so that the probability value will be close to the actual values. The produced AT is then used for an input of a multi-attribute analysis, which is used to predict the gamma ray, density and porosity logs. To obtain the number of attributes that are used, stepwise regression algorithm is applied. The results are attributes which are used in the process of PNN. This PNN method is chosen because it has the best correlation of others neural network method. Finally, we interpret the product of the multi-attribute analysis are in the form of pseudo-gamma ray volume, density volume and volume of pseudo-porosity to delineate the reservoir distribution. Our interpretation shows that the structural trap is identified in the southeastern part of study area, which is along the anticline.

  5. Sneaky Gamma-Rays: Using Gravitational Lensing to Avoid Gamma-Gamma-Absorption

    NASA Astrophysics Data System (ADS)

    Boettcher, Markus; Barnacka, Anna

    2014-08-01

    It has recently been suggested that gravitational lensing studies of gamma-ray blazars might be a promising avenue to probe the location of the gamma-ray emitting region in blazars. Motivated by these prospects, we have investigated potential gamma-gamma absorption signatures of intervening lenses in the very-high-energy gamma-ray emission from lensedblazars. We considered intervening galaxies and individual stars within these galaxies. We find that the collective radiation field of galaxies acting as sources of macrolensing are not expected to lead to significant gamma-gamma absorption. Individual stars within intervening galaxies could, in principle, cause a significant opacity to gamma-gamma absorption for VHE gamma-rays if the impact parameter (the distance of closest approach of the gamma-ray to the center of the star) is small enough. However, we find that the curvature of the photon path due to gravitational lensing will cause gamma-ray photons to maintain a sufficiently large distance from such stars to avoid significant gamma-gamma absorption. This re-inforces the prospect of gravitational-lensing studies of gamma-ray blazars without interference due to gamma-gamma absorption due to the lensing objects.

  6. Neutron Capture Energies for Flux Normalization and Approximate Model for Gamma-Smeared Power

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kang Seog; Clarno, Kevin T.; Liu, Yuxuan

    The Consortium for Advanced Simulation of Light Water Reactors (CASL) Virtual Environment for Reactor Applications (VERA) neutronics simulator MPACT has used a single recoverable fission energy for each fissionable nuclide assuming that all recoverable energies come only from fission reaction, for which capture energy is merged with fission energy. This approach includes approximations and requires improvement by separating capture energy from the merged effective recoverable energy. This report documents the procedure to generate recoverable neutron capture energies and the development of a program called CapKappa to generate capture energies. Recoverable neutron capture energies have been generated by using CapKappa withmore » the evaluated nuclear data file (ENDF)/B-7.0 and 7.1 cross section and decay libraries. The new capture kappas were compared to the current SCALE-6.2 and the CASMO-5 capture kappas. These new capture kappas have been incorporated into the Simplified AMPX 51- and 252-group libraries, and they can be used for the AMPX multigroup (MG) libraries and the SCALE code package. The CASL VERA neutronics simulator MPACT does not include a gamma transport capability, which limits it to explicitly estimating local energy deposition from fission, neutron, and gamma slowing down and capture. Since the mean free path of gamma rays is typically much longer than that for the neutron, and the total gamma energy is about 10% to the total energy, the gamma-smeared power distribution is different from the fission power distribution. Explicit local energy deposition through neutron and gamma transport calculation is significantly important in multi-physics whole core simulation with thermal-hydraulic feedback. Therefore, the gamma transport capability should be incorporated into the CASL neutronics simulator MPACT. However, this task will be timeconsuming in developing the neutron induced gamma production and gamma cross section libraries. This study is to

  7. The return period analysis of natural disasters with statistical modeling of bivariate joint probability distribution.

    PubMed

    Li, Ning; Liu, Xueqin; Xie, Wei; Wu, Jidong; Zhang, Peng

    2013-01-01

    New features of natural disasters have been observed over the last several years. The factors that influence the disasters' formation mechanisms, regularity of occurrence and main characteristics have been revealed to be more complicated and diverse in nature than previously thought. As the uncertainty involved increases, the variables need to be examined further. This article discusses the importance and the shortage of multivariate analysis of natural disasters and presents a method to estimate the joint probability of the return periods and perform a risk analysis. Severe dust storms from 1990 to 2008 in Inner Mongolia were used as a case study to test this new methodology, as they are normal and recurring climatic phenomena on Earth. Based on the 79 investigated events and according to the dust storm definition with bivariate, the joint probability distribution of severe dust storms was established using the observed data of maximum wind speed and duration. The joint return periods of severe dust storms were calculated, and the relevant risk was analyzed according to the joint probability. The copula function is able to simulate severe dust storm disasters accurately. The joint return periods generated are closer to those observed in reality than the univariate return periods and thus have more value in severe dust storm disaster mitigation, strategy making, program design, and improvement of risk management. This research may prove useful in risk-based decision making. The exploration of multivariate analysis methods can also lay the foundation for further applications in natural disaster risk analysis. © 2012 Society for Risk Analysis.

  8. High-energy Neutrino Emission from Short Gamma-Ray Bursts: Prospects for Coincident Detection with Gravitational Waves

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimura, Shigeo S.; Murase, Kohta; Mészáros, Peter

    We investigate current and future prospects for coincident detection of high-energy neutrinos and gravitational waves (GWs). Short gamma-ray bursts (SGRBs) are believed to originate from mergers of compact star binaries involving neutron stars. We estimate high-energy neutrino fluences from prompt emission, extended emission (EE), X-ray flares, and plateau emission, and we show that neutrino signals associated with the EE are the most promising. Assuming that the cosmic-ray loading factor is ∼10 and the Lorentz factor distribution is lognormal, we calculate the probability of neutrino detection from EE by current and future neutrino detectors, and we find that the quasi-simultaneous detectionmore » of high-energy neutrinos, gamma-rays, and GWs is possible with future instruments or even with current instruments for nearby SGRBs having EE. We also discuss stacking analyses that will also be useful with future experiments such as IceCube-Gen2.« less

  9. OBSERVATION OF TeV GAMMA RAYS FROM THE FERMI BRIGHT GALACTIC SOURCES WITH THE TIBET AIR SHOWER ARRAY

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Amenomori, M.; Bi, X. J.; Ding, L. K.

    2010-01-20

    Using the Tibet-III air shower array, we search for TeV {gamma}-rays from 27 potential Galactic sources in the early list of bright sources obtained by the Fermi Large Area Telescope at energies above 100 MeV. Among them, we observe seven sources instead of the expected 0.61 sources at a significance of 2{sigma} or more excess. The chance probability from Poisson statistics would be estimated to be 3.8 x 10{sup -6}. If the excess distribution observed by the Tibet-III array has a density gradient toward the Galactic plane, the expected number of sources may be enhanced in chance association. Then, themore » chance probability rises slightly, to 1.2 x 10{sup -5}, based on a simple Monte Carlo simulation. These low chance probabilities clearly show that the Fermi bright Galactic sources have statistically significant correlations with TeV {gamma}-ray excesses. We also find that all seven sources are associated with pulsars, and six of them are coincident with sources detected by the Milagro experiment at a significance of 3{sigma} or more at the representative energy of 35 TeV. The significance maps observed by the Tibet-III air shower array around the Fermi sources, which are coincident with the Milagro {>=}3{sigma} sources, are consistent with the Milagro observations. This is the first result of the northern sky survey of the Fermi bright Galactic sources in the TeV region.« less

  10. An evaluation of procedures to estimate monthly precipitation probabilities

    NASA Astrophysics Data System (ADS)

    Legates, David R.

    1991-01-01

    Many frequency distributions have been used to evaluate monthly precipitation probabilities. Eight of these distributions (including Pearson type III, extreme value, and transform normal probability density functions) are comparatively examined to determine their ability to represent accurately variations in monthly precipitation totals for global hydroclimatological analyses. Results indicate that a modified version of the Box-Cox transform-normal distribution more adequately describes the 'true' precipitation distribution than does any of the other methods. This assessment was made using a cross-validation procedure for a global network of 253 stations for which at least 100 years of monthly precipitation totals were available.

  11. Dynamic gamma knife radiosurgery

    NASA Astrophysics Data System (ADS)

    Luan, Shuang; Swanson, Nathan; Chen, Zhe; Ma, Lijun

    2009-03-01

    Gamma knife has been the treatment of choice for various brain tumors and functional disorders. Current gamma knife radiosurgery is planned in a 'ball-packing' approach and delivered in a 'step-and-shoot' manner, i.e. it aims to 'pack' the different sized spherical high-dose volumes (called 'shots') into a tumor volume. We have developed a dynamic scheme for gamma knife radiosurgery based on the concept of 'dose-painting' to take advantage of the new robotic patient positioning system on the latest Gamma Knife C™ and Perfexion™ units. In our scheme, the spherical high dose volume created by the gamma knife unit will be viewed as a 3D spherical 'paintbrush', and treatment planning reduces to finding the best route of this 'paintbrush' to 'paint' a 3D tumor volume. Under our dose-painting concept, gamma knife radiosurgery becomes dynamic, where the patient moves continuously under the robotic positioning system. We have implemented a fully automatic dynamic gamma knife radiosurgery treatment planning system, where the inverse planning problem is solved as a traveling salesman problem combined with constrained least-square optimizations. We have also carried out experimental studies of dynamic gamma knife radiosurgery and showed the following. (1) Dynamic gamma knife radiosurgery is ideally suited for fully automatic inverse planning, where high quality radiosurgery plans can be obtained in minutes of computation. (2) Dynamic radiosurgery plans are more conformal than step-and-shoot plans and can maintain a steep dose gradient (around 13% per mm) between the target tumor volume and the surrounding critical structures. (3) It is possible to prescribe multiple isodose lines with dynamic gamma knife radiosurgery, so that the treatment can cover the periphery of the target volume while escalating the dose for high tumor burden regions. (4) With dynamic gamma knife radiosurgery, one can obtain a family of plans representing a tradeoff between the delivery time and the

  12. The Impact of Spatial and Temporal Resolutions in Tropical Summer Rainfall Distribution: Preliminary Results

    NASA Astrophysics Data System (ADS)

    Liu, Q.; Chiu, L. S.; Hao, X.

    2017-10-01

    The abundance or lack of rainfall affects peoples' life and activities. As a major component of the global hydrological cycle (Chokngamwong & Chiu, 2007), accurate representations at various spatial and temporal scales are crucial for a lot of decision making processes. Climate models show a warmer and wetter climate due to increases of Greenhouse Gases (GHG). However, the models' resolutions are often too coarse to be directly applicable to local scales that are useful for mitigation purposes. Hence disaggregation (downscaling) procedures are needed to transfer the coarse scale products to higher spatial and temporal resolutions. The aim of this paper is to examine the changes in the statistical parameters of rainfall at various spatial and temporal resolutions. The TRMM Multi-satellite Precipitation Analysis (TMPA) at 0.25 degree, 3 hourly grid rainfall data for a summer is aggregated to 0.5,1.0, 2.0 and 2.5 degree and at 6, 12, 24 hourly, pentad (five days) and monthly resolutions. The probability distributions (PDF) and cumulative distribution functions(CDF) of rain amount at these resolutions are computed and modeled as a mixed distribution. Parameters of the PDFs are compared using the Kolmogrov-Smironov (KS) test, both for the mixed and the marginal distribution. These distributions are shown to be distinct. The marginal distributions are fitted with Lognormal and Gamma distributions and it is found that the Gamma distributions fit much better than the Lognormal.

  13. Gamma-delta t-cell lymphomas.

    PubMed

    Foppoli, Marco; Ferreri, Andrés J M

    2015-03-01

    Gamma-delta T-cell lymphomas are aggressive and rare diseases originating from gamma-delta lymphocytes. These cells, which naturally play a role in the innate, non-specific immune response, develop from thymic precursor in the bone marrow, lack the major histocompatibility complex restrictions and can be divided into two subpopulations: Vdelta1, mostly represented in the intestine, and Vdelta2, prevalently located in the skin, tonsils and lymph nodes. Chronic immunosuppression such as in solid organ transplanted subjects and prolonged antigenic exposure are probably the strongest risk factors for the triggering of lymphomagenesis. Two entities are recognised by the 2008 WHO Classification: hepatosplenic gamma-delta T-cell lymphoma (HSGDTL) and primary cutaneous gamma-delta T-cell lymphoma (PCGDTL). The former is more common among young males, presenting with B symptoms, splenomegaly and thrombocytopenia, usually with the absence of nodal involvement. Natural behaviour of HSGDTL is characterised by low response rates, poor treatment tolerability, common early progression of disease and disappointing survival figures. PCGDTL accounts for <1% of all primary cutaneous lymphomas, occurring in adults with relevant comorbidities. Cutaneous lesions may vary, but its clinical behaviour is usually aggressive and long-term survival is anecdotal. Available literature on gamma-delta T-cell lymphomas is fractioned, mostly consisting of case reports or small cumulative series. Therefore, clinical suspicion and diagnosis are usually delayed, and therapeutic management remains to be established. This review critically analyses available evidence on diagnosis, staging and behaviour of gamma-delta T-cell lymphomas, provides recommendations for therapeutic management in routine practice and discusses relevant unmet clinical needs for future studies. © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  14. A Monte Carlo Simulation of Prompt Gamma Emission from Fission Fragments

    NASA Astrophysics Data System (ADS)

    Regnier, D.; Litaize, O.; Serot, O.

    2013-03-01

    The prompt fission gamma spectra and multiplicities are investigated through the Monte Carlo code FIFRELIN which is developed at the Cadarache CEA research center. Knowing the fully accelerated fragment properties, their de-excitation is simulated through a cascade of neutron, gamma and/or electron emissions. This paper presents the recent developments in the FIFRELIN code and the results obtained on the spontaneous fission of 252Cf. Concerning the decay cascades simulation, a fully Hauser-Feshbach model is compared with a previous one using a Weisskopf spectrum for neutron emission. A particular attention is paid to the treatment of the neutron/gamma competition. Calculations lead using different level density and gamma strength function models show significant discrepancies of the slope of the gamma spectra at high energy. The underestimation of the prompt gamma spectra obtained regardless our de-excitation cascade modeling choice is discussed. This discrepancy is probably linked to an underestimation of the post-neutron fragments spin in our calculation.

  15. USE OF MODELS FOR GAMMA SHIELDING STUDIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clifford, C.E.

    1962-02-01

    The use of models for shielding studies of buildings exposed to gamma radiation was evaluated by comparing the dose distributions produced in a blockhouse with movable inside walls exposed to 0.66 Mev gamma radiation with corresponding distributions in an iron 1 to 10 scale model. The effects of air and ground scaling on the readings in the model were also investigated. Iron appeared to be a suitable model material for simple closed buildings but for more complex structures it appeared that the use of iron models would progressively overestimite the gamms shielding protection as the complexity increased. (auth)

  16. Prediction of fatty acid-binding residues on protein surfaces with three-dimensional probability distributions of interacting atoms.

    PubMed

    Mahalingam, Rajasekaran; Peng, Hung-Pin; Yang, An-Suei

    2014-08-01

    Protein-fatty acid interaction is vital for many cellular processes and understanding this interaction is important for functional annotation as well as drug discovery. In this work, we present a method for predicting the fatty acid (FA)-binding residues by using three-dimensional probability density distributions of interacting atoms of FAs on protein surfaces which are derived from the known protein-FA complex structures. A machine learning algorithm was established to learn the characteristic patterns of the probability density maps specific to the FA-binding sites. The predictor was trained with five-fold cross validation on a non-redundant training set and then evaluated with an independent test set as well as on holo-apo pair's dataset. The results showed good accuracy in predicting the FA-binding residues. Further, the predictor developed in this study is implemented as an online server which is freely accessible at the following website, http://ismblab.genomics.sinica.edu.tw/. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Characterizing the Lyα forest flux probability distribution function using Legendre polynomials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cieplak, Agnieszka M.; Slosar, Anze

    The Lyman-α forest is a highly non-linear field with considerable information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, the n-th Legendre coefficient can be expressed as a linear combination of the first n moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation overmore » mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with a very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities. In conclusion, we find that the amount of recoverable information is a very non-linear function of spectral noise that strongly favors fewer quasars measured at better signal to noise.« less

  18. Characterizing the Lyα forest flux probability distribution function using Legendre polynomials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cieplak, Agnieszka M.; Slosar, Anže, E-mail: acieplak@bnl.gov, E-mail: anze@bnl.gov

    The Lyman-α forest is a highly non-linear field with considerable information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, the n -th Legendre coefficient can be expressed as a linear combination of the first n moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisationmore » over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with a very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities. We find that the amount of recoverable information is a very non-linear function of spectral noise that strongly favors fewer quasars measured at better signal to noise.« less

  19. Characterizing the Lyα forest flux probability distribution function using Legendre polynomials

    NASA Astrophysics Data System (ADS)

    Cieplak, Agnieszka M.; Slosar, Anže

    2017-10-01

    The Lyman-α forest is a highly non-linear field with considerable information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, the n-th Legendre coefficient can be expressed as a linear combination of the first n moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation over mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with a very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities. We find that the amount of recoverable information is a very non-linear function of spectral noise that strongly favors fewer quasars measured at better signal to noise.

  20. Characterizing the Lyα forest flux probability distribution function using Legendre polynomials

    DOE PAGES

    Cieplak, Agnieszka M.; Slosar, Anze

    2017-10-12

    The Lyman-α forest is a highly non-linear field with considerable information available in the data beyond the power spectrum. The flux probability distribution function (PDF) has been used as a successful probe of small-scale physics. In this paper we argue that measuring coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. In particular, the n-th Legendre coefficient can be expressed as a linear combination of the first n moments, allowing these coefficients to be measured in the presence of noise and allowing a clear route for marginalisation overmore » mean flux. Moreover, in the presence of noise, our numerical work shows that a finite number of coefficients are well measured with a very sharp transition into noise dominance. This compresses the available information into a small number of well-measured quantities. In conclusion, we find that the amount of recoverable information is a very non-linear function of spectral noise that strongly favors fewer quasars measured at better signal to noise.« less

  1. Sensory-driven and spontaneous gamma oscillations engage distinct cortical circuitry

    PubMed Central

    2015-01-01

    Gamma oscillations are a robust component of sensory responses but are also part of the background spontaneous activity of the brain. To determine whether the properties of gamma oscillations in cortex are specific to their mechanism of generation, we compared in mouse visual cortex in vivo the laminar geometry and single-neuron rhythmicity of oscillations produced during sensory representation with those occurring spontaneously in the absence of stimulation. In mouse visual cortex under anesthesia (isoflurane and xylazine), visual stimulation triggered oscillations mainly between 20 and 50 Hz, which, because of their similar functional significance to gamma oscillations in higher mammals, we define here as gamma range. Sensory representation in visual cortex specifically increased gamma oscillation amplitude in the supragranular (L2/3) and granular (L4) layers and strongly entrained putative excitatory and inhibitory neurons in infragranular layers, while spontaneous gamma oscillations were distributed evenly through the cortical depth and primarily entrained putative inhibitory neurons in the infragranular (L5/6) cortical layers. The difference in laminar distribution of gamma oscillations during the two different conditions may result from differences in the source of excitatory input to the cortex. In addition, modulation of superficial gamma oscillation amplitude did not result in a corresponding change in deep-layer oscillations, suggesting that superficial and deep layers of cortex may utilize independent but related networks for gamma generation. These results demonstrate that stimulus-driven gamma oscillations engage cortical circuitry in a manner distinct from spontaneous oscillations and suggest multiple networks for the generation of gamma oscillations in cortex. PMID:26719085

  2. On the detection of very high redshift gamma-ray bursts with Swift

    NASA Astrophysics Data System (ADS)

    Salvaterra, R.; Campana, S.; Chincarini, G.; Tagliaferri, G.; Covino, S.

    2007-09-01

    We compute the probability of detecting long gamma-ray bursts (GRBs) at z >= 5 with Swift, assuming that GRBs form preferentially in low-metallicity environments. The model fits both the observed Burst and Transient Source Experiment (BATSE) and Swift GRB differential peak flux distributions well and is consistent with the number of z >= 2.5 detections in the 2-yr Swift data. We find that the probability of observing a burst at z >= 5 becomes larger than 10 per cent for photon fluxes P < 1 ph s-1 cm-2, consistent with the number of confirmed detections. The corresponding fraction of z >= 5 bursts in the Swift catalogue is ~10-30 per cent depending on the adopted metallicity threshold for GRB formation. We propose to use the computed probability as a tool to identify high-redshift GRBs. By jointly considering promptly available information provided by Swift and model results, we can select reliable z >= 5 candidates in a few hours from the BAT detection. We test the procedure against last year Swift data: only three bursts match all our requirements, two being confirmed at z >= 5. Another three possible candidates are picked up by slightly relaxing the adopted criteria. No low-z interloper is found among the six candidates.

  3. Pan-European comparison of candidate distributions for climatological drought indices, SPI and SPEI

    NASA Astrophysics Data System (ADS)

    Stagge, James; Tallaksen, Lena; Gudmundsson, Lukas; Van Loon, Anne; Stahl, Kerstin

    2013-04-01

    Drought indices are vital to objectively quantify and compare drought severity, duration, and extent across regions with varied climatic and hydrologic regimes. The Standardized Precipitation Index (SPI), a well-reviewed meterological drought index recommended by the WMO, and its more recent water balance variant, the Standardized Precipitation-Evapotranspiration Index (SPEI) both rely on selection of univariate probability distributions to normalize the index, allowing for comparisons across climates. The SPI, considered a universal meteorological drought index, measures anomalies in precipitation, whereas the SPEI measures anomalies in climatic water balance (precipitation minus potential evapotranspiration), a more comprehensive measure of water availability that incorporates temperature. Many reviewers recommend use of the gamma (Pearson Type III) distribution for SPI normalization, while developers of the SPEI recommend use of the three parameter log-logistic distribution, based on point observation validation. Before the SPEI can be implemented at the pan-European scale, it is necessary to further validate the index using a range of candidate distributions to determine sensitivity to distribution selection, identify recommended distributions, and highlight those instances where a given distribution may not be valid. This study rigorously compares a suite of candidate probability distributions using WATCH Forcing Data, a global, historical (1958-2001) climate dataset based on ERA40 reanalysis with 0.5 x 0.5 degree resolution and bias-correction based on CRU-TS2.1 observations. Using maximum likelihood estimation, alternative candidate distributions are fit for the SPI and SPEI across the range of European climate zones. When evaluated at this scale, the gamma distribution for the SPI results in negatively skewed values, exaggerating the index severity of extreme dry conditions, while decreasing the index severity of extreme high precipitation. This bias is

  4. On the use of the energy probability distribution zeros in the study of phase transitions

    NASA Astrophysics Data System (ADS)

    Mól, L. A. S.; Rodrigues, R. G. M.; Stancioli, R. A.; Rocha, J. C. S.; Costa, B. V.

    2018-04-01

    This contribution is devoted to cover some technical aspects related to the use of the recently proposed energy probability distribution zeros in the study of phase transitions. This method is based on the partial knowledge of the partition function zeros and has been shown to be extremely efficient to precisely locate phase transition temperatures. It is based on an iterative method in such a way that the transition temperature can be approached at will. The iterative method will be detailed and some convergence issues that has been observed in its application to the 2D Ising model and to an artificial spin ice model will be shown, together with ways to circumvent them.

  5. Airborne gamma-ray and magnetic anomaly signatures of serpentinite in relation to soil geochemistry, northern California

    USGS Publications Warehouse

    McCafferty, A.E.; Van Gosen, B. S.

    2009-01-01

    Serpentinized ultramafic rocks and associated soils in northern California are characterized by high concentrations of Cr and Ni, low levels of radioelements (K, Th, and U) and high amounts of ferrimagnetic minerals (primarily magnetite). Geophysical attributes over ultramafic rocks, which include airborne gamma-ray and magnetic anomaly data, are quantified and provide indirect measurements on the relative abundance of radioelements and magnetic minerals, respectively. Attributes are defined through a statistical modeling approach and the results are portrayed as probabilities in chart and map form. Two predictive models are presented, including one derived from the aeromagnetic anomaly data and one from a combination of the airborne K, Th and U gamma-ray data. Both models distinguish preferential values within the aerogeophysical data that coincide with mapped and potentially unmapped ultramafic rocks. The magnetic predictive model shows positive probabilities associated with magnetic anomaly highs and, to a lesser degree, anomaly lows, which accurately locate many known ultramafic outcrops, but more interestingly, locate potentially unmapped ultramafic rocks, possible extensions of ultramafic bodies that dip into the shallow subsurface, as well as prospective buried ultramafic rocks. The airborne radiometric model shows positive probabilities in association with anomalously low gamma radiation measurements over ultramafic rock, which is similar to that produced by gabbro, metavolcanic rock, and water bodies. All of these features share the characteristic of being depleted in K, Th and U. Gabbro is the only rock type in the study area that shares similar magnetic properties with the ultramafic rock. The aerogeophysical model results are compared to the distribution of ultramafic outcrops and to Cr, Ni, K, Th and U concentrations and magnetic susceptibility measurements from soil samples. Analysis of the soil data indicates high positive correlation between

  6. Alpha, beta, or gamma: where does all the diversity go?

    NASA Technical Reports Server (NTRS)

    Sepkoski, J. J. Jr; Sepkoski JJ, J. r. (Principal Investigator)

    1988-01-01

    Global taxonomic richness is affected by variation in three components: within-community, or alpha, diversity, between-community, or beta, diversity; and between-region, or gamma, diversity. A data set consisting of 505 faunal lists distributed among 40 stratigraphic intervals and six environmental zones was used to investigate how variation of alpha and beta diversity influenced global diversity through the Paleozoic, and especially during the Ordovician radiations. As first shown by Bambach (1977), alpha diversity increased by 50 to 70 percent in offshore marine environments during the Ordovician and then remained essentially constant of the remainder of the Paleozoic. The increase is insufficient, however, to account for the 300 percent rise observed in global generic diversity. It is shown that beta diversity among level, soft-bottom communities also increased significantly during the early Paleozoic. This change is related to enhanced habitat selection, and presumably increased overall specialization, among diversifying taxa during the Ordovician radiations. Combined with alpha diversity, the measured change in beta diversity still accounts for only about half of the increase in global diversity. Other sources of increase are probably not related to variation in gamma diversity but rather to appearance and/or expansion of organic reefs, hardground communities, bryozoan thickets, and crinoid gardens during the Ordovician.

  7. Neutron and gamma flux distributions and their implications for radiation damage in the shielded superconducting core of a fusion power plant

    NASA Astrophysics Data System (ADS)

    Windsor, Colin G.; Morgan, J. Guy

    2017-11-01

    The neutron and gamma ray fluxes within the shielded high-temperature superconducting central columns of proposed spherical tokamak power plants have been studied using the MCNP Monte-Carlo code. The spatial, energy and angular variations of the fluxes over the shield and superconducting core are computed and used to specify experimental studies relevant to radiation damage and activation. The mean neutron and gamma fluxes, averaged over energy and angle, are shown to decay exponentially through the shield and then to remain roughly constant in the core region. The mean energy of neutrons is shown to decay more slowly than the neutron flux through the shield while the gamma energy is almost constant around 2 MeV. The differential neutron and gamma fluxes as a function of energy are examined. The neutron spectrum shows a fusion peak around 1 MeV changing at lower energies into an epithermal E -0.85 variation and at thermal energies to a Maxwellian distribution. The neutron and gamma energy spectra are defined for the outer surface of the superconducting core, relevant to damage studies. The inclusion of tungsten boride in the shield is shown to reduce energy deposition. A series of plasma scenarios with varying plasma major radii between 0.6 and 2.5 m was considered. Neutron and gamma fluxes are shown to decay exponentially with plasma radius, except at low shield thickness. Using the currently known experimental fluence limitations for high temperature superconductors, the continuous running time before the fluence limit is reached has been calculated to be days at 1.4 m major radius increasing to years at 2.2 m. This work helps validate the concept of the spherical tokamak route to fusion power by demonstrating that the neutron shielding required for long lifetime fusion power generation can be accommodated in a compact device.

  8. Confidence as Bayesian Probability: From Neural Origins to Behavior.

    PubMed

    Meyniel, Florent; Sigman, Mariano; Mainen, Zachary F

    2015-10-07

    Research on confidence spreads across several sub-fields of psychology and neuroscience. Here, we explore how a definition of confidence as Bayesian probability can unify these viewpoints. This computational view entails that there are distinct forms in which confidence is represented and used in the brain, including distributional confidence, pertaining to neural representations of probability distributions, and summary confidence, pertaining to scalar summaries of those distributions. Summary confidence is, normatively, derived or "read out" from distributional confidence. Neural implementations of readout will trade off optimality versus flexibility of routing across brain systems, allowing confidence to serve diverse cognitive functions. Copyright © 2015 Elsevier Inc. All rights reserved.

  9. Fisher's method of combining dependent statistics using generalizations of the gamma distribution with applications to genetic pleiotropic associations.

    PubMed

    Li, Qizhai; Hu, Jiyuan; Ding, Juan; Zheng, Gang

    2014-04-01

    A classical approach to combine independent test statistics is Fisher's combination of $p$-values, which follows the $\\chi ^2$ distribution. When the test statistics are dependent, the gamma distribution (GD) is commonly used for the Fisher's combination test (FCT). We propose to use two generalizations of the GD: the generalized and the exponentiated GDs. We study some properties of mis-using the GD for the FCT to combine dependent statistics when one of the two proposed distributions are true. Our results show that both generalizations have better control of type I error rates than the GD, which tends to have inflated type I error rates at more extreme tails. In practice, common model selection criteria (e.g. Akaike information criterion/Bayesian information criterion) can be used to help select a better distribution to use for the FCT. A simple strategy of the two generalizations of the GD in genome-wide association studies is discussed. Applications of the results to genetic pleiotrophic associations are described, where multiple traits are tested for association with a single marker.

  10. Two Active States of the Narrow-Line Gamma-Ray-Loud AGN GB 1310 + 487

    NASA Technical Reports Server (NTRS)

    Sokolovsky, K. V.; Schinzel, F. K.; Tanaka, Y. T.; Abolmasov, P. K.; Angelakis, E.; Bulgarelli, A.; Carrasco, L.; Cenko, S. B.; Cheung, C. C.; Clubb, K. I.; hide

    2014-01-01

    Context. Previously unremarkable, the extragalactic radio source GB1310 487 showed gamma-ray flare on 2009 November 18, reaching a daily flux of approximately 10(exp -6) photons cm(exp -2) s(exp -1) at energies E greater than 100MeV and became one of the brightest GeV sources for about two weeks. Its optical spectrum shows strong forbidden-line emission while lacking broad permitted lines, which is not typical for a blazar. Instead, the spectrum resembles those of narrow emission-line galaxies. Aims. We investigate changes in the object's radio-to-GeV spectral energy distribution (SED) during and after the prominent gamma-ray flare with the aim of determining the nature of the object and of constraining the origin of the variable high-energy emission. Methods. The data collected by the Fermi and AGILE satellites at gamma-ray energies; Swift at X-ray and ultraviolet (UV); the Kanata, NOT, and Keck telescopes at optical; OAGH and WISE at infrared (IR); and IRAM30m, OVRO 40m, Effelsberg 100m, RATAN-600, and VLBA at radio are analyzed together to trace the SED evolution on timescales of months. Results. The gamma-ray radio-loud narrow-line active galactic nucleus (AGN) is located at redshift z = 0.638. It shines through an unrelated foreground galaxy at z = 0.500. The AGN light is probably amplified by gravitational lensing. The AGN SED shows a two-humped structure typical of blazars and gamma-ray-loud narrow-line Seyfert 1 galaxies, with the high-energy (inverse-Compton) emission dominating by more than an order of magnitude over the low-energy (synchrotron) emission during gamma-ray flares. The difference between the two SED humps is smaller during the low-activity state. Fermi observations reveal a strong correlation between the gamma-ray flux and spectral index, with the hardest spectrum observed during the brightest gamma-ray state. The gamma-ray flares occurred before and during a slow rising trend in the radio, but no direct association between gamma-ray and

  11. Application of quasi-distributions for solving inverse problems of neutron and {gamma}-ray transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pogosbekyan, L.R.; Lysov, D.A.

    The considered inverse problems deal with the calculation of the unknown values of nuclear installations by means of the known (goal) functionals of neutron/{gamma}-ray distributions. The example of these problems might be the calculation of the automatic control rods position as function of neutron sensors reading, or the calculation of experimentally-corrected values of cross-sections, isotopes concentration, fuel enrichment via the measured functional. The authors have developed the new method to solve inverse problem. It finds flux density as quasi-solution of the particles conservation linear system adjointed to equalities for functionals. The method is more effective compared to the one basedmore » on the classical perturbation theory. It is suitable for vectorization and it can be used successfully in optimization codes.« less

  12. Failure modes and effects analysis (FMEA) for Gamma Knife radiosurgery.

    PubMed

    Xu, Andy Yuanguang; Bhatnagar, Jagdish; Bednarz, Greg; Flickinger, John; Arai, Yoshio; Vacsulka, Jonet; Feng, Wenzheng; Monaco, Edward; Niranjan, Ajay; Lunsford, L Dade; Huq, M Saiful

    2017-11-01

    Gamma Knife radiosurgery is a highly precise and accurate treatment technique for treating brain diseases with low risk of serious error that nevertheless could potentially be reduced. We applied the AAPM Task Group 100 recommended failure modes and effects analysis (FMEA) tool to develop a risk-based quality management program for Gamma Knife radiosurgery. A team consisting of medical physicists, radiation oncologists, neurosurgeons, radiation safety officers, nurses, operating room technologists, and schedulers at our institution and an external physicist expert on Gamma Knife was formed for the FMEA study. A process tree and a failure mode table were created for the Gamma Knife radiosurgery procedures using the Leksell Gamma Knife Perfexion and 4C units. Three scores for the probability of occurrence (O), the severity (S), and the probability of no detection for failure mode (D) were assigned to each failure mode by 8 professionals on a scale from 1 to 10. An overall risk priority number (RPN) for each failure mode was then calculated from the averaged O, S, and D scores. The coefficient of variation for each O, S, or D score was also calculated. The failure modes identified were prioritized in terms of both the RPN scores and the severity scores. The established process tree for Gamma Knife radiosurgery consists of 10 subprocesses and 53 steps, including a subprocess for frame placement and 11 steps that are directly related to the frame-based nature of the Gamma Knife radiosurgery. Out of the 86 failure modes identified, 40 Gamma Knife specific failure modes were caused by the potential for inappropriate use of the radiosurgery head frame, the imaging fiducial boxes, the Gamma Knife helmets and plugs, the skull definition tools as well as other features of the GammaPlan treatment planning system. The other 46 failure modes are associated with the registration, imaging, image transfer, contouring processes that are common for all external beam radiation therapy

  13. Exact valence bond entanglement entropy and probability distribution in the XXX spin chain and the potts model.

    PubMed

    Jacobsen, J L; Saleur, H

    2008-02-29

    We determine exactly the probability distribution of the number N_(c) of valence bonds connecting a subsystem of length L>1 to the rest of the system in the ground state of the XXX antiferromagnetic spin chain. This provides, in particular, the asymptotic behavior of the valence-bond entanglement entropy S_(VB)=N_(c)ln2=4ln2/pi(2)lnL disproving a recent conjecture that this should be related with the von Neumann entropy, and thus equal to 1/3lnL. Our results generalize to the Q-state Potts model.

  14. Probability Distributions of Minkowski Distances between Discrete Random Variables.

    ERIC Educational Resources Information Center

    Schroger, Erich; And Others

    1993-01-01

    Minkowski distances are used to indicate similarity of two vectors in an N-dimensional space. How to compute the probability function, the expectation, and the variance for Minkowski distances and the special cases City-block distance and Euclidean distance. Critical values for tests of significance are presented in tables. (SLD)

  15. Implications of the IRAS data for galactic gamma ray astronomy and EGRET

    NASA Technical Reports Server (NTRS)

    Stecker, Floyd W.

    1990-01-01

    Using the results of gamma-ray, millimeter wave and far surveys of the galaxy, logically consistent picture of the large scale distribution of galactic gas and cosmic rays was derived, tied to the overall processes of stellar birth and destruction on a galactic scale. Using the results of the IRAS far-infrared survey of te galaxy, the large scale radial distributions of galactic far-infrared emission independently was obtained for both the Northern and Southern Hemisphere sides of the Galaxy. The dominant feature in these distributions was found to be a broad peak coincident with the 5 kpc molecular gas cloud ring. Evidence was found for spiral arm features. Strong correlations are evident between the large scale galactic distributions of far-infrared emission, gamma-ray emission and total CO emission. There is particularly tight correlation between the distribution of warm molecular clouds and far-infrared emission on a galactic scale. The 5 kpc ring was evident in existing galactic gamma-ray data. The extent to which the more detailed spiral arm features are evident in the more resolved EGRET (Energetic Gamma-Ray Experimental Telescope) data will help to determine more precisely the propagation characteristics of cosmic rays.

  16. New S control chart using skewness correction method for monitoring process dispersion of skewed distributions

    NASA Astrophysics Data System (ADS)

    Atta, Abdu; Yahaya, Sharipah; Zain, Zakiyah; Ahmed, Zalikha

    2017-11-01

    Control chart is established as one of the most powerful tools in Statistical Process Control (SPC) and is widely used in industries. The conventional control charts rely on normality assumption, which is not always the case for industrial data. This paper proposes a new S control chart for monitoring process dispersion using skewness correction method for skewed distributions, named as SC-S control chart. Its performance in terms of false alarm rate is compared with various existing control charts for monitoring process dispersion, such as scaled weighted variance S chart (SWV-S); skewness correction R chart (SC-R); weighted variance R chart (WV-R); weighted variance S chart (WV-S); and standard S chart (STD-S). Comparison with exact S control chart with regards to the probability of out-of-control detections is also accomplished. The Weibull and gamma distributions adopted in this study are assessed along with the normal distribution. Simulation study shows that the proposed SC-S control chart provides good performance of in-control probabilities (Type I error) in almost all the skewness levels and sample sizes, n. In the case of probability of detection shift the proposed SC-S chart is closer to the exact S control chart than the existing charts for skewed distributions, except for the SC-R control chart. In general, the performance of the proposed SC-S control chart is better than all the existing control charts for monitoring process dispersion in the cases of Type I error and probability of detection shift.

  17. Phenomenological characteristic of the electron component in gamma-quanta initiated showers

    NASA Technical Reports Server (NTRS)

    Nikolsky, S. I.; Stamenov, J. N.; Ushev, S. Z.

    1985-01-01

    The phenomenological characteristics of the electron component in showers initiated by primary gamma-quanta were analyzed on the basis of the Tien Shan experimental data. It is shown that the lateral distribution of the electrons ion gamma-quanta initiated showers can be described with NKG - function with age parameters bar S equals 0, 76 plus or minus 0, 02, different from the same parameter for normal showers with the same size bar S equals 0, 85 plus or minus 0, 01. The lateral distribution of the correspondent electron energy flux in gamma-quanta initiated showers is steeper as in normal cosmic ray showers.

  18. A Large Sample Procedure for Testing Coefficients of Ordinal Association: Goodman and Kruskal's Gamma and Somers' d ba and d ab

    ERIC Educational Resources Information Center

    Berry, Kenneth J.; And Others

    1977-01-01

    A FORTRAN program, GAMMA, computes Goodman and Kruskal's coefficient of ordinal association, gamma, and Somer's coefficient. The program also provides associated standard errors, standard scores, and probability values. (Author/JKS)

  19. Probability Forecasting Using Monte Carlo Simulation

    NASA Astrophysics Data System (ADS)

    Duncan, M.; Frisbee, J.; Wysack, J.

    2014-09-01

    Space Situational Awareness (SSA) is defined as the knowledge and characterization of all aspects of space. SSA is now a fundamental and critical component of space operations. Increased dependence on our space assets has in turn lead to a greater need for accurate, near real-time knowledge of all space activities. With the growth of the orbital debris population, satellite operators are performing collision avoidance maneuvers more frequently. Frequent maneuver execution expends fuel and reduces the operational lifetime of the spacecraft. Thus the need for new, more sophisticated collision threat characterization methods must be implemented. The collision probability metric is used operationally to quantify the collision risk. The collision probability is typically calculated days into the future, so that high risk and potential high risk conjunction events are identified early enough to develop an appropriate course of action. As the time horizon to the conjunction event is reduced, the collision probability changes. A significant change in the collision probability will change the satellite mission stakeholder's course of action. So constructing a method for estimating how the collision probability will evolve improves operations by providing satellite operators with a new piece of information, namely an estimate or 'forecast' of how the risk will change as time to the event is reduced. Collision probability forecasting is a predictive process where the future risk of a conjunction event is estimated. The method utilizes a Monte Carlo simulation that produces a likelihood distribution for a given collision threshold. Using known state and state uncertainty information, the simulation generates a set possible trajectories for a given space object pair. Each new trajectory produces a unique event geometry at the time of close approach. Given state uncertainty information for both objects, a collision probability value can be computed for every trail. This yields a

  20. Vaginal distribution and retention of a multiparticulate drug delivery system, assessed by gamma scintigraphy and magnetic resonance imaging.

    PubMed

    Mehta, Samata; Verstraelen, Hans; Peremans, Kathelijne; Villeirs, Geert; Vermeire, Simon; De Vos, Filip; Mehuys, Els; Remon, Jean Paul; Vervaet, Chris

    2012-04-15

    For any new vaginal dosage form, the distribution and retention in the vagina has to be assessed by in vivo evaluation. We evaluated the vaginal distribution and retention of starch-based pellets in sheep as live animal model by gamma scintigraphy (using Indium-111 DTPA as radiolabel) and in women via magnetic resonance imaging (MRI, using a gadolinium chelate as contrast agent). A conventional cream formulation was used as reference in both studies. Cream and pellets were administered to sheep (n=6) in a two period-two treatment study and to healthy female volunteers (n=6) via a randomized crossover trial. Pellets (filled into hard gelatin capsule) and cetomacrogol cream, both labeled with Indium-111 DTPA (for gamma scintigraphy) or with gadolinium chelate (for MRI) were evaluated for their intravaginal distribution and retention over a 24h period. Spreading in the vagina was assessed based on the part of the vagina covered with formulation (expressed in relation to the total vaginal length). Vaginal retention of the formulation was quantified based on the radioactivity remaining in the vaginal area (sheep study), or qualitatively evaluated (women study). Both trials indicated a rapid distribution of the cream within the vagina as complete coverage of the vaginal mucosa was seen 1h after dose administration. Clearance of the cream was rapid: about 10% activity remained in the vaginal area of the sheep 12h post-administration, while after 8h only a thin layer of cream was detected on the vaginal mucosa of women. After disintegration of the hard gelatin capsule, the pellet formulation gradually distributed over the entire vaginal mucosa. Residence time of the pellets in the vagina was longer compared to the semi-solid formulation: after 24h 23 ± 7% radioactivity was detected in the vaginal area of the sheep, while in women the pellet formulation was still detected throughout the vagina. A multi-particulate system containing starch-based pellets was identified as a

  1. A Unifying Probability Example.

    ERIC Educational Resources Information Center

    Maruszewski, Richard F., Jr.

    2002-01-01

    Presents an example from probability and statistics that ties together several topics including the mean and variance of a discrete random variable, the binomial distribution and its particular mean and variance, the sum of independent random variables, the mean and variance of the sum, and the central limit theorem. Uses Excel to illustrate these…

  2. Oil spill contamination probability in the southeastern Levantine basin.

    PubMed

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  3. Neutron detection in a high gamma-ray background with EJ-301 and EJ-309 liquid scintillators

    NASA Astrophysics Data System (ADS)

    Stevanato, L.; Cester, D.; Nebbia, G.; Viesti, G.

    2012-10-01

    Using a fast digitizer, the neutron-gamma discrimination capability of the new liquid scintillator EJ-309 is compared with that obtained using standard EJ-301. Moreover the capability of both the scintillation detectors to identify a weak neutron source in a high gamma-ray background is demonstrated. The probability of neutron detection is PD=95% at 95% confidence level for a gamma-ray background corresponding to a dose rate of 100 μSv/h.

  4. Probability distributions of linear statistics in chaotic cavities and associated phase transitions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vivo, Pierpaolo; Majumdar, Satya N.; Bohigas, Oriol

    2010-03-01

    We establish large deviation formulas for linear statistics on the N transmission eigenvalues (T{sub i}) of a chaotic cavity, in the framework of random matrix theory. Given any linear statistics of interest A=SIGMA{sub i=1}{sup N}a(T{sub i}), the probability distribution P{sub A}(A,N) of A generically satisfies the large deviation formula lim{sub N-}>{sub i}nfinity[-2 log P{sub A}(Nx,N)/betaN{sup 2}]=PSI{sub A}(x), where PSI{sub A}(x) is a rate function that we compute explicitly in many cases (conductance, shot noise, and moments) and beta corresponds to different symmetry classes. Using these large deviation expressions, it is possible to recover easily known results and to produce newmore » formulas, such as a closed form expression for v(n)=lim{sub N-}>{sub i}nfinity var(T{sub n}) (where T{sub n}=SIGMA{sub i}T{sub i}{sup n}) for arbitrary integer n. The universal limit v*=lim{sub n-}>{sub i}nfinity v(n)=1/2pibeta is also computed exactly. The distributions display a central Gaussian region flanked on both sides by non-Gaussian tails. At the junction of the two regimes, weakly nonanalytical points appear, a direct consequence of phase transitions in an associated Coulomb gas problem. Numerical checks are also provided, which are in full agreement with our asymptotic results in both real and Laplace space even for moderately small N. Part of the results have been announced by Vivo et al. [Phys. Rev. Lett. 101, 216809 (2008)].« less

  5. An efficient multi-objective optimization method for water quality sensor placement within water distribution systems considering contamination probability variations.

    PubMed

    He, Guilin; Zhang, Tuqiao; Zheng, Feifei; Zhang, Qingzhou

    2018-06-20

    Water quality security within water distribution systems (WDSs) has been an important issue due to their inherent vulnerability associated with contamination intrusion. This motivates intensive studies to identify optimal water quality sensor placement (WQSP) strategies, aimed to timely/effectively detect (un)intentional intrusion events. However, these available WQSP optimization methods have consistently presumed that each WDS node has an equal contamination probability. While being simple in implementation, this assumption may do not conform to the fact that the nodal contamination probability may be significantly regionally varied owing to variations in population density and user properties. Furthermore, the low computational efficiency is another important factor that has seriously hampered the practical applications of the currently available WQSP optimization approaches. To address these two issues, this paper proposes an efficient multi-objective WQSP optimization method to explicitly account for contamination probability variations. Four different contamination probability functions (CPFs) are proposed to represent the potential variations of nodal contamination probabilities within the WDS. Two real-world WDSs are used to demonstrate the utility of the proposed method. Results show that WQSP strategies can be significantly affected by the choice of the CPF. For example, when the proposed method is applied to the large case study with the CPF accounting for user properties, the event detection probabilities of the resultant solutions are approximately 65%, while these values are around 25% for the traditional approach, and such design solutions are achieved approximately 10,000 times faster than the traditional method. This paper provides an alternative method to identify optimal WQSP solutions for the WDS, and also builds knowledge regarding the impacts of different CPFs on sensor deployments. Copyright © 2018 Elsevier Ltd. All rights reserved.

  6. Supervised learning of probability distributions by neural networks

    NASA Technical Reports Server (NTRS)

    Baum, Eric B.; Wilczek, Frank

    1988-01-01

    Supervised learning algorithms for feedforward neural networks are investigated analytically. The back-propagation algorithm described by Werbos (1974), Parker (1985), and Rumelhart et al. (1986) is generalized by redefining the values of the input and output neurons as probabilities. The synaptic weights are then varied to follow gradients in the logarithm of likelihood rather than in the error. This modification is shown to provide a more rigorous theoretical basis for the algorithm and to permit more accurate predictions. A typical application involving a medical-diagnosis expert system is discussed.

  7. Significance of medium energy gamma ray astronomy in the study of cosmic rays

    NASA Technical Reports Server (NTRS)

    Fichtel, C. E.; Kniffen, D. A.; Thompson, D. J.; Bignami, G. F.; Cheung, C. Y.

    1975-01-01

    Medium energy (about 10 to 30 MeV) gamma ray astronomy provides information on the product of the galactic electron cosmic ray intensity and the galactic matter to which the electrons are dynamically coupled by the magnetic field. Because high energy (greater than 100 MeV) gamma ray astronomy provides analogous information for the nucleonic cosmic rays and the relevant matter, a comparison between high energy and medium energy gamma ray intensities provides a direct ratio of the cosmic ray electrons and nucleons throughout the galaxy. A calculation of gamma ray production by electron bremsstrahlung shows that: bremsstrahlung energy loss is probably not negligible over the lifetime of the electrons in the galaxy; and the approximate bremsstrahlung calculation often used previously overestimates the gamma ray intensity by about a factor of two. As a specific example, expected medium energy gamma ray intensities are calculated for the speral arm model.

  8. Assessment of source probabilities for potential tsunamis affecting the U.S. Atlantic coast

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2009-01-01

    Estimating the likelihood of tsunamis occurring along the U.S. Atlantic coast critically depends on knowledge of tsunami source probability. We review available information on both earthquake and landslide probabilities from potential sources that could generate local and transoceanic tsunamis. Estimating source probability includes defining both size and recurrence distributions for earthquakes and landslides. For the former distribution, source sizes are often distributed according to a truncated or tapered power-law relationship. For the latter distribution, sources are often assumed to occur in time according to a Poisson process, simplifying the way tsunami probabilities from individual sources can be aggregated. For the U.S. Atlantic coast, earthquake tsunami sources primarily occur at transoceanic distances along plate boundary faults. Probabilities for these sources are constrained from previous statistical studies of global seismicity for similar plate boundary types. In contrast, there is presently little information constraining landslide probabilities that may generate local tsunamis. Though there is significant uncertainty in tsunami source probabilities for the Atlantic, results from this study yield a comparative analysis of tsunami source recurrence rates that can form the basis for future probabilistic analyses.

  9. The structure and content of the galaxy and galactic gamma rays

    NASA Technical Reports Server (NTRS)

    Fichtel, C. E. (Editor); Stecker, F. W. (Editor)

    1977-01-01

    Gamma radiation investigations by COS-B and SAS-2 satellite are reported. Data from CO surveys of the galaxy and the galactic distribution of pulsars are analyzed. Theories of galactic gamma ray emission are explored.

  10. Characterizing the Lyman-alpha forest flux probability distribution function using Legendre polynomials

    NASA Astrophysics Data System (ADS)

    Cieplak, Agnieszka; Slosar, Anze

    2018-01-01

    The Lyman-alpha forest has become a powerful cosmological probe at intermediate redshift. It is a highly non-linear field with much information present beyond the power spectrum. The flux probability flux distribution (PDF) in particular has been a successful probe of small scale physics. However, it is also sensitive to pixel noise, spectrum resolution, and continuum fitting, all of which lead to possible biased estimators. Here we argue that measuring the coefficients of the Legendre polynomial expansion of the PDF offers several advantages over measuring the binned values as is commonly done. Since the n-th Legendre coefficient can be expressed as a linear combination of the first n moments of the field, this allows for the coefficients to be measured in the presence of noise and allows for a clear route towards marginalization over the mean flux. Additionally, in the presence of noise, a finite number of these coefficients are well measured with a very sharp transition into noise dominance. This compresses the information into a small amount of well-measured quantities. Finally, we find that measuring fewer quasars with high signal-to-noise produces a higher amount of recoverable information.

  11. Convergence of Transition Probability Matrix in CLVMarkov Models

    NASA Astrophysics Data System (ADS)

    Permana, D.; Pasaribu, U. S.; Indratno, S. W.; Suprayogi, S.

    2018-04-01

    A transition probability matrix is an arrangement of transition probability from one states to another in a Markov chain model (MCM). One of interesting study on the MCM is its behavior for a long time in the future. The behavior is derived from one property of transition probabilty matrix for n steps. This term is called the convergence of the n-step transition matrix for n move to infinity. Mathematically, the convergence of the transition probability matrix is finding the limit of the transition matrix which is powered by n where n moves to infinity. The convergence form of the transition probability matrix is very interesting as it will bring the matrix to its stationary form. This form is useful for predicting the probability of transitions between states in the future. The method usually used to find the convergence of transition probability matrix is through the process of limiting the distribution. In this paper, the convergence of the transition probability matrix is searched using a simple concept of linear algebra that is by diagonalizing the matrix.This method has a higher level of complexity because it has to perform the process of diagonalization in its matrix. But this way has the advantage of obtaining a common form of power n of the transition probability matrix. This form is useful to see transition matrix before stationary. For example cases are taken from CLV model using MCM called Model of CLV-Markov. There are several models taken by its transition probability matrix to find its convergence form. The result is that the convergence of the matrix of transition probability through diagonalization has similarity with convergence with commonly used distribution of probability limiting method.

  12. Characterization of cDNAs encoding the chick retinoic acid receptor gamma 2 and preferential distribution of retinoic acid receptor gamma transcripts during chick skin development.

    PubMed

    Michaille, J J; Blanchet, S; Kanzler, B; Garnier, J M; Dhouailly, D

    1994-12-01

    Retinoic acid receptors alpha, beta and gamma (RAR alpha, beta and gamma) are ligand-inductible transcriptional activators which belong to the steroid/thyroid hormone receptor superfamily. At least two major isoforms (1 and 2) of each RAR arise by differential use of two promoters and alternative splicing. In mouse, the three RAR genes are expressed in stage- and tissue-specific patterns during embryonic development. In order to understand the role of the different RARs in chick, RAR gamma 2 cDNAs were isolated from an 8.5-day (stage 35 of Hamburger and Hamilton) chick embryo skin library. The deduced chick RAR gamma 2 amino acid sequence displays uncommon features such as 21 specific amino acid replacements, 12 of them being clustered in the amino-terminal region (domains A2 and B), and a truncated acidic carboxy-terminal region (F domain). However, the pattern of RAR gamma expression in chick embryo resembles that reported in mouse, particularly in skin where RAR gamma expression occurs in both the dermal and epidermal layers at the beginning of feather formation, and is subsequently restricted to the differentiating epidermal cells. Northern blot analysis suggests that different RAR gamma isoforms could be successively required during chick development.

  13. Multivariate Distributions in Reliability Theory and Life Testing.

    DTIC Science & Technology

    1981-04-01

    Downton Distribution This distribution is a special case of a classical bivariate gamma distribution due to Wicksell and to Kibble. See Krishnaiah and...Krishnamoorthy and Parthasarathy (1951) (see also Krishnaiah and Rao (1961) and Krishnaiah (1977))and also within the frame- 13 work of the Arnold classes. A...for these distributions and their properties is Johnson and Kotz (1972). Krishnaiah (1977) has specifically discussed multi- variate gamma

  14. Imprecise Probability Methods for Weapons UQ

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Picard, Richard Roy; Vander Wiel, Scott Alan

    Building on recent work in uncertainty quanti cation, we examine the use of imprecise probability methods to better characterize expert knowledge and to improve on misleading aspects of Bayesian analysis with informative prior distributions. Quantitative approaches to incorporate uncertainties in weapons certi cation are subject to rigorous external peer review, and in this regard, certain imprecise probability methods are well established in the literature and attractive. These methods are illustrated using experimental data from LANL detonator impact testing.

  15. Distribution, characterization, and exposure of MC252 oil in the supratidal beach environment.

    PubMed

    Lemelle, Kendall R; Elango, Vijaikrishnah; Pardue, John H

    2014-07-01

    The distribution and characteristics of MC252 oil:sand aggregates, termed surface residue balls (SRBs), were measured on the supratidal beach environment of oil-impacted Fourchon Beach in Louisiana (USA). Probability distributions of 4 variables, surface coverage (%), size of SRBs (mm(2) of projected area), mass of SRBs per m(2) (g/m(2)), and concentrations of polycyclic aromatic hydrocarbons (PAHs) and n-alkanes in the SRBs (mg of crude oil component per kg of SRB) were determined using parametric and nonparametric statistical techniques. Surface coverage of SRBs, an operational remedial standard for the beach surface, was a gamma-distributed variable ranging from 0.01% to 8.1%. The SRB sizes had a mean of 90.7 mm(2) but fit no probability distribution, and a nonparametric ranking was used to describe the size distributions. Concentrations of total PAHs ranged from 2.5 mg/kg to 126 mg/kg of SRB. Individual PAH concentration distributions, consisting primarily of alkylated phenanthrenes, dibenzothiophenes, and chrysenes, did not consistently fit a parametric distribution. Surface coverage was correlated with an oil mass per unit area but with a substantial error at lower coverage (i.e., <2%). These data provide probabilistic risk assessors with the ability to specify uncertainty in PAH concentration, exposure frequency, and ingestion rate, based on SRB characteristics for the dominant oil form on beaches along the US Gulf Coast. © 2014 SETAC.

  16. 137 Ba Double Gamma Decay Measurement with GAMMASPHERE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merchán, E.; Moran, K.; Lister, C. J.

    2015-05-28

    The study of the electromagnetic moments (EM), and decay probability, provides detailed information about nuclear wave functions. The well-know properties of EM interactions are good for extracting information about the motion of nucleons. Higher order EM processes always occur, but are usually too weak to be measured. In the case of a 0 + → 0 + transitions, where a single gamma transition is forbidden, the simultaneous emission of two γ-rays has been studied. An interesting opportunity to further investigate 2-photon emission phenomena is by using a standard 137Cs source populating, via β-decay, the J π = 11/2 - isomericmore » state at 662 keV in 137Ba. In this case, two photon process can have contributions from quadrupole-quadrupole or dipole-octupole multipolarities in direct competition with the high multipolarity M4 decay. Since the yield of the double gamma decay is around six orders of magnitude less than the first order transition, very good statistics are needed in order to observe the phenomena and great care must be taken to suppress the first-order decay. The Gammasphere array is ideal since its configuration allows a good coverage of the angular distribution and the Compton events can be suppressed. Nevertheless the process to understand and eliminate the Compton background is a challenge. Geant4 simulations were carried out to help understand and correct for those factors.« less

  17. The Homotopic Probability Distribution and the Partition Function for the Entangled System Around a Ribbon Segment Chain

    NASA Astrophysics Data System (ADS)

    Qian, Shang-Wu; Gu, Zhi-Yu

    2001-12-01

    Using the Feynman's path integral with topological constraints arising from the presence of one singular line, we find the homotopic probability distribution P_L^n for the winding number n and the partition function P_L of the entangled system around a ribbon segment chain. We find that when the width of the ribbon segment chain 2a increases,the partition function exponentially decreases, whereas the free energy increases an amount, which is proportional to the square of the width. When the width tends to zero we obtain the same results as those of a single chain with one singular point.

  18. Quantum probability assignment limited by relativistic causality.

    PubMed

    Han, Yeong Deok; Choi, Taeseung

    2016-03-14

    Quantum theory has nonlocal correlations, which bothered Einstein, but found to satisfy relativistic causality. Correlation for a shared quantum state manifests itself, in the standard quantum framework, by joint probability distributions that can be obtained by applying state reduction and probability assignment that is called Born rule. Quantum correlations, which show nonlocality when the shared state has an entanglement, can be changed if we apply different probability assignment rule. As a result, the amount of nonlocality in quantum correlation will be changed. The issue is whether the change of the rule of quantum probability assignment breaks relativistic causality. We have shown that Born rule on quantum measurement is derived by requiring relativistic causality condition. This shows how the relativistic causality limits the upper bound of quantum nonlocality through quantum probability assignment.

  19. Menzerath-Altmann law for distinct word distribution analysis in a large text

    NASA Astrophysics Data System (ADS)

    Eroglu, Sertac

    2013-06-01

    The empirical law uncovered by Menzerath and formulated by Altmann, known as the Menzerath-Altmann law (henceforth the MA law), reveals the statistical distribution behavior of human language in various organizational levels. Building on previous studies relating organizational regularities in a language, we propose that the distribution of distinct (or different) words in a large text can effectively be described by the MA law. The validity of the proposition is demonstrated by examining two text corpora written in different languages not belonging to the same language family (English and Turkish). The results show not only that distinct word distribution behavior can accurately be predicted by the MA law, but that this result appears to be language-independent. This result is important not only for quantitative linguistic studies, but also may have significance for other naturally occurring organizations that display analogous organizational behavior. We also deliberately demonstrate that the MA law is a special case of the probability function of the generalized gamma distribution.

  20. The perception of probability.

    PubMed

    Gallistel, C R; Krishan, Monika; Liu, Ye; Miller, Reilly; Latham, Peter E

    2014-01-01

    We present a computational model to explain the results from experiments in which subjects estimate the hidden probability parameter of a stepwise nonstationary Bernoulli process outcome by outcome. The model captures the following results qualitatively and quantitatively, with only 2 free parameters: (a) Subjects do not update their estimate after each outcome; they step from one estimate to another at irregular intervals. (b) The joint distribution of step widths and heights cannot be explained on the assumption that a threshold amount of change must be exceeded in order for them to indicate a change in their perception. (c) The mapping of observed probability to the median perceived probability is the identity function over the full range of probabilities. (d) Precision (how close estimates are to the best possible estimate) is good and constant over the full range. (e) Subjects quickly detect substantial changes in the hidden probability parameter. (f) The perceived probability sometimes changes dramatically from one observation to the next. (g) Subjects sometimes have second thoughts about a previous change perception, after observing further outcomes. (h) The frequency with which they perceive changes moves in the direction of the true frequency over sessions. (Explaining this finding requires 2 additional parametric assumptions.) The model treats the perception of the current probability as a by-product of the construction of a compact encoding of the experienced sequence in terms of its change points. It illustrates the why and the how of intermittent Bayesian belief updating and retrospective revision in simple perception. It suggests a reinterpretation of findings in the recent literature on the neurobiology of decision making. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  1. A matrix-inversion method for gamma-source mapping from gamma-count data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adsley, Ian; Burgess, Claire; Bull, Richard K

    In a previous paper it was proposed that a simple matrix inversion method could be used to extract source distributions from gamma-count maps, using simple models to calculate the response matrix. The method was tested using numerically generated count maps. In the present work a 100 kBq Co{sup 60} source has been placed on a gridded surface and the count rate measured using a NaI scintillation detector. The resulting map of gamma counts was used as input to the matrix inversion procedure and the source position recovered. A multi-source array was simulated by superposition of several single-source count maps andmore » the source distribution was again recovered using matrix inversion. The measurements were performed for several detector heights. The effects of uncertainties in source-detector distances on the matrix inversion method are also examined. The results from this work give confidence in the application of the method to practical applications, such as the segregation of highly active objects amongst fuel-element debris. (authors)« less

  2. Path probability of stochastic motion: A functional approach

    NASA Astrophysics Data System (ADS)

    Hattori, Masayuki; Abe, Sumiyoshi

    2016-06-01

    The path probability of a particle undergoing stochastic motion is studied by the use of functional technique, and the general formula is derived for the path probability distribution functional. The probability of finding paths inside a tube/band, the center of which is stipulated by a given path, is analytically evaluated in a way analogous to continuous measurements in quantum mechanics. Then, the formalism developed here is applied to the stochastic dynamics of stock price in finance.

  3. Explosion probability of unexploded ordnance: expert beliefs.

    PubMed

    MacDonald, Jacqueline Anne; Small, Mitchell J; Morgan, M G

    2008-08-01

    This article reports on a study to quantify expert beliefs about the explosion probability of unexploded ordnance (UXO). Some 1,976 sites at closed military bases in the United States are contaminated with UXO and are slated for cleanup, at an estimated cost of $15-140 billion. Because no available technology can guarantee 100% removal of UXO, information about explosion probability is needed to assess the residual risks of civilian reuse of closed military bases and to make decisions about how much to invest in cleanup. This study elicited probability distributions for the chance of UXO explosion from 25 experts in explosive ordnance disposal, all of whom have had field experience in UXO identification and deactivation. The study considered six different scenarios: three different types of UXO handled in two different ways (one involving children and the other involving construction workers). We also asked the experts to rank by sensitivity to explosion 20 different kinds of UXO found at a case study site at Fort Ord, California. We found that the experts do not agree about the probability of UXO explosion, with significant differences among experts in their mean estimates of explosion probabilities and in the amount of uncertainty that they express in their estimates. In three of the six scenarios, the divergence was so great that the average of all the expert probability distributions was statistically indistinguishable from a uniform (0, 1) distribution-suggesting that the sum of expert opinion provides no information at all about the explosion risk. The experts' opinions on the relative sensitivity to explosion of the 20 UXO items also diverged. The average correlation between rankings of any pair of experts was 0.41, which, statistically, is barely significant (p= 0.049) at the 95% confidence level. Thus, one expert's rankings provide little predictive information about another's rankings. The lack of consensus among experts suggests that empirical studies

  4. The transition probability and the probability for the left-most particle's position of the q-totally asymmetric zero range process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korhonen, Marko; Lee, Eunghyun

    2014-01-15

    We treat the N-particle zero range process whose jumping rates satisfy a certain condition. This condition is required to use the Bethe ansatz and the resulting model is the q-boson model by Sasamoto and Wadati [“Exact results for one-dimensional totally asymmetric diffusion models,” J. Phys. A 31, 6057–6071 (1998)] or the q-totally asymmetric zero range process (TAZRP) by Borodin and Corwin [“Macdonald processes,” Probab. Theory Relat. Fields (to be published)]. We find the explicit formula of the transition probability of the q-TAZRP via the Bethe ansatz. By using the transition probability we find the probability distribution of the left-most particle'smore » position at time t. To find the probability for the left-most particle's position we find a new identity corresponding to identity for the asymmetric simple exclusion process by Tracy and Widom [“Integral formulas for the asymmetric simple exclusion process,” Commun. Math. Phys. 279, 815–844 (2008)]. For the initial state that all particles occupy a single site, the probability distribution of the left-most particle's position at time t is represented by the contour integral of a determinant.« less

  5. EDF: Computing electron number probability distribution functions in real space from molecular wave functions

    NASA Astrophysics Data System (ADS)

    Francisco, E.; Pendás, A. Martín; Blanco, M. A.

    2008-04-01

    Given an N-electron molecule and an exhaustive partition of the real space ( R) into m arbitrary regions Ω,Ω,…,Ω ( ⋃i=1mΩ=R), the edf program computes all the probabilities P(n,n,…,n) of having exactly n electrons in Ω, n electrons in Ω,…, and n electrons ( n+n+⋯+n=N) in Ω. Each Ω may correspond to a single basin (atomic domain) or several such basins (functional group). In the later case, each atomic domain must belong to a single Ω. The program can manage both single- and multi-determinant wave functions which are read in from an aimpac-like wave function description ( .wfn) file (T.A. Keith et al., The AIMPAC95 programs, http://www.chemistry.mcmaster.ca/aimpac, 1995). For multi-determinantal wave functions a generalization of the original .wfn file has been introduced. The new format is completely backwards compatible, adding to the previous structure a description of the configuration interaction (CI) coefficients and the determinants of correlated wave functions. Besides the .wfn file, edf only needs the overlap integrals over all the atomic domains between the molecular orbitals (MO). After the P(n,n,…,n) probabilities are computed, edf obtains from them several magnitudes relevant to chemical bonding theory, such as average electronic populations and localization/delocalization indices. Regarding spin, edf may be used in two ways: with or without a splitting of the P(n,n,…,n) probabilities into α and β spin components. Program summaryProgram title: edf Catalogue identifier: AEAJ_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEAJ_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: Standard CPC licence, http://cpc.cs.qub.ac.uk/licence/licence.html No. of lines in distributed program, including test data, etc.: 5387 No. of bytes in distributed program, including test data, etc.: 52 381 Distribution format: tar.gz Programming language: Fortran 77 Computer

  6. 21 CFR 1304.26 - Additional recordkeeping requirements applicable to drug products containing gamma-hydroxybutyric...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... to drug products containing gamma-hydroxybutyric acid. 1304.26 Section 1304.26 Food and Drugs DRUG....26 Additional recordkeeping requirements applicable to drug products containing gamma-hydroxybutyric....22, practitioners dispensing gamma-hydroxybutyric acid that is manufactured or distributed in...

  7. Fermi GBM Observations of Terrestrial Gamma Flashes

    NASA Technical Reports Server (NTRS)

    Wilson-Hodge, Colleen A.; Briggs, M. S.; Fishman, G. J.; Bhat, P. N.; Paciesas, W. S.; Preece, R.; Kippen, R. M.; von Kienlin, A.; Dwyer, J. R.; Smith, D. M.; hide

    2010-01-01

    In its first two years of operation, the Fermi Gamma Ray Burst Monitor (GBM) has observed more than 77 Terrestrial Gamma Flashes (TGFs). The thick Bismuth Germanate (BGO) detectors are excellent for TGF spectroscopy, having a high probability of recording the full energy of an incident photon, spanning a broad energy range from 150 keV to 40 MeV, and recording a large number of photons per TGF. Correlations between GBM TGF triggers and lightning sferics detected with the World-Wide Lightning Location Network indicate that TGFs and lightning are simultaneous to within tens of microseconds. The energy spectra of some TGFs have strong 511 keV positron annihilation lines, indicating that these TGFs contain a large fraction of positrons

  8. Gamma motes for detection of radioactive materials in shipping containers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harold McHugh; William Quam; Stephan Weeks

    Shipping containers can be effectively monitored for radiological materials using gamma (and neutron) motes in distributed mesh networks. The mote platform is ideal for collecting data for integration into operational management systems required for efficiently and transparently monitoring international trade. Significant reductions in size and power requirements have been achieved for room-temperature cadmium zinc telluride (CZT) gamma detectors. Miniaturization of radio modules and microcontroller units are paving the way for low-power, deeply-embedded, wireless sensor distributed mesh networks.

  9. GRIS observations of Al-26 gamma-ray line emission from two points in the Galactic plane

    NASA Technical Reports Server (NTRS)

    Teegarden, B. J.; Barthelmy, S. D.; Gehrels, N.; Tueller, J.; Leventhal, M.

    1991-01-01

    Both of the Gamma-Ray Imaging Spectrometer (GRIS) experiment's two observations of the Galactic center region, at l = zero and 335 deg respectively, detected Al-26 gamma-ray line emission. While these observations are consistent with the assumed high-energy gamma-ray distribution, they are consistent with other distributions as well. The data suggest that the Al-26 emission is distributed over Galactic longitude rather than being confined to a point source. The GRIS data also indicate that the 1809 keV line is broadened.

  10. Estimation of transition probabilities of credit ratings

    NASA Astrophysics Data System (ADS)

    Peng, Gan Chew; Hin, Pooi Ah

    2015-12-01

    The present research is based on the quarterly credit ratings of ten companies over 15 years taken from the database of the Taiwan Economic Journal. The components in the vector mi (mi1, mi2,⋯, mi10) may first be used to denote the credit ratings of the ten companies in the i-th quarter. The vector mi+1 in the next quarter is modelled to be dependent on the vector mi via a conditional distribution which is derived from a 20-dimensional power-normal mixture distribution. The transition probability Pkl (i ,j ) for getting mi+1,j = l given that mi, j = k is then computed from the conditional distribution. It is found that the variation of the transition probability Pkl (i ,j ) as i varies is able to give indication for the possible transition of the credit rating of the j-th company in the near future.

  11. Novel gamma-ray signatures of PeV-scale dark matter

    NASA Astrophysics Data System (ADS)

    Blanco, Carlos; Harding, J. Patrick; Hooper, Dan

    2018-04-01

    The gamma-ray annihilation and decay products of very heavy dark matter particles can undergo attenuation through pair production, leading to the development of electromagnetic cascades. This has a significant impact not only on the spectral shape of the gamma-ray signal, but also on the angular distribution of the observed photons. Such phenomena are particularly important in light of the new HAWC experiment, which provides unprecedented sensitivity to multi-TeV photons and thus to very heavy dark matter particles. In this study, we focus on dark matter in the 100 TeV–100 PeV mass range, and calculate the spectral and angular distribution of gamma-rays from dwarf galaxies and from nearby galaxy clusters in this class of models.

  12. PRODIGEN: visualizing the probability landscape of stochastic gene regulatory networks in state and time space.

    PubMed

    Ma, Chihua; Luciani, Timothy; Terebus, Anna; Liang, Jie; Marai, G Elisabeta

    2017-02-15

    Visualizing the complex probability landscape of stochastic gene regulatory networks can further biologists' understanding of phenotypic behavior associated with specific genes. We present PRODIGEN (PRObability DIstribution of GEne Networks), a web-based visual analysis tool for the systematic exploration of probability distributions over simulation time and state space in such networks. PRODIGEN was designed in collaboration with bioinformaticians who research stochastic gene networks. The analysis tool combines in a novel way existing, expanded, and new visual encodings to capture the time-varying characteristics of probability distributions: spaghetti plots over one dimensional projection, heatmaps of distributions over 2D projections, enhanced with overlaid time curves to display temporal changes, and novel individual glyphs of state information corresponding to particular peaks. We demonstrate the effectiveness of the tool through two case studies on the computed probabilistic landscape of a gene regulatory network and of a toggle-switch network. Domain expert feedback indicates that our visual approach can help biologists: 1) visualize probabilities of stable states, 2) explore the temporal probability distributions, and 3) discover small peaks in the probability landscape that have potential relation to specific diseases.

  13. Probability density function of the intensity of a laser beam propagating in the maritime environment.

    PubMed

    Korotkova, Olga; Avramov-Zamurovic, Svetlana; Malek-Madani, Reza; Nelson, Charles

    2011-10-10

    A number of field experiments measuring the fluctuating intensity of a laser beam propagating along horizontal paths in the maritime environment is performed over sub-kilometer distances at the United States Naval Academy. Both above the ground and over the water links are explored. Two different detection schemes, one photographing the beam on a white board, and the other capturing the beam directly using a ccd sensor, gave consistent results. The probability density function (pdf) of the fluctuating intensity is reconstructed with the help of two theoretical models: the Gamma-Gamma and the Gamma-Laguerre, and compared with the intensity's histograms. It is found that the on-ground experimental results are in good agreement with theoretical predictions. The results obtained above the water paths lead to appreciable discrepancies, especially in the case of the Gamma-Gamma model. These discrepancies are attributed to the presence of the various scatterers along the path of the beam, such as water droplets, aerosols and other airborne particles. Our paper's main contribution is providing a methodology for computing the pdf function of the laser beam intensity in the maritime environment using field measurements.

  14. Maximum-Likelihood Methods for Processing Signals From Gamma-Ray Detectors

    PubMed Central

    Barrett, Harrison H.; Hunter, William C. J.; Miller, Brian William; Moore, Stephen K.; Chen, Yichun; Furenlid, Lars R.

    2009-01-01

    In any gamma-ray detector, each event produces electrical signals on one or more circuit elements. From these signals, we may wish to determine the presence of an interaction; whether multiple interactions occurred; the spatial coordinates in two or three dimensions of at least the primary interaction; or the total energy deposited in that interaction. We may also want to compute listmode probabilities for tomographic reconstruction. Maximum-likelihood methods provide a rigorous and in some senses optimal approach to extracting this information, and the associated Fisher information matrix provides a way of quantifying and optimizing the information conveyed by the detector. This paper will review the principles of likelihood methods as applied to gamma-ray detectors and illustrate their power with recent results from the Center for Gamma-ray Imaging. PMID:20107527

  15. Recent results on celestial gamma radiation from SMM

    NASA Technical Reports Server (NTRS)

    Share, Gerald H.

    1991-01-01

    Observations made by the Gamma Ray Spectrometer on board the SMM are described. Recent results reported include observations and analyses of gamma-ray lines from Co-56 produced in supernovae, observations of the temporal variation of the 511 keV line observed during Galactic center transits, and measurements of the diffuse Galactic spectrum from 0.3 to 8.5 MeV. The work in progress includes measurements of the distribution of Galactic Al-26, observations to place limits on Galactic Ti-44 and Fe-60 and on Be-7 produced in novae, and searches for a characteristic gamma-ray emission from pair plasmas, a 2.223 MeV line emission, limits on deexcitation lines from interstellar C and O, and gamma-ray bursts.

  16. Identification of probabilities.

    PubMed

    Vitányi, Paul M B; Chater, Nick

    2017-02-01

    Within psychology, neuroscience and artificial intelligence, there has been increasing interest in the proposal that the brain builds probabilistic models of sensory and linguistic input: that is, to infer a probabilistic model from a sample. The practical problems of such inference are substantial: the brain has limited data and restricted computational resources. But there is a more fundamental question: is the problem of inferring a probabilistic model from a sample possible even in principle? We explore this question and find some surprisingly positive and general results. First, for a broad class of probability distributions characterized by computability restrictions, we specify a learning algorithm that will almost surely identify a probability distribution in the limit given a finite i.i.d. sample of sufficient but unknown length. This is similarly shown to hold for sequences generated by a broad class of Markov chains, subject to computability assumptions. The technical tool is the strong law of large numbers. Second, for a large class of dependent sequences, we specify an algorithm which identifies in the limit a computable measure for which the sequence is typical, in the sense of Martin-Löf (there may be more than one such measure). The technical tool is the theory of Kolmogorov complexity. We analyze the associated predictions in both cases. We also briefly consider special cases, including language learning, and wider theoretical implications for psychology.

  17. Limitations of the Porter-Thomas distribution

    NASA Astrophysics Data System (ADS)

    Weidenmüller, Hans A.

    2017-12-01

    Data on the distribution of reduced partial neutron widths and on the distribution of total gamma decay widths disagree with the Porter-Thomas distribution (PTD) for reduced partial widths or with predictions of the statistical model. We recall why the disagreement is important: The PTD is a direct consequence of the orthogonal invariance of the Gaussian Orthogonal Ensemble (GOE) of random matrices. The disagreement is reviewed. Two possible causes for violation of orthogonal invariance of the GOE are discussed, and their consequences explored. The disagreement of the distribution of total gamma decay widths with theoretical predictions cannot be blamed on the statistical model.

  18. Multifrequency Retrieval of Cloud Ice Particle Size Distributions

    DTIC Science & Technology

    2005-01-01

    distribution ( Testud et al., 2001) to represent the PSD. The normalized gamma distribution has several advantages over a typical gamma PSD. A typical gamma...variation correlated with variation in ýL ( Testud et al., 2001). This variation on N, with P, requires a priori restrictions on the variance in R in...Geoscience & Rem. Sensing, 40, 541-549. Testud , J., S. Oury, R. A. Black, P. Amayenc, and X. Dou, 2001: The Concept of "Normalized" Distibution to Describe

  19. About cosmic gamma ray lines

    NASA Astrophysics Data System (ADS)

    Diehl, Roland

    2017-06-01

    Gamma ray lines from cosmic sources convey the action of nuclear reactions in cosmic sites and their impacts on astrophysical objects. Gamma rays at characteristic energies result from nuclear transitions following radioactive decays or high-energy collisions with excitation of nuclei. The gamma-ray line from the annihilation of positrons at 511 keV falls into the same energy window, although of different origin. We present here the concepts of cosmic gamma ray spectrometry and the corresponding instruments and missions, followed by a discussion of recent results and the challenges and open issues for the future. Among the lessons learned are the diffuse radioactive afterglow of massive-star nucleosynthesis in 26Al and 60Fe gamma rays, which is now being exploited towards the cycle of matter driven by massive stars and their supernovae; large interstellar cavities and superbubbles have been recognised to be of key importance here. Also, constraints on the complex processes making stars explode as either thermonuclear or core-collapse supernovae are being illuminated by gamma-ray lines, in this case from shortlived radioactivities from 56Ni and 44Ti decays. In particular, the three-dimensionality and asphericities that have recently been recognised as important are enlightened in different ways through such gamma-ray line spectroscopy. Finally, the distribution of positron annihilation gamma ray emission with its puzzling bulge-dominated intensity disctribution is measured through spatially-resolved spectra, which indicate that annihilation conditions may differ in different parts of our Galaxy. But it is now understood that a variety of sources may feed positrons into the interstellar medium, and their characteristics largely get lost during slowing down and propagation of positrons before annihilation; a recent microquasar flare was caught as an opportunity to see positrons annihilate at a source.

  20. Determining probability distribution of coherent integration time near 133 Hz and 1346 km in the Pacific Ocean.

    PubMed

    Spiesberger, John L

    2013-02-01

    The hypothesis tested is that internal gravity waves limit the coherent integration time of sound at 1346 km in the Pacific ocean at 133 Hz and a pulse resolution of 0.06 s. Six months of continuous transmissions at about 18 min intervals are examined. The source and receiver are mounted on the bottom of the ocean with timing governed by atomic clocks. Measured variability is only due to fluctuations in the ocean. A model for the propagation of sound through fluctuating internal waves is run without any tuning with data. Excellent resemblance is found between the model and data's probability distributions of integration time up to five hours.

  1. The Third BATSE Gamma-Ray Burst Catalog

    NASA Technical Reports Server (NTRS)

    Meegan, Charles A.; Pendleton, Geoffrey N.; Briggs, Michael S.; Kouveliotou, Chryssa; Koshut, Thomas M.; Lestrade, John Patrick; Paciesas, William S.; McCollough, Michael L.; Brainerd, Jerome J.; Horack, John M.; hide

    1996-01-01

    The Burst and Transient Source Experiment (BATSE) on the Compton Gamma Ray Observatory (CGRO) has triggered on 1122 cosmic gamma-ray bursts between 1991 April 19 and 1994 September 19. These events constitute the Third BATSE (3B) burst catalog. This catalog includes the events previously reported in the 2B catalog, which covered the time interval 1991 April 19 to 1993 March 9. We present tables of the burst occurrence times, locations, peak fluxes, fluences, and durations. In general, results from previous BATSE catalogs are confirmed here with greater statistical significance. The angular distribution is consistent with isotropy. The mean galactic dipole and quadrupole moments are within 0.6 a and 0.3 a, respectively, of the values expected for isotropy. The intensity distribution is not consistent with a homogeneous distribution of burst sources, with V/V(sub max) = 0.33 +/- 0.01. The duration distribution (T(sub 90)) exhibits bimodality, with peaks at approx. 0.5 and approx. 30 s. There is no compelling evidence for burst repetition, but only weak limits can be placed on the repetition rate.

  2. Spectra and angular distributions of atmospheric gamma rays from 0.3 to 10 MeV at lambda = 40 deg

    NASA Technical Reports Server (NTRS)

    Ling, J. C.; Gruber, D. E.

    1977-01-01

    Measurements of the spectral and angular distributions of atmospheric gamma sq cm rays in the energy range 0.3-10 MeV over Palestine, Texas, at residual depths of 2.5 and 70 g/sq cm are reported. In confirmation of the general features of a model prediction, the measurements show at 2.5 g/sq cm upward moving fluxes greater than the downward moving fluxes, the effect increasing with energy, and approximate isotropy at 70 g/sq cm. Numerous characteristic gamma-ray lines were observed, most prominently at 0.511, 1.6, 2.3, 4.4, and 6.1 MeV. Their intensities were also compared with model predictions. Observations were made with an actively shielded scintillator counter with two detectors, one of aperture 50 deg FWHM and the other of 120 deg FWHM. Above 1 MeV, contributions to the counting rate from photons penetrating the shield annulus and from neutron interactions were large; they were studied by means of a Monte Carlo code and are extensively discussed.

  3. Naima: a Python package for inference of particle distribution properties from nonthermal spectra

    NASA Astrophysics Data System (ADS)

    Zabalza, V.

    2015-07-01

    The ultimate goal of the observation of nonthermal emission from astrophysical sources is to understand the underlying particle acceleration and evolution processes, and few tools are publicly available to infer the particle distribution properties from the observed photon spectra from X-ray to VHE gamma rays. Here I present naima, an open source Python package that provides models for nonthermal radiative emission from homogeneous distribution of relativistic electrons and protons. Contributions from synchrotron, inverse Compton, nonthermal bremsstrahlung, and neutral-pion decay can be computed for a series of functional shapes of the particle energy distributions, with the possibility of using user-defined particle distribution functions. In addition, naima provides a set of functions that allow to use these models to fit observed nonthermal spectra through an MCMC procedure, obtaining probability distribution functions for the particle distribution parameters. Here I present the models and methods available in naima and an example of their application to the understanding of a galactic nonthermal source. naima's documentation, including how to install the package, is available at http://naima.readthedocs.org.

  4. On the probability distribution function of the mass surface density of molecular clouds. I

    NASA Astrophysics Data System (ADS)

    Fischera, Jörg

    2014-05-01

    The probability distribution function (PDF) of the mass surface density is an essential characteristic of the structure of molecular clouds or the interstellar medium in general. Observations of the PDF of molecular clouds indicate a composition of a broad distribution around the maximum and a decreasing tail at high mass surface densities. The first component is attributed to the random distribution of gas which is modeled using a log-normal function while the second component is attributed to condensed structures modeled using a simple power-law. The aim of this paper is to provide an analytical model of the PDF of condensed structures which can be used by observers to extract information about the condensations. The condensed structures are considered to be either spheres or cylinders with a truncated radial density profile at cloud radius rcl. The assumed profile is of the form ρ(r) = ρc/ (1 + (r/r0)2)n/ 2 for arbitrary power n where ρc and r0 are the central density and the inner radius, respectively. An implicit function is obtained which either truncates (sphere) or has a pole (cylinder) at maximal mass surface density. The PDF of spherical condensations and the asymptotic PDF of cylinders in the limit of infinite overdensity ρc/ρ(rcl) flattens for steeper density profiles and has a power law asymptote at low and high mass surface densities and a well defined maximum. The power index of the asymptote Σ- γ of the logarithmic PDF (ΣP(Σ)) in the limit of high mass surface densities is given by γ = (n + 1)/(n - 1) - 1 (spheres) or by γ = n/ (n - 1) - 1 (cylinders in the limit of infinite overdensity). Appendices are available in electronic form at http://www.aanda.org

  5. Modulated high-energy gamma-ray emission from the microquasar Cygnus X-3.

    PubMed

    Abdo, A A; Ackermann, M; Ajello, M; Axelsson, M; Baldini, L; Ballet, J; Barbiellini, G; Bastieri, D; Baughman, B M; Bechtol, K; Bellazzini, R; Berenji, B; Blandford, R D; Bloom, E D; Bonamente, E; Borgland, A W; Brez, A; Brigida, M; Bruel, P; Burnett, T H; Buson, S; Caliandro, G A; Cameron, R A; Caraveo, P A; Casandjian, J M; Cecchi, C; Celik, O; Chaty, S; Cheung, C C; Chiang, J; Ciprini, S; Claus, R; Cohen-Tanugi, J; Cominsky, L R; Conrad, J; Corbel, S; Corbet, R; Dermer, C D; de Palma, F; Digel, S W; do Couto e Silva, E; Drell, P S; Dubois, R; Dubus, G; Dumora, D; Farnier, C; Favuzzi, C; Fegan, S J; Focke, W B; Fortin, P; Frailis, M; Fusco, P; Gargano, F; Gehrels, N; Germani, S; Giavitto, G; Giebels, B; Giglietto, N; Giordano, F; Glanzman, T; Godfrey, G; Grenier, I A; Grondin, M-H; Grove, J E; Guillemot, L; Guiriec, S; Hanabata, Y; Harding, A K; Hayashida, M; Hays, E; Hill, A B; Hjalmarsdotter, L; Horan, D; Hughes, R E; Jackson, M S; Jóhannesson, G; Johnson, A S; Johnson, T J; Johnson, W N; Kamae, T; Katagiri, H; Kawai, N; Kerr, M; Knödlseder, J; Kocian, M L; Koerding, E; Kuss, M; Lande, J; Latronico, L; Lemoine-Goumard, M; Longo, F; Loparco, F; Lott, B; Lovellette, M N; Lubrano, P; Madejski, G M; Makeev, A; Marchand, L; Marelli, M; Max-Moerbeck, W; Mazziotta, M N; McColl, N; McEnery, J E; Meurer, C; Michelson, P F; Migliari, S; Mitthumsiri, W; Mizuno, T; Monte, C; Monzani, M E; Morselli, A; Moskalenko, I V; Murgia, S; Nolan, P L; Norris, J P; Nuss, E; Ohsugi, T; Omodei, N; Ong, R A; Ormes, J F; Paneque, D; Parent, D; Pelassa, V; Pepe, M; Pesce-Rollins, M; Piron, F; Pooley, G; Porter, T A; Pottschmidt, K; Rainò, S; Rando, R; Ray, P S; Razzano, M; Rea, N; Readhead, A; Reimer, A; Reimer, O; Richards, J L; Rochester, L S; Rodriguez, J; Rodriguez, A Y; Romani, R W; Ryde, F; Sadrozinski, H F-W; Sander, A; Saz Parkinson, P M; Sgrò, C; Siskind, E J; Smith, D A; Smith, P D; Spinelli, P; Starck, J-L; Stevenson, M; Strickman, M S; Suson, D J; Takahashi, H; Tanaka, T; Thayer, J B; Thompson, D J; Tibaldo, L; Tomsick, J A; Torres, D F; Tosti, G; Tramacere, A; Uchiyama, Y; Usher, T L; Vasileiou, V; Vilchez, N; Vitale, V; Waite, A P; Wang, P; Wilms, J; Winer, B L; Wood, K S; Ylinen, T; Ziegler, M

    2009-12-11

    Microquasars are accreting black holes or neutron stars in binary systems with associated relativistic jets. Despite their frequent outburst activity, they have never been unambiguously detected emitting high-energy gamma rays. The Fermi Large Area Telescope (LAT) has detected a variable high-energy source coinciding with the position of the x-ray binary and microquasar Cygnus X-3. Its identification with Cygnus X-3 is secured by the detection of its orbital period in gamma rays, as well as the correlation of the LAT flux with radio emission from the relativistic jets of Cygnus X-3. The gamma-ray emission probably originates from within the binary system, opening new areas in which to study the formation of relativistic jets.

  6. Multiple Streaming and the Probability Distribution of Density in Redshift Space

    NASA Astrophysics Data System (ADS)

    Hui, Lam; Kofman, Lev; Shandarin, Sergei F.

    2000-07-01

    We examine several aspects of redshift distortions by expressing the redshift-space density in terms of the eigenvalues and orientation of the local Lagrangian deformation tensor. We explore the importance of multiple streaming using the Zeldovich approximation (ZA), and compute the average number of streams in both real and redshift space. We find that multiple streaming can be significant in redshift space but negligible in real space, even at moderate values of the linear fluctuation amplitude (σl<~1). Moreover, unlike their real-space counterparts, redshift-space multiple streams can flow past each other with minimal interactions. Such nonlinear redshift-space effects, which are physically distinct from the fingers-of-God due to small-scale virialized motions, might in part explain the well-known departure of redshift distortions from the classic linear prediction by Kaiser, even at relatively large scales where the corresponding density field in real space is well described by linear perturbation theory. We also compute, using the ZA, the probability distribution function (PDF) of the density, as well as S3, in real and redshift space, and compare it with the PDF measured from N-body simulations. The role of caustics in defining the character of the high-density tail is examined. We find that (non-Lagrangian) smoothing, due to both finite resolution or discreteness and small-scale velocity dispersions, is very effective in erasing caustic structures, unless the initial power spectrum is sufficiently truncated.

  7. Near-infrared and gamma-ray monitoring of TANAMI gamma-ray bright sources

    DOE PAGES

    Nesci, R.; Tosti, G.; Pursimo, T.; ...

    2013-06-18

    Context. We present that spectral energy distribution and its variability are basic tools for understanding the physical processes operating in active galactic nuclei (AGN). Aims. In this paper we report the results of a one-year near-infrared (NIR) and optical monitoring of a sample of 22 AGN known to be gamma-ray emitters, aimed at discovering correlations between optical and gamma-ray emission. Methods. We observed our objects with the Rapid Eye Mount (REM) telescope in J,H,K, and R bands nearly twice every month during their visibility window and derived light curves and spectral indexes. We also analyzed the gamma-ray data from themore » Fermi gamma-ray Space Telescope, making weekly averages. Results. Six sources were never detected during our monitoring, proving to be fainter than their historical Two micron all sky survey (2MASS) level. All of the sixteen detected sources showed marked flux density variability, while the spectral indexes remained unchanged within our sensitivity limits. Steeper sources showed, on average, a larger variability. From the NIR light curves we also computed a variability speed index for each detected source. Only one source (PKS 0208-512) underwent an NIR flare during our monitoring. Half of the sources showed a regular flux density trend on a one-year time scale, but do not show any other peculiar characteristic. The broadband spectral index α ro appears to be a good proxy of the NIR spectral index only for BL Lac objects. No clear correlation between NIR and gamma-ray data is evident in our data, save for PKS 0537-441, PKS 0521-360, PKS 2155-304, and PKS 1424-418. In conclusion, the gamma-ray/NIR flux ratio showed a large spread, QSO being generally gamma-louder than BL Lac, with a marked correlation with the estimated peak frequency (ν peak) of the synchrotron emission.« less

  8. Gamma-Ray Astronomy Across 6 Decades of Energy: Synergy between Fermi, IACTs, and HAWC

    NASA Technical Reports Server (NTRS)

    Hui, C. Michelle

    2017-01-01

    Gamma Ray Observatories, Gamma-Ray Astrophysics, GeV TeV Sky Survey, Galaxy, Galactic Plane, Source Distribution, The gamma-ray sky is currently well-monitored with good survey coverage. Many instruments from different waveband/messenger (X rays, gamma rays, neutrinos, gravitational waves) available for simultaneous observations. Both wide-field and pointing instruments in development and coming online in the next decade LIGO

  9. Rapidly assessing the probability of exceptionally high natural hazard losses

    NASA Astrophysics Data System (ADS)

    Gollini, Isabella; Rougier, Jonathan

    2014-05-01

    One of the objectives in catastrophe modeling is to assess the probability distribution of losses for a specified period, such as a year. From the point of view of an insurance company, the whole of the loss distribution is interesting, and valuable in determining insurance premiums. But the shape of the righthand tail is critical, because it impinges on the solvency of the company. A simple measure of the risk of insolvency is the probability that the annual loss will exceed the company's current operating capital. Imposing an upper limit on this probability is one of the objectives of the EU Solvency II directive. If a probabilistic model is supplied for the loss process, then this tail probability can be computed, either directly, or by simulation. This can be a lengthy calculation for complex losses. Given the inevitably subjective nature of quantifying loss distributions, computational resources might be better used in a sensitivity analysis. This requires either a quick approximation to the tail probability or an upper bound on the probability, ideally a tight one. We present several different bounds, all of which can be computed nearly instantly from a very general event loss table. We provide a numerical illustration, and discuss the conditions under which the bound is tight. Although we consider the perspective of insurance and reinsurance companies, exactly the same issues concern the risk manager, who is typically very sensitive to large losses.

  10. Classroom Research: Assessment of Student Understanding of Sampling Distributions of Means and the Central Limit Theorem in Post-Calculus Probability and Statistics Classes

    ERIC Educational Resources Information Center

    Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy

    2006-01-01

    We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…

  11. Anticonvulsant properties of alpha, gamma, and alpha, gamma-substituted gamma-butyrolactones.

    PubMed

    Klunk, W E; Covey, D F; Ferrendelli, J A

    1982-09-01

    Derivatives of gamma-butyrolactone (GBL) substituted on the alpha- and/or gamma-positions were synthesized and tested for their effects on behavior in mice, on the electroencephalographs and blood pressure of paralyzed-ventilated guinea pigs, and on electrical activity of incubated hippocampal slices. Several compounds, including alpha-ethyl-alpha-methyl GBL (alpha-EMGBL), alpha, alpha-dimethyl GBL, alpha, gamma-diethyl-alpha, gamma-dimethyl GBL, and gamma-ethyl-gamma-methyl GBL, prevented seizures induced by pentylenetetrazol, beta-ethyl-beta-methyl-gamma-butyrolactone (beta-EMGBL), picrotoxin, or all three compounds in mice and guinea pigs but had no effect on seizures induced by maximal electroshock or bicuculline. Neither gamma-hydroxybutyrate (GHB) nor alpha-isopropylidine GBL had any anticonvulsant activity. The anticonvulsant alpha-substituted compounds had a potent hypotensive effect and antagonized the hypertensive effect of beta-EMGBL, alpha-EMGBL was tested in incubated hippocampal slices and was found to depress basal activity and antagonize excitation induced by beta-EMGBL. These results demonstrate that alpha-alkyl-substituted GBL and, to a lesser extent, gamma-substituted derivatives are anticonvulsant agents and that their effects are strikingly different from those of GHB or beta-alkyl-substituted GBLs, which are epileptogenic. Possibly beta- and alpha-substituted GBLs act at the same site as agonists and antagonists, respectively.

  12. Determination of the measurement threshold in gamma-ray spectrometry.

    PubMed

    Korun, M; Vodenik, B; Zorko, B

    2017-03-01

    In gamma-ray spectrometry the measurement threshold describes the lover boundary of the interval of peak areas originating in the response of the spectrometer to gamma-rays from the sample measured. In this sense it presents a generalization of the net indication corresponding to the decision threshold, which is the measurement threshold at the quantity value zero for a predetermined probability for making errors of the first kind. Measurement thresholds were determined for peaks appearing in the spectra of radon daughters 214 Pb and 214 Bi by measuring the spectrum 35 times under repeatable conditions. For the calculation of the measurement threshold the probability for detection of the peaks and the mean relative uncertainty of the peak area were used. The relative measurement thresholds, the ratios between the measurement threshold and the mean peak area uncertainty, were determined for 54 peaks where the probability for detection varied between some percent and about 95% and the relative peak area uncertainty between 30% and 80%. The relative measurement thresholds vary considerably from peak to peak, although the nominal value of the sensitivity parameter defining the sensitivity for locating peaks was equal for all peaks. At the value of the sensitivity parameter used, the peak analysis does not locate peaks corresponding to the decision threshold with the probability in excess of 50%. This implies that peaks in the spectrum may not be located, although the true value of the measurand exceeds the decision threshold. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. An empirical probability density distribution of planetary ionosphere storms with geomagnetic precursors

    NASA Astrophysics Data System (ADS)

    Gulyaeva, Tamara; Stanislawska, Iwona; Arikan, Feza; Arikan, Orhan

    The probability of occurrence of the positive and negative planetary ionosphere storms is evaluated using the W index maps produced from Global Ionospheric Maps of Total Electron Content, GIM-TEC, provided by Jet Propulsion Laboratory, and transformed from geographic coordinates to magnetic coordinates frame. The auroral electrojet AE index and the equatorial disturbance storm time Dst index are investigated as precursors of the global ionosphere storm. The superposed epoch analysis is performed for 77 intense storms (Dst≤-100 nT) and 227 moderate storms (-100probability per map, pW+, and negative storm probability pW- with model parameters determined using Particle Swarm Optimization routine with the best fitting to the data in the least squares sense. The normalized cross-correlation function is used to define lag (time delay) between the probability of positive phase pW+ (W = 3 and 4) and negative phase pW- (W = -3 and -4) of ionosphere storm, versus AE index and Dst index. It is found that AE index better suits to serve as a precursor of the ionosphere storm than Dst index with onset of the average auroral AE storm occurring 6 h before the equatorial Dst storm onset for intense storms and 3 h in advance of moderate Dst storm. The similar space zones advancement of the ionosphere storm is observed with W index (pW+ and pW-) depicting maximum localized in the polar magnetic zone and minimum at magnetic equator. An empirical relation for pW+ and pW- with AE storm precursor is derived which enables the probability of occurrence of the ionosphere storm to be predicted with leading time of 1-2 h for the positive ionosphere storm and 9-10 h for the negative ionosphere storm. The ionosphere storm probability model is validated using data for 2 intense and 20

  14. Novel approximation of misalignment fading modeled by Beckmann distribution on free-space optical links.

    PubMed

    Boluda-Ruiz, Rubén; García-Zambrana, Antonio; Castillo-Vázquez, Carmen; Castillo-Vázquez, Beatriz

    2016-10-03

    A novel accurate and useful approximation of the well-known Beckmann distribution is presented here, which is used to model generalized pointing errors in the context of free-space optical (FSO) communication systems. We derive an approximate closed-form probability density function (PDF) for the composite gamma-gamma (GG) atmospheric turbulence with the pointing error model using the proposed approximation of the Beckmann distribution, which is valid for most practical terrestrial FSO links. This approximation takes into account the effect of the beam width, different jitters for the elevation and the horizontal displacement and the simultaneous effect of nonzero boresight errors for each axis at the receiver plane. Additionally, the proposed approximation allows us to delimit two different FSO scenarios. The first of them is when atmospheric turbulence is the dominant effect in relation to generalized pointing errors, and the second one when generalized pointing error is the dominant effect in relation to atmospheric turbulence. The second FSO scenario has not been studied in-depth by the research community. Moreover, the accuracy of the method is measured both visually and quantitatively using curve-fitting metrics. Simulation results are further included to confirm the analytical results.

  15. Distribution of leached radioactive material in the Legin Group Area, San Miguel County, Colorado

    USGS Publications Warehouse

    Rogers, Allen S.

    1950-01-01

    Radioactivity anomalies, which are small in magnitude, and probably are not caused by extensions of known uranium-vanadium ore bodies, were detected during the gamma-ray logging of diamond-drill holes in the Legin group of claims, southwest San Miguel County, Colo. The positions of these anomalies are at the top surfaces of mudstone strata within, and at the base of, the ore-bearing sandstone of the Salt Wash member of the Morrison formation. The distribution of these anomalies suggests that ground water has leached radioactive material from the ore bodies and has carried it down dip and laterally along the top surfaces of underlying impermeable mudstone strata for distance as great as 300 feet. The anomalies are probably caused by radon and its daughter elements. Preliminary tests indicate that radon in quantities up to 10-7 curies per liter may be present in ground water flowing along sandstone-mudstone contacts under carnotite ore bodies. In comparison, the radium content of the same water is less than 10-10 curies per liter. Further substantiation of the relationship between ore bodies, the movement of water, and the radon-caused anomalies may greatly increase the scope of gamma-ray logs of drill holes as an aid to prospecting.

  16. Gamma ray bursts: Current status of observations and theory

    NASA Technical Reports Server (NTRS)

    Meegan, Charles A.

    1990-01-01

    Gamma-ray bursts display a wide range of temporal and spectral characteristics, but typically last several seconds and emit most of their energy in the low-energy gamma-ray region. The burst sources appear to be isotropically distributed on the sky. Several lines of evidence suggest magnetic neutron stars as sources for bursts. A variety of energy sources and emission mechanisms were proposed.

  17. Gamma ray bursts: Current status of observations and theory

    NASA Technical Reports Server (NTRS)

    Meegan, Charles A.

    1990-01-01

    Gamma ray bursts display a wide range of temporal and spectral characteristics, but typically last several seconds and emit most of their energy in a low energy, gamma ray region. The burst sources appear to be isotropically distributed on the sky. Several lines of evidence suggest magnetic neutron stars as sources for bursts. A variety of energy sources and emission mechanisms are proposed.

  18. Calculating Absolute Transition Probabilities for Deformed Nuclei in the Rare-Earth Region

    NASA Astrophysics Data System (ADS)

    Stratman, Anne; Casarella, Clark; Aprahamian, Ani

    2017-09-01

    Absolute transition probabilities are the cornerstone of understanding nuclear structure physics in comparison to nuclear models. We have developed a code to calculate absolute transition probabilities from measured lifetimes, using a Python script and a Mathematica notebook. Both of these methods take pertinent quantities such as the lifetime of a given state, the energy and intensity of the emitted gamma ray, and the multipolarities of the transitions to calculate the appropriate B(E1), B(E2), B(M1) or in general, any B(σλ) values. The program allows for the inclusion of mixing ratios of different multipolarities and the electron conversion of gamma-rays to correct for their intensities, and yields results in absolute units or results normalized to Weisskopf units. The code has been tested against available data in a wide range of nuclei from the rare earth region (28 in total), including 146-154Sm, 154-160Gd, 158-164Dy, 162-170Er, 168-176Yb, and 174-182Hf. It will be available from the Notre Dame Nuclear Science Laboratory webpage for use by the community. This work was supported by the University of Notre Dame College of Science, and by the National Science Foundation, under Contract PHY-1419765.

  19. Comparison of penumbra regions produced by ancient Gamma knife model C and Gamma ART 6000 using Monte Carlo MCNP6 simulation.

    PubMed

    Banaee, Nooshin; Asgari, Sepideh; Nedaie, Hassan Ali

    2018-07-01

    The accuracy of penumbral measurements in radiotherapy is pivotal because dose planning computers require accurate data to adequately modeling the beams, which in turn are used to calculate patient dose distributions. Gamma knife is a non-invasive intracranial technique based on principles of the Leksell stereotactic system for open deep brain surgeries, invented and developed by Professor Lars Leksell. The aim of this study is to compare the penumbra widths of Leksell Gamma Knife model C and Gamma ART 6000. Initially, the structure of both systems were simulated by using Monte Carlo MCNP6 code and after validating the accuracy of simulation, beam profiles of different collimators were plotted. MCNP6 beam profile calculations showed that the penumbra values of Leksell Gamma knife model C and Gamma ART 6000 for 18, 14, 8 and 4 mm collimators are 9.7, 7.9, 4.3, 2.6 and 8.2, 6.9, 3.6, 2.4, respectively. The results of this study showed that since Gamma ART 6000 has larger solid angle in comparison with Gamma Knife model C, it produces better beam profile penumbras than Gamma Knife model C in the direct plane. Copyright © 2017 Elsevier Ltd. All rights reserved.

  20. Probability theory for 3-layer remote sensing radiative transfer model: univariate case.

    PubMed

    Ben-David, Avishai; Davidson, Charles E

    2012-04-23

    A probability model for a 3-layer radiative transfer model (foreground layer, cloud layer, background layer, and an external source at the end of line of sight) has been developed. The 3-layer model is fundamentally important as the primary physical model in passive infrared remote sensing. The probability model is described by the Johnson family of distributions that are used as a fit for theoretically computed moments of the radiative transfer model. From the Johnson family we use the SU distribution that can address a wide range of skewness and kurtosis values (in addition to addressing the first two moments, mean and variance). In the limit, SU can also describe lognormal and normal distributions. With the probability model one can evaluate the potential for detecting a target (vapor cloud layer), the probability of observing thermal contrast, and evaluate performance (receiver operating characteristics curves) in clutter-noise limited scenarios. This is (to our knowledge) the first probability model for the 3-layer remote sensing geometry that treats all parameters as random variables and includes higher-order statistics. © 2012 Optical Society of America

  1. Observation of the {chi}{sub c2}(2P) meson in the reaction {gamma}{gamma}{yields}DD at BABAR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aubert, B.; Karyotakis, Y.; Lees, J. P.

    2010-05-01

    A search for the Z(3930) resonance in {gamma}{gamma} production of the DD system has been performed using a data sample corresponding to an integrated luminosity of 384 fb{sup -1} recorded by the BABAR experiment at the PEP-II asymmetric-energy electron-positron collider. The DD invariant mass distribution shows clear evidence of the Z(3930) state with a significance of 5.8{sigma}. We determine mass and width values of (3926.7{+-}2.7{+-}1.1) MeV/c{sup 2} and (21.3{+-}6.8{+-}3.6) MeV, respectively. A decay angular analysis provides evidence that the Z(3930) is a tensor state with positive parity and C parity (J{sup PC}=2{sup ++}); therefore we identify the Z(3930) state asmore » the {chi}{sub c2}(2P) meson. The value of the partial width {Gamma}{sub {gamma}{gamma}x}B(Z(3930){yields}DD) is found to be (0.24{+-}0.05{+-}0.04) keV.« less

  2. Transient slow gamma synchrony underlies hippocampal memory replay

    PubMed Central

    Carr, Margaret F.; Karlsson, Mattias P.; Frank, Loren M.

    2012-01-01

    Summary The replay of previously stored memories during hippocampal sharp wave ripples (SWRs) is thought to support both memory retrieval and consolidation in distributed hippocampal-neocortical circuits. Replay events consist of precisely timed sequences of spikes from CA3 and CA1 neurons that are coordinated both within and across hemispheres. The mechanism of this coordination is not understood. Here we show that during SWRs in both awake and quiescent states there are transient increases in slow gamma (20-50Hz) power and synchrony across dorsal CA3 and CA1 networks of both hemispheres. These gamma oscillations entrain CA3 and CA1 spiking. Moreover, during awake SWRs, higher levels of slow gamma synchrony are predictive of higher quality replay of past experiences. Our results indicate that CA3–CA1 gamma synchronization is a central component of awake memory replay and suggest that transient gamma synchronization serves as a clocking mechanism to enable coordinated memory reactivation across the hippocampal network. PMID:22920260

  3. Swift Gamma-Ray Burst Explorer: Mission Design for Rapid, Accurate Location of Gamma-ray Bursts

    NASA Technical Reports Server (NTRS)

    Bundas, David J.

    2004-01-01

    The Swift Gamma-ray Burst Explorer is a NASA Mid-sized Explorer (MIDEX) with the primary mission of determining the origins of Gamma-Ray Bursts (GRBs). It will be the first mission to autonomously respond to newly-discovered GRBs and provide immediate follow-up with narrow field instruments capable of multi-wavelength (UV, Optical, X-ray) observations. The characteristics of GRBs that are the key mission design drivers, are their non-repeating and brief duration bursts of multi-wavelength photons. In addition, rapid notification of the location and characteristics of the GRBs to ground-and-space-based observatories drive the end-to-end data analysis and distribution requirements.

  4. Swift Gamma-ray Burst Explorer: Mission Design for Rapid, Accurate Location of Gamma-ray Bursts

    NASA Technical Reports Server (NTRS)

    Bundas, David J.

    2005-01-01

    The Swift Gamma-ray Burst Explorer is a NASA Mid-sized Explorer (MIDEX) with the primary mission of determining the origins of Gamma-Ray Bursts (GRBs). It will be the first mission to autonomously respond to newly-discovered GRBs and provide immediate follow-up with narrow field instruments capable of multi-wavelength (UT, Optical, X-ray) observations. The characteristics of GRBs that are the key mission design drivers, are their non-repeating and brief duration bursts of multi-wavelength photons. In addition, rapid notification of the location and characteristics of the GRBs to ground-and-space-based observatories drive the end-to-end data analysis and distribution requirements.

  5. A simultaneous beta and coincidence-gamma imaging system for plant leaves

    NASA Astrophysics Data System (ADS)

    Ranjbar, Homayoon; Wen, Jie; Mathews, Aswin J.; Komarov, Sergey; Wang, Qiang; Li, Ke; O'Sullivan, Joseph A.; Tai, Yuan-Chuan

    2016-05-01

    Positron emitting isotopes, such as 11C, 13N, and 18F, can be used to label molecules. The tracers, such as 11CO2, are delivered to plants to study their biological processes, particularly metabolism and photosynthesis, which may contribute to the development of plants that have a higher yield of crops and biomass. Measurements and resulting images from PET scanners are not quantitative in young plant structures or in plant leaves due to poor positron annihilation in thin objects. To address this problem we have designed, assembled, modeled, and tested a nuclear imaging system (simultaneous beta-gamma imager). The imager can simultaneously detect positrons ({β+} ) and coincidence-gamma rays (γ). The imaging system employs two planar detectors; one is a regular gamma detector which has a LYSO crystal array, and the other is a phoswich detector which has an additional BC-404 plastic scintillator for beta detection. A forward model for positrons is proposed along with a joint image reconstruction formulation to utilize the beta and coincidence-gamma measurements for estimating radioactivity distribution in plant leaves. The joint reconstruction algorithm first reconstructs beta and gamma images independently to estimate the thickness component of the beta forward model and afterward jointly estimates the radioactivity distribution in the object. We have validated the physics model and reconstruction framework through a phantom imaging study and imaging a tomato leaf that has absorbed 11CO2. The results demonstrate that the simultaneously acquired beta and coincidence-gamma data, combined with our proposed joint reconstruction algorithm, improved the quantitative accuracy of estimating radioactivity distribution in thin objects such as leaves. We used the structural similarity (SSIM) index for comparing the leaf images from the simultaneous beta-gamma imager with the ground truth image. The jointly reconstructed images yield SSIM indices of 0.69 and 0.63, whereas the

  6. A simultaneous beta and coincidence-gamma imaging system for plant leaves.

    PubMed

    Ranjbar, Homayoon; Wen, Jie; Mathews, Aswin J; Komarov, Sergey; Wang, Qiang; Li, Ke; O'Sullivan, Joseph A; Tai, Yuan-Chuan

    2016-05-07

    Positron emitting isotopes, such as (11)C, (13)N, and (18)F, can be used to label molecules. The tracers, such as (11)CO2, are delivered to plants to study their biological processes, particularly metabolism and photosynthesis, which may contribute to the development of plants that have a higher yield of crops and biomass. Measurements and resulting images from PET scanners are not quantitative in young plant structures or in plant leaves due to poor positron annihilation in thin objects. To address this problem we have designed, assembled, modeled, and tested a nuclear imaging system (simultaneous beta-gamma imager). The imager can simultaneously detect positrons ([Formula: see text]) and coincidence-gamma rays (γ). The imaging system employs two planar detectors; one is a regular gamma detector which has a LYSO crystal array, and the other is a phoswich detector which has an additional BC-404 plastic scintillator for beta detection. A forward model for positrons is proposed along with a joint image reconstruction formulation to utilize the beta and coincidence-gamma measurements for estimating radioactivity distribution in plant leaves. The joint reconstruction algorithm first reconstructs beta and gamma images independently to estimate the thickness component of the beta forward model and afterward jointly estimates the radioactivity distribution in the object. We have validated the physics model and reconstruction framework through a phantom imaging study and imaging a tomato leaf that has absorbed (11)CO2. The results demonstrate that the simultaneously acquired beta and coincidence-gamma data, combined with our proposed joint reconstruction algorithm, improved the quantitative accuracy of estimating radioactivity distribution in thin objects such as leaves. We used the structural similarity (SSIM) index for comparing the leaf images from the simultaneous beta-gamma imager with the ground truth image. The jointly reconstructed images yield SSIM indices of 0

  7. Failure probability analysis of optical grid

    NASA Astrophysics Data System (ADS)

    Zhong, Yaoquan; Guo, Wei; Sun, Weiqiang; Jin, Yaohui; Hu, Weisheng

    2008-11-01

    Optical grid, the integrated computing environment based on optical network, is expected to be an efficient infrastructure to support advanced data-intensive grid applications. In optical grid, the faults of both computational and network resources are inevitable due to the large scale and high complexity of the system. With the optical network based distributed computing systems extensive applied in the processing of data, the requirement of the application failure probability have been an important indicator of the quality of application and an important aspect the operators consider. This paper will present a task-based analysis method of the application failure probability in optical grid. Then the failure probability of the entire application can be quantified, and the performance of reducing application failure probability in different backup strategies can be compared, so that the different requirements of different clients can be satisfied according to the application failure probability respectively. In optical grid, when the application based DAG (directed acyclic graph) is executed in different backup strategies, the application failure probability and the application complete time is different. This paper will propose new multi-objective differentiated services algorithm (MDSA). New application scheduling algorithm can guarantee the requirement of the failure probability and improve the network resource utilization, realize a compromise between the network operator and the application submission. Then differentiated services can be achieved in optical grid.

  8. Evaluation of carotid plaque echogenicity based on the integral of the cumulative probability distribution using gray-scale ultrasound images.

    PubMed

    Huang, Xiaowei; Zhang, Yanling; Meng, Long; Abbott, Derek; Qian, Ming; Wong, Kelvin K L; Zheng, Rongqing; Zheng, Hairong; Niu, Lili

    2017-01-01

    Carotid plaque echogenicity is associated with the risk of cardiovascular events. Gray-scale median (GSM) of the ultrasound image of carotid plaques has been widely used as an objective method for evaluation of plaque echogenicity in patients with atherosclerosis. We proposed a computer-aided method to evaluate plaque echogenicity and compared its efficiency with GSM. One hundred and twenty-five carotid plaques (43 echo-rich, 35 intermediate, 47 echolucent) were collected from 72 patients in this study. The cumulative probability distribution curves were obtained based on statistics of the pixels in the gray-level images of plaques. The area under the cumulative probability distribution curve (AUCPDC) was calculated as its integral value to evaluate plaque echogenicity. The classification accuracy for three types of plaques is 78.4% (kappa value, κ = 0.673), when the AUCPDC is used for classifier training, whereas GSM is 64.8% (κ = 0.460). The receiver operating characteristic curves were produced to test the effectiveness of AUCPDC and GSM for the identification of echolucent plaques. The area under the curve (AUC) was 0.817 when AUCPDC was used for training the classifier, which is higher than that achieved using GSM (AUC = 0.746). Compared with GSM, the AUCPDC showed a borderline association with coronary heart disease (Spearman r = 0.234, p = 0.050). Our experimental results suggest that AUCPDC analysis is a promising method for evaluation of plaque echogenicity and predicting cardiovascular events in patients with plaques.

  9. Quantum temporal probabilities in tunneling systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anastopoulos, Charis, E-mail: anastop@physics.upatras.gr; Savvidou, Ntina, E-mail: ksavvidou@physics.upatras.gr

    We study the temporal aspects of quantum tunneling as manifested in time-of-arrival experiments in which the detected particle tunnels through a potential barrier. In particular, we present a general method for constructing temporal probabilities in tunneling systems that (i) defines ‘classical’ time observables for quantum systems and (ii) applies to relativistic particles interacting through quantum fields. We show that the relevant probabilities are defined in terms of specific correlation functions of the quantum field associated with tunneling particles. We construct a probability distribution with respect to the time of particle detection that contains all information about the temporal aspects ofmore » the tunneling process. In specific cases, this probability distribution leads to the definition of a delay time that, for parity-symmetric potentials, reduces to the phase time of Bohm and Wigner. We apply our results to piecewise constant potentials, by deriving the appropriate junction conditions on the points of discontinuity. For the double square potential, in particular, we demonstrate the existence of (at least) two physically relevant time parameters, the delay time and a decay rate that describes the escape of particles trapped in the inter-barrier region. Finally, we propose a resolution to the paradox of apparent superluminal velocities for tunneling particles. We demonstrate that the idea of faster-than-light speeds in tunneling follows from an inadmissible use of classical reasoning in the description of quantum systems. -- Highlights: •Present a general methodology for deriving temporal probabilities in tunneling systems. •Treatment applies to relativistic particles interacting through quantum fields. •Derive a new expression for tunneling time. •Identify new time parameters relevant to tunneling. •Propose a resolution of the superluminality paradox in tunneling.« less

  10. Probability of success for phase III after exploratory biomarker analysis in phase II.

    PubMed

    Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver

    2017-05-01

    The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs. Copyright © 2017 John Wiley & Sons, Ltd.

  11. Optimizing Probability of Detection Point Estimate Demonstration

    NASA Technical Reports Server (NTRS)

    Koshti, Ajay M.

    2017-01-01

    Probability of detection (POD) analysis is used in assessing reliably detectable flaw size in nondestructive evaluation (NDE). MIL-HDBK-18231and associated mh18232POD software gives most common methods of POD analysis. Real flaws such as cracks and crack-like flaws are desired to be detected using these NDE methods. A reliably detectable crack size is required for safe life analysis of fracture critical parts. The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using Point Estimate Method. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible.

  12. The maximum entropy method of moments and Bayesian probability theory

    NASA Astrophysics Data System (ADS)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  13. Probabilistic Cloning of Three Real States with Optimal Success Probabilities

    NASA Astrophysics Data System (ADS)

    Rui, Pin-shu

    2017-06-01

    We investigate the probabilistic quantum cloning (PQC) of three real states with average probability distribution. To get the analytic forms of the optimal success probabilities we assume that the three states have only two pairwise inner products. Based on the optimal success probabilities, we derive the explicit form of 1 →2 PQC for cloning three real states. The unitary operation needed in the PQC process is worked out too. The optimal success probabilities are also generalized to the M→ N PQC case.

  14. Binomial probability distribution model-based protein identification algorithm for tandem mass spectrometry utilizing peak intensity information.

    PubMed

    Xiao, Chuan-Le; Chen, Xiao-Zhou; Du, Yang-Li; Sun, Xuesong; Zhang, Gong; He, Qing-Yu

    2013-01-04

    Mass spectrometry has become one of the most important technologies in proteomic analysis. Tandem mass spectrometry (LC-MS/MS) is a major tool for the analysis of peptide mixtures from protein samples. The key step of MS data processing is the identification of peptides from experimental spectra by searching public sequence databases. Although a number of algorithms to identify peptides from MS/MS data have been already proposed, e.g. Sequest, OMSSA, X!Tandem, Mascot, etc., they are mainly based on statistical models considering only peak-matches between experimental and theoretical spectra, but not peak intensity information. Moreover, different algorithms gave different results from the same MS data, implying their probable incompleteness and questionable reproducibility. We developed a novel peptide identification algorithm, ProVerB, based on a binomial probability distribution model of protein tandem mass spectrometry combined with a new scoring function, making full use of peak intensity information and, thus, enhancing the ability of identification. Compared with Mascot, Sequest, and SQID, ProVerB identified significantly more peptides from LC-MS/MS data sets than the current algorithms at 1% False Discovery Rate (FDR) and provided more confident peptide identifications. ProVerB is also compatible with various platforms and experimental data sets, showing its robustness and versatility. The open-source program ProVerB is available at http://bioinformatics.jnu.edu.cn/software/proverb/ .

  15. Mixture distributions of wind speed in the UAE

    NASA Astrophysics Data System (ADS)

    Shin, J.; Ouarda, T.; Lee, T. S.

    2013-12-01

    Wind speed probability distribution is commonly used to estimate potential wind energy. The 2-parameter Weibull distribution has been most widely used to characterize the distribution of wind speed. However, it is unable to properly model wind speed regimes when wind speed distribution presents bimodal and kurtotic shapes. Several studies have concluded that the Weibull distribution should not be used for frequency analysis of wind speed without investigation of wind speed distribution. Due to these mixture distributional characteristics of wind speed data, the application of mixture distributions should be further investigated in the frequency analysis of wind speed. A number of studies have investigated the potential wind energy in different parts of the Arabian Peninsula. Mixture distributional characteristics of wind speed were detected from some of these studies. Nevertheless, mixture distributions have not been employed for wind speed modeling in the Arabian Peninsula. In order to improve our understanding of wind energy potential in Arabian Peninsula, mixture distributions should be tested for the frequency analysis of wind speed. The aim of the current study is to assess the suitability of mixture distributions for the frequency analysis of wind speed in the UAE. Hourly mean wind speed data at 10-m height from 7 stations were used in the current study. The Weibull and Kappa distributions were employed as representatives of the conventional non-mixture distributions. 10 mixture distributions are used and constructed by mixing four probability distributions such as Normal, Gamma, Weibull and Extreme value type-one (EV-1) distributions. Three parameter estimation methods such as Expectation Maximization algorithm, Least Squares method and Meta-Heuristic Maximum Likelihood (MHML) method were employed to estimate the parameters of the mixture distributions. In order to compare the goodness-of-fit of tested distributions and parameter estimation methods for

  16. Determination of solar flare accelerated ion angular distributions from SMM gamma ray and neutron measurements and determination of the He-3/H ratio in the solar photosphere from SMM gamma ray measurements

    NASA Technical Reports Server (NTRS)

    Lingenfelter, Richard E.

    1989-01-01

    Comparisons of Solar Maximum Mission (SMM) observations of gamma-ray line and neutron emission with theoretical calculation of their expected production by flare accelerated ion interactions in the solar atmosphere have led to significant advances in the understanding of solar flare particle acceleration and interaction, as well as the flare process itself. These comparisons have enabled the determination of, not only the total number and energy spectrum of accelerated ions trapped at the sun, but also the ion angular distribution as they interact in the solar atmosphere. The Monte Carlo program was modified to include in the calculations of ion trajectories the effects of both mirroring in converging magnetic fields and of pitch angle scattering. Comparing the results of these calculations with the SMM observations, not only the angular distribution of the interacting ions can be determined, but also the initial angular distribution of the ions at acceleration. The reliable determination of the solar photospheric He-3 abundance is of great importance for understanding nucleosynthesis in the early universe and its implications for cosmology, as well as for the study of the evolution of the sun. It is also essential for the determinations of the spectrum and total number of flare accelerated ions from the SMM/GRS gamma-ray line measurements. Systematic Monte Carlo calculations of the time dependence were made as a function of the He-3 abundance and other variables. A new series of calculations were compared for the time-dependent flux of 2.223 MeV neutron capture line emission and the ratio of the time-integrated flux in the 2.223 MeV line to that in the 4.1 to 6.4 MeV nuclear deexcitation band.

  17. Isotopic response with small scintillator based gamma-ray spectrometers

    DOEpatents

    Madden, Norman W [Sparks, NV; Goulding, Frederick S [Lafayette, CA; Asztalos, Stephen J [Oakland, CA

    2012-01-24

    The intrinsic background of a gamma ray spectrometer is significantly reduced by surrounding the scintillator with a second scintillator. This second (external) scintillator surrounds the first scintillator and has an opening of approximately the same diameter as the smaller central scintillator in the forward direction. The second scintillator is selected to have a higher atomic number, and thus has a larger probability for a Compton scattering interaction than within the inner region. Scattering events that are essentially simultaneous in coincidence to the first and second scintillators, from an electronics perspective, are precluded electronically from the data stream. Thus, only gamma-rays that are wholly contained in the smaller central scintillator are used for analytic purposes.

  18. Quantum Probability Cancellation Due to a Single-Photon State

    NASA Technical Reports Server (NTRS)

    Ou, Z. Y.

    1996-01-01

    When an N-photon state enters a lossless symmetric beamsplitter from one input port, the photon distribution for the two output ports has the form of Bernouli Binormial, with highest probability at equal partition (N/2 at one outport and N/2 at the other). However, injection of a single photon state at the other input port can dramatically change the photon distribution at the outputs, resulting in zero probability at equal partition. Such a strong deviation from classical particle theory stems from quantum probability amplitude cancellation. The effect persists even if the N-photon state is replaced by an arbitrary state of light. A special case is the coherent state which corresponds to homodyne detection of a single photon state and can lead to the measurement of the wave function of a single photon state.

  19. Gamma-Ray Emision from the Broad-Line Radio Galaxy 3C 111

    NASA Technical Reports Server (NTRS)

    Hartman, Robert C.; Kadler, Matthias; Tueller, Jack

    2008-01-01

    The broad-line radio galaxy 3C 111 has been suggested as the counterpart of the Gamma-ray source 3EGJ0416+3650. While 3C 111 meets most of the criteria for a high-probability identification, like a bright fla t-spectrum radio core and a blazarlike broadband SED, in the Third EG RET Catalog, the large positional offset of about 1.5 degrees put 3C1 11 outside the 99% probability region for 3EG J0416+3650, making this association questionable. We present a re-analysis of all available data for 3C111 from the EGRET archives, resulting in probable detection of high-energy Gamma-ray emission above 1000MeV from a position clo se to the nominal position of 3C 111, in two separate viewing periods (VPs), at a 3sigma level in each. A new source, GROJ0426+3747, appea rs to be present nearby, seen only in the >1000MeV data. For >100MeV, the data are in agreement with only one source (at the original cata log position) accounting for most of the EGRET-detected emission of 3 EGJ0416+3650. A follow-up Swift UVOT/XRT observation reveals one mode rately bright X-ray source in the error box of 3EGJ0416+3650, but bec ause of the large EGRET position uncertainty, it is not certain that the X-ray and Gamma-ray sources are associated. A Swift observation of GROJ0426+3747 detected no X.ray source nearby.

  20. Dual-isotope PET using positron-gamma emitters.

    PubMed

    Andreyev, A; Celler, A

    2011-07-21

    Positron emission tomography (PET) is widely recognized as a highly effective functional imaging modality. Unfortunately, standard PET cannot be used for dual-isotope imaging (which would allow for simultaneous investigation of two different biological processes), because positron-electron annihilation products from different tracers are indistinguishable in terms of energy. Methods that have been proposed for dual-isotope PET rely on differences in half-lives of the participating isotopes; these approaches, however, require making assumptions concerning kinetic behavior of the tracers and may not lead to optimal results. In this paper we propose a novel approach for dual-isotope PET and investigate its performance using GATE simulations. Our method requires one of the two radioactive isotopes to be a pure positron emitter and the second isotope to emit an additional high-energy gamma in a cascade simultaneously with positron emission. Detection of this auxiliary prompt gamma in coincidence with the annihilation event allows us to identify the corresponding 511 keV photon pair as originating from the same isotope. Two list-mode datasets are created: a primary dataset that contains all detected 511 keV photon pairs from both isotopes, and a second, tagged (much smaller) dataset that contains only those PET events for which a coincident prompt gamma has also been detected. An image reconstructed from the tagged dataset reflects the distribution of the second positron-gamma radiotracer and serves as a prior for the reconstruction of the primary dataset. Our preliminary simulation study with partially overlapping (18)F/(22)Na and (18)F/(60)Cu radiotracer distributions showed that in these two cases the dual-isotope PET method allowed for separation of the two activity distributions and recovered total activities with relative errors of about 5%.

  1. Multiple Streaming and the Probability Distribution of Density in Redshift Space

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hui, Lam; Kofman, Lev; Shandarin, Sergei F.

    2000-07-01

    We examine several aspects of redshift distortions by expressing the redshift-space density in terms of the eigenvalues and orientation of the local Lagrangian deformation tensor. We explore the importance of multiple streaming using the Zeldovich approximation (ZA), and compute the average number of streams in both real and redshift space. We find that multiple streaming can be significant in redshift space but negligible in real space, even at moderate values of the linear fluctuation amplitude ({sigma}{sub l}(less-or-similar sign)1). Moreover, unlike their real-space counterparts, redshift-space multiple streams can flow past each other with minimal interactions. Such nonlinear redshift-space effects, which aremore » physically distinct from the fingers-of-God due to small-scale virialized motions, might in part explain the well-known departure of redshift distortions from the classic linear prediction by Kaiser, even at relatively large scales where the corresponding density field in real space is well described by linear perturbation theory. We also compute, using the ZA, the probability distribution function (PDF) of the density, as well as S{sub 3}, in real and redshift space, and compare it with the PDF measured from N-body simulations. The role of caustics in defining the character of the high-density tail is examined. We find that (non-Lagrangian) smoothing, due to both finite resolution or discreteness and small-scale velocity dispersions, is very effective in erasing caustic structures, unless the initial power spectrum is sufficiently truncated. (c) 2000 The American Astronomical Society.« less

  2. Stability metrics for multi-source biomedical data based on simplicial projections from probability distribution distances.

    PubMed

    Sáez, Carlos; Robles, Montserrat; García-Gómez, Juan M

    2017-02-01

    Biomedical data may be composed of individuals generated from distinct, meaningful sources. Due to possible contextual biases in the processes that generate data, there may exist an undesirable and unexpected variability among the probability distribution functions (PDFs) of the source subsamples, which, when uncontrolled, may lead to inaccurate or unreproducible research results. Classical statistical methods may have difficulties to undercover such variabilities when dealing with multi-modal, multi-type, multi-variate data. This work proposes two metrics for the analysis of stability among multiple data sources, robust to the aforementioned conditions, and defined in the context of data quality assessment. Specifically, a global probabilistic deviation and a source probabilistic outlyingness metrics are proposed. The first provides a bounded degree of the global multi-source variability, designed as an estimator equivalent to the notion of normalized standard deviation of PDFs. The second provides a bounded degree of the dissimilarity of each source to a latent central distribution. The metrics are based on the projection of a simplex geometrical structure constructed from the Jensen-Shannon distances among the sources PDFs. The metrics have been evaluated and demonstrated their correct behaviour on a simulated benchmark and with real multi-source biomedical data using the UCI Heart Disease data set. The biomedical data quality assessment based on the proposed stability metrics may improve the efficiency and effectiveness of biomedical data exploitation and research.

  3. Observations of Galactic gamma-radiation with the SMM spectrometer

    NASA Technical Reports Server (NTRS)

    Share, G. H.; Kinzer, R. L.; Messina, D. C.; Purcell, W. R.; Chupp, E. L.

    1986-01-01

    Preliminary results from the SMM gamma-ray spectrometer are reported which indicate the detection of a constant source of 0.511-MeV annihilation radiation from the Galaxy. Year-to-year variability appears to be less than 30 percent. The radiation probably comes from a diffuse source and is not associated with the reported compact object at the Galactic center.

  4. Sample size guidelines for fitting a lognormal probability distribution to censored most probable number data with a Markov chain Monte Carlo method.

    PubMed

    Williams, Michael S; Cao, Yong; Ebel, Eric D

    2013-07-15

    Levels of pathogenic organisms in food and water have steadily declined in many parts of the world. A consequence of this reduction is that the proportion of samples that test positive for the most contaminated product-pathogen pairings has fallen to less than 0.1. While this is unequivocally beneficial to public health, datasets with very few enumerated samples present an analytical challenge because a large proportion of the observations are censored values. One application of particular interest to risk assessors is the fitting of a statistical distribution function to datasets collected at some point in the farm-to-table continuum. The fitted distribution forms an important component of an exposure assessment. A number of studies have compared different fitting methods and proposed lower limits on the proportion of samples where the organisms of interest are identified and enumerated, with the recommended lower limit of enumerated samples being 0.2. This recommendation may not be applicable to food safety risk assessments for a number of reasons, which include the development of new Bayesian fitting methods, the use of highly sensitive screening tests, and the generally larger sample sizes found in surveys of food commodities. This study evaluates the performance of a Markov chain Monte Carlo fitting method when used in conjunction with a screening test and enumeration of positive samples by the Most Probable Number technique. The results suggest that levels of contamination for common product-pathogen pairs, such as Salmonella on poultry carcasses, can be reliably estimated with the proposed fitting method and samples sizes in excess of 500 observations. The results do, however, demonstrate that simple guidelines for this application, such as the proportion of positive samples, cannot be provided. Published by Elsevier B.V.

  5. Gamma-ray momentum reconstruction from Compton electron trajectories by filtered back-projection

    DOE PAGES

    Haefner, A.; Gunter, D.; Plimley, B.; ...

    2014-11-03

    Gamma-ray imaging utilizing Compton scattering has traditionally relied on measuring coincident gamma-ray interactions to map directional information of the source distribution. This coincidence requirement makes it an inherently inefficient process. We present an approach to gamma-ray reconstruction from Compton scattering that requires only a single electron tracking detector, thus removing the coincidence requirement. From the Compton scattered electron momentum distribution, our algorithm analytically computes the incident photon's correlated direction and energy distributions. Because this method maps the source energy and location, it is useful in applications, where prior information about the source distribution is unknown. We demonstrate this method withmore » electron tracks measured in a scientific Si charge coupled device. While this method was demonstrated with electron tracks in a Si-based detector, it is applicable to any detector that can measure electron direction and energy, or equivalently the electron momentum. For example, it can increase the sensitivity to obtain energy and direction in gas-based systems that suffer from limited efficiency.« less

  6. V/V(max) test applied to SMM gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Matz, S. M.; Higdon, J. C.; Share, G. H.; Messina, D. C.; Iadicicco, A.

    1992-01-01

    We have applied the V/V(max) test to candidate gamma-ray bursts detected by the Gamma-Ray Spectrometer (GRS) aboard the SMM satellite to examine quantitatively the uniformity of the burst source population. For a sample of 132 candidate bursts identified in the GRS data by an automated search using a single uniform trigger criterion we find average V/V(max) = 0.40 +/- 0.025. This value is significantly different from 0.5, the average for a uniform distribution in space of the parent population of burst sources; however, the shape of the observed distribution of V/V(max) is unusual and our result conflicts with previous measurements. For these reasons we can currently draw no firm conclusion about the distribution of burst sources.

  7. The structure and content of the galaxy and galactic gamma rays. [conferences

    NASA Technical Reports Server (NTRS)

    Fichtel, C. E.; Stecker, F. W.

    1976-01-01

    Papers are presented dealing with galactic structure drawing on all branches of galactic astronomy with emphasis on the implications of the new gamma ray observations. Topics discussed include: (1) results from the COS-B gamma ray satellite; (2) results from SAS-2 on gamma ray pulsar, Cygnus X-3, and maps of the galactic diffuse flux; (3) recent data from CO surveys of the galaxy; (4) high resolution radio surveys of external galaxies; (5) results on the galactic distribution of pulsars; and (6) theoretical work on galactic gamma ray emission.

  8. Towards a Global Water Scarcity Risk Assessment Framework: Incorporation of Probability Distributions and Hydro-Climatic Variability

    NASA Technical Reports Server (NTRS)

    Veldkamp, T. I. E.; Wada, Y.; Aerts, J. C. J. H.; Ward, P. J.

    2016-01-01

    Changing hydro-climatic and socioeconomic conditions increasingly put pressure on fresh water resources and are expected to aggravate water scarcity conditions towards the future. Despite numerous calls for risk-based water scarcity assessments, a global-scale framework that includes UNISDR's definition of risk does not yet exist. This study provides a first step towards such a risk based assessment, applying a Gamma distribution to estimate water scarcity conditions at the global scale under historic and future conditions, using multiple climate change and population growth scenarios. Our study highlights that water scarcity risk, expressed in terms of expected annual exposed population, increases given all future scenarios, up to greater than 56.2% of the global population in 2080. Looking at the drivers of risk, we find that population growth outweigh the impacts of climate change at global and regional scales. Using a risk-based method to assess water scarcity, we show the results to be less sensitive than traditional water scarcity assessments to the use of fixed threshold to represent different levels of water scarcity. This becomes especially important when moving from global to local scales, whereby deviations increase up to 50% of estimated risk levels.

  9. A bioinformatic survey of distribution, conservation, and probable functions of LuxR solo regulators in bacteria.

    PubMed

    Subramoni, Sujatha; Florez Salcedo, Diana Vanessa; Suarez-Moreno, Zulma R

    2015-01-01

    LuxR solo transcriptional regulators contain both an autoinducer binding domain (ABD; N-terminal) and a DNA binding Helix-Turn-Helix domain (HTH; C-terminal), but are not associated with a cognate N-acyl homoserine lactone (AHL) synthase coding gene in the same genome. Although a few LuxR solos have been characterized, their distributions as well as their role in bacterial signal perception and other processes are poorly understood. In this study we have carried out a systematic survey of distribution of all ABD containing LuxR transcriptional regulators (QS domain LuxRs) available in the InterPro database (IPR005143), and identified those lacking a cognate AHL synthase. These LuxR solos were then analyzed regarding their taxonomical distribution, predicted functions of neighboring genes and the presence of complete AHL-QS systems in the genomes that carry them. Our analyses reveal the presence of one or multiple predicted LuxR solos in many proteobacterial genomes carrying QS domain LuxRs, some of them harboring genes for one or more AHL-QS circuits. The presence of LuxR solos in bacteria occupying diverse environments suggests potential ecological functions for these proteins beyond AHL and interkingdom signaling. Based on gene context and the conservation levels of invariant amino acids of ABD, we have classified LuxR solos into functionally meaningful groups or putative orthologs. Surprisingly, putative LuxR solos were also found in a few non-proteobacterial genomes which are not known to carry AHL-QS systems. Multiple predicted LuxR solos in the same genome appeared to have different levels of conservation of invariant amino acid residues of ABD questioning their binding to AHLs. In summary, this study provides a detailed overview of distribution of LuxR solos and their probable roles in bacteria with genome sequence information.

  10. A bioinformatic survey of distribution, conservation, and probable functions of LuxR solo regulators in bacteria

    PubMed Central

    Subramoni, Sujatha; Florez Salcedo, Diana Vanessa; Suarez-Moreno, Zulma R.

    2015-01-01

    LuxR solo transcriptional regulators contain both an autoinducer binding domain (ABD; N-terminal) and a DNA binding Helix-Turn-Helix domain (HTH; C-terminal), but are not associated with a cognate N-acyl homoserine lactone (AHL) synthase coding gene in the same genome. Although a few LuxR solos have been characterized, their distributions as well as their role in bacterial signal perception and other processes are poorly understood. In this study we have carried out a systematic survey of distribution of all ABD containing LuxR transcriptional regulators (QS domain LuxRs) available in the InterPro database (IPR005143), and identified those lacking a cognate AHL synthase. These LuxR solos were then analyzed regarding their taxonomical distribution, predicted functions of neighboring genes and the presence of complete AHL-QS systems in the genomes that carry them. Our analyses reveal the presence of one or multiple predicted LuxR solos in many proteobacterial genomes carrying QS domain LuxRs, some of them harboring genes for one or more AHL-QS circuits. The presence of LuxR solos in bacteria occupying diverse environments suggests potential ecological functions for these proteins beyond AHL and interkingdom signaling. Based on gene context and the conservation levels of invariant amino acids of ABD, we have classified LuxR solos into functionally meaningful groups or putative orthologs. Surprisingly, putative LuxR solos were also found in a few non-proteobacterial genomes which are not known to carry AHL-QS systems. Multiple predicted LuxR solos in the same genome appeared to have different levels of conservation of invariant amino acid residues of ABD questioning their binding to AHLs. In summary, this study provides a detailed overview of distribution of LuxR solos and their probable roles in bacteria with genome sequence information. PMID:25759807

  11. Survival Bayesian Estimation of Exponential-Gamma Under Linex Loss Function

    NASA Astrophysics Data System (ADS)

    Rizki, S. W.; Mara, M. N.; Sulistianingsih, E.

    2017-06-01

    This paper elaborates a research of the cancer patients after receiving a treatment in cencored data using Bayesian estimation under Linex Loss function for Survival Model which is assumed as an exponential distribution. By giving Gamma distribution as prior and likelihood function produces a gamma distribution as posterior distribution. The posterior distribution is used to find estimatior {\\hat{λ }}BL by using Linex approximation. After getting {\\hat{λ }}BL, the estimators of hazard function {\\hat{h}}BL and survival function {\\hat{S}}BL can be found. Finally, we compare the result of Maximum Likelihood Estimation (MLE) and Linex approximation to find the best method for this observation by finding smaller MSE. The result shows that MSE of hazard and survival under MLE are 2.91728E-07 and 0.000309004 and by using Bayesian Linex worths 2.8727E-07 and 0.000304131, respectively. It concludes that the Bayesian Linex is better than MLE.

  12. Gamma-ray detectors for breast imaging

    NASA Astrophysics Data System (ADS)

    Williams, Mark B.; Goode, Allen R.; Majewski, Stan; Steinbach, Daniela; Weisenberger, Andrew G.; Wojcik, Randolph F.; Farzanpay, Farzin

    1997-07-01

    Breast cancer is the most common cancer of American women and is the leading cause of cancer-related death among women aged 15 - 54; however recent years have shown that early detection using x-ray mammography can lead to a high probability of cure. However, because of mammography's low positive predictive value, surgical or core biopsy is typically required for diagnosis. In addition, the low radiographic contrast of many nonpalpable breast masses, particularly among women with radiographically dense breasts, results in an overall rate of 10% to 25% for missed tumors. Nuclear imaging of the breast using single gamma emitters (scintimammography) such as (superscript 99m)Tc, or positron emitters such as F-18- fluorodeoxyglucose (FDG) for positron emission tomography (PET), can provide information on functional or metabolic tumor activity that is complementary to the structural information of x-ray mammography, thereby potentially reducing the number of unnecessary biopsies and missed cancers. This paper summarizes recent data on the efficacy of scintimammography using conventional gamma cameras, and describes the development of dedicated detectors for gamma emission breast imaging. The detectors use new, high density crystal scintillators and large area position sensitive photomultiplier tubes (PSPMTs). Detector design, imaging requirements, and preliminary measured imaging performance are discussed.

  13. Probability shapes perceptual precision: A study in orientation estimation.

    PubMed

    Jabar, Syaheed B; Anderson, Britt

    2015-12-01

    Probability is known to affect perceptual estimations, but an understanding of mechanisms is lacking. Moving beyond binary classification tasks, we had naive participants report the orientation of briefly viewed gratings where we systematically manipulated contingent probability. Participants rapidly developed faster and more precise estimations for high-probability tilts. The shapes of their error distributions, as indexed by a kurtosis measure, also showed a distortion from Gaussian. This kurtosis metric was robust, capturing probability effects that were graded, contextual, and varying as a function of stimulus orientation. Our data can be understood as a probability-induced reduction in the variability or "shape" of estimation errors, as would be expected if probability affects the perceptual representations. As probability manipulations are an implicit component of many endogenous cuing paradigms, changes at the perceptual level could account for changes in performance that might have traditionally been ascribed to "attention." (c) 2015 APA, all rights reserved).

  14. Testing the Gamma-Ray Burst Energy Relationships

    NASA Technical Reports Server (NTRS)

    Band, David L.; Preece, Robert D.

    2005-01-01

    Building on Nakar & Piran's analysis of the Amati relation relating gamma-ray burst peak energies E(sub p) and isotropic energies E(sub iso ) we test the consistency of a large sample of BATSE bursts with the Amati and Ghirlanda (which relates peak energies and actual gamma-ray energies E(sub gamma)) relations. Each of these relations can be exp ressed as a ratio of the different energies that is a function of red shift (for both the Amati and Ghirlanda relations) and beaming fraction f(sub B) (for the Ghirlanda relation). The most rigorous test, whic h allows bursts to be at any redshift, corroborates Nakar & Piran's r esult - 88% of the BATSE bursts are inconsistent with the Amati relat ion - while only l.6% of the bursts are inconsistent with the Ghirlan da relation if f(sub B) = 1. Modelling the redshift distribution resu lts in an energy ratio distribution for the Amati relation that is sh ifted by an order of magnitude relative to the observed distributions; any sub-population satisfying the Amati relation can comprise at mos t approx. 18% of our burst sample. A similar analysis of the Ghirland a relation depends sensitively on the beaming fraction distribution f or small values of f(sub B); for reasonable estimates of this distrib ution about a third of the burst sample is inconsistent with the Ghir landa relation. Our results indicate that these relations are an artifact of the selection effects of the burst sample in which they were f ound; these selection effects may favor sub-populations for which the se relations are valid.

  15. Size-biased distributions in the generalized beta distribution family, with applications to forestry

    Treesearch

    Mark J. Ducey; Jeffrey H. Gove

    2015-01-01

    Size-biased distributions arise in many forestry applications, as well as other environmental, econometric, and biomedical sampling problems. We examine the size-biased versions of the generalized beta of the first kind, generalized beta of the second kind and generalized gamma distributions. These distributions include, as special cases, the Dagum (Burr Type III),...

  16. Stochastic analysis of particle movement over a dune bed

    USGS Publications Warehouse

    Lee, Baum K.; Jobson, Harvey E.

    1977-01-01

    Stochastic models are available that can be used to predict the transport and dispersion of bed-material sediment particles in an alluvial channel. These models are based on the proposition that the movement of a single bed-material sediment particle consists of a series of steps of random length separated by rest periods of random duration and, therefore, application of the models requires a knowledge of the probability distributions of the step lengths, the rest periods, the elevation of particle deposition, and the elevation of particle erosion. The procedure was tested by determining distributions from bed profiles formed in a large laboratory flume with a coarse sand as the bed material. The elevation of particle deposition and the elevation of particle erosion can be considered to be identically distributed, and their distribution can be described by either a ' truncated Gaussian ' or a ' triangular ' density function. The conditional probability distribution of the rest period given the elevation of particle deposition closely followed the two-parameter gamma distribution. The conditional probability distribution of the step length given the elevation of particle erosion and the elevation of particle deposition also closely followed the two-parameter gamma density function. For a given flow, the scale and shape parameters describing the gamma probability distributions can be expressed as functions of bed-elevation. (Woodard-USGS)

  17. Prompt-gamma monitoring in hadrontherapy: A review

    NASA Astrophysics Data System (ADS)

    Krimmer, J.; Dauvergne, D.; Létang, J. M.; Testa, É.

    2018-01-01

    Secondary radiation emission induced by nuclear reactions is correlated to the path of ions in matter. Therefore, such penetrating radiation can be used for in vivo control of hadrontherapy treatments, for which the primary beam is absorbed inside the patient. Among secondary radiations, prompt-gamma rays were proposed for real-time verification of ion range. Such a verification is a desired condition to reduce uncertainties in treatment planning. For more than a decade, efforts have been undertaken worldwide to promote prompt-gamma-based devices to be used in clinical conditions. Dedicated cameras are necessary to overcome the challenges of a broad- and high-energy distribution, a large background, high instantaneous count rates, and compatibility constraints with patient irradiation. Several types of prompt-gamma imaging devices have been proposed, that are either physically-collimated or electronically collimated (Compton cameras). Clinical tests are now undergoing. Meanwhile, other methods than direct prompt-gamma imaging were proposed, that are based on specific counting using either time-of-flight or photon energy measurements. In the present article, we make a review and discuss the state of the art for all techniques using prompt-gamma detection to improve the quality assurance in hadrontherapy.

  18. Some New Approaches to Multivariate Probability Distributions.

    DTIC Science & Technology

    1986-12-01

    Krishnaiah (1977). The following example may serve as an illustration of this point. EXAMPLE 2. (Fre^*chet’s bivariate continuous distribution...the error in the theorem of "" Prakasa Rao (1974) and to Dr. P.R. Krishnaiah for his valuable comments on the initial draft, his monumental patience and...M. and Proschan, F. (1984). Nonparametric Concepts and Methods in Reliability, Handbook of Statistics, 4, 613-655, (eds. P.R. Krishnaiah and P.K

  19. Bivariate extreme value distributions

    NASA Technical Reports Server (NTRS)

    Elshamy, M.

    1992-01-01

    In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

  20. Conditional Probabilities and Collapse in Quantum Measurements

    NASA Astrophysics Data System (ADS)

    Laura, Roberto; Vanni, Leonardo

    2008-09-01

    We show that including both the system and the apparatus in the quantum description of the measurement process, and using the concept of conditional probabilities, it is possible to deduce the statistical operator of the system after a measurement with a given result, which gives the probability distribution for all possible consecutive measurements on the system. This statistical operator, representing the state of the system after the first measurement, is in general not the same that would be obtained using the postulate of collapse.

  1. A hydroclimatological approach to predicting regional landslide probability using Landlab

    NASA Astrophysics Data System (ADS)

    Strauch, Ronda; Istanbulluoglu, Erkan; Nudurupati, Sai Siddhartha; Bandaragoda, Christina; Gasparini, Nicole M.; Tucker, Gregory E.

    2018-02-01

    We develop a hydroclimatological approach to the modeling of regional shallow landslide initiation that integrates spatial and temporal dimensions of parameter uncertainty to estimate an annual probability of landslide initiation based on Monte Carlo simulations. The physically based model couples the infinite-slope stability model with a steady-state subsurface flow representation and operates in a digital elevation model. Spatially distributed gridded data for soil properties and vegetation classification are used for parameter estimation of probability distributions that characterize model input uncertainty. Hydrologic forcing to the model is through annual maximum daily recharge to subsurface flow obtained from a macroscale hydrologic model. We demonstrate the model in a steep mountainous region in northern Washington, USA, over 2700 km2. The influence of soil depth on the probability of landslide initiation is investigated through comparisons among model output produced using three different soil depth scenarios reflecting the uncertainty of soil depth and its potential long-term variability. We found elevation-dependent patterns in probability of landslide initiation that showed the stabilizing effects of forests at low elevations, an increased landslide probability with forest decline at mid-elevations (1400 to 2400 m), and soil limitation and steep topographic controls at high alpine elevations and in post-glacial landscapes. These dominant controls manifest themselves in a bimodal distribution of spatial annual landslide probability. Model testing with limited observations revealed similarly moderate model confidence for the three hazard maps, suggesting suitable use as relative hazard products. The model is available as a component in Landlab, an open-source, Python-based landscape earth systems modeling environment, and is designed to be easily reproduced utilizing HydroShare cyberinfrastructure.

  2. Probability Distribution of Dose and Dose-Rate Effectiveness Factor for use in Estimating Risks of Solid Cancers From Exposure to Low-Let Radiation.

    PubMed

    Kocher, David C; Apostoaei, A Iulian; Hoffman, F Owen; Trabalka, John R

    2018-06-01

    This paper presents an analysis to develop a subjective state-of-knowledge probability distribution of a dose and dose-rate effectiveness factor for use in estimating risks of solid cancers from exposure to low linear energy transfer radiation (photons or electrons) whenever linear dose responses from acute and chronic exposure are assumed. A dose and dose-rate effectiveness factor represents an assumption that the risk of a solid cancer per Gy at low acute doses or low dose rates of low linear energy transfer radiation, RL, differs from the risk per Gy at higher acute doses, RH; RL is estimated as RH divided by a dose and dose-rate effectiveness factor, where RH is estimated from analyses of dose responses in Japanese atomic-bomb survivors. A probability distribution to represent uncertainty in a dose and dose-rate effectiveness factor for solid cancers was developed from analyses of epidemiologic data on risks of incidence or mortality from all solid cancers as a group or all cancers excluding leukemias, including (1) analyses of possible nonlinearities in dose responses in atomic-bomb survivors, which give estimates of a low-dose effectiveness factor, and (2) comparisons of risks in radiation workers or members of the public from chronic exposure to low linear energy transfer radiation at low dose rates with risks in atomic-bomb survivors, which give estimates of a dose-rate effectiveness factor. Probability distributions of uncertain low-dose effectiveness factors and dose-rate effectiveness factors for solid cancer incidence and mortality were combined using assumptions about the relative weight that should be assigned to each estimate to represent its relevance to estimation of a dose and dose-rate effectiveness factor. The probability distribution of a dose and dose-rate effectiveness factor for solid cancers developed in this study has a median (50th percentile) and 90% subjective confidence interval of 1.3 (0.47, 3.6). The harmonic mean is 1.1, which

  3. Radiation measurement above the lunar surface by Kaguya gamma-ray spectrometer

    NASA Astrophysics Data System (ADS)

    Hasebe, Nobuyuki; Nagaoka, Hiroshi; Kusano, Hiroki; Hareyama, Matoko; Ideguchi, Yusuke; Shimizu, Sota; Shibamura, Eido

    The lunar surface is filled with various ionizing radiations such as high energy galactic particles, albedo particles and secondary radiations of neutrons, gamma rays and other elementary particles. A high-resolution Kaguya Gamma-Ray Spectrometer (KGRS) was carried on the Japan’s lunar explorer SELENE (Kaguya), the largest lunar orbiter since the Apollo missions. The KGRS instrument employed, for the first time in lunar exploration, a high-purity Ge crystal to increase the identification capability of elemental gamma-ray lines. The Ge detector is surrounded by BGO and plastic counters as for anticoincidence shields. The KGRS measured gamma rays in the energy range from 200 keV to 13 MeV with high precision to determine the chemical composition of the lunar surface. It provided data on the abundance of major elements over the entire lunar surface. In addition to the gamma-ray observation by the KGRS, it successfully measured the global distribution of fast neutrons. In the energy spectra of gamma-rays observed by the KGRS, several saw-tooth- peaks of Ge are included, which are formed by the collision interaction of lunar fast neutrons with Ge atoms in the Ge crystal. With these saw-tooth-peaks analysis, global distribution of neutrons emitted from the lunara surface was successfully created, which was compared with the previous results obtained by Lunar Prospector neutron maps. Another anticoincidence counter, the plastic counter with 5 mm thickness, was used to veto radiation events mostly generated by charged particles. A single photomultiplier serves to count scintillation light from the plastic scintillation counter. The global map of counting rates observed by the plastic counter was also created, implying that the radiation counting rate implies the geological distribution, in spite that the plastic counter mostly measures high energy charged particles and energetic neutrons. These results are presented and discussed.

  4. Visualization of the operational space of edge-localized modes through low-dimensional embedding of probability distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shabbir, A., E-mail: aqsa.shabbir@ugent.be; Noterdaeme, J. M.; Max-Planck-Institut für Plasmaphysik, Garching D-85748

    2014-11-15

    Information visualization aimed at facilitating human perception is an important tool for the interpretation of experiments on the basis of complex multidimensional data characterizing the operational space of fusion devices. This work describes a method for visualizing the operational space on a two-dimensional map and applies it to the discrimination of type I and type III edge-localized modes (ELMs) from a series of carbon-wall ELMy discharges at JET. The approach accounts for stochastic uncertainties that play an important role in fusion data sets, by modeling measurements with probability distributions in a metric space. The method is aimed at contributing tomore » physical understanding of ELMs as well as their control. Furthermore, it is a general method that can be applied to the modeling of various other plasma phenomena as well.« less

  5. Evaluation of the Combined Effects of Gamma Radiation and High Dietary Iron on Peripheral Leukocyte Distribution and Function

    NASA Technical Reports Server (NTRS)

    Crucian, Brian E.; Morgan, Jennifer L. L.; Quiriarte, Heather A.; Sams, Clarence F.; Smith, Scott M.; Zwart, Sara R.

    2011-01-01

    NASA is concerned with the health risks to astronauts, particularly those risks related to radiation exposure. Both radiation and increased iron stores can independently increase oxidative damage, resulting in protein, lipid and DNA oxidation. Oxidative stress increases the risk of many health problems including cancer, cataracts, and heart disease. This study, a subset of a larger interdisciplinary investigation of the combined effect of iron overload on sensitivity to radiation injury, monitored immune parameters in the peripheral blood of rats subjected to gamma radiation, high dietary iron or both. Specific immune measures consisted of (A) peripheral leukocyte distribution; (B) plasma cytokine levels; (C) cytokine production profiles following whole blood stimulation of either T cells or monocytes.

  6. Development of a Gamma-Ray Spectrometer for Korean Pathfinder Lunar Orbiter

    NASA Astrophysics Data System (ADS)

    Kim, Kyeong Ja; Park, Junghun; Choi, Yire; Lee, Sungsoon; Yeon, Youngkwang; Yi, Eung Seok; Jeong, Meeyoung; Sun, Changwan; van Gasselt, Stephan; Lee, K. B.; Kim, Yongkwon; Min, Kyungwook; Kang, Kyungin; Cho, Jinyeon; Park, Kookjin; Hasebe, Nobuyuki; Elphic, Richard; Englert, Peter; Gasnault, Olivier; Lim, Lucy; Shibamura, Eido; GRS Team

    2016-10-01

    Korea is preparing for a lunar orbiter mission (KPLO) to be developed in no later than 2018. Onboard the spacecraft is a gamma ray spectrometer (KLGRS) allowing to collect low energy gamma-ray signals in order to detect elements by either X-ray fluorescence or by natural radioactive decay in the low as well as higher energy regions of up to 10 MeV. Scientific objectives include lunar resources (water and volatile measurements, rare earth elements and precious metals, energy resources, major elemental distributions for prospective in-situ utilizations), investigation of the lunar geology and studies of the lunar environment (mapping of the global radiation environment from keV to 10 MeV, high energy cosmic ray flux using the plastic scintillator).The Gamma-Ray Spectrometer (GRS) system is a compact low-weight instrument for the chemical analysis of lunar surface materials within a gamma-ray energy range from 10s keV to 10 MeV. The main LaBr3 detector is surrounded by an anti-coincidence counting module of BGO/PS scintillators to reduce both low gamma-ray background from the spacecraft and housing materials and high energy gamma-ray background from cosmic rays. The GRS system will determine the elemental compositions of the near surface of the Moon.The GRS system is a recently developed gamma-ray scintillation based detector which can be used as a replacement for the HPGe GRS sensor with the advantage of being able to operate at a wide range of temperatures with remarkable energy resolution. LaBr3 also has a high photoelectron yield, fast scintillation response, good linearity and thermal stability. With these major advantages, the LaBr3 GRS system will allow us to investigate scientific objectives and assess important research questions on lunar geology and resource exploration.The GRS investigation will help to assess open questions related to the spatial distribution and origin of the elements on the lunar surface and will contribute to unravel geological surface

  7. Derived distribution of floods based on the concept of partial area coverage with a climatic appeal

    NASA Astrophysics Data System (ADS)

    Iacobellis, Vito; Fiorentino, Mauro

    2000-02-01

    A new rationale for deriving the probability distribution of floods and help in understanding the physical processes underlying the distribution itself is presented. On the basis of this a model that presents a number of new assumptions is developed. The basic ideas are as follows: (1) The peak direct streamflow Q can always be expressed as the product of two random variates, namely, the average runoff per unit area ua and the peak contributing area a; (2) the distribution of ua conditional on a can be related to that of the rainfall depth occurring in a duration equal to a characteristic response time тa of the contributing part of the basin; and (3) тa is assumed to vary with a according to a power law. Consequently, the probability density function of Q can be found as the integral, over the total basin area A of that of a times the density function of ua given a. It is suggested that ua can be expressed as a fraction of the excess rainfall and that the annual flood distribution can be related to that of Q by the hypothesis that the flood occurrence process is Poissonian. In the proposed model it is assumed, as an exploratory attempt, that a and ua are gamma and Weibull distributed, respectively. The model was applied to the annual flood series of eight gauged basins in Basilicata (southern Italy) with catchment areas ranging from 40 to 1600 km2. The results showed strong physical consistence as the parameters tended to assume values in good agreement with well-consolidated geomorphologic knowledge and suggested a new key to understanding the climatic control of the probability distribution of floods.

  8. Performance evaluation of receive-diversity free-space optical communications over correlated Gamma-Gamma fading channels.

    PubMed

    Yang, Guowei; Khalighi, Mohammad-Ali; Ghassemlooy, Zabih; Bourennane, Salah

    2013-08-20

    The efficacy of spatial diversity in practical free-space optical communication systems is impaired by the fading correlation among the underlying subchannels. We consider in this paper the generation of correlated Gamma-Gamma random variables in view of evaluating the system outage probability and bit-error-rate under the condition of correlated fading. Considering the case of receive-diversity systems with intensity modulation and direct detection, we propose a set of criteria for setting the correlation coefficients on the small- and large-scale fading components based on scintillation theory. We verify these criteria using wave-optics simulations and further show through Monte Carlo simulations that we can effectively neglect the correlation corresponding to the small-scale turbulence in most practical systems, irrespective of the specific turbulence conditions. This has not been clarified before, to the best of our knowledge. We then present some numerical results to illustrate the effect of fading correlation on the system performance. Our conclusions can be generalized to the cases of multiple-beam and multiple-beam multiple-aperture systems.

  9. Diffuse gamma radiation

    NASA Technical Reports Server (NTRS)

    Fichtel, C. E.; Simpson, G. A.; Thompson, D. J.

    1977-01-01

    An examination of the intensity, energy spectrum, and spatial distribution of the diffuse gamma-radiation observed by SAS-2 satellite away from the galactic plane in the energy range above 35 MeV has shown that it consists of two components. One component is generally correlated with galactic latitudes, the atomic hydrogen column density was deduced from 21 cm measurements, and the continuum radio emission, believed to be synchrotron emission. It has an energy spectrum similar to that in the plane and joins smoothly to the intense radiation from the plane. It is therefore presumed to be of galactic origin. The other component is apparently isotropic, at least on a coarse scale, and has a steep energy spectrum. No evidence is found for a cosmic ray halo surrounding the galaxy in the shape of a sphere or oblate spheroid with galactic dimensions. Constraints for a halo model with significantly larger dimensions are set on the basis of an upper limit to the gamma-ray anisotropy.

  10. A Case Series of the Probability Density and Cumulative Distribution of Laryngeal Disease in a Tertiary Care Voice Center.

    PubMed

    de la Fuente, Jaime; Garrett, C Gaelyn; Ossoff, Robert; Vinson, Kim; Francis, David O; Gelbard, Alexander

    2017-11-01

    To examine the distribution of clinic and operative pathology in a tertiary care laryngology practice. Probability density and cumulative distribution analyses (Pareto analysis) was used to rank order laryngeal conditions seen in an outpatient tertiary care laryngology practice and those requiring surgical intervention during a 3-year period. Among 3783 new clinic consultations and 1380 operative procedures, voice disorders were the most common primary diagnostic category seen in clinic (n = 3223), followed by airway (n = 374) and swallowing (n = 186) disorders. Within the voice strata, the most common primary ICD-9 code used was dysphonia (41%), followed by unilateral vocal fold paralysis (UVFP) (9%) and cough (7%). Among new voice patients, 45% were found to have a structural abnormality. The most common surgical indications were laryngotracheal stenosis (37%), followed by recurrent respiratory papillomatosis (18%) and UVFP (17%). Nearly 55% of patients presenting to a tertiary referral laryngology practice did not have an identifiable structural abnormality in the larynx on direct or indirect examination. The distribution of ICD-9 codes requiring surgical intervention was disparate from that seen in clinic. Application of the Pareto principle may improve resource allocation in laryngology, but these initial results require confirmation across multiple institutions.

  11. Simultaneous Planck, Swift, and Fermi Observations of X-ray and Gamma-ray Selected Blazars

    NASA Technical Reports Server (NTRS)

    Giommi, P.; Polenta, G.; Laehteenmaeki, A.; Thompson, D. J.; Capalbi, M.; Cutini, S.; Gasparrini, D.; Gonzalez, Nuevo, J.; Leon-Tavares, J.; Lopez-Caniego, M.; hide

    2012-01-01

    We present simultaneous Planck, Swift, Fermi, and ground-based data for 105 blazars belonging to three samples with flux limits in the soft X-ray, hard X-ray, and gamma-ray bands, with additional 5 GHz flux-density limits to ensure a good probability of a Planck detection. We compare our results to those of a companion paper presenting simultaneous Planck and multi-frequency observations of 104 radio-loud northern active galactic nuclei selected at radio frequencies. While we confirm several previous results, our unique data set allows us to demonstrate that the selection method strongly influences the results, producing biases that cannot be ignored. Almost all the BL Lac objects have been detected by the Fermi Large Area Telescope (LAT), whereas 30% to 40% of the flat-spectrum radio quasars (FSRQs) in the radio, soft X-ray, and hard X-ray selected samples are still below the gamma-ray detection limit even after integrating 27 months of Fermi-LAT data. The radio to sub-millimetre spectral slope of blazars is quite flat, with (alpha) approx 0 up to about 70GHz, above which it steepens to (alpha) approx -0.65. The BL Lacs have significantly flatter spectra than FSRQs at higher frequencies. The distribution of the rest-frame synchrotron peak frequency (nu(sup s)(sub peak)) in the spectral energy distribution (SED) of FSRQs is the same in all the blazar samples with (nu(sup s)(sub peak)) = 10(exp 13.1 +/- 0.1) Hz, while the mean inverse Compton peak frequency, (nu(sup IC)(sub peak)), ranges from 10(exp 21) to 10(exp 22) Hz. The distributions of nu(sup s)(sub peak) and nu(sup IC)(sub peak) of BL Lacs are much broader and are shifted to higher energies than those of FSRQs; their shapes strongly depend on the selection method. The Compton dominance of blazars. defined as the ratio of the inverse Compton to synchrotron peak luminosities, ranges from less than 0.2 to nearly 100, with only FSRQs reaching values larger than about 3. Its distribution is broad and depends

  12. Large Deviations: Advanced Probability for Undergrads

    ERIC Educational Resources Information Center

    Rolls, David A.

    2007-01-01

    In the branch of probability called "large deviations," rates of convergence (e.g. of the sample mean) are considered. The theory makes use of the moment generating function. So, particularly for sums of independent and identically distributed random variables, the theory can be made accessible to senior undergraduates after a first course in…

  13. A statistical approach to estimate the 3D size distribution of spheres from 2D size distributions

    USGS Publications Warehouse

    Kong, M.; Bhattacharya, R.N.; James, C.; Basu, A.

    2005-01-01

    Size distribution of rigidly embedded spheres in a groundmass is usually determined from measurements of the radii of the two-dimensional (2D) circular cross sections of the spheres in random flat planes of a sample, such as in thin sections or polished slabs. Several methods have been devised to find a simple factor to convert the mean of such 2D size distributions to the actual 3D mean size of the spheres without a consensus. We derive an entirely theoretical solution based on well-established probability laws and not constrained by limitations of absolute size, which indicates that the ratio of the means of measured 2D and estimated 3D grain size distribution should be r/4 (=.785). Actual 2D size distribution of the radii of submicron sized, pure Fe0 globules in lunar agglutinitic glass, determined from backscattered electron images, is tested to fit the gamma size distribution model better than the log-normal model. Numerical analysis of 2D size distributions of Fe0 globules in 9 lunar soils shows that the average mean of 2D/3D ratio is 0.84, which is very close to the theoretical value. These results converge with the ratio 0.8 that Hughes (1978) determined for millimeter-sized chondrules from empirical measurements. We recommend that a factor of 1.273 (reciprocal of 0.785) be used to convert the determined 2D mean size (radius or diameter) of a population of spheres to estimate their actual 3D size. ?? 2005 Geological Society of America.

  14. A short walk in quantum probability

    NASA Astrophysics Data System (ADS)

    Hudson, Robin

    2018-04-01

    This is a personal survey of aspects of quantum probability related to the Heisenberg commutation relation for canonical pairs. Using the failure, in general, of non-negativity of the Wigner distribution for canonical pairs to motivate a more satisfactory quantum notion of joint distribution, we visit a central limit theorem for such pairs and a resulting family of quantum planar Brownian motions which deform the classical planar Brownian motion, together with a corresponding family of quantum stochastic areas. This article is part of the themed issue `Hilbert's sixth problem'.

  15. A short walk in quantum probability.

    PubMed

    Hudson, Robin

    2018-04-28

    This is a personal survey of aspects of quantum probability related to the Heisenberg commutation relation for canonical pairs. Using the failure, in general, of non-negativity of the Wigner distribution for canonical pairs to motivate a more satisfactory quantum notion of joint distribution, we visit a central limit theorem for such pairs and a resulting family of quantum planar Brownian motions which deform the classical planar Brownian motion, together with a corresponding family of quantum stochastic areas.This article is part of the themed issue 'Hilbert's sixth problem'. © 2018 The Author(s).

  16. Cosmological gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Paczynski, Bohdan

    1991-01-01

    The distribution in angle and flux of gamma-ray bursts indicates that the majority of gamma-ray bursters are at cosmological distances, i.e., at z of about 1. The rate is then about 10 exp -8/yr in a galaxy like the Milky Way, i.e., orders of magnitude lower than the estimated rate for collisions between neutron stars in close binary systems. The energy per burst is about 10 exp 51 ergs, assuming isotropic emission. The events appear to be less energetic and more frequent if their emission is strongly beamed. Some tests for the distance scale are discussed: a correlation between the burst's strength and its spectrum; the absorption by the Galactic gas below about 2 keV; the X-ray tails caused by forward scattering by the Galactic dust; about 1 month recurrence of some bursts caused by gravitational lensing by foreground galaxies; and a search for gamma-ray bursts in M31. The bursts appear to be a manifestation of something exotic, but conventional compact objects can provide an explanation. The best possibility is offered by a decay of a bindary composed of a spinning-stellar-mass black-hole primary and a neutron or a strange-quark star secondary. In the final phase the secondary is tidally disrupted, forms an accretion disk, and up to 10 exp 54 ergs are released. A very small fraction of this energy powers the gamma-ray burst.

  17. Use of Probability Distribution Functions for Discriminating Between Cloud and Aerosol in Lidar Backscatter Data

    NASA Technical Reports Server (NTRS)

    Liu, Zhaoyan; Vaughan, Mark A.; Winker, Davd M.; Hostetler, Chris A.; Poole, Lamont R.; Hlavka, Dennis; Hart, William; McGill, Mathew

    2004-01-01

    In this paper we describe the algorithm hat will be used during the upcoming Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observations (CALIPSO) mission for discriminating between clouds and aerosols detected in two wavelength backscatter lidar profiles. We first analyze single-test and multiple-test classification approaches based on one-dimensional and multiple-dimensional probability density functions (PDFs) in the context of a two-class feature identification scheme. From these studies we derive an operational algorithm based on a set of 3-dimensional probability distribution functions characteristic of clouds and aerosols. A dataset acquired by the Cloud Physics Lidar (CPL) is used to test the algorithm. Comparisons are conducted between the CALIPSO algorithm results and the CPL data product. The results obtained show generally good agreement between the two methods. However, of a total of 228,264 layers analyzed, approximately 5.7% are classified as different types by the CALIPSO and CPL algorithm. This disparity is shown to be due largely to the misclassification of clouds as aerosols by the CPL algorithm. The use of 3-dimensional PDFs in the CALIPSO algorithm is found to significantly reduce this type of error. Dust presents a special case. Because the intrinsic scattering properties of dust layers can be very similar to those of clouds, additional algorithm testing was performed using an optically dense layer of Saharan dust measured during the Lidar In-space Technology Experiment (LITE). In general, the method is shown to distinguish reliably between dust layers and clouds. The relatively few erroneous classifications occurred most often in the LITE data, in those regions of the Saharan dust layer where the optical thickness was the highest.

  18. gamma-Hexachlorocyclohexane (gamma-HCH)

    Integrated Risk Information System (IRIS)

    gamma - Hexachlorocyclohexane ( gamma - HCH ) ; CASRN 58 - 89 - 9 Human health assessment information on a chemical substance is included in the IRIS database only after a comprehensive review of toxicity data , as outlined in the IRIS assessment development process . Sections I ( Health Hazard Asse

  19. GAMSOR: Gamma Source Preparation and DIF3D Flux Solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, M. A.; Lee, C. H.; Hill, R. N.

    2016-12-15

    Nuclear reactors that rely upon the fission reaction have two modes of thermal energy deposition in the reactor system: neutron absorption and gamma absorption. The gamma rays are typically generated by neutron absorption reactions or during the fission process which means the primary driver of energy production is of course the neutron interaction. In conventional reactor physics methods, the gamma heating component is ignored such that the gamma absorption is forced to occur at the gamma emission site. For experimental reactor systems like EBR-II and FFTF, the placement of structural pins and assemblies internal to the core leads to problemsmore » with power heating predictions because there is no fission power source internal to the assembly to dictate a spatial distribution of the power. As part of the EBR-II support work in the 1980s, the GAMSOR code was developed to assist analysts in calculating the gamma heating. The GAMSOR code is a modified version of DIF3D and actually functions within a sequence of DIF3D calculations. The gamma flux in a conventional fission reactor system does not perturb the neutron flux and thus the gamma flux calculation can be cast as a fixed source problem given a solution to the steady state neutron flux equation. This leads to a sequence of DIF3D calculations, called the GAMSOR sequence, which involves solving the neutron flux, then the gamma flux, then combining the results to do a summary edit. In this manuscript, we go over the GAMSOR code and detail how it is put together and functions. We also discuss how to setup the GAMSOR sequence and input for each DIF3D calculation in the GAMSOR sequence. With the GAMSOR capability, users can take any valid steady state DIF3D calculation and compute the power distribution due to neutron and gamma heating. The MC2-3 code is the preferable companion code to use for generating neutron and gamma cross section data, but the GAMSOR code can accept cross section data from other sources. To

  20. Regional flood probabilities

    USGS Publications Warehouse

    Troutman, Brent M.; Karlinger, Michael R.

    2003-01-01

    The T‐year annual maximum flood at a site is defined to be that streamflow, that has probability 1/T of being exceeded in any given year, and for a group of sites the corresponding regional flood probability (RFP) is the probability that at least one site will experience a T‐year flood in any given year. The RFP depends on the number of sites of interest and on the spatial correlation of flows among the sites. We present a Monte Carlo method for obtaining the RFP and demonstrate that spatial correlation estimates used in this method may be obtained with rank transformed data and therefore that knowledge of the at‐site peak flow distribution is not necessary. We examine the extent to which the estimates depend on specification of a parametric form for the spatial correlation function, which is known to be nonstationary for peak flows. It is shown in a simulation study that use of a stationary correlation function to compute RFPs yields satisfactory estimates for certain nonstationary processes. Application of asymptotic extreme value theory is examined, and a methodology for separating channel network and rainfall effects on RFPs is suggested. A case study is presented using peak flow data from the state of Washington. For 193 sites in the Puget Sound region it is estimated that a 100‐year flood will occur on the average every 4.5 years.