Sample records for content probability distribution

  1. Characterization of Cloud Water-Content Distribution

    NASA Technical Reports Server (NTRS)

    Lee, Seungwon

    2010-01-01

    The development of realistic cloud parameterizations for climate models requires accurate characterizations of subgrid distributions of thermodynamic variables. To this end, a software tool was developed to characterize cloud water-content distributions in climate-model sub-grid scales. This software characterizes distributions of cloud water content with respect to cloud phase, cloud type, precipitation occurrence, and geo-location using CloudSat radar measurements. It uses a statistical method called maximum likelihood estimation to estimate the probability density function of the cloud water content.

  2. Inverse statistics and information content

    NASA Astrophysics Data System (ADS)

    Ebadi, H.; Bolgorian, Meysam; Jafari, G. R.

    2010-12-01

    Inverse statistics analysis studies the distribution of investment horizons to achieve a predefined level of return. This distribution provides a maximum investment horizon which determines the most likely horizon for gaining a specific return. There exists a significant difference between inverse statistics of financial market data and a fractional Brownian motion (fBm) as an uncorrelated time-series, which is a suitable criteria to measure information content in financial data. In this paper we perform this analysis for the DJIA and S&P500 as two developed markets and Tehran price index (TEPIX) as an emerging market. We also compare these probability distributions with fBm probability, to detect when the behavior of the stocks are the same as fBm.

  3. Ecological risk assessment of a coastal zone in Southern Vietnam: Spatial distribution and content of heavy metals in water and surface sediments of the Thi Vai Estuary and Can Gio Mangrove Forest.

    PubMed

    Costa-Böddeker, Sandra; Hoelzmann, Philipp; Thuyên, Lê Xuân; Huy, Hoang Duc; Nguyen, Hoang Anh; Richter, Otto; Schwalb, Antje

    2017-01-30

    Enrichment of heavy metals was assessed in the Thi Vai Estuary and in the Can Gio Mangrove Forest (SE, Vietnam). Cd, Co, Cr, Cu, Mn, Ni, Pb and Zn contents in water and in sediments were measured. Total organic carbon, nitrogen, phosphorus and C/N ratios were determined. Cu and Cr values were higher than threshold effect level of toxicity, while Ni exceeded probable effect level, indicating the risk of probable toxicity effects. Enrichment factors (EF), contamination factor (CF) and Geo-accumulation index (I-geo) were determined. CF reveals moderate to considerable pollution with Cr and Ni. EF suggests anthropogenic sources of Cr, Cu and Ni. I-geo indicates low contamination with Co, Cu and Zn and moderate contamination with Cr and Ni. Overall metal contents were lower than expected for this highly industrialized region, probably due to dilution, suggesting that erosion rates and hydrodynamics may also play a role in metal contents distribution. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Method for localizing and isolating an errant process step

    DOEpatents

    Tobin, Jr., Kenneth W.; Karnowski, Thomas P.; Ferrell, Regina K.

    2003-01-01

    A method for localizing and isolating an errant process includes the steps of retrieving from a defect image database a selection of images each image having image content similar to image content extracted from a query image depicting a defect, each image in the selection having corresponding defect characterization data. A conditional probability distribution of the defect having occurred in a particular process step is derived from the defect characterization data. A process step as a highest probable source of the defect according to the derived conditional probability distribution is then identified. A method for process step defect identification includes the steps of characterizing anomalies in a product, the anomalies detected by an imaging system. A query image of a product defect is then acquired. A particular characterized anomaly is then correlated with the query image. An errant process step is then associated with the correlated image.

  5. Relationship between radiation-induced aberrations in individual chromosomes and their DNA content: effects of interaction distance

    NASA Technical Reports Server (NTRS)

    Wu, H.; Durante, M.; Lucas, J. N.

    2001-01-01

    PURPOSE: To study the effect of the interaction distance on the frequency of inter- and intrachromosome exchanges in individual chromosomes with respect to their DNA content. Assumptions: Chromosome exchanges are formed by misrejoining of two DNA double-strand breaks (DSB) induced within an interaction distance, d. It is assumed that chromosomes in G(0)/G(1) phase of the cell cycle occupy a spherical domain in a cell nucleus, with no spatial overlap between individual chromosome domains. RESULTS: Formulae are derived for the probability of formation of inter-, as well as intra-, chromosome exchanges relating to the DNA content of the chromosome for a given interaction distance. For interaction distances <1 microm, the relative frequency of interchromosome exchanges predicted by the present model is similar to that by Cigarran et al. (1998) based on the assumption that the probability of interchromosome exchanges is proportional to the "surface area" of the chromosome territory. The "surface area" assumption is shown to be a limiting case of d-->0 in the present model. The present model also predicts that the probability of intrachromosome exchanges occurring in individual chromosomes is proportional to their DNA content with correction terms. CONCLUSION: When the interaction distance is small, the "surface area" distribution for chromosome participation in interchromosome exchanges has been expected. However, the present model shows that for the interaction distance as large as 1 microm, the predicted probability of interchromosome exchange formation is still close to the surface area distribution. Therefore, this distribution does not necessarily rule out the formation of complex chromosomal aberrations by long-range misrejoining of DSB.

  6. Attention as Inference: Selection Is Probabilistic; Responses Are All-or-None Samples

    ERIC Educational Resources Information Center

    Vul, Edward; Hanus, Deborah; Kanwisher, Nancy

    2009-01-01

    Theories of probabilistic cognition postulate that internal representations are made up of multiple simultaneously held hypotheses, each with its own probability of being correct (henceforth, "probability distributions"). However, subjects make discrete responses and report the phenomenal contents of their mind to be all-or-none states rather than…

  7. Bayesian Cherry Picking Revisited

    NASA Astrophysics Data System (ADS)

    Garrett, Anthony J. M.; Prozesky, Victor M.; Padayachee, J.

    2004-04-01

    Tins are marketed as containing nine cherries. To fill the tins, cherries are fed into a drum containing twelve holes through which air is sucked; either zero, one or two cherries stick in each hole. Dielectric measurements are then made on each hole. Three outcomes are distinguished: empty hole (which is reliable); one cherry (which indicates one cherry with high probability, or two cherries with a complementary low probability known from calibration); or an uncertain number (which also indicates one cherry or two, with known probabilities that are quite similar). A choice can be made from which holes simultaneously to discharge contents into the tin. The sum and product rules of probability are applied in a Bayesian manner to find the distribution for the number of cherries in the tin. Based on this distribution, ways are discussed to optimise the number to nine cherries.

  8. An information measure for class discrimination. [in remote sensing of crop observation

    NASA Technical Reports Server (NTRS)

    Shen, S. S.; Badhwar, G. D.

    1986-01-01

    This article describes a separability measure for class discrimination. This measure is based on the Fisher information measure for estimating the mixing proportion of two classes. The Fisher information measure not only provides a means to assess quantitatively the information content in the features for separating classes, but also gives the lower bound for the variance of any unbiased estimate of the mixing proportion based on observations of the features. Unlike most commonly used separability measures, this measure is not dependent on the form of the probability distribution of the features and does not imply a specific estimation procedure. This is important because the probability distribution function that describes the data for a given class does not have simple analytic forms, such as a Gaussian. Results of applying this measure to compare the information content provided by three Landsat-derived feature vectors for the purpose of separating small grains from other crops are presented.

  9. Technical assistance report : I-73 economic impact analysis.

    DOT National Transportation Integrated Search

    1995-01-01

    This study assessed the probable economic impact of the future Interstate 73 along each of twelve alternative corridors that were proposed for the new highway. The contents of this report were originally distributed in four parts during February and ...

  10. Recent Additions for 2000

    EPA Science Inventory

    December 5, 2000
    Options for Development of Parametric Probability Distributions for Exposure Factors
    (This file contains the Table of Contents) EPA/6...

  11. Behavioral Analysis of Visitors to a Medical Institution's Website Using Markov Chain Monte Carlo Methods.

    PubMed

    Suzuki, Teppei; Tani, Yuji; Ogasawara, Katsuhiko

    2016-07-25

    Consistent with the "attention, interest, desire, memory, action" (AIDMA) model of consumer behavior, patients collect information about available medical institutions using the Internet to select information for their particular needs. Studies of consumer behavior may be found in areas other than medical institution websites. Such research uses Web access logs for visitor search behavior. At this time, research applying the patient searching behavior model to medical institution website visitors is lacking. We have developed a hospital website search behavior model using a Bayesian approach to clarify the behavior of medical institution website visitors and determine the probability of their visits, classified by search keyword. We used the website data access log of a clinic of internal medicine and gastroenterology in the Sapporo suburbs, collecting data from January 1 through June 31, 2011. The contents of the 6 website pages included the following: home, news, content introduction for medical examinations, mammography screening, holiday person-on-duty information, and other. The search keywords we identified as best expressing website visitor needs were listed as the top 4 headings from the access log: clinic name, clinic name + regional name, clinic name + medical examination, and mammography screening. Using the search keywords as the explaining variable, we built a binomial probit model that allows inspection of the contents of each purpose variable. Using this model, we determined a beta value and generated a posterior distribution. We performed the simulation using Markov Chain Monte Carlo methods with a noninformation prior distribution for this model and determined the visit probability classified by keyword for each category. In the case of the keyword "clinic name," the visit probability to the website, repeated visit to the website, and contents page for medical examination was positive. In the case of the keyword "clinic name and regional name," the probability for a repeated visit to the website and the mammography screening page was negative. In the case of the keyword "clinic name + medical examination," the visit probability to the website was positive, and the visit probability to the information page was negative. When visitors referred to the keywords "mammography screening," the visit probability to the mammography screening page was positive (95% highest posterior density interval = 3.38-26.66). Further analysis for not only the clinic website but also various other medical institution websites is necessary to build a general inspection model for medical institution websites; we want to consider this in future research. Additionally, we hope to use the results obtained in this study as a prior distribution for future work to conduct higher-precision analysis.

  12. Behavioral Analysis of Visitors to a Medical Institution’s Website Using Markov Chain Monte Carlo Methods

    PubMed Central

    Tani, Yuji

    2016-01-01

    Background Consistent with the “attention, interest, desire, memory, action” (AIDMA) model of consumer behavior, patients collect information about available medical institutions using the Internet to select information for their particular needs. Studies of consumer behavior may be found in areas other than medical institution websites. Such research uses Web access logs for visitor search behavior. At this time, research applying the patient searching behavior model to medical institution website visitors is lacking. Objective We have developed a hospital website search behavior model using a Bayesian approach to clarify the behavior of medical institution website visitors and determine the probability of their visits, classified by search keyword. Methods We used the website data access log of a clinic of internal medicine and gastroenterology in the Sapporo suburbs, collecting data from January 1 through June 31, 2011. The contents of the 6 website pages included the following: home, news, content introduction for medical examinations, mammography screening, holiday person-on-duty information, and other. The search keywords we identified as best expressing website visitor needs were listed as the top 4 headings from the access log: clinic name, clinic name + regional name, clinic name + medical examination, and mammography screening. Using the search keywords as the explaining variable, we built a binomial probit model that allows inspection of the contents of each purpose variable. Using this model, we determined a beta value and generated a posterior distribution. We performed the simulation using Markov Chain Monte Carlo methods with a noninformation prior distribution for this model and determined the visit probability classified by keyword for each category. Results In the case of the keyword “clinic name,” the visit probability to the website, repeated visit to the website, and contents page for medical examination was positive. In the case of the keyword “clinic name and regional name,” the probability for a repeated visit to the website and the mammography screening page was negative. In the case of the keyword “clinic name + medical examination,” the visit probability to the website was positive, and the visit probability to the information page was negative. When visitors referred to the keywords “mammography screening,” the visit probability to the mammography screening page was positive (95% highest posterior density interval = 3.38-26.66). Conclusions Further analysis for not only the clinic website but also various other medical institution websites is necessary to build a general inspection model for medical institution websites; we want to consider this in future research. Additionally, we hope to use the results obtained in this study as a prior distribution for future work to conduct higher-precision analysis. PMID:27457537

  13. Traffic-Adaptive, Flow-Specific Medium Access for Wireless Networks

    DTIC Science & Technology

    2009-09-01

    hybrid, contention and non-contention schemes are shown to be special cases. This work also compares the energy efficiency of centralized and distributed...solutions and proposes an energy efficient version of traffic-adaptive CWS-MAC that includes an adaptive sleep cycle coordinated through the use of...preamble sampling. A preamble sampling probability parameter is introduced to manage the trade-off between energy efficiency and throughput and delay

  14. Digital simulation of an arbitrary stationary stochastic process by spectral representation.

    PubMed

    Yura, Harold T; Hanson, Steen G

    2011-04-01

    In this paper we present a straightforward, efficient, and computationally fast method for creating a large number of discrete samples with an arbitrary given probability density function and a specified spectral content. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In contrast to previous work, where the analyses were limited to auto regressive and or iterative techniques to obtain satisfactory results, we find that a single application of the inverse transform method yields satisfactory results for a wide class of arbitrary probability distributions. Although a single application of the inverse transform technique does not conserve the power spectra exactly, it yields highly accurate numerical results for a wide range of probability distributions and target power spectra that are sufficient for system simulation purposes and can thus be regarded as an accurate engineering approximation, which can be used for wide range of practical applications. A sufficiency condition is presented regarding the range of parameter values where a single application of the inverse transform method yields satisfactory agreement between the simulated and target power spectra, and a series of examples relevant for the optics community are presented and discussed. Outside this parameter range the agreement gracefully degrades but does not distort in shape. Although we demonstrate the method here focusing on stationary random processes, we see no reason why the method could not be extended to simulate non-stationary random processes. © 2011 Optical Society of America

  15. Growing and navigating the small world Web by local content

    PubMed Central

    Menczer, Filippo

    2002-01-01

    Can we model the scale-free distribution of Web hypertext degree under realistic assumptions about the behavior of page authors? Can a Web crawler efficiently locate an unknown relevant page? These questions are receiving much attention due to their potential impact for understanding the structure of the Web and for building better search engines. Here I investigate the connection between the linkage and content topology of Web pages. The relationship between a text-induced distance metric and a link-based neighborhood probability distribution displays a phase transition between a region where linkage is not determined by content and one where linkage decays according to a power law. This relationship is used to propose a Web growth model that is shown to accurately predict the distribution of Web page degree, based on textual content and assuming only local knowledge of degree for existing pages. A qualitatively similar phase transition is found between linkage and semantic distance, with an exponential decay tail. Both relationships suggest that efficient paths can be discovered by decentralized Web navigation algorithms based on textual and/or categorical cues. PMID:12381792

  16. Growing and navigating the small world Web by local content

    NASA Astrophysics Data System (ADS)

    Menczer, Filippo

    2002-10-01

    Can we model the scale-free distribution of Web hypertext degree under realistic assumptions about the behavior of page authors? Can a Web crawler efficiently locate an unknown relevant page? These questions are receiving much attention due to their potential impact for understanding the structure of the Web and for building better search engines. Here I investigate the connection between the linkage and content topology of Web pages. The relationship between a text-induced distance metric and a link-based neighborhood probability distribution displays a phase transition between a region where linkage is not determined by content and one where linkage decays according to a power law. This relationship is used to propose a Web growth model that is shown to accurately predict the distribution of Web page degree, based on textual content and assuming only local knowledge of degree for existing pages. A qualitatively similar phase transition is found between linkage and semantic distance, with an exponential decay tail. Both relationships suggest that efficient paths can be discovered by decentralized Web navigation algorithms based on textual and/or categorical cues.

  17. Growing and navigating the small world Web by local content.

    PubMed

    Menczer, Filippo

    2002-10-29

    Can we model the scale-free distribution of Web hypertext degree under realistic assumptions about the behavior of page authors? Can a Web crawler efficiently locate an unknown relevant page? These questions are receiving much attention due to their potential impact for understanding the structure of the Web and for building better search engines. Here I investigate the connection between the linkage and content topology of Web pages. The relationship between a text-induced distance metric and a link-based neighborhood probability distribution displays a phase transition between a region where linkage is not determined by content and one where linkage decays according to a power law. This relationship is used to propose a Web growth model that is shown to accurately predict the distribution of Web page degree, based on textual content and assuming only local knowledge of degree for existing pages. A qualitatively similar phase transition is found between linkage and semantic distance, with an exponential decay tail. Both relationships suggest that efficient paths can be discovered by decentralized Web navigation algorithms based on textual and/or categorical cues.

  18. Spatial distribution of environmental risk associated to a uranium abandoned mine (Central Portugal)

    NASA Astrophysics Data System (ADS)

    Antunes, I. M.; Ribeiro, A. F.

    2012-04-01

    The abandoned uranium mine of Canto do Lagar is located at Arcozelo da Serra, central Portugal. The mine was exploited in an open pit and produced about 12430Kg of uranium oxide (U3O8), between 1987 and 1988. The dominant geological unit is the porphyritic coarse-grained two-mica granite, with biotite>muscovite. The uranium deposit consists of two gaps crushing, parallel to the coarse-grained porphyritic granite, with average direction N30°E, silicified, sericitized and reddish jasperized, with a width of approximately 10 meters. These gaps are accompanied by two thin veins of white quartz, 70°-80° WNW, ferruginous and jasperized with chalcedony, red jasper and opal. These veins are about 6 meters away from each other. They contain secondary U-phosphates phases such as autunite and torbernite. Rejected materials (1000000ton) were deposited on two dumps and a lake was formed in the open pit. To assess the environmental risk of the abandoned uranium mine of Canto do Lagar, were collected and analysed 70 samples on stream sediments, soils and mine tailings materials. The relation between samples composition were tested using the Principal Components Analysis (PCA) (multivariate analysis) and spatial distribution using Kriging Indicator. The spatial distribution of stream sediments shows that the probability of expression for principal component 1 (explaining Y, Zr, Nb, La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb, Hf, Th and U contents), decreases along SE-NW direction. This component is explained by the samples located inside mine influence. The probability of expression for principal component 2 (explaining Be, Na, Al, Si, P, K, Ca, Ti, Mn, Fe, Co, Ni, Cu, As, Rb, Sr, Mo, Cs, Ba, Tl and Bi contents), increases to middle stream line. This component is explained by the samples located outside mine influence. The spatial distribution of soils, shows that the probability of expression for principal component 1 (explaining Mg, P, Ca, Ge, Sr, Y, Zr, La, Ce, Pr, Nd, Sm, Eu, Gd, Tb, Dy, Ho, Er, Tm, Yb, Lu, Hf, W, Th and U contents) decreases along SE direction and increases along NE and SW directions. The probability of expression for principal component 2 (explaining pH, K, Sc, Ti, V, Cr, Mn, Fe, Co, Ni, Cu, Zn, Sr and Pb contents), decreases from central points (inside mine influence) to peripheral points (outside mine influence) and gradually increases along N and SW directions. The spatial distribution of tailing materials did not allowed a consistent spatial distribution. In general, the stream sediments are classified as unpolluted and not polluted or moderately polluted, according to geoaccumulation Müller index with exception of local samples, located inside mine influence. The soils cannot be used for public, private or residential uses according to the Canadian soil legislation. However, almost samples can be used for commercial or industrial occupation. According to the target values and intervention values for soils remediation, these soils need intervention. Tailing materials samples are much polluted in thorium (Th) and uranium (U) and they cannot be used for public, private or residential uses.

  19. Visualizing Metal Content and Intracellular Distribution in Primary Hippocampal Neurons with Synchrotron X-Ray Fluorescence

    DOE PAGES

    Colvin, Robert A.; Jin, Qiaoling; Lai, Barry; ...

    2016-07-19

    Increasing evidence suggests that metal dyshomeostasis plays an important role in human neurodegenerative diseases. Although distinctive metal distributions are described for mature hippocampus and cortex, much less is known about metal levels and intracellular distribution in individual hippocampal neuronal somata. To solve this problem, we conducted quantitative metal analyses utilizing synchrotron radiation X-Ray fluorescence on frozen hydrated primary cultured neurons derived from rat embryonic cortex (CTX) and two regions of the hippocampus: dentate gyrus (DG) and CA1. Also, comparing average metal contents showed that the most abundant metals were calcium, iron, and zinc, whereas metals such as copper and manganesemore » were less than 10% of zinc. Average metal contents were generally similar when compared across neurons cultured from CTX, DG, and CA1, except for manganese that was larger in CA1. However, each metal showed a characteristic spatial distribution in individual neuronal somata. Zinc was uniformly distributed throughout the cytosol, with no evidence for the existence of previously identified zinc-enriched organelles, zincosomes. Calcium showed a peri-nuclear distribution consistent with accumulation in endoplasmic reticulum and/or mitochondria. Iron showed 2-3 distinct highly concentrated puncta only in peri-nuclear locations. Notwithstanding the small sample size, these analyses demonstrate that primary cultured neurons show characteristic metal signatures. The iron puncta probably represent iron-accumulating organelles, siderosomes. Thus, the metal distributions observed in mature brain structures are likely the result of both intrinsic neuronal factors that control cellular metal content and extrinsic factors related to the synaptic organization, function, and contacts formed and maintained in each region.« less

  20. Visualizing Metal Content and Intracellular Distribution in Primary Hippocampal Neurons with Synchrotron X-Ray Fluorescence

    PubMed Central

    2016-01-01

    Increasing evidence suggests that metal dyshomeostasis plays an important role in human neurodegenerative diseases. Although distinctive metal distributions are described for mature hippocampus and cortex, much less is known about metal levels and intracellular distribution in individual hippocampal neuronal somata. To solve this problem, we conducted quantitative metal analyses utilizing synchrotron radiation X-Ray fluorescence on frozen hydrated primary cultured neurons derived from rat embryonic cortex (CTX) and two regions of the hippocampus: dentate gyrus (DG) and CA1. Comparing average metal contents showed that the most abundant metals were calcium, iron, and zinc, whereas metals such as copper and manganese were less than 10% of zinc. Average metal contents were generally similar when compared across neurons cultured from CTX, DG, and CA1, except for manganese that was larger in CA1. However, each metal showed a characteristic spatial distribution in individual neuronal somata. Zinc was uniformly distributed throughout the cytosol, with no evidence for the existence of previously identified zinc-enriched organelles, zincosomes. Calcium showed a peri-nuclear distribution consistent with accumulation in endoplasmic reticulum and/or mitochondria. Iron showed 2–3 distinct highly concentrated puncta only in peri-nuclear locations. Notwithstanding the small sample size, these analyses demonstrate that primary cultured neurons show characteristic metal signatures. The iron puncta probably represent iron-accumulating organelles, siderosomes. Thus, the metal distributions observed in mature brain structures are likely the result of both intrinsic neuronal factors that control cellular metal content and extrinsic factors related to the synaptic organization, function, and contacts formed and maintained in each region. PMID:27434052

  1. [Distribution of polycyclic aromatic hydrocarbons in water and sediment from Zhoushan coastal area, China].

    PubMed

    Jiang, Min; Tuan, Le Huy; Mei, Wei-Ping; Ruan, Hui-Hui; Wu, Hao

    2014-07-01

    The spatial and temporal distribution of 16 polycyclic aromatic hydrocarbons (PAHs) has been investigated in water and sediments of Zhoushan coastal area every two months in 2012. The concentrations of total PAHs ranged from 382.3 to 816.9 ng x L(-1), with the mean value of 552.5 ng x L(-1) in water; whereas it ranged from 1017.9 to 3047.1 ng x g(-1), with the mean value of 2 022.4 ng x g(-1) in sediment. Spatial distribution showed that Yangshan and Yanwoshan offshore area had the maximum and minimum of total PAHs contents in water, while the maximum and minimum occurred at Yangshan and Zhujiajian Nansha offshore area in sediment. Temporal distribution revealed that total PAHs contents in water reached the maximum and minimum values in October and June, however in sediments these values were found in August and June, respectively. The PAHs pollution was affected by oil emission, charcoal and coal combustion. Using the biological threshold and exceeded coefficient method to assess the ecological risk of PAHs in Zhoushan coastal area, the result showed that sigma PAHs had a lower probability of potential risk, while there was a higher probability of potential risk for acenaphthylene monomer, and there might be ecological risk for acenaphthene and fluorene. Distribution of PAHs between sediment and water showed that Zhoushan coastal sediment enriched a lot of PAHs, meanwhile the enrichment coefficient (K(d) value) of sediment in Daishan island was larger than that in Zhoushan main island.

  2. Contextuality in canonical systems of random variables

    NASA Astrophysics Data System (ADS)

    Dzhafarov, Ehtibar N.; Cervantes, Víctor H.; Kujala, Janne V.

    2017-10-01

    Random variables representing measurements, broadly understood to include any responses to any inputs, form a system in which each of them is uniquely identified by its content (that which it measures) and its context (the conditions under which it is recorded). Two random variables are jointly distributed if and only if they share a context. In a canonical representation of a system, all random variables are binary, and every content-sharing pair of random variables has a unique maximal coupling (the joint distribution imposed on them so that they coincide with maximal possible probability). The system is contextual if these maximal couplings are incompatible with the joint distributions of the context-sharing random variables. We propose to represent any system of measurements in a canonical form and to consider the system contextual if and only if its canonical representation is contextual. As an illustration, we establish a criterion for contextuality of the canonical system consisting of all dichotomizations of a single pair of content-sharing categorical random variables. This article is part of the themed issue `Second quantum revolution: foundational questions'.

  3. The contents and distributions of cadmium, mercury, and lead in Usnea antarctica lichens from Solorina Valley, James Ross Island (Antarctica).

    PubMed

    Zvěřina, Ondřej; Coufalík, Pavel; Barták, Miloš; Petrov, Michal; Komárek, Josef

    2017-12-11

    Lichens are efficient and cost-effective biomonitors of the environment. Their geographic distribution together with their slow growth rate enable investigation of the deposition patterns of various elements and substances. In this research, levels of cadmium, lead, and mercury in Usnea antarctica lichens in the area of James Ross Island, Antarctica, were investigated. The lichens were microwave-digested, and the metals were determined by means of atomic absorption spectrometry with graphite furnace and a direct mercury analyzer. Median total contents of Cd, Hg, and Pb were 0.04, 0.47, and 1.6 mg/kg in whole lichens, respectively. The bottom-up distributions of these metals in the fruticose lichen thalli were investigated, and it was revealed that the accumulation patterns for mercury and lead were opposite to that for cadmium. The probable reason for this phenomenon may lie in the inner structure of thalli. The total contents of metals were comparable with those published for other unpolluted areas of maritime Antarctica. However, this finding was not expected for mercury, since the sampling locality was close to an area with some of the highest mercury contents published for Antarctic lichens. In short, lichens proved their usability as biological monitors, even in harsh conditions. However, the findings emphasize the need to take into account the distributions of elements both in the environment and in the lichen itself.

  4. A prototype method for diagnosing high ice water content probability using satellite imager data

    NASA Astrophysics Data System (ADS)

    Yost, Christopher R.; Bedka, Kristopher M.; Minnis, Patrick; Nguyen, Louis; Strapp, J. Walter; Palikonda, Rabindra; Khlopenkov, Konstantin; Spangenberg, Douglas; Smith, William L., Jr.; Protat, Alain; Delanoe, Julien

    2018-03-01

    Recent studies have found that ingestion of high mass concentrations of ice particles in regions of deep convective storms, with radar reflectivity considered safe for aircraft penetration, can adversely impact aircraft engine performance. Previous aviation industry studies have used the term high ice water content (HIWC) to define such conditions. Three airborne field campaigns were conducted in 2014 and 2015 to better understand how HIWC is distributed in deep convection, both as a function of altitude and proximity to convective updraft regions, and to facilitate development of new methods for detecting HIWC conditions, in addition to many other research and regulatory goals. This paper describes a prototype method for detecting HIWC conditions using geostationary (GEO) satellite imager data coupled with in situ total water content (TWC) observations collected during the flight campaigns. Three satellite-derived parameters were determined to be most useful for determining HIWC probability: (1) the horizontal proximity of the aircraft to the nearest overshooting convective updraft or textured anvil cloud, (2) tropopause-relative infrared brightness temperature, and (3) daytime-only cloud optical depth. Statistical fits between collocated TWC and GEO satellite parameters were used to determine the membership functions for the fuzzy logic derivation of HIWC probability. The products were demonstrated using data from several campaign flights and validated using a subset of the satellite-aircraft collocation database. The daytime HIWC probability was found to agree quite well with TWC time trends and identified extreme TWC events with high probability. Discrimination of HIWC was more challenging at night with IR-only information. The products show the greatest capability for discriminating TWC ≥ 0.5 g m-3. Product validation remains challenging due to vertical TWC uncertainties and the typically coarse spatio-temporal resolution of the GEO data.

  5. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education.

    PubMed

    Masel, J; Humphrey, P T; Blackburn, B; Levine, J A

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students' intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes' theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. © 2015 J. Masel et al. CBE—Life Sciences Education © 2015 The American Society for Cell Biology. This article is distributed by The American Society for Cell Biology under license from the author(s). It is available to the public under an Attribution–Noncommercial–Share Alike 3.0 Unported Creative Commons License (http://creativecommons.org/licenses/by-nc-sa/3.0).

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simonen, E.P.; Johnson, K.I.; Simonen, F.A.

    The Vessel Integrity Simulation Analysis (VISA-II) code was developed to allow calculations of the failure probability of a reactor pressure vessel subject to defined pressure/temperature transients. A version of the code, revised by Pacific Northwest Laboratory for the US Nuclear Regulatory Commission, was used to evaluate the sensitivities of calculated through-wall flaw probability to material, flaw and calculational assumptions. Probabilities were more sensitive to flaw assumptions than to material or calculational assumptions. Alternative flaw assumptions changed the probabilities by one to two orders of magnitude, whereas alternative material assumptions typically changed the probabilities by a factor of two or less.more » Flaw shape, flaw through-wall position and flaw inspection were sensitivities examined. Material property sensitivities included the assumed distributions in copper content and fracture toughness. Methods of modeling flaw propagation that were evaluated included arrest/reinitiation toughness correlations, multiple toughness values along the length of a flaw, flaw jump distance for each computer simulation and added error in estimating irradiated properties caused by the trend curve correlation error.« less

  7. Behavior of suspended particles in the Changjiang Estuary: Size distribution and trace metal contamination.

    PubMed

    Yao, Qingzhen; Wang, Xiaojing; Jian, Huimin; Chen, Hongtao; Yu, Zhigang

    2016-02-15

    Suspended particulate matter (SPM) samples were collected along a salinity gradient in the Changjiang Estuary in June 2011. A custom-built water elutriation apparatus was used to separate the suspended sediments into five size fractions. The results indicated that Cr and Pb originated from natural weathering processes, whereas Cu, Zn, and Cd originated from other sources. The distribution of most trace metals in different particle sizes increased with decreasing particle size. The contents of Fe/Mn and organic matter were confirmed to play an important role in increasing the level of heavy metal contents. The Cu, Pb, Zn, and Cd contents varied significantly with increasing salinity in the medium-low salinity region, thus indicating the release of Cu, Pb, Zn, and Cd particles. Thus, the transfer of polluted fine particles into the open sea is probably accompanied by release of pollutants into the dissolved compartment, thereby amplifying the potential harmful effects to marine organisms. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. DETECTION OF FLUX EMERGENCE, SPLITTING, MERGING, AND CANCELLATION OF NETWORK FIELD. I. SPLITTING AND MERGING

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iida, Y.; Yokoyama, T.; Hagenaar, H. J.

    2012-06-20

    Frequencies of magnetic patch processes on the supergranule boundary, namely, flux emergence, splitting, merging, and cancellation, are investigated through automatic detection. We use a set of line-of-sight magnetograms taken by the Solar Optical Telescope (SOT) on board the Hinode satellite. We found 1636 positive patches and 1637 negative patches in the data set, whose time duration is 3.5 hr and field of view is 112'' Multiplication-Sign 112''. The total numbers of magnetic processes are as follows: 493 positive and 482 negative splittings, 536 positive and 535 negative mergings, 86 cancellations, and 3 emergences. The total numbers of emergence and cancellationmore » are significantly smaller than those of splitting and merging. Further, the frequency dependence of the merging and splitting processes on the flux content are investigated. Merging has a weak dependence on the flux content with a power-law index of only 0.28. The timescale for splitting is found to be independent of the parent flux content before splitting, which corresponds to {approx}33 minutes. It is also found that patches split into any flux contents with the same probability. This splitting has a power-law distribution of the flux content with an index of -2 as a time-independent solution. These results support that the frequency distribution of the flux content in the analyzed flux range is rapidly maintained by merging and splitting, namely, surface processes. We suggest a model for frequency distributions of cancellation and emergence based on this idea.« less

  9. Developing a Questionnaire to Assess the Probability Content Knowledge of Prospective Primary School Teachers

    ERIC Educational Resources Information Center

    Gómez-Torres, Emilse; Batanero, Carmen; Díaz, Carmen; Contreras, José Miguel

    2016-01-01

    In this paper we describe the development of a questionnaire designed to assess the probability content knowledge of prospective primary school teachers. Three components of mathematical knowledge for teaching and three different meanings of probability (classical, frequentist and subjective) are considered. The questionnaire content is based on…

  10. The Contextuality Loophole is Fatal for the Derivation of Bell Inequalities: Reply to a Comment by I. Schmelzer

    NASA Astrophysics Data System (ADS)

    Nieuwenhuizen, Theodorus M.; Kupczynski, Marian

    2017-02-01

    Ilya Schmelzer wrote recently: Nieuwenhuizen argued that there exists some "contextuality loophole" in Bell's theorem. This claim in unjustified. It is made clear that this arose from attaching a meaning to the title and the content of the paper different from the one intended by Nieuwenhuizen. "Contextual loophole" means only that if the supplementary parameters describing measuring instruments are correctly introduced, Bell and Bell-type inequalities may not be proven. It is also stressed that a hidden variable model suffers from a "contextuality loophole" if it tries to describe different sets of incompatible experiments using a unique probability space and a unique joint probability distribution.

  11. Risk of false decision on conformity of a multicomponent material when test results of the components' content are correlated.

    PubMed

    Kuselman, Ilya; Pennecchi, Francesca R; da Silva, Ricardo J N B; Hibbert, D Brynn

    2017-11-01

    The probability of a false decision on conformity of a multicomponent material due to measurement uncertainty is discussed when test results are correlated. Specification limits of the components' content of such a material generate a multivariate specification interval/domain. When true values of components' content and corresponding test results are modelled by multivariate distributions (e.g. by multivariate normal distributions), a total global risk of a false decision on the material conformity can be evaluated based on calculation of integrals of their joint probability density function. No transformation of the raw data is required for that. A total specific risk can be evaluated as the joint posterior cumulative function of true values of a specific batch or lot lying outside the multivariate specification domain, when the vector of test results, obtained for the lot, is inside this domain. It was shown, using a case study of four components under control in a drug, that the correlation influence on the risk value is not easily predictable. To assess this influence, the evaluated total risk values were compared with those calculated for independent test results and also with those assuming much stronger correlation than that observed. While the observed statistically significant correlation did not lead to a visible difference in the total risk values in comparison to the independent test results, the stronger correlation among the variables caused either the total risk decreasing or its increasing, depending on the actual values of the test results. Copyright © 2017 Elsevier B.V. All rights reserved.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Colvin, Robert A.; Jin, Qiaoling; Lai, Barry

    Increasing evidence suggests that metal dyshomeostasis plays an important role in human neurodegenerative diseases. Although distinctive metal distributions are described for mature hippocampus and cortex, much less is known about metal levels and intracellular distribution in individual hippocampal neuronal somata. To solve this problem, we conducted quantitative metal analyses utilizing synchrotron radiation X-Ray fluorescence on frozen hydrated primary cultured neurons derived from rat embryonic cortex (CTX) and two regions of the hippocampus: dentate gyrus (DG) and CA1. Also, comparing average metal contents showed that the most abundant metals were calcium, iron, and zinc, whereas metals such as copper and manganesemore » were less than 10% of zinc. Average metal contents were generally similar when compared across neurons cultured from CTX, DG, and CA1, except for manganese that was larger in CA1. However, each metal showed a characteristic spatial distribution in individual neuronal somata. Zinc was uniformly distributed throughout the cytosol, with no evidence for the existence of previously identified zinc-enriched organelles, zincosomes. Calcium showed a peri-nuclear distribution consistent with accumulation in endoplasmic reticulum and/or mitochondria. Iron showed 2-3 distinct highly concentrated puncta only in peri-nuclear locations. Notwithstanding the small sample size, these analyses demonstrate that primary cultured neurons show characteristic metal signatures. The iron puncta probably represent iron-accumulating organelles, siderosomes. Thus, the metal distributions observed in mature brain structures are likely the result of both intrinsic neuronal factors that control cellular metal content and extrinsic factors related to the synaptic organization, function, and contacts formed and maintained in each region.« less

  13. Methanol and ethanol conversion into hydrocarbons over H-ZSM-5 catalyst

    NASA Astrophysics Data System (ADS)

    Hamieh, S.; Canaff, C.; Tayeb, K. Ben; Tarighi, M.; Maury, S.; Vezin, H.; Pouilloux, Y.; Pinard, L.

    2015-07-01

    Ethanol and methanol are converted using H-ZSM-5 zeolite at 623 K and 3.0 MPa into identical hydrocarbons (paraffins, olefins and aromatics) and moreover with identical selectivities. The distribution of olefins as paraffins follows the Flory distribution with a growth probability of 0.53. Regardless of the alcohol, the catalyst lifetime and selectivity into hydrocarbons C3+ are high in spite of an important coke content. The coke that poisons the Brønsted acid sites without blocking their access is composed in part of radical polyalkylaromatics. The addition of hydroquinone, an inhibitor of radicals, to the feed, provokes an immediate catalyst deactivation.

  14. Identification of cloud fields by the nonparametric algorithm of pattern recognition from normalized video data recorded with the AVHRR instrument

    NASA Astrophysics Data System (ADS)

    Protasov, Konstantin T.; Pushkareva, Tatyana Y.; Artamonov, Evgeny S.

    2002-02-01

    The problem of cloud field recognition from the NOAA satellite data is urgent for solving not only meteorological problems but also for resource-ecological monitoring of the Earth's underlying surface associated with the detection of thunderstorm clouds, estimation of the liquid water content of clouds and the moisture of the soil, the degree of fire hazard, etc. To solve these problems, we used the AVHRR/NOAA video data that regularly displayed the situation in the territory. The complexity and extremely nonstationary character of problems to be solved call for the use of information of all spectral channels, mathematical apparatus of testing statistical hypotheses, and methods of pattern recognition and identification of the informative parameters. For a class of detection and pattern recognition problems, the average risk functional is a natural criterion for the quality and the information content of the synthesized decision rules. In this case, to solve efficiently the problem of identifying cloud field types, the informative parameters must be determined by minimization of this functional. Since the conditional probability density functions, representing mathematical models of stochastic patterns, are unknown, the problem of nonparametric reconstruction of distributions from the leaning samples arises. To this end, we used nonparametric estimates of distributions with the modified Epanechnikov kernel. The unknown parameters of these distributions were determined by minimization of the risk functional, which for the learning sample was substituted by the empirical risk. After the conditional probability density functions had been reconstructed for the examined hypotheses, a cloudiness type was identified using the Bayes decision rule.

  15. Improving Constraints on Climate System Properties withAdditional Data and New Statistical and Sampling Methods

    NASA Astrophysics Data System (ADS)

    Forest, C. E.; Libardoni, A. G.; Sokolov, A. P.; Monier, E.

    2017-12-01

    We use the updated MIT Earth System Model (MESM) to derive the joint probability distribution function for Equilibrium Climate sensitivity (S), an effective heat diffusivity (Kv), and the net aerosol forcing (Faer). Using a new 1800-member ensemble of MESM runs, we derive PDFs by comparing model outputs against historical observations of surface temperature and global mean ocean heat content. We focus on how changes in (i) the MESM model, (ii) recent surface temperature and ocean heat content observations, and (iii) estimates of internal climate variability will all contribute to uncertainties. We show that estimates of S increase and Faer is less negative. These shifts result partly from new model forcing inputs but also from including recent temperature records that lead to higher values of S and Kv. We show that the parameter distributions are sensitive to the internal variability in the climate system. When considering these factors, we derive our best estimate for the joint probability distribution for the climate system properties. We estimate the 90-percent confidence intervals for climate sensitivity as 2.7-5.4 oC with a mode of 3.5 oC, for Kv as 1.9-23.0 cm2 s-1 with a mode of 4.41 cm2 s-1, and for Faer as -0.4 - -0.04 Wm-2 with a mode of -0.25 Wm-2. Lastly, we estimate TCR to be between 1.4 and 2.1 oC with a mode of 1.8 oC.

  16. Gluten-containing grains skew gluten assessment in oats due to sample grind non-homogeneity.

    PubMed

    Fritz, Ronald D; Chen, Yumin; Contreras, Veronica

    2017-02-01

    Oats are easily contaminated with gluten-rich kernels of wheat, rye and barley. These contaminants are like gluten 'pills', shown here to skew gluten analysis results. Using R-Biopharm R5 ELISA, we quantified gluten in gluten-free oatmeal servings from an in-market survey. For samples with a 5-20ppm reading on a first test, replicate analyses provided results ranging <5ppm to >160ppm. This suggests sample grinding may inadequately disperse gluten to allow a single accurate gluten assessment. To ascertain this, and characterize the distribution of 0.25-g gluten test results for kernel contaminated oats, twelve 50g samples of pure oats, each spiked with a wheat kernel, showed that 0.25g test results followed log-normal-like distributions. With this, we estimate probabilities of mis-assessment for a 'single measure/sample' relative to the <20ppm regulatory threshold, and derive an equation relating the probability of mis-assessment to sample average gluten content. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Generalized spherical and simplicial coordinates

    NASA Astrophysics Data System (ADS)

    Richter, Wolf-Dieter

    2007-12-01

    Elementary trigonometric quantities are defined in l2,p analogously to that in l2,2, the sine and cosine functions are generalized for each p>0 as functions sinp and cosp such that they satisfy the basic equation cosp([phi])p+sinp([phi])p=1. The p-generalized radius coordinate of a point [xi][set membership, variant]Rn is defined for each p>0 as . On combining these quantities, ln,p-spherical coordinates are defined. It is shown that these coordinates are nearly related to ln,p-simplicial coordinates. The Jacobians of these generalized coordinate transformations are derived. Applications and interpretations from analysis deal especially with the definition of a generalized surface content on ln,p-spheres which is nearly related to a modified co-area formula and an extension of Cavalieri's and Torricelli's indivisibeln method, and with differential equations. Applications from probability theory deal especially with a geometric interpretation of the uniform probability distribution on the ln,p-sphere and with the derivation of certain generalized statistical distributions.

  18. A Bayesian approach to microwave precipitation profile retrieval

    NASA Technical Reports Server (NTRS)

    Evans, K. Franklin; Turk, Joseph; Wong, Takmeng; Stephens, Graeme L.

    1995-01-01

    A multichannel passive microwave precipitation retrieval algorithm is developed. Bayes theorem is used to combine statistical information from numerical cloud models with forward radiative transfer modeling. A multivariate lognormal prior probability distribution contains the covariance information about hydrometeor distribution that resolves the nonuniqueness inherent in the inversion process. Hydrometeor profiles are retrieved by maximizing the posterior probability density for each vector of observations. The hydrometeor profile retrieval method is tested with data from the Advanced Microwave Precipitation Radiometer (10, 19, 37, and 85 GHz) of convection over ocean and land in Florida. The CP-2 multiparameter radar data are used to verify the retrieved profiles. The results show that the method can retrieve approximate hydrometeor profiles, with larger errors over land than water. There is considerably greater accuracy in the retrieval of integrated hydrometeor contents than of profiles. Many of the retrieval errors are traced to problems with the cloud model microphysical information, and future improvements to the algorithm are suggested.

  19. iSEDfit: Bayesian spectral energy distribution modeling of galaxies

    NASA Astrophysics Data System (ADS)

    Moustakas, John

    2017-08-01

    iSEDfit uses Bayesian inference to extract the physical properties of galaxies from their observed broadband photometric spectral energy distribution (SED). In its default mode, the inputs to iSEDfit are the measured photometry (fluxes and corresponding inverse variances) and a measurement of the galaxy redshift. Alternatively, iSEDfit can be used to estimate photometric redshifts from the input photometry alone. After the priors have been specified, iSEDfit calculates the marginalized posterior probability distributions for the physical parameters of interest, including the stellar mass, star-formation rate, dust content, star formation history, and stellar metallicity. iSEDfit also optionally computes K-corrections and produces multiple "quality assurance" (QA) plots at each stage of the modeling procedure to aid in the interpretation of the prior parameter choices and subsequent fitting results. The software is distributed as part of the impro IDL suite.

  20. Mercury profiles in sediment from the marginal high of Arabian Sea: an indicator of increasing anthropogenic Hg input.

    PubMed

    Chakraborty, Parthasarathi; Vudamala, Krushna; Chennuri, Kartheek; Armoury, Kazip; Linsy, P; Ramteke, Darwin; Sebastian, Tyson; Jayachandran, Saranya; Naik, Chandan; Naik, Richita; Nath, B Nagender

    2016-05-01

    Total Hg distributions and its speciation were determined in two sediment cores collected from the western continental marginal high of India. Total Hg content in the sediment was found to gradually increase (by approximately two times) towards the surface in both the cores. It was found that Hg was preferentially bound to sulfide under anoxic condition. However, redox-mediated reactions in the upper part of the core influenced the total Hg content in the sediment cores. This study suggests that probable increase in authigenic and allogenic Hg deposition attributed to the increasing Hg concentration in the surface sediment in the study area.

  1. On the distribution of interspecies correlation for Markov models of character evolution on Yule trees.

    PubMed

    Mulder, Willem H; Crawford, Forrest W

    2015-01-07

    Efforts to reconstruct phylogenetic trees and understand evolutionary processes depend fundamentally on stochastic models of speciation and mutation. The simplest continuous-time model for speciation in phylogenetic trees is the Yule process, in which new species are "born" from existing lineages at a constant rate. Recent work has illuminated some of the structural properties of Yule trees, but it remains mostly unknown how these properties affect sequence and trait patterns observed at the tips of the phylogenetic tree. Understanding the interplay between speciation and mutation under simple models of evolution is essential for deriving valid phylogenetic inference methods and gives insight into the optimal design of phylogenetic studies. In this work, we derive the probability distribution of interspecies covariance under Brownian motion and Ornstein-Uhlenbeck models of phenotypic change on a Yule tree. We compute the probability distribution of the number of mutations shared between two randomly chosen taxa in a Yule tree under discrete Markov mutation models. Our results suggest summary measures of phylogenetic information content, illuminate the correlation between site patterns in sequences or traits of related organisms, and provide heuristics for experimental design and reconstruction of phylogenetic trees. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Probability analysis of multiple-tank-car release incidents in railway hazardous materials transportation.

    PubMed

    Liu, Xiang; Saat, Mohd Rapik; Barkan, Christopher P L

    2014-07-15

    Railroads play a key role in the transportation of hazardous materials in North America. Rail transport differs from highway transport in several aspects, an important one being that rail transport involves trains in which many railcars carrying hazardous materials travel together. By contrast to truck accidents, it is possible that a train accident may involve multiple hazardous materials cars derailing and releasing contents with consequently greater potential impact on human health, property and the environment. In this paper, a probabilistic model is developed to estimate the probability distribution of the number of tank cars releasing contents in a train derailment. Principal operational characteristics considered include train length, derailment speed, accident cause, position of the first car derailed, number and placement of tank cars in a train and tank car safety design. The effect of train speed, tank car safety design and tank car positions in a train were evaluated regarding the number of cars that release their contents in a derailment. This research provides insights regarding the circumstances affecting multiple-tank-car release incidents and potential strategies to reduce their occurrences. The model can be incorporated into a larger risk management framework to enable better local, regional and national safety management of hazardous materials transportation by rail. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. [Analysis of X-Ray Fluorescence Spectroscopy and Plasma Mass Spectrometry of Pangxidong Composite Granitoid Pluton and Its Implications for Magmatic Differentiation].

    PubMed

    Zeng, Chang-yu; Ding, Ru-xin; Li, Hong-zhong; Zhou, Yong-zhang; Niu, Jia; Zhang, Jie-tang

    2015-11-01

    Pangxidong composite granitoid pluton located in the southwestern margin of Yunkai massif. The metamorphic grade of this pluton increases from outside to inside, that is, banded-augen granitic gneisses, gneissoid granites and granites distribute in order from edge to core. X-Ray Fluorescence Spectroscopy and Plasma Mass Spectrometry are conducted to study the geochemical characteristics of the three types of rocks. The result shows that all the three types of rocks are peraluminous rocks and their contents of main elements and rare earth elements change gradually. From granitic gneisses to granites, the contents of Al₂O₃, CaO, MgO, TiO₂, total rare earth elements and light rare earth elements increase, but the contents of SiO₂ and heavy rare earth elements decrease. It is suggested that the phylogenetic relationship exists between granitic gneisses, gneissoid granites and granites during the multi-stage tectonic evolution process. Furthermore, the remelting of metamorphosed supracrustal rocks in Yunkai massif is probably an important cause of granitoid rocks forming. The evolutionary mechanism is probably that SiO₂ and heavy rare earth elements were melt out from the protolith and gradually enriched upward, but Al₂O₃, CaO, MgO, TiO₂ and light rare earth elements enriched downward.

  4. Kolmogorov-Smirnov test for spatially correlated data

    USGS Publications Warehouse

    Olea, R.A.; Pawlowsky-Glahn, V.

    2009-01-01

    The Kolmogorov-Smirnov test is a convenient method for investigating whether two underlying univariate probability distributions can be regarded as undistinguishable from each other or whether an underlying probability distribution differs from a hypothesized distribution. Application of the test requires that the sample be unbiased and the outcomes be independent and identically distributed, conditions that are violated in several degrees by spatially continuous attributes, such as topographical elevation. A generalized form of the bootstrap method is used here for the purpose of modeling the distribution of the statistic D of the Kolmogorov-Smirnov test. The innovation is in the resampling, which in the traditional formulation of bootstrap is done by drawing from the empirical sample with replacement presuming independence. The generalization consists of preparing resamplings with the same spatial correlation as the empirical sample. This is accomplished by reading the value of unconditional stochastic realizations at the sampling locations, realizations that are generated by simulated annealing. The new approach was tested by two empirical samples taken from an exhaustive sample closely following a lognormal distribution. One sample was a regular, unbiased sample while the other one was a clustered, preferential sample that had to be preprocessed. Our results show that the p-value for the spatially correlated case is always larger that the p-value of the statistic in the absence of spatial correlation, which is in agreement with the fact that the information content of an uncorrelated sample is larger than the one for a spatially correlated sample of the same size. ?? Springer-Verlag 2008.

  5. Metocean design parameter estimation for fixed platform based on copula functions

    NASA Astrophysics Data System (ADS)

    Zhai, Jinjin; Yin, Qilin; Dong, Sheng

    2017-08-01

    Considering the dependent relationship among wave height, wind speed, and current velocity, we construct novel trivariate joint probability distributions via Archimedean copula functions. Total 30-year data of wave height, wind speed, and current velocity in the Bohai Sea are hindcast and sampled for case study. Four kinds of distributions, namely, Gumbel distribution, lognormal distribution, Weibull distribution, and Pearson Type III distribution, are candidate models for marginal distributions of wave height, wind speed, and current velocity. The Pearson Type III distribution is selected as the optimal model. Bivariate and trivariate probability distributions of these environmental conditions are established based on four bivariate and trivariate Archimedean copulas, namely, Clayton, Frank, Gumbel-Hougaard, and Ali-Mikhail-Haq copulas. These joint probability models can maximize marginal information and the dependence among the three variables. The design return values of these three variables can be obtained by three methods: univariate probability, conditional probability, and joint probability. The joint return periods of different load combinations are estimated by the proposed models. Platform responses (including base shear, overturning moment, and deck displacement) are further calculated. For the same return period, the design values of wave height, wind speed, and current velocity obtained by the conditional and joint probability models are much smaller than those by univariate probability. Considering the dependence among variables, the multivariate probability distributions provide close design parameters to actual sea state for ocean platform design.

  6. Models of multidimensional discrete distribution of probabilities of random variables in information systems

    NASA Astrophysics Data System (ADS)

    Gromov, Yu Yu; Minin, Yu V.; Ivanova, O. G.; Morozova, O. N.

    2018-03-01

    Multidimensional discrete distributions of probabilities of independent random values were received. Their one-dimensional distribution is widely used in probability theory. Producing functions of those multidimensional distributions were also received.

  7. Lichens as bioindicators of aerial fallout of heavy metals in Zaria, Nigeria

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kapu, M.M.; Ipaye, M.M.; Ega, R.A.I.

    1991-09-01

    Lichens and other epiphytic cryptogams possess efficient ion-exchange mechanisms which enable many species to accumulate airborne metals and which probably contribute to their tolerating metals at concentrations high enough to cause death to other plant species. A direct relationship between the distribution pattern of lichens and the trace metal content of the surrounding air has been demonstrated. The present study used lichens to assess the aerial fallout of heavy metals from traffic in Zaria, northern Nigeria.

  8. Determination of subsurface fluid contents at a crude-oil spill site

    USGS Publications Warehouse

    Hess, K.M.; Herkelrath, W.N.; Essaid, H.I.

    1992-01-01

    Measurement of the fluid-content distribution at sites contaminated by immiscible fluids, including crude oil, is needed to better understand the movement of these fluids in the subsurface and to provide data to calibrate and verify numerical models and geophysical methods. A laboratory method was used to quantify the fluid contents of 146 core sections retrieved from boreholes aligned along a 120-m longitudinal transect at a crude-oil spill site near Bemidji, Minnesota, U.S.A. The 47-mm-diameter, minimally disturbed cores spanned a 4-m vertical interval contaminated by oil. Cores were frozen on site in a dry ice-alcohol bath to prevent redistribution and loss of fluids while sectioning the cores. We gravimetrically determined oil and water contents using a two-step method: (1) samples were slurried and the oil was removed by absorption onto strips of hydrophobic porous polyethylene (PPE); and (2) the samples were oven-dried to remove the water. The resulting data show sharp vertical gradients in the water and oil contents and a clearly defined oil body. The subsurface distribution is complex and appears to be influenced by sediment heterogeneities and water-table fluctuations. The center of the oil body has depressed the water-saturated zone boundary, and the oil is migrating laterally within the capillary fringe. The oil contents are as high as 0.3 cm3 cm-3, which indicates that oil is probably still mobile 10 years after the spill occurred. The thickness of oil measured in wells suggests that accumulated thickness in wells is a poor indicator of the actual distribution of oil in the subsurface. Several possible sources of error are identified with the field and laboratory methods. An error analysis indicates that adsorption of water and sediment into the PPE adds as much as 4% to the measured oil masses and that uncertainties in the calculated sample volume and the assumed oil density introduce an additional ??3% error when the masses are converted to fluid contents.

  9. Stable laws and cosmic ray physics

    NASA Astrophysics Data System (ADS)

    Genolini, Y.; Salati, P.; Serpico, P. D.; Taillet, R.

    2017-04-01

    Context. In the new "precision era" for cosmic ray astrophysics, scientists making theoretical predictions cannot content themselves with average trends, but need to correctly take into account intrinsic uncertainties. The space-time discreteness of the cosmic ray sources, together with a substantial ignorance of their precise epochs and locations (with the possible exception of the most recent and close ones) play an important role in this sense. Aims: We elaborate a statistical theory to deal with this problem, relating the composite probability P(Ψ) to obtain a flux Ψ at the Earth and the single-source probability p(ψ) to contribute with a flux ψ. The main difficulty arises from the fact that p(ψ) is a "heavy tail" distribution, characterized by power-law or broken power-law behavior up to very large fluxes, for which the central limit theorem does not hold, and leading to distributions different from Gaussian. The functional form of the distribution for the aggregated flux is nonetheless unchanged by its own convolution, that is, it belongs to the so-called stable laws class. Methods: We analytically discuss the regime of validity of the stable laws associated with the distributions arising in cosmic ray astrophysics, as well as the limitations to the treatment imposed by causal considerations and partial source catalog knowledge. We validate our results with extensive Monte Carlo simulations, for different regimes of propagation parameters and energies. Results: We find that relatively simple recipes provide a satisfactory description of the probability P(Ψ). We also find that a naive Gaussian fit to simulation results would underestimate the probability of very large fluxes, that is, several times above the average, while overestimating the probability of relatively milder excursions. At large energies, large flux fluctuations are prevented by causal considerations, while at low energies, a partial knowledge of the recent and nearby population of sources plays an important role. A few proposals have been recently discussed in the literature to account for spectral breaks reported in cosmic ray data in terms of local contributions. We apply our newly developed theory to assess their probabilities, finding that they are relatively small, typically at the 0.1% level or smaller, never exceeding 1%. Conclusions: The use of heavy tail distributions is relevant in assessing how likely a measured cosmic ray flux is to depart from the average expectation in a given model. The existing mathematical theory leading to stable laws can be adapted to the case of interest via some recipes that closely reproduce numerical simulations and are relatively easy to implement.

  10. Statistical tests for whether a given set of independent, identically distributed draws comes from a specified probability density.

    PubMed

    Tygert, Mark

    2010-09-21

    We discuss several tests for determining whether a given set of independent and identically distributed (i.i.d.) draws does not come from a specified probability density function. The most commonly used are Kolmogorov-Smirnov tests, particularly Kuiper's variant, which focus on discrepancies between the cumulative distribution function for the specified probability density and the empirical cumulative distribution function for the given set of i.i.d. draws. Unfortunately, variations in the probability density function often get smoothed over in the cumulative distribution function, making it difficult to detect discrepancies in regions where the probability density is small in comparison with its values in surrounding regions. We discuss tests without this deficiency, complementing the classical methods. The tests of the present paper are based on the plain fact that it is unlikely to draw a random number whose probability is small, provided that the draw is taken from the same distribution used in calculating the probability (thus, if we draw a random number whose probability is small, then we can be confident that we did not draw the number from the same distribution used in calculating the probability).

  11. Forecasting of the selected features of Poaceae (R. Br.) Barnh., Artemisia L. and Ambrosia L. pollen season in Szczecin, north-western Poland, using Gumbel's distribution.

    PubMed

    Puc, Małgorzata; Wolski, Tomasz

    2013-01-01

    The allergenic pollen content of the atmosphere varies according to climate, biogeography and vegetation. Minimisation of the pollen allergy symptoms is related to the possibility of avoidance of large doses of the allergen. Measurements performed in Szczecin over a period of 13 years (2000-2012 inclusive) permitted prediction of theoretical maximum concentrations of pollen grains and their probability for the pollen season of Poaceae, Artemisia and Ambrosia. Moreover, the probabilities were determined of a given date as the beginning of the pollen season, the date of the maximum pollen count, Seasonal Pollen Index value and the number of days with pollen count above threshold values. Aerobiological monitoring was conducted using a Hirst volumetric trap (Lanzoni VPPS). Linear trend with determination coefficient (R(2)) was calculated. Model for long-term forecasting was performed by the method based on Gumbel's distribution. A statistically significant negative correlation was determined between the duration of pollen season of Poaceae and Artemisia and the Seasonal Pollen Index value. Seasonal, total pollen counts of Artemisia and Ambrosia showed a strong and statistically significant decreasing tendency. On the basis of Gumbel's distribution, a model was proposed for Szczecin, allowing prediction of the probabilities of the maximum pollen count values that can appear once in e.g. 5, 10 or 100 years. Short pollen seasons are characterised by a higher intensity of pollination than long ones. Prediction of the maximum pollen count values, dates of the pollen season beginning, and the number of days with pollen count above the threshold, on the basis of Gumbel's distribution, is expected to lead to improvement in the prophylaxis and therapy of persons allergic to pollen.

  12. Examination of Mathematics Teachers' Pedagogical Content Knowledge of Probability

    ERIC Educational Resources Information Center

    Danisman, Sahin; Tanisli, Dilek

    2017-01-01

    The aim of this study is to explore the probability-related pedagogical content knowledge (PCK) of secondary school mathematics teachers in terms of content knowledge, curriculum knowledge, student knowledge, and knowledge of teaching methods and strategies. Case study design, a qualitative research model, was used in the study, and the…

  13. Sources and distribution of aromatic hydrocarbons in a tropical marine protected area estuary under influence of sugarcane cultivation.

    PubMed

    Arruda-Santos, Roxanny Helen de; Schettini, Carlos Augusto França; Yogui, Gilvan Takeshi; Maciel, Daniele Claudino; Zanardi-Lamardo, Eliete

    2018-05-15

    Goiana estuary is a well preserved marine protected area (MPA) located on the northeastern coast of Brazil. Despite its current state, human activities in the watershed represent a potential threat to long term local preservation. Dissolved/dispersed aromatic hydrocarbons and polycyclic aromatic hydrocarbons (PAHs) were investigated in water and sediments across the estuarine salt gradient. Concentration of aromatic hydrocarbons was low in all samples. According to results, aromatic hydrocarbons are associated to suspended particulate matter (SPM) carried to the estuary by river waters. An estuarine turbidity maximum (ETM) was identified in the upper estuary, indicating that both sediments and contaminants are trapped prior to an occasional export to the adjacent sea. PAHs distribution in sediments were associated with organic matter and mud content. Diagnostic ratios indicated pyrolytic processes as the main local source of PAHs that are probably associated with sugarcane burning and combustion engines. Low PAH concentrations probably do not cause adverse biological effects to the local biota although their presence indicate anthropogenic contamination and pressure on the Goiana estuary MPA. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Investigation of Dielectric Breakdown Characteristics for Double-break Vacuum Interrupter and Dielectric Breakdown Probability Distribution in Vacuum Interrupter

    NASA Astrophysics Data System (ADS)

    Shioiri, Tetsu; Asari, Naoki; Sato, Junichi; Sasage, Kosuke; Yokokura, Kunio; Homma, Mitsutaka; Suzuki, Katsumi

    To investigate the reliability of equipment of vacuum insulation, a study was carried out to clarify breakdown probability distributions in vacuum gap. Further, a double-break vacuum circuit breaker was investigated for breakdown probability distribution. The test results show that the breakdown probability distribution of the vacuum gap can be represented by a Weibull distribution using a location parameter, which shows the voltage that permits a zero breakdown probability. The location parameter obtained from Weibull plot depends on electrode area. The shape parameter obtained from Weibull plot of vacuum gap was 10∼14, and is constant irrespective non-uniform field factor. The breakdown probability distribution after no-load switching can be represented by Weibull distribution using a location parameter. The shape parameter after no-load switching was 6∼8.5, and is constant, irrespective of gap length. This indicates that the scatter of breakdown voltage was increased by no-load switching. If the vacuum circuit breaker uses a double break, breakdown probability at low voltage becomes lower than single-break probability. Although potential distribution is a concern in the double-break vacuum cuicuit breaker, its insulation reliability is better than that of the single-break vacuum interrupter even if the bias of the vacuum interrupter's sharing voltage is taken into account.

  15. Transactional Problem Content in Cost Discounting: Parallel Effects for Probability and Delay

    ERIC Educational Resources Information Center

    Jones, Stephen; Oaksford, Mike

    2011-01-01

    Four experiments investigated the effects of transactional content on temporal and probabilistic discounting of costs. Kusev, van Schaik, Ayton, Dent, and Chater (2009) have shown that content other than gambles can alter decision-making behavior even when associated value and probabilities are held constant. Transactions were hypothesized to lead…

  16. Using a Betabinomial distribution to estimate the prevalence of adherence to physical activity guidelines among children and youth.

    PubMed

    Garriguet, Didier

    2016-04-01

    Estimates of the prevalence of adherence to physical activity guidelines in the population are generally the result of averaging individual probability of adherence based on the number of days people meet the guidelines and the number of days they are assessed. Given this number of active and inactive days (days assessed minus days active), the conditional probability of meeting the guidelines that has been used in the past is a Beta (1 + active days, 1 + inactive days) distribution assuming the probability p of a day being active is bounded by 0 and 1 and averages 50%. A change in the assumption about the distribution of p is required to better match the discrete nature of the data and to better assess the probability of adherence when the percentage of active days in the population differs from 50%. Using accelerometry data from the Canadian Health Measures Survey, the probability of adherence to physical activity guidelines is estimated using a conditional probability given the number of active and inactive days distributed as a Betabinomial(n, a + active days , β + inactive days) assuming that p is randomly distributed as Beta(a, β) where the parameters a and β are estimated by maximum likelihood. The resulting Betabinomial distribution is discrete. For children aged 6 or older, the probability of meeting physical activity guidelines 7 out of 7 days is similar to published estimates. For pre-schoolers, the Betabinomial distribution yields higher estimates of adherence to the guidelines than the Beta distribution, in line with the probability of being active on any given day. In estimating the probability of adherence to physical activity guidelines, the Betabinomial distribution has several advantages over the previously used Beta distribution. It is a discrete distribution and maximizes the richness of accelerometer data.

  17. [Gemstone computed tomography in the evaluation of material distribution in pulmonary parenchyma for pulmonary embolism].

    PubMed

    Zhang, Lan; Lü, Lei; Wu, Hua-wei; Zhang, Hao; Zhang, Ji-wei

    2011-12-06

    To present our initial experiences with pulmonary high-definition multidetector computed tomography (HDCT) in patients with acute venous thromboembolism (AVTE) to evaluate their corresponding clinical manifestations. Since December 2009 to March 2010, 23 AVTE patients underwent HDCT at our hospital. Pulmonary embolism (PE) was diagnosed based on the 3D-reconstructed images of computed tomography pulmonary angiography (CTPA). The post processed data were collected by spectral imaging system software to detect the iodine distribution maps. Perfusion defects, calculated as the values of iodine content, were compared with those of normal lung parenchymal perfusion in the absence of PE. Among them, 14 AVTE patients were definitely diagnosed with PE. Prior to anticoagulant therapy, their values of iodine content in defective perfusion area were significantly lower than those in normal perfusion area. After a 3-month anticoagulant therapy, the values of iodine content for the defective perfusion area increased significantly (P < 0.05). There was no significant correlation between the values of iodine content for segmental/subsegmental filling defect area and clinical risk score of DVT (r = 2.68, P > 0.05). But there was a significant negative correlation between the values of iodine content for segmental/subsegmental filling defection area and clinical probability score of PE (r = 0.78, P < 0.05). HDCT is a promising modality of visualizing pulmonary microvasculature as a correlative manifestation of regional perfusion. PE results in hypoperfusion with decreased values of iodine content in affected lung parenchyma. Hemodynamic changes in affected areas correlate with the severity of clinical manifestations of PE.

  18. Spectrum-efficient multipath provisioning with content connectivity for the survivability of elastic optical datacenter networks

    NASA Astrophysics Data System (ADS)

    Gao, Tao; Li, Xin; Guo, Bingli; Yin, Shan; Li, Wenzhe; Huang, Shanguo

    2017-07-01

    Multipath provisioning is a survivable and resource efficient solution against increasing link failures caused by natural or man-made disasters in elastic optical datacenter networks (EODNs). Nevertheless, the conventional multipath provisioning scheme is designed only for connecting a specific node pair. Also, it is obvious that the number of node-disjoint paths between any two nodes is restricted to network connectivity, which has a fixed value for a given topology. Recently, the concept of content connectivity in EODNs has been proposed, which guarantees that a user can be served by any datacenter hosting the required content regardless of where it is located. From this new perspective, we propose a survivable multipath provisioning with content connectivity (MPCC) scheme, which is expected to improve the spectrum efficiency and the whole system survivability. We formulate the MPCC scheme with Integer Linear Program (ILP) in static traffic scenario and a heuristic approach is proposed for dynamic traffic scenario. Furthermore, to adapt MPCC to the variation of network state in dynamic traffic scenario, we propose a dynamic content placement (DCP) strategy in the MPCC scheme for detecting the variation of the distribution of user requests and adjusting the content location dynamically. Simulation results indicate that the MPCC scheme can reduce over 20% spectrum consumption than conventional multipath provisioning scheme in static traffic scenario. And in dynamic traffic scenario, the MPCC scheme can reduce over 20% spectrum consumption and over 50% blocking probability than conventional multipath provisioning scheme. Meanwhile, benefiting from the DCP strategy, the MPCC scheme has a good adaption to the variation of the distribution of user requests.

  19. Studies on Physical and Sensory Properties of Premium Vanilla Ice Cream Distributed in Korean Market.

    PubMed

    Choi, Mi-Jung; Shin, Kwang-Soon

    2014-01-01

    The object of this study was to investigate the difference in physical and sensory properties of various premium ice creams. The physical properties of the various ice creams were compared by manufacturing brand. The water contents of the samples differed, with BR having the highest value at 60.5%, followed by NT and CS at 57.8% and 56.9%, respectively. The higher the water content, the lower Brix and milk fat contents in all samples. The density of the samples showed almost similar values in all samples (p>0.05). The viscosity of each ice cream had no effect on the water content in any of the brands. Before melting of the ice cream, the total color difference was dependent on the lightness, especially in the vanilla ice cream, owing to the reflection of light on the surface of the ice crystals. The CS product melted the fastest. In the sensory test, CS obtained a significantly higher sweetness intensity score but a lower score for color intensity, probably due to the smaller difference in total color, by which consumers might consider the color of CS as less intense. From this study, the cold chain system for ice cream distribution might be important to decide the physical properties although the concentration of milk fat is key factor in premium ice cream.

  20. Studies on Physical and Sensory Properties of Premium Vanilla Ice Cream Distributed in Korean Market

    PubMed Central

    Choi, Mi-Jung

    2014-01-01

    The object of this study was to investigate the difference in physical and sensory properties of various premium ice creams. The physical properties of the various ice creams were compared by manufacturing brand. The water contents of the samples differed, with BR having the highest value at 60.5%, followed by NT and CS at 57.8% and 56.9%, respectively. The higher the water content, the lower Brix and milk fat contents in all samples. The density of the samples showed almost similar values in all samples (p>0.05). The viscosity of each ice cream had no effect on the water content in any of the brands. Before melting of the ice cream, the total color difference was dependent on the lightness, especially in the vanilla ice cream, owing to the reflection of light on the surface of the ice crystals. The CS product melted the fastest. In the sensory test, CS obtained a significantly higher sweetness intensity score but a lower score for color intensity, probably due to the smaller difference in total color, by which consumers might consider the color of CS as less intense. From this study, the cold chain system for ice cream distribution might be important to decide the physical properties although the concentration of milk fat is key factor in premium ice cream. PMID:26761671

  1. Galactic hydrostatic equilibrium with magnetic tension and cosmic-ray diffusion

    NASA Technical Reports Server (NTRS)

    Boulares, Ahmed; Cox, Donald P.

    1990-01-01

    Three gravitational potentials differing in the content of dark matter in the Galactic plane are used to study the structure of the z-distribution of mass and pressure in the solar neighborhood. A P(0) of roughly (3.9 + or - 0.6) x 10 to the -12th dyn/sq cm is obtained, with roughly equal contributions from magnetic field, cosmic ray, and kinetic terms. This boundary condition restricts both the magnitude of gravity and the high z-pressure. It favors lower gravity and higher values for the cosmic ray, magnetic field, and probably the kinetic pressures than have been popular in the past. Inclusion of the warm H(+) distribution carries a significant mass component into the z about 1 kpc regime.

  2. In vitro culture increases mechanical stability of human tissue engineered cartilage constructs by prevention of microscale scaffold buckling.

    PubMed

    Middendorf, Jill M; Shortkroff, Sonya; Dugopolski, Caroline; Kennedy, Stephen; Siemiatkoski, Joseph; Bartell, Lena R; Cohen, Itai; Bonassar, Lawrence J

    2017-11-07

    Many studies have measured the global compressive properties of tissue engineered (TE) cartilage grown on porous scaffolds. Such scaffolds are known to exhibit strain softening due to local buckling under loading. As matrix is deposited onto these scaffolds, the global compressive properties increase. However the relationship between the amount and distribution of matrix in the scaffold and local buckling is unknown. To address this knowledge gap, we studied how local strain and construct buckling in human TE constructs changes over culture times and GAG content. Confocal elastography techniques and digital image correlation (DIC) were used to measure and record buckling modes and local strains. Receiver operating characteristic (ROC) curves were used to quantify construct buckling. The results from the ROC analysis were placed into Kaplan-Meier survival function curves to establish the probability that any point in a construct buckled. These analysis techniques revealed the presence of buckling at early time points, but bending at later time points. An inverse correlation was observed between the probability of buckling and the total GAG content of each construct. This data suggests that increased GAG content prevents the onset of construct buckling and improves the microscale compressive tissue properties. This increase in GAG deposition leads to enhanced global compressive properties by prevention of microscale buckling. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Statistical behavior of ten million experimental detection limits

    NASA Astrophysics Data System (ADS)

    Voigtman, Edward; Abraham, Kevin T.

    2011-02-01

    Using a lab-constructed laser-excited fluorimeter, together with bootstrapping methodology, the authors have generated many millions of experimental linear calibration curves for the detection of rhodamine 6G tetrafluoroborate in ethanol solutions. The detection limits computed from them are in excellent agreement with both previously published theory and with comprehensive Monte Carlo computer simulations. Currie decision levels and Currie detection limits, each in the theoretical, chemical content domain, were found to be simply scaled reciprocals of the non-centrality parameter of the non-central t distribution that characterizes univariate linear calibration curves that have homoscedastic, additive Gaussian white noise. Accurate and precise estimates of the theoretical, content domain Currie detection limit for the experimental system, with 5% (each) probabilities of false positives and false negatives, are presented.

  4. Ordinary chondrites - Multivariate statistical analysis of trace element contents

    NASA Technical Reports Server (NTRS)

    Lipschutz, Michael E.; Samuels, Stephen M.

    1991-01-01

    The contents of mobile trace elements (Co, Au, Sb, Ga, Se, Rb, Cs, Te, Bi, Ag, In, Tl, Zn, and Cd) in Antarctic and non-Antarctic populations of H4-6 and L4-6 chondrites, were compared using standard multivariate discriminant functions borrowed from linear discriminant analysis and logistic regression. A nonstandard randomization-simulation method was developed, making it possible to carry out probability assignments on a distribution-free basis. Compositional differences were found both between the Antarctic and non-Antarctic H4-6 chondrite populations and between two L4-6 chondrite populations. It is shown that, for various types of meteorites (in particular, for the H4-6 chondrites), the Antarctic/non-Antarctic compositional difference is due to preterrestrial differences in the genesis of their parent materials.

  5. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions.

    PubMed

    Dinov, Ivo D; Siegrist, Kyle; Pearl, Dennis K; Kalinin, Alexandr; Christou, Nicolas

    2016-06-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome , which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols.

  6. Probability Distributome: A Web Computational Infrastructure for Exploring the Properties, Interrelations, and Applications of Probability Distributions

    PubMed Central

    Dinov, Ivo D.; Siegrist, Kyle; Pearl, Dennis K.; Kalinin, Alexandr; Christou, Nicolas

    2015-01-01

    Probability distributions are useful for modeling, simulation, analysis, and inference on varieties of natural processes and physical phenomena. There are uncountably many probability distributions. However, a few dozen families of distributions are commonly defined and are frequently used in practice for problem solving, experimental applications, and theoretical studies. In this paper, we present a new computational and graphical infrastructure, the Distributome, which facilitates the discovery, exploration and application of diverse spectra of probability distributions. The extensible Distributome infrastructure provides interfaces for (human and machine) traversal, search, and navigation of all common probability distributions. It also enables distribution modeling, applications, investigation of inter-distribution relations, as well as their analytical representations and computational utilization. The entire Distributome framework is designed and implemented as an open-source, community-built, and Internet-accessible infrastructure. It is portable, extensible and compatible with HTML5 and Web2.0 standards (http://Distributome.org). We demonstrate two types of applications of the probability Distributome resources: computational research and science education. The Distributome tools may be employed to address five complementary computational modeling applications (simulation, data-analysis and inference, model-fitting, examination of the analytical, mathematical and computational properties of specific probability distributions, and exploration of the inter-distributional relations). Many high school and college science, technology, engineering and mathematics (STEM) courses may be enriched by the use of modern pedagogical approaches and technology-enhanced methods. The Distributome resources provide enhancements for blended STEM education by improving student motivation, augmenting the classical curriculum with interactive webapps, and overhauling the learning assessment protocols. PMID:27158191

  7. Random Partition Distribution Indexed by Pairwise Information

    PubMed Central

    Dahl, David B.; Day, Ryan; Tsai, Jerry W.

    2017-01-01

    We propose a random partition distribution indexed by pairwise similarity information such that partitions compatible with the similarities are given more probability. The use of pairwise similarities, in the form of distances, is common in some clustering algorithms (e.g., hierarchical clustering), but we show how to use this type of information to define a prior partition distribution for flexible Bayesian modeling. A defining feature of the distribution is that it allocates probability among partitions within a given number of subsets, but it does not shift probability among sets of partitions with different numbers of subsets. Our distribution places more probability on partitions that group similar items yet keeps the total probability of partitions with a given number of subsets constant. The distribution of the number of subsets (and its moments) is available in closed-form and is not a function of the similarities. Our formulation has an explicit probability mass function (with a tractable normalizing constant) so the full suite of MCMC methods may be used for posterior inference. We compare our distribution with several existing partition distributions, showing that our formulation has attractive properties. We provide three demonstrations to highlight the features and relative performance of our distribution. PMID:29276318

  8. [Genetic polymorphisms of 21 non-CODIS STR loci].

    PubMed

    Shao, Wei-bo; Zhang, Su-hua; Li, Li

    2011-02-01

    To investigate genetic polymorphisms of 21 non-CODIS STR loci in Han population from the east of China and to explore their forensic application value. Twenty-one non-CODIS STR loci, were amplified with AGCU 21+1 STR kit and DNA samples were obtained from 225 unrelated individuals of the Han population from the east of China. The PCR products were analyzed with 3130 Genetic Analyzer and genotyped with GeneMapper ID v3.2 software. The genetic data were statistically analyzed with PowerStats v12.xls and Cervus 2.0 software. The distributions of 21 non-CODIS STR loci satisfied the Hardy-Weinberg equilibration. The heterozygosity (H) distributions were 0.596-0.804, the discrimination power (DP) were 0.764-0.948, the probability of exclusion of duo-testing (PEduo) were 0.176-0.492, the probability of exclusion of trios-testing (PEtrio) were 0.334-0.663, and the polymorphic information content (PIC) were 0.522-0.807. The cumulative probability of exclusion (CPE) of duo-testing was 0.999707, the CPE of trios-testing was 0.9999994, and the cumulated discrimination power (CDP) was 0.99999999999999999994. Twenty-one non-CODIS STR loci are highly polymorphic. They can be effectively used in personal identification and paternity testing in trios cases. They can also be used as supplement in the difficult cases of diad paternity testing.

  9. Heavy Metal Pollution Delineation Based on Uncertainty in a Coastal Industrial City in the Yangtze River Delta, China

    PubMed Central

    Zhao, Ruiying; Chen, Songchao; Zhou, Yue; Jin, Bin; Li, Yan

    2018-01-01

    Assessing heavy metal pollution and delineating pollution are the bases for evaluating pollution and determining a cost-effective remediation plan. Most existing studies are based on the spatial distribution of pollutants but ignore related uncertainty. In this study, eight heavy-metal concentrations (Cr, Pb, Cd, Hg, Zn, Cu, Ni, and Zn) were collected at 1040 sampling sites in a coastal industrial city in the Yangtze River Delta, China. The single pollution index (PI) and Nemerow integrated pollution index (NIPI) were calculated for every surface sample (0–20 cm) to assess the degree of heavy metal pollution. Ordinary kriging (OK) was used to map the spatial distribution of heavy metals content and NIPI. Then, we delineated composite heavy metal contamination based on the uncertainty produced by indicator kriging (IK). The results showed that mean values of all PIs and NIPIs were at safe levels. Heavy metals were most accumulated in the central portion of the study area. Based on IK, the spatial probability of composite heavy metal pollution was computed. The probability of composite contamination in the central core urban area was highest. A probability of 0.6 was found as the optimum probability threshold to delineate polluted areas from unpolluted areas for integrative heavy metal contamination. Results of pollution delineation based on uncertainty showed the proportion of false negative error areas was 6.34%, while the proportion of false positive error areas was 0.86%. The accuracy of the classification was 92.80%. This indicated the method we developed is a valuable tool for delineating heavy metal pollution. PMID:29642623

  10. Heavy Metal Pollution Delineation Based on Uncertainty in a Coastal Industrial City in the Yangtze River Delta, China.

    PubMed

    Hu, Bifeng; Zhao, Ruiying; Chen, Songchao; Zhou, Yue; Jin, Bin; Li, Yan; Shi, Zhou

    2018-04-10

    Assessing heavy metal pollution and delineating pollution are the bases for evaluating pollution and determining a cost-effective remediation plan. Most existing studies are based on the spatial distribution of pollutants but ignore related uncertainty. In this study, eight heavy-metal concentrations (Cr, Pb, Cd, Hg, Zn, Cu, Ni, and Zn) were collected at 1040 sampling sites in a coastal industrial city in the Yangtze River Delta, China. The single pollution index (PI) and Nemerow integrated pollution index (NIPI) were calculated for every surface sample (0-20 cm) to assess the degree of heavy metal pollution. Ordinary kriging (OK) was used to map the spatial distribution of heavy metals content and NIPI. Then, we delineated composite heavy metal contamination based on the uncertainty produced by indicator kriging (IK). The results showed that mean values of all PIs and NIPIs were at safe levels. Heavy metals were most accumulated in the central portion of the study area. Based on IK, the spatial probability of composite heavy metal pollution was computed. The probability of composite contamination in the central core urban area was highest. A probability of 0.6 was found as the optimum probability threshold to delineate polluted areas from unpolluted areas for integrative heavy metal contamination. Results of pollution delineation based on uncertainty showed the proportion of false negative error areas was 6.34%, while the proportion of false positive error areas was 0.86%. The accuracy of the classification was 92.80%. This indicated the method we developed is a valuable tool for delineating heavy metal pollution.

  11. A brief introduction to probability.

    PubMed

    Di Paola, Gioacchino; Bertani, Alessandro; De Monte, Lavinia; Tuzzolino, Fabio

    2018-02-01

    The theory of probability has been debated for centuries: back in 1600, French mathematics used the rules of probability to place and win bets. Subsequently, the knowledge of probability has significantly evolved and is now an essential tool for statistics. In this paper, the basic theoretical principles of probability will be reviewed, with the aim of facilitating the comprehension of statistical inference. After a brief general introduction on probability, we will review the concept of the "probability distribution" that is a function providing the probabilities of occurrence of different possible outcomes of a categorical or continuous variable. Specific attention will be focused on normal distribution that is the most relevant distribution applied to statistical analysis.

  12. Anti-collusion forensics of multimedia fingerprinting using orthogonal modulation.

    PubMed

    Wang, Z Jane; Wu, Min; Zhao, Hong Vicky; Trappe, Wade; Liu, K J Ray

    2005-06-01

    Digital fingerprinting is a method for protecting digital data in which fingerprints that are embedded in multimedia are capable of identifying unauthorized use of digital content. A powerful attack that can be employed to reduce this tracing capability is collusion, where several users combine their copies of the same content to attenuate/remove the original fingerprints. In this paper, we study the collusion resistance of a fingerprinting system employing Gaussian distributed fingerprints and orthogonal modulation. We introduce the maximum detector and the thresholding detector for colluder identification. We then analyze the collusion resistance of a system to the averaging collusion attack for the performance criteria represented by the probability of a false negative and the probability of a false positive. Lower and upper bounds for the maximum number of colluders K(max) are derived. We then show that the detectors are robust to different collusion attacks. We further study different sets of performance criteria, and our results indicate that attacks based on a few dozen independent copies can confound such a fingerprinting system. We also propose a likelihood-based approach to estimate the number of colluders. Finally, we demonstrate the performance for detecting colluders through experiments using real images.

  13. Digital simulation of two-dimensional random fields with arbitrary power spectra and non-Gaussian probability distribution functions.

    PubMed

    Yura, Harold T; Hanson, Steen G

    2012-04-01

    Methods for simulation of two-dimensional signals with arbitrary power spectral densities and signal amplitude probability density functions are disclosed. The method relies on initially transforming a white noise sample set of random Gaussian distributed numbers into a corresponding set with the desired spectral distribution, after which this colored Gaussian probability distribution is transformed via an inverse transform into the desired probability distribution. In most cases the method provides satisfactory results and can thus be considered an engineering approach. Several illustrative examples with relevance for optics are given.

  14. Remembrance of inferences past: Amortization in human hypothesis generation.

    PubMed

    Dasgupta, Ishita; Schulz, Eric; Goodman, Noah D; Gershman, Samuel J

    2018-05-21

    Bayesian models of cognition assume that people compute probability distributions over hypotheses. However, the required computations are frequently intractable or prohibitively expensive. Since people often encounter many closely related distributions, selective reuse of computations (amortized inference) is a computationally efficient use of the brain's limited resources. We present three experiments that provide evidence for amortization in human probabilistic reasoning. When sequentially answering two related queries about natural scenes, participants' responses to the second query systematically depend on the structure of the first query. This influence is sensitive to the content of the queries, only appearing when the queries are related. Using a cognitive load manipulation, we find evidence that people amortize summary statistics of previous inferences, rather than storing the entire distribution. These findings support the view that the brain trades off accuracy and computational cost, to make efficient use of its limited cognitive resources to approximate probabilistic inference. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. The global impact distribution of Near-Earth objects

    NASA Astrophysics Data System (ADS)

    Rumpf, Clemens; Lewis, Hugh G.; Atkinson, Peter M.

    2016-02-01

    Asteroids that could collide with the Earth are listed on the publicly available Near-Earth object (NEO) hazard web sites maintained by the National Aeronautics and Space Administration (NASA) and the European Space Agency (ESA). The impact probability distribution of 69 potentially threatening NEOs from these lists that produce 261 dynamically distinct impact instances, or Virtual Impactors (VIs), were calculated using the Asteroid Risk Mitigation and Optimization Research (ARMOR) tool in conjunction with OrbFit. ARMOR projected the impact probability of each VI onto the surface of the Earth as a spatial probability distribution. The projection considers orbit solution accuracy and the global impact probability. The method of ARMOR is introduced and the tool is validated against two asteroid-Earth collision cases with objects 2008 TC3 and 2014 AA. In the analysis, the natural distribution of impact corridors is contrasted against the impact probability distribution to evaluate the distributions' conformity with the uniform impact distribution assumption. The distribution of impact corridors is based on the NEO population and orbital mechanics. The analysis shows that the distribution of impact corridors matches the common assumption of uniform impact distribution and the result extends the evidence base for the uniform assumption from qualitative analysis of historic impact events into the future in a quantitative way. This finding is confirmed in a parallel analysis of impact points belonging to a synthetic population of 10,006 VIs. Taking into account the impact probabilities introduced significant variation into the results and the impact probability distribution, consequently, deviates markedly from uniformity. The concept of impact probabilities is a product of the asteroid observation and orbit determination technique and, thus, represents a man-made component that is largely disconnected from natural processes. It is important to consider impact probabilities because such information represents the best estimate of where an impact might occur.

  16. Influence of mesoscale features on micronekton and large pelagic fish communities in the Mozambique Channel

    NASA Astrophysics Data System (ADS)

    Potier, Michel; Bach, Pascal; Ménard, Frédéric; Marsac, Francis

    2014-02-01

    We investigated the diversity and distribution of two communities, micronekton organisms and large predatory fishes, sampled in mesoscale features of the Mozambique Channel from 2003 to 2009, by combining mid-water trawls, stomach contents of fish predators and instrumented longline fishing surveys. The highest species richness for assemblages was found in divergences and fronts rather than in the core of eddies. Despite an unbalanced scheme, diversity indices did not differ significantly between cyclonic and anticyclonic eddies, divergences and fronts. We found that eddies and associated physical cues did not substantially affect the distribution of micronektonic species which are mainly driven by the diel vertical migration pattern. Top predators exhibited a more complex response. Swordfish (Xiphias gladius) associated better with mesoscale features than tunas, with a clear preference for divergences which is consistent with the diel vertical migrations and occurrence of its main prey, the flying squids Sthenoteuthis oualaniensis (Ommastrephidae). On the other hand, the probability of presence of yellowfin tuna was not tied to any specific eddy structure. However, the highest values of positive yellowfin CPUEs were associated with low horizontal gradients of sea-level anomalies. We also showed a non-linear response of positive yellowfin CPUEs with respect to the depth of the minimal oxygen content. The larger the distance between the hooks and the minimal oxygen layer, towards the surface or at greater depths, the higher the CPUE, highlighting that yellowfin congregated in well-oxygenated waters. Micronekton sampled by mid-water trawls and stomach contents exhibited different species composition. The highly mobile organisms were not caught by trawling whereas they remain accessible to predators. The combination of stomach contents and mid-water trawls undoubtedly improved our understanding of the micronekton assemblage distribution. Our results provide some evidence that mesoscale features in the Mozambique Channel do not strongly affect the distribution of the mid-trophic level organisms such as micronekton and most of the large predatory fishes, and hypotheses are proposed to support this result.

  17. Command and Control Systems Requirements Analysis. Volume 2. Measuring C2 Effectiveness with Decision Probability

    DTIC Science & Technology

    1990-09-01

    MEASURING C2 EFFECTIVENESS WUIlT DECISION PROBABILITY SEPTEMBER 1990 TABLE OF CONTENTS 1.0 IN T R O D U CTIIO N...i ’i | " i | , TABLE OF CONTENTS (Continued) Ra. 5.0 EXPRESSING REQUIREMENTS WITH PROBABILITY .................................... 15 5.1...gaitrering and maintaining the data needed, and complfeting and reviewing the coiierction of Intoirmallon Send continnts regarding MAr burden asilmate o

  18. Combining a popularity-productivity stochastic block model with a discriminative-content model for general structure detection.

    PubMed

    Chai, Bian-fang; Yu, Jian; Jia, Cai-Yan; Yang, Tian-bao; Jiang, Ya-wen

    2013-07-01

    Latent community discovery that combines links and contents of a text-associated network has drawn more attention with the advance of social media. Most of the previous studies aim at detecting densely connected communities and are not able to identify general structures, e.g., bipartite structure. Several variants based on the stochastic block model are more flexible for exploring general structures by introducing link probabilities between communities. However, these variants cannot identify the degree distributions of real networks due to a lack of modeling of the differences among nodes, and they are not suitable for discovering communities in text-associated networks because they ignore the contents of nodes. In this paper, we propose a popularity-productivity stochastic block (PPSB) model by introducing two random variables, popularity and productivity, to model the differences among nodes in receiving links and producing links, respectively. This model has the flexibility of existing stochastic block models in discovering general community structures and inherits the richness of previous models that also exploit popularity and productivity in modeling the real scale-free networks with power law degree distributions. To incorporate the contents in text-associated networks, we propose a combined model which combines the PPSB model with a discriminative model that models the community memberships of nodes by their contents. We then develop expectation-maximization (EM) algorithms to infer the parameters in the two models. Experiments on synthetic and real networks have demonstrated that the proposed models can yield better performances than previous models, especially on networks with general structures.

  19. Combining a popularity-productivity stochastic block model with a discriminative-content model for general structure detection

    NASA Astrophysics Data System (ADS)

    Chai, Bian-fang; Yu, Jian; Jia, Cai-yan; Yang, Tian-bao; Jiang, Ya-wen

    2013-07-01

    Latent community discovery that combines links and contents of a text-associated network has drawn more attention with the advance of social media. Most of the previous studies aim at detecting densely connected communities and are not able to identify general structures, e.g., bipartite structure. Several variants based on the stochastic block model are more flexible for exploring general structures by introducing link probabilities between communities. However, these variants cannot identify the degree distributions of real networks due to a lack of modeling of the differences among nodes, and they are not suitable for discovering communities in text-associated networks because they ignore the contents of nodes. In this paper, we propose a popularity-productivity stochastic block (PPSB) model by introducing two random variables, popularity and productivity, to model the differences among nodes in receiving links and producing links, respectively. This model has the flexibility of existing stochastic block models in discovering general community structures and inherits the richness of previous models that also exploit popularity and productivity in modeling the real scale-free networks with power law degree distributions. To incorporate the contents in text-associated networks, we propose a combined model which combines the PPSB model with a discriminative model that models the community memberships of nodes by their contents. We then develop expectation-maximization (EM) algorithms to infer the parameters in the two models. Experiments on synthetic and real networks have demonstrated that the proposed models can yield better performances than previous models, especially on networks with general structures.

  20. Distribution and variation of mercury in frozen soils of a high-altitude permafrost region on the northeastern margin of the Tibetan Plateau.

    PubMed

    Sun, Shiwei; Kang, Shichang; Huang, Jie; Chen, Shengyun; Zhang, Qianggong; Guo, Junming; Liu, Wenjie; Neupane, Bigyan; Qin, Dahe

    2017-06-01

    The Tibetan Plateau (TP) is home to the largest permafrost bodies at low- and mid-latitudes, yet little is known about the distribution and variation of mercury (Hg) in frozen soil of the permafrost regions. In this study, extensive soil sampling campaigns were carried out in 23 soil pits from 12 plots in a high-altitude permafrost region of the Shule River Basin, northeastern TP. Hg distribution, variation, and their dependences on soil properties were analyzed. The results have revealed that total Hg (THg) concentrations were low ranging from 6.3 to 29.1 ng g -1 . A near-surface peak of THg concentrations followed by a continuous decrease were observed on the vertical profiles of most soil pits. Significant positive relationships among THg concentrations, soil organic carbon (SOC) contents, and silty fractions were observed, indicating that SOC content and silty fraction are two dominant factors influencing the spatial distribution of THg. THg concentrations in soils showed a decreasing trend with altitude, which was probably attributed to a lower soil potential to Hg accumulation under the condition of lower SOC contents and silty fractions at high altitudes. Approximately, 130.6 t Hg in soils (0-60 cm) was estimated and a loss of 64.2% of Hg from the highly stable and stable permafrost (H-SP) region via permafrost degradation was expected in the upstream regions of the Shule River Basin, indicating that the large areas of permafrost regions may become an important source of global Hg emission as a result of the ongoing widespread permafrost degradation.

  1. Physical Properties of Sediment Related to Gas Hydrate in the Northern Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Winters, W. J.; Novosel, I.; Boldina, O. M.; Waite, W. F.; Lorenson, T. D.; Paull, C. K.; Bryant, W.

    2002-12-01

    Eighteen giant piston cores, up to 38-m long, were recovered during July 2002 to determine the distribution of gas hydrate in widely different geologic environments of the Northern Gulf of Mexico. Physical properties, including electrical resistivity, three different shear strengths, P-wave velocity, and thermal conductivity were measured on split and whole-round cores at sea. Water content, grain density, and related properties are being determined in a shore-based laboratory from shipboard-acquired subsamples. These physical property data are important for two primary reasons: (1) to relate the presence of gas hydrate to the natural host sediment; and (2) to correlate with shallow seismic reflection records so they can be interpreted more accurately within and below the depth of coring. Preliminary results indicate that porosity and water content typically decrease rapidly to a subbottom depth of about 8 to 9 m, but then decrease at a much lower rate to the base of the core - often 30 or more mbsf. Although higher water contents are measured in the sediments that were recovered in association with gas hydrates, they are probably an artifact of post-sampling hydrate dissociation rather than an in-situ characteristic. The hydrate recovered during the cruise, was present either as particles distributed throughout the sediment or as massive chunks that filled the entire 10-cm diameter of the core liner. The sediments immediately adjacent to the recovered gas hydrates are visually similar to surrounding sediments, and thus primary lithologic differences do not appear to control the distribution of these gas hydrates. Vane shear strength measurements correlate better to subbottom depth than to water content. The strength values typically increase from less than 10 kPa near the seafloor to as much as 80 to 90 kPa at the base of some cores. Electrical resistivity appears to be related to water content (and probably porewater salinity) since a break in slope with depth is often recorded in the upper 8 to 15 m of sediment. Electrical resistivity typically increases from about 0.4 to 0.5 ohm-m near the top of many cores, to about 0.7 ohm-m near the base of the deeper recovered sediment. These values are typical for clay-rich fine-grained sediment with high water content. Although the amount of gas hydrate in the natural environment is enormous, little is known about its distribution in sea-floor sediment or even exactly how it forms. A goal of this cruise was to find evidence for the existence of gas hydrate away from obvious seafloor gas-hydrate mounds and at depth in the sediment. This international, multi-discipline coring cruise was conducted jointly by the Institut Polaire Francais, Paul-Emile Victor (IPEV) and the USGS aboard the 120-m-long French research vessel, Marion Dufresne. Partial funding was provided by the U.S. Dept. of Energy and considerable at-sea help was provided by an international group of about 40 scientists under the IMAGES (International Marine Past Global Changes Study) and PAGE (Paleoceanography of the Atlantic and Geochemistry) programs.

  2. Moment analysis description of wetting and redistribution plumes in wettable and water-repellent soils

    NASA Astrophysics Data System (ADS)

    Xiong, Yunwu; Furman, Alex; Wallach, Rony

    2012-02-01

    SummaryWater repellency has a significant impact on water flow patterns in the soil profile. Transient 2D flow in wettable and natural water-repellent soils was monitored in a transparent flow chamber. The substantial differences in plume shape and spatial water content distribution during the wetting and subsequent redistribution stages were related to the variation of contact angle while in contact with water. The observed plumes shape, internal water content distribution in general and the saturation overshoot behind the wetting front in particular in the repellent soils were associated with unstable flow. Moment analysis was applied to characterize the measured plumes during the wetting and subsequent redistribution. The center of mass and spatial variances determined for the measured evolving plumes were fitted by a model that accounts for capillary and gravitational driving forces in a medium of temporally varying wettability. Ellipses defined around the stable and unstable plumes' centers of mass and whose semi-axes represented a particular number of spatial variances were used to characterize plume shape and internal moisture distribution. A single probability curve was able to characterize the corresponding fractions of the total added water in the different ellipses for all measured plumes, which testify the competence and advantage of the moment analysis method.

  3. Nonadditive entropies yield probability distributions with biases not warranted by the data.

    PubMed

    Pressé, Steve; Ghosh, Kingshuk; Lee, Julian; Dill, Ken A

    2013-11-01

    Different quantities that go by the name of entropy are used in variational principles to infer probability distributions from limited data. Shore and Johnson showed that maximizing the Boltzmann-Gibbs form of the entropy ensures that probability distributions inferred satisfy the multiplication rule of probability for independent events in the absence of data coupling such events. Other types of entropies that violate the Shore and Johnson axioms, including nonadditive entropies such as the Tsallis entropy, violate this basic consistency requirement. Here we use the axiomatic framework of Shore and Johnson to show how such nonadditive entropy functions generate biases in probability distributions that are not warranted by the underlying data.

  4. [Heavy metal concentration in Nanjing urban soils and their affecting factors].

    PubMed

    Lu, Ying; Gong, Zitong; Zhang, Ganlin; Zhang, Bo

    2004-01-01

    The concentration and source of heavy metals in Nanjing urban soils and their relationships with soil properties were studied. The results indicated that the soils in Nanjing urban were not obviously polluted by Fe, Ni, Co and V, but polluted by Mn, Cr, Cu, Zn, and Pb to a certain extent. The heavy metals were irregularly distributed in soil profiles. Fe, Ni, Co, and V were originated from soil materials, but Cu, Zn, Pb, and Cr were anthropogenic input. Probably, Mn had different origins in different soils. There were positive correlations among Fe, Cr, Ni, Co, and V concentration, and among Cu, Zn, Pb, and Cr concentration. The Fe, Co, V, and Ni concentration were positively correlated with soil clay content and CEC, and the Cu, Zn and Pb concentration were negatively correlated with clay content. There were positive correlations between Cu, Zn, Pb and Cr concentration and organic C content, and between Pb concentration and soil pH.

  5. Characterization of essential oil distribution in the root cross-section of Valeriana officinalis L. s.l. by using histological imaging techniques.

    PubMed

    Penzkofer, Michael; Baron, Andrea; Naumann, Annette; Krähmer, Andrea; Schulz, Hartwig; Heuberger, Heidi

    2018-01-01

    The essential oil is an important compound of the root and rhizome of medicinally used valerian ( Valeriana officinalis L. s.l.), with a stated minimum content in the European pharmacopoeia. The essential oil is located in droplets, of which the position and distribution in the total root cross-section of different valerian varieties, root thicknesses and root horizons are determined in this study using an adapted fluorescence-microscopy and automatic imaging analysis method. The study was initiated by the following facts:A probable negative correlation between essential oil content and root thickness in selected single plants (elites), observed during the breeding of coarsely rooted valerian with high oil content.Higher essential oil content after careful hand-harvest and processing of the roots. In preliminary tests, the existence of oil containing droplets in the outer and inner regions of the valerian roots was confirmed by histological techniques and light-microscopy, as well as Fourier-transform infrared spectroscopy. Based on this, fluorescence-microscopy followed by image analysis of entire root cross-sections, showed that a large number of oil droplets (on average 43% of total oil droplets) are located close to the root surface. The remaining oil droplets are located in the inner regions (parenchyma) and showed varying density gradients from the inner to the outer regions depending on genotype, root thickness and harvesting depth. Fluorescence-microscopy is suitable to evaluate prevalence and distribution of essential oil droplets of valerian in entire root cross-sections. The oil droplet density gradient varies among genotypes. Genotypes with a linear rather than an exponential increase of oil droplet density from the inner to the outer parenchyma can be chosen for better stability during post-harvest processing. The negative correlation of essential oil content and root thickness as observed in our breeding material can be counteracted through a selection towards generally high oil droplet density levels, and large oil droplet sizes independent of root thickness.

  6. ProbOnto: ontology and knowledge base of probability distributions.

    PubMed

    Swat, Maciej J; Grenon, Pierre; Wimalaratne, Sarala

    2016-09-01

    Probability distributions play a central role in mathematical and statistical modelling. The encoding, annotation and exchange of such models could be greatly simplified by a resource providing a common reference for the definition of probability distributions. Although some resources exist, no suitably detailed and complex ontology exists nor any database allowing programmatic access. ProbOnto, is an ontology-based knowledge base of probability distributions, featuring more than 80 uni- and multivariate distributions with their defining functions, characteristics, relationships and re-parameterization formulas. It can be used for model annotation and facilitates the encoding of distribution-based models, related functions and quantities. http://probonto.org mjswat@ebi.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press.

  7. PRODIGEN: visualizing the probability landscape of stochastic gene regulatory networks in state and time space.

    PubMed

    Ma, Chihua; Luciani, Timothy; Terebus, Anna; Liang, Jie; Marai, G Elisabeta

    2017-02-15

    Visualizing the complex probability landscape of stochastic gene regulatory networks can further biologists' understanding of phenotypic behavior associated with specific genes. We present PRODIGEN (PRObability DIstribution of GEne Networks), a web-based visual analysis tool for the systematic exploration of probability distributions over simulation time and state space in such networks. PRODIGEN was designed in collaboration with bioinformaticians who research stochastic gene networks. The analysis tool combines in a novel way existing, expanded, and new visual encodings to capture the time-varying characteristics of probability distributions: spaghetti plots over one dimensional projection, heatmaps of distributions over 2D projections, enhanced with overlaid time curves to display temporal changes, and novel individual glyphs of state information corresponding to particular peaks. We demonstrate the effectiveness of the tool through two case studies on the computed probabilistic landscape of a gene regulatory network and of a toggle-switch network. Domain expert feedback indicates that our visual approach can help biologists: 1) visualize probabilities of stable states, 2) explore the temporal probability distributions, and 3) discover small peaks in the probability landscape that have potential relation to specific diseases.

  8. Comparision of the different probability distributions for earthquake hazard assessment in the North Anatolian Fault Zone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yilmaz, Şeyda, E-mail: seydayilmaz@ktu.edu.tr; Bayrak, Erdem, E-mail: erdmbyrk@gmail.com; Bayrak, Yusuf, E-mail: bayrak@ktu.edu.tr

    In this study we examined and compared the three different probabilistic distribution methods for determining the best suitable model in probabilistic assessment of earthquake hazards. We analyzed a reliable homogeneous earthquake catalogue between a time period 1900-2015 for magnitude M ≥ 6.0 and estimated the probabilistic seismic hazard in the North Anatolian Fault zone (39°-41° N 30°-40° E) using three distribution methods namely Weibull distribution, Frechet distribution and three-parameter Weibull distribution. The distribution parameters suitability was evaluated Kolmogorov-Smirnov (K-S) goodness-of-fit test. We also compared the estimated cumulative probability and the conditional probabilities of occurrence of earthquakes for different elapsed timemore » using these three distribution methods. We used Easyfit and Matlab software to calculate these distribution parameters and plotted the conditional probability curves. We concluded that the Weibull distribution method was the most suitable than other distribution methods in this region.« less

  9. Applications of multiscale change point detections to monthly stream flow and rainfall in Xijiang River in southern China, part I: correlation and variance

    NASA Astrophysics Data System (ADS)

    Zhu, Yuxiang; Jiang, Jianmin; Huang, Changxing; Chen, Yongqin David; Zhang, Qiang

    2018-04-01

    This article, as part I, introduces three algorithms and applies them to both series of the monthly stream flow and rainfall in Xijiang River, southern China. The three algorithms include (1) normalization of probability distribution, (2) scanning U test for change points in correlation between two time series, and (3) scanning F-test for change points in variances. The normalization algorithm adopts the quantile method to normalize data from a non-normal into the normal probability distribution. The scanning U test and F-test have three common features: grafting the classical statistics onto the wavelet algorithm, adding corrections for independence into each statistic criteria at given confidence respectively, and being almost objective and automatic detection on multiscale time scales. In addition, the coherency analyses between two series are also carried out for changes in variance. The application results show that the changes of the monthly discharge are still controlled by natural precipitation variations in Xijiang's fluvial system. Human activities disturbed the ecological balance perhaps in certain content and in shorter spells but did not violate the natural relationships of correlation and variance changes so far.

  10. Alkali elemental and potassium isotopic compositions of Semarkona chondrules

    USGS Publications Warehouse

    Alexander, C.M. O'D.; Grossman, J.N.

    2005-01-01

    We report measurements of K isotope ratios in 28 Semarkona chondrules with a wide range of petrologic types and bulk compositions as well as the compositions of CPX-mesostasis pairs in 17 type I Semarkona chondrules, including two chondrules with radial alkali zonation and 19 type II chondrules. Despite the wide range in K/Al ratios, no systematic variations in K isotopic compositions were found. Semarkona chondrules do not record a simple history of Rayleigh-type loss of K. Experimentally determined evaporation rates suggest that considerable alkali evaporation would have occurred during chondrule formation. Nevertheless, based on Na CPX-mesostasis distribution coefficients, the alkali contents of the cores of most chondrules in Semarkona were probably established at the time of final crystallization. However, Na CPX-mesostasis distribution coefficients also show that alkali zonation in type I Semarkona chondrules was produced by entry of alkalis after solidification, probably during parent body alteration. This alkali metasomatism may have gone to completion in some chondrules. Our preferred explanation for the lack of systematic isotopic enrichments, even in alkali depleted type I chondrule cores, is that they exchanged with the ambient gas as they cooled. ?? The Meteoritical Society, 2005.

  11. Incorporating Skew into RMS Surface Roughness Probability Distribution

    NASA Technical Reports Server (NTRS)

    Stahl, Mark T.; Stahl, H. Philip.

    2013-01-01

    The standard treatment of RMS surface roughness data is the application of a Gaussian probability distribution. This handling of surface roughness ignores the skew present in the surface and overestimates the most probable RMS of the surface, the mode. Using experimental data we confirm the Gaussian distribution overestimates the mode and application of an asymmetric distribution provides a better fit. Implementing the proposed asymmetric distribution into the optical manufacturing process would reduce the polishing time required to meet surface roughness specifications.

  12. The Estimation of Tree Posterior Probabilities Using Conditional Clade Probability Distributions

    PubMed Central

    Larget, Bret

    2013-01-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample. [Bayesian phylogenetics; conditional clade distributions; improved accuracy; posterior probabilities of trees.] PMID:23479066

  13. Neighbor effect in complexation of a conjugated polymer.

    PubMed

    Sosorev, Andrey; Zapunidi, Sergey

    2013-09-19

    Charge-transfer complex (CTC) formation between a conjugated polymer and low-molecular-weight organic acceptor is proposed to be driven by the neighbor effect. Formation of a CTC on the polymer chain results in an increased probability of new CTC formation near the existing one. We present an analytical model for CTC distribution considering the neighbor effect, based on the principles of statistical mechanics. This model explains the experimentally observed threshold-like dependence of the CTC concentration on the acceptor content in a polymer:acceptor blend. It also allows us to evaluate binding energies of the complexes.

  14. Deciphering factors controlling groundwater arsenic spatial variability in Bangladesh

    NASA Astrophysics Data System (ADS)

    Tan, Z.; Yang, Q.; Zheng, C.; Zheng, Y.

    2017-12-01

    Elevated concentrations of geogenic arsenic in groundwater have been found in many countries to exceed 10 μg/L, the WHO's guideline value for drinking water. A common yet unexplained characteristic of groundwater arsenic spatial distribution is the extensive variability at various spatial scales. This study investigates factors influencing the spatial variability of groundwater arsenic in Bangladesh to improve the accuracy of models predicting arsenic exceedance rate spatially. A novel boosted regression tree method is used to establish a weak-learning ensemble model, which is compared to a linear model using a conventional stepwise logistic regression method. The boosted regression tree models offer the advantage of parametric interaction when big datasets are analyzed in comparison to the logistic regression. The point data set (n=3,538) of groundwater hydrochemistry with 19 parameters was obtained by the British Geological Survey in 2001. The spatial data sets of geological parameters (n=13) were from the Consortium for Spatial Information, Technical University of Denmark, University of East Anglia and the FAO, while the soil parameters (n=42) were from the Harmonized World Soil Database. The aforementioned parameters were regressed to categorical groundwater arsenic concentrations below or above three thresholds: 5 μg/L, 10 μg/L and 50 μg/L to identify respective controlling factors. Boosted regression tree method outperformed logistic regression methods in all three threshold levels in terms of accuracy, specificity and sensitivity, resulting in an improvement of spatial distribution map of probability of groundwater arsenic exceeding all three thresholds when compared to disjunctive-kriging interpolated spatial arsenic map using the same groundwater arsenic dataset. Boosted regression tree models also show that the most important controlling factors of groundwater arsenic distribution include groundwater iron content and well depth for all three thresholds. The probability of a well with iron content higher than 5mg/L to contain greater than 5 μg/L, 10 μg/L and 50 μg/L As is estimated to be more than 91%, 85% and 51%, respectively, while the probability of a well from depth more than 160m to contain more than 5 μg/L, 10 μg/L and 50 μg/L As is estimated to be less than 38%, 25% and 14%, respectively.

  15. Entropy-Bayesian Inversion of Time-Lapse Tomographic GPR data for Monitoring Dielectric Permittivity and Soil Moisture Variations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Z; Terry, N; Hubbard, S S

    2013-02-12

    In this study, we evaluate the possibility of monitoring soil moisture variation using tomographic ground penetrating radar travel time data through Bayesian inversion, which is integrated with entropy memory function and pilot point concepts, as well as efficient sampling approaches. It is critical to accurately estimate soil moisture content and variations in vadose zone studies. Many studies have illustrated the promise and value of GPR tomographic data for estimating soil moisture and associated changes, however, challenges still exist in the inversion of GPR tomographic data in a manner that quantifies input and predictive uncertainty, incorporates multiple data types, handles non-uniquenessmore » and nonlinearity, and honors time-lapse tomograms collected in a series. To address these challenges, we develop a minimum relative entropy (MRE)-Bayesian based inverse modeling framework that non-subjectively defines prior probabilities, incorporates information from multiple sources, and quantifies uncertainty. The framework enables us to estimate dielectric permittivity at pilot point locations distributed within the tomogram, as well as the spatial correlation range. In the inversion framework, MRE is first used to derive prior probability distribution functions (pdfs) of dielectric permittivity based on prior information obtained from a straight-ray GPR inversion. The probability distributions are then sampled using a Quasi-Monte Carlo (QMC) approach, and the sample sets provide inputs to a sequential Gaussian simulation (SGSim) algorithm that constructs a highly resolved permittivity/velocity field for evaluation with a curved-ray GPR forward model. The likelihood functions are computed as a function of misfits, and posterior pdfs are constructed using a Gaussian kernel. Inversion of subsequent time-lapse datasets combines the Bayesian estimates from the previous inversion (as a memory function) with new data. The memory function and pilot point design takes advantage of the spatial-temporal correlation of the state variables. We first apply the inversion framework to a static synthetic example and then to a time-lapse GPR tomographic dataset collected during a dynamic experiment conducted at the Hanford Site in Richland, WA. We demonstrate that the MRE-Bayesian inversion enables us to merge various data types, quantify uncertainty, evaluate nonlinear models, and produce more detailed and better resolved estimates than straight-ray based inversion; therefore, it has the potential to improve estimates of inter-wellbore dielectric permittivity and soil moisture content and to monitor their temporal dynamics more accurately.« less

  16. Vertical changes in the probability distribution of downward irradiance within the near-surface ocean under sunny conditions

    NASA Astrophysics Data System (ADS)

    Gernez, Pierre; Stramski, Dariusz; Darecki, Miroslaw

    2011-07-01

    Time series measurements of fluctuations in underwater downward irradiance, Ed, within the green spectral band (532 nm) show that the probability distribution of instantaneous irradiance varies greatly as a function of depth within the near-surface ocean under sunny conditions. Because of intense light flashes caused by surface wave focusing, the near-surface probability distributions are highly skewed to the right and are heavy tailed. The coefficients of skewness and excess kurtosis at depths smaller than 1 m can exceed 3 and 20, respectively. We tested several probability models, such as lognormal, Gumbel, Fréchet, log-logistic, and Pareto, which are potentially suited to describe the highly skewed heavy-tailed distributions. We found that the models cannot approximate with consistently good accuracy the high irradiance values within the right tail of the experimental distribution where the probability of these values is less than 10%. This portion of the distribution corresponds approximately to light flashes with Ed > 1.5?, where ? is the time-averaged downward irradiance. However, the remaining part of the probability distribution covering all irradiance values smaller than the 90th percentile can be described with a reasonable accuracy (i.e., within 20%) with a lognormal model for all 86 measurements from the top 10 m of the ocean included in this analysis. As the intensity of irradiance fluctuations decreases with depth, the probability distribution tends toward a function symmetrical around the mean like the normal distribution. For the examined data set, the skewness and excess kurtosis assumed values very close to zero at a depth of about 10 m.

  17. Predicting the probability of slip in gait: methodology and distribution study.

    PubMed

    Gragg, Jared; Yang, James

    2016-01-01

    The likelihood of a slip is related to the available and required friction for a certain activity, here gait. Classical slip and fall analysis presumed that a walking surface was safe if the difference between the mean available and required friction coefficients exceeded a certain threshold. Previous research was dedicated to reformulating the classical slip and fall theory to include the stochastic variation of the available and required friction when predicting the probability of slip in gait. However, when predicting the probability of a slip, previous researchers have either ignored the variation in the required friction or assumed the available and required friction to be normally distributed. Also, there are no published results that actually give the probability of slip for various combinations of required and available frictions. This study proposes a modification to the equation for predicting the probability of slip, reducing the previous equation from a double-integral to a more convenient single-integral form. Also, a simple numerical integration technique is provided to predict the probability of slip in gait: the trapezoidal method. The effect of the random variable distributions on the probability of slip is also studied. It is shown that both the required and available friction distributions cannot automatically be assumed as being normally distributed. The proposed methods allow for any combination of distributions for the available and required friction, and numerical results are compared to analytical solutions for an error analysis. The trapezoidal method is shown to be highly accurate and efficient. The probability of slip is also shown to be sensitive to the input distributions of the required and available friction. Lastly, a critical value for the probability of slip is proposed based on the number of steps taken by an average person in a single day.

  18. Integrated-Circuit Pseudorandom-Number Generator

    NASA Technical Reports Server (NTRS)

    Steelman, James E.; Beasley, Jeff; Aragon, Michael; Ramirez, Francisco; Summers, Kenneth L.; Knoebel, Arthur

    1992-01-01

    Integrated circuit produces 8-bit pseudorandom numbers from specified probability distribution, at rate of 10 MHz. Use of Boolean logic, circuit implements pseudorandom-number-generating algorithm. Circuit includes eight 12-bit pseudorandom-number generators, outputs are uniformly distributed. 8-bit pseudorandom numbers satisfying specified nonuniform probability distribution are generated by processing uniformly distributed outputs of eight 12-bit pseudorandom-number generators through "pipeline" of D flip-flops, comparators, and memories implementing conditional probabilities on zeros and ones.

  19. Moment Analysis Characterizing Water Flow in Repellent Soils from On- and Sub-Surface Point Sources

    NASA Astrophysics Data System (ADS)

    Xiong, Yunwu; Furman, Alex; Wallach, Rony

    2010-05-01

    Water repellency has a significant impact on water flow patterns in the soil profile. Flow tends to become unstable in such soils, which affects the water availability to plants and subsurface hydrology. In this paper, water flow in repellent soils was experimentally studied using the light reflection method. The transient 2D moisture profiles were monitored by CCD camera for tested soils packed in a transparent flow chamber. Water infiltration experiments and subsequent redistribution from on-surface and subsurface point sources with different flow rates were conducted for two soils of different repellency degrees as well as for wettable soil. We used spatio-statistical analysis (moments) to characterize the flow patterns. The zeroth moment is related to the total volume of water inside the moisture plume, and the first and second moments are affinitive to the center of mass and spatial variances of the moisture plume, respectively. The experimental results demonstrate that both the general shape and size of the wetting plume and the moisture distribution within the plume for the repellent soils are significantly different from that for the wettable soil. The wetting plume of the repellent soils is smaller, narrower, and longer (finger-like) than that of the wettable soil compared with that for the wettable soil that tended to roundness. Compared to the wettable soil, where the soil water content decreases radially from the source, moisture content for the water-repellent soils is higher, relatively uniform horizontally and gradually increases with depth (saturation overshoot), indicating that flow tends to become unstable. Ellipses, defined around the mass center and whose semi-axes represented a particular number of spatial variances, were successfully used to simulate the spatial and temporal variation of the moisture distribution in the soil profiles. Cumulative probability functions were defined for the water enclosed in these ellipses. Practically identical cumulative probability functions (beta distribution) were obtained for all soils, all source types, and flow rates. Further, same distributions were obtained for the infiltration and redistribution processes. This attractive result demonstrates the competence and advantage of the moment analysis method.

  20. Application of cluster and discriminant analyses to diagnose lithological heterogeneity of the parent material according to its particle-size distribution

    NASA Astrophysics Data System (ADS)

    Giniyatullin, K. G.; Valeeva, A. A.; Smirnova, E. V.

    2017-08-01

    Particle-size distribution in soddy-podzolic and light gray forest soils of the Botanical Garden of Kazan Federal University has been studied. The cluster analysis of data on the samples from genetic soil horizons attests to the lithological heterogeneity of the profiles of all the studied soils. It is probable that they are developed from the two-layered sediments with the upper colluvial layer underlain by the alluvial layer. According to the discriminant analysis, the major contribution to the discrimination of colluvial and alluvial layers is that of the fraction >0.25 mm. The results of canonical analysis show that there is only one significant discriminant function that separates alluvial and colluvial sediments on the investigated territory. The discriminant function correlates with the contents of fractions 0.05-0.01, 0.25-0.05, and >0.25 mm. Classification functions making it possible to distinguish between alluvial and colluvial sediments have been calculated. Statistical assessment of particle-size distribution data obtained for the plow horizons on ten plowed fields within the garden indicates that this horizon is formed from colluvial sediments. We conclude that the contents of separate fractions and their ratios cannot be used as a universal criterion of the lithological heterogeneity. However, adequate combination of the cluster and discriminant analyses makes it possible to give a comprehensive assessment of the lithology of soil samples from data on the contents of sand and silt fractions, which considerably increases the information value and reliability of the results.

  1. Bivariate normal, conditional and rectangular probabilities: A computer program with applications

    NASA Technical Reports Server (NTRS)

    Swaroop, R.; Brownlow, J. D.; Ashwworth, G. R.; Winter, W. R.

    1980-01-01

    Some results for the bivariate normal distribution analysis are presented. Computer programs for conditional normal probabilities, marginal probabilities, as well as joint probabilities for rectangular regions are given: routines for computing fractile points and distribution functions are also presented. Some examples from a closed circuit television experiment are included.

  2. Assessment of source probabilities for potential tsunamis affecting the U.S. Atlantic coast

    USGS Publications Warehouse

    Geist, E.L.; Parsons, T.

    2009-01-01

    Estimating the likelihood of tsunamis occurring along the U.S. Atlantic coast critically depends on knowledge of tsunami source probability. We review available information on both earthquake and landslide probabilities from potential sources that could generate local and transoceanic tsunamis. Estimating source probability includes defining both size and recurrence distributions for earthquakes and landslides. For the former distribution, source sizes are often distributed according to a truncated or tapered power-law relationship. For the latter distribution, sources are often assumed to occur in time according to a Poisson process, simplifying the way tsunami probabilities from individual sources can be aggregated. For the U.S. Atlantic coast, earthquake tsunami sources primarily occur at transoceanic distances along plate boundary faults. Probabilities for these sources are constrained from previous statistical studies of global seismicity for similar plate boundary types. In contrast, there is presently little information constraining landslide probabilities that may generate local tsunamis. Though there is significant uncertainty in tsunami source probabilities for the Atlantic, results from this study yield a comparative analysis of tsunami source recurrence rates that can form the basis for future probabilistic analyses.

  3. Multinomial mixture model with heterogeneous classification probabilities

    USGS Publications Warehouse

    Holland, M.D.; Gray, B.R.

    2011-01-01

    Royle and Link (Ecology 86(9):2505-2512, 2005) proposed an analytical method that allowed estimation of multinomial distribution parameters and classification probabilities from categorical data measured with error. While useful, we demonstrate algebraically and by simulations that this method yields biased multinomial parameter estimates when the probabilities of correct category classifications vary among sampling units. We address this shortcoming by treating these probabilities as logit-normal random variables within a Bayesian framework. We use Markov chain Monte Carlo to compute Bayes estimates from a simulated sample from the posterior distribution. Based on simulations, this elaborated Royle-Link model yields nearly unbiased estimates of multinomial and correct classification probability estimates when classification probabilities are allowed to vary according to the normal distribution on the logit scale or according to the Beta distribution. The method is illustrated using categorical submersed aquatic vegetation data. ?? 2010 Springer Science+Business Media, LLC.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friar, James Lewis; Goldman, Terrance J.; Pérez-Mercader, J.

    In this paper, we apply the Law of Total Probability to the construction of scale-invariant probability distribution functions (pdf's), and require that probability measures be dimensionless and unitless under a continuous change of scales. If the scale-change distribution function is scale invariant then the constructed distribution will also be scale invariant. Repeated application of this construction on an arbitrary set of (normalizable) pdf's results again in scale-invariant distributions. The invariant function of this procedure is given uniquely by the reciprocal distribution, suggesting a kind of universality. Finally, we separately demonstrate that the reciprocal distribution results uniquely from requiring maximum entropymore » for size-class distributions with uniform bin sizes.« less

  5. Positive phase space distributions and uncertainty relations

    NASA Technical Reports Server (NTRS)

    Kruger, Jan

    1993-01-01

    In contrast to a widespread belief, Wigner's theorem allows the construction of true joint probabilities in phase space for distributions describing the object system as well as for distributions depending on the measurement apparatus. The fundamental role of Heisenberg's uncertainty relations in Schroedinger form (including correlations) is pointed out for these two possible interpretations of joint probability distributions. Hence, in order that a multivariate normal probability distribution in phase space may correspond to a Wigner distribution of a pure or a mixed state, it is necessary and sufficient that Heisenberg's uncertainty relation in Schroedinger form should be satisfied.

  6. Ubiquity of Benford's law and emergence of the reciprocal distribution

    DOE PAGES

    Friar, James Lewis; Goldman, Terrance J.; Pérez-Mercader, J.

    2016-04-07

    In this paper, we apply the Law of Total Probability to the construction of scale-invariant probability distribution functions (pdf's), and require that probability measures be dimensionless and unitless under a continuous change of scales. If the scale-change distribution function is scale invariant then the constructed distribution will also be scale invariant. Repeated application of this construction on an arbitrary set of (normalizable) pdf's results again in scale-invariant distributions. The invariant function of this procedure is given uniquely by the reciprocal distribution, suggesting a kind of universality. Finally, we separately demonstrate that the reciprocal distribution results uniquely from requiring maximum entropymore » for size-class distributions with uniform bin sizes.« less

  7. Trace elements and REE geochemistry of Middle Devonian carbonate mounds (Maïder Basin, Eastern Anti-Atlas, Morocco): Implications for early diagenetic processes

    NASA Astrophysics Data System (ADS)

    Franchi, Fulvio; Turetta, Clara; Cavalazzi, Barbara; Corami, Fabiana; Barbieri, Roberto

    2016-08-01

    Trace and rare earth elements (REEs) have proven their utility as tools for assessing the genesis and early diagenesis of widespread geological bodies such as carbonate mounds, whose genetic processes are not yet fully understood. Carbonates from the Middle Devonian conical mud mounds of the Maïder Basin (eastern Anti-Atlas, Morocco) have been analysed for their REE and trace element distribution. Collectively, the carbonates from the Maïder Basin mud mounds appear to display coherent REE patterns. Three different geochemical patterns, possibly related with three different diagenetic events, include: i) dyke fills with a normal marine REE pattern probably precipitated in equilibrium with seawater, ii) mound micrite with a particular enrichment of overall REE contents and variable Ce anomaly probably related to variation of pH, increase of alkalinity or dissolution/remineralization of organic matter during early diagenesis, and iii) haematite-rich vein fills precipitated from venting fluids of probable hydrothermal origin. Our results reinforce the hypothesis that these mounds were probably affected by an early diagenesis induced by microbial activity and triggered by abundance of dispersed organic matter, whilst venting may have affected the mounds during a later diagenetic phase.

  8. Spatial Probability Distribution of Strata's Lithofacies and its Impacts on Land Subsidence in Huairou Emergency Water Resources Region of Beijing

    NASA Astrophysics Data System (ADS)

    Li, Y.; Gong, H.; Zhu, L.; Guo, L.; Gao, M.; Zhou, C.

    2016-12-01

    Continuous over-exploitation of groundwater causes dramatic drawdown, and leads to regional land subsidence in the Huairou Emergency Water Resources region, which is located in the up-middle part of the Chaobai river basin of Beijing. Owing to the spatial heterogeneity of strata's lithofacies of the alluvial fan, ground deformation has no significant positive correlation with groundwater drawdown, and one of the challenges ahead is to quantify the spatial distribution of strata's lithofacies. The transition probability geostatistics approach provides potential for characterizing the distribution of heterogeneous lithofacies in the subsurface. Combined the thickness of clay layer extracted from the simulation, with deformation field acquired from PS-InSAR technology, the influence of strata's lithofacies on land subsidence can be analyzed quantitatively. The strata's lithofacies derived from borehole data were generalized into four categories and their probability distribution in the observe space was mined by using the transition probability geostatistics, of which clay was the predominant compressible material. Geologically plausible realizations of lithofacies distribution were produced, accounting for complex heterogeneity in alluvial plain. At a particular probability level of more than 40 percent, the volume of clay defined was 55 percent of the total volume of strata's lithofacies. This level, equaling nearly the volume of compressible clay derived from the geostatistics, was thus chosen to represent the boundary between compressible and uncompressible material. The method incorporates statistical geological information, such as distribution proportions, average lengths and juxtaposition tendencies of geological types, mainly derived from borehole data and expert knowledge, into the Markov chain model of transition probability. Some similarities of patterns were indicated between the spatial distribution of deformation field and clay layer. In the area with roughly similar water table decline, locations in the subsurface having a higher probability for the existence of compressible material occur more than that in the location with a lower probability. Such estimate of spatial probability distribution is useful to analyze the uncertainty of land subsidence.

  9. The exact probability distribution of the rank product statistics for replicated experiments.

    PubMed

    Eisinga, Rob; Breitling, Rainer; Heskes, Tom

    2013-03-18

    The rank product method is a widely accepted technique for detecting differentially regulated genes in replicated microarray experiments. To approximate the sampling distribution of the rank product statistic, the original publication proposed a permutation approach, whereas recently an alternative approximation based on the continuous gamma distribution was suggested. However, both approximations are imperfect for estimating small tail probabilities. In this paper we relate the rank product statistic to number theory and provide a derivation of its exact probability distribution and the true tail probabilities. Copyright © 2013 Federation of European Biochemical Societies. Published by Elsevier B.V. All rights reserved.

  10. An observational study of entrainment rate in deep convection

    DOE PAGES

    Guo, Xiaohao; Lu, Chunsong; Zhao, Tianliang; ...

    2015-09-22

    This study estimates entrainment rate and investigates its relationships with cloud properties in 156 deep convective clouds based on in-situ aircraft observations during the TOGA-COARE (Tropical Ocean Global Atmosphere Coupled Ocean Atmosphere Response Experiment) field campaign over the western Pacific. To the authors’ knowledge, this is the first study on the probability density function of entrainment rate, the relationships between entrainment rate and cloud microphysics, and the effects of dry air sources on the calculated entrainment rate in deep convection from an observational perspective. Results show that the probability density function of entrainment rate can be well fitted by lognormal,more » gamma or Weibull distribution, with coefficients of determination being 0.82, 0.85 and 0.80, respectively. Entrainment tends to reduce temperature, water vapor content and moist static energy in cloud due to evaporative cooling and dilution. Inspection of the relationships between entrainment rate and microphysical properties reveals a negative correlation between volume-mean radius and entrainment rate, suggesting the potential dominance of homogeneous mechanism in the clouds examined. The entrainment rate and environmental water vapor content show similar tendencies of variation with the distance of the assumed environmental air to the cloud edges. Their variation tendencies are non-monotonic due to the relatively short distance between adjacent clouds.« less

  11. An observational study of entrainment rate in deep convection

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Xiaohao; Lu, Chunsong; Zhao, Tianliang

    This study estimates entrainment rate and investigates its relationships with cloud properties in 156 deep convective clouds based on in-situ aircraft observations during the TOGA-COARE (Tropical Ocean Global Atmosphere Coupled Ocean Atmosphere Response Experiment) field campaign over the western Pacific. To the authors’ knowledge, this is the first study on the probability density function of entrainment rate, the relationships between entrainment rate and cloud microphysics, and the effects of dry air sources on the calculated entrainment rate in deep convection from an observational perspective. Results show that the probability density function of entrainment rate can be well fitted by lognormal,more » gamma or Weibull distribution, with coefficients of determination being 0.82, 0.85 and 0.80, respectively. Entrainment tends to reduce temperature, water vapor content and moist static energy in cloud due to evaporative cooling and dilution. Inspection of the relationships between entrainment rate and microphysical properties reveals a negative correlation between volume-mean radius and entrainment rate, suggesting the potential dominance of homogeneous mechanism in the clouds examined. The entrainment rate and environmental water vapor content show similar tendencies of variation with the distance of the assumed environmental air to the cloud edges. Their variation tendencies are non-monotonic due to the relatively short distance between adjacent clouds.« less

  12. Modeling the probability distribution of peak discharge for infiltrating hillslopes

    NASA Astrophysics Data System (ADS)

    Baiamonte, Giorgio; Singh, Vijay P.

    2017-07-01

    Hillslope response plays a fundamental role in the prediction of peak discharge at the basin outlet. The peak discharge for the critical duration of rainfall and its probability distribution are needed for designing urban infrastructure facilities. This study derives the probability distribution, denoted as GABS model, by coupling three models: (1) the Green-Ampt model for computing infiltration, (2) the kinematic wave model for computing discharge hydrograph from the hillslope, and (3) the intensity-duration-frequency (IDF) model for computing design rainfall intensity. The Hortonian mechanism for runoff generation is employed for computing the surface runoff hydrograph. Since the antecedent soil moisture condition (ASMC) significantly affects the rate of infiltration, its effect on the probability distribution of peak discharge is investigated. Application to a watershed in Sicily, Italy, shows that with the increase of probability, the expected effect of ASMC to increase the maximum discharge diminishes. Only for low values of probability, the critical duration of rainfall is influenced by ASMC, whereas its effect on the peak discharge seems to be less for any probability. For a set of parameters, the derived probability distribution of peak discharge seems to be fitted by the gamma distribution well. Finally, an application to a small watershed, with the aim to test the possibility to arrange in advance the rational runoff coefficient tables to be used for the rational method, and a comparison between peak discharges obtained by the GABS model with those measured in an experimental flume for a loamy-sand soil were carried out.

  13. Design of a sampling plan to detect ochratoxin A in green coffee.

    PubMed

    Vargas, E A; Whitaker, T B; Dos Santos, E A; Slate, A B; Lima, F B; Franca, R C A

    2006-01-01

    The establishment of maximum limits for ochratoxin A (OTA) in coffee by importing countries requires that coffee-producing countries develop scientifically based sampling plans to assess OTA contents in lots of green coffee before coffee enters the market thus reducing consumer exposure to OTA, minimizing the number of lots rejected, and reducing financial loss for producing countries. A study was carried out to design an official sampling plan to determine OTA in green coffee produced in Brazil. Twenty-five lots of green coffee (type 7 - approximately 160 defects) were sampled according to an experimental protocol where 16 test samples were taken from each lot (total of 16 kg) resulting in a total of 800 OTA analyses. The total, sampling, sample preparation, and analytical variances were 10.75 (CV = 65.6%), 7.80 (CV = 55.8%), 2.84 (CV = 33.7%), and 0.11 (CV = 6.6%), respectively, assuming a regulatory limit of 5 microg kg(-1) OTA and using a 1 kg sample, Romer RAS mill, 25 g sub-samples, and high performance liquid chromatography. The observed OTA distribution among the 16 OTA sample results was compared to several theoretical distributions. The 2 parameter-log normal distribution was selected to model OTA test results for green coffee as it gave the best fit across all 25 lot distributions. Specific computer software was developed using the variance and distribution information to predict the probability of accepting or rejecting coffee lots at specific OTA concentrations. The acceptation probability was used to compute an operating characteristic (OC) curve specific to a sampling plan design. The OC curve was used to predict the rejection of good lots (sellers' or exporters' risk) and the acceptance of bad lots (buyers' or importers' risk).

  14. Constructing inverse probability weights for continuous exposures: a comparison of methods.

    PubMed

    Naimi, Ashley I; Moodie, Erica E M; Auger, Nathalie; Kaufman, Jay S

    2014-03-01

    Inverse probability-weighted marginal structural models with binary exposures are common in epidemiology. Constructing inverse probability weights for a continuous exposure can be complicated by the presence of outliers, and the need to identify a parametric form for the exposure and account for nonconstant exposure variance. We explored the performance of various methods to construct inverse probability weights for continuous exposures using Monte Carlo simulation. We generated two continuous exposures and binary outcomes using data sampled from a large empirical cohort. The first exposure followed a normal distribution with homoscedastic variance. The second exposure followed a contaminated Poisson distribution, with heteroscedastic variance equal to the conditional mean. We assessed six methods to construct inverse probability weights using: a normal distribution, a normal distribution with heteroscedastic variance, a truncated normal distribution with heteroscedastic variance, a gamma distribution, a t distribution (1, 3, and 5 degrees of freedom), and a quantile binning approach (based on 10, 15, and 20 exposure categories). We estimated the marginal odds ratio for a single-unit increase in each simulated exposure in a regression model weighted by the inverse probability weights constructed using each approach, and then computed the bias and mean squared error for each method. For the homoscedastic exposure, the standard normal, gamma, and quantile binning approaches performed best. For the heteroscedastic exposure, the quantile binning, gamma, and heteroscedastic normal approaches performed best. Our results suggest that the quantile binning approach is a simple and versatile way to construct inverse probability weights for continuous exposures.

  15. Aggregation of Environmental Model Data for Decision Support

    NASA Astrophysics Data System (ADS)

    Alpert, J. C.

    2013-12-01

    Weather forecasts and warnings must be prepared and then delivered so as to reach their intended audience in good time to enable effective decision-making. An effort to mitigate these difficulties was studied at a Workshop, 'Sustaining National Meteorological Services - Strengthening WMO Regional and Global Centers' convened, June , 2013, by the World Bank, WMO and the US National Weather Service (NWS). The skill and accuracy of atmospheric forecasts from deterministic models have increased and there are now ensembles of such models that improve decisions to protect life, property and commerce. The NWS production of numerical weather prediction products result in model output from global and high resolution regional ensemble forecasts. Ensembles are constructed by changing the initial conditions to make a 'cloud' of forecasts that attempt to span the space of possible atmospheric realizations which can quantify not only the most likely forecast, but also the uncertainty. This has led to an unprecedented increase in data production and information content from higher resolution, multi-model output and secondary calculations. One difficulty is to obtain the needed subset of data required to estimate the probability of events, and report the information. The calibration required to reliably estimate the probability of events, and honing of threshold adjustments to reduce false alarms for decision makers is also needed. To meet the future needs of the ever-broadening user community and address these issues on a national and international basis, the weather service implemented the NOAA Operational Model Archive and Distribution System (NOMADS). NOMADS provides real-time and retrospective format independent access to climate, ocean and weather model data and delivers high availability content services as part of NOAA's official real time data dissemination at its new NCWCP web operations center. An important aspect of the server's abilities is to aggregate the matrix of model output offering access to probability and calibrating information for real time decision making. The aggregation content server reports over ensemble component and forecast time in addition to the other data dimensions of vertical layer and position for each variable. The unpacking, organization and reading of many binary packed files is accomplished most efficiently on the server while weather element event probability calculations, the thresholds for more accurate decision support, or display remain for the client. Our goal is to reduce uncertainty for variables of interest, e.g, agricultural importance. The weather service operational GFS model ensemble and short range ensemble forecasts can make skillful probability forecasts to alert users if and when their selected weather events will occur. A description of how this framework operates and how it can be implemented using existing NOMADS content services and applications is described.

  16. Searching for life in extreme environments relevant to Jovian's Europa: Lessons from subglacial ice studies at Lake Vostok (East Antarctica)

    NASA Astrophysics Data System (ADS)

    Bulat, Sergey A.; Alekhina, Irina A.; Marie, Dominique; Martins, Jean; Petit, Jean Robert

    2011-08-01

    The objective was to estimate the genuine microbial content of ice samples from refrozen water (accretion ice) from the subglacial Lake Vostok (Antarctica) buried beneath the 4-km thick East Antarctic ice sheet. The samples were extracted by heavy deep ice drilling from 3659 m below the surface. High pressure, a low carbon and chemical content, isolation, complete darkness and the probable excess of oxygen in water for millions of years characterize this extreme environment. A decontamination protocol was first applied to samples selected for the absence of cracks to remove the outer part contaminated by handling and drilling fluid. Preliminary indications showed the accretion ice samples to be almost gas free with a low impurity content. Flow cytometry showed the very low unevenly distributed biomass while repeated microscopic observations were unsuccessful.We used strategies of Ancient DNA research that include establishing contaminant databases and criteria to validate the amplification results. To date, positive results that passed the artifacts and contaminant databases have been obtained for a pair of bacterial phylotypes only in accretion ice samples featured by some bedrock sediments. The phylotypes included the chemolithoautotrophic thermophile Hydrogenophilus thermoluteolus and one unclassified phylotype. Combined with geochemical and geophysical considerations, our results suggest the presence of a deep biosphere, possibly thriving within some active faults of the bedrock encircling the subglacial lake, where the temperature is as high as 50 °C and in situ hydrogen is probably present.Our approach indicates that the search for life in the subglacial Lake Vostok is constrained by a high probability of forward-contamination. Our strategy includes strict decontamination procedures, thorough tracking of contaminants at each step of the analysis and validation of the results along with geophysical and ecological considerations for the lake setting. This may serve to establish a guideline protocol for studying extraterrestrial ice samples.

  17. Assessing the Ability of Vegetation Indices to Identify Shallow Subsurface Water Flow Pathways from Hyperspectral Imagery Using Machine Learning: Application

    NASA Astrophysics Data System (ADS)

    Doctor, K.; Byers, J. M.

    2017-12-01

    Shallow underground water flow pathways expressed as slight depressions are common in the land surface. Under conditions of saturated overland flow, such as during heavy rain or snow melt, these areas of preferential flow might appear on the surface as very shallow flowing streams. When there is no water flowing in these ephemeral channels it can be difficult to identify them. It is especially difficult to discern the slight depressions above the subsurface water flow pathways (SWFP) when the area is covered by vegetation. Since the soil moisture content in these SWFP is often greater than the surrounding area, the vegetation growing on top of these channels shows different vigor and moisture content than the vegetation growing above the non-SWFP area. Vegetation indices (VI) are used in visible and near infrared (VNIR) hyperspectral imagery to enhance biophysical properties of vegetation, and so the brightness values between vegetation atop SWFP and the surrounding vegetation were highlighted. We performed supervised machine learning using ground-truth class labels to determine the conditional probability of a SWFP at a given pixel given either the spectral distribution or VI at that pixel. The training data estimates the probability distributions to a determined finite sampling accuracy for a binary Naïve Bayes classifier between SWFP and non-SWFP. The ground-truth data provides a test bed for understanding the ability to build SWFP classifiers using hyperspectral imagery. SWFP were distinguishable in the imagery within corn and grass fields and in areas with low-lying vegetation. However, the training data is limited to particular types of terrain and vegetation cover in the Shenandoah Valley, Virginia and this would limit the resulting classifier. Further training data could extend its use to other environments.

  18. The Laplace method for probability measures in Banach spaces

    NASA Astrophysics Data System (ADS)

    Piterbarg, V. I.; Fatalov, V. R.

    1995-12-01

    Contents §1. Introduction Chapter I. Asymptotic analysis of continual integrals in Banach space, depending on a large parameter §2. The large deviation principle and logarithmic asymptotics of continual integrals §3. Exact asymptotics of Gaussian integrals in Banach spaces: the Laplace method 3.1. The Laplace method for Gaussian integrals taken over the whole Hilbert space: isolated minimum points ([167], I) 3.2. The Laplace method for Gaussian integrals in Hilbert space: the manifold of minimum points ([167], II) 3.3. The Laplace method for Gaussian integrals in Banach space ([90], [174], [176]) 3.4. Exact asymptotics of large deviations of Gaussian norms §4. The Laplace method for distributions of sums of independent random elements with values in Banach space 4.1. The case of a non-degenerate minimum point ([137], I) 4.2. A degenerate isolated minimum point and the manifold of minimum points ([137], II) §5. Further examples 5.1. The Laplace method for the local time functional of a Markov symmetric process ([217]) 5.2. The Laplace method for diffusion processes, a finite number of non-degenerate minimum points ([116]) 5.3. Asymptotics of large deviations for Brownian motion in the Hölder norm 5.4. Non-asymptotic expansion of a strong stable law in Hilbert space ([41]) Chapter II. The double sum method - a version of the Laplace method in the space of continuous functions §6. Pickands' method of double sums 6.1. General situations 6.2. Asymptotics of the distribution of the maximum of a Gaussian stationary process 6.3. Asymptotics of the probability of a large excursion of a Gaussian non-stationary process §7. Probabilities of large deviations of trajectories of Gaussian fields 7.1. Homogeneous fields and fields with constant dispersion 7.2. Finitely many maximum points of dispersion 7.3. Manifold of maximum points of dispersion 7.4. Asymptotics of distributions of maxima of Wiener fields §8. Exact asymptotics of large deviations of the norm of Gaussian vectors and processes with values in the spaces L_k^p and l^2. Gaussian fields with the set of parameters in Hilbert space 8.1 Exact asymptotics of the distribution of the l_k^p-norm of a Gaussian finite-dimensional vector with dependent coordinates, p > 1 8.2. Exact asymptotics of probabilities of high excursions of trajectories of processes of type \\chi^2 8.3. Asymptotics of the probabilities of large deviations of Gaussian processes with a set of parameters in Hilbert space [74] 8.4. Asymptotics of distributions of maxima of the norms of l^2-valued Gaussian processes 8.5. Exact asymptotics of large deviations for the l^2-valued Ornstein-Uhlenbeck process Bibliography

  19. Use of the negative binomial-truncated Poisson distribution in thunderstorm prediction

    NASA Technical Reports Server (NTRS)

    Cohen, A. C.

    1971-01-01

    A probability model is presented for the distribution of thunderstorms over a small area given that thunderstorm events (1 or more thunderstorms) are occurring over a larger area. The model incorporates the negative binomial and truncated Poisson distributions. Probability tables for Cape Kennedy for spring, summer, and fall months and seasons are presented. The computer program used to compute these probabilities is appended.

  20. Comparison of Deterministic and Probabilistic Radial Distribution Systems Load Flow

    NASA Astrophysics Data System (ADS)

    Gupta, Atma Ram; Kumar, Ashwani

    2017-12-01

    Distribution system network today is facing the challenge of meeting increased load demands from the industrial, commercial and residential sectors. The pattern of load is highly dependent on consumer behavior and temporal factors such as season of the year, day of the week or time of the day. For deterministic radial distribution load flow studies load is taken as constant. But, load varies continually with a high degree of uncertainty. So, there is a need to model probable realistic load. Monte-Carlo Simulation is used to model the probable realistic load by generating random values of active and reactive power load from the mean and standard deviation of the load and for solving a Deterministic Radial Load Flow with these values. The probabilistic solution is reconstructed from deterministic data obtained for each simulation. The main contribution of the work is: Finding impact of probable realistic ZIP load modeling on balanced radial distribution load flow. Finding impact of probable realistic ZIP load modeling on unbalanced radial distribution load flow. Compare the voltage profile and losses with probable realistic ZIP load modeling for balanced and unbalanced radial distribution load flow.

  1. Uranium concentration and distribution in six peridotite inclusions of probable mantle origin

    NASA Technical Reports Server (NTRS)

    Haines, E. L.; Zartman, R. E.

    1973-01-01

    Fission-track activation was used to investigate uranium concentration and distribution in peridotite inclusions in alkali basalt from six localities. Whole-rock uranium concentrations range from 24 to 82 ng/g. Most of the uranium is uniformly distributed in the major silicate phases - olivine, orthopyroxene, and clinopyroxene. Chromian spinels may be classified into two groups on the basis of their uranium content - those which have less than 10 ng/g and those which have 100 to 150 ng/g U. In one sample accessory hydrous phases, phlogopite and hornblende, contain 130 and 300 ng/g U, respectively. The contact between the inclusion and the host basalt is usually quite sharp. Glassy or microcrystalline veinlets found in some samples contain more than 1 microgram/g. Very little uranium is associated with microcrystals of apatite. These results agree with some earlier investigators, who have concluded that suboceanic peridotites contain too little uranium to account for normal oceanic heat flow by conduction alone.

  2. Probability Distribution of Turbulent Kinetic Energy Dissipation Rate in Ocean: Observations and Approximations

    NASA Astrophysics Data System (ADS)

    Lozovatsky, I.; Fernando, H. J. S.; Planella-Morato, J.; Liu, Zhiyu; Lee, J.-H.; Jinadasa, S. U. P.

    2017-10-01

    The probability distribution of turbulent kinetic energy dissipation rate in stratified ocean usually deviates from the classic lognormal distribution that has been formulated for and often observed in unstratified homogeneous layers of atmospheric and oceanic turbulence. Our measurements of vertical profiles of micro-scale shear, collected in the East China Sea, northern Bay of Bengal, to the south and east of Sri Lanka, and in the Gulf Stream region, show that the probability distributions of the dissipation rate ɛ˜r in the pycnoclines (r ˜ 1.4 m is the averaging scale) can be successfully modeled by the Burr (type XII) probability distribution. In weakly stratified boundary layers, lognormal distribution of ɛ˜r is preferable, although the Burr is an acceptable alternative. The skewness Skɛ and the kurtosis Kɛ of the dissipation rate appear to be well correlated in a wide range of Skɛ and Kɛ variability.

  3. Major and trace element distribution in soil and sediments from the Egyptian central Nile Valley

    NASA Astrophysics Data System (ADS)

    Badawy, W. M.; Ghanim, E. H.; Duliu, O. G.; El Samman, H.; Frontasyeva, M. V.

    2017-07-01

    The distributions of 32 major and trace elements in 72 surface soil and sediment samples collected from the Asyut to Cairo Nile river section were determined by epithermal neutron activation analysis and compared with corresponding data for the Upper Continental Crust, North American Shale Composite, Average Soil and Average Sediment as well as suspended sediments from Congo and Upper Niger Rivers, in order to establish to which extent the Nile sedimentary material can be related to similar material all over the world as well as to local geology. Their relative distributions indicate the presence of detrital material of igneous origin, most probably resulting from weathering of the Ethiopian Highlands and transported by the Blue Nile, the Nile main tributary. The distributions of nickel, zinc, and arsenic contents suggest that the lower part of the Nile and its surroundings including the Nile Delta is not seriously polluted with heavy metals, so that, in spite of a human activity, which lasted four millennia, the Nile River continues to be less affected by any anthropogenic contamination.

  4. Comparison of Content Structure and Cognitive Structure in the Learning of Probability.

    ERIC Educational Resources Information Center

    Geeslin, William E.

    Digraphs, graphs, and task analysis were used to map out the content structure of a programed text (SMSG) in elementary probability. Mathematical structure was defined as the relationship between concepts within a set of abstract systems. The word association technique was used to measure the existing relations (cognitive structure) in S's memory…

  5. Evaluation of the Three Parameter Weibull Distribution Function for Predicting Fracture Probability in Composite Materials

    DTIC Science & Technology

    1978-03-01

    for the risk of rupture for a unidirectionally laminat - ed composite subjected to pure bending. (5D This equation can be simplified further by use of...C EVALUATION OF THE THREE PARAMETER WEIBULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS. THESIS / AFIT/GAE...EVALUATION OF THE THREE PARAMETER WE1BULL DISTRIBUTION FUNCTION FOR PREDICTING FRACTURE PROBABILITY IN COMPOSITE MATERIALS THESIS Presented

  6. Bivariate extreme value distributions

    NASA Technical Reports Server (NTRS)

    Elshamy, M.

    1992-01-01

    In certain engineering applications, such as those occurring in the analyses of ascent structural loads for the Space Transportation System (STS), some of the load variables have a lower bound of zero. Thus, the need for practical models of bivariate extreme value probability distribution functions with lower limits was identified. We discuss the Gumbel models and present practical forms of bivariate extreme probability distributions of Weibull and Frechet types with two parameters. Bivariate extreme value probability distribution functions can be expressed in terms of the marginal extremel distributions and a 'dependence' function subject to certain analytical conditions. Properties of such bivariate extreme distributions, sums and differences of paired extremals, as well as the corresponding forms of conditional distributions, are discussed. Practical estimation techniques are also given.

  7. Eruption probabilities for the Lassen Volcanic Center and regional volcanism, northern California, and probabilities for large explosive eruptions in the Cascade Range

    USGS Publications Warehouse

    Nathenson, Manuel; Clynne, Michael A.; Muffler, L.J. Patrick

    2012-01-01

    Chronologies for eruptive activity of the Lassen Volcanic Center and for eruptions from the regional mafic vents in the surrounding area of the Lassen segment of the Cascade Range are here used to estimate probabilities of future eruptions. For the regional mafic volcanism, the ages of many vents are known only within broad ranges, and two models are developed that should bracket the actual eruptive ages. These chronologies are used with exponential, Weibull, and mixed-exponential probability distributions to match the data for time intervals between eruptions. For the Lassen Volcanic Center, the probability of an eruption in the next year is 1.4x10-4 for the exponential distribution and 2.3x10-4 for the mixed exponential distribution. For the regional mafic vents, the exponential distribution gives a probability of an eruption in the next year of 6.5x10-4, but the mixed exponential distribution indicates that the current probability, 12,000 years after the last event, could be significantly lower. For the exponential distribution, the highest probability is for an eruption from a regional mafic vent. Data on areas and volumes of lava flows and domes of the Lassen Volcanic Center and of eruptions from the regional mafic vents provide constraints on the probable sizes of future eruptions. Probabilities of lava-flow coverage are similar for the Lassen Volcanic Center and for regional mafic vents, whereas the probable eruptive volumes for the mafic vents are generally smaller. Data have been compiled for large explosive eruptions (>≈ 5 km3 in deposit volume) in the Cascade Range during the past 1.2 m.y. in order to estimate probabilities of eruption. For erupted volumes >≈5 km3, the rate of occurrence since 13.6 ka is much higher than for the entire period, and we use these data to calculate the annual probability of a large eruption at 4.6x10-4. For erupted volumes ≥10 km3, the rate of occurrence has been reasonably constant from 630 ka to the present, giving more confidence in the estimate, and we use those data to calculate the annual probability of a large eruption in the next year at 1.4x10-5.

  8. Burst wait time simulation of CALIBAN reactor at delayed super-critical state

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humbert, P.; Authier, N.; Richard, B.

    2012-07-01

    In the past, the super prompt critical wait time probability distribution was measured on CALIBAN fast burst reactor [4]. Afterwards, these experiments were simulated with a very good agreement by solving the non-extinction probability equation [5]. Recently, the burst wait time probability distribution has been measured at CEA-Valduc on CALIBAN at different delayed super-critical states [6]. However, in the delayed super-critical case the non-extinction probability does not give access to the wait time distribution. In this case it is necessary to compute the time dependent evolution of the full neutron count number probability distribution. In this paper we present themore » point model deterministic method used to calculate the probability distribution of the wait time before a prescribed count level taking into account prompt neutrons and delayed neutron precursors. This method is based on the solution of the time dependent adjoint Kolmogorov master equations for the number of detections using the generating function methodology [8,9,10] and inverse discrete Fourier transforms. The obtained results are then compared to the measurements and Monte-Carlo calculations based on the algorithm presented in [7]. (authors)« less

  9. Steady state, relaxation and first-passage properties of a run-and-tumble particle in one-dimension

    NASA Astrophysics Data System (ADS)

    Malakar, Kanaya; Jemseena, V.; Kundu, Anupam; Vijay Kumar, K.; Sabhapandit, Sanjib; Majumdar, Satya N.; Redner, S.; Dhar, Abhishek

    2018-04-01

    We investigate the motion of a run-and-tumble particle (RTP) in one dimension. We find the exact probability distribution of the particle with and without diffusion on the infinite line, as well as in a finite interval. In the infinite domain, this probability distribution approaches a Gaussian form in the long-time limit, as in the case of a regular Brownian particle. At intermediate times, this distribution exhibits unexpected multi-modal forms. In a finite domain, the probability distribution reaches a steady-state form with peaks at the boundaries, in contrast to a Brownian particle. We also study the relaxation to the steady-state analytically. Finally we compute the survival probability of the RTP in a semi-infinite domain with an absorbing boundary condition at the origin. In the finite interval, we compute the exit probability and the associated exit times. We provide numerical verification of our analytical results.

  10. Bayesian Networks in Educational Assessment

    PubMed Central

    Culbertson, Michael J.

    2015-01-01

    Bayesian networks (BN) provide a convenient and intuitive framework for specifying complex joint probability distributions and are thus well suited for modeling content domains of educational assessments at a diagnostic level. BN have been used extensively in the artificial intelligence community as student models for intelligent tutoring systems (ITS) but have received less attention among psychometricians. This critical review outlines the existing research on BN in educational assessment, providing an introduction to the ITS literature for the psychometric community, and points out several promising research paths. The online appendix lists 40 assessment systems that serve as empirical examples of the use of BN for educational assessment in a variety of domains. PMID:29881033

  11. Gas and hydrogen isotopic analyses of volcanic eruption clouds in Guatemala sampled by aircraft

    USGS Publications Warehouse

    Rose, W.I.; Cadle, R.D.; Heidt, L.E.; Friedman, I.; Lazrus, A.L.; Huebert, B.J.

    1980-01-01

    Gas samples were collected by aircraft entering volcanic eruption clouds of three Guatemalan volcanoes. Gas chromatographic analyses show higher H2 and S gas contents in ash eruption clouds and lower H2 and S gases in vaporous gas plumes. H isotopic data demonstrate lighter isotopic distribution of water vapor in ash eruption clouds than in vaporous gas plumes. Most of the H2O in the vaporous plumes is probably meteoric. The data are the first direct gas analyses of explosive eruptive clouds, and demonstrate that, in spite of atmospheric admixture, useful compositional information on eruptive gases can be obtained using aircraft. ?? 1980.

  12. Fitness Probability Distribution of Bit-Flip Mutation.

    PubMed

    Chicano, Francisco; Sutton, Andrew M; Whitley, L Darrell; Alba, Enrique

    2015-01-01

    Bit-flip mutation is a common mutation operator for evolutionary algorithms applied to optimize functions over binary strings. In this paper, we develop results from the theory of landscapes and Krawtchouk polynomials to exactly compute the probability distribution of fitness values of a binary string undergoing uniform bit-flip mutation. We prove that this probability distribution can be expressed as a polynomial in p, the probability of flipping each bit. We analyze these polynomials and provide closed-form expressions for an easy linear problem (Onemax), and an NP-hard problem, MAX-SAT. We also discuss a connection of the results with runtime analysis.

  13. [Distribution Characteristics of Sedimentary Pigments in the Changjiang Estuary and Zhe-Min Coast and its Implications].

    PubMed

    Li, Dong; Yao, Peng; Zhao, Bin; Wang, Jin-peng; Pan, Hui-hui

    2015-08-01

    Compositions and contents of sedimentary pigments were examined using high performance liquid chromatography in order to discuss the spatial distributions of phytoplankton primary production, phytoplankton functional type and the preservation efficiency of phytoplankton pigments and their influencing factors. The results showed that: chloropigments [Chlorins, including chlorophyll-a (Chl-a) and pheopigments (Pheo-a), such as pheophytin-a (PHtin-a), pheophorbide-a (PHide-a), pPheophytin-a (pPHtin-a), sterol chlorin esters (SCEs) and carotenol chlorin esters (CCEs)] were the major type of sedimentary pigments. The nutrients inputs from Changjiang Diluted Water and upwelling in the Zhe-Min coastal mud area were the major cause for the patchy distribution with high sedimentary chloropigment contents. Carotenoid contents showed no trending changes and exhibited high values in the Changjiang Estuary and Zhe-Min Coasts. Based on the relative proportions of each diagnostic carotenoid to the total diagnostic carotenoids in the sediments, the relative contributions of diatoms, dinoflagellates, prymnesiophytes, prasinophytes, cryptophytes and cyanobacterias in the phytoplankton fuctional types were 48.8% +/- 17.4%, 10.7% +/- 11.5%, 8.1% +/- 7.2%, 18.6% +/- 8.2%, 9.4% +/- 6.4% and 4.3% +/- 3.2%, respectively. The preference for external environmental conditions (e.g., nutrient level and water salinity) was the main cause for the decreasing trends of diatoms and dinoflagellates proportions and the increasing trends of prasinophytes, cryptophytes and cyanobacterias seawards. Based on the spatial distribution of Chl-a/Pheo-a ratios, the higher preservation efficiencies of sedimentary pigments in the coastal regions (e.g., outer edge of maximum turbidity zone in the Changjiang Estuary, mouth of the Hangzhou Bay and upwelling region in the Zhe-Min Coast) were mainly due to the higher sedimentation rate and seasonal occurrences of hypoxia in bottom water, and these regions with higher sedimentary pigment preservation efficiencies were probably ideal areas for the marine eco-environmental evolutions. The bad sedimentary environment caused by the water exchange inside and outside of Hangzhou Bay was the dominant reason for the low sedimentary pigment contents and preservation efficiencies in this region.

  14. Bayesian source tracking via focalization and marginalization in an uncertain Mediterranean Sea environment.

    PubMed

    Dosso, Stan E; Wilmut, Michael J; Nielsen, Peter L

    2010-07-01

    This paper applies Bayesian source tracking in an uncertain environment to Mediterranean Sea data, and investigates the resulting tracks and track uncertainties as a function of data information content (number of data time-segments, number of frequencies, and signal-to-noise ratio) and of prior information (environmental uncertainties and source-velocity constraints). To track low-level sources, acoustic data recorded for multiple time segments (corresponding to multiple source positions along the track) are inverted simultaneously. Environmental uncertainty is addressed by including unknown water-column and seabed properties as nuisance parameters in an augmented inversion. Two approaches are considered: Focalization-tracking maximizes the posterior probability density (PPD) over the unknown source and environmental parameters. Marginalization-tracking integrates the PPD over environmental parameters to obtain a sequence of joint marginal probability distributions over source coordinates, from which the most-probable track and track uncertainties can be extracted. Both approaches apply track constraints on the maximum allowable vertical and radial source velocity. The two approaches are applied for towed-source acoustic data recorded at a vertical line array at a shallow-water test site in the Mediterranean Sea where previous geoacoustic studies have been carried out.

  15. Odorous secretions in anurans: morphological and functional assessment of serous glands as a source of volatile compounds in the skin of the treefrog Hypsiboas pulchellus (Amphibia: Anura: Hylidae).

    PubMed

    Brunetti, Andrés E; Hermida, Gladys N; Iurman, Mariana G; Faivovich, Julián

    2016-03-01

    Serous (granular or venom) glands occur in the skin of almost all species of adult amphibians, and are thought to be the source of a great diversity of chemical compounds. Despite recent advances in their chemistry, odorous volatile substances are compounds that have received less attention, and until now no study has attempted to associate histological data with the presence of these molecules in amphibians, or in any other vertebrate. Given the recent identification of 40 different volatile compounds from the skin secretions of H. pulchellus (a treefrog species that releases a strong odour when handled), we examined the structure, ultrastructure, histochemistry, and distribution of skin glands of this species. Histological analysis from six body regions reveals the presence of two types of glands that differ in their distribution. Mucous glands are homogeneously distributed, whereas serous glands are more numerous in the scapular region. Ultrastructural results indicate that electron-translucent vesicles observed within granules of serous glands are similar to those found in volatile-producing glands from insects and also with lipid vesicles from different organisms. Association among lipids and volatiles is also evidenced from chemical results, which indicate that at least some of the volatile components in H. pulchellus probably originate within the metabolism of fatty acids or the mevalonate pathway. As odorous secretions are often considered to be secreted under stress situations, the release of glandular content was assessed after pharmacological treatments, epinephrine administrated in vivo and on skin explants, and through surface electrical stimulation. Serous glands responded to all treatments, generally through an obvious contraction of myoepithelial cells that surround their secretory portion. No response was observed in mucous glands. Considering these morpho-functional results, along with previous identification of volatiles from H. pulchellus and H. riojanus after electrical stimulation, we suggest that the electron-translucent inclusions found within the granules of serous glands likely are the store sites of volatile compounds and/or their precursors. Histochemical and glandular distribution analyses in five other species of frogs of the hylid tribe Cophomantini, revealed a high lipid content in all the species, whereas a heterogeneous distribution of serous glands is only observed in species of the H. pulchellus group. The distribution pattern of serous glands in members of this species group, and the odorous volatile secretions are probably related to defensive functions. © 2015 Anatomical Society.

  16. Distribution of leached radioactive material in the Legin Group Area, San Miguel County, Colorado

    USGS Publications Warehouse

    Rogers, Allen S.

    1950-01-01

    Radioactivity anomalies, which are small in magnitude, and probably are not caused by extensions of known uranium-vanadium ore bodies, were detected during the gamma-ray logging of diamond-drill holes in the Legin group of claims, southwest San Miguel County, Colo. The positions of these anomalies are at the top surfaces of mudstone strata within, and at the base of, the ore-bearing sandstone of the Salt Wash member of the Morrison formation. The distribution of these anomalies suggests that ground water has leached radioactive material from the ore bodies and has carried it down dip and laterally along the top surfaces of underlying impermeable mudstone strata for distance as great as 300 feet. The anomalies are probably caused by radon and its daughter elements. Preliminary tests indicate that radon in quantities up to 10-7 curies per liter may be present in ground water flowing along sandstone-mudstone contacts under carnotite ore bodies. In comparison, the radium content of the same water is less than 10-10 curies per liter. Further substantiation of the relationship between ore bodies, the movement of water, and the radon-caused anomalies may greatly increase the scope of gamma-ray logs of drill holes as an aid to prospecting.

  17. Improving the chi-squared approximation for bivariate normal tolerance regions

    NASA Technical Reports Server (NTRS)

    Feiveson, Alan H.

    1993-01-01

    Let X be a two-dimensional random variable distributed according to N2(mu,Sigma) and let bar-X and S be the respective sample mean and covariance matrix calculated from N observations of X. Given a containment probability beta and a level of confidence gamma, we seek a number c, depending only on N, beta, and gamma such that the ellipsoid R = (x: (x - bar-X)'S(exp -1) (x - bar-X) less than or = c) is a tolerance region of content beta and level gamma; i.e., R has probability gamma of containing at least 100 beta percent of the distribution of X. Various approximations for c exist in the literature, but one of the simplest to compute -- a multiple of the ratio of certain chi-squared percentage points -- is badly biased for small N. For the bivariate normal case, most of the bias can be removed by simple adjustment using a factor A which depends on beta and gamma. This paper provides values of A for various beta and gamma so that the simple approximation for c can be made viable for any reasonable sample size. The methodology provides an illustrative example of how a combination of Monte-Carlo simulation and simple regression modelling can be used to improve an existing approximation.

  18. Food and habitat resource partitioning between three estuarine fish species on the Swedish west coast

    NASA Astrophysics Data System (ADS)

    Thorman, Staffan

    1983-12-01

    In 1978 the food and habitat resource partitioning of three small and common fish species, viz. Pomatoschistus microps (Krøyer), Gasterosteus aculeatus (L.) and Pungitius pungitius (L.) were studied in river Broälven estuary on the Swedish west coast (58°22'N, 11°29'E). The area was divided into three habitats, based on environmental features. In July, September, and October stomach contents and size distribution of each species present were analysed. In July there was high food and habitat overlap between the species. Interference interactions probably occurred between some size classes of P. microps and the other two species. P. pungitius was exposed to both intra- and interspecific interactions. In September the food and habitat overlaps between G. aculeatus and P. pungitius were high, while both had low food and habitat overlaps in relation to P. microps. Interactions between G. aculeatus and P. pungitius were probably influenced by more severe abiotic conditions in one habitat, which caused lower abundances there, and higher abundances in the other two habitats. In October no interactions were observed. These results indicate that competition for food at least temporarily determines the species distribution in a temperate estuary, and that estuarine fish populations are sometimes food limited.

  19. The Impact of an Instructional Intervention Designed to Support Development of Stochastic Understanding of Probability Distribution

    ERIC Educational Resources Information Center

    Conant, Darcy Lynn

    2013-01-01

    Stochastic understanding of probability distribution undergirds development of conceptual connections between probability and statistics and supports development of a principled understanding of statistical inference. This study investigated the impact of an instructional course intervention designed to support development of stochastic…

  20. Probability distributions of the electroencephalogram envelope of preterm infants.

    PubMed

    Saji, Ryoya; Hirasawa, Kyoko; Ito, Masako; Kusuda, Satoshi; Konishi, Yukuo; Taga, Gentaro

    2015-06-01

    To determine the stationary characteristics of electroencephalogram (EEG) envelopes for prematurely born (preterm) infants and investigate the intrinsic characteristics of early brain development in preterm infants. Twenty neurologically normal sets of EEGs recorded in infants with a post-conceptional age (PCA) range of 26-44 weeks (mean 37.5 ± 5.0 weeks) were analyzed. Hilbert transform was applied to extract the envelope. We determined the suitable probability distribution of the envelope and performed a statistical analysis. It was found that (i) the probability distributions for preterm EEG envelopes were best fitted by lognormal distributions at 38 weeks PCA or less, and by gamma distributions at 44 weeks PCA; (ii) the scale parameter of the lognormal distribution had positive correlations with PCA as well as a strong negative correlation with the percentage of low-voltage activity; (iii) the shape parameter of the lognormal distribution had significant positive correlations with PCA; (iv) the statistics of mode showed significant linear relationships with PCA, and, therefore, it was considered a useful index in PCA prediction. These statistics, including the scale parameter of the lognormal distribution and the skewness and mode derived from a suitable probability distribution, may be good indexes for estimating stationary nature in developing brain activity in preterm infants. The stationary characteristics, such as discontinuity, asymmetry, and unimodality, of preterm EEGs are well indicated by the statistics estimated from the probability distribution of the preterm EEG envelopes. Copyright © 2014 International Federation of Clinical Neurophysiology. Published by Elsevier Ireland Ltd. All rights reserved.

  1. Flood Frequency Curves - Use of information on the likelihood of extreme floods

    NASA Astrophysics Data System (ADS)

    Faber, B.

    2011-12-01

    Investment in the infrastructure that reduces flood risk for flood-prone communities must incorporate information on the magnitude and frequency of flooding in that area. Traditionally, that information has been a probability distribution of annual maximum streamflows developed from the historical gaged record at a stream site. Practice in the United States fits a Log-Pearson type3 distribution to the annual maximum flows of an unimpaired streamflow record, using the method of moments to estimate distribution parameters. The procedure makes the assumptions that annual peak streamflow events are (1) independent, (2) identically distributed, and (3) form a representative sample of the overall probability distribution. Each of these assumptions can be challenged. We rarely have enough data to form a representative sample, and therefore must compute and display the uncertainty in the estimated flood distribution. But, is there a wet/dry cycle that makes precipitation less than independent between successive years? Are the peak flows caused by different types of events from different statistical populations? How does the watershed or climate changing over time (non-stationarity) affect the probability distribution floods? Potential approaches to avoid these assumptions vary from estimating trend and shift and removing them from early data (and so forming a homogeneous data set), to methods that estimate statistical parameters that vary with time. A further issue in estimating a probability distribution of flood magnitude (the flood frequency curve) is whether a purely statistical approach can accurately capture the range and frequency of floods that are of interest. A meteorologically-based analysis produces "probable maximum precipitation" (PMP) and subsequently a "probable maximum flood" (PMF) that attempts to describe an upper bound on flood magnitude in a particular watershed. This analysis can help constrain the upper tail of the probability distribution, well beyond the range of gaged data or even historical or paleo-flood data, which can be very important in risk analyses performed for flood risk management and dam and levee safety studies.

  2. ADS-B within a Multi-Aircraft Simulation for Distributed Air-Ground Traffic Management

    NASA Technical Reports Server (NTRS)

    Barhydt, Richard; Palmer, Michael T.; Chung, William W.; Loveness, Ghyrn W.

    2004-01-01

    Automatic Dependent Surveillance Broadcast (ADS-B) is an enabling technology for NASA s Distributed Air-Ground Traffic Management (DAG-TM) concept. DAG-TM has the goal of significantly increasing capacity within the National Airspace System, while maintaining or improving safety. Under DAG-TM, aircraft exchange state and intent information over ADS-B with other aircraft and ground stations. This information supports various surveillance functions including conflict detection and resolution, scheduling, and conformance monitoring. To conduct more rigorous concept feasibility studies, NASA Langley Research Center s PC-based Air Traffic Operations Simulation models a 1090 MHz ADS-B communication structure, based on industry standards for message content, range, and reception probability. The current ADS-B model reflects a mature operating environment and message interference effects are limited to Mode S transponder replies and ADS-B squitters. This model was recently evaluated in a Joint DAG-TM Air/Ground Coordination Experiment with NASA Ames Research Center. Message probability of reception vs. range was lower at higher traffic levels. The highest message collision probability occurred near the meter fix serving as the confluence for two arrival streams. Even the highest traffic level encountered in the experiment was significantly less than the industry standard "LA Basin 2020" scenario. Future studies will account for Mode A and C message interference (a major effect in several industry studies) and will include Mode A and C aircraft in the simulation, thereby increasing the total traffic level. These changes will support ongoing enhancements to separation assurance functions that focus on accommodating longer ADS-B information update intervals.

  3. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution

    PubMed Central

    Dinov, Ivo D.; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2014-01-01

    Summary Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students’ understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference. PMID:25419016

  4. Technology-enhanced Interactive Teaching of Marginal, Joint and Conditional Probabilities: The Special Case of Bivariate Normal Distribution.

    PubMed

    Dinov, Ivo D; Kamino, Scott; Bhakhrani, Bilal; Christou, Nicolas

    2013-01-01

    Data analysis requires subtle probability reasoning to answer questions like What is the chance of event A occurring, given that event B was observed? This generic question arises in discussions of many intriguing scientific questions such as What is the probability that an adolescent weighs between 120 and 140 pounds given that they are of average height? and What is the probability of (monetary) inflation exceeding 4% and housing price index below 110? To address such problems, learning some applied, theoretical or cross-disciplinary probability concepts is necessary. Teaching such courses can be improved by utilizing modern information technology resources. Students' understanding of multivariate distributions, conditional probabilities, correlation and causation can be significantly strengthened by employing interactive web-based science educational resources. Independent of the type of a probability course (e.g. majors, minors or service probability course, rigorous measure-theoretic, applied or statistics course) student motivation, learning experiences and knowledge retention may be enhanced by blending modern technological tools within the classical conceptual pedagogical models. We have designed, implemented and disseminated a portable open-source web-application for teaching multivariate distributions, marginal, joint and conditional probabilities using the special case of bivariate Normal distribution. A real adolescent height and weight dataset is used to demonstrate the classroom utilization of the new web-application to address problems of parameter estimation, univariate and multivariate inference.

  5. Stylized facts in internal rates of return on stock index and its derivative transactions

    NASA Astrophysics Data System (ADS)

    Pichl, Lukáš; Kaizoji, Taisei; Yamano, Takuya

    2007-08-01

    Universal features in stock markets and their derivative markets are studied by means of probability distributions in internal rates of return on buy and sell transaction pairs. Unlike the stylized facts in normalized log returns, the probability distributions for such single asset encounters incorporate the time factor by means of the internal rate of return, defined as the continuous compound interest. Resulting stylized facts are shown in the probability distributions derived from the daily series of TOPIX, S & P 500 and FTSE 100 index close values. The application of the above analysis to minute-tick data of NIKKEI 225 and its futures market, respectively, reveals an interesting difference in the behavior of the two probability distributions, in case a threshold on the minimal duration of the long position is imposed. It is therefore suggested that the probability distributions of the internal rates of return could be used for causality mining between the underlying and derivative stock markets. The highly specific discrete spectrum, which results from noise trader strategies as opposed to the smooth distributions observed for fundamentalist strategies in single encounter transactions may be useful in deducing the type of investment strategy from trading revenues of small portfolio investors.

  6. Probabilistic Reasoning for Robustness in Automated Planning

    NASA Technical Reports Server (NTRS)

    Schaffer, Steven; Clement, Bradley; Chien, Steve

    2007-01-01

    A general-purpose computer program for planning the actions of a spacecraft or other complex system has been augmented by incorporating a subprogram that reasons about uncertainties in such continuous variables as times taken to perform tasks and amounts of resources to be consumed. This subprogram computes parametric probability distributions for time and resource variables on the basis of user-supplied models of actions and resources that they consume. The current system accepts bounded Gaussian distributions over action duration and resource use. The distributions are then combined during planning to determine the net probability distribution of each resource at any time point. In addition to a full combinatoric approach, several approximations for arriving at these combined distributions are available, including maximum-likelihood and pessimistic algorithms. Each such probability distribution can then be integrated to obtain a probability that execution of the plan under consideration would violate any constraints on the resource. The key idea is to use these probabilities of conflict to score potential plans and drive a search toward planning low-risk actions. An output plan provides a balance between the user s specified averseness to risk and other measures of optimality.

  7. Kinetic and thermodynamic factors controlling the distribution of SO32- and Na+ in calcites and selected aragonites

    USGS Publications Warehouse

    Busenberg, E.; Plummer, Niel

    1985-01-01

    Significant amounts of SO42-, Na+, and OH- are incorporated in marine biogenic calcites. Biogenic high Mg-calcites average about 1 mole percent SO42-. Aragonites and most biogenic low Mg-calcites contain significant amounts of Na+, but very low concentrations of SO42-. The SO42- content of non-biogenic calcites and aragonites investigated was below 100 ppm. The presence of Na+ and SO42- increases the unit cell size of calcites. The solid-solutions show a solubility minimum at about 0.5 mole percent SO42- beyond which the solubility rapidly increases. The solubility product of calcites containing 3 mole percent SO42- is the same as that of aragonite. Na+ appears to have very little effect on the solubility product of calcites. The amounts of Na+ and SO42- incorporated in calcites vary as a function of the rate of crystal growth. The variation of the distribution coefficient (D) of SO42- in calcite at 25.0??C and 0.50 molal NaCl is described by the equation D = k0 + k1R where k0 and k1 are constants equal to 6.16 ?? 10-6 and 3.941 ?? 10-6, respectively, and R is the rate of crystal growth of calcite in mg??min-1??g-1 of seed. The data on Na+ are consistent with the hypothesis that a significant amount of Na+ occupies interstitial positions in the calcite structure. The distribution of Na+ follows a Freundlich isotherm and not the Berthelot-Nernst distribution law. The numerical value of the Na+ distribution coefficient in calcite is probably dependent on the number of defects in the calcite structure. The Na+ contents of calcites are not very accurate indicators of environmental salinities. ?? 1985.

  8. Spatial distribution of pH and organic matter in urban soils and its implications on site-specific land uses in Xuzhou, China.

    PubMed

    Mao, Yingming; Sang, Shuxun; Liu, Shiqi; Jia, Jinlong

    2014-05-01

    The spatial variation of soil pH and soil organic matter (SOM) in the urban area of Xuzhou, China, was investigated in this study. Conventional statistics, geostatistics, and a geographical information system (GIS) were used to produce spatial distribution maps and to provide information about land use types. A total of 172 soil samples were collected based on grid method in the study area. Soil pH ranged from 6.47 to 8.48, with an average of 7.62. SOM content was very variable, ranging from 3.51 g/kg to 17.12 g/kg, with an average of 8.26 g/kg. Soil pH followed a normal distribution, while SOM followed a log-normal distribution. The results of semi-variograms indicated that soil pH and SOM had strong (21%) and moderate (44%) spatial dependence, respectively. The variogram model was spherical for soil pH and exponential for SOM. The spatial distribution maps were achieved using kriging interpolation. The high pH and high SOM tended to occur in the mixed forest land cover areas such as those in the southwestern part of the urban area, while the low values were found in the eastern and the northern parts, probably due to the effect of industrial and human activities. In the central urban area, the soil pH was low, but the SOM content was high, which is mainly attributed to the disturbance of regional resident activities and urban transportation. Furthermore, anthropogenic organic particles are possible sources of organic matter after entering the soil ecosystem in urban areas. These maps provide useful information for urban planning and environmental management. Copyright © 2014 Académie des sciences. Published by Elsevier SAS. All rights reserved.

  9. Mathematical Model to estimate the wind power using four-parameter Burr distribution

    NASA Astrophysics Data System (ADS)

    Liu, Sanming; Wang, Zhijie; Pan, Zhaoxu

    2018-03-01

    When the real probability of wind speed in the same position needs to be described, the four-parameter Burr distribution is more suitable than other distributions. This paper introduces its important properties and characteristics. Also, the application of the four-parameter Burr distribution in wind speed prediction is discussed, and the expression of probability distribution of output power of wind turbine is deduced.

  10. Entropy Methods For Univariate Distributions in Decision Analysis

    NASA Astrophysics Data System (ADS)

    Abbas, Ali E.

    2003-03-01

    One of the most important steps in decision analysis practice is the elicitation of the decision-maker's belief about an uncertainty of interest in the form of a representative probability distribution. However, the probability elicitation process is a task that involves many cognitive and motivational biases. Alternatively, the decision-maker may provide other information about the distribution of interest, such as its moments, and the maximum entropy method can be used to obtain a full distribution subject to the given moment constraints. In practice however, decision makers cannot readily provide moments for the distribution, and are much more comfortable providing information about the fractiles of the distribution of interest or bounds on its cumulative probabilities. In this paper we present a graphical method to determine the maximum entropy distribution between upper and lower probability bounds and provide an interpretation for the shape of the maximum entropy distribution subject to fractile constraints, (FMED). We also discuss the problems with the FMED in that it is discontinuous and flat over each fractile interval. We present a heuristic approximation to a distribution if in addition to its fractiles, we also know it is continuous and work through full examples to illustrate the approach.

  11. Impactites from Popigai Crater

    NASA Technical Reports Server (NTRS)

    Masaitis, V. L.

    1992-01-01

    Impactites (tagamites and suevites) from Popigai impact crater, whose diameter is about 100 km, are distributed over an area of 5000 sq km. The continuous sheet of suevite overlies the allogenic polymict breccia and partly authogenic breccia, and may also be observed in lenses or irregular bodies. The thickness of suevites in the central part of the crater is more than 100 m. Suevites may be distinguished by content of vitroclasts, lithoclasts, and crystalloclasts, by their dimensions, and by type of cementation, which reflects the facial settings of ejection of crushed and molten material, its sedimentation and lithification. Tagamites (impact melt rocks) are distributed on the surface predominantly in the western sector of the crater. The most characteristic are thick sheetlike bodies overlying the allogenic breccia and occurring in suevites where minor irregular bodies are widespread. The maximal thickness of separate tagamite sheets is up to 600 m. Tagamites, whose matrix is crystallized to a different degree, include fragments of minerals and gneiss blocks, among them shocked and thermally metamorphosed ones. Tagamite sheets have a complex inner structure; separate horizontal zones distinguish in crystallinity and fragment saturation. Differentiation in the impact melt in situ was not observed. The average chemical compositions of tagamites and suevites are similar, and correspond to the composition of biotite-garnet gneisses of the basement. According to the content of supplied Ir, Ni, and other siderophiles, impact melt was contaminated by 5 percent cosmic matter of collided body, probably ordinary chondrite. The total volume of remaining products of chilled impact melt is about 1750 cu km. Half this amount is represented by tagamite bodies. Though impact melt was in general well homogenized, the trend analysis showed that the concentric zonation is distribution of SiO2, MgO, and Na2O and the bandlike distribution of FeO and Al2O3 content testifies to a certain inheritance and heterogeneity in country rock composition laterally and vertically in the melting zone.

  12. Work probability distribution for a ferromagnet with long-ranged and short-ranged correlations

    NASA Astrophysics Data System (ADS)

    Bhattacharjee, J. K.; Kirkpatrick, T. R.; Sengers, J. V.

    2018-04-01

    Work fluctuations and work probability distributions are fundamentally different in systems with short-ranged versus long-ranged correlations. Specifically, in systems with long-ranged correlations the work distribution is extraordinarily broad compared to systems with short-ranged correlations. This difference profoundly affects the possible applicability of fluctuation theorems like the Jarzynski fluctuation theorem. The Heisenberg ferromagnet, well below its Curie temperature, is a system with long-ranged correlations in very low magnetic fields due to the presence of Goldstone modes. As the magnetic field is increased the correlations gradually become short ranged. Hence, such a ferromagnet is an ideal system for elucidating the changes of the work probability distribution as one goes from a domain with long-ranged correlations to a domain with short-ranged correlations by tuning the magnetic field. A quantitative analysis of this crossover behavior of the work probability distribution and the associated fluctuations is presented.

  13. Methods to elicit probability distributions from experts: a systematic review of reported practice in health technology assessment.

    PubMed

    Grigore, Bogdan; Peters, Jaime; Hyde, Christopher; Stein, Ken

    2013-11-01

    Elicitation is a technique that can be used to obtain probability distribution from experts about unknown quantities. We conducted a methodology review of reports where probability distributions had been elicited from experts to be used in model-based health technology assessments. Databases including MEDLINE, EMBASE and the CRD database were searched from inception to April 2013. Reference lists were checked and citation mapping was also used. Studies describing their approach to the elicitation of probability distributions were included. Data was abstracted on pre-defined aspects of the elicitation technique. Reports were critically appraised on their consideration of the validity, reliability and feasibility of the elicitation exercise. Fourteen articles were included. Across these studies, the most marked features were heterogeneity in elicitation approach and failure to report key aspects of the elicitation method. The most frequently used approaches to elicitation were the histogram technique and the bisection method. Only three papers explicitly considered the validity, reliability and feasibility of the elicitation exercises. Judged by the studies identified in the review, reports of expert elicitation are insufficient in detail and this impacts on the perceived usability of expert-elicited probability distributions. In this context, the wider credibility of elicitation will only be improved by better reporting and greater standardisation of approach. Until then, the advantage of eliciting probability distributions from experts may be lost.

  14. In vivo NMR imaging of sodium-23 in the human head.

    PubMed

    Hilal, S K; Maudsley, A A; Ra, J B; Simon, H E; Roschmann, P; Wittekoek, S; Cho, Z H; Mun, S K

    1985-01-01

    We report the first clinical nuclear magnetic resonance (NMR) images of cerebral sodium distribution in normal volunteers and in patients with a variety of pathological lesions. We have used a 1.5 T NMR magnet system. When compared with proton distribution, sodium shows a greater variation in its concentration from tissue to tissue and from normal to pathological conditions. Image contrast calculated on the basis of sodium concentration is 7 to 18 times greater than that of proton spin density. Normal images emphasize the extracellular compartments. In the clinical studies, areas of recent or old cerebral infarction and tumors show a pronounced increase of sodium content (300-400%). Actual measurements of image density values indicate that there is probably a further accentuation of the contrast by the increased "NMR visibility" of sodium in infarcted tissue. Sodium imaging may prove to be a more sensitive means for early detection of some brain disorders than other imaging methods.

  15. Azimuthal Dependence of Intrinsic Top in Photon-Quark Scattering and Higgs Production in Boson-Gluon Fusion DIS

    NASA Astrophysics Data System (ADS)

    Boroun, G. R.; Khanehzar, A.; Boustanchi Kashan, M.

    2017-11-01

    In this paper, we study the top content of nucleon by analyzing azimuthal asymmetries in lepton-nucleon deep inelastic scattering (DIS), also we search for the Higgs boson associated production channel, t\\bar{t}H, at the large hadron-electron collider (LHeC) caused by boson-gluon fusion (BGF) contribution. We use azimuthal asymmetries in {γ }* Q cross sections in terms of helicity contributions to semi-inclusive deep inelastic scattering to investigate numerical properties of the \\cos 2φ distribution. We conclude that measuring azimuthal distributions caused by intrinsic heavy quark production can directly probe heavy quarks inside nucleon. Moreover, in order to estimate the probability of producing the Higgs boson, we suggest another approach in the framework of calculating t\\bar{t} cross section in boson-gluon fusion mechanism. Finally, we can confirm that this observed massive particle is referred to Higgs boson produced by fermion loop.

  16. Crystal gazing. Part 2: Implications of advanced in digital data storage technology

    NASA Technical Reports Server (NTRS)

    Wells, D. C.

    1984-01-01

    During the next 5-10 years it is likely that the bit density available in digital mass storage systems (magnetic tapes, optical and magnetic disks) will be increased to such an extent that it will greatly exceed that of the conventional photographic emulsions like IIIaJ which are used in astronomy. These developments imply that it will soon be advantageous for astronomers to use microdensitometers to completely digitize all photographic plates soon after they are developed. Distribution of digital copies of sky surveys and the contents of plate vaults will probably become feasible within ten years. Copies of other astronomical archieves (e.g., Space Telescope) could also be distributed with the same techniques. The implications for designers of future microdensitometers are: (1) there will be a continuing need for precision digitization of large-format photographic imagery, and (2) that the need for real-time analysis of the output of microdensitometers will decrease.

  17. Computer simulation of random variables and vectors with arbitrary probability distribution laws

    NASA Technical Reports Server (NTRS)

    Bogdan, V. M.

    1981-01-01

    Assume that there is given an arbitrary n-dimensional probability distribution F. A recursive construction is found for a sequence of functions x sub 1 = f sub 1 (U sub 1, ..., U sub n), ..., x sub n = f sub n (U sub 1, ..., U sub n) such that if U sub 1, ..., U sub n are independent random variables having uniform distribution over the open interval (0,1), then the joint distribution of the variables x sub 1, ..., x sub n coincides with the distribution F. Since uniform independent random variables can be well simulated by means of a computer, this result allows one to simulate arbitrary n-random variables if their joint probability distribution is known.

  18. Cache-enabled small cell networks: modeling and tradeoffs.

    PubMed

    Baştuǧ, Ejder; Bennis, Mehdi; Kountouris, Marios; Debbah, Mérouane

    We consider a network model where small base stations (SBSs) have caching capabilities as a means to alleviate the backhaul load and satisfy users' demand. The SBSs are stochastically distributed over the plane according to a Poisson point process (PPP) and serve their users either (i) by bringing the content from the Internet through a finite rate backhaul or (ii) by serving them from the local caches. We derive closed-form expressions for the outage probability and the average delivery rate as a function of the signal-to-interference-plus-noise ratio (SINR), SBS density, target file bitrate, storage size, file length, and file popularity. We then analyze the impact of key operating parameters on the system performance. It is shown that a certain outage probability can be achieved either by increasing the number of base stations or the total storage size. Our results and analysis provide key insights into the deployment of cache-enabled small cell networks (SCNs), which are seen as a promising solution for future heterogeneous cellular networks.

  19. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W., Dr.

    1991-01-01

    The use of a radioisotope thermoelectric generator fueled with plutonium-238 dioxide on the Space Shuttle-launched Ulysses mission implies some level of risk due to potential accidents. This paper describes the method used to quantify risks in the Ulysses mission Final Safety Analysis Report prepared for the U.S. Department of Energy. The starting point for the analysis described herein is following input of source term probability distributions from the General Electric Company. A Monte Carlo technique is used to develop probability distributions of radiological consequences for a range of accident scenarios thoughout the mission. Factors affecting radiological consequences are identified, the probability distribution of the effect of each factor determined, and the functional relationship among all the factors established. The probability distributions of all the factor effects are then combined using a Monte Carlo technique. The results of the analysis are presented in terms of complementary cumulative distribution functions (CCDF) by mission sub-phase, phase, and the overall mission. The CCDFs show the total probability that consequences (calculated health effects) would be equal to or greater than a given value.

  20. The aluminum phosphate zone in the Peace River area, land-pebble phosphate field, Florida

    USGS Publications Warehouse

    Cathcart, James B.

    1953-01-01

    The Peace River area, comprising T. 30 and 31 S., R. 24 and 25 E., contains a thicker and more persistent aluminum phosphate zone, and one that is higher in P2O5 and uranium content than is known elsewhere in the land-pebble phosphate district. This report has been prepared to bring together all of the information on the aluminum phosphate zone in the area where the first plant to treat this material will probably be located. The area may be divided into three physiographic units, (1) the ridge, (2) the flatwoods, and (3) the valley. Maps showing distribution and grade of the aluminum phosphate zone indicate that the zone is thin or absent in the ridge unit, thickest and most persistent, and of the best grade in P2O5 and uranium in the flatwoods unit, and absent or very low in grade in the valley unit. Maps of thickness and of chemical composition show that even in favorable areas there are places where the aluminum phosphate zone is missing or of questionable economic importance. The distribution maps also show that areas of high P2O5 and high uranium content coincide closely. Areas containing thick aluminum phosphate material usually have high uranium and P2O5 contents. It is estimated that an average of 13,000 tons per day of aluminum phosphate material might be mined from this area. This figure is based on the probable amount of time, per year, that mining would be in favorable ground. When all mines in the area are in favorable ground, the tonnage per day might be about 23,000 tons. Tonnages of aluminum phosphate material have been computed for about 36 percent of the area of T. 30 S., R. 25 E., and for 18 percent of the area of T. 31 S., R. 25 E. The total inferred tonnage is about 150,000,000 short tons, with an average grade of 0.012 percent U3O8.

  1. Hoeffding Type Inequalities and their Applications in Statistics and Operations Research

    NASA Astrophysics Data System (ADS)

    Daras, Tryfon

    2007-09-01

    Large Deviation theory is the branch of Probability theory that deals with rare events. Sometimes, these events can be described by the sum of random variables that deviates from its mean more than a "normal" amount. A precise calculation of the probabilities of such events turns out to be crucial in a variety of different contents (e.g. in Probability Theory, Statistics, Operations Research, Statistical Physics, Financial Mathematics e.t.c.). Recent applications of the theory deal with random walks in random environments, interacting diffusions, heat conduction, polymer chains [1]. In this paper we prove an inequality of exponential type, namely theorem 2.1, which gives a large deviation upper bound for a specific sequence of r.v.s. Inequalities of this type have many applications in Combinatorics [2]. The inequality generalizes already proven results of this type, in the case of symmetric probability measures. We get as consequences to the inequality: (a) large deviations upper bounds for exchangeable Bernoulli sequences of random variables, generalizing results proven for independent and identically distributed Bernoulli sequences of r.v.s. and (b) a general form of Bernstein's inequality. We compare the inequality with large deviation results already proven by the author and try to see its advantages. Finally, using the inequality, we solve one of the basic problems of Operations Research (bin packing problem) in the case of exchangeable r.v.s.

  2. Force Density Function Relationships in 2-D Granular Media

    NASA Technical Reports Server (NTRS)

    Youngquist, Robert C.; Metzger, Philip T.; Kilts, Kelly N.

    2004-01-01

    An integral transform relationship is developed to convert between two important probability density functions (distributions) used in the study of contact forces in granular physics. Developing this transform has now made it possible to compare and relate various theoretical approaches with one another and with the experimental data despite the fact that one may predict the Cartesian probability density and another the force magnitude probability density. Also, the transforms identify which functional forms are relevant to describe the probability density observed in nature, and so the modified Bessel function of the second kind has been identified as the relevant form for the Cartesian probability density corresponding to exponential forms in the force magnitude distribution. Furthermore, it is shown that this transform pair supplies a sufficient mathematical framework to describe the evolution of the force magnitude distribution under shearing. Apart from the choice of several coefficients, whose evolution of values must be explained in the physics, this framework successfully reproduces the features of the distribution that are taken to be an indicator of jamming and unjamming in a granular packing. Key words. Granular Physics, Probability Density Functions, Fourier Transforms

  3. An evaluation of procedures to estimate monthly precipitation probabilities

    NASA Astrophysics Data System (ADS)

    Legates, David R.

    1991-01-01

    Many frequency distributions have been used to evaluate monthly precipitation probabilities. Eight of these distributions (including Pearson type III, extreme value, and transform normal probability density functions) are comparatively examined to determine their ability to represent accurately variations in monthly precipitation totals for global hydroclimatological analyses. Results indicate that a modified version of the Box-Cox transform-normal distribution more adequately describes the 'true' precipitation distribution than does any of the other methods. This assessment was made using a cross-validation procedure for a global network of 253 stations for which at least 100 years of monthly precipitation totals were available.

  4. q-Gaussian distributions of leverage returns, first stopping times, and default risk valuations

    NASA Astrophysics Data System (ADS)

    Katz, Yuri A.; Tian, Li

    2013-10-01

    We study the probability distributions of daily leverage returns of 520 North American industrial companies that survive de-listing during the financial crisis, 2006-2012. We provide evidence that distributions of unbiased leverage returns of all individual firms belong to the class of q-Gaussian distributions with the Tsallis entropic parameter within the interval 1

  5. Probability weighted moments: Definition and relation to parameters of several distributions expressable in inverse form

    USGS Publications Warehouse

    Greenwood, J. Arthur; Landwehr, J. Maciunas; Matalas, N.C.; Wallis, J.R.

    1979-01-01

    Distributions whose inverse forms are explicitly defined, such as Tukey's lambda, may present problems in deriving their parameters by more conventional means. Probability weighted moments are introduced and shown to be potentially useful in expressing the parameters of these distributions.

  6. Univariate Probability Distributions

    ERIC Educational Resources Information Center

    Leemis, Lawrence M.; Luckett, Daniel J.; Powell, Austin G.; Vermeer, Peter E.

    2012-01-01

    We describe a web-based interactive graphic that can be used as a resource in introductory classes in mathematical statistics. This interactive graphic presents 76 common univariate distributions and gives details on (a) various features of the distribution such as the functional form of the probability density function and cumulative distribution…

  7. A probability space for quantum models

    NASA Astrophysics Data System (ADS)

    Lemmens, L. F.

    2017-06-01

    A probability space contains a set of outcomes, a collection of events formed by subsets of the set of outcomes and probabilities defined for all events. A reformulation in terms of propositions allows to use the maximum entropy method to assign the probabilities taking some constraints into account. The construction of a probability space for quantum models is determined by the choice of propositions, choosing the constraints and making the probability assignment by the maximum entropy method. This approach shows, how typical quantum distributions such as Maxwell-Boltzmann, Fermi-Dirac and Bose-Einstein are partly related with well-known classical distributions. The relation between the conditional probability density, given some averages as constraints and the appropriate ensemble is elucidated.

  8. Regional probability distribution of the annual reference evapotranspiration and its effective parameters in Iran

    NASA Astrophysics Data System (ADS)

    Khanmohammadi, Neda; Rezaie, Hossein; Montaseri, Majid; Behmanesh, Javad

    2017-10-01

    The reference evapotranspiration (ET0) plays an important role in water management plans in arid or semi-arid countries such as Iran. For this reason, the regional analysis of this parameter is important. But, ET0 process is affected by several meteorological parameters such as wind speed, solar radiation, temperature and relative humidity. Therefore, the effect of distribution type of effective meteorological variables on ET0 distribution was analyzed. For this purpose, the regional probability distribution of the annual ET0 and its effective parameters were selected. Used data in this research was recorded data at 30 synoptic stations of Iran during 1960-2014. Using the probability plot correlation coefficient (PPCC) test and the L-moment method, five common distributions were compared and the best distribution was selected. The results of PPCC test and L-moment diagram indicated that the Pearson type III distribution was the best probability distribution for fitting annual ET0 and its four effective parameters. The results of RMSE showed that the ability of the PPCC test and L-moment method for regional analysis of reference evapotranspiration and its effective parameters was similar. The results also showed that the distribution type of the parameters which affected ET0 values can affect the distribution of reference evapotranspiration.

  9. Probabilistic reasoning in data analysis.

    PubMed

    Sirovich, Lawrence

    2011-09-20

    This Teaching Resource provides lecture notes, slides, and a student assignment for a lecture on probabilistic reasoning in the analysis of biological data. General probabilistic frameworks are introduced, and a number of standard probability distributions are described using simple intuitive ideas. Particular attention is focused on random arrivals that are independent of prior history (Markovian events), with an emphasis on waiting times, Poisson processes, and Poisson probability distributions. The use of these various probability distributions is applied to biomedical problems, including several classic experimental studies.

  10. Multiobjective fuzzy stochastic linear programming problems with inexact probability distribution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hamadameen, Abdulqader Othman; Zainuddin, Zaitul Marlizawati

    This study deals with multiobjective fuzzy stochastic linear programming problems with uncertainty probability distribution which are defined as fuzzy assertions by ambiguous experts. The problem formulation has been presented and the two solutions strategies are; the fuzzy transformation via ranking function and the stochastic transformation when α{sup –}. cut technique and linguistic hedges are used in the uncertainty probability distribution. The development of Sen’s method is employed to find a compromise solution, supported by illustrative numerical example.

  11. Work probability distribution and tossing a biased coin

    NASA Astrophysics Data System (ADS)

    Saha, Arnab; Bhattacharjee, Jayanta K.; Chakraborty, Sagar

    2011-01-01

    We show that the rare events present in dissipated work that enters Jarzynski equality, when mapped appropriately to the phenomenon of large deviations found in a biased coin toss, are enough to yield a quantitative work probability distribution for the Jarzynski equality. This allows us to propose a recipe for constructing work probability distribution independent of the details of any relevant system. The underlying framework, developed herein, is expected to be of use in modeling other physical phenomena where rare events play an important role.

  12. A probabilistic approach for shallow rainfall-triggered landslide modeling at basin scale. A case study in the Luquillo Forest, Puerto Rico

    NASA Astrophysics Data System (ADS)

    Dialynas, Y. G.; Arnone, E.; Noto, L. V.; Bras, R. L.

    2013-12-01

    Slope stability depends on geotechnical and hydrological factors that exhibit wide natural spatial variability, yet sufficient measurements of the related parameters are rarely available over entire study areas. The uncertainty associated with the inability to fully characterize hydrologic behavior has an impact on any attempt to model landslide hazards. This work suggests a way to systematically account for this uncertainty in coupled distributed hydrological-stability models for shallow landslide hazard assessment. A probabilistic approach for the prediction of rainfall-triggered landslide occurrence at basin scale was implemented in an existing distributed eco-hydrological and landslide model, tRIBS-VEGGIE -landslide (Triangulated Irregular Network (TIN)-based Real-time Integrated Basin Simulator - VEGetation Generator for Interactive Evolution). More precisely, we upgraded tRIBS-VEGGIE- landslide to assess the likelihood of shallow landslides by accounting for uncertainty related to geotechnical and hydrological factors that directly affect slope stability. Natural variability of geotechnical soil characteristics was considered by randomizing soil cohesion and friction angle. Hydrological uncertainty related to the estimation of matric suction was taken into account by considering soil retention parameters as correlated random variables. The probability of failure is estimated through an assumed theoretical Factor of Safety (FS) distribution, conditioned on soil moisture content. At each cell, the temporally variant FS statistics are approximated by the First Order Second Moment (FOSM) method, as a function of parameters statistical properties. The model was applied on the Rio Mameyes Basin, located in the Luquillo Experimental Forest in Puerto Rico, where previous landslide analyses have been carried out. At each time step, model outputs include the probability of landslide occurrence across the basin, and the most probable depth of failure at each soil column. The use of the proposed probabilistic approach for shallow landslide prediction is able to reveal and quantify landslide risk at slopes assessed as stable by simpler deterministic methods.

  13. Hybrid computer technique yields random signal probability distributions

    NASA Technical Reports Server (NTRS)

    Cameron, W. D.

    1965-01-01

    Hybrid computer determines the probability distributions of instantaneous and peak amplitudes of random signals. This combined digital and analog computer system reduces the errors and delays of manual data analysis.

  14. Fast Reliability Assessing Method for Distribution Network with Distributed Renewable Energy Generation

    NASA Astrophysics Data System (ADS)

    Chen, Fan; Huang, Shaoxiong; Ding, Jinjin; Ding, Jinjin; Gao, Bo; Xie, Yuguang; Wang, Xiaoming

    2018-01-01

    This paper proposes a fast reliability assessing method for distribution grid with distributed renewable energy generation. First, the Weibull distribution and the Beta distribution are used to describe the probability distribution characteristics of wind speed and solar irradiance respectively, and the models of wind farm, solar park and local load are built for reliability assessment. Then based on power system production cost simulation probability discretization and linearization power flow, a optimal power flow objected with minimum cost of conventional power generation is to be resolved. Thus a reliability assessment for distribution grid is implemented fast and accurately. The Loss Of Load Probability (LOLP) and Expected Energy Not Supplied (EENS) are selected as the reliability index, a simulation for IEEE RBTS BUS6 system in MATLAB indicates that the fast reliability assessing method calculates the reliability index much faster with the accuracy ensured when compared with Monte Carlo method.

  15. 40 CFR Appendix C to Part 191 - Guidance for Implementation of Subpart B

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... that the remaining probability distribution of cumulative releases would not be significantly changed... with § 191.13 into a “complementary cumulative distribution function” that indicates the probability of... distribution function for each disposal system considered. The Agency assumes that a disposal system can be...

  16. 40 CFR Appendix C to Part 191 - Guidance for Implementation of Subpart B

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... that the remaining probability distribution of cumulative releases would not be significantly changed... with § 191.13 into a “complementary cumulative distribution function” that indicates the probability of... distribution function for each disposal system considered. The Agency assumes that a disposal system can be...

  17. Effects of land use, climate, topography and soil properties on regional soil organic carbon and total nitrogen in the upstream watershed of Miyun Reservoir, North China.

    PubMed

    Wang, Shufang; Wang, Xiaoke; Ouyang, Zhiyun

    2012-01-01

    Soil organic carbon (SOC) and total nitrogen (TN) contents as well as their relationships with site characteristics are of profound importance in assessing current regional, continental and global soil C and N stocks and potentials for C sequestration and N conservation to offset anthropogenic emissions of greenhouse gases. This study investigated contents and distribution of SOC and TN under different land uses, and the quantitative relationships between SOC or TN and site characteristics in the Upstream Watershed of Miyun Reservoir, North China. Overall, both SOC and TN contents in natural secondary forests and grasslands were much higher than in plantations and croplands. Land use alone explained 37.2% and 38.4% of variations in SOC and TN contents, respectively. The optimal models for SOC and TN, achieved by multiple regression analysis combined with principal component analysis (PCA) to remove the multicollinearity among site variables, showed that elevation, slope, soil clay and water contents were the most significant factors controlling SOC and TN contents, jointly explaining 70.3% of SOC and 67.1% of TN contents variability. Only does additional 1.9% and 3% increase in the interpretations of SOC and TN contents variability respectively when land use was added to regressions, probably due to environment factors determine land use. Therefore, environmental variables were more important for SOC and TN variability than land use in the study area, and should be taken into consideration in properly evaluating effects of future land use changes on SOC and TN on a regional scale.

  18. Optimal methods for fitting probability distributions to propagule retention time in studies of zoochorous dispersal.

    PubMed

    Viana, Duarte S; Santamaría, Luis; Figuerola, Jordi

    2016-02-01

    Propagule retention time is a key factor in determining propagule dispersal distance and the shape of "seed shadows". Propagules dispersed by animal vectors are either ingested and retained in the gut until defecation or attached externally to the body until detachment. Retention time is a continuous variable, but it is commonly measured at discrete time points, according to pre-established sampling time-intervals. Although parametric continuous distributions have been widely fitted to these interval-censored data, the performance of different fitting methods has not been evaluated. To investigate the performance of five different fitting methods, we fitted parametric probability distributions to typical discretized retention-time data with known distribution using as data-points either the lower, mid or upper bounds of sampling intervals, as well as the cumulative distribution of observed values (using either maximum likelihood or non-linear least squares for parameter estimation); then compared the estimated and original distributions to assess the accuracy of each method. We also assessed the robustness of these methods to variations in the sampling procedure (sample size and length of sampling time-intervals). Fittings to the cumulative distribution performed better for all types of parametric distributions (lognormal, gamma and Weibull distributions) and were more robust to variations in sample size and sampling time-intervals. These estimated distributions had negligible deviations of up to 0.045 in cumulative probability of retention times (according to the Kolmogorov-Smirnov statistic) in relation to original distributions from which propagule retention time was simulated, supporting the overall accuracy of this fitting method. In contrast, fitting the sampling-interval bounds resulted in greater deviations that ranged from 0.058 to 0.273 in cumulative probability of retention times, which may introduce considerable biases in parameter estimates. We recommend the use of cumulative probability to fit parametric probability distributions to propagule retention time, specifically using maximum likelihood for parameter estimation. Furthermore, the experimental design for an optimal characterization of unimodal propagule retention time should contemplate at least 500 recovered propagules and sampling time-intervals not larger than the time peak of propagule retrieval, except in the tail of the distribution where broader sampling time-intervals may also produce accurate fits.

  19. How Can Histograms Be Useful for Introducing Continuous Probability Distributions?

    ERIC Educational Resources Information Center

    Derouet, Charlotte; Parzysz, Bernard

    2016-01-01

    The teaching of probability has changed a great deal since the end of the last century. The development of technologies is indeed part of this evolution. In France, continuous probability distributions began to be studied in 2002 by scientific 12th graders, but this subject was marginal and appeared only as an application of integral calculus.…

  20. Pseudo Bayes Estimates for Test Score Distributions and Chained Equipercentile Equating. Research Report. ETS RR-09-47

    ERIC Educational Resources Information Center

    Moses, Tim; Oh, Hyeonjoo J.

    2009-01-01

    Pseudo Bayes probability estimates are weighted averages of raw and modeled probabilities; these estimates have been studied primarily in nonpsychometric contexts. The purpose of this study was to evaluate pseudo Bayes probability estimates as applied to the estimation of psychometric test score distributions and chained equipercentile equating…

  1. Failure probability under parameter uncertainty.

    PubMed

    Gerrard, R; Tsanakas, A

    2011-05-01

    In many problems of risk analysis, failure is equivalent to the event of a random risk factor exceeding a given threshold. Failure probabilities can be controlled if a decisionmaker is able to set the threshold at an appropriate level. This abstract situation applies, for example, to environmental risks with infrastructure controls; to supply chain risks with inventory controls; and to insurance solvency risks with capital controls. However, uncertainty around the distribution of the risk factor implies that parameter error will be present and the measures taken to control failure probabilities may not be effective. We show that parameter uncertainty increases the probability (understood as expected frequency) of failures. For a large class of loss distributions, arising from increasing transformations of location-scale families (including the log-normal, Weibull, and Pareto distributions), the article shows that failure probabilities can be exactly calculated, as they are independent of the true (but unknown) parameters. Hence it is possible to obtain an explicit measure of the effect of parameter uncertainty on failure probability. Failure probability can be controlled in two different ways: (1) by reducing the nominal required failure probability, depending on the size of the available data set, and (2) by modifying of the distribution itself that is used to calculate the risk control. Approach (1) corresponds to a frequentist/regulatory view of probability, while approach (2) is consistent with a Bayesian/personalistic view. We furthermore show that the two approaches are consistent in achieving the required failure probability. Finally, we briefly discuss the effects of data pooling and its systemic risk implications. © 2010 Society for Risk Analysis.

  2. Comparison of three-parameter probability distributions for representing annual extreme and partial duration precipitation series

    NASA Astrophysics Data System (ADS)

    Wilks, Daniel S.

    1993-10-01

    Performance of 8 three-parameter probability distributions for representing annual extreme and partial duration precipitation data at stations in the northeastern and southeastern United States is investigated. Particular attention is paid to fidelity on the right tail, through use of a bootstrap procedure simulating extrapolation on the right tail beyond the data. It is found that the beta-κ distribution best describes the extreme right tail of annual extreme series, and the beta-P distribution is best for the partial duration data. The conventionally employed two-parameter Gumbel distribution is found to substantially underestimate probabilities associated with the larger precipitation amounts for both annual extreme and partial duration data. Fitting the distributions using left-censored data did not result in improved fits to the right tail.

  3. Moisture Content Influences Ignitability of Slash Pine Litter

    Treesearch

    Winfred H. Blackmarr

    1972-01-01

    The influence of moisture content on the ignitability of slash pine litter was measured by dropping lighted matches onto fuel beds conditioned to different levels of moisture content.The percentage of matches igniting the fuel bed was used to indicate ignition probability at each moisture content. The "critical range" of fuel moisture contents within which...

  4. Differentiation in Data Analysis & Probability, PreK-Grade 2: A Content Companion for Ongoing Assessment, Grouping Students, Targeting Instruction, and Adjusting Levels of Cognitive Demand

    ERIC Educational Resources Information Center

    Taylor-Cox, Jennifer

    2008-01-01

    This book applies the author's easy but effective differentiation strategies to the data analysis and probability content standard. Taking the foundational elements of differentiation in this book, it helps you: (1) assess students' math abilities quickly and efficiently; (2) group children by need; (3) target instruction to meet every student's…

  5. The estimation of tree posterior probabilities using conditional clade probability distributions.

    PubMed

    Larget, Bret

    2013-07-01

    In this article I introduce the idea of conditional independence of separated subtrees as a principle by which to estimate the posterior probability of trees using conditional clade probability distributions rather than simple sample relative frequencies. I describe an algorithm for these calculations and software which implements these ideas. I show that these alternative calculations are very similar to simple sample relative frequencies for high probability trees but are substantially more accurate for relatively low probability trees. The method allows the posterior probability of unsampled trees to be calculated when these trees contain only clades that are in other sampled trees. Furthermore, the method can be used to estimate the total probability of the set of sampled trees which provides a measure of the thoroughness of a posterior sample.

  6. Probability Analysis of the Wave-Slamming Pressure Values of the Horizontal Deck with Elastic Support

    NASA Astrophysics Data System (ADS)

    Zuo, Weiguang; Liu, Ming; Fan, Tianhui; Wang, Pengtao

    2018-06-01

    This paper presents the probability distribution of the slamming pressure from an experimental study of regular wave slamming on an elastically supported horizontal deck. The time series of the slamming pressure during the wave impact were first obtained through statistical analyses on experimental data. The exceeding probability distribution of the maximum slamming pressure peak and distribution parameters were analyzed, and the results show that the exceeding probability distribution of the maximum slamming pressure peak accords with the three-parameter Weibull distribution. Furthermore, the range and relationships of the distribution parameters were studied. The sum of the location parameter D and the scale parameter L was approximately equal to 1.0, and the exceeding probability was more than 36.79% when the random peak was equal to the sample average during the wave impact. The variation of the distribution parameters and slamming pressure under different model conditions were comprehensively presented, and the parameter values of the Weibull distribution of wave-slamming pressure peaks were different due to different test models. The parameter values were found to decrease due to the increased stiffness of the elastic support. The damage criterion of the structure model caused by the wave impact was initially discussed, and the structure model was destroyed when the average slamming time was greater than a certain value during the duration of the wave impact. The conclusions of the experimental study were then described.

  7. On the inequivalence of the CH and CHSH inequalities due to finite statistics

    NASA Astrophysics Data System (ADS)

    Renou, M. O.; Rosset, D.; Martin, A.; Gisin, N.

    2017-06-01

    Different variants of a Bell inequality, such as CHSH and CH, are known to be equivalent when evaluated on nonsignaling outcome probability distributions. However, in experimental setups, the outcome probability distributions are estimated using a finite number of samples. Therefore the nonsignaling conditions are only approximately satisfied and the robustness of the violation depends on the chosen inequality variant. We explain that phenomenon using the decomposition of the space of outcome probability distributions under the action of the symmetry group of the scenario, and propose a method to optimize the statistical robustness of a Bell inequality. In the process, we describe the finite group composed of relabeling of parties, measurement settings and outcomes, and identify correspondences between the irreducible representations of this group and properties of outcome probability distributions such as normalization, signaling or having uniform marginals.

  8. Characteristics of plastids responsible for starch synthesis in developing pea embryos.

    PubMed

    Smith, A M; Quinton-Tulloch, J; Denyer, K

    1990-03-01

    The nature of the starch-synthesising plastids in developing pea (Pisum sativum L.) embryos has been investigated. Chlorophyll and starch were distributed throughout the cotyledon during development. Chlorophyll content increased initially, then showed little change up to the point of drying out of the embryo. Starch content per embryo increased dramatically throughout development. The chlorophyll content per unit volume was highest on the outer edge of the cotyledon, while the starch content was highest on inner face. Nycodenz gradients, which fractionated mechanically-prepared plastids according to their starch content, failed to achieve any significant separation of plastids rich in starch and ADP-glucose pyrophosphorylase from those rich in chlorophyll and a Calvin-cycle marker enzyme, NADP-glyceraldehyde-3-phosphate dehydrogenase. However, material that was not sufficiently dense to enter the gradients was enriched in activity of the Calvin-cycle marker enzyme relative to that of ADP-glucose pyrophosphorylase. Nomarski and epi-fluorescence microscopy showed that intact, isolated plastids, including those with very large starch grains, invariably contained chlorophyll in stromal structures peripheral to the starch grain. We suggest that the starch-storing plastids of developing pea embryos are derived directly from chloroplasts, and retain chloroplast-like characteristics throughout their development. Developing pea embryos also contain chloroplasts which store little or no starch. These are probably located primarily on the outer edge of the cotyledons where there is sufficient light for photosynthesis at some stages of development.

  9. Confidence as Bayesian Probability: From Neural Origins to Behavior.

    PubMed

    Meyniel, Florent; Sigman, Mariano; Mainen, Zachary F

    2015-10-07

    Research on confidence spreads across several sub-fields of psychology and neuroscience. Here, we explore how a definition of confidence as Bayesian probability can unify these viewpoints. This computational view entails that there are distinct forms in which confidence is represented and used in the brain, including distributional confidence, pertaining to neural representations of probability distributions, and summary confidence, pertaining to scalar summaries of those distributions. Summary confidence is, normatively, derived or "read out" from distributional confidence. Neural implementations of readout will trade off optimality versus flexibility of routing across brain systems, allowing confidence to serve diverse cognitive functions. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Exact probability distribution functions for Parrondo's games

    NASA Astrophysics Data System (ADS)

    Zadourian, Rubina; Saakian, David B.; Klümper, Andreas

    2016-12-01

    We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.

  11. Exact probability distribution functions for Parrondo's games.

    PubMed

    Zadourian, Rubina; Saakian, David B; Klümper, Andreas

    2016-12-01

    We study the discrete time dynamics of Brownian ratchet models and Parrondo's games. Using the Fourier transform, we calculate the exact probability distribution functions for both the capital dependent and history dependent Parrondo's games. In certain cases we find strong oscillations near the maximum of the probability distribution with two limiting distributions for odd and even number of rounds of the game. Indications of such oscillations first appeared in the analysis of real financial data, but now we have found this phenomenon in model systems and a theoretical understanding of the phenomenon. The method of our work can be applied to Brownian ratchets, molecular motors, and portfolio optimization.

  12. Distribution of heavy metals and radionuclides in sediments, water, and fish in an area of Great Bear Lake contaminated with mine wastes.

    PubMed

    Moore, J W; Sutherland, D J

    1981-01-01

    The concentrations of heavy metals and radionuclides in the sediments and water of Great Bear Lake were determined during 1978 near an operating silver mine and an abandoned uranium mine. Additional information on the level of mercury in fish tissues were also collected. The mines, situated on the same site, deposited tailings and other waste material directly into the lake. The concentrations of mercury, lead, manganese, and nickel in the sediments were highest near the tailings deposit and decreased significantly as the distance from the mine increased. Although there were also significant positive correlations between these metals and the organic content of the sediments, water depth and slope of the bottom had no impact on metal distribution. Since the concentrations of arsenic, cobalt, copper, 226radium, 210lead and 230thorium varied inconsistently throughout the study area, the distribution of these substances could not be related to any of the environmental factors that were measured. There were, however, significant negative correlations between the concentrations of 232thorium and 228thorium and distance from the mine and organic content of the sediments. Heavy metal and radionuclide levels in water were generally below detectable limits, reflecting the strong chemical bonding characteristics of the sediments. The low concentrations of mercury in the tissues of lake trout Salvelinus namaycush were probably related to low uptake rates and the ability of this species to move into uncontaminated areas of the lake.

  13. What Can Quantum Optics Say about Computational Complexity Theory?

    NASA Astrophysics Data System (ADS)

    Rahimi-Keshari, Saleh; Lund, Austin P.; Ralph, Timothy C.

    2015-02-01

    Considering the problem of sampling from the output photon-counting probability distribution of a linear-optical network for input Gaussian states, we obtain results that are of interest from both quantum theory and the computational complexity theory point of view. We derive a general formula for calculating the output probabilities, and by considering input thermal states, we show that the output probabilities are proportional to permanents of positive-semidefinite Hermitian matrices. It is believed that approximating permanents of complex matrices in general is a #P-hard problem. However, we show that these permanents can be approximated with an algorithm in the BPPNP complexity class, as there exists an efficient classical algorithm for sampling from the output probability distribution. We further consider input squeezed-vacuum states and discuss the complexity of sampling from the probability distribution at the output.

  14. Vacuum quantum stress tensor fluctuations: A diagonalization approach

    NASA Astrophysics Data System (ADS)

    Schiappacasse, Enrico D.; Fewster, Christopher J.; Ford, L. H.

    2018-01-01

    Large vacuum fluctuations of a quantum stress tensor can be described by the asymptotic behavior of its probability distribution. Here we focus on stress tensor operators which have been averaged with a sampling function in time. The Minkowski vacuum state is not an eigenstate of the time-averaged operator, but can be expanded in terms of its eigenstates. We calculate the probability distribution and the cumulative probability distribution for obtaining a given value in a measurement of the time-averaged operator taken in the vacuum state. In these calculations, we study a specific operator that contributes to the stress-energy tensor of a massless scalar field in Minkowski spacetime, namely, the normal ordered square of the time derivative of the field. We analyze the rate of decrease of the tail of the probability distribution for different temporal sampling functions, such as compactly supported functions and the Lorentzian function. We find that the tails decrease relatively slowly, as exponentials of fractional powers, in agreement with previous work using the moments of the distribution. Our results lend additional support to the conclusion that large vacuum stress tensor fluctuations are more probable than large thermal fluctuations, and may have observable effects.

  15. Measurements of gas hydrate formation probability distributions on a quasi-free water droplet

    NASA Astrophysics Data System (ADS)

    Maeda, Nobuo

    2014-06-01

    A High Pressure Automated Lag Time Apparatus (HP-ALTA) can measure gas hydrate formation probability distributions from water in a glass sample cell. In an HP-ALTA gas hydrate formation originates near the edges of the sample cell and gas hydrate films subsequently grow across the water-guest gas interface. It would ideally be desirable to be able to measure gas hydrate formation probability distributions of a single water droplet or mist that is freely levitating in a guest gas, but this is technically challenging. The next best option is to let a water droplet sit on top of a denser, immiscible, inert, and wall-wetting hydrophobic liquid to avoid contact of a water droplet with the solid walls. Here we report the development of a second generation HP-ALTA which can measure gas hydrate formation probability distributions of a water droplet which sits on a perfluorocarbon oil in a container that is coated with 1H,1H,2H,2H-Perfluorodecyltriethoxysilane. It was found that the gas hydrate formation probability distributions of such a quasi-free water droplet were significantly lower than those of water in a glass sample cell.

  16. Characterizing a Brazilian sanitary landfill using geophysical seismic techniques.

    PubMed

    Abreu, A E S; Gandolfo, O C B; Vilar, O M

    2016-07-01

    Two different geophysical techniques, namely crosshole and multichannel analysis of surface waves - MASW, were applied to investigate the mechanical response of Municipal Solid Waste buried under humid, subtropical climate. Direct investigations revealed that the buried waste was composed mainly of soil-like material (51%) and plastics (31%) with moisture content average values of 43% near the surface and 53% after around 11m depth. Unit weight varied between 9kN/m(3) and 15kN/m(3). Seismic investigation of the landfill yielded shear wave velocities (VS) estimated from the crosshole tests ranging from 92 to 214m/s, while compression wave velocities (VP) ranged from 197 to 451m/s. Both velocities were influenced by vertical confining stress and thus tended to increase with depth. VS calculated from MASW tests were lower than the ones calculated from the crosshole tests, probably due to the different frequencies used in the tests. The results of both methods tended to configure a lower bound to the values reported in the technical literature in general, as expected for low compaction waste with small amounts of cover soil. Although VS did not show abrupt changes with depth, VP profile distribution combined with direct investigations results, such as temperature, in-place unit weight and moisture content, suggest that the waste body could be divided into two strata. The lower one is poorly drained and shows higher moisture content, as a consequence of the operational techniques used in the first years, while the upper stratum is probably related to a better drained waste stratum, resulting from the improvement of operational standards and increase in drainage facilities throughout the years. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Training Teachers to Teach Probability

    ERIC Educational Resources Information Center

    Batanero, Carmen; Godino, Juan D.; Roa, Rafael

    2004-01-01

    In this paper we analyze the reasons why the teaching of probability is difficult for mathematics teachers, describe the contents needed in the didactical preparation of teachers to teach probability and analyze some examples of activities to carry out this training. These activities take into account the experience at the University of Granada,…

  18. Misconceptions in Rational Numbers, Probability, Algebra, and Geometry

    ERIC Educational Resources Information Center

    Rakes, Christopher R.

    2010-01-01

    In this study, the author examined the relationship of probability misconceptions to algebra, geometry, and rational number misconceptions and investigated the potential of probability instruction as an intervention to address misconceptions in all 4 content areas. Through a review of literature, 5 fundamental concepts were identified that, if…

  19. Fragment size distribution in viscous bag breakup of a drop

    NASA Astrophysics Data System (ADS)

    Kulkarni, Varun; Bulusu, Kartik V.; Plesniak, Michael W.; Sojka, Paul E.

    2015-11-01

    In this study we examine the drop size distribution resulting from the fragmentation of a single drop in the presence of a continuous air jet. Specifically, we study the effect of Weber number, We, and Ohnesorge number, Oh on the disintegration process. The regime of breakup considered is observed between 12 <= We <= 16 for Oh <= 0.1. Experiments are conducted using phase Doppler anemometry. Both the number and volume fragment size probability distributions are plotted. The volume probability distribution revealed a bi-modal behavior with two distinct peaks: one corresponding to the rim fragments and the other to the bag fragments. This behavior was suppressed in the number probability distribution. Additionally, we employ an in-house particle detection code to isolate the rim fragment size distribution from the total probability distributions. Our experiments showed that the bag fragments are smaller in diameter and larger in number, while the rim fragments are larger in diameter and smaller in number. Furthermore, with increasing We for a given Ohwe observe a large number of small-diameter drops and small number of large-diameter drops. On the other hand, with increasing Oh for a fixed We the opposite is seen.

  20. Nuclear risk analysis of the Ulysses mission

    NASA Astrophysics Data System (ADS)

    Bartram, Bart W.; Vaughan, Frank R.; Englehart, Richard W.

    An account is given of the method used to quantify the risks accruing to the use of a radioisotope thermoelectric generator fueled by Pu-238 dioxide aboard the Space Shuttle-launched Ulysses mission. After using a Monte Carlo technique to develop probability distributions for the radiological consequences of a range of accident scenarios throughout the mission, factors affecting those consequences are identified in conjunction with their probability distributions. The functional relationship among all the factors is then established, and probability distributions for all factor effects are combined by means of a Monte Carlo technique.

  1. Score distributions of gapped multiple sequence alignments down to the low-probability tail

    NASA Astrophysics Data System (ADS)

    Fieth, Pascal; Hartmann, Alexander K.

    2016-08-01

    Assessing the significance of alignment scores of optimally aligned DNA or amino acid sequences can be achieved via the knowledge of the score distribution of random sequences. But this requires obtaining the distribution in the biologically relevant high-scoring region, where the probabilities are exponentially small. For gapless local alignments of infinitely long sequences this distribution is known analytically to follow a Gumbel distribution. Distributions for gapped local alignments and global alignments of finite lengths can only be obtained numerically. To obtain result for the small-probability region, specific statistical mechanics-based rare-event algorithms can be applied. In previous studies, this was achieved for pairwise alignments. They showed that, contrary to results from previous simple sampling studies, strong deviations from the Gumbel distribution occur in case of finite sequence lengths. Here we extend the studies to multiple sequence alignments with gaps, which are much more relevant for practical applications in molecular biology. We study the distributions of scores over a large range of the support, reaching probabilities as small as 10-160, for global and local (sum-of-pair scores) multiple alignments. We find that even after suitable rescaling, eliminating the sequence-length dependence, the distributions for multiple alignment differ from the pairwise alignment case. Furthermore, we also show that the previously discussed Gaussian correction to the Gumbel distribution needs to be refined, also for the case of pairwise alignments.

  2. Soil Nutrient Content Influences the Abundance of Soil Microbes but Not Plant Biomass at the Small-Scale

    PubMed Central

    Koorem, Kadri; Gazol, Antonio; Öpik, Maarja; Moora, Mari; Saks, Ülle; Uibopuu, Annika; Sõber, Virve; Zobel, Martin

    2014-01-01

    Small-scale heterogeneity of abiotic and biotic factors is expected to play a crucial role in species coexistence. It is known that plants are able to concentrate their root biomass into areas with high nutrient content and also acquire nutrients via symbiotic microorganisms such as arbuscular mycorrhizal (AM) fungi. At the same time, little is known about the small-scale distribution of soil nutrients, microbes and plant biomass occurring in the same area. We examined small-scale temporal and spatial variation as well as covariation of soil nutrients, microbial biomass (using soil fatty acid biomarker content) and above- and belowground biomass of herbaceous plants in a natural herb-rich boreonemoral spruce forest. The abundance of AM fungi and bacteria decreased during the plant growing season while soil nutrient content rather increased. The abundance of all microbes studied also varied in space and was affected by soil nutrient content. In particular, the abundance of AM fungi was negatively related to soil phosphorus and positively influenced by soil nitrogen content. Neither shoot nor root biomass of herbaceous plants showed any significant relationship with variation in soil nutrient content or the abundance of soil microbes. Our study suggests that plants can compensate for low soil phosphorus concentration via interactions with soil microbes, most probably due to a more efficient symbiosis with AM fungi. This compensation results in relatively constant plant biomass despite variation in soil phosphorous content and in the abundance of AM fungi. Hence, it is crucial to consider both soil nutrient content and the abundance of soil microbes when exploring the mechanisms driving vegetation patterns. PMID:24637633

  3. Site occupancy models with heterogeneous detection probabilities

    USGS Publications Warehouse

    Royle, J. Andrew

    2006-01-01

    Models for estimating the probability of occurrence of a species in the presence of imperfect detection are important in many ecological disciplines. In these ?site occupancy? models, the possibility of heterogeneity in detection probabilities among sites must be considered because variation in abundance (and other factors) among sampled sites induces variation in detection probability (p). In this article, I develop occurrence probability models that allow for heterogeneous detection probabilities by considering several common classes of mixture distributions for p. For any mixing distribution, the likelihood has the general form of a zero-inflated binomial mixture for which inference based upon integrated likelihood is straightforward. A recent paper by Link (2003, Biometrics 59, 1123?1130) demonstrates that in closed population models used for estimating population size, different classes of mixture distributions are indistinguishable from data, yet can produce very different inferences about population size. I demonstrate that this problem can also arise in models for estimating site occupancy in the presence of heterogeneous detection probabilities. The implications of this are discussed in the context of an application to avian survey data and the development of animal monitoring programs.

  4. Investigation of Archaeal and Bacterial community structure of five different small drinking water networks with special regard to the nitrifying microorganisms.

    PubMed

    Nagymáté, Zsuzsanna; Homonnay, Zalán G; Márialigeti, Károly

    2016-01-01

    Total microbial community structure, and particularly nitrifying communities inhabiting five different small drinking water networks characterized with different water physical and chemical parameters was investigated, using cultivation-based methods and sequence aided Terminal Restriction Fragment Length Polymorphism (T-RFLP) analysis. Ammonium ion, originated from well water, was only partially oxidized via nitrite to nitrate in the drinking water distribution systems. Nitrification occurred at low ammonium ion concentration (27-46μM), relatively high pH (7.6-8.2) and over a wide range of dissolved oxygen concentrations (0.4-9.0mgL(-1)). The nitrifying communities of the distribution systems were characterized by variable most probable numbers (2×10(2)-7.1×10(4) MPN L(-1)) and probably originated from the non-treated well water. The sequence aided T-RFLP method revealed that ammonia-oxidizing microorganisms and nitrite-oxidizing Bacteria (Nitrosomonas oligotropha, Nitrosopumilus maritimus, and Nitrospira moscoviensis, 'Candidatus Nitrospira defluvii') were present in different ratios in the total microbial communities of the distinct parts of the water network systems. The nitrate generated by nitrification was partly utilized by nitrate-reducing (and denitrifying) Bacteria, present in low MPN and characterized by sequence aided T-RFLP as Comamonas sp. and Pseudomonas spp. Different environmental factors, like pH, chemical oxygen demand, calculated total inorganic nitrogen content (moreover nitrite and nitrate concentration), temperature had important effect on the total bacterial and archaeal community distribution. Copyright © 2016 Elsevier GmbH. All rights reserved.

  5. New insights in dehydration stress behavior of two maize hybrids using advanced distributed reactivity model (DRM). Responses to the impact of 24-epibrassinolide

    PubMed Central

    Janković, Bojan; Janković, Marija; Nikolić, Bogdan; Dimkić, Ivica; Lalević, Blažo; Raičević, Vera

    2017-01-01

    Proposed distributed reactivity model of dehydration for seedling parts of two various maize hybrids (ZP434, ZP704) was established. Dehydration stresses were induced thermally, which is also accompanied by response of hybrids to heat stress. It was found that an increased value of activation energy counterparts within radicle dehydration of ZP434, with a high concentration of 24-epibrassinolide (24-EBL) at elevated operating temperatures, probably causes activation of diffusion mechanisms in cutin network and may increases likelihood of formation of free volumes, large enough to accommodate diffusing molecule. Many small random effects were detected and can be correlated with micro-disturbing in a space filled with water caused by thermal gradients, increasing capillary phenomena, and which can induce thermo-capillary migration. The influence of seedling content of various sugars and minerals on dehydration was also examined. Estimated distributed reactivity models indicate a dependence of reactivity on structural arrangements, due to present interactions between water molecules and chemical species within the plant. PMID:28644899

  6. New insights in dehydration stress behavior of two maize hybrids using advanced distributed reactivity model (DRM). Responses to the impact of 24-epibrassinolide.

    PubMed

    Waisi, Hadi; Janković, Bojan; Janković, Marija; Nikolić, Bogdan; Dimkić, Ivica; Lalević, Blažo; Raičević, Vera

    2017-01-01

    Proposed distributed reactivity model of dehydration for seedling parts of two various maize hybrids (ZP434, ZP704) was established. Dehydration stresses were induced thermally, which is also accompanied by response of hybrids to heat stress. It was found that an increased value of activation energy counterparts within radicle dehydration of ZP434, with a high concentration of 24-epibrassinolide (24-EBL) at elevated operating temperatures, probably causes activation of diffusion mechanisms in cutin network and may increases likelihood of formation of free volumes, large enough to accommodate diffusing molecule. Many small random effects were detected and can be correlated with micro-disturbing in a space filled with water caused by thermal gradients, increasing capillary phenomena, and which can induce thermo-capillary migration. The influence of seedling content of various sugars and minerals on dehydration was also examined. Estimated distributed reactivity models indicate a dependence of reactivity on structural arrangements, due to present interactions between water molecules and chemical species within the plant.

  7. Internal iron biomineralization in Imperata cylindrica, a perennial grass: chemical composition, speciation and plant localization.

    PubMed

    Rodríguez, N; Menéndez, N; Tornero, J; Amils, R; de la Fuente, V

    2005-03-01

    * The analysis of metal distribution in Imperata cylindrica, a perennial grass isolated from the banks of Tinto River (Iberian Pyritic Belt), an extreme acidic environment with high content in metals, has shown a remarkable accumulation of iron. This property has been used to study iron speciation and its distribution among different tissues and structures of the plant. * Mossbauer (MS) and X-ray diffraction (XRD) were used to determine the iron species, scanning electron microscopy (SEM) to locate iron biominerals among plant tissue structures, and energy-dispersive X-ray microanalysis (EDAX), X-ray fluorescence (TXRF) and inductively coupled plasma emission spectroscopy (ICP-MS) to confirm their elemental composition. * The MS spectral analysis indicated that iron accumulated in this plant mainly as jarosite and ferritin. The presence of jarosite was confirmed by XRD and the distribution of both minerals in structures of different tissues was ascertained by SEM-EDAX analysis. * The convergent results obtained by complementary techniques suggest a complex iron management system in I. cylindrica, probably as a consequence of the environmental conditions of its habitat.

  8. Universality of Citation Distributions for Academic Institutions and Journals

    PubMed Central

    Chatterjee, Arnab; Ghosh, Asim; Chakrabarti, Bikas K.

    2016-01-01

    Citations measure the importance of a publication, and may serve as a proxy for its popularity and quality of its contents. Here we study the distributions of citations to publications from individual academic institutions for a single year. The average number of citations have large variations between different institutions across the world, but the probability distributions of citations for individual institutions can be rescaled to a common form by scaling the citations by the average number of citations for that institution. We find this feature seems to be universal for a broad selection of institutions irrespective of the average number of citations per article. A similar analysis for citations to publications in a particular journal in a single year reveals similar results. We find high absolute inequality for both these sets, Gini coefficients being around 0.66 and 0.58 for institutions and journals respectively. We also find that the top 25% of the articles hold about 75% of the total citations for institutions and the top 29% of the articles hold about 71% of the total citations for journals. PMID:26751563

  9. Universality of Citation Distributions for Academic Institutions and Journals.

    PubMed

    Chatterjee, Arnab; Ghosh, Asim; Chakrabarti, Bikas K

    2016-01-01

    Citations measure the importance of a publication, and may serve as a proxy for its popularity and quality of its contents. Here we study the distributions of citations to publications from individual academic institutions for a single year. The average number of citations have large variations between different institutions across the world, but the probability distributions of citations for individual institutions can be rescaled to a common form by scaling the citations by the average number of citations for that institution. We find this feature seems to be universal for a broad selection of institutions irrespective of the average number of citations per article. A similar analysis for citations to publications in a particular journal in a single year reveals similar results. We find high absolute inequality for both these sets, Gini coefficients being around 0.66 and 0.58 for institutions and journals respectively. We also find that the top 25% of the articles hold about 75% of the total citations for institutions and the top 29% of the articles hold about 71% of the total citations for journals.

  10. Implementation of the Iterative Proportion Fitting Algorithm for Geostatistical Facies Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li Yupeng, E-mail: yupeng@ualberta.ca; Deutsch, Clayton V.

    2012-06-15

    In geostatistics, most stochastic algorithm for simulation of categorical variables such as facies or rock types require a conditional probability distribution. The multivariate probability distribution of all the grouped locations including the unsampled location permits calculation of the conditional probability directly based on its definition. In this article, the iterative proportion fitting (IPF) algorithm is implemented to infer this multivariate probability. Using the IPF algorithm, the multivariate probability is obtained by iterative modification to an initial estimated multivariate probability using lower order bivariate probabilities as constraints. The imposed bivariate marginal probabilities are inferred from profiles along drill holes or wells.more » In the IPF process, a sparse matrix is used to calculate the marginal probabilities from the multivariate probability, which makes the iterative fitting more tractable and practical. This algorithm can be extended to higher order marginal probability constraints as used in multiple point statistics. The theoretical framework is developed and illustrated with estimation and simulation example.« less

  11. Modeling highway travel time distribution with conditional probability models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliveira Neto, Francisco Moraes; Chin, Shih-Miao; Hwang, Ho-Ling

    ABSTRACT Under the sponsorship of the Federal Highway Administration's Office of Freight Management and Operations, the American Transportation Research Institute (ATRI) has developed performance measures through the Freight Performance Measures (FPM) initiative. Under this program, travel speed information is derived from data collected using wireless based global positioning systems. These telemetric data systems are subscribed and used by trucking industry as an operations management tool. More than one telemetric operator submits their data dumps to ATRI on a regular basis. Each data transmission contains truck location, its travel time, and a clock time/date stamp. Data from the FPM program providesmore » a unique opportunity for studying the upstream-downstream speed distributions at different locations, as well as different time of the day and day of the week. This research is focused on the stochastic nature of successive link travel speed data on the continental United States Interstates network. Specifically, a method to estimate route probability distributions of travel time is proposed. This method uses the concepts of convolution of probability distributions and bivariate, link-to-link, conditional probability to estimate the expected distributions for the route travel time. Major contribution of this study is the consideration of speed correlation between upstream and downstream contiguous Interstate segments through conditional probability. The established conditional probability distributions, between successive segments, can be used to provide travel time reliability measures. This study also suggests an adaptive method for calculating and updating route travel time distribution as new data or information is added. This methodology can be useful to estimate performance measures as required by the recent Moving Ahead for Progress in the 21st Century Act (MAP 21).« less

  12. Probability of success for phase III after exploratory biomarker analysis in phase II.

    PubMed

    Götte, Heiko; Kirchner, Marietta; Sailer, Martin Oliver

    2017-05-01

    The probability of success or average power describes the potential of a future trial by weighting the power with a probability distribution of the treatment effect. The treatment effect estimate from a previous trial can be used to define such a distribution. During the development of targeted therapies, it is common practice to look for predictive biomarkers. The consequence is that the trial population for phase III is often selected on the basis of the most extreme result from phase II biomarker subgroup analyses. In such a case, there is a tendency to overestimate the treatment effect. We investigate whether the overestimation of the treatment effect estimate from phase II is transformed into a positive bias for the probability of success for phase III. We simulate a phase II/III development program for targeted therapies. This simulation allows to investigate selection probabilities and allows to compare the estimated with the true probability of success. We consider the estimated probability of success with and without subgroup selection. Depending on the true treatment effects, there is a negative bias without selection because of the weighting by the phase II distribution. In comparison, selection increases the estimated probability of success. Thus, selection does not lead to a bias in probability of success if underestimation due to the phase II distribution and overestimation due to selection cancel each other out. We recommend to perform similar simulations in practice to get the necessary information about the risk and chances associated with such subgroup selection designs. Copyright © 2017 John Wiley & Sons, Ltd.

  13. General formulation of long-range degree correlations in complex networks

    NASA Astrophysics Data System (ADS)

    Fujiki, Yuka; Takaguchi, Taro; Yakubo, Kousuke

    2018-06-01

    We provide a general framework for analyzing degree correlations between nodes separated by more than one step (i.e., beyond nearest neighbors) in complex networks. One joint and four conditional probability distributions are introduced to fully describe long-range degree correlations with respect to degrees k and k' of two nodes and shortest path length l between them. We present general relations among these probability distributions and clarify the relevance to nearest-neighbor degree correlations. Unlike nearest-neighbor correlations, some of these probability distributions are meaningful only in finite-size networks. Furthermore, as a baseline to determine the existence of intrinsic long-range degree correlations in a network other than inevitable correlations caused by the finite-size effect, the functional forms of these probability distributions for random networks are analytically evaluated within a mean-field approximation. The utility of our argument is demonstrated by applying it to real-world networks.

  14. Stochastic analysis of particle movement over a dune bed

    USGS Publications Warehouse

    Lee, Baum K.; Jobson, Harvey E.

    1977-01-01

    Stochastic models are available that can be used to predict the transport and dispersion of bed-material sediment particles in an alluvial channel. These models are based on the proposition that the movement of a single bed-material sediment particle consists of a series of steps of random length separated by rest periods of random duration and, therefore, application of the models requires a knowledge of the probability distributions of the step lengths, the rest periods, the elevation of particle deposition, and the elevation of particle erosion. The procedure was tested by determining distributions from bed profiles formed in a large laboratory flume with a coarse sand as the bed material. The elevation of particle deposition and the elevation of particle erosion can be considered to be identically distributed, and their distribution can be described by either a ' truncated Gaussian ' or a ' triangular ' density function. The conditional probability distribution of the rest period given the elevation of particle deposition closely followed the two-parameter gamma distribution. The conditional probability distribution of the step length given the elevation of particle erosion and the elevation of particle deposition also closely followed the two-parameter gamma density function. For a given flow, the scale and shape parameters describing the gamma probability distributions can be expressed as functions of bed-elevation. (Woodard-USGS)

  15. Effect of harvest time and physical form of alfalfa silage on chewing time and particle size distribution in boli, rumen content and faeces.

    PubMed

    Kornfelt, L F; Weisbjerg, M R; Nørgaard, P

    2013-02-01

    The study examined the effects of physical form and harvest time of alfalfa silage on eating and ruminating activity and particle size distribution in feed boli, rumen content and faeces in dry cows. The alfalfa crop was harvested at two stages of growth (early: NDF 37%, late: NDF 44% in dry matter (DM)), and from each harvest, a chopped (theoretical cutting length: 19 mm) and an unchopped crop was ensiled in bales. The silages were fed restrictively to four rumen cannulated non-lactating Jersey cows (391 ± 26 kg) in a 4 × 4 Latin square design. The cows were fed restrictively 80% of their ad libitum intake twice daily. Chewing activity was recorded for 96 h continuously. Swallowed boli, rumen content, rumen fluid and faeces samples were collected, washed in nylon bags (0.01 mm pore size) and freeze-dried before dry sieving through 4.750, 2.360, 1.000, 0.500 and 0.212 mm pore sizes into six fractions. The length (PL) and width (PW) of particles within each fraction was measured by the use of image analysis. The eating activity (min/kg dry matter intake (P < 0.01) and min/kg NDF (P < 0.05)) was affected by harvest time. The mean ruminating time (min/kg DM) was affected by harvest time (P < 0.01), physical form (P < 0.05) and NDF intake per kg BW (P < 0.01). The proportion of washed particle DM of total DM in boli, rumen content, rumen fluid and faeces was affected by harvest time (P < 0.01) and highest by feeding late-harvested alfalfa silage. Two peaks on the probability density distribution function (PDF) of PW and PL values of boli, rumen content and faeces were identified. Chopping of the silage decreased the mean PL and PW, the most frequent PL (mode) and 95% percentile PL and PW values in boli. In the rumen content, chopping increased the mean PW (P < 0.05). The dimension sizes of faeces particles were not significantly affected by chopping. The mode PW value was lower in rumen content and faeces than in boli (P < 0.001), and the mode PL value was higher in boli and lower in faeces compared with rumen contents (P < 0.001). In conclusion, the mean total chewing activity per kg NDF decreased due to chopping and early harvest time. The mean PL and PW in boli decreased due to chopping and late harvest. The two peak values on the PDF (PL) and PDF (PW) of boli, rumen content and faeces particles are most likely related to the leaf and the stem residues.

  16. Using type IV Pearson distribution to calculate the probabilities of underrun and overrun of lists of multiple cases.

    PubMed

    Wang, Jihan; Yang, Kai

    2014-07-01

    An efficient operating room needs both little underutilised and overutilised time to achieve optimal cost efficiency. The probabilities of underrun and overrun of lists of cases can be estimated by a well defined duration distribution of the lists. To propose a method of predicting the probabilities of underrun and overrun of lists of cases using Type IV Pearson distribution to support case scheduling. Six years of data were collected. The first 5 years of data were used to fit distributions and estimate parameters. The data from the last year were used as testing data to validate the proposed methods. The percentiles of the duration distribution of lists of cases were calculated by Type IV Pearson distribution and t-distribution. Monte Carlo simulation was conducted to verify the accuracy of percentiles defined by the proposed methods. Operating rooms in John D. Dingell VA Medical Center, United States, from January 2005 to December 2011. Differences between the proportion of lists of cases that were completed within the percentiles of the proposed duration distribution of the lists and the corresponding percentiles. Compared with the t-distribution, the proposed new distribution is 8.31% (0.38) more accurate on average and 14.16% (0.19) more accurate in calculating the probabilities at the 10th and 90th percentiles of the distribution, which is a major concern of operating room schedulers. The absolute deviations between the percentiles defined by Type IV Pearson distribution and those from Monte Carlo simulation varied from 0.20  min (0.01) to 0.43  min (0.03). Operating room schedulers can rely on the most recent 10 cases with the same combination of surgeon and procedure(s) for distribution parameter estimation to plan lists of cases. Values are mean (SEM). The proposed Type IV Pearson distribution is more accurate than t-distribution to estimate the probabilities of underrun and overrun of lists of cases. However, as not all the individual case durations followed log-normal distributions, there was some deviation from the true duration distribution of the lists.

  17. Maceral distributions in Illinois coals and their paleoenvironmental implications

    USGS Publications Warehouse

    Harvey, R.D.; Dillon, J.W.

    1985-01-01

    For purposes of assessing the maceral distribution of Illinois (U.S.A.) coals analyses were assembled for 326 face channel and drill core samples from 24 coal members of the Pennsylvanian System. The inertinite content of coals from the Missourian and Virgilian Series averages 16.1% (mineral free), compared to 9.4% for older coals from the Desmoinesian and older Series. This indicates there was generally a higher state of oxidation in the peat that formed the younger coals. This state probably resulted from greater exposure of these peats to weathering as the climate became drier and the water table lower than was the case for the older coals, although oxidation during allochthonous deposition of inertinite components is a genetic factor that needs further study to confirm the importance of the climate. Regional variation of the vitrinite-inertinite ratio (V-I), on a mineral- and micrinite-free basis, was observed in the Springfield (No. 5) and Herrin (No. 6) Coal Members to be related to the geographical position of paleochannel (river) deposits known to have been contemporaneous with the peats that formed these two coal strata. The V-I ratio is highest (generally 12-27) in samples from areas adjacent to the channels, and lower (5-11) some 10-20 km away. We interpret the V-I ratio to be an inverse index of the degree of oxidation to which the original peat was exposed. High V-I ratio coal located near the channels probably formed under more anoxic conditions than did the lower V-I ratio coal some distance away from the channels. The low V-I ratio coal probably formed in areas of the peat swamp where the watertable was generally lower than the channel areas. ?? 1986.

  18. A tool for simulating collision probabilities of animals with marine renewable energy devices.

    PubMed

    Schmitt, Pál; Culloch, Ross; Lieber, Lilian; Molander, Sverker; Hammar, Linus; Kregting, Louise

    2017-01-01

    The mathematical problem of establishing a collision probability distribution is often not trivial. The shape and motion of the animal as well as of the the device must be evaluated in a four-dimensional space (3D motion over time). Earlier work on wind and tidal turbines was limited to a simplified two-dimensional representation, which cannot be applied to many new structures. We present a numerical algorithm to obtain such probability distributions using transient, three-dimensional numerical simulations. The method is demonstrated using a sub-surface tidal kite as an example. Necessary pre- and post-processing of the data created by the model is explained, numerical details and potential issues and limitations in the application of resulting probability distributions are highlighted.

  19. Lognormal Approximations of Fault Tree Uncertainty Distributions.

    PubMed

    El-Shanawany, Ashraf Ben; Ardron, Keith H; Walker, Simon P

    2018-01-26

    Fault trees are used in reliability modeling to create logical models of fault combinations that can lead to undesirable events. The output of a fault tree analysis (the top event probability) is expressed in terms of the failure probabilities of basic events that are input to the model. Typically, the basic event probabilities are not known exactly, but are modeled as probability distributions: therefore, the top event probability is also represented as an uncertainty distribution. Monte Carlo methods are generally used for evaluating the uncertainty distribution, but such calculations are computationally intensive and do not readily reveal the dominant contributors to the uncertainty. In this article, a closed-form approximation for the fault tree top event uncertainty distribution is developed, which is applicable when the uncertainties in the basic events of the model are lognormally distributed. The results of the approximate method are compared with results from two sampling-based methods: namely, the Monte Carlo method and the Wilks method based on order statistics. It is shown that the closed-form expression can provide a reasonable approximation to results obtained by Monte Carlo sampling, without incurring the computational expense. The Wilks method is found to be a useful means of providing an upper bound for the percentiles of the uncertainty distribution while being computationally inexpensive compared with full Monte Carlo sampling. The lognormal approximation method and Wilks's method appear attractive, practical alternatives for the evaluation of uncertainty in the output of fault trees and similar multilinear models. © 2018 Society for Risk Analysis.

  20. Quantum key distribution without the wavefunction

    NASA Astrophysics Data System (ADS)

    Niestegge, Gerd

    A well-known feature of quantum mechanics is the secure exchange of secret bit strings which can then be used as keys to encrypt messages transmitted over any classical communication channel. It is demonstrated that this quantum key distribution allows a much more general and abstract access than commonly thought. The results include some generalizations of the Hilbert space version of quantum key distribution, but are based upon a general nonclassical extension of conditional probability. A special state-independent conditional probability is identified as origin of the superior security of quantum key distribution; this is a purely algebraic property of the quantum logic and represents the transition probability between the outcomes of two consecutive quantum measurements.

  1. The complexity of divisibility.

    PubMed

    Bausch, Johannes; Cubitt, Toby

    2016-09-01

    We address two sets of long-standing open questions in linear algebra and probability theory, from a computational complexity perspective: stochastic matrix divisibility, and divisibility and decomposability of probability distributions. We prove that finite divisibility of stochastic matrices is an NP-complete problem, and extend this result to nonnegative matrices, and completely-positive trace-preserving maps, i.e. the quantum analogue of stochastic matrices. We further prove a complexity hierarchy for the divisibility and decomposability of probability distributions, showing that finite distribution divisibility is in P, but decomposability is NP-hard. For the former, we give an explicit polynomial-time algorithm. All results on distributions extend to weak-membership formulations, proving that the complexity of these problems is robust to perturbations.

  2. Probability distributions of hydraulic conductivity for the hydrogeologic units of the Death Valley regional ground-water flow system, Nevada and California

    USGS Publications Warehouse

    Belcher, Wayne R.; Sweetkind, Donald S.; Elliott, Peggy E.

    2002-01-01

    The use of geologic information such as lithology and rock properties is important to constrain conceptual and numerical hydrogeologic models. This geologic information is difficult to apply explicitly to numerical modeling and analyses because it tends to be qualitative rather than quantitative. This study uses a compilation of hydraulic-conductivity measurements to derive estimates of the probability distributions for several hydrogeologic units within the Death Valley regional ground-water flow system, a geologically and hydrologically complex region underlain by basin-fill sediments, volcanic, intrusive, sedimentary, and metamorphic rocks. Probability distributions of hydraulic conductivity for general rock types have been studied previously; however, this study provides more detailed definition of hydrogeologic units based on lithostratigraphy, lithology, alteration, and fracturing and compares the probability distributions to the aquifer test data. Results suggest that these probability distributions can be used for studies involving, for example, numerical flow modeling, recharge, evapotranspiration, and rainfall runoff. These probability distributions can be used for such studies involving the hydrogeologic units in the region, as well as for similar rock types elsewhere. Within the study area, fracturing appears to have the greatest influence on the hydraulic conductivity of carbonate bedrock hydrogeologic units. Similar to earlier studies, we find that alteration and welding in the Tertiary volcanic rocks greatly influence hydraulic conductivity. As alteration increases, hydraulic conductivity tends to decrease. Increasing degrees of welding appears to increase hydraulic conductivity because welding increases the brittleness of the volcanic rocks, thus increasing the amount of fracturing.

  3. Bayesian alternative to the ISO-GUM's use of the Welch Satterthwaite formula

    NASA Astrophysics Data System (ADS)

    Kacker, Raghu N.

    2006-02-01

    In certain disciplines, uncertainty is traditionally expressed as an interval about an estimate for the value of the measurand. Development of such uncertainty intervals with a stated coverage probability based on the International Organization for Standardization (ISO) Guide to the Expression of Uncertainty in Measurement (GUM) requires a description of the probability distribution for the value of the measurand. The ISO-GUM propagates the estimates and their associated standard uncertainties for various input quantities through a linear approximation of the measurement equation to determine an estimate and its associated standard uncertainty for the value of the measurand. This procedure does not yield a probability distribution for the value of the measurand. The ISO-GUM suggests that under certain conditions motivated by the central limit theorem the distribution for the value of the measurand may be approximated by a scaled-and-shifted t-distribution with effective degrees of freedom obtained from the Welch-Satterthwaite (W-S) formula. The approximate t-distribution may then be used to develop an uncertainty interval with a stated coverage probability for the value of the measurand. We propose an approximate normal distribution based on a Bayesian uncertainty as an alternative to the t-distribution based on the W-S formula. A benefit of the approximate normal distribution based on a Bayesian uncertainty is that it greatly simplifies the expression of uncertainty by eliminating altogether the need for calculating effective degrees of freedom from the W-S formula. In the special case where the measurand is the difference between two means, each evaluated from statistical analyses of independent normally distributed measurements with unknown and possibly unequal variances, the probability distribution for the value of the measurand is known to be a Behrens-Fisher distribution. We compare the performance of the approximate normal distribution based on a Bayesian uncertainty and the approximate t-distribution based on the W-S formula with respect to the Behrens-Fisher distribution. The approximate normal distribution is simpler and better in this case. A thorough investigation of the relative performance of the two approximate distributions would require comparison for a range of measurement equations by numerical methods.

  4. Does Litter Size Variation Affect Models of Terrestrial Carnivore Extinction Risk and Management?

    PubMed Central

    Devenish-Nelson, Eleanor S.; Stephens, Philip A.; Harris, Stephen; Soulsbury, Carl; Richards, Shane A.

    2013-01-01

    Background Individual variation in both survival and reproduction has the potential to influence extinction risk. Especially for rare or threatened species, reliable population models should adequately incorporate demographic uncertainty. Here, we focus on an important form of demographic stochasticity: variation in litter sizes. We use terrestrial carnivores as an example taxon, as they are frequently threatened or of economic importance. Since data on intraspecific litter size variation are often sparse, it is unclear what probability distribution should be used to describe the pattern of litter size variation for multiparous carnivores. Methodology/Principal Findings We used litter size data on 32 terrestrial carnivore species to test the fit of 12 probability distributions. The influence of these distributions on quasi-extinction probabilities and the probability of successful disease control was then examined for three canid species – the island fox Urocyon littoralis, the red fox Vulpes vulpes, and the African wild dog Lycaon pictus. Best fitting probability distributions differed among the carnivores examined. However, the discretised normal distribution provided the best fit for the majority of species, because variation among litter-sizes was often small. Importantly, however, the outcomes of demographic models were generally robust to the distribution used. Conclusion/Significance These results provide reassurance for those using demographic modelling for the management of less studied carnivores in which litter size variation is estimated using data from species with similar reproductive attributes. PMID:23469140

  5. Does litter size variation affect models of terrestrial carnivore extinction risk and management?

    PubMed

    Devenish-Nelson, Eleanor S; Stephens, Philip A; Harris, Stephen; Soulsbury, Carl; Richards, Shane A

    2013-01-01

    Individual variation in both survival and reproduction has the potential to influence extinction risk. Especially for rare or threatened species, reliable population models should adequately incorporate demographic uncertainty. Here, we focus on an important form of demographic stochasticity: variation in litter sizes. We use terrestrial carnivores as an example taxon, as they are frequently threatened or of economic importance. Since data on intraspecific litter size variation are often sparse, it is unclear what probability distribution should be used to describe the pattern of litter size variation for multiparous carnivores. We used litter size data on 32 terrestrial carnivore species to test the fit of 12 probability distributions. The influence of these distributions on quasi-extinction probabilities and the probability of successful disease control was then examined for three canid species - the island fox Urocyon littoralis, the red fox Vulpes vulpes, and the African wild dog Lycaon pictus. Best fitting probability distributions differed among the carnivores examined. However, the discretised normal distribution provided the best fit for the majority of species, because variation among litter-sizes was often small. Importantly, however, the outcomes of demographic models were generally robust to the distribution used. These results provide reassurance for those using demographic modelling for the management of less studied carnivores in which litter size variation is estimated using data from species with similar reproductive attributes.

  6. Neighbor-Dependent Ramachandran Probability Distributions of Amino Acids Developed from a Hierarchical Dirichlet Process Model

    PubMed Central

    Mitra, Rajib; Jordan, Michael I.; Dunbrack, Roland L.

    2010-01-01

    Distributions of the backbone dihedral angles of proteins have been studied for over 40 years. While many statistical analyses have been presented, only a handful of probability densities are publicly available for use in structure validation and structure prediction methods. The available distributions differ in a number of important ways, which determine their usefulness for various purposes. These include: 1) input data size and criteria for structure inclusion (resolution, R-factor, etc.); 2) filtering of suspect conformations and outliers using B-factors or other features; 3) secondary structure of input data (e.g., whether helix and sheet are included; whether beta turns are included); 4) the method used for determining probability densities ranging from simple histograms to modern nonparametric density estimation; and 5) whether they include nearest neighbor effects on the distribution of conformations in different regions of the Ramachandran map. In this work, Ramachandran probability distributions are presented for residues in protein loops from a high-resolution data set with filtering based on calculated electron densities. Distributions for all 20 amino acids (with cis and trans proline treated separately) have been determined, as well as 420 left-neighbor and 420 right-neighbor dependent distributions. The neighbor-independent and neighbor-dependent probability densities have been accurately estimated using Bayesian nonparametric statistical analysis based on the Dirichlet process. In particular, we used hierarchical Dirichlet process priors, which allow sharing of information between densities for a particular residue type and different neighbor residue types. The resulting distributions are tested in a loop modeling benchmark with the program Rosetta, and are shown to improve protein loop conformation prediction significantly. The distributions are available at http://dunbrack.fccc.edu/hdp. PMID:20442867

  7. Probabilistic analysis of preload in the abutment screw of a dental implant complex.

    PubMed

    Guda, Teja; Ross, Thomas A; Lang, Lisa A; Millwater, Harry R

    2008-09-01

    Screw loosening is a problem for a percentage of implants. A probabilistic analysis to determine the cumulative probability distribution of the preload, the probability of obtaining an optimal preload, and the probabilistic sensitivities identifying important variables is lacking. The purpose of this study was to examine the inherent variability of material properties, surface interactions, and applied torque in an implant system to determine the probability of obtaining desired preload values and to identify the significant variables that affect the preload. Using software programs, an abutment screw was subjected to a tightening torque and the preload was determined from finite element (FE) analysis. The FE model was integrated with probabilistic analysis software. Two probabilistic analysis methods (advanced mean value and Monte Carlo sampling) were applied to determine the cumulative distribution function (CDF) of preload. The coefficient of friction, elastic moduli, Poisson's ratios, and applied torque were modeled as random variables and defined by probability distributions. Separate probability distributions were determined for the coefficient of friction in well-lubricated and dry environments. The probabilistic analyses were performed and the cumulative distribution of preload was determined for each environment. A distinct difference was seen between the preload probability distributions generated in a dry environment (normal distribution, mean (SD): 347 (61.9) N) compared to a well-lubricated environment (normal distribution, mean (SD): 616 (92.2) N). The probability of obtaining a preload value within the target range was approximately 54% for the well-lubricated environment and only 0.02% for the dry environment. The preload is predominately affected by the applied torque and coefficient of friction between the screw threads and implant bore at lower and middle values of the preload CDF, and by the applied torque and the elastic modulus of the abutment screw at high values of the preload CDF. Lubrication at the threaded surfaces between the abutment screw and implant bore affects the preload developed in the implant complex. For the well-lubricated surfaces, only approximately 50% of implants will have preload values within the generally accepted range. This probability can be improved by applying a higher torque than normally recommended or a more closely controlled torque than typically achieved. It is also suggested that materials with higher elastic moduli be used in the manufacture of the abutment screw to achieve a higher preload.

  8. Comparative analysis through probability distributions of a data set

    NASA Astrophysics Data System (ADS)

    Cristea, Gabriel; Constantinescu, Dan Mihai

    2018-02-01

    In practice, probability distributions are applied in such diverse fields as risk analysis, reliability engineering, chemical engineering, hydrology, image processing, physics, market research, business and economic research, customer support, medicine, sociology, demography etc. This article highlights important aspects of fitting probability distributions to data and applying the analysis results to make informed decisions. There are a number of statistical methods available which can help us to select the best fitting model. Some of the graphs display both input data and fitted distributions at the same time, as probability density and cumulative distribution. The goodness of fit tests can be used to determine whether a certain distribution is a good fit. The main used idea is to measure the "distance" between the data and the tested distribution, and compare that distance to some threshold values. Calculating the goodness of fit statistics also enables us to order the fitted distributions accordingly to how good they fit to data. This particular feature is very helpful for comparing the fitted models. The paper presents a comparison of most commonly used goodness of fit tests as: Kolmogorov-Smirnov, Anderson-Darling, and Chi-Squared. A large set of data is analyzed and conclusions are drawn by visualizing the data, comparing multiple fitted distributions and selecting the best model. These graphs should be viewed as an addition to the goodness of fit tests.

  9. Impact of temporal probability in 4D dose calculation for lung tumors.

    PubMed

    Rouabhi, Ouided; Ma, Mingyu; Bayouth, John; Xia, Junyi

    2015-11-08

    The purpose of this study was to evaluate the dosimetric uncertainty in 4D dose calculation using three temporal probability distributions: uniform distribution, sinusoidal distribution, and patient-specific distribution derived from the patient respiratory trace. Temporal probability, defined as the fraction of time a patient spends in each respiratory amplitude, was evaluated in nine lung cancer patients. Four-dimensional computed tomography (4D CT), along with deformable image registration, was used to compute 4D dose incorporating the patient's respiratory motion. First, the dose of each of 10 phase CTs was computed using the same planning parameters as those used in 3D treatment planning based on the breath-hold CT. Next, deformable image registration was used to deform the dose of each phase CT to the breath-hold CT using the deformation map between the phase CT and the breath-hold CT. Finally, the 4D dose was computed by summing the deformed phase doses using their corresponding temporal probabilities. In this study, 4D dose calculated from the patient-specific temporal probability distribution was used as the ground truth. The dosimetric evaluation matrix included: 1) 3D gamma analysis, 2) mean tumor dose (MTD), 3) mean lung dose (MLD), and 4) lung V20. For seven out of nine patients, both uniform and sinusoidal temporal probability dose distributions were found to have an average gamma passing rate > 95% for both the lung and PTV regions. Compared with 4D dose calculated using the patient respiratory trace, doses using uniform and sinusoidal distribution showed a percentage difference on average of -0.1% ± 0.6% and -0.2% ± 0.4% in MTD, -0.2% ± 1.9% and -0.2% ± 1.3% in MLD, 0.09% ± 2.8% and -0.07% ± 1.8% in lung V20, -0.1% ± 2.0% and 0.08% ± 1.34% in lung V10, 0.47% ± 1.8% and 0.19% ± 1.3% in lung V5, respectively. We concluded that four-dimensional dose computed using either a uniform or sinusoidal temporal probability distribution can approximate four-dimensional dose computed using the patient-specific respiratory trace.

  10. Evidence-Based Medicine as a Tool for Undergraduate Probability and Statistics Education

    PubMed Central

    Masel, J.; Humphrey, P. T.; Blackburn, B.; Levine, J. A.

    2015-01-01

    Most students have difficulty reasoning about chance events, and misconceptions regarding probability can persist or even strengthen following traditional instruction. Many biostatistics classes sidestep this problem by prioritizing exploratory data analysis over probability. However, probability itself, in addition to statistics, is essential both to the biology curriculum and to informed decision making in daily life. One area in which probability is particularly important is medicine. Given the preponderance of pre health students, in addition to more general interest in medicine, we capitalized on students’ intrinsic motivation in this area to teach both probability and statistics. We use the randomized controlled trial as the centerpiece of the course, because it exemplifies the most salient features of the scientific method, and the application of critical thinking to medicine. The other two pillars of the course are biomedical applications of Bayes’ theorem and science and society content. Backward design from these three overarching aims was used to select appropriate probability and statistics content, with a focus on eliciting and countering previously documented misconceptions in their medical context. Pretest/posttest assessments using the Quantitative Reasoning Quotient and Attitudes Toward Statistics instruments are positive, bucking several negative trends previously reported in statistics education. PMID:26582236

  11. Transient Properties of Probability Distribution for a Markov Process with Size-dependent Additive Noise

    NASA Astrophysics Data System (ADS)

    Yamada, Yuhei; Yamazaki, Yoshihiro

    2018-04-01

    This study considered a stochastic model for cluster growth in a Markov process with a cluster size dependent additive noise. According to this model, the probability distribution of the cluster size transiently becomes an exponential or a log-normal distribution depending on the initial condition of the growth. In this letter, a master equation is obtained for this model, and derivation of the distributions is discussed.

  12. q-Gaussian distributions and multiplicative stochastic processes for analysis of multiple financial time series

    NASA Astrophysics Data System (ADS)

    Sato, Aki-Hiro

    2010-12-01

    This study considers q-Gaussian distributions and stochastic differential equations with both multiplicative and additive noises. In the M-dimensional case a q-Gaussian distribution can be theoretically derived as a stationary probability distribution of the multiplicative stochastic differential equation with both mutually independent multiplicative and additive noises. By using the proposed stochastic differential equation a method to evaluate a default probability under a given risk buffer is proposed.

  13. Net present value probability distributions from decline curve reserves estimates

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Simpson, D.E.; Huffman, C.H.; Thompson, R.S.

    1995-12-31

    This paper demonstrates how reserves probability distributions can be used to develop net present value (NPV) distributions. NPV probability distributions were developed from the rate and reserves distributions presented in SPE 28333. This real data study used practicing engineer`s evaluations of production histories. Two approaches were examined to quantify portfolio risk. The first approach, the NPV Relative Risk Plot, compares the mean NPV with the NPV relative risk ratio for the portfolio. The relative risk ratio is the NPV standard deviation (a) divided the mean ({mu}) NPV. The second approach, a Risk - Return Plot, is a plot of themore » {mu} discounted cash flow rate of return (DCFROR) versus the {sigma} for the DCFROR distribution. This plot provides a risk-return relationship for comparing various portfolios. These methods may help evaluate property acquisition and divestiture alternatives and assess the relative risk of a suite of wells or fields for bank loans.« less

  14. Optimal random search for a single hidden target.

    PubMed

    Snider, Joseph

    2011-01-01

    A single target is hidden at a location chosen from a predetermined probability distribution. Then, a searcher must find a second probability distribution from which random search points are sampled such that the target is found in the minimum number of trials. Here it will be shown that if the searcher must get very close to the target to find it, then the best search distribution is proportional to the square root of the target distribution regardless of dimension. For a Gaussian target distribution, the optimum search distribution is approximately a Gaussian with a standard deviation that varies inversely with how close the searcher must be to the target to find it. For a network where the searcher randomly samples nodes and looks for the fixed target along edges, the optimum is either to sample a node with probability proportional to the square root of the out-degree plus 1 or not to do so at all.

  15. Deep Ore-controlling Role Beneath the Collision-related Deposit Zone in South Tibetan Plateau, Preliminary Results Revealed by Magnetotelluric Data

    NASA Astrophysics Data System (ADS)

    Xie, C.; Jin, S.; Wei, W.; Ye, G.; Fang, Y.; Zhang, L.; Dong, H.; Yin, Y.

    2017-12-01

    The Tibetan plateau is the largest and most recent plateau orogenic belt in the world, and the south part is expected as the ongoing India-Eurasia continental collision zone. The collision-related deposit zones which are distributed in south plateau could be roughly divided into three parts: the porphyry deposit in the Gangdese magmatic belt, the chromite deposit along the Yarlung-Zangbo suture (YZS) and the prospective deposit along the gneiss domes in the Tethys Himalayan. The deep ore-controlling role of those deposit zones is still remain controversial. Previous magnetotelluric (MT) data deployed from Himalayan to Gangdese terrane were inverted using a three dimensional (3D) MT inversion algorithm ModEM. The results show that the resistivity cover layers above -10 km are distributed along the whole profiles, whereas small and sporadic conductors could be also imaged. The middle to lower crust beneath -25 km is imaged as large scale but discontinuous conductive zones which have a central resistivity less than 10 ohm·m. We suggest the middle to lower crustal conductors could be interpreted as partial melting. This hypothesis is supported by some previous geological and geochemical studies. The Metallogenesis and partial melting play an important role in promoting each other. For the metallogenesis, the high water content is one of the prominent factors, and could be released on breakdown of amphibole in eclogite and garnet amphibolite during melting. On the other hand, the increasing of the water content would probably advance partial melting. The results indicate that the deep process and magmatism beneath different deposit zones are probably varying. We studied the rheological characteristics from the perspective of subsurface electrical structures. We hope by comparative analysis, the process of `origins - migration -formation' for the system of deep `magma - rheology - deposition' would be better understood.

  16. Multi-scale Characterization and Modeling of Surface Slope Probability Distribution for ~20-km Diameter Lunar Craters

    NASA Astrophysics Data System (ADS)

    Mahanti, P.; Robinson, M. S.; Boyd, A. K.

    2013-12-01

    Craters ~20-km diameter and above significantly shaped the lunar landscape. The statistical nature of the slope distribution on their walls and floors dominate the overall slope distribution statistics for the lunar surface. Slope statistics are inherently useful for characterizing the current topography of the surface, determining accurate photometric and surface scattering properties, and in defining lunar surface trafficability [1-4]. Earlier experimental studies on the statistical nature of lunar surface slopes were restricted either by resolution limits (Apollo era photogrammetric studies) or by model error considerations (photoclinometric and radar scattering studies) where the true nature of slope probability distribution was not discernible at baselines smaller than a kilometer[2,3,5]. Accordingly, historical modeling of lunar surface slopes probability distributions for applications such as in scattering theory development or rover traversability assessment is more general in nature (use of simple statistical models such as the Gaussian distribution[1,2,5,6]). With the advent of high resolution, high precision topographic models of the Moon[7,8], slopes in lunar craters can now be obtained at baselines as low as 6-meters allowing unprecedented multi-scale (multiple baselines) modeling possibilities for slope probability distributions. Topographic analysis (Lunar Reconnaissance Orbiter Camera (LROC) Narrow Angle Camera (NAC) 2-m digital elevation models (DEM)) of ~20-km diameter Copernican lunar craters revealed generally steep slopes on interior walls (30° to 36°, locally exceeding 40°) over 15-meter baselines[9]. In this work, we extend the analysis from a probability distribution modeling point-of-view with NAC DEMs to characterize the slope statistics for the floors and walls for the same ~20-km Copernican lunar craters. The difference in slope standard deviations between the Gaussian approximation and the actual distribution (2-meter sampling) was computed over multiple scales. This slope analysis showed that local slope distributions are non-Gaussian for both crater walls and floors. Over larger baselines (~100 meters), crater wall slope probability distributions do approximate Gaussian distributions better, but have long distribution tails. Crater floor probability distributions however, were always asymmetric (for the baseline scales analyzed) and less affected by baseline scale variations. Accordingly, our results suggest that use of long tailed probability distributions (like Cauchy) and a baseline-dependant multi-scale model can be more effective in describing the slope statistics for lunar topography. Refrences: [1]Moore, H.(1971), JGR,75(11) [2]Marcus, A. H.(1969),JGR,74 (22).[3]R.J. Pike (1970),U.S. Geological Survey Working Paper [4]N. C. Costes, J. E. Farmer and E. B. George (1972),NASA Technical Report TR R-401 [5]M. N. Parker and G. L. Tyler(1973), Radio Science, 8(3),177-184 [6]Alekseev, V. A.et al (1968), Soviet Astronomy, Vol. 11, p.860 [7]Burns et al. (2012) Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci., XXXIX-B4, 483-488.[8]Smith et al. (2010) GRL 37, L18204, DOI: 10.1029/2010GL043751. [9]Wagner R., Robinson, M., Speyerer E., Mahanti, P., LPSC 2013, #2924.

  17. Development and application of an empirical probability distribution for the prediction error of re-entry body maximum dynamic pressure

    NASA Technical Reports Server (NTRS)

    Lanzi, R. James; Vincent, Brett T.

    1993-01-01

    The relationship between actual and predicted re-entry maximum dynamic pressure is characterized using a probability density function and a cumulative distribution function derived from sounding rocket flight data. This paper explores the properties of this distribution and demonstrates applications of this data with observed sounding rocket re-entry body damage characteristics to assess probabilities of sustaining various levels of heating damage. The results from this paper effectively bridge the gap existing in sounding rocket reentry analysis between the known damage level/flight environment relationships and the predicted flight environment.

  18. Probability and the changing shape of response distributions for orientation.

    PubMed

    Anderson, Britt

    2014-11-18

    Spatial attention and feature-based attention are regarded as two independent mechanisms for biasing the processing of sensory stimuli. Feature attention is held to be a spatially invariant mechanism that advantages a single feature per sensory dimension. In contrast to the prediction of location independence, I found that participants were able to report the orientation of a briefly presented visual grating better for targets defined by high probability conjunctions of features and locations even when orientations and locations were individually uniform. The advantage for high-probability conjunctions was accompanied by changes in the shape of the response distributions. High-probability conjunctions had error distributions that were not normally distributed but demonstrated increased kurtosis. The increase in kurtosis could be explained as a change in the variances of the component tuning functions that comprise a population mixture. By changing the mixture distribution of orientation-tuned neurons, it is possible to change the shape of the discrimination function. This prompts the suggestion that attention may not "increase" the quality of perceptual processing in an absolute sense but rather prioritizes some stimuli over others. This results in an increased number of highly accurate responses to probable targets and, simultaneously, an increase in the number of very inaccurate responses. © 2014 ARVO.

  19. Vector wind and vector wind shear models 0 to 27 km altitude for Cape Kennedy, Florida, and Vandenberg AFB, California

    NASA Technical Reports Server (NTRS)

    Smith, O. E.

    1976-01-01

    The techniques are presented to derive several statistical wind models. The techniques are from the properties of the multivariate normal probability function. Assuming that the winds can be considered as bivariate normally distributed, then (1) the wind components and conditional wind components are univariate normally distributed, (2) the wind speed is Rayleigh distributed, (3) the conditional distribution of wind speed given a wind direction is Rayleigh distributed, and (4) the frequency of wind direction can be derived. All of these distributions are derived from the 5-sample parameter of wind for the bivariate normal distribution. By further assuming that the winds at two altitudes are quadravariate normally distributed, then the vector wind shear is bivariate normally distributed and the modulus of the vector wind shear is Rayleigh distributed. The conditional probability of wind component shears given a wind component is normally distributed. Examples of these and other properties of the multivariate normal probability distribution function as applied to Cape Kennedy, Florida, and Vandenberg AFB, California, wind data samples are given. A technique to develop a synthetic vector wind profile model of interest to aerospace vehicle applications is presented.

  20. Entropy-Bayesian Inversion of Time-Lapse Tomographic GPR data for Monitoring Dielectric Permittivity and Soil Moisture Variations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hou, Zhangshuan; Terry, Neil C.; Hubbard, Susan S.

    2013-02-22

    In this study, we evaluate the possibility of monitoring soil moisture variation using tomographic ground penetrating radar travel time data through Bayesian inversion, which is integrated with entropy memory function and pilot point concepts, as well as efficient sampling approaches. It is critical to accurately estimate soil moisture content and variations in vadose zone studies. Many studies have illustrated the promise and value of GPR tomographic data for estimating soil moisture and associated changes, however, challenges still exist in the inversion of GPR tomographic data in a manner that quantifies input and predictive uncertainty, incorporates multiple data types, handles non-uniquenessmore » and nonlinearity, and honors time-lapse tomograms collected in a series. To address these challenges, we develop a minimum relative entropy (MRE)-Bayesian based inverse modeling framework that non-subjectively defines prior probabilities, incorporates information from multiple sources, and quantifies uncertainty. The framework enables us to estimate dielectric permittivity at pilot point locations distributed within the tomogram, as well as the spatial correlation range. In the inversion framework, MRE is first used to derive prior probability density functions (pdfs) of dielectric permittivity based on prior information obtained from a straight-ray GPR inversion. The probability distributions are then sampled using a Quasi-Monte Carlo (QMC) approach, and the sample sets provide inputs to a sequential Gaussian simulation (SGSIM) algorithm that constructs a highly resolved permittivity/velocity field for evaluation with a curved-ray GPR forward model. The likelihood functions are computed as a function of misfits, and posterior pdfs are constructed using a Gaussian kernel. Inversion of subsequent time-lapse datasets combines the Bayesian estimates from the previous inversion (as a memory function) with new data. The memory function and pilot point design takes advantage of the spatial-temporal correlation of the state variables. We first apply the inversion framework to a static synthetic example and then to a time-lapse GPR tomographic dataset collected during a dynamic experiment conducted at the Hanford Site in Richland, WA. We demonstrate that the MRE-Bayesian inversion enables us to merge various data types, quantify uncertainty, evaluate nonlinear models, and produce more detailed and better resolved estimates than straight-ray based inversion; therefore, it has the potential to improve estimates of inter-wellbore dielectric permittivity and soil moisture content and to monitor their temporal dynamics more accurately.« less

  1. Joint probabilities and quantum cognition

    NASA Astrophysics Data System (ADS)

    de Barros, J. Acacio

    2012-12-01

    In this paper we discuss the existence of joint probability distributions for quantumlike response computations in the brain. We do so by focusing on a contextual neural-oscillator model shown to reproduce the main features of behavioral stimulus-response theory. We then exhibit a simple example of contextual random variables not having a joint probability distribution, and describe how such variables can be obtained from neural oscillators, but not from a quantum observable algebra.

  2. Properties of the probability distribution associated with the largest event in an earthquake cluster and their implications to foreshocks.

    PubMed

    Zhuang, Jiancang; Ogata, Yosihiko

    2006-04-01

    The space-time epidemic-type aftershock sequence model is a stochastic branching process in which earthquake activity is classified into background and clustering components and each earthquake triggers other earthquakes independently according to certain rules. This paper gives the probability distributions associated with the largest event in a cluster and their properties for all three cases when the process is subcritical, critical, and supercritical. One of the direct uses of these probability distributions is to evaluate the probability of an earthquake to be a foreshock, and magnitude distributions of foreshocks and nonforeshock earthquakes. To verify these theoretical results, the Japan Meteorological Agency earthquake catalog is analyzed. The proportion of events that have 1 or more larger descendants in total events is found to be as high as about 15%. When the differences between background events and triggered event in the behavior of triggering children are considered, a background event has a probability about 8% to be a foreshock. This probability decreases when the magnitude of the background event increases. These results, obtained from a complicated clustering model, where the characteristics of background events and triggered events are different, are consistent with the results obtained in [Ogata, Geophys. J. Int. 127, 17 (1996)] by using the conventional single-linked cluster declustering method.

  3. Sampling and Analysis of Atmospheric Pcdd/fs in South China Sea and Background Area in Vietnam

    NASA Astrophysics Data System (ADS)

    Chi, K.; Thuan, N. T.; Anh, N. X.; Lin, N.

    2011-12-01

    During the Vietnam conflict, United States (US) forces sprayed a greater volume of defoliant (Agent Orange) with higher PCDD/F content in central Vietnam. The Vietnamese have been exposed to these levels during spraying primarily through contact with former US military infrastructure. In this study, the concentrations of atmospheric PCDD/Fs observed at three background stations (Fig. 1) at Dongsha Island (Sites A) in South China Sea, Da Nang (Site B) city and Son La (Site C) in central and northern Vietnam, respectively, to further understand the PCDD/F contamination in Vietnam. The Measurements indicated that the atmospheric PCDD/F concentrations at Sites A, B and C were 1.66~10.8, 23.4~146 and 11.1~59.5 fg I-TEQ/m3, respectively, during the spring season in 2010 and 2011. The significantly lower PCDD/F concentrations and contents in suspended particles (23.7~33.9 pg I-TEQ/g-TSP) measured at Site A in the South China Sea can be attributed to the lack of any combustion sources within almost 300 km of this island. However, the significantly higher PCDD/F contents in suspended particles (270~300 pg I-TEQ/g-TSP) were measured at Site B in central Vietnam. In addition, Fig. 2 shows that the distribution of PCDD/F congeners measured at Da Nang station was quite different from those measured at other station with high PCDD distribution (>85%) especially in OCDD (>70%). We consider that the high fraction of PCDDs observed at Da Nang probably originated as anthropogenic emission from specific source in Vietnam.

  4. Permafrost and land cover as controlling factors for light fraction organic matter on the southern Qinghai-Tibetan plateau.

    PubMed

    Wu, Xiaodong; Zhao, Lin; Hu, Guojie; Liu, Guimin; Li, Wangping; Ding, Yongjian

    2018-02-01

    Permafrost degradation can stimulate the decomposition of organic soil matter and cause a large amount of greenhouse gas emissions into the atmosphere. The light fraction organic matter (LFOM) is a labile substrate for microbial decomposition and probably plays an important role in future permafrost carbon cycles. However, little is known about the distribution of LFOM and its relationship with permafrost and environmental factors. Here, we investigated the light fraction carbon (LFC) and nitrogen (LFN) contents and stocks under meadows and wet meadows with different permafrost conditions on the southern Qinghai-Tibetan Plateau. Our results showed that LFC and LFN were mainly distributed in the upper 30cm of soils, and the sites with permafrost had significantly higher contents of LFC and LFN than those from the sites without existing permafrost. The LFC and LFN decreased sharply with depth, suggesting that the soil organic matter (SOM) in this area was highly decomposed in deep soils. Soil moisture and bulk density explained approximately 50% of the variances in LFC and LFN for all the sampling sites, while soil moisture explained approximately 30% of the variance in permafrost sites. Both the C:N ratios and LFC:LFN ratios in the sites with permafrost were higher than those in the sites without permafrost. The results suggested that the permafrost and land cover types are the main factors controlling LFOM content and stock, and that permafrost degradation would lead to a decrease of LFOM and soil C:N ratios, thus accelerating the decomposition of SOM. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Does Breast Cancer Drive the Building of Survival Probability Models among States? An Assessment of Goodness of Fit for Patient Data from SEER Registries

    PubMed

    Khan, Hafiz; Saxena, Anshul; Perisetti, Abhilash; Rafiq, Aamrin; Gabbidon, Kemesha; Mende, Sarah; Lyuksyutova, Maria; Quesada, Kandi; Blakely, Summre; Torres, Tiffany; Afesse, Mahlet

    2016-12-01

    Background: Breast cancer is a worldwide public health concern and is the most prevalent type of cancer in women in the United States. This study concerned the best fit of statistical probability models on the basis of survival times for nine state cancer registries: California, Connecticut, Georgia, Hawaii, Iowa, Michigan, New Mexico, Utah, and Washington. Materials and Methods: A probability random sampling method was applied to select and extract records of 2,000 breast cancer patients from the Surveillance Epidemiology and End Results (SEER) database for each of the nine state cancer registries used in this study. EasyFit software was utilized to identify the best probability models by using goodness of fit tests, and to estimate parameters for various statistical probability distributions that fit survival data. Results: Statistical analysis for the summary of statistics is reported for each of the states for the years 1973 to 2012. Kolmogorov-Smirnov, Anderson-Darling, and Chi-squared goodness of fit test values were used for survival data, the highest values of goodness of fit statistics being considered indicative of the best fit survival model for each state. Conclusions: It was found that California, Connecticut, Georgia, Iowa, New Mexico, and Washington followed the Burr probability distribution, while the Dagum probability distribution gave the best fit for Michigan and Utah, and Hawaii followed the Gamma probability distribution. These findings highlight differences between states through selected sociodemographic variables and also demonstrate probability modeling differences in breast cancer survival times. The results of this study can be used to guide healthcare providers and researchers for further investigations into social and environmental factors in order to reduce the occurrence of and mortality due to breast cancer. Creative Commons Attribution License

  6. Role of the site of synaptic competition and the balance of learning forces for Hebbian encoding of probabilistic Markov sequences

    PubMed Central

    Bouchard, Kristofer E.; Ganguli, Surya; Brainard, Michael S.

    2015-01-01

    The majority of distinct sensory and motor events occur as temporally ordered sequences with rich probabilistic structure. Sequences can be characterized by the probability of transitioning from the current state to upcoming states (forward probability), as well as the probability of having transitioned to the current state from previous states (backward probability). Despite the prevalence of probabilistic sequencing of both sensory and motor events, the Hebbian mechanisms that mold synapses to reflect the statistics of experienced probabilistic sequences are not well understood. Here, we show through analytic calculations and numerical simulations that Hebbian plasticity (correlation, covariance, and STDP) with pre-synaptic competition can develop synaptic weights equal to the conditional forward transition probabilities present in the input sequence. In contrast, post-synaptic competition can develop synaptic weights proportional to the conditional backward probabilities of the same input sequence. We demonstrate that to stably reflect the conditional probability of a neuron's inputs and outputs, local Hebbian plasticity requires balance between competitive learning forces that promote synaptic differentiation and homogenizing learning forces that promote synaptic stabilization. The balance between these forces dictates a prior over the distribution of learned synaptic weights, strongly influencing both the rate at which structure emerges and the entropy of the final distribution of synaptic weights. Together, these results demonstrate a simple correspondence between the biophysical organization of neurons, the site of synaptic competition, and the temporal flow of information encoded in synaptic weights by Hebbian plasticity while highlighting the utility of balancing learning forces to accurately encode probability distributions, and prior expectations over such probability distributions. PMID:26257637

  7. A method to deconvolve stellar rotational velocities II. The probability distribution function via Tikhonov regularization

    NASA Astrophysics Data System (ADS)

    Christen, Alejandra; Escarate, Pedro; Curé, Michel; Rial, Diego F.; Cassetti, Julia

    2016-10-01

    Aims: Knowing the distribution of stellar rotational velocities is essential for understanding stellar evolution. Because we measure the projected rotational speed v sin I, we need to solve an ill-posed problem given by a Fredholm integral of the first kind to recover the "true" rotational velocity distribution. Methods: After discretization of the Fredholm integral we apply the Tikhonov regularization method to obtain directly the probability distribution function for stellar rotational velocities. We propose a simple and straightforward procedure to determine the Tikhonov parameter. We applied Monte Carlo simulations to prove that the Tikhonov method is a consistent estimator and asymptotically unbiased. Results: This method is applied to a sample of cluster stars. We obtain confidence intervals using a bootstrap method. Our results are in close agreement with those obtained using the Lucy method for recovering the probability density distribution of rotational velocities. Furthermore, Lucy estimation lies inside our confidence interval. Conclusions: Tikhonov regularization is a highly robust method that deconvolves the rotational velocity probability density function from a sample of v sin I data directly without the need for any convergence criteria.

  8. Quantitative assessment of building fire risk to life safety.

    PubMed

    Guanquan, Chu; Jinhua, Sun

    2008-06-01

    This article presents a quantitative risk assessment framework for evaluating fire risk to life safety. Fire risk is divided into two parts: probability and corresponding consequence of every fire scenario. The time-dependent event tree technique is used to analyze probable fire scenarios based on the effect of fire protection systems on fire spread and smoke movement. To obtain the variation of occurrence probability with time, Markov chain is combined with a time-dependent event tree for stochastic analysis on the occurrence probability of fire scenarios. To obtain consequences of every fire scenario, some uncertainties are considered in the risk analysis process. When calculating the onset time to untenable conditions, a range of fires are designed based on different fire growth rates, after which uncertainty of onset time to untenable conditions can be characterized by probability distribution. When calculating occupant evacuation time, occupant premovement time is considered as a probability distribution. Consequences of a fire scenario can be evaluated according to probability distribution of evacuation time and onset time of untenable conditions. Then, fire risk to life safety can be evaluated based on occurrence probability and consequences of every fire scenario. To express the risk assessment method in detail, a commercial building is presented as a case study. A discussion compares the assessment result of the case study with fire statistics.

  9. Competing risk models in reliability systems, an exponential distribution model with Bayesian analysis approach

    NASA Astrophysics Data System (ADS)

    Iskandar, I.

    2018-03-01

    The exponential distribution is the most widely used reliability analysis. This distribution is very suitable for representing the lengths of life of many cases and is available in a simple statistical form. The characteristic of this distribution is a constant hazard rate. The exponential distribution is the lower rank of the Weibull distributions. In this paper our effort is to introduce the basic notions that constitute an exponential competing risks model in reliability analysis using Bayesian analysis approach and presenting their analytic methods. The cases are limited to the models with independent causes of failure. A non-informative prior distribution is used in our analysis. This model describes the likelihood function and follows with the description of the posterior function and the estimations of the point, interval, hazard function, and reliability. The net probability of failure if only one specific risk is present, crude probability of failure due to a specific risk in the presence of other causes, and partial crude probabilities are also included.

  10. On probability-possibility transformations

    NASA Technical Reports Server (NTRS)

    Klir, George J.; Parviz, Behzad

    1992-01-01

    Several probability-possibility transformations are compared in terms of the closeness of preserving second-order properties. The comparison is based on experimental results obtained by computer simulation. Two second-order properties are involved in this study: noninteraction of two distributions and projections of a joint distribution.

  11. Theoretical size distribution of fossil taxa: analysis of a null model.

    PubMed

    Reed, William J; Hughes, Barry D

    2007-03-22

    This article deals with the theoretical size distribution (of number of sub-taxa) of a fossil taxon arising from a simple null model of macroevolution. New species arise through speciations occurring independently and at random at a fixed probability rate, while extinctions either occur independently and at random (background extinctions) or cataclysmically. In addition new genera are assumed to arise through speciations of a very radical nature, again assumed to occur independently and at random at a fixed probability rate. The size distributions of the pioneering genus (following a cataclysm) and of derived genera are determined. Also the distribution of the number of genera is considered along with a comparison of the probability of a monospecific genus with that of a monogeneric family.

  12. The aluminium content of breast tissue taken from women with breast cancer.

    PubMed

    House, Emily; Polwart, Anthony; Darbre, Philippa; Barr, Lester; Metaxas, George; Exley, Christopher

    2013-10-01

    The aetiology of breast cancer is multifactorial. While there are known genetic predispositions to the disease it is probable that environmental factors are also involved. Recent research has demonstrated a regionally specific distribution of aluminium in breast tissue mastectomies while other work has suggested mechanisms whereby breast tissue aluminium might contribute towards the aetiology of breast cancer. We have looked to develop microwave digestion combined with a new form of graphite furnace atomic absorption spectrometry as a precise, accurate and reproducible method for the measurement of aluminium in breast tissue biopsies. We have used this method to test the thesis that there is a regional distribution of aluminium across the breast in women with breast cancer. Microwave digestion of whole breast tissue samples resulted in clear homogenous digests perfectly suitable for the determination of aluminium by graphite furnace atomic absorption spectrometry. The instrument detection limit for the method was 0.48 μg/L. Method blanks were used to estimate background levels of contamination of 14.80 μg/L. The mean concentration of aluminium across all tissues was 0.39 μg Al/g tissue dry wt. There were no statistically significant regionally specific differences in the content of aluminium. We have developed a robust method for the precise and accurate measurement of aluminium in human breast tissue. There are very few such data currently available in the scientific literature and they will add substantially to our understanding of any putative role of aluminium in breast cancer. While we did not observe any statistically significant differences in aluminium content across the breast it has to be emphasised that herein we measured whole breast tissue and not defatted tissue where such a distribution was previously noted. We are very confident that the method developed herein could now be used to provide accurate and reproducible data on the aluminium content in defatted tissue and oil from such tissues and thereby contribute towards our knowledge on aluminium and any role in breast cancer. Copyright © 2013 Elsevier GmbH. All rights reserved.

  13. [Rare earth elements contents and distribution characteristics in nasopharyngeal carcinoma tissue].

    PubMed

    Zhang, Xiangmin; Lan, Xiaolin; Zhang, Lingzhen; Xiao, Fufu; Zhong, Zhaoming; Ye, Guilin; Li, Zong; Li, Shaojin

    2016-03-01

    To investigate the rare earth elements(REEs) contents and distribution characteristics in nasopharyngeal carcinoma( NPC) tissue in Gannan region. Thirty patients of NPC in Gannan region were included in this study. The REEs contents were measured by tandem mass spectrometer inductively coupled plasma(ICP-MS/MS) in 30 patients, and the REEs contents and distribution were analyzed. The average standard deviation value of REEs in lung cancer and normal lung tissues was the minimum mostly. Light REEs content was higher than the medium REEs, and medium REEs content was higher than the heavy REEs content. REEs contents changes in nasopharyngeal carcinoma were variable obviously, the absolute value of Nd, Ce, Pr, Gd and other light rare earth elements were variable widely. The degree of changes on Yb, Tb, Ho and other heavy rare earth elements were variable widely, and there was presence of Eu, Ce negative anomaly(δEu=0. 385 5, δCe= 0. 523 4). The distribution characteristic of REEs contents in NPC patients is consistent with the parity distribution. With increasing atomic sequence, the content is decline wavy. Their distribution patterns were a lack of heavy REEs and enrichment of light REEs, and there was Eu , Ce negative anomaly.

  14. Newton/Poisson-Distribution Program

    NASA Technical Reports Server (NTRS)

    Bowerman, Paul N.; Scheuer, Ernest M.

    1990-01-01

    NEWTPOIS, one of two computer programs making calculations involving cumulative Poisson distributions. NEWTPOIS (NPO-17715) and CUMPOIS (NPO-17714) used independently of one another. NEWTPOIS determines Poisson parameter for given cumulative probability, from which one obtains percentiles for gamma distributions with integer shape parameters and percentiles for X(sup2) distributions with even degrees of freedom. Used by statisticians and others concerned with probabilities of independent events occurring over specific units of time, area, or volume. Program written in C.

  15. Probability theory for 3-layer remote sensing radiative transfer model: univariate case.

    PubMed

    Ben-David, Avishai; Davidson, Charles E

    2012-04-23

    A probability model for a 3-layer radiative transfer model (foreground layer, cloud layer, background layer, and an external source at the end of line of sight) has been developed. The 3-layer model is fundamentally important as the primary physical model in passive infrared remote sensing. The probability model is described by the Johnson family of distributions that are used as a fit for theoretically computed moments of the radiative transfer model. From the Johnson family we use the SU distribution that can address a wide range of skewness and kurtosis values (in addition to addressing the first two moments, mean and variance). In the limit, SU can also describe lognormal and normal distributions. With the probability model one can evaluate the potential for detecting a target (vapor cloud layer), the probability of observing thermal contrast, and evaluate performance (receiver operating characteristics curves) in clutter-noise limited scenarios. This is (to our knowledge) the first probability model for the 3-layer remote sensing geometry that treats all parameters as random variables and includes higher-order statistics. © 2012 Optical Society of America

  16. Sampling probability distributions of lesions in mammograms

    NASA Astrophysics Data System (ADS)

    Looney, P.; Warren, L. M.; Dance, D. R.; Young, K. C.

    2015-03-01

    One approach to image perception studies in mammography using virtual clinical trials involves the insertion of simulated lesions into normal mammograms. To facilitate this, a method has been developed that allows for sampling of lesion positions across the cranio-caudal and medio-lateral radiographic projections in accordance with measured distributions of real lesion locations. 6825 mammograms from our mammography image database were segmented to find the breast outline. The outlines were averaged and smoothed to produce an average outline for each laterality and radiographic projection. Lesions in 3304 mammograms with malignant findings were mapped on to a standardised breast image corresponding to the average breast outline using piecewise affine transforms. A four dimensional probability distribution function was found from the lesion locations in the cranio-caudal and medio-lateral radiographic projections for calcification and noncalcification lesions. Lesion locations sampled from this probability distribution function were mapped on to individual mammograms using a piecewise affine transform which transforms the average outline to the outline of the breast in the mammogram. The four dimensional probability distribution function was validated by comparing it to the two dimensional distributions found by considering each radiographic projection and laterality independently. The correlation of the location of the lesions sampled from the four dimensional probability distribution function across radiographic projections was shown to match the correlation of the locations of the original mapped lesion locations. The current system has been implemented as a web-service on a server using the Python Django framework. The server performs the sampling, performs the mapping and returns the results in a javascript object notation format.

  17. Sunflower Plants as Bioindicators of Environmental Pollution with Lead (II) Ions

    PubMed Central

    Krystofova, Olga; Shestivska, Violetta; Galiova, Michaela; Novotny, Karel; Kaiser, Jozef; Zehnalek, Josef; Babula, Petr; Opatrilova, Radka; Adam, Vojtech; Kizek, Rene

    2009-01-01

    In this study, the influence of lead (II) ions on sunflower growth and biochemistry was investigated from various points of view. Sunflower plants were treated with 0, 10, 50, 100 and/or 500 μM Pb-EDTA for eight days. We observed alterations in growth in all experimental groups compared with non-treated control plants. Further we determined total content of proteins by a Bradford protein assay. By the eighth day of the experiment, total protein contents in all treated plants were much lower compared to control. Particularly noticeable was the loss of approx. 8 μg/mL or 15 μg/mL in shoots or roots of plants treated with 100 mM Pb-EDTA. We also focused our attention on the activity of alanine transaminase (ALT), aspartate transaminase (AST) and urease. Activity of the enzymes increased with increasing length of the treatment and applied concentration of lead (II) ions. This increase corresponds well with a higher metabolic activity of treated plants. Contents of cysteine, reduced glutathione (GSH), oxidized glutathione (GSSG) and phytochelatin 2 (PC2) were determined by high performance liquid chromatography with electrochemical detection. Cysteine content declined in roots of plants with the increasing time of treatment of plants with Pb-EDTA and the concentration of toxic substance. Moreover, we observed ten times higher content of cysteine in roots in comparison with shoots. The observed reduction of cysteine content probably relates with its utilization for biosynthesis of GSH and phytochelatins, because the content of GSH and PC2 was similar in roots and shoots and increased with increased treatment time and concentration of Pb-EDTA. Moreover, we observed oxidative stress caused by Pb-EDTA in roots where the GSSG/GSH ratio was about 0.66. In shoots, the oxidative stress was less distinctive, with a GSSG/GSH ratio 0.14. We also estimated the rate of phytochelatin biosynthesis from the slope of linear equations plotted with data measured in the particular experimental group. The highest rate was detected in roots treated with 100 μM of Pb-EDTA. To determine heavy metal ions many analytical instruments can be used, however, most of them are only able to quantify total content of the metals. This problem can be overcome using laser induced breakdown spectroscopy, because it is able to provide a high spatial-distribution of metal ions in different types of materials, including plant tissues. Data obtained were used to assemble 3D maps of Pb and Mg distribution. Distribution of these elements is concentrated around main vascular bundle of leaf, which means around midrib. PMID:22346686

  18. Sunflower Plants as Bioindicators of Environmental Pollution with Lead (II) Ions.

    PubMed

    Krystofova, Olga; Shestivska, Violetta; Galiova, Michaela; Novotny, Karel; Kaiser, Jozef; Zehnalek, Josef; Babula, Petr; Opatrilova, Radka; Adam, Vojtech; Kizek, Rene

    2009-01-01

    In this study, the influence of lead (II) ions on sunflower growth and biochemistry was investigated from various points of view. Sunflower plants were treated with 0, 10, 50, 100 and/or 500 μM Pb-EDTA for eight days. We observed alterations in growth in all experimental groups compared with non-treated control plants. Further we determined total content of proteins by a Bradford protein assay. By the eighth day of the experiment, total protein contents in all treated plants were much lower compared to control. Particularly noticeable was the loss of approx. 8 μg/mL or 15 μg/mL in shoots or roots of plants treated with 100 mM Pb-EDTA. We also focused our attention on the activity of alanine transaminase (ALT), aspartate transaminase (AST) and urease. Activity of the enzymes increased with increasing length of the treatment and applied concentration of lead (II) ions. This increase corresponds well with a higher metabolic activity of treated plants. Contents of cysteine, reduced glutathione (GSH), oxidized glutathione (GSSG) and phytochelatin 2 (PC2) were determined by high performance liquid chromatography with electrochemical detection. Cysteine content declined in roots of plants with the increasing time of treatment of plants with Pb-EDTA and the concentration of toxic substance. Moreover, we observed ten times higher content of cysteine in roots in comparison with shoots. The observed reduction of cysteine content probably relates with its utilization for biosynthesis of GSH and phytochelatins, because the content of GSH and PC2 was similar in roots and shoots and increased with increased treatment time and concentration of Pb-EDTA. Moreover, we observed oxidative stress caused by Pb-EDTA in roots where the GSSG/GSH ratio was about 0.66. In shoots, the oxidative stress was less distinctive, with a GSSG/GSH ratio 0.14. We also estimated the rate of phytochelatin biosynthesis from the slope of linear equations plotted with data measured in the particular experimental group. The highest rate was detected in roots treated with 100 μM of Pb-EDTA. To determine heavy metal ions many analytical instruments can be used, however, most of them are only able to quantify total content of the metals. This problem can be overcome using laser induced breakdown spectroscopy, because it is able to provide a high spatial-distribution of metal ions in different types of materials, including plant tissues. Data obtained were used to assemble 3D maps of Pb and Mg distribution. Distribution of these elements is concentrated around main vascular bundle of leaf, which means around midrib.

  19. Density functional study for crystalline structures and electronic properties of Si1- x Sn x binary alloys

    NASA Astrophysics Data System (ADS)

    Nagae, Yuki; Kurosawa, Masashi; Shibayama, Shigehisa; Araidai, Masaaki; Sakashita, Mitsuo; Nakatsuka, Osamu; Shiraishi, Kenji; Zaima, Shigeaki

    2016-08-01

    We have carried out density functional theory (DFT) calculation for Si1- x Sn x alloy and investigated the effect of the displacement of Si and Sn atoms with strain relaxation on the lattice constant and E- k dispersion. We calculated the formation probabilities for all atomic configurations of Si1- x Sn x according to the Boltzmann distribution. The average lattice constant and E- k dispersion were weighted by the formation probability of each configuration of Si1- x Sn x . We estimated the displacement of Si and Sn atoms from the initial tetrahedral site in the Si1- x Sn x unit cell considering structural relaxation under hydrostatic pressure, and we found that the breaking of the degenerated electronic levels of the valence band edge could be caused by the breaking of the tetrahedral symmetry. We also calculated the E- k dispersion of the Si1- x Sn x alloy by the DFT+U method and found that a Sn content above 50% would be required for the indirect-direct transition.

  20. How to model a negligible probability under the WTO sanitary and phytosanitary agreement?

    PubMed

    Powell, Mark R

    2013-06-01

    Since the 1997 EC--Hormones decision, World Trade Organization (WTO) Dispute Settlement Panels have wrestled with the question of what constitutes a negligible risk under the Sanitary and Phytosanitary Agreement. More recently, the 2010 WTO Australia--Apples Panel focused considerable attention on the appropriate quantitative model for a negligible probability in a risk assessment. The 2006 Australian Import Risk Analysis for Apples from New Zealand translated narrative probability statements into quantitative ranges. The uncertainty about a "negligible" probability was characterized as a uniform distribution with a minimum value of zero and a maximum value of 10(-6) . The Australia - Apples Panel found that the use of this distribution would tend to overestimate the likelihood of "negligible" events and indicated that a triangular distribution with a most probable value of zero and a maximum value of 10⁻⁶ would correct the bias. The Panel observed that the midpoint of the uniform distribution is 5 × 10⁻⁷ but did not consider that the triangular distribution has an expected value of 3.3 × 10⁻⁷. Therefore, if this triangular distribution is the appropriate correction, the magnitude of the bias found by the Panel appears modest. The Panel's detailed critique of the Australian risk assessment, and the conclusions of the WTO Appellate Body about the materiality of flaws found by the Panel, may have important implications for the standard of review for risk assessments under the WTO SPS Agreement. © 2012 Society for Risk Analysis.

  1. Fourier Method for Calculating Fission Chain Neutron Multiplicity Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chambers, David H.; Chandrasekaran, Hema; Walston, Sean E.

    Here, a new way of utilizing the fast Fourier transform is developed to compute the probability distribution for a fission chain to create n neutrons. We then extend this technique to compute the probability distributions for detecting n neutrons. Lastly, our technique can be used for fission chains initiated by either a single neutron inducing a fission or by the spontaneous fission of another isotope.

  2. Fourier Method for Calculating Fission Chain Neutron Multiplicity Distributions

    DOE PAGES

    Chambers, David H.; Chandrasekaran, Hema; Walston, Sean E.

    2017-03-27

    Here, a new way of utilizing the fast Fourier transform is developed to compute the probability distribution for a fission chain to create n neutrons. We then extend this technique to compute the probability distributions for detecting n neutrons. Lastly, our technique can be used for fission chains initiated by either a single neutron inducing a fission or by the spontaneous fission of another isotope.

  3. Teaching Uncertainties

    ERIC Educational Resources Information Center

    Duerdoth, Ian

    2009-01-01

    The subject of uncertainties (sometimes called errors) is traditionally taught (to first-year science undergraduates) towards the end of a course on statistics that defines probability as the limit of many trials, and discusses probability distribution functions and the Gaussian distribution. We show how to introduce students to the concepts of…

  4. Robust approaches to quantification of margin and uncertainty for sparse data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin

    Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Hang, E-mail: hangchen@mit.edu; Thill, Peter; Cao, Jianshu

    In biochemical systems, intrinsic noise may drive the system switch from one stable state to another. We investigate how kinetic switching between stable states in a bistable network is influenced by dynamic disorder, i.e., fluctuations in the rate coefficients. Using the geometric minimum action method, we first investigate the optimal transition paths and the corresponding minimum actions based on a genetic toggle switch model in which reaction coefficients draw from a discrete probability distribution. For the continuous probability distribution of the rate coefficient, we then consider two models of dynamic disorder in which reaction coefficients undergo different stochastic processes withmore » the same stationary distribution. In one, the kinetic parameters follow a discrete Markov process and in the other they follow continuous Langevin dynamics. We find that regulation of the parameters modulating the dynamic disorder, as has been demonstrated to occur through allosteric control in bistable networks in the immune system, can be crucial in shaping the statistics of optimal transition paths, transition probabilities, and the stationary probability distribution of the network.« less

  6. Detecting background changes in environments with dynamic foreground by separating probability distribution function mixtures using Pearson's method of moments

    NASA Astrophysics Data System (ADS)

    Jenkins, Colleen; Jordan, Jay; Carlson, Jeff

    2007-02-01

    This paper presents parameter estimation techniques useful for detecting background changes in a video sequence with extreme foreground activity. A specific application of interest is automated detection of the covert placement of threats (e.g., a briefcase bomb) inside crowded public facilities. We propose that a histogram of pixel intensity acquired from a fixed mounted camera over time for a series of images will be a mixture of two Gaussian functions: the foreground probability distribution function and background probability distribution function. We will use Pearson's Method of Moments to separate the two probability distribution functions. The background function can then be "remembered" and changes in the background can be detected. Subsequent comparisons of background estimates are used to detect changes. Changes are flagged to alert security forces to the presence and location of potential threats. Results are presented that indicate the significant potential for robust parameter estimation techniques as applied to video surveillance.

  7. Study of the organic matter in the DSDP /JOIDES/ cores, legs 10-15. [Deep Sea Drilling Program

    NASA Technical Reports Server (NTRS)

    Simoneit, B. R. T.; Burlingame, A. L.

    1974-01-01

    The composition of the organic matter collected on legs 10 to 15 of the DSDP (Deep Sea Drilling Project) is described. Distributions of various alkanes, carboxylic acids, steroids and terpenoids, isoprenoid ketones and olefins, and aromatic polycyclic compounds are given. Samples analyzed had terrigenous clay components, with variable organic carbon contents and thus diverse solvent soluble matter. The distribution patterns for the various compound series monitored were of marine derivation, with the terrigenous components superimposed. Diagenesis of steroids appeared to proceed via both stanones and stanols to their respective steranes. Degradative processes were observed to be operative: oxidative products, mainly ketones derived from steroids and phytol, were identified, probably due to microbial alteration prior to or during sedimentation. Loss of alkane and fatty acid C preferences and presence of polycyclic aromatics evinced maturation. Results indicate that the accumulation, degradation, diagenesis and maturation of organic matter occurs in various steps in the deep sea environment.

  8. Spatial distribution and pollution evaluation of heavy metals in Yangtze estuary sediment.

    PubMed

    Liu, Ruimin; Men, Cong; Liu, Yongyan; Yu, Wenwen; Xu, Fei; Shen, Zhenyao

    2016-09-15

    To analyze the spatial distribution patterns and ecological risks of heavy metals, 30 sediment samples were taken in the Yangtze River Estuary (YRE) in May 2011. The content of Al, As, Cr, Cu, Fe, Mn, Ni and Pb increased as follows: inner-region

  9. Results of irradiation of (U0.55Pu0.45)N and (U0.4Pu0.6)N fuels in BOR-60 up to ˜12 at.% burn-up

    NASA Astrophysics Data System (ADS)

    Rogozkin, B. D.; Stepennova, N. M.; Fedorov, Yu. Ye.; Shishkov, M. G.; Kryukov, F. N.; Kuzmin, S. V.; Nikitin, O. N.; Belyaeva, A. V.; Zabudko, L. M.

    2013-09-01

    In the article presented are the results of post-irradiation tests of helium bonded fuel pins with mixed mononitride fuel (U0.55Pu0.45)N and (U0.4Pu0.6)N having 85% density irradiated in BOR-60 reactor. Achieved maximum burn-up was, respectively, equal to 9.4 and 12.1 at.% with max linear heat rates 41.9 and 54.5 kW/m. Maximum irradiation dose was 43 dpa. No damage of claddings made of ChS-68 steel (20% cold worked) was observed, and ductility margin existed. Maximum depth of cladding corrosion was within 15 μm. Swelling rates of (U0.4Pu0.6)N and (U0.55Pu0.45)N were, respectively, ˜1.1% and ˜0.68% per 1 at.%. Gas release rate did not exceed 19.3% and 19%. Pattern of porosity distribution in the fuel influenced fuel swelling and gas release rates. Plutonium and uranium are uniformly distributed in the fuel, local minimum values of their content being caused by pores and cracks in the pellets. The observable peaks in content distribution are probably connected with the local formation of isolated phases (e.g. Mo, Pd) while the minimum values refer to fuel pores and cracks. Xenon and cesium tend to migrate from the hot sections of fuel, and therefore their min content is observed in the central section of the fuel pellets. Phase composition of the fuel was determined with X-ray diffractometer. The X-ray patterns of metallographic specimens were obtained by the scanning method (the step was 0.02°, the step exposition was equal to 2 s). From the X-ray diffraction analysis data, it follows that the nitrides of both fuel types have the single-phase structure with an FCC lattice (see Table 6).

  10. Mechanisms of graviperception and response in pea seedlings

    NASA Technical Reports Server (NTRS)

    Galston, A. W.

    1984-01-01

    A new method for the mass isolation and purification of multigranular amyloplasts from the bundle sheath parenchyma of etiolated pa epicotyls was presented. These bodies, which displace within 2+3 minutes of exposure to 1 x g, are probably the gravity receptors (statoliths) in this plant. These amyloplasts were characterized as having a doublemembrane with a surface-localized ATPase, a high calcium content, and their own genomic DNA. These amyloplasts are investigated as to (a) the reasons for their especially high density, probable related to their starch content, (b) the possible identity of their DNA with the DNA of chloroplasts and unigranular amyloplasts, and (c) possible importance of their high calcium content.

  11. p-adic stochastic hidden variable model

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrew

    1998-03-01

    We propose stochastic hidden variables model in which hidden variables have a p-adic probability distribution ρ(λ) and at the same time conditional probabilistic distributions P(U,λ), U=A,A',B,B', are ordinary probabilities defined on the basis of the Kolmogorov measure-theoretical axiomatics. A frequency definition of p-adic probability is quite similar to the ordinary frequency definition of probability. p-adic frequency probability is defined as the limit of relative frequencies νn but in the p-adic metric. We study a model with p-adic stochastics on the level of the hidden variables description. But, of course, responses of macroapparatuses have to be described by ordinary stochastics. Thus our model describes a mixture of p-adic stochastics of the microworld and ordinary stochastics of macroapparatuses. In this model probabilities for physical observables are the ordinary probabilities. At the same time Bell's inequality is violated.

  12. Stationary properties of maximum-entropy random walks.

    PubMed

    Dixit, Purushottam D

    2015-10-01

    Maximum-entropy (ME) inference of state probabilities using state-dependent constraints is popular in the study of complex systems. In stochastic systems, how state space topology and path-dependent constraints affect ME-inferred state probabilities remains unknown. To that end, we derive the transition probabilities and the stationary distribution of a maximum path entropy Markov process subject to state- and path-dependent constraints. A main finding is that the stationary distribution over states differs significantly from the Boltzmann distribution and reflects a competition between path multiplicity and imposed constraints. We illustrate our results with particle diffusion on a two-dimensional landscape. Connections with the path integral approach to diffusion are discussed.

  13. Estimation and prediction of maximum daily rainfall at Sagar Island using best fit probability models

    NASA Astrophysics Data System (ADS)

    Mandal, S.; Choudhury, B. U.

    2015-07-01

    Sagar Island, setting on the continental shelf of Bay of Bengal, is one of the most vulnerable deltas to the occurrence of extreme rainfall-driven climatic hazards. Information on probability of occurrence of maximum daily rainfall will be useful in devising risk management for sustaining rainfed agrarian economy vis-a-vis food and livelihood security. Using six probability distribution models and long-term (1982-2010) daily rainfall data, we studied the probability of occurrence of annual, seasonal and monthly maximum daily rainfall (MDR) in the island. To select the best fit distribution models for annual, seasonal and monthly time series based on maximum rank with minimum value of test statistics, three statistical goodness of fit tests, viz. Kolmogorove-Smirnov test (K-S), Anderson Darling test ( A 2 ) and Chi-Square test ( X 2) were employed. The fourth probability distribution was identified from the highest overall score obtained from the three goodness of fit tests. Results revealed that normal probability distribution was best fitted for annual, post-monsoon and summer seasons MDR, while Lognormal, Weibull and Pearson 5 were best fitted for pre-monsoon, monsoon and winter seasons, respectively. The estimated annual MDR were 50, 69, 86, 106 and 114 mm for return periods of 2, 5, 10, 20 and 25 years, respectively. The probability of getting an annual MDR of >50, >100, >150, >200 and >250 mm were estimated as 99, 85, 40, 12 and 03 % level of exceedance, respectively. The monsoon, summer and winter seasons exhibited comparatively higher probabilities (78 to 85 %) for MDR of >100 mm and moderate probabilities (37 to 46 %) for >150 mm. For different recurrence intervals, the percent probability of MDR varied widely across intra- and inter-annual periods. In the island, rainfall anomaly can pose a climatic threat to the sustainability of agricultural production and thus needs adequate adaptation and mitigation measures.

  14. Quasi-probabilities in conditioned quantum measurement and a geometric/statistical interpretation of Aharonov's weak value

    NASA Astrophysics Data System (ADS)

    Lee, Jaeha; Tsutsui, Izumi

    2017-05-01

    We show that the joint behavior of an arbitrary pair of (generally noncommuting) quantum observables can be described by quasi-probabilities, which are an extended version of the standard probabilities used for describing the outcome of measurement for a single observable. The physical situations that require these quasi-probabilities arise when one considers quantum measurement of an observable conditioned by some other variable, with the notable example being the weak measurement employed to obtain Aharonov's weak value. Specifically, we present a general prescription for the construction of quasi-joint probability (QJP) distributions associated with a given combination of observables. These QJP distributions are introduced in two complementary approaches: one from a bottom-up, strictly operational construction realized by examining the mathematical framework of the conditioned measurement scheme, and the other from a top-down viewpoint realized by applying the results of the spectral theorem for normal operators and their Fourier transforms. It is then revealed that, for a pair of simultaneously measurable observables, the QJP distribution reduces to the unique standard joint probability distribution of the pair, whereas for a noncommuting pair there exists an inherent indefiniteness in the choice of such QJP distributions, admitting a multitude of candidates that may equally be used for describing the joint behavior of the pair. In the course of our argument, we find that the QJP distributions furnish the space of operators in the underlying Hilbert space with their characteristic geometric structures such that the orthogonal projections and inner products of observables can be given statistical interpretations as, respectively, “conditionings” and “correlations”. The weak value Aw for an observable A is then given a geometric/statistical interpretation as either the orthogonal projection of A onto the subspace generated by another observable B, or equivalently, as the conditioning of A given B with respect to the QJP distribution under consideration.

  15. Uncertainty in modeled upper ocean heat content change

    NASA Astrophysics Data System (ADS)

    Tokmakian, Robin; Challenor, Peter

    2014-02-01

    This paper examines the uncertainty in the change in the heat content in the ocean component of a general circulation model. We describe the design and implementation of our statistical methodology. Using an ensemble of model runs and an emulator, we produce an estimate of the full probability distribution function (PDF) for the change in upper ocean heat in an Atmosphere/Ocean General Circulation Model, the Community Climate System Model v. 3, across a multi-dimensional input space. We show how the emulator of the GCM's heat content change and hence, the PDF, can be validated and how implausible outcomes from the emulator can be identified when compared to observational estimates of the metric. In addition, the paper describes how the emulator outcomes and related uncertainty information might inform estimates of the same metric from a multi-model Coupled Model Intercomparison Project phase 3 ensemble. We illustrate how to (1) construct an ensemble based on experiment design methods, (2) construct and evaluate an emulator for a particular metric of a complex model, (3) validate the emulator using observational estimates and explore the input space with respect to implausible outcomes and (4) contribute to the understanding of uncertainties within a multi-model ensemble. Finally, we estimate the most likely value for heat content change and its uncertainty for the model, with respect to both observations and the uncertainty in the value for the input parameters.

  16. Critical Values for Lawshe's Content Validity Ratio: Revisiting the Original Methods of Calculation

    ERIC Educational Resources Information Center

    Ayre, Colin; Scally, Andrew John

    2014-01-01

    The content validity ratio originally proposed by Lawshe is widely used to quantify content validity and yet methods used to calculate the original critical values were never reported. Methods for original calculation of critical values are suggested along with tables of exact binomial probabilities.

  17. Two Universality Properties Associated with the Monkey Model of Zipf's Law

    NASA Astrophysics Data System (ADS)

    Perline, Richard; Perline, Ron

    2016-03-01

    The distribution of word probabilities in the monkey model of Zipf's law is associated with two universality properties: (1) the power law exponent converges strongly to $-1$ as the alphabet size increases and the letter probabilities are specified as the spacings from a random division of the unit interval for any distribution with a bounded density function on $[0,1]$; and (2), on a logarithmic scale the version of the model with a finite word length cutoff and unequal letter probabilities is approximately normally distributed in the part of the distribution away from the tails. The first property is proved using a remarkably general limit theorem for the logarithm of sample spacings from Shao and Hahn, and the second property follows from Anscombe's central limit theorem for a random number of i.i.d. random variables. The finite word length model leads to a hybrid Zipf-lognormal mixture distribution closely related to work in other areas.

  18. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.

    1983-01-01

    Use of previously coded and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main progress. The probability distributions provided include the beta, chi-square, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F. Other mathematical functions include the Bessel function, I sub o, gamma and log-gamma functions, error functions, and exponential integral. Auxiliary services include sorting and printer-plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  19. Computer routines for probability distributions, random numbers, and related functions

    USGS Publications Warehouse

    Kirby, W.H.

    1980-01-01

    Use of previously codes and tested subroutines simplifies and speeds up program development and testing. This report presents routines that can be used to calculate various probability distributions and other functions of importance in statistical hydrology. The routines are designed as general-purpose Fortran subroutines and functions to be called from user-written main programs. The probability distributions provided include the beta, chisquare, gamma, Gaussian (normal), Pearson Type III (tables and approximation), and Weibull. Also provided are the distributions of the Grubbs-Beck outlier test, Kolmogorov 's and Smirnov 's D, Student 's t, noncentral t (approximate), and Snedecor F tests. Other mathematical functions include the Bessel function I (subzero), gamma and log-gamma functions, error functions and exponential integral. Auxiliary services include sorting and printer plotting. Random number generators for uniform and normal numbers are provided and may be used with some of the above routines to generate numbers from other distributions. (USGS)

  20. Universal laws of human society's income distribution

    NASA Astrophysics Data System (ADS)

    Tao, Yong

    2015-10-01

    General equilibrium equations in economics play the same role with many-body Newtonian equations in physics. Accordingly, each solution of the general equilibrium equations can be regarded as a possible microstate of the economic system. Since Arrow's Impossibility Theorem and Rawls' principle of social fairness will provide a powerful support for the hypothesis of equal probability, then the principle of maximum entropy is available in a just and equilibrium economy so that an income distribution will occur spontaneously (with the largest probability). Remarkably, some scholars have observed such an income distribution in some democratic countries, e.g. USA. This result implies that the hypothesis of equal probability may be only suitable for some "fair" systems (economic or physical systems). From this meaning, the non-equilibrium systems may be "unfair" so that the hypothesis of equal probability is unavailable.

  1. Polynomial chaos representation of databases on manifolds

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soize, C., E-mail: christian.soize@univ-paris-est.fr; Ghanem, R., E-mail: ghanem@usc.edu

    2017-04-15

    Characterizing the polynomial chaos expansion (PCE) of a vector-valued random variable with probability distribution concentrated on a manifold is a relevant problem in data-driven settings. The probability distribution of such random vectors is multimodal in general, leading to potentially very slow convergence of the PCE. In this paper, we build on a recent development for estimating and sampling from probabilities concentrated on a diffusion manifold. The proposed methodology constructs a PCE of the random vector together with an associated generator that samples from the target probability distribution which is estimated from data concentrated in the neighborhood of the manifold. Themore » method is robust and remains efficient for high dimension and large datasets. The resulting polynomial chaos construction on manifolds permits the adaptation of many uncertainty quantification and statistical tools to emerging questions motivated by data-driven queries.« less

  2. Gravitational lensing, time delay, and gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Mao, Shude

    1992-01-01

    The probability distributions of time delay in gravitational lensing by point masses and isolated galaxies (modeled as singular isothermal spheres) are studied. For point lenses (all with the same mass) the probability distribution is broad, and with a peak at delta(t) of about 50 S; for singular isothermal spheres, the probability distribution is a rapidly decreasing function with increasing time delay, with a median delta(t) equals about 1/h month, and its behavior depends sensitively on the luminosity function of galaxies. The present simplified calculation is particularly relevant to the gamma-ray bursts if they are of cosmological origin. The frequency of 'recurrent' bursts due to gravitational lensing by galaxies is probably between 0.05 and 0.4 percent. Gravitational lensing can be used as a test of the cosmological origin of gamma-ray bursts.

  3. Reduction of mare basalts by sulfur loss

    USGS Publications Warehouse

    Brett, R.

    1976-01-01

    Metallic Fe content and S abundance are inversely correlated in mare basalts. Either S volatilization from the melt results in reduction of Fe2+ to Fe0 or else high S content decreases Fe0 activity in the melt, thus explaining the correlation. All considerations favor the model that metallic iron in mare basalts is due to sulfur loss. The Apollo 11 and 17 mare basalt melts were probably saturated with S at the time of eruption; the Apollo 12 and 15 basalts were probably not saturated. Non-mare rocks show a positive correlation of S abundance with metallic Fe content; it is proposed that this is due to the addition of meteoritic material having a fairly constant Fe0/S ratio. If true, metallic Fe content or S abundance in non-mare rocks provides a measure of degree of meteoritic contamination. ?? 1976.

  4. A general formula for computing maximum proportion correct scores in various psychophysical paradigms with arbitrary probability distributions of stimulus observations.

    PubMed

    Dai, Huanping; Micheyl, Christophe

    2015-05-01

    Proportion correct (Pc) is a fundamental measure of task performance in psychophysics. The maximum Pc score that can be achieved by an optimal (maximum-likelihood) observer in a given task is of both theoretical and practical importance, because it sets an upper limit on human performance. Within the framework of signal detection theory, analytical solutions for computing the maximum Pc score have been established for several common experimental paradigms under the assumption of Gaussian additive internal noise. However, as the scope of applications of psychophysical signal detection theory expands, the need is growing for psychophysicists to compute maximum Pc scores for situations involving non-Gaussian (internal or stimulus-induced) noise. In this article, we provide a general formula for computing the maximum Pc in various psychophysical experimental paradigms for arbitrary probability distributions of sensory activity. Moreover, easy-to-use MATLAB code implementing the formula is provided. Practical applications of the formula are illustrated, and its accuracy is evaluated, for two paradigms and two types of probability distributions (uniform and Gaussian). The results demonstrate that Pc scores computed using the formula remain accurate even for continuous probability distributions, as long as the conversion from continuous probability density functions to discrete probability mass functions is supported by a sufficiently high sampling resolution. We hope that the exposition in this article, and the freely available MATLAB code, facilitates calculations of maximum performance for a wider range of experimental situations, as well as explorations of the impact of different assumptions concerning internal-noise distributions on maximum performance in psychophysical experiments.

  5. Atom counting in HAADF STEM using a statistical model-based approach: methodology, possibilities, and inherent limitations.

    PubMed

    De Backer, A; Martinez, G T; Rosenauer, A; Van Aert, S

    2013-11-01

    In the present paper, a statistical model-based method to count the number of atoms of monotype crystalline nanostructures from high resolution high-angle annular dark-field (HAADF) scanning transmission electron microscopy (STEM) images is discussed in detail together with a thorough study on the possibilities and inherent limitations. In order to count the number of atoms, it is assumed that the total scattered intensity scales with the number of atoms per atom column. These intensities are quantitatively determined using model-based statistical parameter estimation theory. The distribution describing the probability that intensity values are generated by atomic columns containing a specific number of atoms is inferred on the basis of the experimental scattered intensities. Finally, the number of atoms per atom column is quantified using this estimated probability distribution. The number of atom columns available in the observed STEM image, the number of components in the estimated probability distribution, the width of the components of the probability distribution, and the typical shape of a criterion to assess the number of components in the probability distribution directly affect the accuracy and precision with which the number of atoms in a particular atom column can be estimated. It is shown that single atom sensitivity is feasible taking the latter aspects into consideration. © 2013 Elsevier B.V. All rights reserved.

  6. Glyphosate and AMPA distribution in wind-eroded sediment derived from loess soil.

    PubMed

    Bento, Célia P M; Goossens, Dirk; Rezaei, Mahrooz; Riksen, Michel; Mol, Hans G J; Ritsema, Coen J; Geissen, Violette

    2017-01-01

    Glyphosate is one of the most used herbicides in agricultural lands worldwide. Wind-eroded sediment and dust, as an environmental transport pathway of glyphosate and of its main metabolite aminomethylphosphonic acid (AMPA), can result in environmental- and human exposure far beyond the agricultural areas where it has been applied. Therefore, special attention is required to the airborne transport of glyphosate and AMPA. In this study, we investigated the behavior of glyphosate and AMPA in wind-eroded sediment by measuring their content in different size fractions (median diameters between 715 and 8 μm) of a loess soil, during a period of 28 days after glyphosate application. Granulometrical extraction was done using a wind tunnel and a Soil Fine Particle Extractor. Extractions were conducted on days 0, 3, 7, 14, 21 and 28 after glyphosate application. Results indicated that glyphosate and AMPA contents were significantly higher in the finest particle fractions (median diameters between 8 and 18 μm), and lowered significantly with the increase in particle size. However, their content remained constant when aggregates were present in the sample. Glyphosate and AMPA contents correlated positively with clay, organic matter, and silt content. The dissipation of glyphosate over time was very low, which was most probably due to the low soil moisture content of the sediment. Consequently, the formation of AMPA was also very low. The low dissipation of glyphosate in our study indicates that the risk of glyphosate transport in dry sediment to off-target areas by wind can be very high. The highest glyphosate and AMPA contents were found in the smallest soil fractions (PM 10 and less), which are easily inhaled and, therefore, contribute to human exposure. Copyright © 2016 Elsevier Ltd. All rights reserved.

  7. Theoretical size distribution of fossil taxa: analysis of a null model

    PubMed Central

    Reed, William J; Hughes, Barry D

    2007-01-01

    Background This article deals with the theoretical size distribution (of number of sub-taxa) of a fossil taxon arising from a simple null model of macroevolution. Model New species arise through speciations occurring independently and at random at a fixed probability rate, while extinctions either occur independently and at random (background extinctions) or cataclysmically. In addition new genera are assumed to arise through speciations of a very radical nature, again assumed to occur independently and at random at a fixed probability rate. Conclusion The size distributions of the pioneering genus (following a cataclysm) and of derived genera are determined. Also the distribution of the number of genera is considered along with a comparison of the probability of a monospecific genus with that of a monogeneric family. PMID:17376249

  8. A Submillimeter Continuum Survey of Local Dust-obscured Galaxies

    NASA Astrophysics Data System (ADS)

    Lee, Jong Chul; Hwang, Ho Seong; Lee, Gwang-Ho

    2016-12-01

    We conduct a 350 μm dust continuum emission survey of 17 dust-obscured galaxies (DOGs) at z = 0.05-0.08 with the Caltech Submillimeter Observatory (CSO). We detect 14 DOGs with S 350 μm = 114-650 mJy and signal-to-noise > 3. By including two additional DOGs with submillimeter data in the literature, we are able to study dust content for a sample of 16 local DOGs, which consist of 12 bump and four power-law types. We determine their physical parameters with a two-component modified blackbody function model. The derived dust temperatures are in the range 57-122 K and 22-35 K for the warm and cold dust components, respectively. The total dust mass and the mass fraction of the warm dust component are 3-34 × 107 M ⊙ and 0.03%-2.52%, respectively. We compare these results with those of other submillimeter-detected infrared luminous galaxies. The bump DOGs, the majority of the DOG sample, show similar distributions of dust temperatures and total dust mass to the comparison sample. The power-law DOGs show a hint of smaller dust masses than other samples, but need to be tested with a larger sample. These findings support that the reason DOGs show heavy dust obscuration is not an overall amount of dust content, but probably the spatial distribution of dust therein.

  9. Maximum entropy approach to statistical inference for an ocean acoustic waveguide.

    PubMed

    Knobles, D P; Sagers, J D; Koch, R A

    2012-02-01

    A conditional probability distribution suitable for estimating the statistical properties of ocean seabed parameter values inferred from acoustic measurements is derived from a maximum entropy principle. The specification of the expectation value for an error function constrains the maximization of an entropy functional. This constraint determines the sensitivity factor (β) to the error function of the resulting probability distribution, which is a canonical form that provides a conservative estimate of the uncertainty of the parameter values. From the conditional distribution, marginal distributions for individual parameters can be determined from integration over the other parameters. The approach is an alternative to obtaining the posterior probability distribution without an intermediary determination of the likelihood function followed by an application of Bayes' rule. In this paper the expectation value that specifies the constraint is determined from the values of the error function for the model solutions obtained from a sparse number of data samples. The method is applied to ocean acoustic measurements taken on the New Jersey continental shelf. The marginal probability distribution for the values of the sound speed ratio at the surface of the seabed and the source levels of a towed source are examined for different geoacoustic model representations. © 2012 Acoustical Society of America

  10. Probability distributions of continuous measurement results for conditioned quantum evolution

    NASA Astrophysics Data System (ADS)

    Franquet, A.; Nazarov, Yuli V.

    2017-02-01

    We address the statistics of continuous weak linear measurement on a few-state quantum system that is subject to a conditioned quantum evolution. For a conditioned evolution, both the initial and final states of the system are fixed: the latter is achieved by the postselection in the end of the evolution. The statistics may drastically differ from the nonconditioned case, and the interference between initial and final states can be observed in the probability distributions of measurement outcomes as well as in the average values exceeding the conventional range of nonconditioned averages. We develop a proper formalism to compute the distributions of measurement outcomes, and evaluate and discuss the distributions in experimentally relevant setups. We demonstrate the manifestations of the interference between initial and final states in various regimes. We consider analytically simple examples of nontrivial probability distributions. We reveal peaks (or dips) at half-quantized values of the measurement outputs. We discuss in detail the case of zero overlap between initial and final states demonstrating anomalously big average outputs and sudden jump in time-integrated output. We present and discuss the numerical evaluation of the probability distribution aiming at extending the analytical results and describing a realistic experimental situation of a qubit in the regime of resonant fluorescence.

  11. Shallow slip amplification and enhanced tsunami hazard unravelled by dynamic simulations of mega-thrust earthquakes

    PubMed Central

    Murphy, S.; Scala, A.; Herrero, A.; Lorito, S.; Festa, G.; Trasatti, E.; Tonini, R.; Romano, F.; Molinari, I.; Nielsen, S.

    2016-01-01

    The 2011 Tohoku earthquake produced an unexpected large amount of shallow slip greatly contributing to the ensuing tsunami. How frequent are such events? How can they be efficiently modelled for tsunami hazard? Stochastic slip models, which can be computed rapidly, are used to explore the natural slip variability; however, they generally do not deal specifically with shallow slip features. We study the systematic depth-dependence of slip along a thrust fault with a number of 2D dynamic simulations using stochastic shear stress distributions and a geometry based on the cross section of the Tohoku fault. We obtain a probability density for the slip distribution, which varies both with depth, earthquake size and whether the rupture breaks the surface. We propose a method to modify stochastic slip distributions according to this dynamically-derived probability distribution. This method may be efficiently applied to produce large numbers of heterogeneous slip distributions for probabilistic tsunami hazard analysis. Using numerous M9 earthquake scenarios, we demonstrate that incorporating the dynamically-derived probability distribution does enhance the conditional probability of exceedance of maximum estimated tsunami wave heights along the Japanese coast. This technique for integrating dynamic features in stochastic models can be extended to any subduction zone and faulting style. PMID:27725733

  12. ACARA - AVAILABILITY, COST AND RESOURCE ALLOCATION

    NASA Technical Reports Server (NTRS)

    Viterna, L. A.

    1994-01-01

    ACARA is a program for analyzing availability, lifecycle cost, and resource scheduling. It uses a statistical Monte Carlo method to simulate a system's capacity states as well as component failure and repair. Component failures are modelled using a combination of exponential and Weibull probability distributions. ACARA schedules component replacement to achieve optimum system performance. The scheduling will comply with any constraints on component production, resupply vehicle capacity, on-site spares, or crew manpower and equipment. ACARA is capable of many types of analyses and trade studies because of its integrated approach. It characterizes the system performance in terms of both state availability and equivalent availability (a weighted average of state availability). It can determine the probability of exceeding a capacity state to assess reliability and loss of load probability. It can also evaluate the effect of resource constraints on system availability and lifecycle cost. ACARA interprets the results of a simulation and displays tables and charts for: (1) performance, i.e., availability and reliability of capacity states, (2) frequency of failure and repair, (3) lifecycle cost, including hardware, transportation, and maintenance, and (4) usage of available resources, including mass, volume, and maintenance man-hours. ACARA incorporates a user-friendly, menu-driven interface with full screen data entry. It provides a file management system to store and retrieve input and output datasets for system simulation scenarios. ACARA is written in APL2 using the APL2 interpreter for IBM PC compatible systems running MS-DOS. Hardware requirements for the APL2 system include 640K of RAM, 2Mb of extended memory, and an 80386 or 80486 processor with an 80x87 math co-processor. A dot matrix printer is required if the user wishes to print a graph from a results table. A sample MS-DOS executable is provided on the distribution medium. The executable contains licensed material from the APL2 for the IBM PC product which is program property of IBM; Copyright IBM Corporation 1988 - All rights reserved. It is distributed with IBM's permission. The standard distribution medium for this program is a set of three 5.25 inch 360K MS-DOS format diskettes. The contents of the diskettes are compressed using the PKWARE archiving tools. The utility to unarchive the files, PKUNZIP.EXE, is included. ACARA was developed in 1992.

  13. A Distributed Cache Update Deployment Strategy in CDN

    NASA Astrophysics Data System (ADS)

    E, Xinhua; Zhu, Binjie

    2018-04-01

    The CDN management system distributes content objects to the edge of the internet to achieve the user's near access. Cache strategy is an important problem in network content distribution. A cache strategy was designed in which the content effective diffusion in the cache group, so more content was storage in the cache, and it improved the group hit rate.

  14. TK3 eBook Software to Author, Distribute, and Use Electronic Course Content for Medical Education

    ERIC Educational Resources Information Center

    Morton, David A.; Foreman, K. Bo; Goede, Patricia A.; Bezzant, John L.; Albertine, Kurt H.

    2007-01-01

    The methods for authoring and distributing course content are undergoing substantial changes due to advancement in computer technology. Paper has been the traditional method to author and distribute course content. Paper enables students to personalize content through highlighting and note taking but does not enable the incorporation of multimedia…

  15. A discussion on the origin of quantum probabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holik, Federico, E-mail: olentiev2@gmail.com; Departamento de Matemática - Ciclo Básico Común, Universidad de Buenos Aires - Pabellón III, Ciudad Universitaria, Buenos Aires; Sáenz, Manuel

    We study the origin of quantum probabilities as arising from non-Boolean propositional-operational structures. We apply the method developed by Cox to non distributive lattices and develop an alternative formulation of non-Kolmogorovian probability measures for quantum mechanics. By generalizing the method presented in previous works, we outline a general framework for the deduction of probabilities in general propositional structures represented by lattices (including the non-distributive case). -- Highlights: •Several recent works use a derivation similar to that of R.T. Cox to obtain quantum probabilities. •We apply Cox’s method to the lattice of subspaces of the Hilbert space. •We obtain a derivationmore » of quantum probabilities which includes mixed states. •The method presented in this work is susceptible to generalization. •It includes quantum mechanics and classical mechanics as particular cases.« less

  16. Probability Weighting Functions Derived from Hyperbolic Time Discounting: Psychophysical Models and Their Individual Level Testing.

    PubMed

    Takemura, Kazuhisa; Murakami, Hajime

    2016-01-01

    A probability weighting function (w(p)) is considered to be a nonlinear function of probability (p) in behavioral decision theory. This study proposes a psychophysical model of probability weighting functions derived from a hyperbolic time discounting model and a geometric distribution. The aim of the study is to show probability weighting functions from the point of view of waiting time for a decision maker. Since the expected value of a geometrically distributed random variable X is 1/p, we formulized the probability weighting function of the expected value model for hyperbolic time discounting as w(p) = (1 - k log p)(-1). Moreover, the probability weighting function is derived from Loewenstein and Prelec's (1992) generalized hyperbolic time discounting model. The latter model is proved to be equivalent to the hyperbolic-logarithmic weighting function considered by Prelec (1998) and Luce (2001). In this study, we derive a model from the generalized hyperbolic time discounting model assuming Fechner's (1860) psychophysical law of time and a geometric distribution of trials. In addition, we develop median models of hyperbolic time discounting and generalized hyperbolic time discounting. To illustrate the fitness of each model, a psychological experiment was conducted to assess the probability weighting and value functions at the level of the individual participant. The participants were 50 university students. The results of individual analysis indicated that the expected value model of generalized hyperbolic discounting fitted better than previous probability weighting decision-making models. The theoretical implications of this finding are discussed.

  17. Hybrid Approaches and Industrial Applications of Pattern Recognition,

    DTIC Science & Technology

    1980-10-01

    emphasized that the probability distribution in (9) is correct only under the assumption that P( wIx ) is known exactly. In practice this assumption will...sufficient precision. The alternative would be to take the probability distribution of estimates of P( wix ) into account in the analysis. However, from the

  18. Generalized Success-Breeds-Success Principle Leading to Time-Dependent Informetric Distributions.

    ERIC Educational Resources Information Center

    Egghe, Leo; Rousseau, Ronald

    1995-01-01

    Reformulates the success-breeds-success (SBS) principle in informetrics in order to generate a general theory of source-item relationships. Topics include a time-dependent probability, a new model for the expected probability that is compared with the SBS principle with exact combinatorial calculations, classical frequency distributions, and…

  19. The beta distribution: A statistical model for world cloud cover

    NASA Technical Reports Server (NTRS)

    Falls, L. W.

    1973-01-01

    Much work has been performed in developing empirical global cloud cover models. This investigation was made to determine an underlying theoretical statistical distribution to represent worldwide cloud cover. The beta distribution with probability density function is given to represent the variability of this random variable. It is shown that the beta distribution possesses the versatile statistical characteristics necessary to assume the wide variety of shapes exhibited by cloud cover. A total of 160 representative empirical cloud cover distributions were investigated and the conclusion was reached that this study provides sufficient statical evidence to accept the beta probability distribution as the underlying model for world cloud cover.

  20. Gas Hydrate Formation Probability Distributions: The Effect of Shear and Comparisons with Nucleation Theory.

    PubMed

    May, Eric F; Lim, Vincent W; Metaxas, Peter J; Du, Jianwei; Stanwix, Paul L; Rowland, Darren; Johns, Michael L; Haandrikman, Gert; Crosby, Daniel; Aman, Zachary M

    2018-03-13

    Gas hydrate formation is a stochastic phenomenon of considerable significance for any risk-based approach to flow assurance in the oil and gas industry. In principle, well-established results from nucleation theory offer the prospect of predictive models for hydrate formation probability in industrial production systems. In practice, however, heuristics are relied on when estimating formation risk for a given flowline subcooling or when quantifying kinetic hydrate inhibitor (KHI) performance. Here, we present statistically significant measurements of formation probability distributions for natural gas hydrate systems under shear, which are quantitatively compared with theoretical predictions. Distributions with over 100 points were generated using low-mass, Peltier-cooled pressure cells, cycled in temperature between 40 and -5 °C at up to 2 K·min -1 and analyzed with robust algorithms that automatically identify hydrate formation and initial growth rates from dynamic pressure data. The application of shear had a significant influence on the measured distributions: at 700 rpm mass-transfer limitations were minimal, as demonstrated by the kinetic growth rates observed. The formation probability distributions measured at this shear rate had mean subcoolings consistent with theoretical predictions and steel-hydrate-water contact angles of 14-26°. However, the experimental distributions were substantially wider than predicted, suggesting that phenomena acting on macroscopic length scales are responsible for much of the observed stochastic formation. Performance tests of a KHI provided new insights into how such chemicals can reduce the risk of hydrate blockage in flowlines. Our data demonstrate that the KHI not only reduces the probability of formation (by both shifting and sharpening the distribution) but also reduces hydrate growth rates by a factor of 2.

  1. Quantum work in the Bohmian framework

    NASA Astrophysics Data System (ADS)

    Sampaio, R.; Suomela, S.; Ala-Nissila, T.; Anders, J.; Philbin, T. G.

    2018-01-01

    At nonzero temperature classical systems exhibit statistical fluctuations of thermodynamic quantities arising from the variation of the system's initial conditions and its interaction with the environment. The fluctuating work, for example, is characterized by the ensemble of system trajectories in phase space and, by including the probabilities for various trajectories to occur, a work distribution can be constructed. However, without phase-space trajectories, the task of constructing a work probability distribution in the quantum regime has proven elusive. Here we use quantum trajectories in phase space and define fluctuating work as power integrated along the trajectories, in complete analogy to classical statistical physics. The resulting work probability distribution is valid for any quantum evolution, including cases with coherences in the energy basis. We demonstrate the quantum work probability distribution and its properties with an exactly solvable example of a driven quantum harmonic oscillator. An important feature of the work distribution is its dependence on the initial statistical mixture of pure states, which is reflected in higher moments of the work. The proposed approach introduces a fundamentally different perspective on quantum thermodynamics, allowing full thermodynamic characterization of the dynamics of quantum systems, including the measurement process.

  2. Applying the log-normal distribution to target detection

    NASA Astrophysics Data System (ADS)

    Holst, Gerald C.

    1992-09-01

    Holst and Pickard experimentally determined that MRT responses tend to follow a log-normal distribution. The log normal distribution appeared reasonable because nearly all visual psychological data is plotted on a logarithmic scale. It has the additional advantage that it is bounded to positive values; an important consideration since probability of detection is often plotted in linear coordinates. Review of published data suggests that the log-normal distribution may have universal applicability. Specifically, the log-normal distribution obtained from MRT tests appears to fit the target transfer function and the probability of detection of rectangular targets.

  3. Tsunami Size Distributions at Far-Field Locations from Aggregated Earthquake Sources

    NASA Astrophysics Data System (ADS)

    Geist, E. L.; Parsons, T.

    2015-12-01

    The distribution of tsunami amplitudes at far-field tide gauge stations is explained by aggregating the probability of tsunamis derived from individual subduction zones and scaled by their seismic moment. The observed tsunami amplitude distributions of both continental (e.g., San Francisco) and island (e.g., Hilo) stations distant from subduction zones are examined. Although the observed probability distributions nominally follow a Pareto (power-law) distribution, there are significant deviations. Some stations exhibit varying degrees of tapering of the distribution at high amplitudes and, in the case of the Hilo station, there is a prominent break in slope on log-log probability plots. There are also differences in the slopes of the observed distributions among stations that can be significant. To explain these differences we first estimate seismic moment distributions of observed earthquakes for major subduction zones. Second, regression models are developed that relate the tsunami amplitude at a station to seismic moment at a subduction zone, correcting for epicentral distance. The seismic moment distribution is then transformed to a site-specific tsunami amplitude distribution using the regression model. Finally, a mixture distribution is developed, aggregating the transformed tsunami distributions from all relevant subduction zones. This mixture distribution is compared to the observed distribution to assess the performance of the method described above. This method allows us to estimate the largest tsunami that can be expected in a given time period at a station.

  4. Application of at-site peak-streamflow frequency analyses for very low annual exceedance probabilities

    USGS Publications Warehouse

    Asquith, William H.; Kiang, Julie E.; Cohn, Timothy A.

    2017-07-17

    The U.S. Geological Survey (USGS), in cooperation with the U.S. Nuclear Regulatory Commission, has investigated statistical methods for probabilistic flood hazard assessment to provide guidance on very low annual exceedance probability (AEP) estimation of peak-streamflow frequency and the quantification of corresponding uncertainties using streamgage-specific data. The term “very low AEP” implies exceptionally rare events defined as those having AEPs less than about 0.001 (or 1 × 10–3 in scientific notation or for brevity 10–3). Such low AEPs are of great interest to those involved with peak-streamflow frequency analyses for critical infrastructure, such as nuclear power plants. Flood frequency analyses at streamgages are most commonly based on annual instantaneous peak streamflow data and a probability distribution fit to these data. The fitted distribution provides a means to extrapolate to very low AEPs. Within the United States, the Pearson type III probability distribution, when fit to the base-10 logarithms of streamflow, is widely used, but other distribution choices exist. The USGS-PeakFQ software, implementing the Pearson type III within the Federal agency guidelines of Bulletin 17B (method of moments) and updates to the expected moments algorithm (EMA), was specially adapted for an “Extended Output” user option to provide estimates at selected AEPs from 10–3 to 10–6. Parameter estimation methods, in addition to product moments and EMA, include L-moments, maximum likelihood, and maximum product of spacings (maximum spacing estimation). This study comprehensively investigates multiple distributions and parameter estimation methods for two USGS streamgages (01400500 Raritan River at Manville, New Jersey, and 01638500 Potomac River at Point of Rocks, Maryland). The results of this study specifically involve the four methods for parameter estimation and up to nine probability distributions, including the generalized extreme value, generalized log-normal, generalized Pareto, and Weibull. Uncertainties in streamflow estimates for corresponding AEP are depicted and quantified as two primary forms: quantile (aleatoric [random sampling] uncertainty) and distribution-choice (epistemic [model] uncertainty). Sampling uncertainties of a given distribution are relatively straightforward to compute from analytical or Monte Carlo-based approaches. Distribution-choice uncertainty stems from choices of potentially applicable probability distributions for which divergence among the choices increases as AEP decreases. Conventional goodness-of-fit statistics, such as Cramér-von Mises, and L-moment ratio diagrams are demonstrated in order to hone distribution choice. The results generally show that distribution choice uncertainty is larger than sampling uncertainty for very low AEP values.

  5. Polynomial probability distribution estimation using the method of moments

    PubMed Central

    Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram–Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation. PMID:28394949

  6. Polynomial probability distribution estimation using the method of moments.

    PubMed

    Munkhammar, Joakim; Mattsson, Lars; Rydén, Jesper

    2017-01-01

    We suggest a procedure for estimating Nth degree polynomial approximations to unknown (or known) probability density functions (PDFs) based on N statistical moments from each distribution. The procedure is based on the method of moments and is setup algorithmically to aid applicability and to ensure rigor in use. In order to show applicability, polynomial PDF approximations are obtained for the distribution families Normal, Log-Normal, Weibull as well as for a bimodal Weibull distribution and a data set of anonymized household electricity use. The results are compared with results for traditional PDF series expansion methods of Gram-Charlier type. It is concluded that this procedure is a comparatively simple procedure that could be used when traditional distribution families are not applicable or when polynomial expansions of probability distributions might be considered useful approximations. In particular this approach is practical for calculating convolutions of distributions, since such operations become integrals of polynomial expressions. Finally, in order to show an advanced applicability of the method, it is shown to be useful for approximating solutions to the Smoluchowski equation.

  7. The Spiral of Life

    NASA Astrophysics Data System (ADS)

    Cajiao Vélez, F.; Kamiński, J. Z.; Krajewska, K.

    2018-04-01

    High-energy photoionization driven by short and circularly-polarized laser pulses is studied in the framework of the relativistic strong-field approximation. The saddle-point analysis of the integrals defining the probability amplitude is used to determine the general properties of the probability distributions. Additionally, an approximate solution to the saddle-point equation is derived. This leads to the concept of the three-dimensional spiral of life in momentum space, around which the ionization probability distribution is maximum. We demonstrate that such spiral is also obtained from a classical treatment.

  8. Count data, detection probabilities, and the demography, dynamics, distribution, and decline of amphibians.

    PubMed

    Schmidt, Benedikt R

    2003-08-01

    The evidence for amphibian population declines is based on count data that were not adjusted for detection probabilities. Such data are not reliable even when collected using standard methods. The formula C = Np (where C is a count, N the true parameter value, and p is a detection probability) relates count data to demography, population size, or distributions. With unadjusted count data, one assumes a linear relationship between C and N and that p is constant. These assumptions are unlikely to be met in studies of amphibian populations. Amphibian population data should be based on methods that account for detection probabilities.

  9. Bayesian data analysis tools for atomic physics

    NASA Astrophysics Data System (ADS)

    Trassinelli, Martino

    2017-10-01

    We present an introduction to some concepts of Bayesian data analysis in the context of atomic physics. Starting from basic rules of probability, we present the Bayes' theorem and its applications. In particular we discuss about how to calculate simple and joint probability distributions and the Bayesian evidence, a model dependent quantity that allows to assign probabilities to different hypotheses from the analysis of a same data set. To give some practical examples, these methods are applied to two concrete cases. In the first example, the presence or not of a satellite line in an atomic spectrum is investigated. In the second example, we determine the most probable model among a set of possible profiles from the analysis of a statistically poor spectrum. We show also how to calculate the probability distribution of the main spectral component without having to determine uniquely the spectrum modeling. For these two studies, we implement the program Nested_fit to calculate the different probability distributions and other related quantities. Nested_fit is a Fortran90/Python code developed during the last years for analysis of atomic spectra. As indicated by the name, it is based on the nested algorithm, which is presented in details together with the program itself.

  10. Sandpile-based model for capturing magnitude distributions and spatiotemporal clustering and separation in regional earthquakes

    NASA Astrophysics Data System (ADS)

    Batac, Rene C.; Paguirigan, Antonino A., Jr.; Tarun, Anjali B.; Longjas, Anthony G.

    2017-04-01

    We propose a cellular automata model for earthquake occurrences patterned after the sandpile model of self-organized criticality (SOC). By incorporating a single parameter describing the probability to target the most susceptible site, the model successfully reproduces the statistical signatures of seismicity. The energy distributions closely follow power-law probability density functions (PDFs) with a scaling exponent of around -1. 6, consistent with the expectations of the Gutenberg-Richter (GR) law, for a wide range of the targeted triggering probability values. Additionally, for targeted triggering probabilities within the range 0.004-0.007, we observe spatiotemporal distributions that show bimodal behavior, which is not observed previously for the original sandpile. For this critical range of values for the probability, model statistics show remarkable comparison with long-period empirical data from earthquakes from different seismogenic regions. The proposed model has key advantages, the foremost of which is the fact that it simultaneously captures the energy, space, and time statistics of earthquakes by just introducing a single parameter, while introducing minimal parameters in the simple rules of the sandpile. We believe that the critical targeting probability parameterizes the memory that is inherently present in earthquake-generating regions.

  11. Effects of vibration and shock on the performance of gas-bearing space-power Brayton cycle turbomachinery. Part 3: Sinusoidal and random vibration data reduction and evaluation, and random vibration probability analysis

    NASA Technical Reports Server (NTRS)

    Tessarzik, J. M.; Chiang, T.; Badgley, R. H.

    1973-01-01

    The random vibration response of a gas bearing rotor support system has been experimentally and analytically investigated in the amplitude and frequency domains. The NASA Brayton Rotating Unit (BRU), a 36,000 rpm, 10 KWe turbogenerator had previously been subjected in the laboratory to external random vibrations, and the response data recorded on magnetic tape. This data has now been experimentally analyzed for amplitude distribution and magnetic tape. This data has now been experimentally analyzed for amplitude distribution and frequency content. The results of the power spectral density analysis indicate strong vibration responses for the major rotor-bearing system components at frequencies which correspond closely to their resonant frequencies obtained under periodic vibration testing. The results of amplitude analysis indicate an increasing shift towards non-Gaussian distributions as the input level of external vibrations is raised. Analysis of axial random vibration response of the BRU was performed by using a linear three-mass model. Power spectral densities, the root-mean-square value of the thrust bearing surface contact were calculated for specified input random excitation.

  12. Fingerprint multicast in secure video streaming.

    PubMed

    Zhao, H Vicky; Liu, K J Ray

    2006-01-01

    Digital fingerprinting is an emerging technology to protect multimedia content from illegal redistribution, where each distributed copy is labeled with unique identification information. In video streaming, huge amount of data have to be transmitted to a large number of users under stringent latency constraints, so the bandwidth-efficient distribution of uniquely fingerprinted copies is crucial. This paper investigates the secure multicast of anticollusion fingerprinted video in streaming applications and analyzes their performance. We first propose a general fingerprint multicast scheme that can be used with most spread spectrum embedding-based multimedia fingerprinting systems. To further improve the bandwidth efficiency, we explore the special structure of the fingerprint design and propose a joint fingerprint design and distribution scheme. From our simulations, the two proposed schemes can reduce the bandwidth requirement by 48% to 87%, depending on the number of users, the characteristics of video sequences, and the network and computation constraints. We also show that under the constraint that all colluders have the same probability of detection, the embedded fingerprints in the two schemes have approximately the same collusion resistance. Finally, we propose a fingerprint drift compensation scheme to improve the quality of the reconstructed sequences at the decoder's side without introducing extra communication overhead.

  13. Characterising RNA secondary structure space using information entropy

    PubMed Central

    2013-01-01

    Comparative methods for RNA secondary structure prediction use evolutionary information from RNA alignments to increase prediction accuracy. The model is often described in terms of stochastic context-free grammars (SCFGs), which generate a probability distribution over secondary structures. It is, however, unclear how this probability distribution changes as a function of the input alignment. As prediction programs typically only return a single secondary structure, better characterisation of the underlying probability space of RNA secondary structures is of great interest. In this work, we show how to efficiently compute the information entropy of the probability distribution over RNA secondary structures produced for RNA alignments by a phylo-SCFG, and implement it for the PPfold model. We also discuss interpretations and applications of this quantity, including how it can clarify reasons for low prediction reliability scores. PPfold and its source code are available from http://birc.au.dk/software/ppfold/. PMID:23368905

  14. The probability distribution of side-chain conformations in [Leu] and [Met]enkephalin determines the potency and selectivity to mu and delta opiate receptors.

    PubMed

    Nielsen, Bjørn G; Jensen, Morten Ø; Bohr, Henrik G

    2003-01-01

    The structure of enkephalin, a small neuropeptide with five amino acids, has been simulated on computers using molecular dynamics. Such simulations exhibit a few stable conformations, which also have been identified experimentally. The simulations provide the possibility to perform cluster analysis in the space defined by potentially pharmacophoric measures such as dihedral angles, side-chain orientation, etc. By analyzing the statistics of the resulting clusters, the probability distribution of the side-chain conformations may be determined. These probabilities allow us to predict the selectivity of [Leu]enkephalin and [Met]enkephalin to the known mu- and delta-type opiate receptors to which they bind as agonists. Other plausible consequences of these probability distributions are discussed in relation to the way in which they may influence the dynamics of the synapse. Copyright 2003 Wiley Periodicals, Inc. Biopolymers (Pept Sci) 71: 577-592, 2003

  15. A model for the distribution of watermarked digital content on mobile networks

    NASA Astrophysics Data System (ADS)

    Frattolillo, Franco; D'Onofrio, Salvatore

    2006-10-01

    Although digital watermarking can be considered one of the key technologies to implement the copyright protection of digital contents distributed on the Internet, most of the content distribution models based on watermarking protocols proposed in literature have been purposely designed for fixed networks and cannot be easily adapted to mobile networks. On the contrary, the use of mobile devices currently enables new types of services and business models, and this makes the development of new content distribution models for mobile environments strategic in the current scenario of the Internet. This paper presents and discusses a distribution model of watermarked digital contents for such environments able to achieve a trade-off between the needs of efficiency and security.

  16. Exact probability distribution function for the volatility of cumulative production

    NASA Astrophysics Data System (ADS)

    Zadourian, Rubina; Klümper, Andreas

    2018-04-01

    In this paper we study the volatility and its probability distribution function for the cumulative production based on the experience curve hypothesis. This work presents a generalization of the study of volatility in Lafond et al. (2017), which addressed the effects of normally distributed noise in the production process. Due to its wide applicability in industrial and technological activities we present here the mathematical foundation for an arbitrary distribution function of the process, which we expect will pave the future research on forecasting of the production process.

  17. Statistics of intensity in adaptive-optics images and their usefulness for detection and photometry of exoplanets.

    PubMed

    Gladysz, Szymon; Yaitskova, Natalia; Christou, Julian C

    2010-11-01

    This paper is an introduction to the problem of modeling the probability density function of adaptive-optics speckle. We show that with the modified Rician distribution one cannot describe the statistics of light on axis. A dual solution is proposed: the modified Rician distribution for off-axis speckle and gamma-based distribution for the core of the point spread function. From these two distributions we derive optimal statistical discriminators between real sources and quasi-static speckles. In the second part of the paper the morphological difference between the two probability density functions is used to constrain a one-dimensional, "blind," iterative deconvolution at the position of an exoplanet. Separation of the probability density functions of signal and speckle yields accurate differential photometry in our simulations of the SPHERE planet finder instrument.

  18. Small-Scale Spatio-Temporal Distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) Using Probability Kriging.

    PubMed

    Wang, S Q; Zhang, H Y; Li, Z L

    2016-10-01

    Understanding spatio-temporal distribution of pest in orchards can provide important information that could be used to design monitoring schemes and establish better means for pest control. In this study, the spatial and temporal distribution of Bactrocera minax (Enderlein) (Diptera: Tephritidae) was assessed, and activity trends were evaluated by using probability kriging. Adults of B. minax were captured in two successive occurrences in a small-scale citrus orchard by using food bait traps, which were placed both inside and outside the orchard. The weekly spatial distribution of B. minax within the orchard and adjacent woods was examined using semivariogram parameters. The edge concentration was discovered during the most weeks in adult occurrence, and the population of the adults aggregated with high probability within a less-than-100-m-wide band on both of the sides of the orchard and the woods. The sequential probability kriged maps showed that the adults were estimated in the marginal zone with higher probability, especially in the early and peak stages. The feeding, ovipositing, and mating behaviors of B. minax are possible explanations for these spatio-temporal patterns. Therefore, spatial arrangement and distance to the forest edge of traps or spraying spot should be considered to enhance pest control on B. minax in small-scale orchards.

  19. Quantitative Measurements of Autobiographical Memory Content

    PubMed Central

    Mainetti, Matteo; Ascoli, Giorgio A.

    2012-01-01

    Autobiographical memory (AM), subjective recollection of past experiences, is fundamental in everyday life. Nevertheless, characterization of the spontaneous occurrence of AM, as well as of the number and types of recollected details, remains limited. The CRAM (Cue-Recalled Autobiographical Memory) test (http://cramtest.info) adapts and combines the cue-word method with an assessment that collects counts of details recalled from different life periods. The SPAM (Spontaneous Probability of Autobiographical Memories) protocol samples introspection during everyday activity, recording memory duration and frequency. These measures provide detailed, naturalistic accounts of AM content and frequency, quantifying essential dimensions of recollection. AM content (∼20 details/recollection) decreased with the age of the episode, but less drastically than the probability of reporting remote compared to recent memories. AM retrieval was frequent (∼20/hour), each memory lasting ∼30 seconds. Testable hypotheses of the specific content retrieved in a fixed time from given life periods are presented. PMID:23028629

  20. The Detection of Signals in Impulsive Noise.

    DTIC Science & Technology

    1983-06-01

    ASSI FICATION/ DOWN GRADING SCHEOUL1E * I1S. DISTRIBUTION STATEMENT (of th0i0 Rhport) Approved for Public Release; Distribucion Unlimited * 17...has a symmetric distribution, sgn(x i) will be -1 with probability 1/2 and +1 with probability 1/2. Considering the sum of observations as 0 binomial

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Diwaker, E-mail: diwakerphysics@gmail.com; Chakraborty, Aniruddha

    The Smoluchowski equation with a time-dependent sink term is solved exactly. In this method, knowing the probability distribution P(0, s) at the origin, allows deriving the probability distribution P(x, s) at all positions. Exact solutions of the Smoluchowski equation are also provided in different cases where the sink term has linear, constant, inverse, and exponential variation in time.

  2. Selenium speciation profiles in biofortified sangiovese wine.

    PubMed

    Fontanella, Maria Chiara; D'Amato, Roberto; Regni, Luca; Proietti, Primo; Beone, Gian Maria; Businelli, Daniela

    2017-09-01

    Biofortification is an agronomic-based strategy, utilized by farmers, to produce selenium (Se)-enriched food products that may help reduce dietary deficiencies of Se occurring throughout susceptible regions of the world. The foliar exposure route application ensures a high efficiency of Se assimilation by the plant since it does not depend on root-to-shoot translocation. In this study we treated grapevines of Sangiovese variety in the pre-flowering period with sodium selenate (100mg Se L -1 ). Se content was measured in leaves, fruit at harvest time and in wine respectively in treated and not treated samples with ICP-MS. At harvest, a higher amount of Se in the treated leaves compared to untreated ones was found, 16.0±3.1mgkg -1 dry weight (dw) against 0.17±0.006mgkg -1 dw in the untreated ones. The treated grapes had a content of Se of 0.800±0.08mgkg -1 dw, while that untreated one 0.065±0.025mgkg -1 dw. Immediately after the malolactic fermentation, the wine obtained from treated and untreated vines had a Se content of 0.620±0.09mg Se L -1 and 0.024±0.010mg Se L -1 respectively. In our case the percentage of inorganic Se is 26% of the total Se in the untreated wine, while in Se enriched wine this percentage increase to 47.5% of the total Se. The Se(VI) was the inorganic chemical form more present in enriched wine, probably due to foliar application with selenate. Distributions of Se species suggested being careful to the choice of the enrichment solutions to promote a balanced distribution of different chemical forms, perhaps favouring the accumulation of organic forms. Copyright © 2016 Elsevier GmbH. All rights reserved.

  3. A least squares approach to estimating the probability distribution of unobserved data in multiphoton microscopy

    NASA Astrophysics Data System (ADS)

    Salama, Paul

    2008-02-01

    Multi-photon microscopy has provided biologists with unprecedented opportunities for high resolution imaging deep into tissues. Unfortunately deep tissue multi-photon microscopy images are in general noisy since they are acquired at low photon counts. To aid in the analysis and segmentation of such images it is sometimes necessary to initially enhance the acquired images. One way to enhance an image is to find the maximum a posteriori (MAP) estimate of each pixel comprising an image, which is achieved by finding a constrained least squares estimate of the unknown distribution. In arriving at the distribution it is assumed that the noise is Poisson distributed, the true but unknown pixel values assume a probability mass function over a finite set of non-negative values, and since the observed data also assumes finite values because of low photon counts, the sum of the probabilities of the observed pixel values (obtained from the histogram of the acquired pixel values) is less than one. Experimental results demonstrate that it is possible to closely estimate the unknown probability mass function with these assumptions.

  4. Probability distribution for the Gaussian curvature of the zero level surface of a random function

    NASA Astrophysics Data System (ADS)

    Hannay, J. H.

    2018-04-01

    A rather natural construction for a smooth random surface in space is the level surface of value zero, or ‘nodal’ surface f(x,y,z)  =  0, of a (real) random function f; the interface between positive and negative regions of the function. A physically significant local attribute at a point of a curved surface is its Gaussian curvature (the product of its principal curvatures) because, when integrated over the surface it gives the Euler characteristic. Here the probability distribution for the Gaussian curvature at a random point on the nodal surface f  =  0 is calculated for a statistically homogeneous (‘stationary’) and isotropic zero mean Gaussian random function f. Capitalizing on the isotropy, a ‘fixer’ device for axes supplies the probability distribution directly as a multiple integral. Its evaluation yields an explicit algebraic function with a simple average. Indeed, this average Gaussian curvature has long been known. For a non-zero level surface instead of the nodal one, the probability distribution is not fully tractable, but is supplied as an integral expression.

  5. A mechanism producing power law etc. distributions

    NASA Astrophysics Data System (ADS)

    Li, Heling; Shen, Hongjun; Yang, Bin

    2017-07-01

    Power law distribution is playing an increasingly important role in the complex system study. Based on the insolvability of complex systems, the idea of incomplete statistics is utilized and expanded, three different exponential factors are introduced in equations about the normalization condition, statistical average and Shannon entropy, with probability distribution function deduced about exponential function, power function and the product form between power function and exponential function derived from Shannon entropy and maximal entropy principle. So it is shown that maximum entropy principle can totally replace equal probability hypothesis. Owing to the fact that power and probability distribution in the product form between power function and exponential function, which cannot be derived via equal probability hypothesis, can be derived by the aid of maximal entropy principle, it also can be concluded that maximal entropy principle is a basic principle which embodies concepts more extensively and reveals basic principles on motion laws of objects more fundamentally. At the same time, this principle also reveals the intrinsic link between Nature and different objects in human society and principles complied by all.

  6. On Orbital Elements of Extrasolar Planetary Candidates and Spectroscopic Binaries

    NASA Technical Reports Server (NTRS)

    Stepinski, T. F.; Black, D. C.

    2001-01-01

    We estimate probability densities of orbital elements, periods, and eccentricities, for the population of extrasolar planetary candidates (EPC) and, separately, for the population of spectroscopic binaries (SB) with solar-type primaries. We construct empirical cumulative distribution functions (CDFs) in order to infer probability distribution functions (PDFs) for orbital periods and eccentricities. We also derive a joint probability density for period-eccentricity pairs in each population. Comparison of respective distributions reveals that in all cases EPC and SB populations are, in the context of orbital elements, indistinguishable from each other to a high degree of statistical significance. Probability densities of orbital periods in both populations have P(exp -1) functional form, whereas the PDFs of eccentricities can he best characterized as a Gaussian with a mean of about 0.35 and standard deviation of about 0.2 turning into a flat distribution at small values of eccentricity. These remarkable similarities between EPC and SB must be taken into account by theories aimed at explaining the origin of extrasolar planetary candidates, and constitute an important clue us to their ultimate nature.

  7. Steady-state distributions of probability fluxes on complex networks

    NASA Astrophysics Data System (ADS)

    Chełminiak, Przemysław; Kurzyński, Michał

    2017-02-01

    We consider a simple model of the Markovian stochastic dynamics on complex networks to examine the statistical properties of the probability fluxes. The additional transition, called hereafter a gate, powered by the external constant force breaks a detailed balance in the network. We argue, using a theoretical approach and numerical simulations, that the stationary distributions of the probability fluxes emergent under such conditions converge to the Gaussian distribution. By virtue of the stationary fluctuation theorem, its standard deviation depends directly on the square root of the mean flux. In turn, the nonlinear relation between the mean flux and the external force, which provides the key result of the present study, allows us to calculate the two parameters that entirely characterize the Gaussian distribution of the probability fluxes both close to as well as far from the equilibrium state. Also, the other effects that modify these parameters, such as the addition of shortcuts to the tree-like network, the extension and configuration of the gate and a change in the network size studied by means of computer simulations are widely discussed in terms of the rigorous theoretical predictions.

  8. Time-dependent landslide probability mapping

    USGS Publications Warehouse

    Campbell, Russell H.; Bernknopf, Richard L.; ,

    1993-01-01

    Case studies where time of failure is known for rainfall-triggered debris flows can be used to estimate the parameters of a hazard model in which the probability of failure is a function of time. As an example, a time-dependent function for the conditional probability of a soil slip is estimated from independent variables representing hillside morphology, approximations of material properties, and the duration and rate of rainfall. If probabilities are calculated in a GIS (geomorphic information system ) environment, the spatial distribution of the result for any given hour can be displayed on a map. Although the probability levels in this example are uncalibrated, the method offers a potential for evaluating different physical models and different earth-science variables by comparing the map distribution of predicted probabilities with inventory maps for different areas and different storms. If linked with spatial and temporal socio-economic variables, this method could be used for short-term risk assessment.

  9. Generalized Wishart Mixtures for Unsupervised Classification of PolSAR Data

    NASA Astrophysics Data System (ADS)

    Li, Lan; Chen, Erxue; Li, Zengyuan

    2013-01-01

    This paper presents an unsupervised clustering algorithm based upon the expectation maximization (EM) algorithm for finite mixture modelling, using the complex wishart probability density function (PDF) for the probabilities. The mixture model enables to consider heterogeneous thematic classes which could not be better fitted by the unimodal wishart distribution. In order to make it fast and robust to calculate, we use the recently proposed generalized gamma distribution (GΓD) for the single polarization intensity data to make the initial partition. Then we use the wishart probability density function for the corresponding sample covariance matrix to calculate the posterior class probabilities for each pixel. The posterior class probabilities are used for the prior probability estimates of each class and weights for all class parameter updates. The proposed method is evaluated and compared with the wishart H-Alpha-A classification. Preliminary results show that the proposed method has better performance.

  10. Oil spill contamination probability in the southeastern Levantine basin.

    PubMed

    Goldman, Ron; Biton, Eli; Brokovich, Eran; Kark, Salit; Levin, Noam

    2015-02-15

    Recent gas discoveries in the eastern Mediterranean Sea led to multiple operations with substantial economic interest, and with them there is a risk of oil spills and their potential environmental impacts. To examine the potential spatial distribution of this threat, we created seasonal maps of the probability of oil spill pollution reaching an area in the Israeli coastal and exclusive economic zones, given knowledge of its initial sources. We performed simulations of virtual oil spills using realistic atmospheric and oceanic conditions. The resulting maps show dominance of the alongshore northerly current, which causes the high probability areas to be stretched parallel to the coast, increasing contamination probability downstream of source points. The seasonal westerly wind forcing determines how wide the high probability areas are, and may also restrict these to a small coastal region near source points. Seasonal variability in probability distribution, oil state, and pollution time is also discussed. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Probability density function of non-reactive solute concentration in heterogeneous porous formations.

    PubMed

    Bellin, Alberto; Tonina, Daniele

    2007-10-30

    Available models of solute transport in heterogeneous formations lack in providing complete characterization of the predicted concentration. This is a serious drawback especially in risk analysis where confidence intervals and probability of exceeding threshold values are required. Our contribution to fill this gap of knowledge is a probability distribution model for the local concentration of conservative tracers migrating in heterogeneous aquifers. Our model accounts for dilution, mechanical mixing within the sampling volume and spreading due to formation heterogeneity. It is developed by modeling local concentration dynamics with an Ito Stochastic Differential Equation (SDE) that under the hypothesis of statistical stationarity leads to the Beta probability distribution function (pdf) for the solute concentration. This model shows large flexibility in capturing the smoothing effect of the sampling volume and the associated reduction of the probability of exceeding large concentrations. Furthermore, it is fully characterized by the first two moments of the solute concentration, and these are the same pieces of information required for standard geostatistical techniques employing Normal or Log-Normal distributions. Additionally, we show that in the absence of pore-scale dispersion and for point concentrations the pdf model converges to the binary distribution of [Dagan, G., 1982. Stochastic modeling of groundwater flow by unconditional and conditional probabilities, 2, The solute transport. Water Resour. Res. 18 (4), 835-848.], while it approaches the Normal distribution for sampling volumes much larger than the characteristic scale of the aquifer heterogeneity. Furthermore, we demonstrate that the same model with the spatial moments replacing the statistical moments can be applied to estimate the proportion of the plume volume where solute concentrations are above or below critical thresholds. Application of this model to point and vertically averaged bromide concentrations from the first Cape Cod tracer test and to a set of numerical simulations confirms the above findings and for the first time it shows the superiority of the Beta model to both Normal and Log-Normal models in interpreting field data. Furthermore, we show that assuming a-priori that local concentrations are normally or log-normally distributed may result in a severe underestimate of the probability of exceeding large concentrations.

  12. A Performance Comparison on the Probability Plot Correlation Coefficient Test using Several Plotting Positions for GEV Distribution.

    NASA Astrophysics Data System (ADS)

    Ahn, Hyunjun; Jung, Younghun; Om, Ju-Seong; Heo, Jun-Haeng

    2014-05-01

    It is very important to select the probability distribution in Statistical hydrology. Goodness of fit test is a statistical method that selects an appropriate probability model for a given data. The probability plot correlation coefficient (PPCC) test as one of the goodness of fit tests was originally developed for normal distribution. Since then, this test has been widely applied to other probability models. The PPCC test is known as one of the best goodness of fit test because it shows higher rejection powers among them. In this study, we focus on the PPCC tests for the GEV distribution which is widely used in the world. For the GEV model, several plotting position formulas are suggested. However, the PPCC statistics are derived only for the plotting position formulas (Goel and De, In-na and Nguyen, and Kim et al.) in which the skewness coefficient (or shape parameter) are included. And then the regression equations are derived as a function of the shape parameter and sample size for a given significance level. In addition, the rejection powers of these formulas are compared using Monte-Carlo simulation. Keywords: Goodness-of-fit test, Probability plot correlation coefficient test, Plotting position, Monte-Carlo Simulation ACKNOWLEDGEMENTS This research was supported by a grant 'Establishing Active Disaster Management System of Flood Control Structures by using 3D BIM Technique' [NEMA-12-NH-57] from the Natural Hazard Mitigation Research Group, National Emergency Management Agency of Korea.

  13. PubMed related articles: a probabilistic topic-based model for content similarity

    PubMed Central

    Lin, Jimmy; Wilbur, W John

    2007-01-01

    Background We present a probabilistic topic-based model for content similarity called pmra that underlies the related article search feature in PubMed. Whether or not a document is about a particular topic is computed from term frequencies, modeled as Poisson distributions. Unlike previous probabilistic retrieval models, we do not attempt to estimate relevance–but rather our focus is "relatedness", the probability that a user would want to examine a particular document given known interest in another. We also describe a novel technique for estimating parameters that does not require human relevance judgments; instead, the process is based on the existence of MeSH ® in MEDLINE ®. Results The pmra retrieval model was compared against bm25, a competitive probabilistic model that shares theoretical similarities. Experiments using the test collection from the TREC 2005 genomics track shows a small but statistically significant improvement of pmra over bm25 in terms of precision. Conclusion Our experiments suggest that the pmra model provides an effective ranking algorithm for related article search. PMID:17971238

  14. Classic maximum entropy recovery of the average joint distribution of apparent FRET efficiency and fluorescence photons for single-molecule burst measurements.

    PubMed

    DeVore, Matthew S; Gull, Stephen F; Johnson, Carey K

    2012-04-05

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions.

  15. Improved first-order uncertainty method for water-quality modeling

    USGS Publications Warehouse

    Melching, C.S.; Anmangandla, S.

    1992-01-01

    Uncertainties are unavoidable in water-quality modeling and subsequent management decisions. Monte Carlo simulation and first-order uncertainty analysis (involving linearization at central values of the uncertain variables) have been frequently used to estimate probability distributions for water-quality model output due to their simplicity. Each method has its drawbacks: Monte Carlo simulation's is mainly computational time; and first-order analysis are mainly questions of accuracy and representativeness, especially for nonlinear systems and extreme conditions. An improved (advanced) first-order method is presented, where the linearization point varies to match the output level whose exceedance probability is sought. The advanced first-order method is tested on the Streeter-Phelps equation to estimate the probability distribution of critical dissolved-oxygen deficit and critical dissolved oxygen using two hypothetical examples from the literature. The advanced first-order method provides a close approximation of the exceedance probability for the Streeter-Phelps model output estimated by Monte Carlo simulation using less computer time - by two orders of magnitude - regardless of the probability distributions assumed for the uncertain model parameters.

  16. Measures for a multidimensional multiverse

    NASA Astrophysics Data System (ADS)

    Chung, Hyeyoun

    2015-04-01

    We explore the phenomenological implications of generalizing the causal patch and fat geodesic measures to a multidimensional multiverse, where the vacua can have differing numbers of large dimensions. We consider a simple model in which the vacua are nucleated from a D -dimensional parent spacetime through dynamical compactification of the extra dimensions, and compute the geometric contribution to the probability distribution of observations within the multiverse for each measure. We then study how the shape of this probability distribution depends on the time scales for the existence of observers, for vacuum domination, and for curvature domination (tobs,tΛ , and tc, respectively.) In this work we restrict ourselves to bubbles with positive cosmological constant, Λ . We find that in the case of the causal patch cutoff, when the bubble universes have p +1 large spatial dimensions with p ≥2 , the shape of the probability distribution is such that we obtain the coincidence of time scales tobs˜tΛ˜tc . Moreover, the size of the cosmological constant is related to the size of the landscape. However, the exact shape of the probability distribution is different in the case p =2 , compared to p ≥3 . In the case of the fat geodesic measure, the result is even more robust: the shape of the probability distribution is the same for all p ≥2 , and we once again obtain the coincidence tobs˜tΛ˜tc . These results require only very mild conditions on the prior probability of the distribution of vacua in the landscape. Our work shows that the observed double coincidence of time scales is a robust prediction even when the multiverse is generalized to be multidimensional; that this coincidence is not a consequence of our particular Universe being (3 +1 )-dimensional; and that this observable cannot be used to preferentially select one measure over another in a multidimensional multiverse.

  17. Shape of growth-rate distribution determines the type of Non-Gibrat’s Property

    NASA Astrophysics Data System (ADS)

    Ishikawa, Atushi; Fujimoto, Shouji; Mizuno, Takayuki

    2011-11-01

    In this study, the authors examine exhaustive business data on Japanese firms, which cover nearly all companies in the mid- and large-scale ranges in terms of firm size, to reach several key findings on profits/sales distribution and business growth trends. Here, profits denote net profits. First, detailed balance is observed not only in profits data but also in sales data. Furthermore, the growth-rate distribution of sales has wider tails than the linear growth-rate distribution of profits in log-log scale. On the one hand, in the mid-scale range of profits, the probability of positive growth decreases and the probability of negative growth increases symmetrically as the initial value increases. This is called Non-Gibrat’s First Property. On the other hand, in the mid-scale range of sales, the probability of positive growth decreases as the initial value increases, while the probability of negative growth hardly changes. This is called Non-Gibrat’s Second Property. Under detailed balance, Non-Gibrat’s First and Second Properties are analytically derived from the linear and quadratic growth-rate distributions in log-log scale, respectively. In both cases, the log-normal distribution is inferred from Non-Gibrat’s Properties and detailed balance. These analytic results are verified by empirical data. Consequently, this clarifies the notion that the difference in shapes between growth-rate distributions of sales and profits is closely related to the difference between the two Non-Gibrat’s Properties in the mid-scale range.

  18. Estimation of distribution overlap of urn models.

    PubMed

    Hampton, Jerrad; Lladser, Manuel E

    2012-01-01

    A classical problem in statistics is estimating the expected coverage of a sample, which has had applications in gene expression, microbial ecology, optimization, and even numismatics. Here we consider a related extension of this problem to random samples of two discrete distributions. Specifically, we estimate what we call the dissimilarity probability of a sample, i.e., the probability of a draw from one distribution not being observed in [Formula: see text] draws from another distribution. We show our estimator of dissimilarity to be a [Formula: see text]-statistic and a uniformly minimum variance unbiased estimator of dissimilarity over the largest appropriate range of [Formula: see text]. Furthermore, despite the non-Markovian nature of our estimator when applied sequentially over [Formula: see text], we show it converges uniformly in probability to the dissimilarity parameter, and we present criteria when it is approximately normally distributed and admits a consistent jackknife estimator of its variance. As proof of concept, we analyze V35 16S rRNA data to discern between various microbial environments. Other potential applications concern any situation where dissimilarity of two discrete distributions may be of interest. For instance, in SELEX experiments, each urn could represent a random RNA pool and each draw a possible solution to a particular binding site problem over that pool. The dissimilarity of these pools is then related to the probability of finding binding site solutions in one pool that are absent in the other.

  19. Diffusion of active chiral particles

    NASA Astrophysics Data System (ADS)

    Sevilla, Francisco J.

    2016-12-01

    The diffusion of chiral active Brownian particles in three-dimensional space is studied analytically, by consideration of the corresponding Fokker-Planck equation for the probability density of finding a particle at position x and moving along the direction v ̂ at time t , and numerically, by the use of Langevin dynamics simulations. The analysis is focused on the marginal probability density of finding a particle at a given location and at a given time (independently of its direction of motion), which is found from an infinite hierarchy of differential-recurrence relations for the coefficients that appear in the multipole expansion of the probability distribution, which contains the whole kinematic information. This approach allows the explicit calculation of the time dependence of the mean-squared displacement and the time dependence of the kurtosis of the marginal probability distribution, quantities from which the effective diffusion coefficient and the "shape" of the positions distribution are examined. Oscillations between two characteristic values were found in the time evolution of the kurtosis, namely, between the value that corresponds to a Gaussian and the one that corresponds to a distribution of spherical shell shape. In the case of an ensemble of particles, each one rotating around a uniformly distributed random axis, evidence is found of the so-called effect "anomalous, yet Brownian, diffusion," for which particles follow a non-Gaussian distribution for the positions yet the mean-squared displacement is a linear function of time.

  20. A Search Model for Imperfectly Detected Targets

    NASA Technical Reports Server (NTRS)

    Ahumada, Albert

    2012-01-01

    Under the assumptions that 1) the search region can be divided up into N non-overlapping sub-regions that are searched sequentially, 2) the probability of detection is unity if a sub-region is selected, and 3) no information is available to guide the search, there are two extreme case models. The search can be done perfectly, leading to a uniform distribution over the number of searches required, or the search can be done with no memory, leading to a geometric distribution for the number of searches required with a success probability of 1/N. If the probability of detection P is less than unity, but the search is done otherwise perfectly, the searcher will have to search the N regions repeatedly until detection occurs. The number of searches is thus the sum two random variables. One is N times the number of full searches (a geometric distribution with success probability P) and the other is the uniform distribution over the integers 1 to N. The first three moments of this distribution were computed, giving the mean, standard deviation, and the kurtosis of the distribution as a function of the two parameters. The model was fit to the data presented last year (Ahumada, Billington, & Kaiwi, 2 required to find a single pixel target on a simulated horizon. The model gave a good fit to the three moments for all three observers.

  1. Optimizing probability of detection point estimate demonstration

    NASA Astrophysics Data System (ADS)

    Koshti, Ajay M.

    2017-04-01

    The paper provides discussion on optimizing probability of detection (POD) demonstration experiments using point estimate method. The optimization is performed to provide acceptable value for probability of passing demonstration (PPD) and achieving acceptable value for probability of false (POF) calls while keeping the flaw sizes in the set as small as possible. POD Point estimate method is used by NASA for qualifying special NDE procedures. The point estimate method uses binomial distribution for probability density. Normally, a set of 29 flaws of same size within some tolerance are used in the demonstration. Traditionally largest flaw size in the set is considered to be a conservative estimate of the flaw size with minimum 90% probability and 95% confidence. The flaw size is denoted as α90/95PE. The paper investigates relationship between range of flaw sizes in relation to α90, i.e. 90% probability flaw size, to provide a desired PPD. The range of flaw sizes is expressed as a proportion of the standard deviation of the probability density distribution. Difference between median or average of the 29 flaws and α90 is also expressed as a proportion of standard deviation of the probability density distribution. In general, it is concluded that, if probability of detection increases with flaw size, average of 29 flaw sizes would always be larger than or equal to α90 and is an acceptable measure of α90/95PE. If NDE technique has sufficient sensitivity and signal-to-noise ratio, then the 29 flaw-set can be optimized to meet requirements of minimum required PPD, maximum allowable POF, requirements on flaw size tolerance about mean flaw size and flaw size detectability requirements. The paper provides procedure for optimizing flaw sizes in the point estimate demonstration flaw-set.

  2. Theoretical cratering rates on Ida, Mathilde, Eros and Gaspra

    NASA Astrophysics Data System (ADS)

    Jeffers, S. V.; Asher, D. J.; Bailey, M. E.

    2002-11-01

    We investigate the main influences on crater size distributions, by deriving results for the four example target objects, (951) Gaspra, (243) Ida, (253) Mathilde and (433) Eros. The dynamical history of each of these asteroids is modelled using the MERCURY (Chambers 1999) numerical integrator. The use of an efficient, Öpik-type, collision code enables the calculation of a velocity histogram and the probability of impact. This when combined with a crater scaling law and an impactor size distribution, through a Monte Carlo method, results in a crater size distribution. The resulting crater probability distributions are in good agreement with observed crater distributions on these asteroids.

  3. Velocity distributions among colliding asteroids

    NASA Technical Reports Server (NTRS)

    Bottke, William F., Jr.; Nolan, Michael C.; Greenberg, Richard; Kolvoord, Robert A.

    1994-01-01

    The probability distribution for impact velocities between two given asteroids is wide, non-Gaussian, and often contains spikes according to our new method of analysis in which each possible orbital geometry for collision is weighted according to its probability. An average value would give a good representation only if the distribution were smooth and narrow. Therefore, the complete velocity distribution we obtain for various asteroid populations differs significantly from published histograms of average velocities. For all pairs among the 682 asteroids in the main-belt with D greater than 50 km, we find that our computed velocity distribution is much wider than previously computed histograms of average velocities. In this case, the most probable impact velocity is approximately 4.4 km/sec, compared with the mean impact velocity of 5.3 km/sec. For cases of a single asteroid (e.g., Gaspra or Ida) relative to an impacting population, the distribution we find yields lower velocities than previously reported by others. The width of these velocity distributions implies that mean impact velocities must be used with caution when calculating asteroid collisional lifetimes or crater-size distributions. Since the most probable impact velocities are lower than the mean, disruption events may occur less frequently than previously estimated. However, this disruption rate may be balanced somewhat by an apparent increase in the frequency of high-velocity impacts between asteroids. These results have implications for issues such as asteroidal disruption rates, the amount/type of impact ejecta available for meteoritical delivery to the Earth, and the geology and evolution of specific asteroids like Gaspra.

  4. The Probability Distribution for a Biased Spinner

    ERIC Educational Resources Information Center

    Foster, Colin

    2012-01-01

    This article advocates biased spinners as an engaging context for statistics students. Calculating the probability of a biased spinner landing on a particular side makes valuable connections between probability and other areas of mathematics. (Contains 2 figures and 1 table.)

  5. A probable probability distribution of a series nonequilibrium states in a simple system out of equilibrium

    NASA Astrophysics Data System (ADS)

    Gao, Haixia; Li, Ting; Xiao, Changming

    2016-05-01

    When a simple system is in its nonequilibrium state, it will shift to its equilibrium state. Obviously, in this process, there are a series of nonequilibrium states. With the assistance of Bayesian statistics and hyperensemble, a probable probability distribution of these nonequilibrium states can be determined by maximizing the hyperensemble entropy. It is known that the largest probability is the equilibrium state, and the far a nonequilibrium state is away from the equilibrium one, the smaller the probability will be, and the same conclusion can also be obtained in the multi-state space. Furthermore, if the probability stands for the relative time the corresponding nonequilibrium state can stay, then the velocity of a nonequilibrium state returning back to its equilibrium can also be determined through the reciprocal of the derivative of this probability. It tells us that the far away the state from the equilibrium is, the faster the returning velocity will be; if the system is near to its equilibrium state, the velocity will tend to be smaller and smaller, and finally tends to 0 when it gets the equilibrium state.

  6. Study on probability distributions for evolution in modified extremal optimization

    NASA Astrophysics Data System (ADS)

    Zeng, Guo-Qiang; Lu, Yong-Zai; Mao, Wei-Jie; Chu, Jian

    2010-05-01

    It is widely believed that the power-law is a proper probability distribution being effectively applied for evolution in τ-EO (extremal optimization), a general-purpose stochastic local-search approach inspired by self-organized criticality, and its applications in some NP-hard problems, e.g., graph partitioning, graph coloring, spin glass, etc. In this study, we discover that the exponential distributions or hybrid ones (e.g., power-laws with exponential cutoff) being popularly used in the research of network sciences may replace the original power-laws in a modified τ-EO method called self-organized algorithm (SOA), and provide better performances than other statistical physics oriented methods, such as simulated annealing, τ-EO and SOA etc., from the experimental results on random Euclidean traveling salesman problems (TSP) and non-uniform instances. From the perspective of optimization, our results appear to demonstrate that the power-law is not the only proper probability distribution for evolution in EO-similar methods at least for TSP, the exponential and hybrid distributions may be other choices.

  7. Rapidly assessing the probability of exceptionally high natural hazard losses

    NASA Astrophysics Data System (ADS)

    Gollini, Isabella; Rougier, Jonathan

    2014-05-01

    One of the objectives in catastrophe modeling is to assess the probability distribution of losses for a specified period, such as a year. From the point of view of an insurance company, the whole of the loss distribution is interesting, and valuable in determining insurance premiums. But the shape of the righthand tail is critical, because it impinges on the solvency of the company. A simple measure of the risk of insolvency is the probability that the annual loss will exceed the company's current operating capital. Imposing an upper limit on this probability is one of the objectives of the EU Solvency II directive. If a probabilistic model is supplied for the loss process, then this tail probability can be computed, either directly, or by simulation. This can be a lengthy calculation for complex losses. Given the inevitably subjective nature of quantifying loss distributions, computational resources might be better used in a sensitivity analysis. This requires either a quick approximation to the tail probability or an upper bound on the probability, ideally a tight one. We present several different bounds, all of which can be computed nearly instantly from a very general event loss table. We provide a numerical illustration, and discuss the conditions under which the bound is tight. Although we consider the perspective of insurance and reinsurance companies, exactly the same issues concern the risk manager, who is typically very sensitive to large losses.

  8. Variation in the standard deviation of the lure rating distribution: Implications for estimates of recollection probability.

    PubMed

    Dopkins, Stephen; Varner, Kaitlin; Hoyer, Darin

    2017-10-01

    In word recognition semantic priming of test words increased the false-alarm rate and the mean of confidence ratings to lures. Such priming also increased the standard deviation of confidence ratings to lures and the slope of the z-ROC function, suggesting that the priming increased the standard deviation of the lure evidence distribution. The Unequal Variance Signal Detection (UVSD) model interpreted the priming as increasing the standard deviation of the lure evidence distribution. Without additional parameters the Dual Process Signal Detection (DPSD) model could only accommodate the results by fitting the data for related and unrelated primes separately, interpreting the priming, implausibly, as decreasing the probability of target recollection (DPSD). With an additional parameter, for the probability of false (lure) recollection the model could fit the data for related and unrelated primes together, interpreting the priming as increasing the probability of false recollection. These results suggest that DPSD estimates of target recollection probability will decrease with increases in the lure confidence/evidence standard deviation unless a parameter is included for false recollection. Unfortunately the size of a given lure confidence/evidence standard deviation relative to other possible lure confidence/evidence standard deviations is often unspecified by context. Hence the model often has no way of estimating false recollection probability and thereby correcting its estimates of target recollection probability.

  9. Probability distributions for Markov chain based quantum walks

    NASA Astrophysics Data System (ADS)

    Balu, Radhakrishnan; Liu, Chaobin; Venegas-Andraca, Salvador E.

    2018-01-01

    We analyze the probability distributions of the quantum walks induced from Markov chains by Szegedy (2004). The first part of this paper is devoted to the quantum walks induced from finite state Markov chains. It is shown that the probability distribution on the states of the underlying Markov chain is always convergent in the Cesaro sense. In particular, we deduce that the limiting distribution is uniform if the transition matrix is symmetric. In the case of a non-symmetric Markov chain, we exemplify that the limiting distribution of the quantum walk is not necessarily identical with the stationary distribution of the underlying irreducible Markov chain. The Szegedy scheme can be extended to infinite state Markov chains (random walks). In the second part, we formulate the quantum walk induced from a lazy random walk on the line. We then obtain the weak limit of the quantum walk. It is noted that the current quantum walk appears to spread faster than its counterpart-quantum walk on the line driven by the Grover coin discussed in literature. The paper closes with an outlook on possible future directions.

  10. Comparison of characteristic of soils with and without salt crust soils in a hyper-arid floodplain

    NASA Astrophysics Data System (ADS)

    LI, X.; Feng, G.

    2017-12-01

    Soil salt crusts have been shown to restrict soil erosion, and influence the water and salt movement in soil, was great concern in the world. However, there is little information for the comparison of characteristic of soil with and without salt crust in a hyper-arid flood plains. The objective of this study was to investigate paired samples of salt crusts and the surface soil without a salt curst in the flood plain of Tarim River in China. The results revealed that the salt crust soils most distributed in shrubland dominated by Tarimax, in which account for 73%, the wetland was followed, with shallower groundwater table (<2.4 m). The salt crust was comprised of salt greater than 109 g·kg-1, crust was not found on the soils as its salt content less than 89 g·kg-1. The salt content of soils either with crust or without crust ranged from 89 to 109 g·kg-1. The salt crust thickness had positive correlation with salt content (R2=0.61), and also with crust strength (R2=0.64). Compared with soils without salt crust, the salt crust soils had more clay, silt and soil organic matter content. It was found that those soils were located in low-lying area, experience relatively higher frequent overflowing flood. This study revealed that the flood did not reduce salt content in top soils. Salt crust was probably formed due to salt accumulation from shallow groundwater (e.g. <2.4 m) in this region.

  11. Probability in reasoning: a developmental test on conditionals.

    PubMed

    Barrouillet, Pierre; Gauffroy, Caroline

    2015-04-01

    Probabilistic theories have been claimed to constitute a new paradigm for the psychology of reasoning. A key assumption of these theories is captured by what they call the Equation, the hypothesis that the meaning of the conditional is probabilistic in nature and that the probability of If p then q is the conditional probability, in such a way that P(if p then q)=P(q|p). Using the probabilistic truth-table task in which participants are required to evaluate the probability of If p then q sentences, the present study explored the pervasiveness of the Equation through ages (from early adolescence to adulthood), types of conditionals (basic, causal, and inducements) and contents. The results reveal that the Equation is a late developmental achievement only endorsed by a narrow majority of educated adults for certain types of conditionals depending on the content they involve. Age-related changes in evaluating the probability of all the conditionals studied closely mirror the development of truth-value judgements observed in previous studies with traditional truth-table tasks. We argue that our modified mental model theory can account for this development, and hence for the findings related with the probability task, which do not consequently support the probabilistic approach of human reasoning over alternative theories. Copyright © 2014 Elsevier B.V. All rights reserved.

  12. Study of the physical properties of Ge-S-Ga glassy alloy

    NASA Astrophysics Data System (ADS)

    Rana, Anjli; Sharma, Raman

    2018-05-01

    In the present work, we have studied the effect of Ga doping on the physical properties of Ge20S80-xGax glassy alloy. The basic physical parameters which have important role in determining the structure and strength of the material viz. average coordination number, lone-pair electrons, mean bond energy, glass transition temperature, electro negativity, probabilities for bond distribution and cohesive energy have been computed theoretically for Ge-S-Ga glassy alloy. Here, the glass transition temperature and mean bond energy have been investigated using the Tichy-Ticha approach. The cohesive energy has been calculated by using chemical bond approach (CBA) method. It has been found that while average coordination number increases, all the other parameters decrease with the increase in Ga content in Ge-S-Ga system.

  13. Pedologic and climatic controls on Rn-222 concentrations in soil gas, Denver, Colorado

    USGS Publications Warehouse

    Asher-Bolinder, S.; Owen, D.E.; Schumann, R.R.

    1990-01-01

    Soil-gas radon concentrations are controlled seasonally by factors of climate and pedology. In a swelling soil of the semiarid Western United States, soil-gas radon concentrations at 100 cm depth increase in winter and spring due to increased emanation with higher soil moisture and the capping effect of surface water or ice. Radon concentrations in soil drop markedly through the summer and fall. The increased insolation of spring and summer warms and dries the soil, limiting the amount of water that reaches 100 cm. Probable controls on the distribution of uranium within the soil column include its downward leaching, its precipitation or adsorption onto B-horizon clays, concretions, or cement, and the uranium content and mineralogy of the soil's granitic and gneissic precursors. -from Authors

  14. Fragility issues of medical video streaming over 802.11e-WLAN m-health environments.

    PubMed

    Tan, Yow-Yiong Edwin; Philip, Nada; Istepanian, Robert H

    2006-01-01

    This paper presents some of the fragility issues of a medical video streaming over 802.11e-WLAN in m-health applications. In particular, we present a medical channel-adaptive fair allocation (MCAFA) scheme for enhanced QoS support for IEEE 802.11 (WLAN), as a modification for the standard 802.11e enhanced distributed coordination function (EDCF) is proposed for enhanced medical data performance. The medical channel-adaptive fair allocation (MCAFA) proposed extends the EDCF, by halving the contention window (CW) after zeta consecutive successful transmissions to reduce the collision probability when channel is busy. Simulation results show that MCAFA outperforms EDCF in-terms of overall performance relevant to the requirements of high throughput of medical data and video streaming traffic in 3G/WLAN wireless environments.

  15. Estimating alarm thresholds and the number of components in mixture distributions

    NASA Astrophysics Data System (ADS)

    Burr, Tom; Hamada, Michael S.

    2012-09-01

    Mixtures of probability distributions arise in many nuclear assay and forensic applications, including nuclear weapon detection, neutron multiplicity counting, and in solution monitoring (SM) for nuclear safeguards. SM data is increasingly used to enhance nuclear safeguards in aqueous reprocessing facilities having plutonium in solution form in many tanks. This paper provides background for mixture probability distributions and then focuses on mixtures arising in SM data. SM data can be analyzed by evaluating transfer-mode residuals defined as tank-to-tank transfer differences, and wait-mode residuals defined as changes during non-transfer modes. A previous paper investigated impacts on transfer-mode and wait-mode residuals of event marking errors which arise when the estimated start and/or stop times of tank events such as transfers are somewhat different from the true start and/or stop times. Event marking errors contribute to non-Gaussian behavior and larger variation than predicted on the basis of individual tank calibration studies. This paper illustrates evidence for mixture probability distributions arising from such event marking errors and from effects such as condensation or evaporation during non-transfer modes, and pump carryover during transfer modes. A quantitative assessment of the sample size required to adequately characterize a mixture probability distribution arising in any context is included.

  16. Construction and identification of a D-Vine model applied to the probability distribution of modal parameters in structural dynamics

    NASA Astrophysics Data System (ADS)

    Dubreuil, S.; Salaün, M.; Rodriguez, E.; Petitjean, F.

    2018-01-01

    This study investigates the construction and identification of the probability distribution of random modal parameters (natural frequencies and effective parameters) in structural dynamics. As these parameters present various types of dependence structures, the retained approach is based on pair copula construction (PCC). A literature review leads us to choose a D-Vine model for the construction of modal parameters probability distributions. Identification of this model is based on likelihood maximization which makes it sensitive to the dimension of the distribution, namely the number of considered modes in our context. To this respect, a mode selection preprocessing step is proposed. It allows the selection of the relevant random modes for a given transfer function. The second point, addressed in this study, concerns the choice of the D-Vine model. Indeed, D-Vine model is not uniquely defined. Two strategies are proposed and compared. The first one is based on the context of the study whereas the second one is purely based on statistical considerations. Finally, the proposed approaches are numerically studied and compared with respect to their capabilities, first in the identification of the probability distribution of random modal parameters and second in the estimation of the 99 % quantiles of some transfer functions.

  17. Probability distribution functions in turbulent convection

    NASA Technical Reports Server (NTRS)

    Balachandar, S.; Sirovich, L.

    1991-01-01

    Results of an extensive investigation of probability distribution functions (pdfs) for Rayleigh-Benard convection, in hard turbulence regime, are presented. It is shown that the pdfs exhibit a high degree of internal universality. In certain cases this universality is established within two Kolmogorov scales of a boundary. A discussion of the factors leading to the universality is presented.

  18. Power-law tail probabilities of drainage areas in river basins

    USGS Publications Warehouse

    Veitzer, S.A.; Troutman, B.M.; Gupta, V.K.

    2003-01-01

    The significance of power-law tail probabilities of drainage areas in river basins was discussed. The convergence to a power law was not observed for all underlying distributions, but for a large class of statistical distributions with specific limiting properties. The article also discussed about the scaling properties of topologic and geometric network properties in river basins.

  19. KINETICS OF LOW SOURCE REACTOR STARTUPS. PART II

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    hurwitz, H. Jr.; MacMillan, D.B.; Smith, J.H.

    1962-06-01

    A computational technique is described for computation of the probability distribution of power level for a low source reactor startup. The technique uses a mathematical model, for the time-dependent probability distribution of neutron and precursor concentration, having finite neutron lifetime, one group of delayed neutron precursors, and no spatial dependence. Results obtained by the technique are given. (auth)

  20. Generating an Empirical Probability Distribution for the Andrews-Pregibon Statistic.

    ERIC Educational Resources Information Center

    Jarrell, Michele G.

    A probability distribution was developed for the Andrews-Pregibon (AP) statistic. The statistic, developed by D. F. Andrews and D. Pregibon (1978), identifies multivariate outliers. It is a ratio of the determinant of the data matrix with an observation deleted to the determinant of the entire data matrix. Although the AP statistic has been used…

  1. Animating Statistics: A New Kind of Applet for Exploring Probability Distributions

    ERIC Educational Resources Information Center

    Kahle, David

    2014-01-01

    In this article, I introduce a novel applet ("module") for exploring probability distributions, their samples, and various related statistical concepts. The module is primarily designed to be used by the instructor in the introductory course, but it can be used far beyond it as well. It is a free, cross-platform, stand-alone interactive…

  2. CAN'T MISS--conquer any number task by making important statistics simple. Part 2. Probability, populations, samples, and normal distributions.

    PubMed

    Hansen, John P

    2003-01-01

    Healthcare quality improvement professionals need to understand and use inferential statistics to interpret sample data from their organizations. In quality improvement and healthcare research studies all the data from a population often are not available, so investigators take samples and make inferences about the population by using inferential statistics. This three-part series will give readers an understanding of the concepts of inferential statistics as well as the specific tools for calculating confidence intervals for samples of data. This article, Part 2, describes probability, populations, and samples. The uses of descriptive and inferential statistics are outlined. The article also discusses the properties and probability of normal distributions, including the standard normal distribution.

  3. The Finite-Size Scaling Relation for the Order-Parameter Probability Distribution of the Six-Dimensional Ising Model

    NASA Astrophysics Data System (ADS)

    Merdan, Ziya; Karakuş, Özlem

    2016-11-01

    The six dimensional Ising model with nearest-neighbor pair interactions has been simulated and verified numerically on the Creutz Cellular Automaton by using five bit demons near the infinite-lattice critical temperature with the linear dimensions L=4,6,8,10. The order parameter probability distribution for six dimensional Ising model has been calculated at the critical temperature. The constants of the analytical function have been estimated by fitting to probability function obtained numerically at the finite size critical point.

  4. Neural correlates of the divergence of instrumental probability distributions.

    PubMed

    Liljeholm, Mimi; Wang, Shuo; Zhang, June; O'Doherty, John P

    2013-07-24

    Flexible action selection requires knowledge about how alternative actions impact the environment: a "cognitive map" of instrumental contingencies. Reinforcement learning theories formalize this map as a set of stochastic relationships between actions and states, such that for any given action considered in a current state, a probability distribution is specified over possible outcome states. Here, we show that activity in the human inferior parietal lobule correlates with the divergence of such outcome distributions-a measure that reflects whether discrimination between alternative actions increases the controllability of the future-and, further, that this effect is dissociable from those of other information theoretic and motivational variables, such as outcome entropy, action values, and outcome utilities. Our results suggest that, although ultimately combined with reward estimates to generate action values, outcome probability distributions associated with alternative actions may be contrasted independently of valence computations, to narrow the scope of the action selection problem.

  5. Qualitative fusion technique based on information poor system and its application to factor analysis for vibration of rolling bearings

    NASA Astrophysics Data System (ADS)

    Xia, Xintao; Wang, Zhongyu

    2008-10-01

    For some methods of stability analysis of a system using statistics, it is difficult to resolve the problems of unknown probability distribution and small sample. Therefore, a novel method is proposed in this paper to resolve these problems. This method is independent of probability distribution, and is useful for small sample systems. After rearrangement of the original data series, the order difference and two polynomial membership functions are introduced to estimate the true value, the lower bound and the supper bound of the system using fuzzy-set theory. Then empirical distribution function is investigated to ensure confidence level above 95%, and the degree of similarity is presented to evaluate stability of the system. Cases of computer simulation investigate stable systems with various probability distribution, unstable systems with linear systematic errors and periodic systematic errors and some mixed systems. The method of analysis for systematic stability is approved.

  6. A double hit model for the distribution of time to AIDS onset

    NASA Astrophysics Data System (ADS)

    Chillale, Nagaraja Rao

    2013-09-01

    Incubation time is a key epidemiologic descriptor of an infectious disease. In the case of HIV infection this is a random variable and is probably the longest one. The probability distribution of incubation time is the major determinant of the relation between the incidences of HIV infection and its manifestation to Aids. This is also one of the key factors used for accurate estimation of AIDS incidence in a region. The present article i) briefly reviews the work done, points out uncertainties in estimation of AIDS onset time and stresses the need for its precise estimation, ii) highlights some of the modelling features of onset distribution including immune failure mechanism, and iii) proposes a 'Double Hit' model for the distribution of time to AIDS onset in the cases of (a) independent and (b) dependent time variables of the two markers and examined the applicability of a few standard probability models.

  7. TK3 eBook software to author, distribute, and use electronic course content for medical education.

    PubMed

    Morton, David A; Foreman, K Bo; Goede, Patricia A; Bezzant, John L; Albertine, Kurt H

    2007-03-01

    The methods for authoring and distributing course content are undergoing substantial changes due to advancement in computer technology. Paper has been the traditional method to author and distribute course content. Paper enables students to personalize content through highlighting and note taking but does not enable the incorporation of multimedia elements. Computers enable multimedia content but lack the capability of the user to personalize the content. Therefore, we investigated TK3 eBooks as a potential solution to incorporate the benefits of both paper and computer technology. The objective of our study was to assess the utility of TK3 eBooks in the context of authoring and distributing dermatology course content for use by second-year medical students at the University of Utah School of Medicine during the spring of 2004. We incorporated all dermatology course content into TK3 eBook format. TK3 eBooks enable students to personalize information through tools such as "notebook," "hiliter," "stickies," mark pages, and keyword search. Students were given the course content in both paper and eBook formats. At the conclusion of the dermatology course, students completed a questionnaire designed to evaluate the effectiveness of the eBooks compared with paper. Students perceived eBooks as an effective way to distribute course content and as a study tool. However, students preferred paper over eBooks to take notes during lecture. In conclusion, the present study demonstrated that eBooks provide a convenient method for authoring, distributing, and using course content but that students preferred paper to take notes during lecture.

  8. A Possible Origin of the Gas Hydrate in Southwest Taiwan Offshore Area

    NASA Astrophysics Data System (ADS)

    Lee, C.; Lee, J.; Oung, J.

    2003-12-01

    The southwest Taiwan locate at the eastward subduction zone of the Eurasian plate, which is currently converging with the Philippine Sea plate at a rate of several few centimeters per year. The geological setting of this region is characterized by the appearance of thick accreted sediments up to several kilometers, numerous submarine canyons, active faults, and mud diapirs/volcanoes. The origin of mud diapir/volcano is probably related to the plate convergence. During the tectonic processes, the organic matters were "cooked" thermogenically and biogenically to produce the natural gases, and possibly the oil in the sediment. Beneath the seafloor, if the natural gases were at the appropriate temperature and pressure condition, they would become the gas hydrate, and preserved in the top sediment layers. The formation of gas hydrate is situated under the water depth at about 300 to 3000 meters in this region. In the seismic profiles, the Bottom Simulation Reflector (BSR) probably represents the boundary between the solid-state and gas-state natural gas. The BSR is also regarded as an important marker as an existence of gas hydrate. It is extensively distributed in the continental margin off southwest Taiwan, but unstable, especially along the active fault zones. The natural gas as well as the mud and hydraulic fluid in the deep sediment are pushed into the surface layer. In order to investigate the relationship between mud diapir and gas hydrate, we conduct the geophysical and geological methods: using a 38/150 kHz high-frequency echo sounder system to guide and select the sites for mud diapirs, and take 1-3 m gravity core samples. We, then, adopt an up-side-down "headspace" tin-can technique to preserve the gases, and use a gas chromatography to analyze its contents. Oil companies commonly use the method. The first result shows that the existence of methane, ethane, propane and possible other higher hydrocarbon contents in the core samples. The methane is the most abundant gas, up to 1859 parts per million in volume (ppm); the others are not significant, probably due to a leaking in the sampling and transportation. We have reduced the "headspace" in order to preserve more concentrated gases in the second attempt, and the result shows similar. Nonetheless, our results suggest that the gases are probably a mixture of thermogenic and biogenic origin. Due to the existence of higher hydrocarbon contents, we believe that the thermogenic gases are produced in the deep source sediments, while the shallow biogenic methane is mixing with them in the top sediment. In the mud diapir/volcano area, the contents of natural gases are usually higher than that in a flat seafloor. As several high gas values have been founded in the near shore area (e.g., 1604 ppm of C1 plus C2 and C3 found at a water depth of 23 m), we suggest that the 300-3000 m gas hydrate zone is probably in a dynamic balance of which the deep gases are continuously migrating to the BSR and the free gases are being evaporating from this zone. Our data illustrate the potential existence of natural gases in this region; however, we cannot quantify the reserve at this time. Further investigations with a long core and better-improved techniques are needed.

  9. Learning Probabilities From Random Observables in High Dimensions: The Maximum Entropy Distribution and Others

    NASA Astrophysics Data System (ADS)

    Obuchi, Tomoyuki; Cocco, Simona; Monasson, Rémi

    2015-11-01

    We consider the problem of learning a target probability distribution over a set of N binary variables from the knowledge of the expectation values (with this target distribution) of M observables, drawn uniformly at random. The space of all probability distributions compatible with these M expectation values within some fixed accuracy, called version space, is studied. We introduce a biased measure over the version space, which gives a boost increasing exponentially with the entropy of the distributions and with an arbitrary inverse `temperature' Γ . The choice of Γ allows us to interpolate smoothly between the unbiased measure over all distributions in the version space (Γ =0) and the pointwise measure concentrated at the maximum entropy distribution (Γ → ∞ ). Using the replica method we compute the volume of the version space and other quantities of interest, such as the distance R between the target distribution and the center-of-mass distribution over the version space, as functions of α =(log M)/N and Γ for large N. Phase transitions at critical values of α are found, corresponding to qualitative improvements in the learning of the target distribution and to the decrease of the distance R. However, for fixed α the distance R does not vary with Γ which means that the maximum entropy distribution is not closer to the target distribution than any other distribution compatible with the observable values. Our results are confirmed by Monte Carlo sampling of the version space for small system sizes (N≤ 10).

  10. Microplastic pollution in lakes and lake shoreline sediments - A case study on Lake Bolsena and Lake Chiusi (central Italy).

    PubMed

    Fischer, Elke Kerstin; Paglialonga, Lisa; Czech, Elisa; Tamminga, Matthias

    2016-06-01

    Rivers and effluents have been identified as major pathways for microplastics of terrestrial sources. Moreover, lakes of different dimensions and even in remote locations contain microplastics in striking abundances. This study investigates concentrations of microplastic particles at two lakes in central Italy (Lake Bolsena, Lake Chiusi). A total number of six Manta Trawls have been carried out, two of them one day after heavy winds occurred on Lake Bolsena showing effects on particle distribution of fragments and fibers of varying size categories. Additionally, 36 sediment samples from lakeshores were analyzed for microplastic content. In the surface waters 2.68 to 3.36 particles/m(3) (Lake Chiusi) and 0.82 to 4.42 particles/m(3) (Lake Bolsena) were detected, respectively. Main differences between the lakes are attributed to lake characteristics such as surface and catchment area, depth and the presence of local wind patterns and tide range at Lake Bolsena. An event of heavy winds and moderate rainfall prior to one sampling led to an increase of concentrations at Lake Bolsena which is most probable related to lateral land-based and sewage effluent inputs. The abundances of microplastic particles in sediments vary from mean values of 112 (Lake Bolsena) to 234 particles/kg dry weight (Lake Chiusi). Lake Chiusi results reveal elevated fiber concentrations compared to those of Lake Bolsena what might be a result of higher organic content and a shift in grain size distribution towards the silt and clay fraction at the shallow and highly eutrophic Lake Chiusi. The distribution of particles along different beach levels revealed no significant differences. Copyright © 2016 Elsevier Ltd. All rights reserved.

  11. Stochastic multicomponent reactive transport analysis of low quality drainage release from waste rock piles: Controls of the spatial distribution of acid generating and neutralizing minerals

    NASA Astrophysics Data System (ADS)

    Pedretti, Daniele; Mayer, K. Ulrich; Beckie, Roger D.

    2017-06-01

    In mining environmental applications, it is important to assess water quality from waste rock piles (WRPs) and estimate the likelihood of acid rock drainage (ARD) over time. The mineralogical heterogeneity of WRPs is a source of uncertainty in this assessment, undermining the reliability of traditional bulk indicators used in the industry. We focused in this work on the bulk neutralizing potential ratio (NPR), which is defined as the ratio of the content of non-acid-generating minerals (typically reactive carbonates such as calcite) to the content of potentially acid-generating minerals (typically sulfides such as pyrite). We used a streamtube-based Monte-Carlo method to show why and to what extent bulk NPR can be a poor indicator of ARD occurrence. We simulated ensembles of WRPs identical in their geometry and bulk NPR, which only differed in their initial distribution of the acid generating and acid neutralizing minerals that control NPR. All models simulated the same principal acid-producing, acid-neutralizing and secondary mineral forming processes. We show that small differences in the distribution of local NPR values or the number of flow paths that generate acidity strongly influence drainage pH. The results indicate that the likelihood of ARD (epitomized by the probability of occurrence of pH< 4 in a mixing boundary) within the first 100 years can be as high as 75% for a NPR = 2 and 40% for NPR = 4. The latter is traditionally considered as a ;universally safe; threshold to ensure non-acidic waters in practical applications. Our results suggest that new methods that explicitly account for mineralogical heterogeneity must be sought when computing effective (upscaled) NPR values at the scale of the piles.

  12. Statistical Characterization of the Mechanical Parameters of Intact Rock Under Triaxial Compression: An Experimental Proof of the Jinping Marble

    NASA Astrophysics Data System (ADS)

    Jiang, Quan; Zhong, Shan; Cui, Jie; Feng, Xia-Ting; Song, Leibo

    2016-12-01

    We investigated the statistical characteristics and probability distribution of the mechanical parameters of natural rock using triaxial compression tests. Twenty cores of Jinping marble were tested under each different levels of confining stress (i.e., 5, 10, 20, 30, and 40 MPa). From these full stress-strain data, we summarized the numerical characteristics and determined the probability distribution form of several important mechanical parameters, including deformational parameters, characteristic strength, characteristic strains, and failure angle. The statistical proofs relating to the mechanical parameters of rock presented new information about the marble's probabilistic distribution characteristics. The normal and log-normal distributions were appropriate for describing random strengths of rock; the coefficients of variation of the peak strengths had no relationship to the confining stress; the only acceptable random distribution for both Young's elastic modulus and Poisson's ratio was the log-normal function; and the cohesive strength had a different probability distribution pattern than the frictional angle. The triaxial tests and statistical analysis also provided experimental evidence for deciding the minimum reliable number of experimental sample and for picking appropriate parameter distributions to use in reliability calculations for rock engineering.

  13. Modeling potential distribution of Oligoryzomys longicaudatus, the Andes virus (Genus: Hantavirus) reservoir, in Argentina.

    PubMed

    Andreo, Verónica; Glass, Gregory; Shields, Timothy; Provensal, Cecilia; Polop, Jaime

    2011-09-01

    We constructed a model to predict the potential distribution of Oligoryzomys longicaudatus, the reservoir of Andes virus (Genus: Hantavirus), in Argentina. We developed an extensive database of occurrence records from published studies and our own surveys and compared two methods to model the probability of O. longicaudatus presence; logistic regression and MaxEnt algorithm. The environmental variables used were tree, grass and bare soil cover from MODIS imagery and, altitude and 19 bioclimatic variables from WorldClim database. The models performances were evaluated and compared both by threshold dependent and independent measures. The best models included tree and grass cover, mean diurnal temperature range, and precipitation of the warmest and coldest seasons. The potential distribution maps for O. longicaudatus predicted the highest occurrence probabilities along the Andes range, from 32°S and narrowing southwards. They also predicted high probabilities for the south-central area of Argentina, reaching the Atlantic coast. The Hantavirus Pulmonary Syndrome cases coincided with mean occurrence probabilities of 95 and 77% for logistic and MaxEnt models, respectively. HPS transmission zones in Argentine Patagonia matched the areas with the highest probability of presence. Therefore, colilargos presence probability may provide an approximate risk of transmission and act as an early tool to guide control and prevention plans.

  14. Computation of marginal distributions of peak-heights in electropherograms for analysing single source and mixture STR DNA samples.

    PubMed

    Cowell, Robert G

    2018-05-04

    Current models for single source and mixture samples, and probabilistic genotyping software based on them used for analysing STR electropherogram data, assume simple probability distributions, such as the gamma distribution, to model the allelic peak height variability given the initial amount of DNA prior to PCR amplification. Here we illustrate how amplicon number distributions, for a model of the process of sample DNA collection and PCR amplification, may be efficiently computed by evaluating probability generating functions using discrete Fourier transforms. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. An Integrated Framework for Model-Based Distributed Diagnosis and Prognosis

    DTIC Science & Technology

    2012-09-01

    0 : t) denotes all measurements observed up to time t. The goal of prognosis is to determine the end of (use- ful) life ( EOL ) of a system, and/or its...remaining useful life (RUL). For a given fault, f , using the fault estimate, p(xf (t),θf (t)|y(0 : t)), a probability distribution of EOL , p(EOLf (tP...is stochas- tic, EOL /RUL are random variables and we represent them by probability distributions. The acceptable behavior of the system is expressed

  16. Secure and Robust Overlay Content Distribution

    ERIC Educational Resources Information Center

    Kang, Hun Jeong

    2010-01-01

    With the success of applications spurring the tremendous increase in the volume of data transfer, efficient and reliable content distribution has become a key issue. Peer-to-peer (P2P) technology has gained popularity as a promising approach to large-scale content distribution due to its benefits including self-organizing, load-balancing, and…

  17. Estimating Bayesian Phylogenetic Information Content

    PubMed Central

    Lewis, Paul O.; Chen, Ming-Hui; Kuo, Lynn; Lewis, Louise A.; Fučíková, Karolina; Neupane, Suman; Wang, Yu-Bo; Shi, Daoyuan

    2016-01-01

    Measuring the phylogenetic information content of data has a long history in systematics. Here we explore a Bayesian approach to information content estimation. The entropy of the posterior distribution compared with the entropy of the prior distribution provides a natural way to measure information content. If the data have no information relevant to ranking tree topologies beyond the information supplied by the prior, the posterior and prior will be identical. Information in data discourages consideration of some hypotheses allowed by the prior, resulting in a posterior distribution that is more concentrated (has lower entropy) than the prior. We focus on measuring information about tree topology using marginal posterior distributions of tree topologies. We show that both the accuracy and the computational efficiency of topological information content estimation improve with use of the conditional clade distribution, which also allows topological information content to be partitioned by clade. We explore two important applications of our method: providing a compelling definition of saturation and detecting conflict among data partitions that can negatively affect analyses of concatenated data. [Bayesian; concatenation; conditional clade distribution; entropy; information; phylogenetics; saturation.] PMID:27155008

  18. Factors controlling spatial distribution patterns of biocrusts in a heterogeneous and topographically complex semiarid area

    NASA Astrophysics Data System (ADS)

    Chamizo, Sonia; Rodríguez-Caballero, Emilio; Roncero, Beatriz; Raúl Román, José; Cantón, Yolanda

    2016-04-01

    Biocrusts are widespread soil components in drylands all over the world. They are known to play key roles in the functioning of these regions by fixing carbon and nitrogen, regulating hydrological processes, and preventing from water and wind erosion, thus reducing the loss of soil resources and increasing soil fertility. The rate and magnitude of services provided by biocrusts greatly depend on their composition and developmental stage. Late-successional biocrusts such as lichens and mosses have higher carbon and nitrogen fixation rates, and confer greater protection against erosion and the loss of sediments and nutrients than early-successional algae and cyanobacteria biocrusts. Knowledge of spatial distribution patterns of different biocrust types and the factors that control their distribution is important to assess ecosystem services provided by biocrusts at large spatial scales and to improve modelling of biogeochemical processes and water and carbon balance in drylands. Some of the factors that condition biocrust cover and composition are incoming solar radiation, terrain attributes, vegetation distribution patterns, microclimatic variables and soil properties such as soil pH, texture, soil organic matter, soil nutrients and gypsum and CaCO3 content. However, the factors that govern biocrust distribution may vary from one site to another depending on site characteristics. In this study, we examined the influence of abiotic attributes on the spatial distribution of biocrust types in a complex heterogeneous badland system (Tabernas, SE Spain) where biocrust cover up to 50% of the soil surface. From the analysis of relationships between terrain attributes and proportional abundance of biocrust types, it was found that topography exerted a main control on the spatial distribution of biocrust types in this area. SW-facing slopes were dominated by physical soil crusts and were practically devoid of vegetation and biocrusts. Biocrusts mainly occupied the pediments and NE-facing slopes. Cyanobacteria biocrusts were predominant in the pediments, probably because of their higher capacity to produce UV-protective pigments such as carotenoids and survive in zones of higher incident solar radiation. Lichen biocrusts showed preference for NE-facing slopes that, despite being less stable than the pediments, were exposed to less insolation and probably maintained moisture availability longer. Moreover, some differences were observed between lichen species. While Diploschistes diacapsis and Squamarina lentigera were widely distributed from gentle to steep NE-facing slopes, Lepraria sp. distribution was restricted to steep N-facing slopes, where shade predominance extended the periods of soil moisture availability.

  19. Encounter risk analysis of rainfall and reference crop evapotranspiration in the irrigation district

    NASA Astrophysics Data System (ADS)

    Zhang, Jinping; Lin, Xiaomin; Zhao, Yong; Hong, Yang

    2017-09-01

    Rainfall and reference crop evapotranspiration are random but mutually affected variables in the irrigation district, and their encounter situation can determine water shortage risks under the contexts of natural water supply and demand. However, in reality, the rainfall and reference crop evapotranspiration may have different marginal distributions and their relations are nonlinear. In this study, based on the annual rainfall and reference crop evapotranspiration data series from 1970 to 2013 in the Luhun irrigation district of China, the joint probability distribution of rainfall and reference crop evapotranspiration are developed with the Frank copula function. Using the joint probability distribution, the synchronous-asynchronous encounter risk, conditional joint probability, and conditional return period of different combinations of rainfall and reference crop evapotranspiration are analyzed. The results show that the copula-based joint probability distributions of rainfall and reference crop evapotranspiration are reasonable. The asynchronous encounter probability of rainfall and reference crop evapotranspiration is greater than their synchronous encounter probability, and the water shortage risk associated with meteorological drought (i.e. rainfall variability) is more prone to appear. Compared with other states, there are higher conditional joint probability and lower conditional return period in either low rainfall or high reference crop evapotranspiration. For a specifically high reference crop evapotranspiration with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is increased with the decrease in frequency. For a specifically low rainfall with a certain frequency, the encounter risk of low rainfall and high reference crop evapotranspiration is decreased with the decrease in frequency. When either the high reference crop evapotranspiration exceeds a certain frequency or low rainfall does not exceed a certain frequency, the higher conditional joint probability and lower conditional return period of various combinations likely cause a water shortage, but the water shortage is not severe.

  20. Tsallis non-extensive statistical mechanics in the ionospheric detrended total electron content during quiet and storm periods

    NASA Astrophysics Data System (ADS)

    Ogunsua, B. O.; Laoye, J. A.

    2018-05-01

    In this paper, the Tsallis non-extensive q-statistics in ionospheric dynamics was investigated using the total electron content (TEC) obtained from two Global Positioning System (GPS) receiver stations. This investigation was carried out considering the geomagnetically quiet and storm periods. The micro density variation of the ionospheric total electron content was extracted from the TEC data by method of detrending. The detrended total electron content, which represent the variation in the internal dynamics of the system was further analyzed using for non-extensive statistical mechanics using the q-Gaussian methods. Our results reveals that for all the analyzed data sets the Tsallis Gaussian probability distribution (q-Gaussian) with value q > 1 were obtained. It was observed that there is no distinct difference in pattern between the values of qquiet and qstorm. However the values of q varies with geophysical conditions and possibly with local dynamics for the two stations. Also observed are the asymmetric pattern of the q-Gaussian and a highly significant level of correlation for the q-index values obtained for the storm periods compared to the quiet periods between the two GPS receiver stations where the TEC was measured. The factors responsible for this variation can be mostly attributed to the varying mechanisms resulting in the self-reorganization of the system dynamics during the storm periods. The result shows the existence of long range correlation for both quiet and storm periods for the two stations.

  1. Bayesian inversion of data from effusive volcanic eruptions using physics-based models: Application to Mount St. Helens 2004--2008

    USGS Publications Warehouse

    Anderson, Kyle; Segall, Paul

    2013-01-01

    Physics-based models of volcanic eruptions can directly link magmatic processes with diverse, time-varying geophysical observations, and when used in an inverse procedure make it possible to bring all available information to bear on estimating properties of the volcanic system. We develop a technique for inverting geodetic, extrusive flux, and other types of data using a physics-based model of an effusive silicic volcanic eruption to estimate the geometry, pressure, depth, and volatile content of a magma chamber, and properties of the conduit linking the chamber to the surface. A Bayesian inverse formulation makes it possible to easily incorporate independent information into the inversion, such as petrologic estimates of melt water content, and yields probabilistic estimates for model parameters and other properties of the volcano. Probability distributions are sampled using a Markov-Chain Monte Carlo algorithm. We apply the technique using GPS and extrusion data from the 2004–2008 eruption of Mount St. Helens. In contrast to more traditional inversions such as those involving geodetic data alone in combination with kinematic forward models, this technique is able to provide constraint on properties of the magma, including its volatile content, and on the absolute volume and pressure of the magma chamber. Results suggest a large chamber of >40 km3 with a centroid depth of 11–18 km and a dissolved water content at the top of the chamber of 2.6–4.9 wt%.

  2. Study on probability distribution of prices in electricity market: A case study of zhejiang province, china

    NASA Astrophysics Data System (ADS)

    Zhou, H.; Chen, B.; Han, Z. X.; Zhang, F. Q.

    2009-05-01

    The study on probability density function and distribution function of electricity prices contributes to the power suppliers and purchasers to estimate their own management accurately, and helps the regulator monitor the periods deviating from normal distribution. Based on the assumption of normal distribution load and non-linear characteristic of the aggregate supply curve, this paper has derived the distribution of electricity prices as the function of random variable of load. The conclusion has been validated with the electricity price data of Zhejiang market. The results show that electricity prices obey normal distribution approximately only when supply-demand relationship is loose, whereas the prices deviate from normal distribution and present strong right-skewness characteristic. Finally, the real electricity markets also display the narrow-peak characteristic when undersupply occurs.

  3. Classic Maximum Entropy Recovery of the Average Joint Distribution of Apparent FRET Efficiency and Fluorescence Photons for Single-molecule Burst Measurements

    PubMed Central

    DeVore, Matthew S.; Gull, Stephen F.; Johnson, Carey K.

    2012-01-01

    We describe a method for analysis of single-molecule Förster resonance energy transfer (FRET) burst measurements using classic maximum entropy. Classic maximum entropy determines the Bayesian inference for the joint probability describing the total fluorescence photons and the apparent FRET efficiency. The method was tested with simulated data and then with DNA labeled with fluorescent dyes. The most probable joint distribution can be marginalized to obtain both the overall distribution of fluorescence photons and the apparent FRET efficiency distribution. This method proves to be ideal for determining the distance distribution of FRET-labeled biomolecules, and it successfully predicts the shape of the recovered distributions. PMID:22338694

  4. The effect of microscopic friction and size distributions on conditional probability distributions in soft particle packings

    NASA Astrophysics Data System (ADS)

    Saitoh, Kuniyasu; Magnanimo, Vanessa; Luding, Stefan

    2017-10-01

    Employing two-dimensional molecular dynamics (MD) simulations of soft particles, we study their non-affine responses to quasi-static isotropic compression where the effects of microscopic friction between the particles in contact and particle size distributions are examined. To quantify complicated restructuring of force-chain networks under isotropic compression, we introduce the conditional probability distributions (CPDs) of particle overlaps such that a master equation for distribution of overlaps in the soft particle packings can be constructed. From our MD simulations, we observe that the CPDs are well described by q-Gaussian distributions, where we find that the correlation for the evolution of particle overlaps is suppressed by microscopic friction, while it significantly increases with the increase of poly-dispersity.

  5. Ordinal probability effect measures for group comparisons in multinomial cumulative link models.

    PubMed

    Agresti, Alan; Kateri, Maria

    2017-03-01

    We consider simple ordinal model-based probability effect measures for comparing distributions of two groups, adjusted for explanatory variables. An "ordinal superiority" measure summarizes the probability that an observation from one distribution falls above an independent observation from the other distribution, adjusted for explanatory variables in a model. The measure applies directly to normal linear models and to a normal latent variable model for ordinal response variables. It equals Φ(β/2) for the corresponding ordinal model that applies a probit link function to cumulative multinomial probabilities, for standard normal cdf Φ and effect β that is the coefficient of the group indicator variable. For the more general latent variable model for ordinal responses that corresponds to a linear model with other possible error distributions and corresponding link functions for cumulative multinomial probabilities, the ordinal superiority measure equals exp(β)/[1+exp(β)] with the log-log link and equals approximately exp(β/2)/[1+exp(β/2)] with the logit link, where β is the group effect. Another ordinal superiority measure generalizes the difference of proportions from binary to ordinal responses. We also present related measures directly for ordinal models for the observed response that need not assume corresponding latent response models. We present confidence intervals for the measures and illustrate with an example. © 2016, The International Biometric Society.

  6. Sampling--how big a sample?

    PubMed

    Aitken, C G

    1999-07-01

    It is thought that, in a consignment of discrete units, a certain proportion of the units contain illegal material. A sample of the consignment is to be inspected. Various methods for the determination of the sample size are compared. The consignment will be considered as a random sample from some super-population of units, a certain proportion of which contain drugs. For large consignments, a probability distribution, known as the beta distribution, for the proportion of the consignment which contains illegal material is obtained. This distribution is based on prior beliefs about the proportion. Under certain specific conditions the beta distribution gives the same numerical results as an approach based on the binomial distribution. The binomial distribution provides a probability for the number of units in a sample which contain illegal material, conditional on knowing the proportion of the consignment which contains illegal material. This is in contrast to the beta distribution which provides probabilities for the proportion of a consignment which contains illegal material, conditional on knowing the number of units in the sample which contain illegal material. The interpretation when the beta distribution is used is much more intuitively satisfactory. It is also much more flexible in its ability to cater for prior beliefs which may vary given the different circumstances of different crimes. For small consignments, a distribution, known as the beta-binomial distribution, for the number of units in the consignment which are found to contain illegal material, is obtained, based on prior beliefs about the number of units in the consignment which are thought to contain illegal material. As with the beta and binomial distributions for large samples, it is shown that, in certain specific conditions, the beta-binomial and hypergeometric distributions give the same numerical results. However, the beta-binomial distribution, as with the beta distribution, has a more intuitively satisfactory interpretation and greater flexibility. The beta and the beta-binomial distributions provide methods for the determination of the minimum sample size to be taken from a consignment in order to satisfy a certain criterion. The criterion requires the specification of a proportion and a probability.

  7. Organic-rich sediments in ventilated deep-sea environments: Relationship to climate, sea level, and trophic changes

    NASA Astrophysics Data System (ADS)

    Bertrand, P.; Pedersen, T. F.; Schneider, R.; Shimmield, G.; Lallier-Verges, E.; Disnar, J. R.; Massias, D.; Villanueva, J.; Tribovillard, N.; Huc, A. Y.; Giraud, X.; Pierre, C.; VéNec-Peyré, M.-T.

    2003-02-01

    Sediments on the Namibian Margin in the SE Atlantic between water depths of ˜1000 and ˜3600 m are highly enriched in hydrocarbon-prone organic matter. Such sedimentation has occurred for more than 2 million years and is geographically distributed over hundreds of kilometers along the margin, so that the sediments of this region contain a huge concentrated stock of organic carbon. It is shown here that most of the variability in organic content is due to relative dilution by buried carbonates. This reflects both export productivity and diagenetic dissolution, not differences in either water column or bottom water anoxia and related enhanced preservation of organic matter. These observations offer a new mechanism for the formation of potential source rocks in a well-ventilated open ocean, in this case the South Atlantic. The organic richness is discussed in terms of a suite of probable controls including local wind-driven productivity (upwelling), trophic conditions, transfer efficiency, diagenetic processes, and climate-related sea level and deep circulation. The probability of past occurrences of such organic-rich facies in equivalent oceanographic settings at the edge of large oceanic basins should be carefully considered in deep offshore exploration.

  8. A Study of Strengthening Secondary Mathematics Teachers' Knowledge of Statistics and Probability via Professional Development

    ERIC Educational Resources Information Center

    DeVaul, Lina

    2017-01-01

    A professional development program (PSPD) was implemented to improve in-service secondary mathematics teachers' content knowledge, pedagogical knowledge, and self-efficacy in teaching secondary school statistics and probability. Participants generated a teaching resource website at the conclusion of the PSPD program. Participants' content…

  9. The maximum entropy method of moments and Bayesian probability theory

    NASA Astrophysics Data System (ADS)

    Bretthorst, G. Larry

    2013-08-01

    The problem of density estimation occurs in many disciplines. For example, in MRI it is often necessary to classify the types of tissues in an image. To perform this classification one must first identify the characteristics of the tissues to be classified. These characteristics might be the intensity of a T1 weighted image and in MRI many other types of characteristic weightings (classifiers) may be generated. In a given tissue type there is no single intensity that characterizes the tissue, rather there is a distribution of intensities. Often this distributions can be characterized by a Gaussian, but just as often it is much more complicated. Either way, estimating the distribution of intensities is an inference problem. In the case of a Gaussian distribution, one must estimate the mean and standard deviation. However, in the Non-Gaussian case the shape of the density function itself must be inferred. Three common techniques for estimating density functions are binned histograms [1, 2], kernel density estimation [3, 4], and the maximum entropy method of moments [5, 6]. In the introduction, the maximum entropy method of moments will be reviewed. Some of its problems and conditions under which it fails will be discussed. Then in later sections, the functional form of the maximum entropy method of moments probability distribution will be incorporated into Bayesian probability theory. It will be shown that Bayesian probability theory solves all of the problems with the maximum entropy method of moments. One gets posterior probabilities for the Lagrange multipliers, and, finally, one can put error bars on the resulting estimated density function.

  10. Precipitation intensity probability distribution modelling for hydrological and construction design purposes

    NASA Astrophysics Data System (ADS)

    Koshinchanov, Georgy; Dimitrov, Dobri

    2008-11-01

    The characteristics of rainfall intensity are important for many purposes, including design of sewage and drainage systems, tuning flood warning procedures, etc. Those estimates are usually statistical estimates of the intensity of precipitation realized for certain period of time (e.g. 5, 10 min., etc) with different return period (e.g. 20, 100 years, etc). The traditional approach in evaluating the mentioned precipitation intensities is to process the pluviometer's records and fit probability distribution to samples of intensities valid for certain locations ore regions. Those estimates further become part of the state regulations to be used for various economic activities. Two problems occur using the mentioned approach: 1. Due to various factors the climate conditions are changed and the precipitation intensity estimates need regular update; 2. As far as the extremes of the probability distribution are of particular importance for the practice, the methodology of the distribution fitting needs specific attention to those parts of the distribution. The aim of this paper is to make review of the existing methodologies for processing the intensive rainfalls and to refresh some of the statistical estimates for the studied areas. The methodologies used in Bulgaria for analyzing the intensive rainfalls and produce relevant statistical estimates: The method of the maximum intensity, used in the National Institute of Meteorology and Hydrology to process and decode the pluviometer's records, followed by distribution fitting for each precipitation duration period; As the above, but with separate modeling of probability distribution for the middle and high probability quantiles. Method is similar to the first one, but with a threshold of 0,36 mm/min of intensity; Another method proposed by the Russian hydrologist G. A. Aleksiev for regionalization of estimates over some territory, improved and adapted by S. Gerasimov for Bulgaria; Next method is considering only the intensive rainfalls (if any) during the day with the maximal annual daily precipitation total for a given year; Conclusions are drown on the relevance and adequacy of the applied methods.

  11. Streamflow distribution maps for the Cannon River drainage basin, southeast Minnesota, and the St. Louis River drainage basin, northeast Minnesota

    USGS Publications Warehouse

    Smith, Erik A.; Sanocki, Chris A.; Lorenz, David L.; Jacobsen, Katrin E.

    2017-12-27

    Streamflow distribution maps for the Cannon River and St. Louis River drainage basins were developed by the U.S. Geological Survey, in cooperation with the Legislative-Citizen Commission on Minnesota Resources, to illustrate relative and cumulative streamflow distributions. The Cannon River was selected to provide baseline data to assess the effects of potential surficial sand mining, and the St. Louis River was selected to determine the effects of ongoing Mesabi Iron Range mining. Each drainage basin (Cannon, St. Louis) was subdivided into nested drainage basins: the Cannon River was subdivided into 152 nested drainage basins, and the St. Louis River was subdivided into 353 nested drainage basins. For each smaller drainage basin, the estimated volumes of groundwater discharge (as base flow) and surface runoff flowing into all surface-water features were displayed under the following conditions: (1) extreme low-flow conditions, comparable to an exceedance-probability quantile of 0.95; (2) low-flow conditions, comparable to an exceedance-probability quantile of 0.90; (3) a median condition, comparable to an exceedance-probability quantile of 0.50; and (4) a high-flow condition, comparable to an exceedance-probability quantile of 0.02.Streamflow distribution maps were developed using flow-duration curve exceedance-probability quantiles in conjunction with Soil-Water-Balance model outputs; both the flow-duration curve and Soil-Water-Balance models were built upon previously published U.S. Geological Survey reports. The selected streamflow distribution maps provide a proactive water management tool for State cooperators by illustrating flow rates during a range of hydraulic conditions. Furthermore, after the nested drainage basins are highlighted in terms of surface-water flows, the streamflows can be evaluated in the context of meeting specific ecological flows under different flow regimes and potentially assist with decisions regarding groundwater and surface-water appropriations. Presented streamflow distribution maps are foundational work intended to support the development of additional streamflow distribution maps that include statistical constraints on the selected flow conditions.

  12. Four-dimensional symmetry from a broad viewpoint. II Invariant distribution of quantized field oscillators and questions on infinities

    NASA Technical Reports Server (NTRS)

    Hsu, J. P.

    1983-01-01

    The foundation of the quantum field theory is changed by introducing a new universal probability principle into field operators: one single inherent and invariant probability distribution P(/k/) is postulated for boson and fermion field oscillators. This can be accomplished only when one treats the four-dimensional symmetry from a broad viewpoint. Special relativity is too restrictive to allow such a universal probability principle. A radical length, R, appears in physics through the probability distribution P(/k/). The force between two point particles vanishes when their relative distance tends to zero. This appears to be a general property for all forces and resembles the property of asymptotic freedom. The usual infinities in vacuum fluctuations and in local interactions, however complicated they may be, are all removed from quantum field theories. In appendix A a simple finite and unitary theory of unified electroweak interactions is discussed without assuming Higgs scalar bosons.

  13. Probabilistic sensitivity analysis for decision trees with multiple branches: use of the Dirichlet distribution in a Bayesian framework.

    PubMed

    Briggs, Andrew H; Ades, A E; Price, Martin J

    2003-01-01

    In structuring decision models of medical interventions, it is commonly recommended that only 2 branches be used for each chance node to avoid logical inconsistencies that can arise during sensitivity analyses if the branching probabilities do not sum to 1. However, information may be naturally available in an unconditional form, and structuring a tree in conditional form may complicate rather than simplify the sensitivity analysis of the unconditional probabilities. Current guidance emphasizes using probabilistic sensitivity analysis, and a method is required to provide probabilistic probabilities over multiple branches that appropriately represents uncertainty while satisfying the requirement that mutually exclusive event probabilities should sum to 1. The authors argue that the Dirichlet distribution, the multivariate equivalent of the beta distribution, is appropriate for this purpose and illustrate its use for generating a fully probabilistic transition matrix for a Markov model. Furthermore, they demonstrate that by adopting a Bayesian approach, the problem of observing zero counts for transitions of interest can be overcome.

  14. Moment and maximum likelihood estimators for Weibull distributions under length- and area-biased sampling

    Treesearch

    Jeffrey H. Gove

    2003-01-01

    Many of the most popular sampling schemes used in forestry are probability proportional to size methods. These methods are also referred to as size biased because sampling is actually from a weighted form of the underlying population distribution. Length- and area-biased sampling are special cases of size-biased sampling where the probability weighting comes from a...

  15. Surface Impact Simulations of Helium Nanodroplets

    DTIC Science & Technology

    2015-06-30

    mechanical delocalization of the individual helium atoms in the droplet and the quan- tum statistical effects that accompany the interchange of identical...incorporates the effects of atomic delocaliza- tion by treating individual atoms as smeared-out probability distributions that move along classical...probability density distributions to give effec- tive interatomic potential energy curves that have zero-point averaging effects built into them [25

  16. Classroom Research: Assessment of Student Understanding of Sampling Distributions of Means and the Central Limit Theorem in Post-Calculus Probability and Statistics Classes

    ERIC Educational Resources Information Center

    Lunsford, M. Leigh; Rowell, Ginger Holmes; Goodson-Espy, Tracy

    2006-01-01

    We applied a classroom research model to investigate student understanding of sampling distributions of sample means and the Central Limit Theorem in post-calculus introductory probability and statistics courses. Using a quantitative assessment tool developed by previous researchers and a qualitative assessment tool developed by the authors, we…

  17. Misinterpretation of statistical distance in security of quantum key distribution shown by simulation

    NASA Astrophysics Data System (ADS)

    Iwakoshi, Takehisa; Hirota, Osamu

    2014-10-01

    This study will test an interpretation in quantum key distribution (QKD) that trace distance between the distributed quantum state and the ideal mixed state is a maximum failure probability of the protocol. Around 2004, this interpretation was proposed and standardized to satisfy both of the key uniformity in the context of universal composability and operational meaning of the failure probability of the key extraction. However, this proposal has not been verified concretely yet for many years while H. P. Yuen and O. Hirota have thrown doubt on this interpretation since 2009. To ascertain this interpretation, a physical random number generator was employed to evaluate key uniformity in QKD. In this way, we calculated statistical distance which correspond to trace distance in quantum theory after a quantum measurement is done, then we compared it with the failure probability whether universal composability was obtained. As a result, the degree of statistical distance of the probability distribution of the physical random numbers and the ideal uniformity was very large. It is also explained why trace distance is not suitable to guarantee the security in QKD from the view point of quantum binary decision theory.

  18. Option volatility and the acceleration Lagrangian

    NASA Astrophysics Data System (ADS)

    Baaquie, Belal E.; Cao, Yang

    2014-01-01

    This paper develops a volatility formula for option on an asset from an acceleration Lagrangian model and the formula is calibrated with market data. The Black-Scholes model is a simpler case that has a velocity dependent Lagrangian. The acceleration Lagrangian is defined, and the classical solution of the system in Euclidean time is solved by choosing proper boundary conditions. The conditional probability distribution of final position given the initial position is obtained from the transition amplitude. The volatility is the standard deviation of the conditional probability distribution. Using the conditional probability and the path integral method, the martingale condition is applied, and one of the parameters in the Lagrangian is fixed. The call option price is obtained using the conditional probability and the path integral method.

  19. A SUBMILLIMETER CONTINUUM SURVEY OF LOCAL DUST-OBSCURED GALAXIES

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Jong Chul; Hwang, Ho Seong; Lee, Gwang-Ho, E-mail: jclee@kasi.re.kr

    We conduct a 350 μ m dust continuum emission survey of 17 dust-obscured galaxies (DOGs) at z = 0.05–0.08 with the Caltech Submillimeter Observatory (CSO). We detect 14 DOGs with S{sub 350μm} = 114–650 mJy and signal-to-noise > 3. By including two additional DOGs with submillimeter data in the literature, we are able to study dust content for a sample of 16 local DOGs, which consist of 12 bump and four power-law types. We determine their physical parameters with a two-component modified blackbody function model. The derived dust temperatures are in the range 57–122 K and 22–35 K for themore » warm and cold dust components, respectively. The total dust mass and the mass fraction of the warm dust component are 3–34 × 10{sup 7} M {sub ⊙} and 0.03%–2.52%, respectively. We compare these results with those of other submillimeter-detected infrared luminous galaxies. The bump DOGs, the majority of the DOG sample, show similar distributions of dust temperatures and total dust mass to the comparison sample. The power-law DOGs show a hint of smaller dust masses than other samples, but need to be tested with a larger sample. These findings support that the reason DOGs show heavy dust obscuration is not an overall amount of dust content, but probably the spatial distribution of dust therein.« less

  20. On the Structure of a Best Possible Crossover Selection Strategy in Genetic Algorithms

    NASA Astrophysics Data System (ADS)

    Lässig, Jörg; Hoffmann, Karl Heinz

    The paper considers the problem of selecting individuals in the current population in genetic algorithms for crossover to find a solution with high fitness for a given optimization problem. Many different schemes have been described in the literature as possible strategies for this task but so far comparisons have been predominantly empirical. It is shown that if one wishes to maximize any linear function of the final state probabilities, e.g. the fitness of the best individual in the final population of the algorithm, then a best probability distribution for selecting an individual in each generation is a rectangular distribution over the individuals sorted in descending sequence by their fitness values. This means uniform probabilities have to be assigned to a group of the best individuals of the population but probabilities equal to zero to individuals with lower fitness, assuming that the probability distribution to choose individuals from the current population can be chosen independently for each iteration and each individual. This result is then generalized also to typical practically applied performance measures, such as maximizing the expected fitness value of the best individual seen in any generation.

  1. Observations on the distribution of freshwater mollusca and chemistry of the natural waters in the south-eastern Transvaal and adjacent northern Swaziland*

    PubMed Central

    Schutte, C. H. J.; Frank, G. H.

    1964-01-01

    An extensive survey of the molluscan fauna and of the chemistry of the freshwaters of the Eastern Transvaal Lowveld has revealed no simple correlation between the two. The waters fall into four fairly distinct and geographically associated groups chiefly characterized by their calcium and magnesium content. The frequency of the two intermediate hosts of bilharziasis was found to be roughly proportional to the hardness of the water but as the latter, in this area, is associated with altitude and this again with temperature and stream gradient it is thought highly probable that the distribution of these snails is the result of the interaction of a complex of factors. None of the individual chemical constituents in any of the waters examined is regarded as outside the tolerance range of these snails. It is also concluded that under natural conditions this area would have had few waterbodies suitable for colonization by these snails but that the expansion of irrigation schemes has created ideal conditions for their rapid establishment throughout the area. PMID:14163962

  2. Vitamin D and male reproductive system.

    PubMed

    Costanzo, Pablo R; Knoblovits, Pablo

    2016-12-01

    Vitamin D deficiency is a highly prevalent worldwide condition and affects people of all ages. The most important role of vitamin D is the regulation of intestinal calcium absorption and metabolism of calcium and phosphorus to maintain muscle and bone homeostasis. Furthermore, in recent years it has been discovered that the vitamin D receptor (VDR) is widely distributed in many organs and tissues where vitamin D can perform other actions that include the modulation of the immune response, insulin secretion, anti-proliferative effect on cells of vascular smooth muscle, modulation of the renin-angiotensin-aldosterone system and regulates cell growth in several organs. The VDR is widely distributed in the male reproductive system. Vitamin D induces changes in the spermatozoa's calcium and cholesterol content and in protein phosphorylation to tyrosine/threonine residues. These changes could be involved in sperm capacitation. Vitamin D seems to regulate aromatase expression in different tissues. Studies analyzing seasonal variations of sex steroids in male populations yield conflicting results. This is probably due to the wide heterogeneity of the populations included according to age, systemic diseases and obesity.

  3. Supply chain risk management of newspaper industry: A quantitative study

    NASA Astrophysics Data System (ADS)

    Sartika, Viny; Hisjam, Muh.; Sutopo, Wahyudi

    2018-02-01

    The newspaper industry has several distinctive features that make it stands out from other industries. The strict delivery deadline and zero inventory led to a very short time frame for production and distribution. On the other hand, there is pressure from the newsroom to encourage the start of production as slowly as possible in order to enter the news, while there is pressure from production and distribution to start production as early as possible. Supply chain risk management is needed in determining the best strategy for dealing with possible risks in the newspaper industry. In a case study of a newspaper in Surakarta, quantitative approaches are made to the newspaper supply chain risk management by calculating the expected cost of risk based on the magnitude of the impact and the probability of a risk event. From the calculation results obtained that the five risks with the highest value are newspaper delays to the end customer, broken plate, miss print, down machine, and delayed delivery of newspaper content. Then analyzed appropriate mitigation strategies to cope with such risk events.

  4. Occurrence of Toxic Cyanobacterial Blooms in Rio de la Plata Estuary, Argentina: Field Study and Data Analysis

    PubMed Central

    Giannuzzi, L.; Carvajal, G.; Corradini, M. G.; Araujo Andrade, C.; Echenique, R.; Andrinolo, D.

    2012-01-01

    Water samples were collected during 3 years (2004–2007) at three sampling sites in the Rio de la Plata estuary. Thirteen biological, physical, and chemical parameters were determined on the water samples. The presence of microcystin-LR in the reservoir samples, and also in domestic water samples, was confirmed and quantified. Microcystin-LR concentration ranged between 0.02 and 8.6 μg.L−1. Principal components analysis was used to identify the factors promoting cyanobacteria growth. The proliferation of cyanobacteria was accompanied by the presence of high total and fecal coliforms bacteria (>1500 MNP/100 mL), temperature ≥25°C, and total phosphorus content ≥1.24 mg·L−1. The observed fluctuating patterns of Microcystis aeruginosa, total coliforms, and Microcystin-LR were also described by probabilistic models based on the log-normal and extreme value distributions. The sampling sites were compared in terms of the distribution parameters and the probability of observing high concentrations for Microcystis aeruginosa, total coliforms, and microcystin-LR concentration. PMID:22523486

  5. Mapping the distribution of vesicular textures on silicic lavas using the Thermal Infrared Multispectral Scanner

    NASA Technical Reports Server (NTRS)

    Ondrusek, Jaime; Christensen, Philip R.; Fink, Jonathan H.

    1993-01-01

    To investigate the effect of vesicularity on TIMS (Thermal Infrared Multispectral Scanner) imagery independent of chemical variations, we studied a large rhyolitic flow of uniform composition but textural heterogeneity. The imagery was recalibrated so that the digital number values for a lake in the scene matched a calculated ideal spectrum for water. TIMS spectra for the lava show useful differences in coarsely and finely vesicular pumice data, particularly in TIMS bands 3 and 4. Images generated by ratioing these bands accurately map out those areas known from field studies to be coarsely vesicular pumice. These texture-related emissivity variations are probably due to the larger vesicles being relatively deeper and separated by smaller septa leaving less smooth glass available to give the characteristic emission of the lava. In studies of inaccessible lava flows (as on Mars) areas of coarsely vesicular pumice must be identified and avoided before chemical variations can be interpreted. Remotely determined distributions of vesicular and glassy textures can also be related to the volatile contents and potential hazards associated with the emplacement of silicic lava flows on Earth.

  6. High throughput nonparametric probability density estimation.

    PubMed

    Farmer, Jenny; Jacobs, Donald

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference.

  7. High throughput nonparametric probability density estimation

    PubMed Central

    Farmer, Jenny

    2018-01-01

    In high throughput applications, such as those found in bioinformatics and finance, it is important to determine accurate probability distribution functions despite only minimal information about data characteristics, and without using human subjectivity. Such an automated process for univariate data is implemented to achieve this goal by merging the maximum entropy method with single order statistics and maximum likelihood. The only required properties of the random variables are that they are continuous and that they are, or can be approximated as, independent and identically distributed. A quasi-log-likelihood function based on single order statistics for sampled uniform random data is used to empirically construct a sample size invariant universal scoring function. Then a probability density estimate is determined by iteratively improving trial cumulative distribution functions, where better estimates are quantified by the scoring function that identifies atypical fluctuations. This criterion resists under and over fitting data as an alternative to employing the Bayesian or Akaike information criterion. Multiple estimates for the probability density reflect uncertainties due to statistical fluctuations in random samples. Scaled quantile residual plots are also introduced as an effective diagnostic to visualize the quality of the estimated probability densities. Benchmark tests show that estimates for the probability density function (PDF) converge to the true PDF as sample size increases on particularly difficult test probability densities that include cases with discontinuities, multi-resolution scales, heavy tails, and singularities. These results indicate the method has general applicability for high throughput statistical inference. PMID:29750803

  8. Estimation of the lower and upper bounds on the probability of failure using subset simulation and random set theory

    NASA Astrophysics Data System (ADS)

    Alvarez, Diego A.; Uribe, Felipe; Hurtado, Jorge E.

    2018-02-01

    Random set theory is a general framework which comprises uncertainty in the form of probability boxes, possibility distributions, cumulative distribution functions, Dempster-Shafer structures or intervals; in addition, the dependence between the input variables can be expressed using copulas. In this paper, the lower and upper bounds on the probability of failure are calculated by means of random set theory. In order to accelerate the calculation, a well-known and efficient probability-based reliability method known as subset simulation is employed. This method is especially useful for finding small failure probabilities in both low- and high-dimensional spaces, disjoint failure domains and nonlinear limit state functions. The proposed methodology represents a drastic reduction of the computational labor implied by plain Monte Carlo simulation for problems defined with a mixture of representations for the input variables, while delivering similar results. Numerical examples illustrate the efficiency of the proposed approach.

  9. Properties of the probability density function of the non-central chi-squared distribution

    NASA Astrophysics Data System (ADS)

    András, Szilárd; Baricz, Árpád

    2008-10-01

    In this paper we consider the probability density function (pdf) of a non-central [chi]2 distribution with arbitrary number of degrees of freedom. For this function we prove that can be represented as a finite sum and we deduce a partial derivative formula. Moreover, we show that the pdf is log-concave when the degrees of freedom is greater or equal than 2. At the end of this paper we present some Turán-type inequalities for this function and an elegant application of the monotone form of l'Hospital's rule in probability theory is given.

  10. Combining Probability Distributions of Wind Waves and Sea Level Variations to Assess Return Periods of Coastal Floods

    NASA Astrophysics Data System (ADS)

    Leijala, U.; Bjorkqvist, J. V.; Pellikka, H.; Johansson, M. M.; Kahma, K. K.

    2017-12-01

    Predicting the behaviour of the joint effect of sea level and wind waves is of great significance due to the major impact of flooding events in densely populated coastal regions. As mean sea level rises, the effect of sea level variations accompanied by the waves will be even more harmful in the future. The main challenge when evaluating the effect of waves and sea level variations is that long time series of both variables rarely exist. Wave statistics are also highly location-dependent, thus requiring wave buoy measurements and/or high-resolution wave modelling. As an initial approximation of the joint effect, the variables may be treated as independent random variables, to achieve the probability distribution of their sum. We present results of a case study based on three probability distributions: 1) wave run-up constructed from individual wave buoy measurements, 2) short-term sea level variability based on tide gauge data, and 3) mean sea level projections based on up-to-date regional scenarios. The wave measurements were conducted during 2012-2014 on the coast of city of Helsinki located in the Gulf of Finland in the Baltic Sea. The short-term sea level distribution contains the last 30 years (1986-2015) of hourly data from Helsinki tide gauge, and the mean sea level projections are scenarios adjusted for the Gulf of Finland. Additionally, we present a sensitivity test based on six different theoretical wave height distributions representing different wave behaviour in relation to sea level variations. As these wave distributions are merged with one common sea level distribution, we can study how the different shapes of the wave height distribution affect the distribution of the sum, and which one of the components is dominating under different wave conditions. As an outcome of the method, we obtain a probability distribution of the maximum elevation of the continuous water mass, which enables a flexible tool for evaluating different risk levels in the current and future climate.

  11. An efficient distribution method for nonlinear transport problems in highly heterogeneous stochastic porous media

    NASA Astrophysics Data System (ADS)

    Ibrahima, Fayadhoi; Meyer, Daniel; Tchelepi, Hamdi

    2016-04-01

    Because geophysical data are inexorably sparse and incomplete, stochastic treatments of simulated responses are crucial to explore possible scenarios and assess risks in subsurface problems. In particular, nonlinear two-phase flows in porous media are essential, yet challenging, in reservoir simulation and hydrology. Adding highly heterogeneous and uncertain input, such as the permeability and porosity fields, transforms the estimation of the flow response into a tough stochastic problem for which computationally expensive Monte Carlo (MC) simulations remain the preferred option.We propose an alternative approach to evaluate the probability distribution of the (water) saturation for the stochastic Buckley-Leverett problem when the probability distributions of the permeability and porosity fields are available. We give a computationally efficient and numerically accurate method to estimate the one-point probability density (PDF) and cumulative distribution functions (CDF) of the (water) saturation. The distribution method draws inspiration from a Lagrangian approach of the stochastic transport problem and expresses the saturation PDF and CDF essentially in terms of a deterministic mapping and the distribution and statistics of scalar random fields. In a large class of applications these random fields can be estimated at low computational costs (few MC runs), thus making the distribution method attractive. Even though the method relies on a key assumption of fixed streamlines, we show that it performs well for high input variances, which is the case of interest. Once the saturation distribution is determined, any one-point statistics thereof can be obtained, especially the saturation average and standard deviation. Moreover, the probability of rare events and saturation quantiles (e.g. P10, P50 and P90) can be efficiently derived from the distribution method. These statistics can then be used for risk assessment, as well as data assimilation and uncertainty reduction in the prior knowledge of input distributions. We provide various examples and comparisons with MC simulations to illustrate the performance of the method.

  12. Content analysis in information flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grusho, Alexander A.; Faculty of Computational Mathematics and Cybernetics, Moscow State University, Moscow; Grusho, Nick A.

    The paper deals with architecture of content recognition system. To analyze the problem the stochastic model of content recognition in information flows was built. We proved that under certain conditions it is possible to solve correctly a part of the problem with probability 1, viewing a finite section of the information flow. That means that good architecture consists of two steps. The first step determines correctly certain subsets of contents, while the second step may demand much more time for true decision.

  13. Distribution of rain height over subtropical region: Durban, South Africa for satellite communication systems

    NASA Astrophysics Data System (ADS)

    Olurotimi, E. O.; Sokoya, O.; Ojo, J. S.; Owolawi, P. A.

    2018-03-01

    Rain height is one of the significant parameters for prediction of rain attenuation for Earth-space telecommunication links, especially those operating at frequencies above 10 GHz. This study examines Three-parameter Dagum distribution of the rain height over Durban, South Africa. 5-year data were used to study the monthly, seasonal, and annual variations using the parameters estimated by the maximum likelihood of the distribution. The performance estimation of the distribution was determined using the statistical goodness of fit. Three-parameter Dagum distribution shows an appropriate distribution for the modeling of rain height over Durban with the Root Mean Square Error of 0.26. Also, the shape and scale parameters for the distribution show a wide variation. The probability exceedance of time for 0.01% indicates the high probability of rain attenuation at higher frequencies.

  14. Radial particle distributions in PARMILA simulation beams

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boicourt, G.P.

    1984-03-01

    The estimation of beam spill in particle accelerators is becoming of greater importance as higher current designs are being funded. To the present, no numerical method for predicting beam-spill has been available. In this paper, we present an approach to the loss-estimation problem that uses probability distributions fitted to particle-simulation beams. The properties of the PARMILA code's radial particle distribution are discussed, and a broad class of probability distributions are examined to check their ability to fit it. The possibility that the PARMILA distribution is a mixture is discussed, and a fitting distribution consisting of a mixture of two generalizedmore » gamma distributions is found. An efficient algorithm to accomplish the fit is presented. Examples of the relative prediction of beam spill are given. 26 references, 18 figures, 1 table.« less

  15. Geodesic Monte Carlo on Embedded Manifolds

    PubMed Central

    Byrne, Simon; Girolami, Mark

    2013-01-01

    Markov chain Monte Carlo methods explicitly defined on the manifold of probability distributions have recently been established. These methods are constructed from diffusions across the manifold and the solution of the equations describing geodesic flows in the Hamilton–Jacobi representation. This paper takes the differential geometric basis of Markov chain Monte Carlo further by considering methods to simulate from probability distributions that themselves are defined on a manifold, with common examples being classes of distributions describing directional statistics. Proposal mechanisms are developed based on the geodesic flows over the manifolds of support for the distributions, and illustrative examples are provided for the hypersphere and Stiefel manifold of orthonormal matrices. PMID:25309024

  16. Reconstructing the equilibrium Boltzmann distribution from well-tempered metadynamics.

    PubMed

    Bonomi, M; Barducci, A; Parrinello, M

    2009-08-01

    Metadynamics is a widely used and successful method for reconstructing the free-energy surface of complex systems as a function of a small number of suitably chosen collective variables. This is achieved by biasing the dynamics of the system. The bias acting on the collective variables distorts the probability distribution of the other variables. Here we present a simple reweighting algorithm for recovering the unbiased probability distribution of any variable from a well-tempered metadynamics simulation. We show the efficiency of the reweighting procedure by reconstructing the distribution of the four backbone dihedral angles of alanine dipeptide from two and even one dimensional metadynamics simulation. 2009 Wiley Periodicals, Inc.

  17. Unified nano-mechanics based probabilistic theory of quasibrittle and brittle structures: II. Fatigue crack growth, lifetime and scaling

    NASA Astrophysics Data System (ADS)

    Le, Jia-Liang; Bažant, Zdeněk P.

    2011-07-01

    This paper extends the theoretical framework presented in the preceding Part I to the lifetime distribution of quasibrittle structures failing at the fracture of one representative volume element under constant amplitude fatigue. The probability distribution of the critical stress amplitude is derived for a given number of cycles and a given minimum-to-maximum stress ratio. The physical mechanism underlying the Paris law for fatigue crack growth is explained under certain plausible assumptions about the damage accumulation in the cyclic fracture process zone at the tip of subcritical crack. This law is then used to relate the probability distribution of critical stress amplitude to the probability distribution of fatigue lifetime. The theory naturally yields a power-law relation for the stress-life curve (S-N curve), which agrees with Basquin's law. Furthermore, the theory indicates that, for quasibrittle structures, the S-N curve must be size dependent. Finally, physical explanation is provided to the experimentally observed systematic deviations of lifetime histograms of various ceramics and bones from the Weibull distribution, and their close fits by the present theory are demonstrated.

  18. Voronoi cell patterns: Theoretical model and applications

    NASA Astrophysics Data System (ADS)

    González, Diego Luis; Einstein, T. L.

    2011-11-01

    We use a simple fragmentation model to describe the statistical behavior of the Voronoi cell patterns generated by a homogeneous and isotropic set of points in 1D and in 2D. In particular, we are interested in the distribution of sizes of these Voronoi cells. Our model is completely defined by two probability distributions in 1D and again in 2D, the probability to add a new point inside an existing cell and the probability that this new point is at a particular position relative to the preexisting point inside this cell. In 1D the first distribution depends on a single parameter while the second distribution is defined through a fragmentation kernel; in 2D both distributions depend on a single parameter. The fragmentation kernel and the control parameters are closely related to the physical properties of the specific system under study. We use our model to describe the Voronoi cell patterns of several systems. Specifically, we study the island nucleation with irreversible attachment, the 1D car-parking problem, the formation of second-level administrative divisions, and the pattern formed by the Paris Métro stations.

  19. Voronoi Cell Patterns: theoretical model and application to submonolayer growth

    NASA Astrophysics Data System (ADS)

    González, Diego Luis; Einstein, T. L.

    2012-02-01

    We use a simple fragmentation model to describe the statistical behavior of the Voronoi cell patterns generated by a homogeneous and isotropic set of points in 1D and in 2D. In particular, we are interested in the distribution of sizes of these Voronoi cells. Our model is completely defined by two probability distributions in 1D and again in 2D, the probability to add a new point inside an existing cell and the probability that this new point is at a particular position relative to the preexisting point inside this cell. In 1D the first distribution depends on a single parameter while the second distribution is defined through a fragmentation kernel; in 2D both distributions depend on a single parameter. The fragmentation kernel and the control parameters are closely related to the physical properties of the specific system under study. We apply our model to describe the Voronoi cell patterns of island nucleation for critical island sizes i=0,1,2,3. Experimental results for the Voronoi cells of InAs/GaAs quantum dots are also described by our model.

  20. Origin and spatial distribution of metals in moss samples in Albania: A hotspot of heavy metal contamination in Europe.

    PubMed

    Lazo, Pranvera; Steinnes, Eiliv; Qarri, Flora; Allajbeu, Shaniko; Kane, Sonila; Stafilov, Trajce; Frontasyeva, Marina V; Harmens, Harry

    2018-01-01

    This study presents the spatial distribution of 37 elements in 48 moss samples collected over the whole territory of Albania and provides information on sources and factors controlling the concentrations of elements in the moss. High variations of trace metals indicate that the concentrations of elements are affected by different factors. Relations between the elements in moss, geochemical interpretation of the data, and secondary effects such as redox conditions generated from local soil and/or long distance atmospheric transport of the pollutants are discussed. Zr normalized data, and the ratios of different elements are calculated to assess the origin of elements present in the current moss samples with respect to different geogenic and anthropogenic inputs. Factor analysis (FA) is used to identify the most probable sources of the elements. Four dominant factors are identified, i.e. natural contamination; dust emission from local mining operations; atmospheric transport of contaminants from local and long distance sources; and contributions from air borne marine salts. Mineral particle dust from local emission sources is classified as the most important factor affecting the atmospheric deposition of elements accumulated in the current moss samples. The open slag dumps of mining operation in Albania is probably the main factor contributing to high contents of Cr, Ni, Fe, Ti and Al in the moss. Enrichment factors (EF) were calculated to clarify whether the elements in the present moss samples mainly originate from atmospheric deposition and/or local substrate materials. Copyright © 2017 Elsevier Ltd. All rights reserved.

  1. Bayesian Model Selection in Geophysics: The evidence

    NASA Astrophysics Data System (ADS)

    Vrugt, J. A.

    2016-12-01

    Bayesian inference has found widespread application and use in science and engineering to reconcile Earth system models with data, including prediction in space (interpolation), prediction in time (forecasting), assimilation of observations and deterministic/stochastic model output, and inference of the model parameters. Per Bayes theorem, the posterior probability, , P(H|D), of a hypothesis, H, given the data D, is equivalent to the product of its prior probability, P(H), and likelihood, L(H|D), divided by a normalization constant, P(D). In geophysics, the hypothesis, H, often constitutes a description (parameterization) of the subsurface for some entity of interest (e.g. porosity, moisture content). The normalization constant, P(D), is not required for inference of the subsurface structure, yet of great value for model selection. Unfortunately, it is not particularly easy to estimate P(D) in practice. Here, I will introduce the various building blocks of a general purpose method which provides robust and unbiased estimates of the evidence, P(D). This method uses multi-dimensional numerical integration of the posterior (parameter) distribution. I will then illustrate this new estimator by application to three competing subsurface models (hypothesis) using GPR travel time data from the South Oyster Bacterial Transport Site, in Virginia, USA. The three subsurface models differ in their treatment of the porosity distribution and use (a) horizontal layering with fixed layer thicknesses, (b) vertical layering with fixed layer thicknesses and (c) a multi-Gaussian field. The results of the new estimator are compared against the brute force Monte Carlo method, and the Laplace-Metropolis method.

  2. The probability distribution model of air pollution index and its dominants in Kuala Lumpur

    NASA Astrophysics Data System (ADS)

    AL-Dhurafi, Nasr Ahmed; Razali, Ahmad Mahir; Masseran, Nurulkamal; Zamzuri, Zamira Hasanah

    2016-11-01

    This paper focuses on the statistical modeling for the distributions of air pollution index (API) and its sub-indexes data observed at Kuala Lumpur in Malaysia. Five pollutants or sub-indexes are measured including, carbon monoxide (CO); sulphur dioxide (SO2); nitrogen dioxide (NO2), and; particulate matter (PM10). Four probability distributions are considered, namely log-normal, exponential, Gamma and Weibull in search for the best fit distribution to the Malaysian air pollutants data. In order to determine the best distribution for describing the air pollutants data, five goodness-of-fit criteria's are applied. This will help in minimizing the uncertainty in pollution resource estimates and improving the assessment phase of planning. The conflict in criterion results for selecting the best distribution was overcome by using the weight of ranks method. We found that the Gamma distribution is the best distribution for the majority of air pollutants data in Kuala Lumpur.

  3. Probability distribution of extreme share returns in Malaysia

    NASA Astrophysics Data System (ADS)

    Zin, Wan Zawiah Wan; Safari, Muhammad Aslam Mohd; Jaaman, Saiful Hafizah; Yie, Wendy Ling Shin

    2014-09-01

    The objective of this study is to investigate the suitable probability distribution to model the extreme share returns in Malaysia. To achieve this, weekly and monthly maximum daily share returns are derived from share prices data obtained from Bursa Malaysia over the period of 2000 to 2012. The study starts with summary statistics of the data which will provide a clue on the likely candidates for the best fitting distribution. Next, the suitability of six extreme value distributions, namely the Gumbel, Generalized Extreme Value (GEV), Generalized Logistic (GLO) and Generalized Pareto (GPA), the Lognormal (GNO) and the Pearson (PE3) distributions are evaluated. The method of L-moments is used in parameter estimation. Based on several goodness of fit tests and L-moment diagram test, the Generalized Pareto distribution and the Pearson distribution are found to be the best fitted distribution to represent the weekly and monthly maximum share returns in Malaysia stock market during the studied period, respectively.

  4. Maximizing a Probability: A Student Workshop on an Application of Continuous Distributions

    ERIC Educational Resources Information Center

    Griffiths, Martin

    2010-01-01

    For many students meeting, say, the gamma distribution for the first time, it may well turn out to be a rather fruitless encounter unless they are immediately able to see an application of this probability model to some real-life situation. With this in mind, we pose here an appealing problem that can be used as the basis for a workshop activity…

  5. The role of lower-hybrid-wave collapse in the auroral ionosphere.

    PubMed

    Schuck, P W; Ganguli, G I; Kintner, P M

    2002-08-05

    In regions where lower-hybrid solitary structures (LHSS) are observed, the character of auroral lower-hybrid turbulence (LHT) (0-20 kHz) is investigated using the amplitude probability distribution of the electric field. The observed probability distributions are accurately described by a Rayleigh distribution with two degrees of freedom. The statistics of the LHT exhibit no evidence of the global modulational instability or self-similar wave collapse. We conclude that nucleation and resonant scattering in preexisting density depletions are the processes responsible for LHSS in auroral LHT.

  6. Bivariate Rainfall and Runoff Analysis Using Shannon Entropy Theory

    NASA Astrophysics Data System (ADS)

    Rahimi, A.; Zhang, L.

    2012-12-01

    Rainfall-Runoff analysis is the key component for many hydrological and hydraulic designs in which the dependence of rainfall and runoff needs to be studied. It is known that the convenient bivariate distribution are often unable to model the rainfall-runoff variables due to that they either have constraints on the range of the dependence or fixed form for the marginal distributions. Thus, this paper presents an approach to derive the entropy-based joint rainfall-runoff distribution using Shannon entropy theory. The distribution derived can model the full range of dependence and allow different specified marginals. The modeling and estimation can be proceeded as: (i) univariate analysis of marginal distributions which includes two steps, (a) using the nonparametric statistics approach to detect modes and underlying probability density, and (b) fitting the appropriate parametric probability density functions; (ii) define the constraints based on the univariate analysis and the dependence structure; (iii) derive and validate the entropy-based joint distribution. As to validate the method, the rainfall-runoff data are collected from the small agricultural experimental watersheds located in semi-arid region near Riesel (Waco), Texas, maintained by the USDA. The results of unviariate analysis show that the rainfall variables follow the gamma distribution, whereas the runoff variables have mixed structure and follow the mixed-gamma distribution. With this information, the entropy-based joint distribution is derived using the first moments, the first moments of logarithm transformed rainfall and runoff, and the covariance between rainfall and runoff. The results of entropy-based joint distribution indicate: (1) the joint distribution derived successfully preserves the dependence between rainfall and runoff, and (2) the K-S goodness of fit statistical tests confirm the marginal distributions re-derived reveal the underlying univariate probability densities which further assure that the entropy-based joint rainfall-runoff distribution are satisfactorily derived. Overall, the study shows the Shannon entropy theory can be satisfactorily applied to model the dependence between rainfall and runoff. The study also shows that the entropy-based joint distribution is an appropriate approach to capture the dependence structure that cannot be captured by the convenient bivariate joint distributions. Joint Rainfall-Runoff Entropy Based PDF, and Corresponding Marginal PDF and Histogram for W12 Watershed The K-S Test Result and RMSE on Univariate Distributions Derived from the Maximum Entropy Based Joint Probability Distribution;

  7. Sampling considerations for disease surveillance in wildlife populations

    USGS Publications Warehouse

    Nusser, S.M.; Clark, W.R.; Otis, D.L.; Huang, L.

    2008-01-01

    Disease surveillance in wildlife populations involves detecting the presence of a disease, characterizing its prevalence and spread, and subsequent monitoring. A probability sample of animals selected from the population and corresponding estimators of disease prevalence and detection provide estimates with quantifiable statistical properties, but this approach is rarely used. Although wildlife scientists often assume probability sampling and random disease distributions to calculate sample sizes, convenience samples (i.e., samples of readily available animals) are typically used, and disease distributions are rarely random. We demonstrate how landscape-based simulation can be used to explore properties of estimators from convenience samples in relation to probability samples. We used simulation methods to model what is known about the habitat preferences of the wildlife population, the disease distribution, and the potential biases of the convenience-sample approach. Using chronic wasting disease in free-ranging deer (Odocoileus virginianus) as a simple illustration, we show that using probability sample designs with appropriate estimators provides unbiased surveillance parameter estimates but that the selection bias and coverage errors associated with convenience samples can lead to biased and misleading results. We also suggest practical alternatives to convenience samples that mix probability and convenience sampling. For example, a sample of land areas can be selected using a probability design that oversamples areas with larger animal populations, followed by harvesting of individual animals within sampled areas using a convenience sampling method.

  8. Superthermal photon bunching in terms of simple probability distributions

    NASA Astrophysics Data System (ADS)

    Lettau, T.; Leymann, H. A. M.; Melcher, B.; Wiersig, J.

    2018-05-01

    We analyze the second-order photon autocorrelation function g(2 ) with respect to the photon probability distribution and discuss the generic features of a distribution that results in superthermal photon bunching [g(2 )(0 ) >2 ]. Superthermal photon bunching has been reported for a number of optical microcavity systems that exhibit processes such as superradiance or mode competition. We show that a superthermal photon number distribution cannot be constructed from the principle of maximum entropy if only the intensity and the second-order autocorrelation are given. However, for bimodal systems, an unbiased superthermal distribution can be constructed from second-order correlations and the intensities alone. Our findings suggest modeling superthermal single-mode distributions by a mixture of a thermal and a lasinglike state and thus reveal a generic mechanism in the photon probability distribution responsible for creating superthermal photon bunching. We relate our general considerations to a physical system, i.e., a (single-emitter) bimodal laser, and show that its statistics can be approximated and understood within our proposed model. Furthermore, the excellent agreement of the statistics of the bimodal laser and our model reveals that the bimodal laser is an ideal source of bunched photons, in the sense that it can generate statistics that contain no other features but the superthermal bunching.

  9. A novel method for correcting scanline-observational bias of discontinuity orientation

    PubMed Central

    Huang, Lei; Tang, Huiming; Tan, Qinwen; Wang, Dingjian; Wang, Liangqing; Ez Eldin, Mutasim A. M.; Li, Changdong; Wu, Qiong

    2016-01-01

    Scanline observation is known to introduce an angular bias into the probability distribution of orientation in three-dimensional space. In this paper, numerical solutions expressing the functional relationship between the scanline-observational distribution (in one-dimensional space) and the inherent distribution (in three-dimensional space) are derived using probability theory and calculus under the independence hypothesis of dip direction and dip angle. Based on these solutions, a novel method for obtaining the inherent distribution (also for correcting the bias) is proposed, an approach which includes two procedures: 1) Correcting the cumulative probabilities of orientation according to the solutions, and 2) Determining the distribution of the corrected orientations using approximation methods such as the one-sample Kolmogorov-Smirnov test. The inherent distribution corrected by the proposed method can be used for discrete fracture network (DFN) modelling, which is applied to such areas as rockmass stability evaluation, rockmass permeability analysis, rockmass quality calculation and other related fields. To maximize the correction capacity of the proposed method, the observed sample size is suggested through effectiveness tests for different distribution types, dispersions and sample sizes. The performance of the proposed method and the comparison of its correction capacity with existing methods are illustrated with two case studies. PMID:26961249

  10. Desensitization shortens the high-quantal-content endplate current time course in frog muscle with intact cholinesterase.

    PubMed

    Giniatullin, R A; Talantova, M; Vyskocil, F

    1997-08-01

    1. The desensitization induced by bath-applied carbachol or acetylcholine (ACh) and potentiated by proadifen (SKF 525A) was studied in the frog sartorius with intact synaptic acetylcholinesterase (AChE). 2. The reduction in the density and number of postsynaptic receptors produced by desensitization lowered the amplitude of the endplate currents (EPCs) and shortened the EPC decay when the quantal content (m) of the EPC was about 170 and when multiple release of quanta at single active zones was highly probably. The shortening of high-quantal-content EPCs persisted for at least 15 min after the wash-out of agonists, at a time when the amplitude had recovered fully. 3. The decay times of the low-quantal-content EPCs recorded from preparations pretreated with 5 mM Mg2+ (m approximately 70) and single-quantum miniature endplate currents (MEPCs) were not affected by carbachol, ACh or proadifen. 4. The desensitization of ACh receptors potentiated by proadifen, prevented completely the 6- to 8-fold prolongation of EPC which was induced by neostigmine inhibition of synaptic AChE. 5. It is assumed that high-quantal-content EPCs increase the incidence of multiple quanta release at single active zones and the probability of repetitive binding of ACh molecules which leads to EPC prolongation. The shortening which persists after complete recovery of the amplitude during wash-out of the exogenous agonist is probably due to 'trapping' of ACh molecules onto rapidly desensitized receptors and the reduced density of functional AChRs during the quantum action.

  11. Fish debris record the hydrothermal activity in the Atlantis II deep sediments (Red Sea)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oudin, E.; Cocherie, A.

    1988-01-01

    The REE and U, Th, Zr, Hf, Sc have been analyzed in samples from Atlantis II and Shaban/Jean Charcot Deeps in the Red Sea. The high Zr/Hf ratio in some sediments indicates the presence of fish debris or of finely crystallized apatite. The positive ..sigma..REE vs P/sub 2/O/sub 5/ and ..sigma..REE vs Zr/Hf correlations show that fish debris and finely crystallized apatite are the main REE sink in Atlantis II Deep sediments as in other marine environments. The hydrothermal sediments and the fish debris concentrates have similar REE patterns, characterized by a LREE enrichment and a large positive Eu anomaly.more » This REE pattern is also observed in E.P.R. hydrothermal solutions. Fish debris from marine environments acquire their REE content and signature mostly from sea water during early diagenesis. The hydrothermal REE signature of Atlantis II Deep fish debris indicate that they probably record the REE signature of their hydrothermal sedimentation and diagenetic environment. The different REE signatures of the Shaban/Jean Charcot and Atlantis II Deep hydrothermal sediments suggest a sea water-dominated brine in the Shaban/Jean Charcot Deep as opposed to the predominantly hydrothermal brine in Atlantis II Deep. Atlantis II Deep fish debris are also characterized by their high U but low Th contents. Their low Th contents probably reflect the low Th content of the various possible sources (sea water, brine, sediments). Their U contents are probably controlled by the redox conditions of sedimentation.« less

  12. Stable isotope, chemical, and mineral compositions of the Middle Proterozoic Lijiaying Mn deposit, Shaanxi Province, China

    USGS Publications Warehouse

    Yeh, Hsueh-Wen; Hein, James R.; Ye, Jie; Fan, Delian

    1999-01-01

    The Lijiaying Mn deposit, located about 250 km southwest of Xian, is a high-quality ore characterized by low P and Fe contents and a mean Mn content of about 23%. The ore deposit occurs in shallow-water marine sedimentary rocks of probable Middle Proterozoic age. Carbonate minerals in the ore deposit include kutnahorite, calcite, Mn calcite, and Mg calcite. Carbon (−0.4 to −4.0‰) and oxygen (−3.7 to −12.9‰) isotopes show that, with a few exceptions, those carbonate minerals are not pristine low-temperature marine precipitates. All samples are depleted in rare earth elements (REEs) relative to shale and have negative Eu and positive Ce anomalies on chondrite-normalized plots. The Fe/Mn ratios of representative ore samples range from about 0.034 to <0.008 and P/Mn from 0.0023 to <0.001. Based on mineralogical data, the low ends of those ranges of ratios are probably close to ratios for the pure Mn minerals. Manganese contents have a strong positive correlation with Ce anomaly values and a moderate correlation with total REE contents. Compositional data indicate that kutnahorite is a metamorphic mineral and that most calcites formed as low-temperature marine carbonates that were subsequently metamorphosed. The braunite ore precursor mineral was probably a Mn oxyhydroxide, similar to those that formed on the deep ocean-floor during the Cenozoic. Because the Lijiaying precursor mineral formed in a shallow-water marine environment, the atmospheric oxygen content during the Middle Proterozoic may have been lower than it has been during the Cenozoic.

  13. Probabilistic Models For Earthquakes With Large Return Periods In Himalaya Region

    NASA Astrophysics Data System (ADS)

    Chaudhary, Chhavi; Sharma, Mukat Lal

    2017-12-01

    Determination of the frequency of large earthquakes is of paramount importance for seismic risk assessment as large events contribute to significant fraction of the total deformation and these long return period events with low probability of occurrence are not easily captured by classical distributions. Generally, with a small catalogue these larger events follow different distribution function from the smaller and intermediate events. It is thus of special importance to use statistical methods that analyse as closely as possible the range of its extreme values or the tail of the distributions in addition to the main distributions. The generalised Pareto distribution family is widely used for modelling the events which are crossing a specified threshold value. The Pareto, Truncated Pareto, and Tapered Pareto are the special cases of the generalised Pareto family. In this work, the probability of earthquake occurrence has been estimated using the Pareto, Truncated Pareto, and Tapered Pareto distributions. As a case study, the Himalayas whose orogeny lies in generation of large earthquakes and which is one of the most active zones of the world, has been considered. The whole Himalayan region has been divided into five seismic source zones according to seismotectonic and clustering of events. Estimated probabilities of occurrence of earthquakes have also been compared with the modified Gutenberg-Richter distribution and the characteristics recurrence distribution. The statistical analysis reveals that the Tapered Pareto distribution better describes seismicity for the seismic source zones in comparison to other distributions considered in the present study.

  14. Void probability as a function of the void's shape and scale-invariant models. [in studies of spacial galactic distribution

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1992-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  15. Probability elicitation to inform early health economic evaluations of new medical technologies: a case study in heart failure disease management.

    PubMed

    Cao, Qi; Postmus, Douwe; Hillege, Hans L; Buskens, Erik

    2013-06-01

    Early estimates of the commercial headroom available to a new medical device can assist producers of health technology in making appropriate product investment decisions. The purpose of this study was to illustrate how this quantity can be captured probabilistically by combining probability elicitation with early health economic modeling. The technology considered was a novel point-of-care testing device in heart failure disease management. First, we developed a continuous-time Markov model to represent the patients' disease progression under the current care setting. Next, we identified the model parameters that are likely to change after the introduction of the new device and interviewed three cardiologists to capture the probability distributions of these parameters. Finally, we obtained the probability distribution of the commercial headroom available per measurement by propagating the uncertainty in the model inputs to uncertainty in modeled outcomes. For a willingness-to-pay value of €10,000 per life-year, the median headroom available per measurement was €1.64 (interquartile range €0.05-€3.16) when the measurement frequency was assumed to be daily. In the subsequently conducted sensitivity analysis, this median value increased to a maximum of €57.70 for different combinations of the willingness-to-pay threshold and the measurement frequency. Probability elicitation can successfully be combined with early health economic modeling to obtain the probability distribution of the headroom available to a new medical technology. Subsequently feeding this distribution into a product investment evaluation method enables stakeholders to make more informed decisions regarding to which markets a currently available product prototype should be targeted. Copyright © 2013. Published by Elsevier Inc.

  16. A microcomputer program for energy assessment and aggregation using the triangular probability distribution

    USGS Publications Warehouse

    Crovelli, R.A.; Balay, R.H.

    1991-01-01

    A general risk-analysis method was developed for petroleum-resource assessment and other applications. The triangular probability distribution is used as a model with an analytic aggregation methodology based on probability theory rather than Monte-Carlo simulation. Among the advantages of the analytic method are its computational speed and flexibility, and the saving of time and cost on a microcomputer. The input into the model consists of a set of components (e.g. geologic provinces) and, for each component, three potential resource estimates: minimum, most likely (mode), and maximum. Assuming a triangular probability distribution, the mean, standard deviation, and seven fractiles (F100, F95, F75, F50, F25, F5, and F0) are computed for each component, where for example, the probability of more than F95 is equal to 0.95. The components are aggregated by combining the means, standard deviations, and respective fractiles under three possible siutations (1) perfect positive correlation, (2) complete independence, and (3) any degree of dependence between these two polar situations. A package of computer programs named the TRIAGG system was written in the Turbo Pascal 4.0 language for performing the analytic probabilistic methodology. The system consists of a program for processing triangular probability distribution assessments and aggregations, and a separate aggregation routine for aggregating aggregations. The user's documentation and program diskette of the TRIAGG system are available from USGS Open File Services. TRIAGG requires an IBM-PC/XT/AT compatible microcomputer with 256kbyte of main memory, MS-DOS 3.1 or later, either two diskette drives or a fixed disk, and a 132 column printer. A graphics adapter and color display are optional. ?? 1991.

  17. Geotechnical parameter spatial distribution stochastic analysis based on multi-precision information assimilation

    NASA Astrophysics Data System (ADS)

    Wang, C.; Rubin, Y.

    2014-12-01

    Spatial distribution of important geotechnical parameter named compression modulus Es contributes considerably to the understanding of the underlying geological processes and the adequate assessment of the Es mechanics effects for differential settlement of large continuous structure foundation. These analyses should be derived using an assimilating approach that combines in-situ static cone penetration test (CPT) with borehole experiments. To achieve such a task, the Es distribution of stratum of silty clay in region A of China Expo Center (Shanghai) is studied using the Bayesian-maximum entropy method. This method integrates rigorously and efficiently multi-precision of different geotechnical investigations and sources of uncertainty. Single CPT samplings were modeled as a rational probability density curve by maximum entropy theory. Spatial prior multivariate probability density function (PDF) and likelihood PDF of the CPT positions were built by borehole experiments and the potential value of the prediction point, then, preceding numerical integration on the CPT probability density curves, the posterior probability density curve of the prediction point would be calculated by the Bayesian reverse interpolation framework. The results were compared between Gaussian Sequential Stochastic Simulation and Bayesian methods. The differences were also discussed between single CPT samplings of normal distribution and simulated probability density curve based on maximum entropy theory. It is shown that the study of Es spatial distributions can be improved by properly incorporating CPT sampling variation into interpolation process, whereas more informative estimations are generated by considering CPT Uncertainty for the estimation points. Calculation illustrates the significance of stochastic Es characterization in a stratum, and identifies limitations associated with inadequate geostatistical interpolation techniques. This characterization results will provide a multi-precision information assimilation method of other geotechnical parameters.

  18. Ensemble Kalman filtering in presence of inequality constraints

    NASA Astrophysics Data System (ADS)

    van Leeuwen, P. J.

    2009-04-01

    Kalman filtering is presence of constraints is an active area of research. Based on the Gaussian assumption for the probability-density functions, it looks hard to bring in extra constraints in the formalism. On the other hand, in geophysical systems we often encounter constraints related to e.g. the underlying physics or chemistry, which are violated by the Gaussian assumption. For instance, concentrations are always non-negative, model layers have non-negative thickness, and sea-ice concentration is between 0 and 1. Several methods to bring inequality constraints into the Kalman-filter formalism have been proposed. One of them is probability density function (pdf) truncation, in which the Gaussian mass from the non-allowed part of the variables is just equally distributed over the pdf where the variables are alolwed, as proposed by Shimada et al. 1998. However, a problem with this method is that the probability that e.g. the sea-ice concentration is zero, is zero! The new method proposed here does not have this drawback. It assumes that the probability-density function is a truncated Gaussian, but the truncated mass is not distributed equally over all allowed values of the variables, but put into a delta distribution at the truncation point. This delta distribution can easily be handled with in Bayes theorem, leading to posterior probability density functions that are also truncated Gaussians with delta distributions at the truncation location. In this way a much better representation of the system is obtained, while still keeping most of the benefits of the Kalman-filter formalism. In the full Kalman filter the formalism is prohibitively expensive in large-scale systems, but efficient implementation is possible in ensemble variants of the kalman filter. Applications to low-dimensional systems and large-scale systems will be discussed.

  19. Probing the statistics of transport in the Hénon Map

    NASA Astrophysics Data System (ADS)

    Alus, O.; Fishman, S.; Meiss, J. D.

    2016-09-01

    The phase space of an area-preserving map typically contains infinitely many elliptic islands embedded in a chaotic sea. Orbits near the boundary of a chaotic region have been observed to stick for long times, strongly influencing their transport properties. The boundary is composed of invariant "boundary circles." We briefly report recent results of the distribution of rotation numbers of boundary circles for the Hénon quadratic map and show that the probability of occurrence of small integer entries of their continued fraction expansions is larger than would be expected for a number chosen at random. However, large integer entries occur with probabilities distributed proportionally to the random case. The probability distributions of ratios of fluxes through island chains is reported as well. These island chains are neighbours in the sense of the Meiss-Ott Markov-tree model. Two distinct universality families are found. The distributions of the ratio between the flux and orbital period are also presented. All of these results have implications for models of transport in mixed phase space.

  20. Perceptual salience affects the contents of working memory during free-recollection of objects from natural scenes

    PubMed Central

    Pedale, Tiziana; Santangelo, Valerio

    2015-01-01

    One of the most important issues in the study of cognition is to understand which are the factors determining internal representation of the external world. Previous literature has started to highlight the impact of low-level sensory features (indexed by saliency-maps) in driving attention selection, hence increasing the probability for objects presented in complex and natural scenes to be successfully encoded into working memory (WM) and then correctly remembered. Here we asked whether the probability of retrieving high-saliency objects modulates the overall contents of WM, by decreasing the probability of retrieving other, lower-saliency objects. We presented pictures of natural scenes for 4 s. After a retention period of 8 s, we asked participants to verbally report as many objects/details as possible of the previous scenes. We then computed how many times the objects located at either the peak of maximal or minimal saliency in the scene (as indexed by a saliency-map; Itti et al., 1998) were recollected by participants. Results showed that maximal-saliency objects were recollected more often and earlier in the stream of successfully reported items than minimal-saliency objects. This indicates that bottom-up sensory salience increases the recollection probability and facilitates the access to memory representation at retrieval, respectively. Moreover, recollection of the maximal- (but not the minimal-) saliency objects predicted the overall amount of successfully recollected objects: The higher the probability of having successfully reported the most-salient object in the scene, the lower the amount of recollected objects. These findings highlight that bottom-up sensory saliency modulates the current contents of WM during recollection of objects from natural scenes, most likely by reducing available resources to encode and then retrieve other (lower saliency) objects. PMID:25741266

  1. Aging ballistic Lévy walks

    NASA Astrophysics Data System (ADS)

    Magdziarz, Marcin; Zorawik, Tomasz

    2017-02-01

    Aging can be observed for numerous physical systems. In such systems statistical properties [like probability distribution, mean square displacement (MSD), first-passage time] depend on a time span ta between the initialization and the beginning of observations. In this paper we study aging properties of ballistic Lévy walks and two closely related jump models: wait-first and jump-first. We calculate explicitly their probability distributions and MSDs. It turns out that despite similarities these models react very differently to the delay ta. Aging weakly affects the shape of probability density function and MSD of standard Lévy walks. For the jump models the shape of the probability density function is changed drastically. Moreover for the wait-first jump model we observe a different behavior of MSD when ta≪t and ta≫t .

  2. Chronology of Postglacial Eruptive Activity and Calculation of Eruption Probabilities for Medicine Lake Volcano, Northern California

    USGS Publications Warehouse

    Nathenson, Manuel; Donnelly-Nolan, Julie M.; Champion, Duane E.; Lowenstern, Jacob B.

    2007-01-01

    Medicine Lake volcano has had 4 eruptive episodes in its postglacial history (since 13,000 years ago) comprising 16 eruptions. Time intervals between events within the episodes are relatively short, whereas time intervals between the episodes are much longer. An updated radiocarbon chronology for these eruptions is presented that uses paleomagnetic data to constrain the choice of calibrated ages. This chronology is used with exponential, Weibull, and mixed-exponential probability distributions to model the data for time intervals between eruptions. The mixed exponential distribution is the best match to the data and provides estimates for the conditional probability of a future eruption given the time since the last eruption. The probability of an eruption at Medicine Lake volcano in the next year from today is 0.00028.

  3. SUBMICROSCOPIC ( less than 1 mu m) MINERAL CONTENTS OF VITRINITES IN SELECTED BITUMINOUS COAL BEDS.

    USGS Publications Warehouse

    Minkin, J.A.; Chao, E.C.T.; Thompson, C.L.; Wandless, M.-V.; Dulong, F.T.; Larson, R.R.; Neuzil, S.G.; ,

    1983-01-01

    An important aspect of the petrographic description of coal is the characterization of coal quality, including chemical attributes. For geologic investigations, data on the concentrations, distribution, and modes of occurrence of minor and trace elements provide a basis for reconstructing the probable geochemical environment of the swamp material that was converted into peat, and the geochemical conditions that prevailed during and subsequent to coalification. We have been using electron (EPMA) and proton (PIXE) microprobe analytical methods to obtain data on the chemical characteristics of specific coal constituents in their original associations within coal samples. The present study is aimed at evaluation of the nature of mineral occurrences and heterogeneous elemental concentrations within vitrinites. Vitrinites are usually the most abundant, and therefore most important, maceral group in bituminous coal. 8 refs.

  4. The subcellular distribution and biosynthesis of castaprenols and plastoquinone in the leaves of Aesculus hippocastanum

    PubMed Central

    Wellburn, A. R.; Hemming, F. W.

    1967-01-01

    Intact chloroplasts and cell walls were prepared from horse-chestnut leaves that had previously metabolized [2-14C]mevalonate. The bulk of the castaprenols and plastoquinone-9 was found within the chloroplasts. The remaining portion of the castaprenols was associated with the cell-wall preparation whereas that of the plastoquinone-9 was probably localized in the soluble fraction of the plant cell. The 14C content of these compounds of different cell fractions indicated the presence of polyisoprenoid-synthesizing activity both inside and outside the chloroplasts. This was confirmed by the relative incorporation of 14C when ultrasonically treated and intact chloroplasts were incubated with [2-14C]mevalonate. As the leaves aged (on the tree) an increase in extraplastidic castaprenols and plastoquinone-9, together with associated synthesizing activities, was observed. PMID:6068175

  5. [Enhanced phytoextraction of heavy metal contaminated soil by chelating agents and auxin indole-3-acetic acid].

    PubMed

    Zhou, Jian-min; Dang, Zhi; Chen, Neng-chang; Xu, Sheng-guang; Xie, Zhi-yi

    2007-09-01

    The environmental risk of chelating agents such as EDTA application to the heavy metals polluted soils and the stress on plant roots due to the abrupt increase metals concentration limit the wide commercial use of chelate-induced phytoextraction. Chelating agent ethylenediaminetetraacetic acid (EDTA) and nitrilotriacetic acid (NTA) and auxin indole-3-acetic acid (IAA) were used for enhancing heavy metals uptake from soils by Zea mays L. (corn) in pot experiments. The metals content in plant tissues was quantified using an inductively coupled plasma mass spectrometer (ICP-MS). The results showed that the combination of IAA and EDTA increased the biomass by about 40.0% and the contents of Cu, Zn, Cd and Pb in corn shoots by 27.0%, 26.8%, 27.5% and 32.8% respectively, as compared to those in EDTA treatment. While NTA&IAA treatment increased the biomass by about 29.9% and the contents of Cu, Zn, Cd and Pb in corn shoots by 31.8%, 27.6%, 17.0% and 26.9% respectively, as compared to those in NTA treatment. These results indicated that corn growth was promoted, and the biomass and the accumulation of heavy metals in plant shoots were increased significantly with the addition of IAA, which probably helps to change the cell membrane properties and the biomass distribution, resulting in the alleviation of the phytotoxicity of metals and the chelating agents.

  6. Microbial Functioning and Community Structure Variability in the Mesopelagic and Epipelagic Waters of the Subtropical Northeast Atlantic Ocean

    PubMed Central

    Arístegui, Javier; Gasol, Josep M.; Herndl, Gerhard J.

    2012-01-01

    We analyzed the regional distribution of bulk heterotrophic prokaryotic activity (leucine incorporation) and selected single-cell parameters (cell viability and nucleic acid content) as parameters for microbial functioning, as well as bacterial and archaeal community structure in the epipelagic (0 to 200 m) and mesopelagic (200 to 1,000 m) subtropical Northeast Atlantic Ocean. We selectively sampled three contrasting regions covering a wide range of surface productivity and oceanographic properties within the same basin: (i) the eddy field south of the Canary Islands, (ii) the open-ocean NE Atlantic Subtropical Gyre, and (iii) the upwelling filament off Cape Blanc. In the epipelagic waters, a high regional variation in hydrographic parameters and bacterial community structure was detected, accompanied, however, by a low variability in microbial functioning. In contrast, mesopelagic microbial functioning was highly variable between the studied regions despite the homogeneous abiotic conditions found therein. More microbial functioning parameters indicated differences among the three regions within the mesopelagic (i.e., viability of cells, nucleic acid content, cell-specific heterotrophic activity, nanoflagellate abundance, prokaryote-to-nanoflagellate abundance ratio) than within the epipelagic (i.e., bulk activity, nucleic acid content, and nanoflagellate abundance) waters. Our results show that the mesopelagic realm in the Northeast Atlantic is, in terms of microbial activity, more heterogeneous than its epipelagic counterpart, probably linked to mesoscale hydrographical variations. PMID:22344670

  7. Uncertainty quantification for nuclear density functional theory and information content of new measurements.

    PubMed

    McDonnell, J D; Schunck, N; Higdon, D; Sarich, J; Wild, S M; Nazarewicz, W

    2015-03-27

    Statistical tools of uncertainty quantification can be used to assess the information content of measured observables with respect to present-day theoretical models, to estimate model errors and thereby improve predictive capability, to extrapolate beyond the regions reached by experiment, and to provide meaningful input to applications and planned measurements. To showcase new opportunities offered by such tools, we make a rigorous analysis of theoretical statistical uncertainties in nuclear density functional theory using Bayesian inference methods. By considering the recent mass measurements from the Canadian Penning Trap at Argonne National Laboratory, we demonstrate how the Bayesian analysis and a direct least-squares optimization, combined with high-performance computing, can be used to assess the information content of the new data with respect to a model based on the Skyrme energy density functional approach. Employing the posterior probability distribution computed with a Gaussian process emulator, we apply the Bayesian framework to propagate theoretical statistical uncertainties in predictions of nuclear masses, two-neutron dripline, and fission barriers. Overall, we find that the new mass measurements do not impose a constraint that is strong enough to lead to significant changes in the model parameters. The example discussed in this study sets the stage for quantifying and maximizing the impact of new measurements with respect to current modeling and guiding future experimental efforts, thus enhancing the experiment-theory cycle in the scientific method.

  8. Probability distributions for multimeric systems.

    PubMed

    Albert, Jaroslav; Rooman, Marianne

    2016-01-01

    We propose a fast and accurate method of obtaining the equilibrium mono-modal joint probability distributions for multimeric systems. The method necessitates only two assumptions: the copy number of all species of molecule may be treated as continuous; and, the probability density functions (pdf) are well-approximated by multivariate skew normal distributions (MSND). Starting from the master equation, we convert the problem into a set of equations for the statistical moments which are then expressed in terms of the parameters intrinsic to the MSND. Using an optimization package on Mathematica, we minimize a Euclidian distance function comprising of a sum of the squared difference between the left and the right hand sides of these equations. Comparison of results obtained via our method with those rendered by the Gillespie algorithm demonstrates our method to be highly accurate as well as efficient.

  9. Probability of lensing magnification by cosmologically distributed galaxies

    NASA Technical Reports Server (NTRS)

    Pei, Yichuan C.

    1993-01-01

    We present the analytical formulae for computing the magnification probability caused by cosmologically distributed galaxies. The galaxies are assumed to be singular, truncated-isothermal spheres without both evolution and clustering in redshift. We find that, for a fixed total mass, extended galaxies produce a broader shape in the magnification probability distribution and hence are less efficient as gravitational lenses than compact galaxies. The high-magnification tail caused by large galaxies is well approximated by an A exp -3 form, while the tail by small galaxies is slightly shallower. The mean magnification as a function of redshift is, however, found to be independent of the size of the lensing galaxies. In terms of the flux conservation, our formulae for the isothermal galaxy model predict a mean magnification to within a few percent with the Dyer-Roeder model of a clumpy universe.

  10. Lignin characteristics in soil profiles of different plant communities in a subtropical mixed forest in Central China

    NASA Astrophysics Data System (ADS)

    Liu, F.; Wang, X.

    2016-12-01

    Lignin is widely considered as a major source of stable soil carbon, its content and degradation states are important indicators of soil carbon quality and stability. Few studies have explored the effects of plant communities on lignin characteristics in soils, and studies on lignin characteristics across soil depths resulted in contradictory findings. In this study, we investigated the lignin contents, their degradation states in the soil aggregates across three soil depths for four major plant communities in a subtropical mixed forest in central China. We found that lignin content in the litter of two deciduous species (Carpinus fargesii CF and Fagus Lucida FL) are higher than that in the two evergreen species ( Cyclobalanopsis multinervis CM and Schima parviflora SP). These differences maintained in the soil with a diminished scale. Lignin content showed a decreased trend in soil profiles of all plant communities, but no significant differences of degradation states were observed. The distribution of aggregation fractions was significantly different among plant communities, the SP community have higher percent of >2000 μm fraction (50.46%) and lower percent of <0.25 μm fraction (12.87%) than the CF community (40.05%, 21.90% respectively). The lignin content increased with decreasing aggregations size, however, no significant differences of lignin degradation states was observed among the four size aggregations. These results collectively reveal the influence of plant communities on lignin characteristics in soil, probably through litter input. Similar degradation states of lignin across soil profile and different size aggregates emphasized the importance of lignin movements association with soil water. This knowledge of lignin characteristics across soil profile can improve our understanding of soil carbon stability at different depths and how it may respond to changes in soil conditions.

  11. SU-F-T-450: The Investigation of Radiotherapy Quality Assurance and Automatic Treatment Planning Based On the Kernel Density Estimation Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fan, J; Fan, J; Hu, W

    Purpose: To develop a fast automatic algorithm based on the two dimensional kernel density estimation (2D KDE) to predict the dose-volume histogram (DVH) which can be employed for the investigation of radiotherapy quality assurance and automatic treatment planning. Methods: We propose a machine learning method that uses previous treatment plans to predict the DVH. The key to the approach is the framing of DVH in a probabilistic setting. The training consists of estimating, from the patients in the training set, the joint probability distribution of the dose and the predictive features. The joint distribution provides an estimation of the conditionalmore » probability of the dose given the values of the predictive features. For the new patient, the prediction consists of estimating the distribution of the predictive features and marginalizing the conditional probability from the training over this. Integrating the resulting probability distribution for the dose yields an estimation of the DVH. The 2D KDE is implemented to predict the joint probability distribution of the training set and the distribution of the predictive features for the new patient. Two variables, including the signed minimal distance from each OAR (organs at risk) voxel to the target boundary and its opening angle with respect to the origin of voxel coordinate, are considered as the predictive features to represent the OAR-target spatial relationship. The feasibility of our method has been demonstrated with the rectum, breast and head-and-neck cancer cases by comparing the predicted DVHs with the planned ones. Results: The consistent result has been found between these two DVHs for each cancer and the average of relative point-wise differences is about 5% within the clinical acceptable extent. Conclusion: According to the result of this study, our method can be used to predict the clinical acceptable DVH and has ability to evaluate the quality and consistency of the treatment planning.« less

  12. Categorical Indicator Kriging for assessing the risk of groundwater nitrate pollution: the case of Vega de Granada aquifer (SE Spain).

    PubMed

    Chica-Olmo, Mario; Luque-Espinar, Juan Antonio; Rodriguez-Galiano, Victor; Pardo-Igúzquiza, Eulogio; Chica-Rivas, Lucía

    2014-02-01

    Groundwater nitrate pollution associated with agricultural activity is an important environmental problem in the management of this natural resource, as acknowledged by the European Water Framework Directive. Therefore, specific measures aimed to control the risk of water pollution by nitrates must be implemented to minimise its impact on the environment and potential risk to human health. The spatial probability distribution of nitrate contents exceeding a threshold or limit value, established within the quality standard, will be helpful to managers and decision-makers. A methodology based on non-parametric and non-linear methods of Indicator Kriging was used in the elaboration of a nitrate pollution categorical map for the aquifer of Vega de Granada (SE Spain). The map has been obtained from the local estimation of the probability that a nitrate content in an unsampled location belongs to one of the three categories established by the European Water Framework Directive: CL. 1 good quality [Min - 37.5 ppm], CL. 2 intermediate quality [37.5-50 ppm] and CL. 3 poor quality [50 ppm - Max]. The obtained results show that the areas exceeding nitrate concentrations of 50 ppm, poor quality waters, occupy more than 50% of the aquifer area. A great proportion of the area's municipalities are located in these poor quality water areas. The intermediate quality and good quality areas correspond to 21% and 28%, respectively, but with the highest population density. These results are coherent with the experimental data, which show an average nitrate concentration value of 72 ppm, significantly higher than the quality standard limit of 50 ppm. Consequently, the results suggest the importance of planning actions in order to control and monitor aquifer nitrate pollution. © 2013.

  13. Race Socialization Messages across Historical Time

    ERIC Educational Resources Information Center

    Brown, Tony N.; Lesane-Brown, Chase L.

    2006-01-01

    In this study we investigated whether the content of race socialization messages varied by birth cohort, using data from a national probability sample. Most respondents recalled receiving messages about what it means to be black from their parents or guardians; these messages were coded into five mutually exclusive content categories: individual…

  14. Investigation of Bose-Einstein Condensates in q-Deformed Potentials with First Order Perturbation Theory

    NASA Astrophysics Data System (ADS)

    Nutku, Ferhat; Aydıner, Ekrem

    2018-02-01

    The Gross-Pitaevskii equation, which is the governor equation of Bose-Einstein condensates, is solved by first order perturbation expansion under various q-deformed potentials. Stationary probability distributions reveal one and two soliton behavior depending on the type of the q-deformed potential. Additionally a spatial shift of the probability distribution is found for the dark soliton solution, when the q parameter is changed.

  15. Benchmarking PARTISN with Analog Monte Carlo: Moments of the Neutron Number and the Cumulative Fission Number Probability Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Rourke, Patrick Francis

    The purpose of this report is to provide the reader with an understanding of how a Monte Carlo neutron transport code was written, developed, and evolved to calculate the probability distribution functions (PDFs) and their moments for the neutron number at a final time as well as the cumulative fission number, along with introducing several basic Monte Carlo concepts.

  16. The Poisson Random Process. Applications of Probability Theory to Operations Research. Modules and Monographs in Undergraduate Mathematics and Its Applications Project. UMAP Unit 340.

    ERIC Educational Resources Information Center

    Wilde, Carroll O.

    The Poisson probability distribution is seen to provide a mathematical model from which useful information can be obtained in practical applications. The distribution and some situations to which it applies are studied, and ways to find answers to practical questions are noted. The unit includes exercises and a model exam, and provides answers to…

  17. Randomized path optimization for thevMitigated counter detection of UAVS

    DTIC Science & Technology

    2017-06-01

    using Bayesian filtering . The KL divergence is used to compare the probability density of aircraft termination to a normal distribution around the...Bayesian filtering . The KL divergence is used to compare the probability density of aircraft termination to a normal distribution around the true terminal...algorithm’s success. A recursive Bayesian filtering scheme is used to assimilate noisy measurements of the UAVs position to predict its terminal location. We

  18. Return probabilities and hitting times of random walks on sparse Erdös-Rényi graphs.

    PubMed

    Martin, O C; Sulc, P

    2010-03-01

    We consider random walks on random graphs, focusing on return probabilities and hitting times for sparse Erdös-Rényi graphs. Using the tree approach, which is expected to be exact in the large graph limit, we show how to solve for the distribution of these quantities and we find that these distributions exhibit a form of self-similarity.

  19. Automatic colorimetric calibration of human wounds

    PubMed Central

    2010-01-01

    Background Recently, digital photography in medicine is considered an acceptable tool in many clinical domains, e.g. wound care. Although ever higher resolutions are available, reproducibility is still poor and visual comparison of images remains difficult. This is even more the case for measurements performed on such images (colour, area, etc.). This problem is often neglected and images are freely compared and exchanged without further thought. Methods The first experiment checked whether camera settings or lighting conditions could negatively affect the quality of colorimetric calibration. Digital images plus a calibration chart were exposed to a variety of conditions. Precision and accuracy of colours after calibration were quantitatively assessed with a probability distribution for perceptual colour differences (dE_ab). The second experiment was designed to assess the impact of the automatic calibration procedure (i.e. chart detection) on real-world measurements. 40 Different images of real wounds were acquired and a region of interest was selected in each image. 3 Rotated versions of each image were automatically calibrated and colour differences were calculated. Results 1st Experiment: Colour differences between the measurements and real spectrophotometric measurements reveal median dE_ab values respectively 6.40 for the proper patches of calibrated normal images and 17.75 for uncalibrated images demonstrating an important improvement in accuracy after calibration. The reproducibility, visualized by the probability distribution of the dE_ab errors between 2 measurements of the patches of the images has a median of 3.43 dE* for all calibrated images, 23.26 dE_ab for all uncalibrated images. If we restrict ourselves to the proper patches of normal calibrated images the median is only 2.58 dE_ab! Wilcoxon sum-rank testing (p < 0.05) between uncalibrated normal images and calibrated normal images with proper squares were equal to 0 demonstrating a highly significant improvement of reproducibility. In the second experiment, the reproducibility of the chart detection during automatic calibration is presented using a probability distribution of dE_ab errors between 2 measurements of the same ROI. Conclusion The investigators proposed an automatic colour calibration algorithm that ensures reproducible colour content of digital images. Evidence was provided that images taken with commercially available digital cameras can be calibrated independently of any camera settings and illumination features. PMID:20298541

  20. Occupancy modeling of autonomously recorded vocalizations to predict distribution of rallids in tidal wetlands

    USGS Publications Warehouse

    Stiffler, Lydia L.; Anderson, James T.; Katzner, Todd

    2018-01-01

    Conservation and management for a species requires reliable information on its status, distribution, and habitat use. We identified occupancy and distributions of king (Rallus elegans) and clapper (R. crepitans) rail populations in marsh complexes along the Pamunkey and Mattaponi Rivers in Virginia, USA by modeling data on vocalizations recorded from autonomous recording units (ARUs). Occupancy probability for both species combined was 0.64 (95% CI: 0.53, 0.75) in marshes along the Pamunkey and 0.59 (0.45, 0.72) in marshes along the Mattaponi. Occupancy probability along the Pamunkey was strongly influenced by salinity, increasing logistically by a factor of 1.62 (0.6, 2.65) per parts per thousand of salinity. In contrast, there was not a strong salinity gradient on the Mattaponi and therefore vegetative community structure determined occupancy probability on that river. Estimated detection probability across both marshes was 0.63 (0.62, 0.65), but detection rates decreased as the season progressed. Monitoring wildlife within wetlands presents unique challenges for conservation managers. Our findings provide insight not only into how rails responded to environmental variation but also into the general utility of ARUs for occupancy modeling of the distribution and habitat associations of rails within tidal marsh systems.

  1. Correlation between discrete probability and reaction front propagation rate in heterogeneous mixtures

    NASA Astrophysics Data System (ADS)

    Naine, Tarun Bharath; Gundawar, Manoj Kumar

    2017-09-01

    We demonstrate a very powerful correlation between the discrete probability of distances of neighboring cells and thermal wave propagation rate, for a system of cells spread on a one-dimensional chain. A gamma distribution is employed to model the distances of neighboring cells. In the absence of an analytical solution and the differences in ignition times of adjacent reaction cells following non-Markovian statistics, invariably the solution for thermal wave propagation rate for a one-dimensional system with randomly distributed cells is obtained by numerical simulations. However, such simulations which are based on Monte-Carlo methods require several iterations of calculations for different realizations of distribution of adjacent cells. For several one-dimensional systems, differing in the value of shaping parameter of the gamma distribution, we show that the average reaction front propagation rates obtained by a discrete probability between two limits, shows excellent agreement with those obtained numerically. With the upper limit at 1.3, the lower limit depends on the non-dimensional ignition temperature. Additionally, this approach also facilitates the prediction of burning limits of heterogeneous thermal mixtures. The proposed method completely eliminates the need for laborious, time intensive numerical calculations where the thermal wave propagation rates can now be calculated based only on macroscopic entity of discrete probability.

  2. Characteristics of Tables for Disseminating Biobehavioral Results.

    PubMed

    Schneider, Barbara St Pierre; Nagelhout, Ed; Feng, Du

    2018-01-01

    To report the complexity and richness of study variables within biological nursing research, authors often use tables; however, the ease with which consumers understand, synthesize, evaluate, and build upon findings depends partly upon table design. To assess and compare table characteristics within research and review articles published in Biological Research for Nursing and Nursing Research. A total of 10 elements in tables from 48 biobehavioral or biological research or review articles were analyzed. To test six hypotheses, a two-level hierarchical linear model was used for each of the continuous table elements, and a two-level hierarchical generalized linear model was used for each of the categorical table elements. Additionally, the inclusion of probability values in statistical tables was examined. The mean number of tables per article was 3. Tables in research articles were more likely to contain quantitative content, while tables in review articles were more likely to contain both quantitative and qualitative content. Tables in research articles had a greater number of rows, columns, and column-heading levels than tables in review articles. More than one half of statistical tables in research articles had a separate probability column or had probability values within the table, whereas approximately one fourth had probability notes. Authors and journal editorial staff may be generating tables that better depict biobehavioral content than those identified in specific style guidelines. However, authors and journal editorial staff may want to consider table design in terms of audience, including alternative visual displays.

  3. Energetics and Birth Rates of Supernova Remnants in the Large Magellanic Cloud

    NASA Astrophysics Data System (ADS)

    Leahy, D. A.

    2017-03-01

    Published X-ray emission properties for a sample of 50 supernova remnants (SNRs) in the Large Magellanic Cloud (LMC) are used as input for SNR evolution modeling calculations. The forward shock emission is modeled to obtain the initial explosion energy, age, and circumstellar medium density for each SNR in the sample. The resulting age distribution yields a SNR birthrate of 1/(500 yr) for the LMC. The explosion energy distribution is well fit by a log-normal distribution, with a most-probable explosion energy of 0.5× {10}51 erg, with a 1σ dispersion by a factor of 3 in energy. The circumstellar medium density distribution is broader than the explosion energy distribution, with a most-probable density of ˜0.1 cm-3. The shape of the density distribution can be fit with a log-normal distribution, with incompleteness at high density caused by the shorter evolution times of SNRs.

  4. Characterization of autoregressive processes using entropic quantifiers

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; Redelico, Francisco O.

    2018-01-01

    The aim of the contribution is to introduce a novel information plane, the causal-amplitude informational plane. As previous works seems to indicate, Bandt and Pompe methodology for estimating entropy does not allow to distinguish between probability distributions which could be fundamental for simulation or for probability analysis purposes. Once a time series is identified as stochastic by the causal complexity-entropy informational plane, the novel causal-amplitude gives a deeper understanding of the time series, quantifying both, the autocorrelation strength and the probability distribution of the data extracted from the generating processes. Two examples are presented, one from climate change model and the other from financial markets.

  5. Individual heterogeneity and identifiability in capture-recapture models

    USGS Publications Warehouse

    Link, W.A.

    2004-01-01

    Individual heterogeneity in detection probabilities is a far more serious problem for capture-recapture modeling than has previously been recognized. In this note, I illustrate that population size is not an identifiable parameter under the general closed population mark-recapture model Mh. The problem of identifiability is obvious if the population includes individuals with pi = 0, but persists even when it is assumed that individual detection probabilities are bounded away from zero. Identifiability may be attained within parametric families of distributions for pi, but not among parametric families of distributions. Consequently, in the presence of individual heterogeneity in detection probability, capture-recapture analysis is strongly model dependent.

  6. Erratum: Three-cluster dynamics within an ab initio framework [Phys. Rev. C 88 , 034320 (2013)

    DOE PAGES

    Quaglioni, Sofia; Romero-Redondo, Carolina; Navrátil, Petr

    2016-07-14

    In this study, we have discovered a typographical error in the portion of code used in our original article to compute the probability distribution of Figs. 7–9 of Sec. III B 2. The correct results are shown in the figures below. The correct probability distribution now shows the characteristic prominence of the “dineutron” over the “cigar” configuration. In addition, the most-probable distance between the two neutrons in the latter configuration is close to 4 fm (rather than the erroneously reported 5 fm). The other results and conclusion of the original paper remain unaffected.

  7. Experimental investigation of the intensity fluctuation joint probability and conditional distributions of the twin-beam quantum state.

    PubMed

    Zhang, Yun; Kasai, Katsuyuki; Watanabe, Masayoshi

    2003-01-13

    We give the intensity fluctuation joint probability of the twin-beam quantum state, which was generated with an optical parametric oscillator operating above threshold. Then we present what to our knowledge is the first measurement of the intensity fluctuation conditional probability distributions of twin beams. The measured inference variance of twin beams 0.62+/-0.02, which is less than the standard quantum limit of unity, indicates inference with a precision better than that of separable states. The measured photocurrent variance exhibits a quantum correlation of as much as -4.9+/-0.2 dB between the signal and the idler.

  8. Kullback-Leibler information function and the sequential selection of experiments to discriminate among several linear models

    NASA Technical Reports Server (NTRS)

    Sidik, S. M.

    1972-01-01

    The error variance of the process prior multivariate normal distributions of the parameters of the models are assumed to be specified, prior probabilities of the models being correct. A rule for termination of sampling is proposed. Upon termination, the model with the largest posterior probability is chosen as correct. If sampling is not terminated, posterior probabilities of the models and posterior distributions of the parameters are computed. An experiment was chosen to maximize the expected Kullback-Leibler information function. Monte Carlo simulation experiments were performed to investigate large and small sample behavior of the sequential adaptive procedure.

  9. Probabilistic Cloning of Three Real States with Optimal Success Probabilities

    NASA Astrophysics Data System (ADS)

    Rui, Pin-shu

    2017-06-01

    We investigate the probabilistic quantum cloning (PQC) of three real states with average probability distribution. To get the analytic forms of the optimal success probabilities we assume that the three states have only two pairwise inner products. Based on the optimal success probabilities, we derive the explicit form of 1 →2 PQC for cloning three real states. The unitary operation needed in the PQC process is worked out too. The optimal success probabilities are also generalized to the M→ N PQC case.

  10. Calibration of micromechanical parameters for DEM simulations by using the particle filter

    NASA Astrophysics Data System (ADS)

    Cheng, Hongyang; Shuku, Takayuki; Thoeni, Klaus; Yamamoto, Haruyuki

    2017-06-01

    The calibration of DEM models is typically accomplished by trail and error. However, the procedure lacks of objectivity and has several uncertainties. To deal with these issues, the particle filter is employed as a novel approach to calibrate DEM models of granular soils. The posterior probability distribution of the microparameters that give numerical results in good agreement with the experimental response of a Toyoura sand specimen is approximated by independent model trajectories, referred as `particles', based on Monte Carlo sampling. The soil specimen is modeled by polydisperse packings with different numbers of spherical grains. Prepared in `stress-free' states, the packings are subjected to triaxial quasistatic loading. Given the experimental data, the posterior probability distribution is incrementally updated, until convergence is reached. The resulting `particles' with higher weights are identified as the calibration results. The evolutions of the weighted averages and posterior probability distribution of the micro-parameters are plotted to show the advantage of using a particle filter, i.e., multiple solutions are identified for each parameter with known probabilities of reproducing the experimental response.

  11. Unit-Sphere Anisotropic Multiaxial Stochastic-Strength Model Probability Density Distribution for the Orientation of Critical Flaws

    NASA Technical Reports Server (NTRS)

    Nemeth, Noel

    2013-01-01

    Models that predict the failure probability of monolithic glass and ceramic components under multiaxial loading have been developed by authors such as Batdorf, Evans, and Matsuo. These "unit-sphere" failure models assume that the strength-controlling flaws are randomly oriented, noninteracting planar microcracks of specified geometry but of variable size. This report develops a formulation to describe the probability density distribution of the orientation of critical strength-controlling flaws that results from an applied load. This distribution is a function of the multiaxial stress state, the shear sensitivity of the flaws, the Weibull modulus, and the strength anisotropy. Examples are provided showing the predicted response on the unit sphere for various stress states for isotropic and transversely isotropic (anisotropic) materials--including the most probable orientation of critical flaws for offset uniaxial loads with strength anisotropy. The author anticipates that this information could be used to determine anisotropic stiffness degradation or anisotropic damage evolution for individual brittle (or quasi-brittle) composite material constituents within finite element or micromechanics-based software

  12. Complex growing networks with intrinsic vertex fitness

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bedogne, C.; Rodgers, G. J.

    2006-10-15

    One of the major questions in complex network research is to identify the range of mechanisms by which a complex network can self organize into a scale-free state. In this paper we investigate the interplay between a fitness linking mechanism and both random and preferential attachment. In our models, each vertex is assigned a fitness x, drawn from a probability distribution {rho}(x). In Model A, at each time step a vertex is added and joined to an existing vertex, selected at random, with probability p and an edge is introduced between vertices with fitnesses x and y, with a ratemore » f(x,y), with probability 1-p. Model B differs from Model A in that, with probability p, edges are added with preferential attachment rather than randomly. The analysis of Model A shows that, for every fixed fitness x, the network's degree distribution decays exponentially. In Model B we recover instead a power-law degree distribution whose exponent depends only on p, and we show how this result can be generalized. The properties of a number of particular networks are examined.« less

  13. Product of Ginibre matrices: Fuss-Catalan and Raney distributions

    NASA Astrophysics Data System (ADS)

    Penson, Karol A.; Życzkowski, Karol

    2011-06-01

    Squared singular values of a product of s square random Ginibre matrices are asymptotically characterized by probability distributions Ps(x), such that their moments are equal to the Fuss-Catalan numbers of order s. We find a representation of the Fuss-Catalan distributions Ps(x) in terms of a combination of s hypergeometric functions of the type sFs-1. The explicit formula derived here is exact for an arbitrary positive integer s, and for s=1 it reduces to the Marchenko-Pastur distribution. Using similar techniques, involving the Mellin transform and the Meijer G function, we find exact expressions for the Raney probability distributions, the moments of which are given by a two-parameter generalization of the Fuss-Catalan numbers. These distributions can also be considered as a two-parameter generalization of the Wigner semicircle law.

  14. Product of Ginibre matrices: Fuss-Catalan and Raney distributions.

    PubMed

    Penson, Karol A; Zyczkowski, Karol

    2011-06-01

    Squared singular values of a product of s square random Ginibre matrices are asymptotically characterized by probability distributions P(s)(x), such that their moments are equal to the Fuss-Catalan numbers of order s. We find a representation of the Fuss-Catalan distributions P(s)(x) in terms of a combination of s hypergeometric functions of the type (s)F(s-1). The explicit formula derived here is exact for an arbitrary positive integer s, and for s=1 it reduces to the Marchenko-Pastur distribution. Using similar techniques, involving the Mellin transform and the Meijer G function, we find exact expressions for the Raney probability distributions, the moments of which are given by a two-parameter generalization of the Fuss-Catalan numbers. These distributions can also be considered as a two-parameter generalization of the Wigner semicircle law.

  15. In favor of general probability distributions: lateral prefrontal and insular cortices respond to stimulus inherent, but irrelevant differences.

    PubMed

    Mestres-Missé, Anna; Trampel, Robert; Turner, Robert; Kotz, Sonja A

    2016-04-01

    A key aspect of optimal behavior is the ability to predict what will come next. To achieve this, we must have a fairly good idea of the probability of occurrence of possible outcomes. This is based both on prior knowledge about a particular or similar situation and on immediately relevant new information. One question that arises is: when considering converging prior probability and external evidence, is the most probable outcome selected or does the brain represent degrees of uncertainty, even highly improbable ones? Using functional magnetic resonance imaging, the current study explored these possibilities by contrasting words that differ in their probability of occurrence, namely, unbalanced ambiguous words and unambiguous words. Unbalanced ambiguous words have a strong frequency-based bias towards one meaning, while unambiguous words have only one meaning. The current results reveal larger activation in lateral prefrontal and insular cortices in response to dominant ambiguous compared to unambiguous words even when prior and contextual information biases one interpretation only. These results suggest a probability distribution, whereby all outcomes and their associated probabilities of occurrence--even if very low--are represented and maintained.

  16. Long-term trends of metal content and water quality in the Belaya River Basin

    NASA Astrophysics Data System (ADS)

    Fashchevskaia, Tatiana; Motovilov, Yuri

    2017-04-01

    The aim of this research is to identify the spatiotemporal regularities of iron, copper and zinc contents in the streams of the Belaya River basin. The Belaya River is situated in the South Ural region and it is one of the biggest tributary in the Volga River basin with catchment area of 142 000 km2. More than sixty years the diverse economic activities are carried out in the Belaya River basin, the intensity of this activity is characterized by high temporal variability. The leading industries in the region are metallurgy, oil production, petroleum processing, chemistry and petro chemistry, mechanical engineering, power industry. The dynamics of human activities in the catchment and intra and inter-annual changes in the water quality were analyzed for the period 1969-2007 years. Inter-annual dynamics of the metal content in the river waters was identified on the basis of the long-term hydrological monitoring statistics at the 32 sites. It was found that the dynamics of intensity of economic activities in the Belaya River basin was the cause statistically significant changes in the metal content of the river network. Statistically homogeneous time intervals have been set for each monitoring site. Within these time intervals there were obtained averaged reliable quantitative estimations of water quality. Calculations showed that the content of iron, copper and zinc did not change during the analyzed period at the sites, located in the mountain and foothill parts of the basin. At other sites, located on the plains areas of the Belaya River Basin and in the areas of functioning of large industrial facilities, metal content varies. A period of increased concentrations of metals is since the second half of 1970 until the end of the 1990s. From the end of 1990 to 2007 the average metal content for a long-term period in the river waters is reduced in comparison with the previous period: iron - to 7.4 times, copper - to 6.7 times, zinc - to 15 times. As a result, by the end of the test period the average long-term metal content in the river waters is: iron 0.07-1.21 mg/l, copper 0.9-7.0 μg/l, zinc 2,0-12.5 μg/l. Empirical probability distributions of iron, copper and zinc concentrations for various phases of the water regime in all investigated monitoring sites were approximated by Pearson type III curves and the average of the concentration values, the coefficient of variation and asymmetry, as well as the values of the concentrations of metal in the range of 1-95% of frequency were estimated. It was found that by the end of the test period, the average long-term concentrations for iron and copper exceed MAC for fishery use, for zinc become smaller MAC in many streams of Belaya River basin. The probability of exceeding the iron and copper content of MAC level increases during floods, the zinc content of MAC level increases during the winter low. Acknowledgements. The work was financially supported by the Russian Foundation for Basic Research (Grant 15-05-09022)

  17. Skill of Ensemble Seasonal Probability Forecasts

    NASA Astrophysics Data System (ADS)

    Smith, Leonard A.; Binter, Roman; Du, Hailiang; Niehoerster, Falk

    2010-05-01

    In operational forecasting, the computational complexity of large simulation models is, ideally, justified by enhanced performance over simpler models. We will consider probability forecasts and contrast the skill of ENSEMBLES-based seasonal probability forecasts of interest to the finance sector (specifically temperature forecasts for Nino 3.4 and the Atlantic Main Development Region (MDR)). The ENSEMBLES model simulations will be contrasted against forecasts from statistical models based on the observations (climatological distributions) and empirical dynamics based on the observations but conditioned on the current state (dynamical climatology). For some start dates, individual ENSEMBLES models yield significant skill even at a lead-time of 14 months. The nature of this skill is discussed, and chances of application are noted. Questions surrounding the interpretation of probability forecasts based on these multi-model ensemble simulations are then considered; the distributions considered are formed by kernel dressing the ensemble and blending with the climatology. The sources of apparent (RMS) skill in distributions based on multi-model simulations is discussed, and it is demonstrated that the inclusion of "zero-skill" models in the long range can improve Root-Mean-Square-Error scores, casting some doubt on the common justification for the claim that all models should be included in forming an operational probability forecast. It is argued that the rational response varies with lead time.

  18. The concept of entropy in landscape evolution

    USGS Publications Warehouse

    Leopold, Luna Bergere; Langbein, Walter Basil

    1962-01-01

    The concept of entropy is expressed in terms of probability of various states. Entropy treats of the distribution of energy. The principle is introduced that the most probable condition exists when energy in a river system is as uniformly distributed as may be permitted by physical constraints. From these general considerations equations for the longitudinal profiles of rivers are derived that are mathematically comparable to those observed in the field. The most probable river profiles approach the condition in which the downstream rate of production of entropy per unit mass is constant. Hydraulic equations are insufficient to determine the velocity, depths, and slopes of rivers that are themselves authors of their own hydraulic geometries. A solution becomes possible by introducing the concept that the distribution of energy tends toward the most probable. This solution leads to a theoretical definition of the hydraulic geometry of river channels that agrees closely with field observations. The most probable state for certain physical systems can also be illustrated by random-walk models. Average longitudinal profiles and drainage networks were so derived and these have the properties implied by the theory. The drainage networks derived from random walks have some of the principal properties demonstrated by the Horton analysis; specifically, the logarithms of stream length and stream numbers are proportional to stream order.

  19. Void probability as a function of the void's shape and scale-invariant models

    NASA Technical Reports Server (NTRS)

    Elizalde, E.; Gaztanaga, E.

    1991-01-01

    The dependence of counts in cells on the shape of the cell for the large scale galaxy distribution is studied. A very concrete prediction can be done concerning the void distribution for scale invariant models. The prediction is tested on a sample of the CfA catalog, and good agreement is found. It is observed that the probability of a cell to be occupied is bigger for some elongated cells. A phenomenological scale invariant model for the observed distribution of the counts in cells, an extension of the negative binomial distribution, is presented in order to illustrate how this dependence can be quantitatively determined. An original, intuitive derivation of this model is presented.

  20. The Darzi-Vali bauxite deposit, West-Azarbaidjan Province, Iran: Critical metals distribution and parental affinities

    NASA Astrophysics Data System (ADS)

    Khosravi, Maryam; Abedini, Ali; Alipour, Samad; Mongelli, Giovanni

    2017-05-01

    The Darzi-Vali bauxite deposit, located 20 km east of Bukan, in northwestern Iran, occurs as discontinuous layers and lenses within the Upper Permian carbonate rocks of the Ruteh Formation. These layers extend laterally for over ∼1 km and vary in thickness ranging from 2 to 17 m. We studied the chemical variations in a selected stratigraphic section throughout the deposit, focusing in particular on numbers of selected special metals that make the deposit of potential economic importance. The critical elements Co, Ga, Nb, Ta, LREEs, and HREEs, along with transition metal Ni, are variously depleted throughout the deposit with respect to Ti, which is assumed to be a less mobile element. Among the critical elements, Cr has only demonstrated conservative behavior. Factor analysis suggests that the factors controlling the distribution of LREEs and HREEs in the ore, which most likely depend on the local composition of groundwater during weathering, are different from those controlling the distribution of other critical elements. Further, the Darzi-Vali ore has ΣREE contents (773 ppm) much higher with respect to other deposits located in NW of Iran, making this deposit worthy of further investigations. As for parental affinity, the Eu anomalies show negligible fluctuations (0.82-0.94) all along the deposit confirming that bauxitization does not affect the effectiveness of this provenance proxy. The average Eu/Eu* value (0.89) of the ore is relatively far afield from that of the average carbonate bedrock (1.3) and close to that of the average mafic protolith (0.94), and similar results are also obtained using the Sm/Nd and Tb/Tb* proxies. Bivariate plots of Eu anomaly versus Sm/Nd and Tb anomalies further support the idea that mafic rocks are probably related to the volcanic activities. These volcanic activities affected the Iranian platform during the Upper Permian as proposed for other bauxite deposits in northwestern Iran. These mafic rocks were the probable precursor of the Darzi-Vali bauxite ore.

  1. Pattern Storage, Bifurcations, and Groupwise Correlation Structure of an Exactly Solvable Asymmetric Neural Network Model.

    PubMed

    Fasoli, Diego; Cattani, Anna; Panzeri, Stefano

    2018-05-01

    Despite their biological plausibility, neural network models with asymmetric weights are rarely solved analytically, and closed-form solutions are available only in some limiting cases or in some mean-field approximations. We found exact analytical solutions of an asymmetric spin model of neural networks with arbitrary size without resorting to any approximation, and we comprehensively studied its dynamical and statistical properties. The network had discrete time evolution equations and binary firing rates, and it could be driven by noise with any distribution. We found analytical expressions of the conditional and stationary joint probability distributions of the membrane potentials and the firing rates. By manipulating the conditional probability distribution of the firing rates, we extend to stochastic networks the associating learning rule previously introduced by Personnaz and coworkers. The new learning rule allowed the safe storage, under the presence of noise, of point and cyclic attractors, with useful implications for content-addressable memories. Furthermore, we studied the bifurcation structure of the network dynamics in the zero-noise limit. We analytically derived examples of the codimension 1 and codimension 2 bifurcation diagrams of the network, which describe how the neuronal dynamics changes with the external stimuli. This showed that the network may undergo transitions among multistable regimes, oscillatory behavior elicited by asymmetric synaptic connections, and various forms of spontaneous symmetry breaking. We also calculated analytically groupwise correlations of neural activity in the network in the stationary regime. This revealed neuronal regimes where, statistically, the membrane potentials and the firing rates are either synchronous or asynchronous. Our results are valid for networks with any number of neurons, although our equations can be realistically solved only for small networks. For completeness, we also derived the network equations in the thermodynamic limit of infinite network size and we analytically studied their local bifurcations. All the analytical results were extensively validated by numerical simulations.

  2. Probability density function of non-reactive solute concentration in heterogeneous porous formations

    Treesearch

    Alberto Bellin; Daniele Tonina

    2007-01-01

    Available models of solute transport in heterogeneous formations lack in providing complete characterization of the predicted concentration. This is a serious drawback especially in risk analysis where confidence intervals and probability of exceeding threshold values are required. Our contribution to fill this gap of knowledge is a probability distribution model for...

  3. Systems Approach to Defeating Maritime Improvised Explosive Devices in U.S. Ports

    DTIC Science & Technology

    2008-12-01

    DETECTION Pfi PROBABILITY OF FALSE IDENTIFICATION PHPK PROBABILITY OF HIT/PROBABILITY OF KILL PMA POST MISSION ANALYSIS PNNL PACIFIC...16 Naval Warfare Publication 27-2(Rev. B), Section 1.8.4.1(unclassified) 42 detection analysis is conducted...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA Approved for public release; distribution is unlimited Prepared

  4. Predictions of malaria vector distribution in Belize based on multispectral satellite data.

    PubMed

    Roberts, D R; Paris, J F; Manguin, S; Harbach, R E; Woodruff, R; Rejmankova, E; Polanco, J; Wullschleger, B; Legters, L J

    1996-03-01

    Use of multispectral satellite data to predict arthropod-borne disease trouble spots is dependent on clear understandings of environmental factors that determine the presence of disease vectors. A blind test of remote sensing-based predictions for the spatial distribution of a malaria vector, Anopheles pseudopunctipennis, was conducted as a follow-up to two years of studies on vector-environmental relationships in Belize. Four of eight sites that were predicted to be high probability locations for presence of An. pseudopunctipennis were positive and all low probability sites (0 of 12) were negative. The absence of An. pseudopunctipennis at four high probability locations probably reflects the low densities that seem to characterize field populations of this species, i.e., the population densities were below the threshold of our sampling effort. Another important malaria vector, An. darlingi, was also present at all high probability sites and absent at all low probability sites. Anopheles darlingi, like An. pseudopunctipennis, is a riverine species. Prior to these collections at ecologically defined locations, this species was last detected in Belize in 1946.

  5. Predictions of malaria vector distribution in Belize based on multispectral satellite data

    NASA Technical Reports Server (NTRS)

    Roberts, D. R.; Paris, J. F.; Manguin, S.; Harbach, R. E.; Woodruff, R.; Rejmankova, E.; Polanco, J.; Wullschleger, B.; Legters, L. J.

    1996-01-01

    Use of multispectral satellite data to predict arthropod-borne disease trouble spots is dependent on clear understandings of environmental factors that determine the presence of disease vectors. A blind test of remote sensing-based predictions for the spatial distribution of a malaria vector, Anopheles pseudopunctipennis, was conducted as a follow-up to two years of studies on vector-environmental relationships in Belize. Four of eight sites that were predicted to be high probability locations for presence of An. pseudopunctipennis were positive and all low probability sites (0 of 12) were negative. The absence of An. pseudopunctipennis at four high probability locations probably reflects the low densities that seem to characterize field populations of this species, i.e., the population densities were below the threshold of our sampling effort. Another important malaria vector, An. darlingi, was also present at all high probability sites and absent at all low probability sites. Anopheles darlingi, like An. pseudopunctipennis, is a riverine species. Prior to these collections at ecologically defined locations, this species was last detected in Belize in 1946.

  6. Using hidden Markov models to align multiple sequences.

    PubMed

    Mount, David W

    2009-07-01

    A hidden Markov model (HMM) is a probabilistic model of a multiple sequence alignment (msa) of proteins. In the model, each column of symbols in the alignment is represented by a frequency distribution of the symbols (called a "state"), and insertions and deletions are represented by other states. One moves through the model along a particular path from state to state in a Markov chain (i.e., random choice of next move), trying to match a given sequence. The next matching symbol is chosen from each state, recording its probability (frequency) and also the probability of going to that state from a previous one (the transition probability). State and transition probabilities are multiplied to obtain a probability of the given sequence. The hidden nature of the HMM is due to the lack of information about the value of a specific state, which is instead represented by a probability distribution over all possible values. This article discusses the advantages and disadvantages of HMMs in msa and presents algorithms for calculating an HMM and the conditions for producing the best HMM.

  7. Science communication podcasting in Brazil: the potential and challenges depicted by two podcasts.

    PubMed

    Dantas-Queiroz, Marcos V; Wentzel, Lia C P; Queiroz, Luciano L

    2018-01-01

    Podcasts - online distributed audio files - are easy access and production media, which can be used for Scientific Communication (SC) but few are presented in Portuguese. The objective of this work is to perform a case study with data from a survey for two Brazilian SC podcasts (Dragões de Garagem and Fronteiras da Ciência) to evaluate the increase of science podcast media in Brazil, the involved potential, their advantages, shortcomings, and perspectives. We noted an increase of listeners over the years, probably due to the internet popularization and the massive increase of mobile phones. Scientific content is underexplored, despite the great interest of the public. Humorous and informal podcasts are the most appealing to the public and they usually listen to them on informal educational sites. The majority of the public is from the South and Southeast regions, they are young male adults with undergraduate or graduate degrees. SC podcasts, despite their potential to communicate science, still have shortcomings to overcome. Nevertheless, independent initiatives can solve this difficulty, making possible for the media to reach a varied audience, affecting different groups that would not have interest in a specific content before, or even the access itself to the scientific knowledge.

  8. Chemical quenching of positronium in Fe 2O 3/Al 2O 3 catalysts

    NASA Astrophysics Data System (ADS)

    Li, C.; Zhang, H. J.; Chen, Z. Q.

    2010-09-01

    Fe 2O 3/Al 2O 3 catalysts were prepared by solid state reaction method using α-Fe 2O 3 and γ-Al 2O 3 nano powders. The microstructure and surface properties of the catalyst were studied using positron lifetime and coincidence Doppler broadening annihilation radiation measurements. The positron lifetime spectrum shows four components. The two long lifetimes τ3 and τ4 are attributed to positronium annihilation in two types of pores distributed inside Al 2O 3 grain and between the grains, respectively. With increasing Fe 2O 3 content from 3 wt% to 40 wt%, the lifetime τ3 keeps nearly unchanged, while the longest lifetime τ4 shows decrease from 96 ns to 64 ns. Its intensity decreases drastically from 24% to less than 8%. The Doppler broadening S parameter shows also a continuous decrease. Further analysis of the Doppler broadening spectra reveals a decrease in the p-Ps intensity with increasing Fe 2O 3 content, which rules out the possibility of spin-conversion of positronium. Therefore the decrease of τ4 is most probably due to the chemical quenching reaction of positronium with Fe ions on the surface of the large pores.

  9. Weighing Clinical Evidence Using Patient Preferences: An Application of Probabilistic Multi-Criteria Decision Analysis.

    PubMed

    Broekhuizen, Henk; IJzerman, Maarten J; Hauber, A Brett; Groothuis-Oudshoorn, Catharina G M

    2017-03-01

    The need for patient engagement has been recognized by regulatory agencies, but there is no consensus about how to operationalize this. One approach is the formal elicitation and use of patient preferences for weighing clinical outcomes. The aim of this study was to demonstrate how patient preferences can be used to weigh clinical outcomes when both preferences and clinical outcomes are uncertain by applying a probabilistic value-based multi-criteria decision analysis (MCDA) method. Probability distributions were used to model random variation and parameter uncertainty in preferences, and parameter uncertainty in clinical outcomes. The posterior value distributions and rank probabilities for each treatment were obtained using Monte-Carlo simulations. The probability of achieving the first rank is the probability that a treatment represents the highest value to patients. We illustrated our methodology for a simplified case on six HIV treatments. Preferences were modeled with normal distributions and clinical outcomes were modeled with beta distributions. The treatment value distributions showed the rank order of treatments according to patients and illustrate the remaining decision uncertainty. This study demonstrated how patient preference data can be used to weigh clinical evidence using MCDA. The model takes into account uncertainty in preferences and clinical outcomes. The model can support decision makers during the aggregation step of the MCDA process and provides a first step toward preference-based personalized medicine, yet requires further testing regarding its appropriate use in real-world settings.

  10. Predicting bottlenose dolphin distribution along Liguria coast (northwestern Mediterranean Sea) through different modeling techniques and indirect predictors.

    PubMed

    Marini, C; Fossa, F; Paoli, C; Bellingeri, M; Gnone, G; Vassallo, P

    2015-03-01

    Habitat modeling is an important tool to investigate the quality of the habitat for a species within a certain area, to predict species distribution and to understand the ecological processes behind it. Many species have been investigated by means of habitat modeling techniques mainly to address effective management and protection policies and cetaceans play an important role in this context. The bottlenose dolphin (Tursiops truncatus) has been investigated with habitat modeling techniques since 1997. The objectives of this work were to predict the distribution of bottlenose dolphin in a coastal area through the use of static morphological features and to compare the prediction performances of three different modeling techniques: Generalized Linear Model (GLM), Generalized Additive Model (GAM) and Random Forest (RF). Four static variables were tested: depth, bottom slope, distance from 100 m bathymetric contour and distance from coast. RF revealed itself both the most accurate and the most precise modeling technique with very high distribution probabilities predicted in presence cells (90.4% of mean predicted probabilities) and with 66.7% of presence cells with a predicted probability comprised between 90% and 100%. The bottlenose distribution obtained with RF allowed the identification of specific areas with particularly high presence probability along the coastal zone; the recognition of these core areas may be the starting point to develop effective management practices to improve T. truncatus protection. Copyright © 2014 Elsevier Ltd. All rights reserved.

  11. Probability Distributions over Cryptographic Protocols

    DTIC Science & Technology

    2009-06-01

    Artificial Immune Algorithm . . . . . . . . . . . . . . . . . . . 9 3 Design Decisions 11 3.1 Common Ground...creation algorithm for unbounded distribution . . . . . . . 24 4.2 Message creation algorithm for unbounded naive distribution . . . . 24 4.3 Protocol...creation algorithm for intended-run distributions . . . . . . 26 4.4 Protocol and message creation algorithm for realistic distribution . . 32 ix THIS

  12. Probability Distribution Extraction from TEC Estimates based on Kernel Density Estimation

    NASA Astrophysics Data System (ADS)

    Demir, Uygar; Toker, Cenk; Çenet, Duygu

    2016-07-01

    Statistical analysis of the ionosphere, specifically the Total Electron Content (TEC), may reveal important information about its temporal and spatial characteristics. One of the core metrics that express the statistical properties of a stochastic process is its Probability Density Function (pdf). Furthermore, statistical parameters such as mean, variance and kurtosis, which can be derived from the pdf, may provide information about the spatial uniformity or clustering of the electron content. For example, the variance differentiates between a quiet ionosphere and a disturbed one, whereas kurtosis differentiates between a geomagnetic storm and an earthquake. Therefore, valuable information about the state of the ionosphere (and the natural phenomena that cause the disturbance) can be obtained by looking at the statistical parameters. In the literature, there are publications which try to fit the histogram of TEC estimates to some well-known pdf.s such as Gaussian, Exponential, etc. However, constraining a histogram to fit to a function with a fixed shape will increase estimation error, and all the information extracted from such pdf will continue to contain this error. In such techniques, it is highly likely to observe some artificial characteristics in the estimated pdf which is not present in the original data. In the present study, we use the Kernel Density Estimation (KDE) technique to estimate the pdf of the TEC. KDE is a non-parametric approach which does not impose a specific form on the TEC. As a result, better pdf estimates that almost perfectly fit to the observed TEC values can be obtained as compared to the techniques mentioned above. KDE is particularly good at representing the tail probabilities, and outliers. We also calculate the mean, variance and kurtosis of the measured TEC values. The technique is applied to the ionosphere over Turkey where the TEC values are estimated from the GNSS measurement from the TNPGN-Active (Turkish National Permanent GNSS Network) network. This study is supported by by TUBITAK 115E915 and Joint TUBITAK 114E092 and AS CR14/001 projects.

  13. Probability distribution functions for unit hydrographs with optimization using genetic algorithm

    NASA Astrophysics Data System (ADS)

    Ghorbani, Mohammad Ali; Singh, Vijay P.; Sivakumar, Bellie; H. Kashani, Mahsa; Atre, Atul Arvind; Asadi, Hakimeh

    2017-05-01

    A unit hydrograph (UH) of a watershed may be viewed as the unit pulse response function of a linear system. In recent years, the use of probability distribution functions (pdfs) for determining a UH has received much attention. In this study, a nonlinear optimization model is developed to transmute a UH into a pdf. The potential of six popular pdfs, namely two-parameter gamma, two-parameter Gumbel, two-parameter log-normal, two-parameter normal, three-parameter Pearson distribution, and two-parameter Weibull is tested on data from the Lighvan catchment in Iran. The probability distribution parameters are determined using the nonlinear least squares optimization method in two ways: (1) optimization by programming in Mathematica; and (2) optimization by applying genetic algorithm. The results are compared with those obtained by the traditional linear least squares method. The results show comparable capability and performance of two nonlinear methods. The gamma and Pearson distributions are the most successful models in preserving the rising and recession limbs of the unit hydographs. The log-normal distribution has a high ability in predicting both the peak flow and time to peak of the unit hydrograph. The nonlinear optimization method does not outperform the linear least squares method in determining the UH (especially for excess rainfall of one pulse), but is comparable.

  14. Reward skewness coding in the insula independent of probability and loss

    PubMed Central

    Tobler, Philippe N.

    2011-01-01

    Rewards in the natural environment are rarely predicted with complete certainty. Uncertainty relating to future rewards has typically been defined as the variance of the potential outcomes. However, the asymmetry of predicted reward distributions, known as skewness, constitutes a distinct but neuroscientifically underexplored risk term that may also have an impact on preference. By changing only reward magnitudes, we study skewness processing in equiprobable ternary lotteries involving only gains and constant probabilities, thus excluding probability distortion or loss aversion as mechanisms for skewness preference formation. We show that individual preferences are sensitive to not only the mean and variance but also to the skewness of predicted reward distributions. Using neuroimaging, we show that the insula, a structure previously implicated in the processing of reward-related uncertainty, responds to the skewness of predicted reward distributions. Some insula responses increased in a monotonic fashion with skewness (irrespective of individual skewness preferences), whereas others were similarly elevated to both negative and positive as opposed to no reward skew. These data support the notion that the asymmetry of reward distributions is processed in the brain and, taken together with replicated findings of mean coding in the striatum and variance coding in the cingulate, suggest that the brain codes distinct aspects of reward distributions in a distributed fashion. PMID:21849610

  15. A mathematical model for evolution and SETI.

    PubMed

    Maccone, Claudio

    2011-12-01

    Darwinian evolution theory may be regarded as a part of SETI theory in that the factor f(l) in the Drake equation represents the fraction of planets suitable for life on which life actually arose. In this paper we firstly provide a statistical generalization of the Drake equation where the factor f(l) is shown to follow the lognormal probability distribution. This lognormal distribution is a consequence of the Central Limit Theorem (CLT) of Statistics, stating that the product of a number of independent random variables whose probability densities are unknown and independent of each other approached the lognormal distribution when the number of factors increased to infinity. In addition we show that the exponential growth of the number of species typical of Darwinian Evolution may be regarded as the geometric locus of the peaks of a one-parameter family of lognormal distributions (b-lognormals) constrained between the time axis and the exponential growth curve. Finally, since each b-lognormal distribution in the family may in turn be regarded as the product of a large number (actually "an infinity") of independent lognormal probability distributions, the mathematical way is paved to further cast Darwinian Evolution into a mathematical theory in agreement with both its typical exponential growth in the number of living species and the Statistical Drake Equation.

  16. Modelling Evolution and SETI Mathematically

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2012-05-01

    Darwinian evolution theory may be regarded as a part of SETI theory in that the factor fl in the Drake equation represents the fraction of planets suitable for life on which life actually arose. In this paper we firstly provide a statistical generalization of the Drake equation where the factor fl is shown to follow the lognormal probability distribution. This lognormal distribution is a consequence of the Central Limit Theorem (CLT) of Statistics, stating that the product of a number of independent random variables whose probability densities are unknown and independent of each other approached the lognormal distribution when the number of factor increased to infinity. In addition we show that the exponential growth of the number of species typical of Darwinian Evolution may be regarded as the geometric locus of the peaks of a one-parameter family of lognormal distributions constrained between the time axis and the exponential growth curve. Finally, since each lognormal distribution in the family may in turn be regarded as the product of a large number (actually "an infinity") of independent lognormal probability distributions, the mathematical way is paved to further cast Darwinian Evolution into a mathematical theory in agreement with both its typical exponential growth in the number of living species and the Statistical Drake Equation.

  17. A Mathematical Model for Evolution and SETI

    NASA Astrophysics Data System (ADS)

    Maccone, Claudio

    2011-12-01

    Darwinian evolution theory may be regarded as a part of SETI theory in that the factor fl in the Drake equation represents the fraction of planets suitable for life on which life actually arose. In this paper we firstly provide a statistical generalization of the Drake equation where the factor fl is shown to follow the lognormal probability distribution. This lognormal distribution is a consequence of the Central Limit Theorem (CLT) of Statistics, stating that the product of a number of independent random variables whose probability densities are unknown and independent of each other approached the lognormal distribution when the number of factors increased to infinity. In addition we show that the exponential growth of the number of species typical of Darwinian Evolution may be regarded as the geometric locus of the peaks of a one-parameter family of lognormal distributions (b-lognormals) constrained between the time axis and the exponential growth curve. Finally, since each b-lognormal distribution in the family may in turn be regarded as the product of a large number (actually "an infinity") of independent lognormal probability distributions, the mathematical way is paved to further cast Darwinian Evolution into a mathematical theory in agreement with both its typical exponential growth in the number of living species and the Statistical Drake Equation.

  18. An efficient distribution method for nonlinear transport problems in stochastic porous media

    NASA Astrophysics Data System (ADS)

    Ibrahima, F.; Tchelepi, H.; Meyer, D. W.

    2015-12-01

    Because geophysical data are inexorably sparse and incomplete, stochastic treatments of simulated responses are convenient to explore possible scenarios and assess risks in subsurface problems. In particular, understanding how uncertainties propagate in porous media with nonlinear two-phase flow is essential, yet challenging, in reservoir simulation and hydrology. We give a computationally efficient and numerically accurate method to estimate the one-point probability density (PDF) and cumulative distribution functions (CDF) of the water saturation for the stochastic Buckley-Leverett problem when the probability distributions of the permeability and porosity fields are available. The method draws inspiration from the streamline approach and expresses the distributions of interest essentially in terms of an analytically derived mapping and the distribution of the time of flight. In a large class of applications the latter can be estimated at low computational costs (even via conventional Monte Carlo). Once the water saturation distribution is determined, any one-point statistics thereof can be obtained, especially its average and standard deviation. Moreover, rarely available in other approaches, yet crucial information such as the probability of rare events and saturation quantiles (e.g. P10, P50 and P90) can be derived from the method. We provide various examples and comparisons with Monte Carlo simulations to illustrate the performance of the method.

  19. Two statistical mechanics aspects of complex networks

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Biely, Christoly

    2006-12-01

    By adopting an ensemble interpretation of non-growing rewiring networks, network theory can be reduced to a counting problem of possible network states and an identification of their associated probabilities. We present two scenarios of how different rewirement schemes can be used to control the state probabilities of the system. In particular, we review how by generalizing the linking rules of random graphs, in combination with superstatistics and quantum mechanical concepts, one can establish an exact relation between the degree distribution of any given network and the nodes’ linking probability distributions. In a second approach, we control state probabilities by a network Hamiltonian, whose characteristics are motivated by biological and socio-economical statistical systems. We demonstrate that a thermodynamics of networks becomes a fully consistent concept, allowing to study e.g. ‘phase transitions’ and computing entropies through thermodynamic relations.

  20. Computing exact bundle compliance control charts via probability generating functions.

    PubMed

    Chen, Binchao; Matis, Timothy; Benneyan, James

    2016-06-01

    Compliance to evidenced-base practices, individually and in 'bundles', remains an important focus of healthcare quality improvement for many clinical conditions. The exact probability distribution of composite bundle compliance measures used to develop corresponding control charts and other statistical tests is based on a fairly large convolution whose direct calculation can be computationally prohibitive. Various series expansions and other approximation approaches have been proposed, each with computational and accuracy tradeoffs, especially in the tails. This same probability distribution also arises in other important healthcare applications, such as for risk-adjusted outcomes and bed demand prediction, with the same computational difficulties. As an alternative, we use probability generating functions to rapidly obtain exact results and illustrate the improved accuracy and detection over other methods. Numerical testing across a wide range of applications demonstrates the computational efficiency and accuracy of this approach.

Top