Sample records for expected statistical properties

  1. Statistical Properties of Generalized Gini Coefficient with Application to Health Inequality Measurement

    ERIC Educational Resources Information Center

    Lai, Dejian; Huang, Jin; Risser, Jan M.; Kapadia, Asha S.

    2008-01-01

    In this article, we report statistical properties of two classes of generalized Gini coefficients (G1 and G2). The theoretical results were assessed via Monte Carlo simulations. Further, we used G1 and G2 on life expectancy to measure health inequalities among the provinces of China and the states of the United States. For China, the results…

  2. A Statistics-Based Material Property Analysis to Support TPS Characterization

    NASA Technical Reports Server (NTRS)

    Copeland, Sean R.; Cozmuta, Ioana; Alonso, Juan J.

    2012-01-01

    Accurate characterization of entry capsule heat shield material properties is a critical component in modeling and simulating Thermal Protection System (TPS) response in a prescribed aerothermal environment. The thermal decomposition of the TPS material during the pyrolysis and charring processes is poorly characterized and typically results in large uncertainties in material properties as inputs for ablation models. These material property uncertainties contribute to large design margins on flight systems and cloud re- construction efforts for data collected during flight and ground testing, making revision to existing models for entry systems more challenging. The analysis presented in this work quantifies how material property uncertainties propagate through an ablation model and guides an experimental test regimen aimed at reducing these uncertainties and characterizing the dependencies between properties in the virgin and charred states for a Phenolic Impregnated Carbon Ablator (PICA) based TPS. A sensitivity analysis identifies how the high-fidelity model behaves in the expected flight environment, while a Monte Carlo based uncertainty propagation strategy is used to quantify the expected spread in the in-depth temperature response of the TPS. An examination of how perturbations to the input probability density functions affect output temperature statistics is accomplished using a Kriging response surface of the high-fidelity model. Simulations are based on capsule configuration and aerothermal environments expected during the Mars Science Laboratory (MSL) entry sequence. We identify and rank primary sources of uncertainty from material properties in a flight-relevant environment, show the dependence on spatial orientation and in-depth location on those uncertainty contributors, and quantify how sensitive the expected results are.

  3. Information-Theoretic Properties of Auditory Sequences Dynamically Influence Expectation and Memory

    ERIC Educational Resources Information Center

    Agres, Kat; Abdallah, Samer; Pearce, Marcus

    2018-01-01

    A basic function of cognition is to detect regularities in sensory input to facilitate the prediction and recognition of future events. It has been proposed that these implicit expectations arise from an internal predictive coding model, based on knowledge acquired through processes such as statistical learning, but it is unclear how different…

  4. Probabilities and statistics for backscatter estimates obtained by a scatterometer with applications to new scatterometer design data

    NASA Technical Reports Server (NTRS)

    Pierson, Willard J., Jr.

    1989-01-01

    The values of the Normalized Radar Backscattering Cross Section (NRCS), sigma (o), obtained by a scatterometer are random variables whose variance is a known function of the expected value. The probability density function can be obtained from the normal distribution. Models for the expected value obtain it as a function of the properties of the waves on the ocean and the winds that generated the waves. Point estimates of the expected value were found from various statistics given the parameters that define the probability density function for each value. Random intervals were derived with a preassigned probability of containing that value. A statistical test to determine whether or not successive values of sigma (o) are truly independent was derived. The maximum likelihood estimates for wind speed and direction were found, given a model for backscatter as a function of the properties of the waves on the ocean. These estimates are biased as a result of the terms in the equation that involve natural logarithms, and calculations of the point estimates of the maximum likelihood values are used to show that the contributions of the logarithmic terms are negligible and that the terms can be omitted.

  5. The development of principled connections and kind representations.

    PubMed

    Haward, Paul; Wagner, Laura; Carey, Susan; Prasada, Sandeep

    2018-07-01

    Kind representations draw an important distinction between properties that are understood as existing in instances of a kind by virtue of their being the kind of thing they are and properties that are not understood in this manner. For example, the property of barking for the kind dog is understood as being had by dogs by virtue of the fact that they are dogs. These properties are said to have a principled connection to the kind. In contrast, the property of wearing a collar is not understood as existing in instances by virtue of their being dogs, despite the fact that a large percentage of dogs wear collars. Such properties are said to have a statistical connection to the kind. Two experiments tested two signatures of principled connections in 4-7 year olds and adults: (i) that principled connections license normative expectations (e.g., we judge there to be something wrong with a dog that does not bark), and (ii) that principled connections license formal explanations which explain the existence of a property by reference to the kind (e.g., that barks because it is a dog). Experiment 1 showed that both the children and adults have normative expectations for properties that have a principled connection to a kind, but not those that have a mere statistical connection to a kind. Experiment 2 showed that both children and adults are more likely to provide a formal explanation when explaining the existence of properties with a principled connection to a kind than properties with statistical connections to their kinds. Both experiments showed no effect of age (over ages 4, 7, and adulthood) on the extent to which participants differentiated principled and statistical connections. We discuss the implications of the results for theories of conceptual representation and for the structure of explanation. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. Expected Monotonicity – A Desirable Property for Evidence Measures?

    PubMed Central

    Hodge, Susan E.; Vieland, Veronica J.

    2010-01-01

    We consider here the principle of ‘evidential consistency’ – that as one gathers more data, any well-behaved evidence measure should, in some sense, approach the true answer. Evidential consistency is essential for the genome-scan design (GWAS or linkage), where one selects the most promising locus(i) for follow-up, expecting that new data will increase evidence for the correct hypothesis. Earlier work [Vieland, Hum Hered 2006;61:144–156] showed that many popular statistics do not satisfy this principle; Vieland concluded that the problem stems from fundamental difficulties in how we measure evidence and argued for determining criteria to evaluate evidence measures. Here, we investigate in detail one proposed consistency criterion – expected monotonicity (ExpM) – for a simple statistical model (binomial) and four likelihood ratio (LR)-based evidence measures. We show that, with one limited exception, none of these measures displays ExpM; what they do display is sometimes counterintuitive. We conclude that ExpM is not a reasonable requirement for evidence measures; moreover, no requirement based on expected values seems feasible. We demonstrate certain desirable properties of the simple LR and demonstrate a connection between the simple and integrated LRs. We also consider an alternative version of consistency, which is satisfied by certain forms of the integrated LR and posterior probability of linkage. PMID:20664208

  7. Interval-based reconstruction for uncertainty quantification in PET

    NASA Astrophysics Data System (ADS)

    Kucharczak, Florentin; Loquin, Kevin; Buvat, Irène; Strauss, Olivier; Mariano-Goulart, Denis

    2018-02-01

    A new directed interval-based tomographic reconstruction algorithm, called non-additive interval based expectation maximization (NIBEM) is presented. It uses non-additive modeling of the forward operator that provides intervals instead of single-valued projections. The detailed approach is an extension of the maximum likelihood—expectation maximization algorithm based on intervals. The main motivation for this extension is that the resulting intervals have appealing properties for estimating the statistical uncertainty associated with the reconstructed activity values. After reviewing previously published theoretical concepts related to interval-based projectors, this paper describes the NIBEM algorithm and gives examples that highlight the properties and advantages of this interval valued reconstruction.

  8. Accurate quantification of magnetic particle properties by intra-pair magnetophoresis for nanobiotechnology

    NASA Astrophysics Data System (ADS)

    van Reenen, Alexander; Gao, Yang; Bos, Arjen H.; de Jong, Arthur M.; Hulsen, Martien A.; den Toonder, Jaap M. J.; Prins, Menno W. J.

    2013-07-01

    The application of magnetic particles in biomedical research and in-vitro diagnostics requires accurate characterization of their magnetic properties, with single-particle resolution and good statistics. Here, we report intra-pair magnetophoresis as a method to accurately quantify the field-dependent magnetic moments of magnetic particles and to rapidly generate histograms of the magnetic moments with good statistics. We demonstrate our method with particles of different sizes and from different sources, with a measurement precision of a few percent. We expect that intra-pair magnetophoresis will be a powerful tool for the characterization and improvement of particles for the upcoming field of particle-based nanobiotechnology.

  9. Halo models of HI selected galaxies

    NASA Astrophysics Data System (ADS)

    Paul, Niladri; Choudhury, Tirthankar Roy; Paranjape, Aseem

    2018-06-01

    Modelling the distribution of neutral hydrogen (HI) in dark matter halos is important for studying galaxy evolution in the cosmological context. We use a novel approach to infer the HI-dark matter connection at the massive end (m_H{I} > 10^{9.8} M_{⊙}) from radio HI emission surveys, using optical properties of low-redshift galaxies as an intermediary. In particular, we use a previously calibrated optical HOD describing the luminosity- and colour-dependent clustering of SDSS galaxies and describe the HI content using a statistical scaling relation between the optical properties and HI mass. This allows us to compute the abundance and clustering properties of HI-selected galaxies and compare with data from the ALFALFA survey. We apply an MCMC-based statistical analysis to constrain the free parameters related to the scaling relation. The resulting best-fit scaling relation identifies massive HI galaxies primarily with optically faint blue centrals, consistent with expectations from galaxy formation models. We compare the Hi-stellar mass relation predicted by our model with independent observations from matched Hi-optical galaxy samples, finding reasonable agreement. As a further application, we make some preliminary forecasts for future observations of HI and optical galaxies in the expected overlap volume of SKA and Euclid/LSST.

  10. Gaussian orthogonal ensemble statistics in graphene billiards with the shape of classically integrable billiards.

    PubMed

    Yu, Pei; Li, Zi-Yuan; Xu, Hong-Ya; Huang, Liang; Dietz, Barbara; Grebogi, Celso; Lai, Ying-Cheng

    2016-12-01

    A crucial result in quantum chaos, which has been established for a long time, is that the spectral properties of classically integrable systems generically are described by Poisson statistics, whereas those of time-reversal symmetric, classically chaotic systems coincide with those of random matrices from the Gaussian orthogonal ensemble (GOE). Does this result hold for two-dimensional Dirac material systems? To address this fundamental question, we investigate the spectral properties in a representative class of graphene billiards with shapes of classically integrable circular-sector billiards. Naively one may expect to observe Poisson statistics, which is indeed true for energies close to the band edges where the quasiparticle obeys the Schrödinger equation. However, for energies near the Dirac point, where the quasiparticles behave like massless Dirac fermions, Poisson statistics is extremely rare in the sense that it emerges only under quite strict symmetry constraints on the straight boundary parts of the sector. An arbitrarily small amount of imperfection of the boundary results in GOE statistics. This implies that, for circular-sector confinements with arbitrary angle, the spectral properties will generically be GOE. These results are corroborated by extensive numerical computation. Furthermore, we provide a physical understanding for our results.

  11. Gaussian orthogonal ensemble statistics in graphene billiards with the shape of classically integrable billiards

    NASA Astrophysics Data System (ADS)

    Yu, Pei; Li, Zi-Yuan; Xu, Hong-Ya; Huang, Liang; Dietz, Barbara; Grebogi, Celso; Lai, Ying-Cheng

    2016-12-01

    A crucial result in quantum chaos, which has been established for a long time, is that the spectral properties of classically integrable systems generically are described by Poisson statistics, whereas those of time-reversal symmetric, classically chaotic systems coincide with those of random matrices from the Gaussian orthogonal ensemble (GOE). Does this result hold for two-dimensional Dirac material systems? To address this fundamental question, we investigate the spectral properties in a representative class of graphene billiards with shapes of classically integrable circular-sector billiards. Naively one may expect to observe Poisson statistics, which is indeed true for energies close to the band edges where the quasiparticle obeys the Schrödinger equation. However, for energies near the Dirac point, where the quasiparticles behave like massless Dirac fermions, Poisson statistics is extremely rare in the sense that it emerges only under quite strict symmetry constraints on the straight boundary parts of the sector. An arbitrarily small amount of imperfection of the boundary results in GOE statistics. This implies that, for circular-sector confinements with arbitrary angle, the spectral properties will generically be GOE. These results are corroborated by extensive numerical computation. Furthermore, we provide a physical understanding for our results.

  12. Squeezed states in the theory of primordial gravitational waves

    NASA Technical Reports Server (NTRS)

    Grishchuk, Leonid P.

    1992-01-01

    It is shown that squeezed states of primordial gravitational waves are inevitably produced in the course of cosmological evolution. The theory of squeezed gravitons is very similar to the theory of squeezed light. Squeezed parameters and statistical properties of the expected relic gravity-wave radiation are described.

  13. Statistical Learning in Emerging Lexicons: The Case of Danish

    ERIC Educational Resources Information Center

    Stokes, Stephanie F.; Bleses, Dorthe; Basboll, Hans; Lambertsen, Claus

    2012-01-01

    Purpose: This research explored the impact of neighborhood density (ND), word frequency (WF), and word length (WL) on the vocabulary size of Danish-speaking children. Given the particular phonological properties of Danish, the impact was expected to differ from that reported in studies on English and French. Method: The monosyllabic words in the…

  14. Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.

    PubMed

    Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory

    2017-01-01

    Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.

  15. Gene Identification Algorithms Using Exploratory Statistical Analysis of Periodicity

    NASA Astrophysics Data System (ADS)

    Mukherjee, Shashi Bajaj; Sen, Pradip Kumar

    2010-10-01

    Studying periodic pattern is expected as a standard line of attack for recognizing DNA sequence in identification of gene and similar problems. But peculiarly very little significant work is done in this direction. This paper studies statistical properties of DNA sequences of complete genome using a new technique. A DNA sequence is converted to a numeric sequence using various types of mappings and standard Fourier technique is applied to study the periodicity. Distinct statistical behaviour of periodicity parameters is found in coding and non-coding sequences, which can be used to distinguish between these parts. Here DNA sequences of Drosophila melanogaster were analyzed with significant accuracy.

  16. Data-adaptive test statistics for microarray data.

    PubMed

    Mukherjee, Sach; Roberts, Stephen J; van der Laan, Mark J

    2005-09-01

    An important task in microarray data analysis is the selection of genes that are differentially expressed between different tissue samples, such as healthy and diseased. However, microarray data contain an enormous number of dimensions (genes) and very few samples (arrays), a mismatch which poses fundamental statistical problems for the selection process that have defied easy resolution. In this paper, we present a novel approach to the selection of differentially expressed genes in which test statistics are learned from data using a simple notion of reproducibility in selection results as the learning criterion. Reproducibility, as we define it, can be computed without any knowledge of the 'ground-truth', but takes advantage of certain properties of microarray data to provide an asymptotically valid guide to expected loss under the true data-generating distribution. We are therefore able to indirectly minimize expected loss, and obtain results substantially more robust than conventional methods. We apply our method to simulated and oligonucleotide array data. By request to the corresponding author.

  17. The effects of neuron morphology on graph theoretic measures of network connectivity: the analysis of a two-level statistical model.

    PubMed

    Aćimović, Jugoslava; Mäki-Marttunen, Tuomo; Linne, Marja-Leena

    2015-01-01

    We developed a two-level statistical model that addresses the question of how properties of neurite morphology shape the large-scale network connectivity. We adopted a low-dimensional statistical description of neurites. From the neurite model description we derived the expected number of synapses, node degree, and the effective radius, the maximal distance between two neurons expected to form at least one synapse. We related these quantities to the network connectivity described using standard measures from graph theory, such as motif counts, clustering coefficient, minimal path length, and small-world coefficient. These measures are used in a neuroscience context to study phenomena from synaptic connectivity in the small neuronal networks to large scale functional connectivity in the cortex. For these measures we provide analytical solutions that clearly relate different model properties. Neurites that sparsely cover space lead to a small effective radius. If the effective radius is small compared to the overall neuron size the obtained networks share similarities with the uniform random networks as each neuron connects to a small number of distant neurons. Large neurites with densely packed branches lead to a large effective radius. If this effective radius is large compared to the neuron size, the obtained networks have many local connections. In between these extremes, the networks maximize the variability of connection repertoires. The presented approach connects the properties of neuron morphology with large scale network properties without requiring heavy simulations with many model parameters. The two-steps procedure provides an easier interpretation of the role of each modeled parameter. The model is flexible and each of its components can be further expanded. We identified a range of model parameters that maximizes variability in network connectivity, the property that might affect network capacity to exhibit different dynamical regimes.

  18. Adaptive Colour Contrast Coding in the Salamander Retina Efficiently Matches Natural Scene Statistics

    PubMed Central

    Vasserman, Genadiy; Schneidman, Elad; Segev, Ronen

    2013-01-01

    The visual system continually adjusts its sensitivity to the statistical properties of the environment through an adaptation process that starts in the retina. Colour perception and processing is commonly thought to occur mainly in high visual areas, and indeed most evidence for chromatic colour contrast adaptation comes from cortical studies. We show that colour contrast adaptation starts in the retina where ganglion cells adjust their responses to the spectral properties of the environment. We demonstrate that the ganglion cells match their responses to red-blue stimulus combinations according to the relative contrast of each of the input channels by rotating their functional response properties in colour space. Using measurements of the chromatic statistics of natural environments, we show that the retina balances inputs from the two (red and blue) stimulated colour channels, as would be expected from theoretical optimal behaviour. Our results suggest that colour is encoded in the retina based on the efficient processing of spectral information that matches spectral combinations in natural scenes on the colour processing level. PMID:24205373

  19. Consumer expectations of nonprescription medications according to location of sale.

    PubMed

    Taylor, Jeffrey G; Lo, Ya-Ning; Dobson, Roy; Suveges, Linda G

    2007-01-01

    To determine whether the public has different expectations of nonprescription medications based on location of sale. Cross-sectional, descriptive. Saskatoon, Saskatchewan, Canada, during the summer of 2003. 2,102 randomly selected citizens. Mail survey. Differences in expectations for potency, safety, adverse effects, effectiveness, and package information of products sold in pharmacies versus convenience stores. The response rate was 57.2%. Most participants (81.2%) were aware that nonprescription medications could be purchased in convenience stores, but far fewer (42.3%) had done so. As one potential resource during purchases, pharmacists were held in reasonably high regard. Expectations with the greatest difference were of a merchandising nature. Respondents expected pharmacies to have a better quality and selection of products and lower prices. For drug-related attributes, differences were minimal but statistically significant. Location of sale does not appear to have any practical influence on consumer expectations of the drug-related attributes of nonprescription medications. Buyers of such products expect similar properties to be present regardless of location.

  20. The Statistics of Urban Scaling and Their Connection to Zipf’s Law

    PubMed Central

    Gomez-Lievano, Andres; Youn, HyeJin; Bettencourt, Luís M. A.

    2012-01-01

    Urban scaling relations characterizing how diverse properties of cities vary on average with their population size have recently been shown to be a general quantitative property of many urban systems around the world. However, in previous studies the statistics of urban indicators were not analyzed in detail, raising important questions about the full characterization of urban properties and how scaling relations may emerge in these larger contexts. Here, we build a self-consistent statistical framework that characterizes the joint probability distributions of urban indicators and city population sizes across an urban system. To develop this framework empirically we use one of the most granular and stochastic urban indicators available, specifically measuring homicides in cities of Brazil, Colombia and Mexico, three nations with high and fast changing rates of violent crime. We use these data to derive the conditional probability of the number of homicides per year given the population size of a city. To do this we use Bayes’ rule together with the estimated conditional probability of city size given their number of homicides and the distribution of total homicides. We then show that scaling laws emerge as expectation values of these conditional statistics. Knowledge of these distributions implies, in turn, a relationship between scaling and population size distribution exponents that can be used to predict Zipf’s exponent from urban indicator statistics. Our results also suggest how a general statistical theory of urban indicators may be constructed from the stochastic dynamics of social interaction processes in cities. PMID:22815745

  1. Galaxy mergers and gravitational lens statistics

    NASA Technical Reports Server (NTRS)

    Rix, Hans-Walter; Maoz, Dan; Turner, Edwin L.; Fukugita, Masataka

    1994-01-01

    We investigate the impact of hierarchical galaxy merging on the statistics of gravitational lensing of distant sources. Since no definite theoretical predictions for the merging history of luminous galaxies exist, we adopt a parameterized prescription, which allows us to adjust the expected number of pieces comprising a typical present galaxy at z approximately 0.65. The existence of global parameter relations for elliptical galaxies and constraints on the evolution of the phase space density in dissipationless mergers, allow us to limit the possible evolution of galaxy lens properties under merging. We draw two lessons from implementing this lens evolution into statistical lens calculations: (1) The total optical depth to multiple imaging (e.g., of quasars) is quite insensitive to merging. (2) Merging leads to a smaller mean separation of observed multiple images. Because merging does not reduce drastically the expected lensing frequency, it cannot make lambda-dominated cosmologies compatible with the existing lensing observations. A comparison with the data from the Hubble Space Telescope (HST) Snapshot Survey shows that models with little or no evolution of the lens population are statistically favored over strong merging scenarios. A specific merging scenario proposed to Toomre can be rejected (95% level) by such a comparison. Some versions of the scenario proposed by Broadhurst, Ellis, & Glazebrook are statistically acceptable.

  2. Integer lattice gas with Monte Carlo collision operator recovers the lattice Boltzmann method with Poisson-distributed fluctuations

    NASA Astrophysics Data System (ADS)

    Blommel, Thomas; Wagner, Alexander J.

    2018-02-01

    We examine a new kind of lattice gas that closely resembles modern lattice Boltzmann methods. This new kind of lattice gas, which we call a Monte Carlo lattice gas, has interesting properties that shed light on the origin of the multirelaxation time collision operator, and it derives the equilibrium distribution for an entropic lattice Boltzmann. Furthermore these lattice gas methods have Galilean invariant fluctuations given by a Poisson statistics, giving further insight into the properties that we should expect for fluctuating lattice Boltzmann methods.

  3. The effect of rare variants on inflation of the test statistics in case-control analyses.

    PubMed

    Pirie, Ailith; Wood, Angela; Lush, Michael; Tyrer, Jonathan; Pharoah, Paul D P

    2015-02-20

    The detection of bias due to cryptic population structure is an important step in the evaluation of findings of genetic association studies. The standard method of measuring this bias in a genetic association study is to compare the observed median association test statistic to the expected median test statistic. This ratio is inflated in the presence of cryptic population structure. However, inflation may also be caused by the properties of the association test itself particularly in the analysis of rare variants. We compared the properties of the three most commonly used association tests: the likelihood ratio test, the Wald test and the score test when testing rare variants for association using simulated data. We found evidence of inflation in the median test statistics of the likelihood ratio and score tests for tests of variants with less than 20 heterozygotes across the sample, regardless of the total sample size. The test statistics for the Wald test were under-inflated at the median for variants below the same minor allele frequency. In a genetic association study, if a substantial proportion of the genetic variants tested have rare minor allele frequencies, the properties of the association test may mask the presence or absence of bias due to population structure. The use of either the likelihood ratio test or the score test is likely to lead to inflation in the median test statistic in the absence of population structure. In contrast, the use of the Wald test is likely to result in under-inflation of the median test statistic which may mask the presence of population structure.

  4. Manifestations of Influence of Solar Activity and Cosmic Ray Intensity on the Wheat Price in the Medieval England (1259-1703 Years)

    NASA Astrophysics Data System (ADS)

    Pustil'Nik, Lev A.; Dorman, L. I.; Yom Din, G.

    2003-07-01

    The database of Professor Rogers, with wheat prices in England in the Middle Ages (1249-1703) was used to search for possible manifestations of solar activity and cosmic ray variations. The main object of the statistical analysis is investigation of bursts of prices. We present a conceptual model of possible modes for sensitivity of wheat prices to weather conditions, caused by solar cycle variations in cosmic rays, and compare the expected price fluctuations with wheat price variations recorded in the Medieval England. We compared statistical properties of the intervals between price bursts with statistical properties of the intervals between extremes (minimums) of solar cycles during the years 1700-2000. Statistical properties of these two samples are similar both in averaged/median values of intervals and in standard deviation of this values. We show that histogram of intervals distribution for price bursts and solar minimums are coincidence with high confidence level. We analyzed direct links between wheat prices and solar activity in the th 17 Century, for which wheat prices and solar activity data as well as cosmic ray intensity (from 10 Be isotop e) are available. We show that for all seven solar activity minimums the observed prices were higher than prices for the nine intervals of maximal solar activity proceed preceding to the minimums. This result, combined with the conclusion on similarity of statistical properties of the price bursts and solar activity extremes we consider as direct evidence of a causal connection between wheat prices bursts and solar activity.

  5. Reliability and Validity of the Self-Efficacy for Exercise in Epilepsy and the Outcome Expectations for Exercise in Epilepsy Scales.

    PubMed

    Dustin, Irene; Resnick, Barbara; Galik, Elizabeth; Klinedinst, N Jennifer; Michael, Kathleen; Wiggs, Edythe

    2017-04-01

    The purpose of this study was to test the psychometric properties of the revised Self-Efficacy for Exercise With Epilepsy (SEE-E) and Outcome Expectations for Exercise with Epilepsy (OEE-E) when used with people with epilepsy. The SEE-E and OEE-E were given in face-to-face interviews to 26 persons with epilepsy in an epilepsy clinic. There was some evidence of validity based on Rasch analysis INFIT and OUTFIT statistics. There was some evidence of reliability for the SEE-E and OEE-E based on person and item separation reliability indexes. These measures can be used to identify persons with epilepsy who have low self-efficacy and outcome expectations for exercise and guide design of interventions to strengthen these expectations and thereby improve exercise behavior.

  6. Asymptotic formulae for likelihood-based tests of new physics

    NASA Astrophysics Data System (ADS)

    Cowan, Glen; Cranmer, Kyle; Gross, Eilam; Vitells, Ofer

    2011-02-01

    We describe likelihood-based statistical tests for use in high energy physics for the discovery of new phenomena and for construction of confidence intervals on model parameters. We focus on the properties of the test procedures that allow one to account for systematic uncertainties. Explicit formulae for the asymptotic distributions of test statistics are derived using results of Wilks and Wald. We motivate and justify the use of a representative data set, called the "Asimov data set", which provides a simple method to obtain the median experimental sensitivity of a search or measurement as well as fluctuations about this expectation.

  7. Exclusion probabilities and likelihood ratios with applications to kinship problems.

    PubMed

    Slooten, Klaas-Jan; Egeland, Thore

    2014-05-01

    In forensic genetics, DNA profiles are compared in order to make inferences, paternity cases being a standard example. The statistical evidence can be summarized and reported in several ways. For example, in a paternity case, the likelihood ratio (LR) and the probability of not excluding a random man as father (RMNE) are two common summary statistics. There has been a long debate on the merits of the two statistics, also in the context of DNA mixture interpretation, and no general consensus has been reached. In this paper, we show that the RMNE is a certain weighted average of inverse likelihood ratios. This is true in any forensic context. We show that the likelihood ratio in favor of the correct hypothesis is, in expectation, bigger than the reciprocal of the RMNE probability. However, with the exception of pathological cases, it is also possible to obtain smaller likelihood ratios. We illustrate this result for paternity cases. Moreover, some theoretical properties of the likelihood ratio for a large class of general pairwise kinship cases, including expected value and variance, are derived. The practical implications of the findings are discussed and exemplified.

  8. A new statistical approach to climate change detection and attribution

    NASA Astrophysics Data System (ADS)

    Ribes, Aurélien; Zwiers, Francis W.; Azaïs, Jean-Marc; Naveau, Philippe

    2017-01-01

    We propose here a new statistical approach to climate change detection and attribution that is based on additive decomposition and simple hypothesis testing. Most current statistical methods for detection and attribution rely on linear regression models where the observations are regressed onto expected response patterns to different external forcings. These methods do not use physical information provided by climate models regarding the expected response magnitudes to constrain the estimated responses to the forcings. Climate modelling uncertainty is difficult to take into account with regression based methods and is almost never treated explicitly. As an alternative to this approach, our statistical model is only based on the additivity assumption; the proposed method does not regress observations onto expected response patterns. We introduce estimation and testing procedures based on likelihood maximization, and show that climate modelling uncertainty can easily be accounted for. Some discussion is provided on how to practically estimate the climate modelling uncertainty based on an ensemble of opportunity. Our approach is based on the " models are statistically indistinguishable from the truth" paradigm, where the difference between any given model and the truth has the same distribution as the difference between any pair of models, but other choices might also be considered. The properties of this approach are illustrated and discussed based on synthetic data. Lastly, the method is applied to the linear trend in global mean temperature over the period 1951-2010. Consistent with the last IPCC assessment report, we find that most of the observed warming over this period (+0.65 K) is attributable to anthropogenic forcings (+0.67 ± 0.12 K, 90 % confidence range), with a very limited contribution from natural forcings (-0.01± 0.02 K).

  9. Computational and Statistical Analyses of Amino Acid Usage and Physico-Chemical Properties of the Twelve Late Embryogenesis Abundant Protein Classes

    PubMed Central

    Jaspard, Emmanuel; Macherel, David; Hunault, Gilles

    2012-01-01

    Late Embryogenesis Abundant Proteins (LEAPs) are ubiquitous proteins expected to play major roles in desiccation tolerance. Little is known about their structure - function relationships because of the scarcity of 3-D structures for LEAPs. The previous building of LEAPdb, a database dedicated to LEAPs from plants and other organisms, led to the classification of 710 LEAPs into 12 non-overlapping classes with distinct properties. Using this resource, numerous physico-chemical properties of LEAPs and amino acid usage by LEAPs have been computed and statistically analyzed, revealing distinctive features for each class. This unprecedented analysis allowed a rigorous characterization of the 12 LEAP classes, which differed also in multiple structural and physico-chemical features. Although most LEAPs can be predicted as intrinsically disordered proteins, the analysis indicates that LEAP class 7 (PF03168) and probably LEAP class 11 (PF04927) are natively folded proteins. This study thus provides a detailed description of the structural properties of this protein family opening the path toward further LEAP structure - function analysis. Finally, since each LEAP class can be clearly characterized by a unique set of physico-chemical properties, this will allow development of software to predict proteins as LEAPs. PMID:22615859

  10. Statistics of initial density perturbations in heavy ion collisions and their fluid dynamic response

    NASA Astrophysics Data System (ADS)

    Floerchinger, Stefan; Wiedemann, Urs Achim

    2014-08-01

    An interesting opportunity to determine thermodynamic and transport properties in more detail is to identify generic statistical properties of initial density perturbations. Here we study event-by-event fluctuations in terms of correlation functions for two models that can be solved analytically. The first assumes Gaussian fluctuations around a distribution that is fixed by the collision geometry but leads to non-Gaussian features after averaging over the reaction plane orientation at non-zero impact parameter. In this context, we derive a three-parameter extension of the commonly used Bessel-Gaussian event-by-event distribution of harmonic flow coefficients. Secondly, we study a model of N independent point sources for which connected n-point correlation functions of initial perturbations scale like 1 /N n-1. This scaling is violated for non-central collisions in a way that can be characterized by its impact parameter dependence. We discuss to what extent these are generic properties that can be expected to hold for any model of initial conditions, and how this can improve the fluid dynamical analysis of heavy ion collisions.

  11. Stochastic Partial Differential Equation Solver for Hydroacoustic Modeling: Improvements to Paracousti Sound Propagation Solver

    NASA Astrophysics Data System (ADS)

    Preston, L. A.

    2017-12-01

    Marine hydrokinetic (MHK) devices offer a clean, renewable alternative energy source for the future. Responsible utilization of MHK devices, however, requires that the effects of acoustic noise produced by these devices on marine life and marine-related human activities be well understood. Paracousti is a 3-D full waveform acoustic modeling suite that can accurately propagate MHK noise signals in the complex bathymetry found in the near-shore to open ocean environment and considers real properties of the seabed, water column, and air-surface interface. However, this is a deterministic simulation that assumes the environment and source are exactly known. In reality, environmental and source characteristics are often only known in a statistical sense. Thus, to fully characterize the expected noise levels within the marine environment, this uncertainty in environmental and source factors should be incorporated into the acoustic simulations. One method is to use Monte Carlo (MC) techniques where simulation results from a large number of deterministic solutions are aggregated to provide statistical properties of the output signal. However, MC methods can be computationally prohibitive since they can require tens of thousands or more simulations to build up an accurate representation of those statistical properties. An alternative method, using the technique of stochastic partial differential equations (SPDE), allows computation of the statistical properties of output signals at a small fraction of the computational cost of MC. We are developing a SPDE solver for the 3-D acoustic wave propagation problem called Paracousti-UQ to help regulators and operators assess the statistical properties of environmental noise produced by MHK devices. In this presentation, we present the SPDE method and compare statistical distributions of simulated acoustic signals in simple models to MC simulations to show the accuracy and efficiency of the SPDE method. Sandia National Laboratories is a multimission laboratory managed and operated by National Technology and Engineering Solutions of Sandia LLC, a wholly owned subsidiary of Honeywell International Inc. for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA0003525.

  12. An Unbiased Estimator of Gene Diversity with Improved Variance for Samples Containing Related and Inbred Individuals of any Ploidy

    PubMed Central

    Harris, Alexandre M.; DeGiorgio, Michael

    2016-01-01

    Gene diversity, or expected heterozygosity (H), is a common statistic for assessing genetic variation within populations. Estimation of this statistic decreases in accuracy and precision when individuals are related or inbred, due to increased dependence among allele copies in the sample. The original unbiased estimator of expected heterozygosity underestimates true population diversity in samples containing relatives, as it only accounts for sample size. More recently, a general unbiased estimator of expected heterozygosity was developed that explicitly accounts for related and inbred individuals in samples. Though unbiased, this estimator’s variance is greater than that of the original estimator. To address this issue, we introduce a general unbiased estimator of gene diversity for samples containing related or inbred individuals, which employs the best linear unbiased estimator of allele frequencies, rather than the commonly used sample proportion. We examine the properties of this estimator, H∼BLUE, relative to alternative estimators using simulations and theoretical predictions, and show that it predominantly has the smallest mean squared error relative to others. Further, we empirically assess the performance of H∼BLUE on a global human microsatellite dataset of 5795 individuals, from 267 populations, genotyped at 645 loci. Additionally, we show that the improved variance of H∼BLUE leads to improved estimates of the population differentiation statistic, FST, which employs measures of gene diversity within its calculation. Finally, we provide an R script, BestHet, to compute this estimator from genomic and pedigree data. PMID:28040781

  13. Developing a statistically powerful measure for quartet tree inference using phylogenetic identities and Markov invariants.

    PubMed

    Sumner, Jeremy G; Taylor, Amelia; Holland, Barbara R; Jarvis, Peter D

    2017-12-01

    Recently there has been renewed interest in phylogenetic inference methods based on phylogenetic invariants, alongside the related Markov invariants. Broadly speaking, both these approaches give rise to polynomial functions of sequence site patterns that, in expectation value, either vanish for particular evolutionary trees (in the case of phylogenetic invariants) or have well understood transformation properties (in the case of Markov invariants). While both approaches have been valued for their intrinsic mathematical interest, it is not clear how they relate to each other, and to what extent they can be used as practical tools for inference of phylogenetic trees. In this paper, by focusing on the special case of binary sequence data and quartets of taxa, we are able to view these two different polynomial-based approaches within a common framework. To motivate the discussion, we present three desirable statistical properties that we argue any invariant-based phylogenetic method should satisfy: (1) sensible behaviour under reordering of input sequences; (2) stability as the taxa evolve independently according to a Markov process; and (3) explicit dependence on the assumption of a continuous-time process. Motivated by these statistical properties, we develop and explore several new phylogenetic inference methods. In particular, we develop a statistically bias-corrected version of the Markov invariants approach which satisfies all three properties. We also extend previous work by showing that the phylogenetic invariants can be implemented in such a way as to satisfy property (3). A simulation study shows that, in comparison to other methods, our new proposed approach based on bias-corrected Markov invariants is extremely powerful for phylogenetic inference. The binary case is of particular theoretical interest as-in this case only-the Markov invariants can be expressed as linear combinations of the phylogenetic invariants. A wider implication of this is that, for models with more than two states-for example DNA sequence alignments with four-state models-we find that methods which rely on phylogenetic invariants are incapable of satisfying all three of the stated statistical properties. This is because in these cases the relevant Markov invariants belong to a class of polynomials independent from the phylogenetic invariants.

  14. Low and High Frequency Models of Response Statistics of a Cylindrical Orthogrid Vehicle Panel to Acoustic Excitation

    NASA Technical Reports Server (NTRS)

    Smith, Andrew; LaVerde, Bruce; Teague, David; Gardner, Bryce; Cotoni, Vincent

    2010-01-01

    This presentation further develops the orthogrid vehicle panel work. Employed Hybrid Module capabilities to assess both low/mid frequency and high frequency models in the VA One simulation environment. The response estimates from three modeling approaches are compared to ground test measurements. Detailed Finite Element Model of the Test Article -Expect to capture both the global panel modes and the local pocket mode response, but at a considerable analysis expense (time & resources). A Composite Layered Construction equivalent global stiffness approximation using SEA -Expect to capture response of the global panel modes only. An SEA approximation using the Periodic Subsystem Formulation. A finite element model of a single periodic cell is used to derive the vibroacoustic properties of the entire periodic structure (modal density, radiation efficiency, etc. Expect to capture response at various locations on the panel (on the skin and on the ribs) with less analysis expense

  15. Rapid Expectation Adaptation during Syntactic Comprehension

    PubMed Central

    Fine, Alex B.; Jaeger, T. Florian; Farmer, Thomas A.; Qian, Ting

    2013-01-01

    When we read or listen to language, we are faced with the challenge of inferring intended messages from noisy input. This challenge is exacerbated by considerable variability between and within speakers. Focusing on syntactic processing (parsing), we test the hypothesis that language comprehenders rapidly adapt to the syntactic statistics of novel linguistic environments (e.g., speakers or genres). Two self-paced reading experiments investigate changes in readers’ syntactic expectations based on repeated exposure to sentences with temporary syntactic ambiguities (so-called “garden path sentences”). These sentences typically lead to a clear expectation violation signature when the temporary ambiguity is resolved to an a priori less expected structure (e.g., based on the statistics of the lexical context). We find that comprehenders rapidly adapt their syntactic expectations to converge towards the local statistics of novel environments. Specifically, repeated exposure to a priori unexpected structures can reduce, and even completely undo, their processing disadvantage (Experiment 1). The opposite is also observed: a priori expected structures become less expected (even eliciting garden paths) in environments where they are hardly ever observed (Experiment 2). Our findings suggest that, when changes in syntactic statistics are to be expected (e.g., when entering a novel environment), comprehenders can rapidly adapt their expectations, thereby overcoming the processing disadvantage that mistaken expectations would otherwise cause. Our findings take a step towards unifying insights from research in expectation-based models of language processing, syntactic priming, and statistical learning. PMID:24204909

  16. What kind of noise is brain noise: anomalous scaling behavior of the resting brain activity fluctuations

    PubMed Central

    Fraiman, Daniel; Chialvo, Dante R.

    2012-01-01

    The study of spontaneous fluctuations of brain activity, often referred as brain noise, is getting increasing attention in functional magnetic resonance imaging (fMRI) studies. Despite important efforts, much of the statistical properties of such fluctuations remain largely unknown. This work scrutinizes these fluctuations looking at specific statistical properties which are relevant to clarify its dynamical origins. Here, three statistical features which clearly differentiate brain data from naive expectations for random processes are uncovered: First, the variance of the fMRI mean signal as a function of the number of averaged voxels remains constant across a wide range of observed clusters sizes. Second, the anomalous behavior of the variance is originated by bursts of synchronized activity across regions, regardless of their widely different sizes. Finally, the correlation length (i.e., the length at which the correlation strength between two regions vanishes) as well as mutual information diverges with the cluster's size considered, such that arbitrarily large clusters exhibit the same collective dynamics than smaller ones. These three properties are known to be exclusive of complex systems exhibiting critical dynamics, where the spatio-temporal dynamics show these peculiar type of fluctuations. Thus, these findings are fully consistent with previous reports of brain critical dynamics, and are relevant for the interpretation of the role of fluctuations and variability in brain function in health and disease. PMID:22934058

  17. Analysis of the Einstein sample of early-type galaxies

    NASA Technical Reports Server (NTRS)

    Eskridge, Paul B.; Fabbiano, Giuseppina

    1993-01-01

    The EINSTEIN galaxy catalog contains x-ray data for 148 early-type (E and SO) galaxies. A detailed analysis of the global properties of this sample are studied. By comparing the x-ray properties with other tracers of the ISM, as well as with observables related to the stellar dynamics and populations of the sample, we expect to determine more clearly the physical relationships that determine the evolution of early-type galaxies. Previous studies with smaller samples have explored the relationships between x-ray luminosity (L(sub x)) and luminosities in other bands. Using our larger sample and the statistical techniques of survival analysis, a number of these earlier analyses were repeated. For our full sample, a strong statistical correlation is found between L(sub X) and L(sub B) (the probability that the null hypothesis is upheld is P less than 10(exp -4) from a variety of rank correlation tests. Regressions with several algorithms yield consistent results.

  18. Diffraction based Hanbury Brown and Twiss interferometry at a hard x-ray free-electron laser

    DOE PAGES

    Gorobtsov, O. Yu.; Mukharamova, N.; Lazarev, S.; ...

    2018-02-02

    X-ray free-electron lasers (XFELs) provide extremely bright and highly spatially coherent x-ray radiation with femtosecond pulse duration. Currently, they are widely used in biology and material science. Knowledge of the XFEL statistical properties during an experiment may be vitally important for the accurate interpretation of the results. Here, for the first time, we demonstrate Hanbury Brown and Twiss (HBT) interferometry performed in diffraction mode at an XFEL source. It allowed us to determine the XFEL statistical properties directly from the Bragg peaks originating from colloidal crystals. This approach is different from the traditional one when HBT interferometry is performed inmore » the direct beam without a sample. Our analysis has demonstrated nearly full (80%) global spatial coherence of the XFEL pulses and an average pulse duration on the order of ten femtoseconds for the monochromatized beam, which is significantly shorter than expected from the electron bunch measurements.« less

  19. Diffraction based Hanbury Brown and Twiss interferometry at a hard x-ray free-electron laser

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gorobtsov, O. Yu.; Mukharamova, N.; Lazarev, S.

    X-ray free-electron lasers (XFELs) provide extremely bright and highly spatially coherent x-ray radiation with femtosecond pulse duration. Currently, they are widely used in biology and material science. Knowledge of the XFEL statistical properties during an experiment may be vitally important for the accurate interpretation of the results. Here, for the first time, we demonstrate Hanbury Brown and Twiss (HBT) interferometry performed in diffraction mode at an XFEL source. It allowed us to determine the XFEL statistical properties directly from the Bragg peaks originating from colloidal crystals. This approach is different from the traditional one when HBT interferometry is performed inmore » the direct beam without a sample. Our analysis has demonstrated nearly full (80%) global spatial coherence of the XFEL pulses and an average pulse duration on the order of ten femtoseconds for the monochromatized beam, which is significantly shorter than expected from the electron bunch measurements.« less

  20. Statistical properties of Galactic CMB foregrounds: dust and synchrotron

    NASA Astrophysics Data System (ADS)

    Kandel, D.; Lazarian, A.; Pogosyan, D.

    2018-07-01

    Recent Planck observations have revealed some of the important statistical properties of synchrotron and dust polarization, namely, the B to E mode power and temperature-E (TE) mode cross-correlation. In this paper, we extend our analysis in Kandel et al. that studied the B to E mode power ratio for polarized dust emission to include TE cross-correlation and develop an analogous formalism for synchrotron signal, all using a realistic model of magnetohydrodynamical turbulence. Our results suggest that the Planck results for both synchrotron and dust polarization can be understood if the turbulence in the Galaxy is sufficiently sub-Alfvénic. Making use of the observed poor magnetic field-density correlation, we show that the observed positive TE correlation for dust corresponds to our theoretical expectations. We also show how the B to E ratio as well as the TE cross-correlation can be used to study media magnetization, compressibility, and level of density-magnetic field correlation.

  1. Study of Statistical Variations of Load Spectra and Material Properties on Aircraft Fatigue Life

    DTIC Science & Technology

    1992-09-01

    requirement for the structure of the aircraft would result in decreased weight and increased performance. The range of a levels studied for that...Chapter III. The expected number of g exceedences at each g level for the different a levels were summarized in Table 4. The results were also...Sequence Figure 21: Life Remaining for Various a Levels (40 KSI) 51 270000 - 260000 M 25000 "T\\ 0 240000 E 230000 S220000-- 210000 200000L .... I

  2. Statistical properties of a utility measure of observer performance compared to area under the ROC curve

    NASA Astrophysics Data System (ADS)

    Abbey, Craig K.; Samuelson, Frank W.; Gallas, Brandon D.; Boone, John M.; Niklason, Loren T.

    2013-03-01

    The receiver operating characteristic (ROC) curve has become a common tool for evaluating diagnostic imaging technologies, and the primary endpoint of such evaluations is the area under the curve (AUC), which integrates sensitivity over the entire false positive range. An alternative figure of merit for ROC studies is expected utility (EU), which focuses on the relevant region of the ROC curve as defined by disease prevalence and the relative utility of the task. However if this measure is to be used, it must also have desirable statistical properties keep the burden of observer performance studies as low as possible. Here, we evaluate effect size and variability for EU and AUC. We use two observer performance studies recently submitted to the FDA to compare the EU and AUC endpoints. The studies were conducted using the multi-reader multi-case methodology in which all readers score all cases in all modalities. ROC curves from the study were used to generate both the AUC and EU values for each reader and modality. The EU measure was computed assuming an iso-utility slope of 1.03. We find mean effect sizes, the reader averaged difference between modalities, to be roughly 2.0 times as big for EU as AUC. The standard deviation across readers is roughly 1.4 times as large, suggesting better statistical properties for the EU endpoint. In a simple power analysis of paired comparison across readers, the utility measure required 36% fewer readers on average to achieve 80% statistical power compared to AUC.

  3. From regular text to artistic writing and artworks: Fourier statistics of images with low and high aesthetic appeal

    PubMed Central

    Melmer, Tamara; Amirshahi, Seyed A.; Koch, Michael; Denzler, Joachim; Redies, Christoph

    2013-01-01

    The spatial characteristics of letters and their influence on readability and letter identification have been intensely studied during the last decades. There have been few studies, however, on statistical image properties that reflect more global aspects of text, for example properties that may relate to its aesthetic appeal. It has been shown that natural scenes and a large variety of visual artworks possess a scale-invariant Fourier power spectrum that falls off linearly with increasing frequency in log-log plots. We asked whether images of text share this property. As expected, the Fourier spectrum of images of regular typed or handwritten text is highly anisotropic, i.e., the spectral image properties in vertical, horizontal, and oblique orientations differ. Moreover, the spatial frequency spectra of text images are not scale-invariant in any direction. The decline is shallower in the low-frequency part of the spectrum for text than for aesthetic artworks, whereas, in the high-frequency part, it is steeper. These results indicate that, in general, images of regular text contain less global structure (low spatial frequencies) relative to fine detail (high spatial frequencies) than images of aesthetics artworks. Moreover, we studied images of text with artistic claim (ornate print and calligraphy) and ornamental art. For some measures, these images assume average values intermediate between regular text and aesthetic artworks. Finally, to answer the question of whether the statistical properties measured by us are universal amongst humans or are subject to intercultural differences, we compared images from three different cultural backgrounds (Western, East Asian, and Arabic). Results for different categories (regular text, aesthetic writing, ornamental art, and fine art) were similar across cultures. PMID:23554592

  4. Measuring the supernova unknowns at the next-generation neutrino telescopes through the diffuse neutrino background

    NASA Astrophysics Data System (ADS)

    MØller, Klaes; Suliga, Anna M.; Tamborra, Irene; Denton, Peter B.

    2018-05-01

    The detection of the diffuse supernova neutrino background (DSNB) will preciously contribute to gauge the properties of the core-collapse supernova population. We estimate the DSNB event rate in the next-generation neutrino detectors, Hyper-Kamiokande enriched with Gadolinium, JUNO, and DUNE. The determination of the supernova unknowns through the DSNB will be heavily driven by Hyper-Kamiokande, given its higher expected event rate, and complemented by DUNE that will help in reducing the parameters uncertainties. Meanwhile, JUNO will be sensitive to the DSNB signal over the largest energy range. A joint statistical analysis of the expected rates in 20 years of data taking from the above detectors suggests that we will be sensitive to the local supernova rate at most at a 20‑33% level. A non-zero fraction of supernovae forming black holes will be confirmed at a 90% CL, if the true value of that fraction is gtrsim20%. On the other hand, the DSNB events show extremely poor statistical sensitivity to the nuclear equation of state and mass accretion rate of the progenitors forming black holes.

  5. Problem Based Learning and the scientific process

    NASA Astrophysics Data System (ADS)

    Schuchardt, Daniel Shaner

    This research project was developed to inspire students to constructively use problem based learning and the scientific process to learn middle school science content. The student population in this study consisted of male and female seventh grade students. Students were presented with authentic problems that are connected to physical and chemical properties of matter. The intent of the study was to have students use the scientific process of looking at existing knowledge, generating learning issues or questions about the problems, and then developing a course of action to research and design experiments to model resolutions to the authentic problems. It was expected that students would improve their ability to actively engage with others in a problem solving process to achieve a deeper understanding of Michigan's 7th Grade Level Content Expectations, the Next Generation Science Standards, and a scientific process. Problem based learning was statistically effective in students' learning of the scientific process. Students statistically showed improvement on pre to posttest scores. The teaching method of Problem Based Learning was effective for seventh grade science students at Dowagiac Middle School.

  6. Exploring Factors Related to Completion of an Online Undergraduate-Level Introductory Statistics Course

    ERIC Educational Resources Information Center

    Zimmerman, Whitney Alicia; Johnson, Glenn

    2017-01-01

    Data were collected from 353 online undergraduate introductory statistics students at the beginning of a semester using the Goals and Outcomes Associated with Learning Statistics (GOALS) instrument and an abbreviated form of the Statistics Anxiety Rating Scale (STARS). Data included a survey of expected grade, expected time commitment, and the…

  7. The statistical distribution of aerosol properties in sourthern West Africa

    NASA Astrophysics Data System (ADS)

    Haslett, Sophie; Taylor, Jonathan; Flynn, Michael; Bower, Keith; Dorsey, James; Crawford, Ian; Brito, Joel; Denjean, Cyrielle; Bourrianne, Thierry; Burnet, Frederic; Batenburg, Anneke; Schulz, Christiane; Schneider, Johannes; Borrmann, Stephan; Sauer, Daniel; Duplissy, Jonathan; Lee, James; Vaughan, Adam; Coe, Hugh

    2017-04-01

    The population and economy in southern West Africa have been growing at an exceptional rate in recent years and this trend is expected to continue, with the population projected to more than double to 800 million by 2050. This will result in a dramatic increase in anthropogenic pollutants, already estimated to have tripled between 1950 and 2000 (Lamarque et al., 2010). It is known that aerosols can modify the radiative properties of clouds. As such, the entrainment of anthropogenic aerosol into the large banks of clouds forming during the onset of the West African Monsoon could have a substantial impact on the region's response to climate change. Such projections, however, are greatly limited by the scarcity of observations in this part of the world. As part of the Dynamics-Aerosol-Chemistry-Cloud Interactions in West Africa (DACCIWA) project, three research aircraft were deployed, each carrying equipment capable of measuring aerosol properties in-situ. Instrumentation included Aerosol Mass Spectrometers (AMS), Single Particle Soot Photometers (SP2), Condensation Particle Counters (CPC) and Scanning Mobility Particle Sizers (SMPS). Throughout the intensive aircraft campaign, 155 hours of scientific flights covered an area including large parts of Benin, Togo, Ghana and parts of Côte D'Ivoire. Approximately 70 hours were dedicated to the measurement of cloud-aerosol interactions, with many other flights producing data contributing towards this objective. Using datasets collected during this campaign period, it is possible to build a robust statistical understanding of aerosol properties in this region for the first time, including size distributions and optical and chemical properties. Here, we describe preliminary results from aerosol measurements on board the three aircraft. These have been used to describe aerosol properties throughout the region and time period encompassed by the DACCIWA aircraft campaign. Such statistics will be invaluable for improving future projections of cloud properties and radiative effects in the region.

  8. Predictive uncertainty in auditory sequence processing

    PubMed Central

    Hansen, Niels Chr.; Pearce, Marcus T.

    2014-01-01

    Previous studies of auditory expectation have focused on the expectedness perceived by listeners retrospectively in response to events. In contrast, this research examines predictive uncertainty—a property of listeners' prospective state of expectation prior to the onset of an event. We examine the information-theoretic concept of Shannon entropy as a model of predictive uncertainty in music cognition. This is motivated by the Statistical Learning Hypothesis, which proposes that schematic expectations reflect probabilistic relationships between sensory events learned implicitly through exposure. Using probability estimates from an unsupervised, variable-order Markov model, 12 melodic contexts high in entropy and 12 melodic contexts low in entropy were selected from two musical repertoires differing in structural complexity (simple and complex). Musicians and non-musicians listened to the stimuli and provided explicit judgments of perceived uncertainty (explicit uncertainty). We also examined an indirect measure of uncertainty computed as the entropy of expectedness distributions obtained using a classical probe-tone paradigm where listeners rated the perceived expectedness of the final note in a melodic sequence (inferred uncertainty). Finally, we simulate listeners' perception of expectedness and uncertainty using computational models of auditory expectation. A detailed model comparison indicates which model parameters maximize fit to the data and how they compare to existing models in the literature. The results show that listeners experience greater uncertainty in high-entropy musical contexts than low-entropy contexts. This effect is particularly apparent for inferred uncertainty and is stronger in musicians than non-musicians. Consistent with the Statistical Learning Hypothesis, the results suggest that increased domain-relevant training is associated with an increasingly accurate cognitive model of probabilistic structure in music. PMID:25295018

  9. The Affective Impact of Financial Skewness on Neural Activity and Choice

    PubMed Central

    Wu, Charlene C.; Bossaerts, Peter; Knutson, Brian

    2011-01-01

    Few finance theories consider the influence of “skewness” (or large and asymmetric but unlikely outcomes) on financial choice. We investigated the impact of skewed gambles on subjects' neural activity, self-reported affective responses, and subsequent preferences using functional magnetic resonance imaging (FMRI). Neurally, skewed gambles elicited more anterior insula activation than symmetric gambles equated for expected value and variance, and positively skewed gambles also specifically elicited more nucleus accumbens (NAcc) activation than negatively skewed gambles. Affectively, positively skewed gambles elicited more positive arousal and negatively skewed gambles elicited more negative arousal than symmetric gambles equated for expected value and variance. Subjects also preferred positively skewed gambles more, but negatively skewed gambles less than symmetric gambles of equal expected value. Individual differences in both NAcc activity and positive arousal predicted preferences for positively skewed gambles. These findings support an anticipatory affect account in which statistical properties of gambles—including skewness—can influence neural activity, affective responses, and ultimately, choice. PMID:21347239

  10. On the Least-Squares Fitting of Correlated Data: a Priorivs a PosterioriWeighting

    NASA Astrophysics Data System (ADS)

    Tellinghuisen, Joel

    1996-10-01

    One of the methods in common use for analyzing large data sets is a two-step procedure, in which subsets of the full data are first least-squares fitted to a preliminary set of parameters, and the latter are subsequently merged to yield the final parameters. The second step of this procedure is properly a correlated least-squares fit and requires the variance-covariance matrices from the first step to construct the weight matrix for the merge. There is, however, an ambiguity concerning the manner in which the first-step variance-covariance matrices are assessed, which leads to different statistical properties for the quantities determined in the merge. The issue is one ofa priorivsa posterioriassessment of weights, which is an application of what was originally calledinternalvsexternal consistencyby Birge [Phys. Rev.40,207-227 (1932)] and Deming ("Statistical Adjustment of Data." Dover, New York, 1964). In the present work the simplest case of a merge fit-that of an average as obtained from a global fit vs a two-step fit of partitioned data-is used to illustrate that only in the case of a priori weighting do the results have the usually expected and desired statistical properties: normal distributions for residuals,tdistributions for parameters assessed a posteriori, and χ2distributions for variances.

  11. The importance of topographically corrected null models for analyzing ecological point processes.

    PubMed

    McDowall, Philip; Lynch, Heather J

    2017-07-01

    Analyses of point process patterns and related techniques (e.g., MaxEnt) make use of the expected number of occurrences per unit area and second-order statistics based on the distance between occurrences. Ecologists working with point process data often assume that points exist on a two-dimensional x-y plane or within a three-dimensional volume, when in fact many observed point patterns are generated on a two-dimensional surface existing within three-dimensional space. For many surfaces, however, such as the topography of landscapes, the projection from the surface to the x-y plane preserves neither area nor distance. As such, when these point patterns are implicitly projected to and analyzed in the x-y plane, our expectations of the point pattern's statistical properties may not be met. When used in hypothesis testing, we find that the failure to account for the topography of the generating surface may bias statistical tests that incorrectly identify clustering and, furthermore, may bias coefficients in inhomogeneous point process models that incorporate slope as a covariate. We demonstrate the circumstances under which this bias is significant, and present simple methods that allow point processes to be simulated with corrections for topography. These point patterns can then be used to generate "topographically corrected" null models against which observed point processes can be compared. © 2017 by the Ecological Society of America.

  12. Developing Human-Machine Interfaces to Support Appropriate Trust and Reliance on Automated Combat Identification Systems (Developpement d’Interfaces Homme-Machine Pour Appuyer la Confiance dans les Systemes Automatises d’Identification au Combat)

    DTIC Science & Technology

    2008-03-31

    on automation; the ‘response bias’ approach. This new approach is based on Signal Detection Theory (SDT) (Macmillan & Creelman , 1991; Wickens...SDT), response bias will vary with the expectation of the target probability, whereas their sensitivity will stay constant (Macmillan & Creelman ...measures, C has the simplest statistical properties (Macmillan & Creelman , 1991, p273), and it was also the measure used in Dzindolet et al.’s study

  13. Statistics of the geomagnetic secular variation for the past 5Ma

    NASA Technical Reports Server (NTRS)

    Constable, C. G.; Parker, R. L.

    1986-01-01

    A new statistical model is proposed for the geomagnetic secular variation over the past 5Ma. Unlike previous models, the model makes use of statistical characteristics of the present day geomagnetic field. The spatial power spectrum of the non-dipole field is consistent with a white source near the core-mantle boundary with Gaussian distribution. After a suitable scaling, the spherical harmonic coefficients may be regarded as statistical samples from a single giant Gaussian process; this is the model of the non-dipole field. The model can be combined with an arbitrary statistical description of the dipole and probability density functions and cumulative distribution functions can be computed for declination and inclination that would be observed at any site on Earth's surface. Global paleomagnetic data spanning the past 5Ma are used to constrain the statistics of the dipole part of the field. A simple model is found to be consistent with the available data. An advantage of specifying the model in terms of the spherical harmonic coefficients is that it is a complete statistical description of the geomagnetic field, enabling us to test specific properties for a general description. Both intensity and directional data distributions may be tested to see if they satisfy the expected model distributions.

  14. Statistics of the geomagnetic secular variation for the past 5 m.y

    NASA Technical Reports Server (NTRS)

    Constable, C. G.; Parker, R. L.

    1988-01-01

    A new statistical model is proposed for the geomagnetic secular variation over the past 5Ma. Unlike previous models, the model makes use of statistical characteristics of the present day geomagnetic field. The spatial power spectrum of the non-dipole field is consistent with a white source near the core-mantle boundary with Gaussian distribution. After a suitable scaling, the spherical harmonic coefficients may be regarded as statistical samples from a single giant Gaussian process; this is the model of the non-dipole field. The model can be combined with an arbitrary statistical description of the dipole and probability density functions and cumulative distribution functions can be computed for declination and inclination that would be observed at any site on Earth's surface. Global paleomagnetic data spanning the past 5Ma are used to constrain the statistics of the dipole part of the field. A simple model is found to be consistent with the available data. An advantage of specifying the model in terms of the spherical harmonic coefficients is that it is a complete statistical description of the geomagnetic field, enabling us to test specific properties for a general description. Both intensity and directional data distributions may be tested to see if they satisfy the expected model distributions.

  15. Randomizing Genome-Scale Metabolic Networks

    PubMed Central

    Samal, Areejit; Martin, Olivier C.

    2011-01-01

    Networks coming from protein-protein interactions, transcriptional regulation, signaling, or metabolism may appear to have “unusual” properties. To quantify this, it is appropriate to randomize the network and test the hypothesis that the network is not statistically different from expected in a motivated ensemble. However, when dealing with metabolic networks, the randomization of the network using edge exchange generates fictitious reactions that are biochemically meaningless. Here we provide several natural ensembles of randomized metabolic networks. A first constraint is to use valid biochemical reactions. Further constraints correspond to imposing appropriate functional constraints. We explain how to perform these randomizations with the help of Markov Chain Monte Carlo (MCMC) and show that they allow one to approach the properties of biological metabolic networks. The implication of the present work is that the observed global structural properties of real metabolic networks are likely to be the consequence of simple biochemical and functional constraints. PMID:21779409

  16. Predicting spiral wave patterns from cell properties in a model of biological self-organization.

    PubMed

    Geberth, Daniel; Hütt, Marc-Thorsten

    2008-09-01

    In many biological systems, biological variability (i.e., systematic differences between the system components) can be expected to outrank statistical fluctuations in the shaping of self-organized patterns. In principle, the distribution of single-element properties should thus allow predicting features of such patterns. For a mathematical model of a paradigmatic and well-studied pattern formation process, spiral waves of cAMP signaling in colonies of the slime mold Dictyostelium discoideum, we explore this possibility and observe a pronounced anticorrelation between spiral waves and cell properties (namely, the firing rate) and particularly a clustering of spiral wave tips in regions devoid of spontaneously firing (pacemaker) cells. Furthermore, we observe local inhomogeneities in the distribution of spiral chiralities, again induced by the pacemaker distribution. We show that these findings can be explained by a simple geometrical model of spiral wave generation.

  17. Nondeterministic data base for computerized visual perception

    NASA Technical Reports Server (NTRS)

    Yakimovsky, Y.

    1976-01-01

    A description is given of the knowledge representation data base in the perception subsystem of the Mars robot vehicle prototype. Two types of information are stored. The first is generic information that represents general rules that are conformed to by structures in the expected environments. The second kind of information is a specific description of a structure, i.e., the properties and relations of objects in the specific case being analyzed. The generic knowledge is represented so that it can be applied to extract and infer the description of specific structures. The generic model of the rules is substantially a Bayesian representation of the statistics of the environment, which means it is geared to representation of nondeterministic rules relating properties of, and relations between, objects. The description of a specific structure is also nondeterministic in the sense that all properties and relations may take a range of values with an associated probability distribution.

  18. Predicting spiral wave patterns from cell properties in a model of biological self-organization

    NASA Astrophysics Data System (ADS)

    Geberth, Daniel; Hütt, Marc-Thorsten

    2008-09-01

    In many biological systems, biological variability (i.e., systematic differences between the system components) can be expected to outrank statistical fluctuations in the shaping of self-organized patterns. In principle, the distribution of single-element properties should thus allow predicting features of such patterns. For a mathematical model of a paradigmatic and well-studied pattern formation process, spiral waves of cAMP signaling in colonies of the slime mold Dictyostelium discoideum, we explore this possibility and observe a pronounced anticorrelation between spiral waves and cell properties (namely, the firing rate) and particularly a clustering of spiral wave tips in regions devoid of spontaneously firing (pacemaker) cells. Furthermore, we observe local inhomogeneities in the distribution of spiral chiralities, again induced by the pacemaker distribution. We show that these findings can be explained by a simple geometrical model of spiral wave generation.

  19. Scattering and transport statistics at the metal-insulator transition: A numerical study of the power-law banded random-matrix model

    NASA Astrophysics Data System (ADS)

    Méndez-Bermúdez, J. A.; Gopar, Victor A.; Varga, Imre

    2010-09-01

    We study numerically scattering and transport statistical properties of the one-dimensional Anderson model at the metal-insulator transition described by the power-law banded random matrix (PBRM) model at criticality. Within a scattering approach to electronic transport, we concentrate on the case of a small number of single-channel attached leads. We observe a smooth crossover from localized to delocalized behavior in the average-scattering matrix elements, the conductance probability distribution, the variance of the conductance, and the shot noise power by varying b (the effective bandwidth of the PBRM model) from small (b≪1) to large (b>1) values. We contrast our results with analytic random matrix theory predictions which are expected to be recovered in the limit b→∞ . We also compare our results for the PBRM model with those for the three-dimensional (3D) Anderson model at criticality, finding that the PBRM model with bɛ[0.2,0.4] reproduces well the scattering and transport properties of the 3D Anderson model.

  20. Statistical and dynamical properties of a dissipative kicked rotator

    NASA Astrophysics Data System (ADS)

    Oliveira, Diego F. M.; Leonel, Edson D.

    2014-11-01

    Some dynamical and statistical properties for a conservative as well as the dissipative problem of relativistic particles in a waveguide are considered. For the first time, two different types of dissipation namely: (i) due to viscosity and; (ii) due to inelastic collision (upon the kick) are considered individually and acting together. For the first case, and contrary to what is expected for the original Zaslavsky’s relativistic model, we show there is a critical parameter where a transition from local to global chaos occurs. On the other hand, after considering the introduction of dissipation also on the kick, the structure of the phase space changes in the sense that chaotic and periodic attractors appear. We study also the chaotic sea by using scaling arguments and we proposed an analytical argument to reinforce the validity of the scaling exponents obtained numerically. In principle such an approach can be extended to any two-dimensional map. Finally, based on the Lyapunov exponent, we show that the parameter space exhibits infinite families of self-similar shrimp-shape structures, corresponding to periodic attractors, embedded in a large region corresponding to chaotic attractors.

  1. Probabilities and statistics for backscatter estimates obtained by a scatterometer

    NASA Technical Reports Server (NTRS)

    Pierson, Willard J., Jr.

    1989-01-01

    Methods for the recovery of winds near the surface of the ocean from measurements of the normalized radar backscattering cross section must recognize and make use of the statistics (i.e., the sampling variability) of the backscatter measurements. Radar backscatter values from a scatterometer are random variables with expected values given by a model. A model relates backscatter to properties of the waves on the ocean, which are in turn generated by the winds in the atmospheric marine boundary layer. The effective wind speed and direction at a known height for a neutrally stratified atmosphere are the values to be recovered from the model. The probability density function for the backscatter values is a normal probability distribution with the notable feature that the variance is a known function of the expected value. The sources of signal variability, the effects of this variability on the wind speed estimation, and criteria for the acceptance or rejection of models are discussed. A modified maximum likelihood method for estimating wind vectors is described. Ways to make corrections for the kinds of errors found for the Seasat SASS model function are described, and applications to a new scatterometer are given.

  2. Confidence intervals for expected moments algorithm flood quantile estimates

    USGS Publications Warehouse

    Cohn, Timothy A.; Lane, William L.; Stedinger, Jery R.

    2001-01-01

    Historical and paleoflood information can substantially improve flood frequency estimates if appropriate statistical procedures are properly applied. However, the Federal guidelines for flood frequency analysis, set forth in Bulletin 17B, rely on an inefficient “weighting” procedure that fails to take advantage of historical and paleoflood information. This has led researchers to propose several more efficient alternatives including the Expected Moments Algorithm (EMA), which is attractive because it retains Bulletin 17B's statistical structure (method of moments with the Log Pearson Type 3 distribution) and thus can be easily integrated into flood analyses employing the rest of the Bulletin 17B approach. The practical utility of EMA, however, has been limited because no closed‐form method has been available for quantifying the uncertainty of EMA‐based flood quantile estimates. This paper addresses that concern by providing analytical expressions for the asymptotic variance of EMA flood‐quantile estimators and confidence intervals for flood quantile estimates. Monte Carlo simulations demonstrate the properties of such confidence intervals for sites where a 25‐ to 100‐year streamgage record is augmented by 50 to 150 years of historical information. The experiments show that the confidence intervals, though not exact, should be acceptable for most purposes.

  3. Open Source Tools for Seismicity Analysis

    NASA Astrophysics Data System (ADS)

    Powers, P.

    2010-12-01

    The spatio-temporal analysis of seismicity plays an important role in earthquake forecasting and is integral to research on earthquake interactions and triggering. For instance, the third version of the Uniform California Earthquake Rupture Forecast (UCERF), currently under development, will use Epidemic Type Aftershock Sequences (ETAS) as a model for earthquake triggering. UCERF will be a "living" model and therefore requires robust, tested, and well-documented ETAS algorithms to ensure transparency and reproducibility. Likewise, as earthquake aftershock sequences unfold, real-time access to high quality hypocenter data makes it possible to monitor the temporal variability of statistical properties such as the parameters of the Omori Law and the Gutenberg Richter b-value. Such statistical properties are valuable as they provide a measure of how much a particular sequence deviates from expected behavior and can be used when assigning probabilities of aftershock occurrence. To address these demands and provide public access to standard methods employed in statistical seismology, we present well-documented, open-source JavaScript and Java software libraries for the on- and off-line analysis of seismicity. The Javascript classes facilitate web-based asynchronous access to earthquake catalog data and provide a framework for in-browser display, analysis, and manipulation of catalog statistics; implementations of this framework will be made available on the USGS Earthquake Hazards website. The Java classes, in addition to providing tools for seismicity analysis, provide tools for modeling seismicity and generating synthetic catalogs. These tools are extensible and will be released as part of the open-source OpenSHA Commons library.

  4. Using Relative Statistics and Approximate Disease Prevalence to Compare Screening Tests.

    PubMed

    Samuelson, Frank; Abbey, Craig

    2016-11-01

    Schatzkin et al. and other authors demonstrated that the ratios of some conditional statistics such as the true positive fraction are equal to the ratios of unconditional statistics, such as disease detection rates, and therefore we can calculate these ratios between two screening tests on the same population even if negative test patients are not followed with a reference procedure and the true and false negative rates are unknown. We demonstrate that this same property applies to an expected utility metric. We also demonstrate that while simple estimates of relative specificities and relative areas under ROC curves (AUC) do depend on the unknown negative rates, we can write these ratios in terms of disease prevalence, and the dependence of these ratios on a posited prevalence is often weak particularly if that prevalence is small or the performance of the two screening tests is similar. Therefore we can estimate relative specificity or AUC with little loss of accuracy, if we use an approximate value of disease prevalence.

  5. Canonical Statistical Model for Maximum Expected Immission of Wire Conductor in an Aperture Enclosure

    NASA Technical Reports Server (NTRS)

    Bremner, Paul G.; Vazquez, Gabriel; Christiano, Daniel J.; Trout, Dawn H.

    2016-01-01

    Prediction of the maximum expected electromagnetic pick-up of conductors inside a realistic shielding enclosure is an important canonical problem for system-level EMC design of space craft, launch vehicles, aircraft and automobiles. This paper introduces a simple statistical power balance model for prediction of the maximum expected current in a wire conductor inside an aperture enclosure. It calculates both the statistical mean and variance of the immission from the physical design parameters of the problem. Familiar probability density functions can then be used to predict the maximum expected immission for deign purposes. The statistical power balance model requires minimal EMC design information and solves orders of magnitude faster than existing numerical models, making it ultimately viable for scaled-up, full system-level modeling. Both experimental test results and full wave simulation results are used to validate the foundational model.

  6. Using the Expectancy Value Model of Motivation to Understand the Relationship between Student Attitudes and Achievement in Statistics

    ERIC Educational Resources Information Center

    Hood, Michelle; Creed, Peter A.; Neumann, David L.

    2012-01-01

    We tested a model of the relationship between attitudes toward statistics and achievement based on Eccles' Expectancy Value Model (1983). Participants (n = 149; 83% female) were second-year Australian university students in a psychology statistics course (mean age = 23.36 years, SD = 7.94 years). We obtained demographic details, past performance,…

  7. Accuracy of Orthognathic Surgical Outcomes Using 2- and 3-Dimensional Landmarks-The Case for Apples and Oranges?

    PubMed

    Borba, Alexandre Meireles; José da Silva, Everton; Fernandes da Silva, André Luis; Han, Michael D; da Graça Naclério-Homem, Maria; Miloro, Michael

    2018-01-12

    To verify predicted versus obtained surgical movements in 2-dimensional (2D) and 3-dimensional (3D) measurements and compare the equivalence between these methods. A retrospective observational study of bimaxillary orthognathic surgeries was performed. Postoperative cone-beam computed tomographic (CBCT) scans were superimposed on preoperative scans and a lateral cephalometric radiograph was generated from each CBCT scan. After identification of the sella, nasion, and upper central incisor tip landmarks on 2D and 3D images, actual and planned movements were compared by cephalometric measurements. One-sample t test was used to statistically evaluate results, with expected mean discrepancy values ranging from 0 to 2 mm. Equivalence of 2D and 3D values was compared using paired t test. The final sample of 46 cases showed by 2D cephalometry that differences between actual and planned movements in the horizontal axis were statistically relevant for expected means of 0, 0.5, and 2 mm without relevance for expected means of 1 and 1.5 mm; vertical movements were statistically relevant for expected means of 0 and 0.5 mm without relevance for expected means of 1, 1.5, and 2 mm. For 3D cephalometry in the horizontal axis, there were statistically relevant differences for expected means of 0, 1.5, and 2 mm without relevance for expected means of 0.5 and 1 mm; vertical movements showed statistically relevant differences for expected means of 0, 0.5, 1.5 and 2 mm without relevance for the expected mean of 1 mm. Comparison of 2D and 3D values displayed statistical differences for the horizontal and vertical axes. Comparison of 2D and 3D surgical outcome assessments should be performed with caution because there seems to be a difference in acceptable levels of accuracy between these 2 methods of evaluation. Moreover, 3D accuracy studies should no longer rely on a 2-mm level of discrepancy but on a 1-mm level. Copyright © 2018 American Association of Oral and Maxillofacial Surgeons. Published by Elsevier Inc. All rights reserved.

  8. How often should we expect to be wrong? Statistical power, P values, and the expected prevalence of false discoveries.

    PubMed

    Marino, Michael J

    2018-05-01

    There is a clear perception in the literature that there is a crisis in reproducibility in the biomedical sciences. Many underlying factors contributing to the prevalence of irreproducible results have been highlighted with a focus on poor design and execution of experiments along with the misuse of statistics. While these factors certainly contribute to irreproducibility, relatively little attention outside of the specialized statistical literature has focused on the expected prevalence of false discoveries under idealized circumstances. In other words, when everything is done correctly, how often should we expect to be wrong? Using a simple simulation of an idealized experiment, it is possible to show the central role of sample size and the related quantity of statistical power in determining the false discovery rate, and in accurate estimation of effect size. According to our calculations, based on current practice many subfields of biomedical science may expect their discoveries to be false at least 25% of the time, and the only viable course to correct this is to require the reporting of statistical power and a minimum of 80% power (1 - β = 0.80) for all studies. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Enhanced backscattering through a deep random phase screen

    NASA Astrophysics Data System (ADS)

    Jakeman, E.

    1988-10-01

    The statistical properties of radiation scattered by a system consisting of a plane mirror placed in the Fresnel region behind a smoothly varying deep random-phase screen with off-axis beam illumination are studied. It is found that two mechanisms cause enhanced scattering around the backward direction, according to the mirror position with respect to the focusing plane of the screen. In all of the plane mirror geometries considered, the scattered field remains a complex Gaussian process with a spatial coherence function identical to that expected for a single screen, and a speckle size smaller than the width of backscatter enhancement.

  10. Duality and topology

    NASA Astrophysics Data System (ADS)

    Sacramento, P. D.; Vieira, V. R.

    2018-04-01

    Mappings between models may be obtained by unitary transformations with preservation of the spectra but in general a change in the states. Non-canonical transformations in general also change the statistics of the operators involved. In these cases one may expect a change of topological properties as a consequence of the mapping. Here we consider some dualities resulting from mappings, by systematically using a Majorana fermion representation of spin and fermionic problems. We focus on the change of topological invariants that results from unitary transformations taking as examples the mapping between a spin system and a topological superconductor, and between different fermionic systems.

  11. Discriminative Random Field Models for Subsurface Contamination Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Arshadi, M.; Abriola, L. M.; Miller, E. L.; De Paolis Kaluza, C.

    2017-12-01

    Application of flow and transport simulators for prediction of the release, entrapment, and persistence of dense non-aqueous phase liquids (DNAPLs) and associated contaminant plumes is a computationally intensive process that requires specification of a large number of material properties and hydrologic/chemical parameters. Given its computational burden, this direct simulation approach is particularly ill-suited for quantifying both the expected performance and uncertainty associated with candidate remediation strategies under real field conditions. Prediction uncertainties primarily arise from limited information about contaminant mass distributions, as well as the spatial distribution of subsurface hydrologic properties. Application of direct simulation to quantify uncertainty would, thus, typically require simulating multiphase flow and transport for a large number of permeability and release scenarios to collect statistics associated with remedial effectiveness, a computationally prohibitive process. The primary objective of this work is to develop and demonstrate a methodology that employs measured field data to produce equi-probable stochastic representations of a subsurface source zone that capture the spatial distribution and uncertainty associated with key features that control remediation performance (i.e., permeability and contamination mass). Here we employ probabilistic models known as discriminative random fields (DRFs) to synthesize stochastic realizations of initial mass distributions consistent with known, and typically limited, site characterization data. Using a limited number of full scale simulations as training data, a statistical model is developed for predicting the distribution of contaminant mass (e.g., DNAPL saturation and aqueous concentration) across a heterogeneous domain. Monte-Carlo sampling methods are then employed, in conjunction with the trained statistical model, to generate realizations conditioned on measured borehole data. Performance of the statistical model is illustrated through comparisons of generated realizations with the `true' numerical simulations. Finally, we demonstrate how these realizations can be used to determine statistically optimal locations for further interrogation of the subsurface.

  12. Learning what to expect (in visual perception)

    PubMed Central

    Seriès, Peggy; Seitz, Aaron R.

    2013-01-01

    Expectations are known to greatly affect our experience of the world. A growing theory in computational neuroscience is that perception can be successfully described using Bayesian inference models and that the brain is “Bayes-optimal” under some constraints. In this context, expectations are particularly interesting, because they can be viewed as prior beliefs in the statistical inference process. A number of questions remain unsolved, however, for example: How fast do priors change over time? Are there limits in the complexity of the priors that can be learned? How do an individual’s priors compare to the true scene statistics? Can we unlearn priors that are thought to correspond to natural scene statistics? Where and what are the neural substrate of priors? Focusing on the perception of visual motion, we here review recent studies from our laboratories and others addressing these issues. We discuss how these data on motion perception fit within the broader literature on perceptual Bayesian priors, perceptual expectations, and statistical and perceptual learning and review the possible neural basis of priors. PMID:24187536

  13. [Completeness of mortality statistics in Navarra, Spain].

    PubMed

    Moreno-Iribas, Conchi; Guevara, Marcela; Díaz-González, Jorge; Álvarez-Arruti, Nerea; Casado, Itziar; Delfrade, Josu; Larumbe, Emilia; Aguirre, Jesús; Floristán, Yugo

    2013-01-01

    Women in the region of Navarra, Spain, have one of the highest life expectancies at birth in Europe. The aim of this study is to assess the completeness of the official mortality statistics of Navarra in 2009 and the impact of the under-registration of deaths on life expectancy estimates. Comparison of the number of deaths in Navarra using the official statistics from the Instituto Nacional de Estadística (INE) and the data derived from a multiple-source case-finding: the electronic health record, Instituto Navarro de Medicina Legal and INE including data that they received late. 5,249 deaths were identified, of which 103 were not included in the official mortality statistics. Taking into account only deaths that occurred in Spain, which are the only ones considered for the official statistics, the completeness was 98.4%. Estimated life expectancy at birth in 2009 descended from 86.6 years to 86.4 in women and from 80.0 to 79.6 years in men, after correcting for undercount. The results of this study ruled out the existence of significant under-registration of the official mortality statistics, confirming the exceptional longevity of women in Navarra, who are in the top position in Europe with a life expectancy at birth of 86.4 years.

  14. Smaller = Denser, and the Brain Knows It: Natural Statistics of Object Density Shape Weight Expectations

    PubMed Central

    Peters, Megan A. K.; Balzer, Jonathan; Shams, Ladan

    2015-01-01

    If one nondescript object’s volume is twice that of another, is it necessarily twice as heavy? As larger objects are typically heavier than smaller ones, one might assume humans use such heuristics in preparing to lift novel objects if other informative cues (e.g., material, previous lifts) are unavailable. However, it is also known that humans are sensitive to statistical properties of our environments, and that such sensitivity can bias perception. Here we asked whether statistical regularities in properties of liftable, everyday objects would bias human observers’ predictions about objects’ weight relationships. We developed state-of-the-art computer vision techniques to precisely measure the volume of everyday objects, and also measured their weight. We discovered that for liftable man-made objects, “twice as large” doesn’t mean “twice as heavy”: Smaller objects are typically denser, following a power function of volume. Interestingly, this “smaller is denser” relationship does not hold for natural or unliftable objects, suggesting some ideal density range for objects designed to be lifted. We then asked human observers to predict weight relationships between novel objects without lifting them; crucially, these weight predictions quantitatively match typical weight relationships shown by similarly-sized objects in everyday environments. These results indicate that the human brain represents the statistics of everyday objects and that this representation can be quantitatively abstracted and applied to novel objects. Finally, that the brain possesses and can use precise knowledge of the nonlinear association between size and weight carries important implications for implementation of forward models of motor control in artificial systems. PMID:25768977

  15. Separate and Simultaneous Adjustment of Light Qualities in a Real Scene

    PubMed Central

    Pont, Sylvia C.; Heynderick, Ingrid

    2017-01-01

    Humans are able to estimate light field properties in a scene in that they have expectations of the objects’ appearance inside it. Previously, we probed such expectations in a real scene by asking whether a “probe object” fitted a real scene with regard to its lighting. But how well are observers able to interactively adjust the light properties on a “probe object” to its surrounding real scene? Image ambiguities can result in perceptual interactions between light properties. Such interactions formed a major problem for the “readability” of the illumination direction and diffuseness on a matte smooth spherical probe. We found that light direction and diffuseness judgments using a rough sphere as probe were slightly more accurate than when using a smooth sphere, due to the three-dimensional (3D) texture. We here extended the previous work by testing independent and simultaneous (i.e., the light field properties separated one by one or blended together) adjustments of light intensity, direction, and diffuseness using a rough probe. Independently inferred light intensities were close to the veridical values, and the simultaneously inferred light intensity interacted somewhat with the light direction and diffuseness. The independently inferred light directions showed no statistical difference with the simultaneously inferred directions. The light diffuseness inferences correlated with but contracted around medium veridical values. In summary, observers were able to adjust the basic light properties through both independent and simultaneous adjustments. The light intensity, direction, and diffuseness are well “readable” from our rough probe. Our method allows “tuning the light” (adjustment of its spatial distribution) in interfaces for lighting design or perception research. PMID:28203350

  16. Sensory Evaluation of Pralines Containing Different Honey Products

    PubMed Central

    Popov-Raljić, Jovanka V.; Laličić-Petronijević, Jovanka G.; Georgijev, Aneta S.; Popov, Vladimir S.; Mladenović, Mića A.

    2010-01-01

    In this study, pralines manufactured by hand were evaluated sensorially. These pralines were obtained from dark chocolate containing 60% cocoa components, filled with Apis mellifera carnica Poll drone larvae, blossom honey and a blossom honey/pollen mixture from the protected region of Stara Planina-Eastern Serbia (a specific botanical region). The objectives of this study were investigations related to the use of sensory analysis for quality assessment of new functional products with potential benefits for human health, in particular of desserts based on dark chocolate pralines filled with different bee products characterized by a specific botanical and geographic origin, as well as of their storage properties and expected shelf life. Sensory quality (appearance, texture, odor and taste were evaluated by a group of experienced panelists immediately after the production (day 0), and then after 30, 90 and 180 days of storage under ambient conditions (temperature 18–20 °C). The results were statistically analyzed by the two-factorial analysis of variance (MANOVA) and with the LSD-test. It is possible to conclude that the storage time and composition of dark chocolate pralines containing different honey-bee products have statistically highly significant (p < 0.01) influence on the sensorially evaluated properties of pralines. PMID:22163633

  17. Sensory evaluation of pralines containing different honey products.

    PubMed

    Popov-Raljić, Jovanka V; Laličić-Petronijević, Jovanka G; Georgijev, Aneta S; Popov, Vladimir S; Mladenović, Mića A

    2010-01-01

    In this study, pralines manufactured by hand were evaluated sensorially. These pralines were obtained from dark chocolate containing 60% cocoa components, filled with Apis mellifera carnica Poll drone larvae, blossom honey and a blossom honey/pollen mixture from the protected region of Stara Planina-Eastern Serbia (a specific botanical region). The objectives of this study were investigations related to the use of sensory analysis for quality assessment of new functional products with potential benefits for human health, in particular of desserts based on dark chocolate pralines filled with different bee products characterized by a specific botanical and geographic origin, as well as of their storage properties and expected shelf life. Sensory quality (appearance, texture, odor and taste were evaluated by a group of experienced panelists immediately after the production (day 0), and then after 30, 90 and 180 days of storage under ambient conditions (temperature 18-20 °C). The results were statistically analyzed by the two-factorial analysis of variance (MANOVA) and with the LSD-test. It is possible to conclude that the storage time and composition of dark chocolate pralines containing different honey-bee products have statistically highly significant (p < 0.01) influence on the sensorially evaluated properties of pralines.

  18. Predicting structural classes of proteins by incorporating their global and local physicochemical and conformational properties into general Chou's PseAAC.

    PubMed

    Contreras-Torres, Ernesto

    2018-06-02

    In this study, I introduce novel global and local 0D-protein descriptors based on a statistical quantity named Total Sum of Squares (TSS). This quantity represents the sum of the squares differences of amino acid properties from the arithmetic mean property. As an extension, the amino acid-types and amino acid-groups formalisms are used for describing zones of interest in proteins. To assess the effectiveness of the proposed descriptors, a Nearest Neighbor model for predicting the major four protein structural classes was built. This model has a success rate of 98.53% on the jackknife cross-validation test; this performance being superior to other reported methods despite the simplicity of the predictor. Additionally, this predictor has an average success rate of 98.35% in different cross-validation tests performed. A value of 0.98 for the Kappa statistic clearly discriminates this model from a random predictor. The results obtained by the Nearest Neighbor model demonstrated the ability of the proposed descriptors not only to reflect relevant biochemical information related to the structural classes of proteins but also to allow appropriate interpretability. It can thus be expected that the current method may play a supplementary role to other existing approaches for protein structural class prediction and other protein attributes. Copyright © 2018 Elsevier Ltd. All rights reserved.

  19. From molecular noise to behavioural variability in a single bacterium

    NASA Astrophysics Data System (ADS)

    Korobkova, Ekaterina; Emonet, Thierry; Vilar, Jose M. G.; Shimizu, Thomas S.; Cluzel, Philippe

    2004-04-01

    The chemotaxis network that governs the motion of Escherichia coli has long been studied to gain a general understanding of signal transduction. Although this pathway is composed of just a few components, it exhibits some essential characteristics of biological complexity, such as adaptation and response to environmental signals. In studying intracellular networks, most experiments and mathematical models have assumed that network properties can be inferred from population measurements. However, this approach masks underlying temporal fluctuations of intracellular signalling events. We have inferred fundamental properties of the chemotaxis network from a noise analysis of behavioural variations in individual bacteria. Here we show that certain properties established by population measurements, such as adapted states, are not conserved at the single-cell level: for timescales ranging from seconds to several minutes, the behaviour of non-stimulated cells exhibit temporal variations much larger than the expected statistical fluctuations. We find that the signalling network itself causes this noise and identify the molecular events that produce it. Small changes in the concentration of one key network component suppress temporal behavioural variability, suggesting that such variability is a selected property of this adaptive system.

  20. The Youth Psychopathic Traits Inventory: Measurement Invariance and Psychometric Properties among Portuguese Youths

    PubMed Central

    Pechorro, Pedro; Ribeiro da Silva, Diana; Andershed, Henrik; Rijo, Daniel; Abrunhosa Gonçalves, Rui

    2016-01-01

    The aim of the present study was to examine the psychometric properties of the Youth Psychopathic Traits Inventory (YPI) among a mixed-gender sample of 782 Portuguese youth (M = 15.87 years; SD = 1.72), in a school context. Confirmatory factor analysis revealed the expected three-factor first-order structure. Cross-gender measurement invariance and cross-sample measurement invariance using a forensic sample of institutionalized males were also confirmed. The Portuguese version of the YPI demonstrated generally adequate psychometric properties of internal consistency, mean inter-item correlation, convergent validity, discriminant validity, and criterion-related validity of statistically significant associations with conduct disorder symptoms, alcohol abuse, drug use, and unprotected sex. In terms of known-groups validity, males scored higher than females, and males from the school sample scored lower than institutionalized males. The use of the YPI among the Portuguese male and female youth population is psychometrically justified, and it can be a useful measure to identify adolescents with high levels of psychopathic traits. PMID:27571095

  1. Maximum entropy approach to statistical inference for an ocean acoustic waveguide.

    PubMed

    Knobles, D P; Sagers, J D; Koch, R A

    2012-02-01

    A conditional probability distribution suitable for estimating the statistical properties of ocean seabed parameter values inferred from acoustic measurements is derived from a maximum entropy principle. The specification of the expectation value for an error function constrains the maximization of an entropy functional. This constraint determines the sensitivity factor (β) to the error function of the resulting probability distribution, which is a canonical form that provides a conservative estimate of the uncertainty of the parameter values. From the conditional distribution, marginal distributions for individual parameters can be determined from integration over the other parameters. The approach is an alternative to obtaining the posterior probability distribution without an intermediary determination of the likelihood function followed by an application of Bayes' rule. In this paper the expectation value that specifies the constraint is determined from the values of the error function for the model solutions obtained from a sparse number of data samples. The method is applied to ocean acoustic measurements taken on the New Jersey continental shelf. The marginal probability distribution for the values of the sound speed ratio at the surface of the seabed and the source levels of a towed source are examined for different geoacoustic model representations. © 2012 Acoustical Society of America

  2. Optimization of Sinter Plant Operating Conditions Using Advanced Multivariate Statistics: Intelligent Data Processing

    NASA Astrophysics Data System (ADS)

    Fernández-González, Daniel; Martín-Duarte, Ramón; Ruiz-Bustinza, Íñigo; Mochón, Javier; González-Gasca, Carmen; Verdeja, Luis Felipe

    2016-08-01

    Blast furnace operators expect to get sinter with homogenous and regular properties (chemical and mechanical), necessary to ensure regular blast furnace operation. Blends for sintering also include several iron by-products and other wastes that are obtained in different processes inside the steelworks. Due to their source, the availability of such materials is not always consistent, but their total production should be consumed in the sintering process, to both save money and recycle wastes. The main scope of this paper is to obtain the least expensive iron ore blend for the sintering process, which will provide suitable chemical and mechanical features for the homogeneous and regular operation of the blast furnace. The systematic use of statistical tools was employed to analyze historical data, including linear and partial correlations applied to the data and fuzzy clustering based on the Sugeno Fuzzy Inference System to establish relationships among the available variables.

  3. Some Probabilistic and Statistical Properties of the Seismic Regime of Zemmouri (Algeria) Seismoactive Zone

    NASA Astrophysics Data System (ADS)

    Baddari, Kamel; Bellalem, Fouzi; Baddari, Ibtihel; Makdeche, Said

    2016-10-01

    Statistical tests have been used to adjust the Zemmouri seismic data using a distribution function. The Pareto law has been used and the probabilities of various expected earthquakes were computed. A mathematical expression giving the quantiles was established. The extreme values limiting law confirmed the accuracy of the adjustment method. Using the moment magnitude scale, a probabilistic model was made to predict the occurrences of strong earthquakes. The seismic structure has been characterized by the slope of the recurrence plot γ, fractal dimension D, concentration parameter K sr, Hurst exponents H r and H t. The values of D, γ, K sr, H r, and H t diminished many months before the principal seismic shock ( M = 6.9) of the studied seismoactive zone has occurred. Three stages of the deformation of the geophysical medium are manifested in the variation of the coefficient G% of the clustering of minor seismic events.

  4. Interlot variations of transition temperature range and force delivery in copper-nickel-titanium orthodontic wires.

    PubMed

    Pompei-Reynolds, Renée C; Kanavakis, Georgios

    2014-08-01

    The manufacturing process for copper-nickel-titanium archwires is technique sensitive. The primary aim of this investigation was to examine the interlot consistency of the mechanical properties of copper-nickel-titanium wires from 2 manufacturers. Wires of 2 sizes (0.016 and 0.016 × 0.022 in) and 3 advertised austenite finish temperatures (27°C, 35°C, and 40°C) from 2 manufacturers were tested for transition temperature ranges and force delivery using differential scanning calorimetry and the 3-point bend test, respectively. Variations of these properties were analyzed for statistical significance by calculating the F statistic for equality of variances for transition temperature and force delivery in each group of wires. All statistical analyses were performed at the 0.05 level of significance. Statistically significant interlot variations in austenite finish were found for the 0.016 in/27°C (P = 0.041) and 0.016 × 0.022 in/35°C (P = 0.048) wire categories, and in austenite start for the 0.016 × 0.022 in/35°C wire category (P = 0.01). In addition, significant variations in force delivery were found between the 2 manufacturers for the 0.016 in/27°C (P = 0.002), 0.016 in/35.0°C (P = 0.049), and 0.016 × 0.022 in/35°C (P = 0.031) wires. Orthodontic wires of the same material, dimension, and manufacturer but from different production lots do not always have similar mechanical properties. Clinicians should be aware that copper-nickel-titanium wires might not always deliver the expected force, even when they come from the same manufacturer, because of interlot variations in the performance of the material. Copyright © 2014 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  5. Space weather influence on the agriculture technology and wheat prices in the medieval England (1259-1703) through cosmic ray/solar activity cycle variations

    NASA Astrophysics Data System (ADS)

    Dorman, L. I.; Pustil'Nik, L. A.; Yom Din, G.

    2003-04-01

    The database of Professor Rogers (1887), which includes wheat prices in England in the Middle Ages (1249-1703) was used to search for possible manifestations of solar activity and cosmic ray intensity variations. The main object of our statistical analysis is investigation of bursts of prices. Our study shows that bursts and troughs of wheat prices take place at extreme states (maximums or minimums) of solar activity cycles. We present a conceptual model of possible modes for sensitivity of wheat prices to weather conditions, caused by cosmic ray intensity solar cycle variations, and compare the expected price fluctuations with wheat price variations recorded in the Medieval England. We compared statistical properties of the intervals between price bursts with statistical properties of the intervals between extremes (minimums) of solar cycles during the years 1700-2000. The medians of both samples have the values of 11.00 and 10.7 years; standard deviations are 1.44 and 1.53 years for prices and for solar activity, respectively. The hypothesis that the frequency distributions are the same for both of the samples have significance level >95%. In the next step we analyzed direct links between wheat prices and cosmic ray cycle variations in the 17th Century, for which both wheat prices and cosmic ray intensity (derived from Be-10 isotope data) are available. We show that for all seven solar activity minimums (cosmic ray intensity maximums) the observed prices were higher than prices for the seven intervals of maximal solar activity (100% sign correlation). This result, combined with the conclusion of similarity of statistical properties of the price and solar activity extremes can be considered as direct evidence of a causal connection between wheat prices bursts and solar activity/cosmic ray intensity extremes.

  6. [Assessment of psychometric properties of the academic involvement questionnaire, expectations version].

    PubMed

    Pérez V, Cristhian; Ortiz M, Liliana; Fasce H, Eduardo; Parra P, Paula; Matus B, Olga; McColl C, Peter; Torres A, Graciela; Meyer K, Andrea; Márquez U, Carolina; Ortega B, Javiera

    2015-11-01

    Academic Involvement Questionnaire, Expectations version (CIA-A), assesses the expectations of involvement in studies. It is a relevant predictor of student success. However, the evidence of its validity and reliability in Chile is low, and in the case of Medical students, there is no evidence at all. To evaluate the factorial structure and internal consistency of the CIA-A in Chilean Medical school freshmen. The survey was applied to 340 Medicine freshmen, chosen by non-probability quota sampling. They answered a back-translated version of CIA-A from Portuguese to Spanish, plus a sociodemographic questionnaire. For psychometric analysis of the CIA-A, an exploratory factor analysis was carried on, the reliability of the factors was calculated, a descriptive analysis was conducted and their correlation was assessed. Five factors were identified: vocational, institutional and social involvement, use of resources and student participation. Their reliabilities ranged between Cronbach's alpha values of 0.71 to 0.87. Factors also showed statistically significant correlations between each other. Identified factor structure is theoretically consistent with the structure of original version. It just disagrees in one factor. In addition, the factors' internal consistency were adequate for using them in research. This supports the construct validity and reliability of the CIA-A to assess involvement expectations in medical school freshmen.

  7. Lagrangian single-particle turbulent statistics through the Hilbert-Huang transform.

    PubMed

    Huang, Yongxiang; Biferale, Luca; Calzavarini, Enrico; Sun, Chao; Toschi, Federico

    2013-04-01

    The Hilbert-Huang transform is applied to analyze single-particle Lagrangian velocity data from numerical simulations of hydrodynamic turbulence. The velocity trajectory is described in terms of a set of intrinsic mode functions C(i)(t) and of their instantaneous frequency ω(i)(t). On the basis of this decomposition we define the ω-conditioned statistical moments of the C(i) modes, named q-order Hilbert spectra (HS). We show that such quantities have enhanced scaling properties as compared to traditional Fourier transform- or correlation-based (structure functions) statistical indicators, thus providing better insights into the turbulent energy transfer process. We present clear empirical evidence that the energylike quantity, i.e., the second-order HS, displays a linear scaling in time in the inertial range, as expected from a dimensional analysis. We also measure high-order moment scaling exponents in a direct way, without resorting to the extended self-similarity procedure. This leads to an estimate of the Lagrangian structure function exponents which are consistent with the multifractal prediction in the Lagrangian frame as proposed by Biferale et al. [Phys. Rev. Lett. 93, 064502 (2004)].

  8. Revising the personality disorder diagnostic criteria for the Diagnostic and Statistical Manual of Mental Disorders-Fifth Edition (DSM-V): consider the later life context.

    PubMed

    Balsis, Steve; Segal, Daniel L; Donahue, Cailin

    2009-10-01

    The categorical measurement approach implemented by the Diagnostic and Statistical Manual of Mental Disorders-Fourth Edition (DSM-IV) personality disorder (PD) diagnostic system is theoretically and pragmatically limited. As a result, many prominent psychologists now advocate for a shift away from this approach in favor of more conceptually sound dimensional measurement. This shift is expected to improve the psychometric properties of the personality disorder (PD) diagnostic system and make it more useful for clinicians and researchers. The current article suggests that despite the probable benefits of such a change, several limitations will remain if the new diagnostic system does not closely consider the context of later life. A failure to address the unique challenges associated with the assessment of personality in older adults likely will result in the continued limited validity, reliability, and utility of the Diagnostic and Statistical Manual of Mental Disorders (DSM) system for this growing population. This article discusses these limitations and their possible implications. (c) 2009 APA, all rights reserved.

  9. A generalized K statistic for estimating phylogenetic signal from shape and other high-dimensional multivariate data.

    PubMed

    Adams, Dean C

    2014-09-01

    Phylogenetic signal is the tendency for closely related species to display similar trait values due to their common ancestry. Several methods have been developed for quantifying phylogenetic signal in univariate traits and for sets of traits treated simultaneously, and the statistical properties of these approaches have been extensively studied. However, methods for assessing phylogenetic signal in high-dimensional multivariate traits like shape are less well developed, and their statistical performance is not well characterized. In this article, I describe a generalization of the K statistic of Blomberg et al. that is useful for quantifying and evaluating phylogenetic signal in highly dimensional multivariate data. The method (K(mult)) is found from the equivalency between statistical methods based on covariance matrices and those based on distance matrices. Using computer simulations based on Brownian motion, I demonstrate that the expected value of K(mult) remains at 1.0 as trait variation among species is increased or decreased, and as the number of trait dimensions is increased. By contrast, estimates of phylogenetic signal found with a squared-change parsimony procedure for multivariate data change with increasing trait variation among species and with increasing numbers of trait dimensions, confounding biological interpretations. I also evaluate the statistical performance of hypothesis testing procedures based on K(mult) and find that the method displays appropriate Type I error and high statistical power for detecting phylogenetic signal in high-dimensional data. Statistical properties of K(mult) were consistent for simulations using bifurcating and random phylogenies, for simulations using different numbers of species, for simulations that varied the number of trait dimensions, and for different underlying models of trait covariance structure. Overall these findings demonstrate that K(mult) provides a useful means of evaluating phylogenetic signal in high-dimensional multivariate traits. Finally, I illustrate the utility of the new approach by evaluating the strength of phylogenetic signal for head shape in a lineage of Plethodon salamanders. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  10. Assessing the independent contribution of maternal educational expectations to children's educational attainment in early adulthood: a propensity score matching analysis.

    PubMed

    Pingault, Jean Baptiste; Côté, Sylvana M; Petitclerc, Amélie; Vitaro, Frank; Tremblay, Richard E

    2015-01-01

    Parental educational expectations have been associated with children's educational attainment in a number of long-term longitudinal studies, but whether this relationship is causal has long been debated. The aims of this prospective study were twofold: 1) test whether low maternal educational expectations contributed to failure to graduate from high school; and 2) compare the results obtained using different strategies for accounting for confounding variables (i.e. multivariate regression and propensity score matching). The study sample included 1,279 participants from the Quebec Longitudinal Study of Kindergarten Children. Maternal educational expectations were assessed when the participants were aged 12 years. High school graduation—measuring educational attainment—was determined through the Quebec Ministry of Education when the participants were aged 22-23 years. Findings show that when using the most common statistical approach (i.e. multivariate regressions to adjust for a restricted set of potential confounders) the contribution of low maternal educational expectations to failure to graduate from high school was statistically significant. However, when using propensity score matching, the contribution of maternal expectations was reduced and remained statistically significant only for males. The results of this study are consistent with the possibility that the contribution of parental expectations to educational attainment is overestimated in the available literature. This may be explained by the use of a restricted range of potential confounding variables as well as the dearth of studies using appropriate statistical techniques and study designs in order to minimize confounding. Each of these techniques and designs, including propensity score matching, has its strengths and limitations: A more comprehensive understanding of the causal role of parental expectations will stem from a convergence of findings from studies using different techniques and designs.

  11. Use of Statistical Analyses in the Ophthalmic Literature

    PubMed Central

    Lisboa, Renato; Meira-Freitas, Daniel; Tatham, Andrew J.; Marvasti, Amir H.; Sharpsten, Lucie; Medeiros, Felipe A.

    2014-01-01

    Purpose To identify the most commonly used statistical analyses in the ophthalmic literature and to determine the likely gain in comprehension of the literature that readers could expect if they were to sequentially add knowledge of more advanced techniques to their statistical repertoire. Design Cross-sectional study Methods All articles published from January 2012 to December 2012 in Ophthalmology, American Journal of Ophthalmology and Archives of Ophthalmology were reviewed. A total of 780 peer-reviewed articles were included. Two reviewers examined each article and assigned categories to each one depending on the type of statistical analyses used. Discrepancies between reviewers were resolved by consensus. Main Outcome Measures Total number and percentage of articles containing each category of statistical analysis were obtained. Additionally we estimated the accumulated number and percentage of articles that a reader would be expected to be able to interpret depending on their statistical repertoire. Results Readers with little or no statistical knowledge would be expected to be able to interpret the statistical methods presented in only 20.8% of articles. In order to understand more than half (51.4%) of the articles published, readers were expected to be familiar with at least 15 different statistical methods. Knowledge of 21 categories of statistical methods was necessary to comprehend 70.9% of articles, while knowledge of more than 29 categories was necessary to comprehend more than 90% of articles. Articles in retina and glaucoma subspecialties showed a tendency for using more complex analysis when compared to cornea. Conclusions Readers of clinical journals in ophthalmology need to have substantial knowledge of statistical methodology to understand the results of published studies in the literature. The frequency of use of complex statistical analyses also indicates that those involved in the editorial peer-review process must have sound statistical knowledge in order to critically appraise articles submitted for publication. The results of this study could provide guidance to direct the statistical learning of clinical ophthalmologists, researchers and educators involved in the design of courses for residents and medical students. PMID:24612977

  12. The dynamics of integrate-and-fire: mean versus variance modulations and dependence on baseline parameters.

    PubMed

    Pressley, Joanna; Troyer, Todd W

    2011-05-01

    The leaky integrate-and-fire (LIF) is the simplest neuron model that captures the essential properties of neuronal signaling. Yet common intuitions are inadequate to explain basic properties of LIF responses to sinusoidal modulations of the input. Here we examine responses to low and moderate frequency modulations of both the mean and variance of the input current and quantify how these responses depend on baseline parameters. Across parameters, responses to modulations in the mean current are low pass, approaching zero in the limit of high frequencies. For very low baseline firing rates, the response cutoff frequency matches that expected from membrane integration. However, the cutoff shows a rapid, supralinear increase with firing rate, with a steeper increase in the case of lower noise. For modulations of the input variance, the gain at high frequency remains finite. Here, we show that the low-frequency responses depend strongly on baseline parameters and derive an analytic condition specifying the parameters at which responses switch from being dominated by low versus high frequencies. Additionally, we show that the resonant responses for variance modulations have properties not expected for common oscillatory resonances: they peak at frequencies higher than the baseline firing rate and persist when oscillatory spiking is disrupted by high noise. Finally, the responses to mean and variance modulations are shown to have a complementary dependence on baseline parameters at higher frequencies, resulting in responses to modulations of Poisson input rates that are independent of baseline input statistics.

  13. Modeling of the dielectric permittivity of porous soil media with water using statistical-physical models

    NASA Astrophysics Data System (ADS)

    Usowicz, Boguslaw; Marczewski, Wojciech; Usowicz, Jerzy B.; Łukowski, Mateusz; Lipiec, Jerzy; Stankiewicz, Krystyna

    2013-04-01

    Radiometric observations with SMOS rely on the Radiation Transfer Equations (RTE) determining the Brightness Temperature (BT) in two linear polarization components (H, V) satisfying Fresnel principle of propagation in horizontally layered target media on the ground. RTE involve variables which bound the equations expressed in Electro-Magnetic (EM) terms of the intensity BT to the physical reality expressed by non-EM variables (Soil Moisture (SM), vegetation indexes, fractional coverage with many different properties, and the boundary conditions like optical thickness, layer definitions, roughness, etc.) bridging the EM domain to other physical aspects by means of the so called tau-omega methods. This method enables joining variety of different valuable models, including specific empirical estimation of physical properties in relation to the volumetric water content. The equations of RTE are in fact expressed by propagation, reflection and losses or attenuation existing on a considered propagation path. The electromagnetic propagation is expressed in the propagation constant. For target media on the ground the dielectric constant is a decisive part for effects of propagation. Therefore, despite of many various physical parameters involved, one must effectively and dominantly rely on the dielectric constant meant as a complex variable. The real part of the dielectric constant represents effect of apparent shortening the propagation path and the refraction, while the imaginary part is responsible for the attenuation or losses. This work engages statistical-physical modeling of soil properties considering the media as a mixture of solid grains, and gas or liquid filling of pores and contact bridges between compounds treated statistically. The method of this modeling provides an opportunity of characterizing the porosity by general statistical means, and is applicable to various physical properties (thermal, electrical conductivity and dielectric properties) which depend on composition of compounds. The method was developed beyond the SMOS method, but they meet just in RTE, at the dielectric constant. The dielectric constant is observed or measured (retrieved) by SMOS, regardless other properties like the soil porosity and without a direct relation to thermal properties of soils. Relations between thermal properties of soil to the water content are very consistent. Therefore, we took a concept of introducing effects of the soil porosity, and thermal properties of soils into the representation of the dielectric constant in complex measures, and thus gaining new abilities for capturing effects of the porosity by the method of SMOS observations. Currently we are able presenting few effects of relations between thermal properties and the soil moisture content, on examples from wetlands Biebrza and Polesie in Poland, and only search for correlations between SM from SMOS to the moisture content known from the ground. The correlations are poor for SMOS L2 data processed with the version of retrievals using the model of Dobson (501), but we expect more correlation for the version using the model of Mironov (551). If the supposition is confirmed, then we may gain encouragement to employing the statistical-physical modeling of the dielectric constant and thermal properties for the purposes of using this model in RTE and tau-omega method. Treating the soil porosity for a target of research directly is not enough strongly motivated like the use of effects on SM observable in SMOS.

  14. Anchored LH2 complexes in 2D polarization imaging.

    PubMed

    Tubasum, Sumera; Sakai, Shunsuke; Dewa, Takehisa; Sundström, Villy; Scheblykin, Ivan G; Nango, Mamoru; Pullerits, Tõnu

    2013-09-26

    Protein is a soft material with inherently large structural disorder. Consequently, the bulk spectroscopies of photosynthetic pigment protein complexes provide averaged information where many details are lost. Here we report spectroscopy of single light-harvesting complexes where fluorescence excitation and detection polarizations are both independently rotated. Two samples of peripheral antenna (LH2) complexes from Rhodopseudomonas acidophila were studied. In one, the complexes were embedded in polyvinyl alcohol (PVA) film; in the other, they were anchored on the glass surface and covered by the PVA film. LH2 contains two rings of pigment molecules-B800 and B850. The B800 excitation polarization properties of the two samples were found to be very similar, indicating that orientation statistics of LH2s are the same in these two very different preparations. At the same time, we found a significant difference in B850 emission polarization statistics. We conclude that the B850 band of the anchored sample is substantially more disordered. We argue that both B800 excitation and B850 emission polarization properties can be explained by the tilt of the anchored LH2s due to the spin-casting of the PVA film on top of the complexes and related shear forces. Due to the tilt, the orientation statistics of two samples become similar. Anchoring is expected to orient the LH2s so that B850 is closer to the substrate. Consequently, the tilt-related strain leads to larger deformation and disorder in B850 than in B800.

  15. Mammographic texture synthesis using genetic programming and clustered lumpy background

    NASA Astrophysics Data System (ADS)

    Castella, Cyril; Kinkel, Karen; Descombes, François; Eckstein, Miguel P.; Sottas, Pierre-Edouard; Verdun, Francis R.; Bochud, François O.

    2006-03-01

    In this work we investigated the digital synthesis of images which mimic real textures observed in mammograms. Such images could be produced in an unlimited number with tunable statistical properties in order to study human performance and model observer performance in perception experiments. We used the previously developed clustered lumpy background (CLB) technique and optimized its parameters with a genetic algorithm (GA). In order to maximize the realism of the textures, we combined the GA objective approach with psychophysical experiments involving the judgments of radiologists. Thirty-six statistical features were computed and averaged, over 1000 real mammograms regions of interest. The same features were measured for the synthetic textures, and the Mahalanobis distance was used to quantify the similarity of the features between the real and synthetic textures. The similarity, as measured by the Mahalanobis distance, was used as GA fitness function for evolving the free CLB parameters. In the psychophysical approach, experienced radiologists were asked to qualify the realism of synthetic images by considering typical structures that are expected to be found on real mammograms: glandular and fatty areas, and fiber crossings. Results show that CLB images found via optimization with GA are significantly closer to real mammograms than previously published images. Moreover, the psychophysical experiments confirm that all the above mentioned structures are reproduced well on the generated images. This means that we can generate an arbitrary large database of textures mimicking mammograms with traceable statistical properties.

  16. Primordial non-Gaussianity and reionization

    NASA Astrophysics Data System (ADS)

    Lidz, Adam; Baxter, Eric J.; Adshead, Peter; Dodelson, Scott

    2013-07-01

    The statistical properties of the primordial perturbations contain clues about their origins. Although the Planck collaboration has recently obtained tight constraints on primordial non-Gaussianity from cosmic microwave background measurements, it is still worthwhile to mine upcoming data sets in an effort to place independent or competitive limits. The ionized bubbles that formed at redshift z˜6-20 during the epoch of reionization were seeded by primordial overdensities, and so the statistics of the ionization field at high redshift are related to the statistics of the primordial field. Here we model the effect of primordial non-Gaussianity on the reionization field. The epoch and duration of reionization are affected, as are the sizes of the ionized bubbles, but these changes are degenerate with variations in the properties of the ionizing sources and the surrounding intergalactic medium. A more promising signature is the power spectrum of the spatial fluctuations in the ionization field, which may be probed by upcoming 21 cm surveys. This has the expected 1/k2 dependence on large scales, characteristic of a biased tracer of the matter field. We project how well upcoming 21 cm observations will be able to disentangle this signal from foreground contamination. Although foreground cleaning inevitably removes the large-scale modes most impacted by primordial non-Gaussianity, we find that primordial non-Gaussianity can be separated from foreground contamination for a narrow range of length scales. In principle, futuristic redshifted 21 cm surveys may allow constraints competitive with Planck.

  17. Within-individual variation in bullfrog vocalizations: implications for a vocally mediated social recognition system.

    PubMed

    Bee, Mark A

    2004-12-01

    Acoustic signals provide a basis for social recognition in a wide range of animals. Few studies, however, have attempted to relate the patterns of individual variation in signals to behavioral discrimination thresholds used by receivers to discriminate among individuals. North American bullfrogs (Rana catesbeiana) discriminate among familiar and unfamiliar individuals based on individual variation in advertisement calls. The sources, patterns, and magnitudes of variation in eight acoustic properties of multiple-note advertisement calls were examined to understand how patterns of within-individual variation might either constrain, or provide additional cues for, vocal recognition. Six of eight acoustic properties exhibited significant note-to-note variation within multiple-note calls. Despite this source of within-individual variation, all call properties varied significantly among individuals, and multivariate analyses indicated that call notes were individually distinct. Fine-temporal and spectral call properties exhibited less within-individual variation compared to gross-temporal properties and contributed most toward statistically distinguishing among individuals. Among-individual differences in the patterns of within-individual variation in some properties suggest that within-individual variation could also function as a recognition cue. The distributions of among-individual and within-individual differences were used to generate hypotheses about the expected behavioral discrimination thresholds of receivers.

  18. An Initial Design of ISO 19152:2012 LADM Based Valuation and Taxation Data Model

    NASA Astrophysics Data System (ADS)

    Çağdaş, V.; Kara, A.; van Oosterom, P.; Lemmen, C.; Işıkdağ, Ü.; Kathmann, R.; Stubkjær, E.

    2016-10-01

    A fiscal registry or database is supposed to record geometric, legal, physical, economic, and environmental characteristics in relation to property units, which are subject to immovable property valuation and taxation. Apart from procedural standards, there is no internationally accepted data standard that defines the semantics of fiscal databases. The ISO 19152:2012 Land Administration Domain Model (LADM), as an international land administration standard focuses on legal requirements, but considers out of scope specifications of external information systems including valuation and taxation databases. However, it provides a formalism which allows for an extension that responds to the fiscal requirements. This paper introduces an initial version of a LADM - Fiscal Extension Module for the specification of databases used in immovable property valuation and taxation. The extension module is designed to facilitate all stages of immovable property taxation, namely the identification of properties and taxpayers, assessment of properties through single or mass appraisal procedures, automatic generation of sales statistics, and the management of tax collection, dealing with arrears and appeals. It is expected that the initial version will be refined through further activities held by a possible joint working group under FIG Commission 7 (Cadastre and Land Management) and FIG Commission 9 (Valuation and the Management of Real Estate) in collaboration with other relevant international bodies.

  19. Assessing the resolution-dependent utility of tomograms for geostatistics

    USGS Publications Warehouse

    Day-Lewis, F. D.; Lane, J.W.

    2004-01-01

    Geophysical tomograms are used increasingly as auxiliary data for geostatistical modeling of aquifer and reservoir properties. The correlation between tomographic estimates and hydrogeologic properties is commonly based on laboratory measurements, co-located measurements at boreholes, or petrophysical models. The inferred correlation is assumed uniform throughout the interwell region; however, tomographic resolution varies spatially due to acquisition geometry, regularization, data error, and the physics underlying the geophysical measurements. Blurring and inversion artifacts are expected in regions traversed by few or only low-angle raypaths. In the context of radar traveltime tomography, we derive analytical models for (1) the variance of tomographic estimates, (2) the spatially variable correlation with a hydrologic parameter of interest, and (3) the spatial covariance of tomographic estimates. Synthetic examples demonstrate that tomograms of qualitative value may have limited utility for geostatistics; moreover, the imprint of regularization may preclude inference of meaningful spatial statistics from tomograms.

  20. DIFFERENCES BETWEEN RADIO-LOUD AND RADIO-QUIET γ -RAY PULSARS AS REVEALED BY FERMI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hui, C. Y.; Lee, Jongsu; Takata, J.

    By comparing the properties of non-recycled radio-loud γ -ray pulsars and radio-quiet γ -ray pulsars, we have searched for the differences between these two populations. We found that the γ -ray spectral curvature of radio-quiet pulsars can be larger than that of radio-loud pulsars. Based on the full sample of non-recycled γ -ray pulsars, their distributions of the magnetic field strength at the light cylinder are also found to be different. We note that this might result from an observational bias. By reexamining the previously reported difference of γ -ray-to-X-ray flux ratios, we found that the significance can be hamperedmore » by their statistical uncertainties. In the context of the outer gap model, we discuss the expected properties of these two populations and compare with the possible differences that are identified in our analysis.« less

  1. Sojourning with the Homogeneous Poisson Process.

    PubMed

    Liu, Piaomu; Peña, Edsel A

    2016-01-01

    In this pedagogical article, distributional properties, some surprising, pertaining to the homogeneous Poisson process (HPP), when observed over a possibly random window, are presented. Properties of the gap-time that covered the termination time and the correlations among gap-times of the observed events are obtained. Inference procedures, such as estimation and model validation, based on event occurrence data over the observation window, are also presented. We envision that through the results in this paper, a better appreciation of the subtleties involved in the modeling and analysis of recurrent events data will ensue, since the HPP is arguably one of the simplest among recurrent event models. In addition, the use of the theorem of total probability, Bayes theorem, the iterated rules of expectation, variance and covariance, and the renewal equation could be illustrative when teaching distribution theory, mathematical statistics, and stochastic processes at both the undergraduate and graduate levels. This article is targeted towards both instructors and students.

  2. Atomistic study of two-level systems in amorphous silica

    NASA Astrophysics Data System (ADS)

    Damart, T.; Rodney, D.

    2018-01-01

    Internal friction is analyzed in an atomic-scale model of amorphous silica. The potential energy landscape of more than 100 glasses is explored to identify a sample of about 700 two-level systems (TLSs). We discuss the properties of TLSs, particularly their energy asymmetry and barrier as well as their deformation potential, computed as longitudinal and transverse averages of the full deformation potential tensors. The discrete sampling is used to predict dissipation in the classical regime. Comparison with experimental data shows a better agreement with poorly relaxed thin films than well relaxed vitreous silica, as expected from the large quench rates used to produce numerical glasses. The TLSs are categorized in three types that are shown to affect dissipation in different temperature ranges. The sampling is also used to discuss critically the usual approximations employed in the literature to represent the statistical properties of TLSs.

  3. Anthropic selection and the habitability of planets orbiting M and K dwarfs

    NASA Astrophysics Data System (ADS)

    Waltham, Dave

    2011-10-01

    The Earth may have untypical characteristics which were necessary preconditions for the emergence of life and, ultimately, intelligent observers. This paper presents a rigorous procedure for quantifying such "anthropic selection" effects by comparing Earth's properties to those of exoplanets. The hypothesis that there is anthropic selection for stellar mass (i.e. planets orbiting stars with masses within a particular range are more favourable for the emergence of observers) is then tested. The results rule out the expected strong selection for low mass stars which would result, all else being equal, if the typical timescale for the emergence of intelligent observers is very long. This indicates that the habitable zone of small stars may be less hospitable for intelligent life than the habitable zone of solar-mass stars. Additional planetary properties can also be analyzed, using the approach introduced here, once relatively complete and unbiased statistics are made available by current and planned exoplanet characterization projects.

  4. Modelling plant species distribution in alpine grasslands using airborne imaging spectroscopy

    PubMed Central

    Pottier, Julien; Malenovský, Zbyněk; Psomas, Achilleas; Homolová, Lucie; Schaepman, Michael E.; Choler, Philippe; Thuiller, Wilfried; Guisan, Antoine; Zimmermann, Niklaus E.

    2014-01-01

    Remote sensing using airborne imaging spectroscopy (AIS) is known to retrieve fundamental optical properties of ecosystems. However, the value of these properties for predicting plant species distribution remains unclear. Here, we assess whether such data can add value to topographic variables for predicting plant distributions in French and Swiss alpine grasslands. We fitted statistical models with high spectral and spatial resolution reflectance data and tested four optical indices sensitive to leaf chlorophyll content, leaf water content and leaf area index. We found moderate added-value of AIS data for predicting alpine plant species distribution. Contrary to expectations, differences between species distribution models (SDMs) were not linked to their local abundance or phylogenetic/functional similarity. Moreover, spectral signatures of species were found to be partly site-specific. We discuss current limits of AIS-based SDMs, highlighting issues of scale and informational content of AIS data. PMID:25079495

  5. Representing the thermal state in time-dependent density functional theory

    DOE PAGES

    Modine, N. A.; Hatcher, R. M.

    2015-05-28

    Classical molecular dynamics (MD) provides a powerful and widely used approach to determining thermodynamic properties by integrating the classical equations of motion of a system of atoms. Time-Dependent Density Functional Theory (TDDFT) provides a powerful and increasingly useful approach to integrating the quantum equations of motion for a system of electrons. TDDFT efficiently captures the unitary evolution of a many-electron state by mapping the system into a fictitious non-interacting system. In analogy to MD, one could imagine obtaining the thermodynamic properties of an electronic system from a TDDFT simulation in which the electrons are excited from their ground state bymore » a time-dependent potential and then allowed to evolve freely in time while statistical data are captured from periodic snapshots of the system. For a variety of systems (e.g., many metals), the electrons reach an effective state of internal equilibrium due to electron-electron interactions on a time scale that is short compared to electron-phonon equilibration. During the initial time-evolution of such systems following electronic excitation, electron-phonon interactions should be negligible, and therefore, TDDFT should successfully capture the internal thermalization of the electrons. However, it is unclear how TDDFT represents the resulting thermal state. In particular, the thermal state is usually represented in quantum statistical mechanics as a mixed state, while the occupations of the TDDFT wave functions are fixed by the initial state in TDDFT. Two key questions involve (1) reformulating quantum statistical mechanics so that thermodynamic expectations can be obtained as an unweighted average over a set of many-body pure states and (2) constructing a family of non-interacting (single determinant) TDDFT states that approximate the required many-body states for the canonical ensemble. In Section II, we will address these questions by first demonstrating that thermodynamic expectations can be evaluated by averaging over certain many-body pure states, which we will call thermal states, and then constructing TDDFT states that approximate these thermal states. In Section III, we will present some numerical tests of the resulting theory, and in Section IV, we will summarize our main results and discuss some possible future directions for this work.« less

  6. Predictive Coding: A Fresh View of Inhibition in the Retina

    NASA Astrophysics Data System (ADS)

    Srinivasan, M. V.; Laughlin, S. B.; Dubs, A.

    1982-11-01

    Interneurons exhibiting centre--surround antagonism within their receptive fields are commonly found in peripheral visual pathways. We propose that this organization enables the visual system to encode spatial detail in a manner that minimizes the deleterious effects of intrinsic noise, by exploiting the spatial correlation that exists within natural scenes. The antagonistic surround takes a weighted mean of the signals in neighbouring receptors to generate a statistical prediction of the signal at the centre. The predicted value is subtracted from the actual centre signal, thus minimizing the range of outputs transmitted by the centre. In this way the entire dynamic range of the interneuron can be devoted to encoding a small range of intensities, thus rendering fine detail detectable against intrinsic noise injected at later stages in processing. This predictive encoding scheme also reduces spatial redundancy, thereby enabling the array of interneurons to transmit a larger number of distinguishable images, taking into account the expected structure of the visual world. The profile of the required inhibitory field is derived from statistical estimation theory. This profile depends strongly upon the signal: noise ratio and weakly upon the extent of lateral spatial correlation. The receptive fields that are quantitatively predicted by the theory resemble those of X-type retinal ganglion cells and show that the inhibitory surround should become weaker and more diffuse at low intensities. The latter property is unequivocally demonstrated in the first-order interneurons of the fly's compound eye. The theory is extended to the time domain to account for the phasic responses of fly interneurons. These comparisons suggest that, in the early stages of processing, the visual system is concerned primarily with coding the visual image to protect against subsequent intrinsic noise, rather than with reconstructing the scene or extracting specific features from it. The treatment emphasizes that a neuron's dynamic range should be matched to both its receptive field and the statistical properties of the visual pattern expected within this field. Finally, the analysis is synthetic because it is an extension of the background suppression hypothesis (Barlow & Levick 1976), satisfies the redundancy reduction hypothesis (Barlow 1961 a, b) and is equivalent to deblurring under certain conditions (Ratliff 1965).

  7. Challenges in Mathematics and Statistics Teaching Underpinned by Student-Lecturer Expectations

    ERIC Educational Resources Information Center

    Parashar, Deepak

    2014-01-01

    This study is motivated by the desire to address some of the enormous challenges faced by the students as well as the lecturer in fulfilling their respective expectations and duties demanded by the process of learning--teaching of mathematics and statistics within the framework of the constraining schedules laid down by the academic institutions…

  8. Exploratory analysis of rainfall events in Coimbra, Portugal: variability of raindrop characteristics

    NASA Astrophysics Data System (ADS)

    Carvalho, S. C. P.; de Lima, M. I. P.; de Lima, J. L. M. P.

    2012-04-01

    Laser disdrometers can monitor efficiently rainfall characteristics at small temporal scales, providing data on rain intensity, raindrop diameter and fall speed, and raindrop counts over time. This type of data allows for the increased understanding of the rainfall structure at small time scales. Of particular interest for many hydrological applications is the characterization of the properties of extreme events, including the intra-event variability, which are affected by different factors (e.g. geographical location, rainfall generating mechanisms). These properties depend on the microphysical, dynamical and kinetic processes that interact to produce rain. In this study we explore rainfall data obtained during two years with a laser disdrometer installed in the city of Coimbra, in the centre region of mainland Portugal. The equipment was developed by Thies Clima. The data temporal resolution is one-minute. Descriptive statistics of time series of raindrop diameter (D), fall speed, kinetic energy, and rain rate were studied at the event scale; for different variables, the average, maximum, minimum, median, variance, standard deviation, quartile, coefficient of variation, skewness and kurtosis were determined. The empirical raindrop size distribution, N(D), was also calculated. Additionally, the parameterization of rainfall was attempted by investigating the applicability of different theoretical statistical distributions to fit the empirical data (e.g. exponential, gamma and lognormal distributions). As expected, preliminary results show that rainfall properties and structure vary with rainfall type and weather conditions over the year. Although only two years were investigated, already some insight into different rain events' structure was obtained.

  9. Contemporaneous disequilibrium of bio-optical properties in the Southern Ocean

    NASA Astrophysics Data System (ADS)

    Kahru, Mati; Lee, Zhongping; Mitchell, B. Greg

    2017-03-01

    Significant changes in satellite-detected net primary production (NPP, mg C m-2 d-1) were observed in the Southern Ocean during 2011-2016: an increase in the Pacific sector and a decrease in the Atlantic sector. While no clear physical forcing was identified, we hypothesize that the changes in NPP were associated with changes in the phytoplankton community and reflected in the concomitant bio-optical properties. Satellite algorithms for chlorophyll a concentration (Chl a, mg m-3) use a combination of estimates of the remote sensing reflectance Rrs(λ) that are statistically fitted to a global reference data set. In any particular region or point in space/time the estimate produced by the global "mean" algorithm can deviate from the true value. Reflectance anomaly (RA) is supposed to remove the first-order variability in Rrs(λ) associated with Chl a and reveal bio-optical properties that are due to the composition of phytoplankton and associated materials. Time series of RA showed variability at multiple scales, including the life span of the sensor, multiyear and annual. Models of plankton functional types using estimated Chl a as input cannot be expected to correctly resolve regional and seasonal anomalies due to biases in the Chl a estimate that they are based on. While a statistical model using RA(λ) time series can predict the times series of NPP with high accuracy (R2 = 0.82) in both Pacific and Atlantic regions, the underlying mechanisms in terms of phytoplankton groups and the associated materials remain elusive.

  10. Psychometric evaluation of the Korean Version of the Self-Efficacy for Exercise Scale for older adults.

    PubMed

    Choi, Mona; Ahn, Sangwoo; Jung, Dukyoo

    2015-01-01

    We evaluated the psychometric properties of the Korean version of the Self-Efficacy for Exercise Scale (SEE-K). The SEE-K consists of nine items and was translated into Korean using the forward-backward translation method. We administered it to 212 community-dwelling older adults along with measures of outcome expectation for exercise, quality of life, and physical activity. The validity was determined using confirmatory factor analysis and Rasch analysis with INFIT and OUTFIT statistics, which showed acceptable model fit. The concurrent validity was confirmed according to positive correlations between the SEE-K, outcome expectation for exercise, and quality of life. Furthermore, the high physical activity group had higher SEE-K scores. Finally, the reliability of the SEE-K was deemed acceptable based on Cronbach's alpha, coefficients of determination, and person and item separation indices with reliability. Thus, the SEE-K appears to have satisfactory validity and reliability among older adults in South Korea. Copyright © 2015 Elsevier Inc. All rights reserved.

  11. Redefining the lower statistical limit in x-ray phase-contrast imaging

    NASA Astrophysics Data System (ADS)

    Marschner, M.; Birnbacher, L.; Willner, M.; Chabior, M.; Fehringer, A.; Herzen, J.; Noël, P. B.; Pfeiffer, F.

    2015-03-01

    Phase-contrast x-ray computed tomography (PCCT) is currently investigated and developed as a potentially very interesting extension of conventional CT, because it promises to provide high soft-tissue contrast for weakly absorbing samples. For data acquisition several images at different grating positions are combined to obtain a phase-contrast projection. For short exposure times, which are necessary for lower radiation dose, the photon counts in a single stepping position are very low. In this case, the currently used phase-retrieval does not provide reliable results for some pixels. This uncertainty results in statistical phase wrapping, which leads to a higher standard deviation in the phase-contrast projections than theoretically expected. For even lower statistics, the phase retrieval breaks down completely and the phase information is lost. New measurement procedures rely on a linear approximation of the sinusoidal phase stepping curve around the zero crossings. In this case only two images are acquired to obtain the phase-contrast projection. The approximation is only valid for small phase values. However, typically nearly all pixels are within this regime due to the differential nature of the signal. We examine the statistical properties of a linear approximation method and illustrate by simulation and experiment that the lower statistical limit can be redefined using this method. That means that the phase signal can be retrieved even with very low photon counts and statistical phase wrapping can be avoided. This is an important step towards enhanced image quality in PCCT with very low photon counts.

  12. Galaxy Evolution in the Radio Band: The Role of Star-forming Galaxies and Active Galactic Nuclei

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mancuso, C.; Prandoni, I.; Lapi, A.

    We investigate the astrophysics of radio-emitting star-forming galaxies and active galactic nuclei (AGNs) and elucidate their statistical properties in the radio band, including luminosity functions, redshift distributions, and number counts at sub-mJy flux levels, which will be crucially probed by next-generation radio continuum surveys. Specifically, we exploit the model-independent approach by Mancuso et al. to compute the star formation rate functions, the AGN duty cycles, and the conditional probability of a star-forming galaxy to host an AGN with given bolometric luminosity. Coupling these ingredients with the radio emission properties associated with star formation and nuclear activity, we compute relevant statisticsmore » at different radio frequencies and disentangle the relative contribution of star-forming galaxies and AGNs in different radio luminosity, radio flux, and redshift ranges. Finally, we highlight that radio-emitting star-forming galaxies and AGNs are expected to host supermassive black holes accreting with different Eddington ratio distributions and to occupy different loci in the galaxy main-sequence diagrams. These specific predictions are consistent with current data sets but need to be tested with larger statistics via future radio data with multiband coverage on wide areas, as will become routinely achievable with the advent of the Square Kilometre Array and its precursors.« less

  13. Correlation between solar flare productivity and photospheric vector magnetic fields

    NASA Astrophysics Data System (ADS)

    Cui, Yanmei; Wang, Huaning

    2008-11-01

    Studying the statistical correlation between the solar flare productivity and photospheric magnetic fields is very important and necessary. It is helpful to set up a practical flare forecast model based on magnetic properties and improve the physical understanding of solar flare eruptions. In the previous study ([Cui, Y.M., Li, R., Zhang, L.Y., He, Y.L., Wang, H.N. Correlation between solar flare productivity and photospheric magnetic field properties 1. Maximum horizontal gradient, length of neutral line, number of singular points. Sol. Phys. 237, 45 59, 2006]; from now on we refer to this paper as ‘Paper I’), three measures of the maximum horizontal gradient, the length of the neutral line, and the number of singular points are computed from 23990 SOHO/MDI longitudinal magnetograms. The statistical relationship between the solar flare productivity and these three measures is well fitted with sigmoid functions. In the current work, the three measures of the length of strong-shear neutral line, total unsigned current, and total unsigned current helicity are computed from 1353 vector magnetograms observed at Huairou Solar Observing Station. The relationship between the solar flare productivity and the current three measures can also be well fitted with sigmoid functions. These results are expected to be beneficial to future operational flare forecasting models.

  14. Detector noise statistics in the non-linear regime

    NASA Technical Reports Server (NTRS)

    Shopbell, P. L.; Bland-Hawthorn, J.

    1992-01-01

    The statistical behavior of an idealized linear detector in the presence of threshold and saturation levels is examined. It is assumed that the noise is governed by the statistical fluctuations in the number of photons emitted by the source during an exposure. Since physical detectors cannot have infinite dynamic range, our model illustrates that all devices have non-linear regimes, particularly at high count rates. The primary effect is a decrease in the statistical variance about the mean signal due to a portion of the expected noise distribution being removed via clipping. Higher order statistical moments are also examined, in particular, skewness and kurtosis. In principle, the expected distortion in the detector noise characteristics can be calibrated using flatfield observations with count rates matched to the observations. For this purpose, some basic statistical methods that utilize Fourier analysis techniques are described.

  15. Spinal appearance questionnaire: factor analysis, scoring, reliability, and validity testing.

    PubMed

    Carreon, Leah Y; Sanders, James O; Polly, David W; Sucato, Daniel J; Parent, Stefan; Roy-Beaudry, Marjolaine; Hopkins, Jeffrey; McClung, Anna; Bratcher, Kelly R; Diamond, Beverly E

    2011-08-15

    Cross sectional. This study presents the factor analysis of the Spinal Appearance Questionnaire (SAQ) and its psychometric properties. Although the SAQ has been administered to a large sample of patients with adolescent idiopathic scoliosis (AIS) treated surgically, its psychometric properties have not been fully evaluated. This study presents the factor analysis and scoring of the SAQ and evaluates its psychometric properties. The SAQ and the Scoliosis Research Society-22 (SRS-22) were administered to AIS patients who were being observed, braced or scheduled for surgery. Standard demographic data and radiographic measures including Lenke type and curve magnitude were also collected. Of the 1802 patients, 83% were female; with a mean age of 14.8 years and mean initial Cobb angle of 55.8° (range, 0°-123°). From the 32 items of the SAQ, 15 loaded on two factors with consistent and significant correlations across all Lenke types. There is an Appearance (items 1-10) and an Expectations factor (items 12-15). Responses are summed giving a range of 5 to 50 for the Appearance domain and 5 to 20 for the Expectations domain. The Cronbach's α was 0.88 for both domains and Total score with a test-retest reliability of 0.81 for Appearance and 0.91 for Expectations. Correlations with major curve magnitude were higher for the SAQ Appearance and SAQ Total scores compared to correlations between the SRS Appearance and SRS Total scores. The SAQ and SRS-22 Scores were statistically significantly different in patients who were scheduled for surgery compared to those who were observed or braced. The SAQ is a valid measure of self-image in patients with AIS with greater correlation to curve magnitude than SRS Appearance and Total score. It also discriminates between patients who require surgery from those who do not.

  16. The statistical average of optical properties for alumina particle cluster in aircraft plume

    NASA Astrophysics Data System (ADS)

    Li, Jingying; Bai, Lu; Wu, Zhensen; Guo, Lixin

    2018-04-01

    We establish a model for lognormal distribution of monomer radius and number of alumina particle clusters in plume. According to the Multi-Sphere T Matrix (MSTM) theory, we provide a method for finding the statistical average of optical properties for alumina particle clusters in plume, analyze the effect of different distributions and different detection wavelengths on the statistical average of optical properties for alumina particle cluster, and compare the statistical average optical properties under the alumina particle cluster model established in this study and those under three simplified alumina particle models. The calculation results show that the monomer number of alumina particle cluster and its size distribution have a considerable effect on its statistical average optical properties. The statistical average of optical properties for alumina particle cluster at common detection wavelengths exhibit obvious differences, whose differences have a great effect on modeling IR and UV radiation properties of plume. Compared with the three simplified models, the alumina particle cluster model herein features both higher extinction and scattering efficiencies. Therefore, we may find that an accurate description of the scattering properties of alumina particles in aircraft plume is of great significance in the study of plume radiation properties.

  17. How patient and carer expectations of orthodontic treatment vary with ethnicity.

    PubMed

    Sadek, Sarah; Newton, Tim; Sayers, Mark

    2015-09-01

    To investigate if the orthodontic treatment expectations of Black British children and their primary carers vary compared with White British children and their primary carers. A hospital orthodontic department (Queen Mary's Hospital, Sidcup, London, UK). Patients and their accompanying primary carers who had not received fixed orthodontic appliance treatment and were aged between 12 and 14 years old. Informed consent was obtained from 100 patients and their primary carers, who completed a psychometrically validated questionnaire, to measure their expectations before a new patient orthodontic consultation. This cohort consisted of 50 Black British patients and their primary carers and 50 White British patients and their primary carers. Mean responses from patients and their primary carers for each ethnic group were compared using the independent groups t-test. Significant statistical differences were found between the two ethnic groups. The greatest statistical differences occurred between Black British patients and their primary carer and Black British primary carers and White British primary carers. Patients tended to have similar orthodontic expectations. There were no statistical significant differences in expectations between White British children and their primary carers. Differences in expectations of orthodontic treatment were more common between Black British and White British primary carers, than their children. White British primary carers had higher expectations at their child's initial appointment and expected dental extractions to be part of the orthodontic treatment plan. These differences have some implications for the provision of orthodontic care. A clinicians understanding of patients and their primary carer's expectations at the start of treatment can help in the quality and delivery of orthodontic care provided.

  18. On global solutions of the random Hamilton-Jacobi equations and the KPZ problem

    NASA Astrophysics Data System (ADS)

    Bakhtin, Yuri; Khanin, Konstantin

    2018-04-01

    In this paper, we discuss possible qualitative approaches to the problem of KPZ universality. Throughout the paper, our point of view is based on the geometrical and dynamical properties of minimisers and shocks forming interlacing tree-like structures. We believe that the KPZ universality can be explained in terms of statistics of these structures evolving in time. The paper is focussed on the setting of the random Hamilton-Jacobi equations. We formulate several conjectures concerning global solutions and discuss how their properties are connected to the KPZ scalings in dimension 1  +  1. In the case of general viscous Hamilton-Jacobi equations with non-quadratic Hamiltonians, we define generalised directed polymers. We expect that their behaviour is similar to the behaviour of classical directed polymers, and present arguments in favour of this conjecture. We also define a new renormalisation transformation defined in purely geometrical terms and discuss conjectural properties of the corresponding fixed points. Most of our conjectures are widely open, and supported by only partial rigorous results for particular models.

  19. On the Evolution of Dark Matter Halo Properties Following Major and Minor Mergers

    NASA Astrophysics Data System (ADS)

    Wu, Peter; Zhang, Shawn; Lee, Christoph; Primack, Joel

    2018-01-01

    We conducted an analysis on dark matter halo properties following major and minor mergers to advance our understanding of halo evolution. In this work, we analyzed ~80,000 dark matter halos from the Bolshoi-Planck cosmological simulation and studied halo evolution during relaxation after major mergers. We then applied a Gaussian filter to the property evolutions and characterized peak distributions, frequencies, and variabilities for several halo properties, including centering, spin, shape (prolateness), scale radius, and virial ratio. However, there were also halos that experienced relaxation without the presence of major mergers. We hypothesized that this was due to minor mergers unrecorded by the simulation analysis. By using property peaks to create a novel merger detection algorithm, we attempted to find minor mergers and match them to the unaccounted relaxed halos. Not only did we find evidence that minor mergers were the causes, but we also found similarities between major and minor merger effects, showing the significance of minor mergers for future studies. Through our dark matter merger statistics, we expect our work to ultimately serve as vital parameters towards better understanding galaxy formation and evolution. Most of this work was carried out by high school students working under the auspices of the Science Internship Program (SIP) at UC Santa Cruz.

  20. Predictability of spatio-temporal patterns in a lattice of coupled FitzHugh–Nagumo oscillators

    PubMed Central

    Grace, Miriam; Hütt, Marc-Thorsten

    2013-01-01

    In many biological systems, variability of the components can be expected to outrank statistical fluctuations in the shaping of self-organized patterns. In pioneering work in the late 1990s, it was hypothesized that a drift of cellular parameters (along a ‘developmental path’), together with differences in cell properties (‘desynchronization’ of cells on the developmental path) can establish self-organized spatio-temporal patterns (in their example, spiral waves of cAMP in a colony of Dictyostelium discoideum cells) starting from a homogeneous state. Here, we embed a generic model of an excitable medium, a lattice of diffusively coupled FitzHugh–Nagumo oscillators, into a developmental-path framework. In this minimal model of spiral wave generation, we can now study the predictability of spatio-temporal patterns from cell properties as a function of desynchronization (or ‘spread’) of cells along the developmental path and the drift speed of cell properties on the path. As a function of drift speed and desynchronization, we observe systematically different routes towards fully established patterns, as well as strikingly different correlations between cell properties and pattern features. We show that the predictability of spatio-temporal patterns from cell properties contains important information on the pattern formation process as well as on the underlying dynamical system. PMID:23349439

  1. Assessing the Independent Contribution of Maternal Educational Expectations to Children’s Educational Attainment in Early Adulthood: A Propensity Score Matching Analysis

    PubMed Central

    Pingault, Jean Baptiste; Côté, Sylvana M.; Petitclerc, Amélie; Vitaro, Frank; Tremblay, Richard E.

    2015-01-01

    Background Parental educational expectations have been associated with children’s educational attainment in a number of long-term longitudinal studies, but whether this relationship is causal has long been debated. The aims of this prospective study were twofold: 1) test whether low maternal educational expectations contributed to failure to graduate from high school; and 2) compare the results obtained using different strategies for accounting for confounding variables (i.e. multivariate regression and propensity score matching). Methodology/Principal Findings The study sample included 1,279 participants from the Quebec Longitudinal Study of Kindergarten Children. Maternal educational expectations were assessed when the participants were aged 12 years. High school graduation – measuring educational attainment – was determined through the Quebec Ministry of Education when the participants were aged 22–23 years. Findings show that when using the most common statistical approach (i.e. multivariate regressions to adjust for a restricted set of potential confounders) the contribution of low maternal educational expectations to failure to graduate from high school was statistically significant. However, when using propensity score matching, the contribution of maternal expectations was reduced and remained statistically significant only for males. Conclusions/Significance The results of this study are consistent with the possibility that the contribution of parental expectations to educational attainment is overestimated in the available literature. This may be explained by the use of a restricted range of potential confounding variables as well as the dearth of studies using appropriate statistical techniques and study designs in order to minimize confounding. Each of these techniques and designs, including propensity score matching, has its strengths and limitations: A more comprehensive understanding of the causal role of parental expectations will stem from a convergence of findings from studies using different techniques and designs. PMID:25803867

  2. Agent-based model to rural urban migration analysis

    NASA Astrophysics Data System (ADS)

    Silveira, Jaylson J.; Espíndola, Aquino L.; Penna, T. J. P.

    2006-05-01

    In this paper, we analyze the rural-urban migration phenomenon as it is usually observed in economies which are in the early stages of industrialization. The analysis is conducted by means of a statistical mechanics approach which builds a computational agent-based model. Agents are placed on a lattice and the connections among them are described via an Ising-like model. Simulations on this computational model show some emergent properties that are common in developing economies, such as a transitional dynamics characterized by continuous growth of urban population, followed by the equalization of expected wages between rural and urban sectors (Harris-Todaro equilibrium condition), urban concentration and increasing of per capita income.

  3. Development and Psychometric Assessment of the Healthcare Provider Cultural Competence Instrument

    PubMed Central

    Schwarz, Joshua L.; Witte, Raymond; Sellers, Sherrill L.; Luzadis, Rebecca A.; Weiner, Judith L.; Domingo-Snyder, Eloiza; Page, James E.

    2015-01-01

    This study presents the measurement properties of 5 scales used in the Healthcare Provider Cultural Competence Instrument (HPCCI). The HPCCI measures a health care provider’s cultural competence along 5 primary dimensions: (1) awareness/sensitivity, (2) behaviors, (3) patient-centered communication, (4) practice orientation, and (5) self-assessment. Exploratory factor analysis demonstrated that the 5 scales were distinct, and within each scale items loaded as expected. Reliability statistics indicated a high level of internal consistency within each scale. The results indicate that the HPCCI effectively measures the cultural competence of health care providers and can provide useful professional feedback for practitioners and organizations seeking to increase a practitioner’s cultural competence. PMID:25911617

  4. Computing the Expected Cost of an Appointment Schedule for Statistically Identical Customers with Probabilistic Service Times

    PubMed Central

    Dietz, Dennis C.

    2014-01-01

    A cogent method is presented for computing the expected cost of an appointment schedule where customers are statistically identical, the service time distribution has known mean and variance, and customer no-shows occur with time-dependent probability. The approach is computationally efficient and can be easily implemented to evaluate candidate schedules within a schedule optimization algorithm. PMID:24605070

  5. Robust transceiver design for reciprocal M × N interference channel based on statistical linearization approximation

    NASA Astrophysics Data System (ADS)

    Mayvan, Ali D.; Aghaeinia, Hassan; Kazemi, Mohammad

    2017-12-01

    This paper focuses on robust transceiver design for throughput enhancement on the interference channel (IC), under imperfect channel state information (CSI). In this paper, two algorithms are proposed to improve the throughput of the multi-input multi-output (MIMO) IC. Each transmitter and receiver has, respectively, M and N antennas and IC operates in a time division duplex mode. In the first proposed algorithm, each transceiver adjusts its filter to maximize the expected value of signal-to-interference-plus-noise ratio (SINR). On the other hand, the second algorithm tries to minimize the variances of the SINRs to hedge against the variability due to CSI error. Taylor expansion is exploited to approximate the effect of CSI imperfection on mean and variance. The proposed robust algorithms utilize the reciprocity of wireless networks to optimize the estimated statistical properties in two different working modes. Monte Carlo simulations are employed to investigate sum rate performance of the proposed algorithms and the advantage of incorporating variation minimization into the transceiver design.

  6. The shape of CMB temperature and polarization peaks on the sphere

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marcos-Caballero, A.; Fernández-Cobos, R.; Martínez-González, E.

    2016-04-01

    We present a theoretical study of CMB temperature peaks, including its effect over the polarization field, and allowing nonzero eccentricity. The formalism is developed in harmonic space and using the covariant derivative on the sphere, which guarantees that the expressions obtained are completely valid at large scales (i.e., no flat approximation). The expected patterns induced by the peak, either in temperature or polarization, are calculated, as well as their covariances. It is found that the eccentricity introduces a quadrupolar dependence in the peak shape, which is proportional to a complex bias parameter b {sub ε}, characterizing the peak asymmetry andmore » orientation. In addition, the one-point statistics of the variables defining the peak on the sphere is reviewed, finding some differences with respect to the flat case for large peaks. Finally, we present a mechanism to simulate constrained CMB maps with a particular peak on the field, which is an interesting tool for analysing the statistical properties of the peaks present in the data.« less

  7. Dephasing in a 5/2 quantum Hall Mach-Zehnder interferometer due to the presence of neutral edge modes

    NASA Astrophysics Data System (ADS)

    Dinaii, Yehuda; Goldstein, Moshe; Gefen, Yuval

    Non-Abelian statistics is an intriguing feature predicted to characterize quasiparticles in certain topological phases of matter. This property is both fascinating on the theoretical side and the key ingredient for the implementation of future topological quantum computers. A smoking gun manifestation of non-Abelian statistics consists of demonstrating that braiding of quasiparticles leads to transitions among different states in the relevant degenerate Hilbert manifold. This can be achieved utilizing a Mach-Zehnder interferometer, where Coulomb effects can be neglected, and the electric current is expected to carry clear signatures of non-Abelianity. Here we argue that attempts to measure non-Abelian statistics in the prominent quantum Hall fraction of 5/2 may fail; this can be understood by studying the corresponding edge theory at finite temperatures and bias. We find that the presence of neutral modes imposes stronger limitations on the experimental conditions as compared to quantum Hall states that do not support neutral edge modes. We discuss how to overcome this hindrance. Interestingly, neutral-mode-induced dephasing can be quite different in the Pfaffian state as compared to the anti-Pfaffian state, if the neutral and charge velocities are comparable.

  8. Rank Dynamics of Word Usage at Multiple Scales

    NASA Astrophysics Data System (ADS)

    Morales, José A.; Colman, Ewan; Sánchez, Sergio; Sánchez-Puig, Fernanda; Pineda, Carlos; Iñiguez, Gerardo; Cocho, Germinal; Flores, Jorge; Gershenson, Carlos

    2018-05-01

    The recent dramatic increase in online data availability has allowed researchers to explore human culture with unprecedented detail, such as the growth and diversification of language. In particular, it provides statistical tools to explore whether word use is similar across languages, and if so, whether these generic features appear at different scales of language structure. Here we use the Google Books N-grams dataset to analyze the temporal evolution of word usage in several languages. We apply measures proposed recently to study rank dynamics, such as the diversity of N-grams in a given rank, the probability that an N-gram changes rank between successive time intervals, the rank entropy, and the rank complexity. Using different methods, results show that there are generic properties for different languages at different scales, such as a core of words necessary to minimally understand a language. We also propose a null model to explore the relevance of linguistic structure across multiple scales, concluding that N-gram statistics cannot be reduced to word statistics. We expect our results to be useful in improving text prediction algorithms, as well as in shedding light on the large-scale features of language use, beyond linguistic and cultural differences across human populations.

  9. Weak Lensing Peaks in Simulated Light-Cones: Investigating the Coupling between Dark Matter and Dark Energy

    NASA Astrophysics Data System (ADS)

    Giocoli, Carlo; Moscardini, Lauro; Baldi, Marco; Meneghetti, Massimo; Metcalf, Robert B.

    2018-05-01

    In this paper, we study the statistical properties of weak lensing peaks in light-cones generated from cosmological simulations. In order to assess the prospects of such observable as a cosmological probe, we consider simulations that include interacting Dark Energy (hereafter DE) models with coupling term between DE and Dark Matter. Cosmological models that produce a larger population of massive clusters have more numerous high signal-to-noise peaks; among models with comparable numbers of clusters those with more concentrated haloes produce more peaks. The most extreme model under investigation shows a difference in peak counts of about 20% with respect to the reference ΛCDM model. We find that peak statistics can be used to distinguish a coupling DE model from a reference one with the same power spectrum normalisation. The differences in the expansion history and the growth rate of structure formation are reflected in their halo counts, non-linear scale features and, through them, in the properties of the lensing peaks. For a source redshift distribution consistent with the expectations of future space-based wide field surveys, we find that typically seventy percent of the cluster population contributes to weak-lensing peaks with signal-to-noise ratios larger than two, and that the fraction of clusters in peaks approaches one-hundred percent for haloes with redshift z ≤ 0.5. Our analysis demonstrates that peak statistics are an important tool for disentangling DE models by accurately tracing the structure formation processes as a function of the cosmic time.

  10. Reflection on Training, Experience, and Introductory Statistics: A Mini-Survey of Tertiary Level Statistics Instructors

    ERIC Educational Resources Information Center

    Hassad, Rossi A.

    2006-01-01

    Instructors of statistics who teach non-statistics majors possess varied academic backgrounds, and hence it is reasonable to expect variability in their content knowledge, and pedagogical approach. The aim of this study was to determine the specific course(s) that contributed mostly to instructors' understanding of statistics. Courses reported…

  11. What Should Researchers Expect When They Replicate Studies? A Statistical View of Replicability in Psychological Science.

    PubMed

    Patil, Prasad; Peng, Roger D; Leek, Jeffrey T

    2016-07-01

    A recent study of the replicability of key psychological findings is a major contribution toward understanding the human side of the scientific process. Despite the careful and nuanced analysis reported, the simple narrative disseminated by the mass, social, and scientific media was that in only 36% of the studies were the original results replicated. In the current study, however, we showed that 77% of the replication effect sizes reported were within a 95% prediction interval calculated using the original effect size. Our analysis suggests two critical issues in understanding replication of psychological studies. First, researchers' intuitive expectations for what a replication should show do not always match with statistical estimates of replication. Second, when the results of original studies are very imprecise, they create wide prediction intervals-and a broad range of replication effects that are consistent with the original estimates. This may lead to effects that replicate successfully, in that replication results are consistent with statistical expectations, but do not provide much information about the size (or existence) of the true effect. In this light, the results of the Reproducibility Project: Psychology can be viewed as statistically consistent with what one might expect when performing a large-scale replication experiment. © The Author(s) 2016.

  12. Predicting Success in an Online Course Using Expectancies, Values, and Typical Mode of Instruction

    ERIC Educational Resources Information Center

    Zimmerman, Whitney Alicia

    2017-01-01

    Expectancies of success and values were used to predict success in an online undergraduate-level introductory statistics course. Students who identified as primarily face-to-face learners were compared to students who identified as primarily online learners. Expectancy value theory served as a model. Expectancies of success were operationalized as…

  13. DNA mutation motifs in the genes associated with inherited diseases.

    PubMed

    Růžička, Michal; Kulhánek, Petr; Radová, Lenka; Čechová, Andrea; Špačková, Naďa; Fajkusová, Lenka; Réblová, Kamila

    2017-01-01

    Mutations in human genes can be responsible for inherited genetic disorders and cancer. Mutations can arise due to environmental factors or spontaneously. It has been shown that certain DNA sequences are more prone to mutate. These sites are termed hotspots and exhibit a higher mutation frequency than expected by chance. In contrast, DNA sequences with lower mutation frequencies than expected by chance are termed coldspots. Mutation hotspots are usually derived from a mutation spectrum, which reflects particular population where an effect of a common ancestor plays a role. To detect coldspots/hotspots unaffected by population bias, we analysed the presence of germline mutations obtained from HGMD database in the 5-nucleotide segments repeatedly occurring in genes associated with common inherited disorders, in particular, the PAH, LDLR, CFTR, F8, and F9 genes. Statistically significant sequences (mutational motifs) rarely associated with mutations (coldspots) and frequently associated with mutations (hotspots) exhibited characteristic sequence patterns, e.g. coldspots contained purine tract while hotspots showed alternating purine-pyrimidine bases, often with the presence of CpG dinucleotide. Using molecular dynamics simulations and free energy calculations, we analysed the global bending properties of two selected coldspots and two hotspots with a G/T mismatch. We observed that the coldspots were inherently more flexible than the hotspots. We assume that this property might be critical for effective mismatch repair as DNA with a mutation recognized by MutSα protein is noticeably bent.

  14. A sub-ensemble theory of ideal quantum measurement processes

    NASA Astrophysics Data System (ADS)

    Allahverdyan, Armen E.; Balian, Roger; Nieuwenhuizen, Theo M.

    2017-01-01

    In order to elucidate the properties currently attributed to ideal measurements, one must explain how the concept of an individual event with a well-defined outcome may emerge from quantum theory which deals with statistical ensembles, and how different runs issued from the same initial state may end up with different final states. This so-called "measurement problem" is tackled with two guidelines. On the one hand, the dynamics of the macroscopic apparatus A coupled to the tested system S is described mathematically within a standard quantum formalism, where " q-probabilities" remain devoid of interpretation. On the other hand, interpretative principles, aimed to be minimal, are introduced to account for the expected features of ideal measurements. Most of the five principles stated here, which relate the quantum formalism to physical reality, are straightforward and refer to macroscopic variables. The process can be identified with a relaxation of S + A to thermodynamic equilibrium, not only for a large ensemble E of runs but even for its sub-ensembles. The different mechanisms of quantum statistical dynamics that ensure these types of relaxation are exhibited, and the required properties of the Hamiltonian of S + A are indicated. The additional theoretical information provided by the study of sub-ensembles remove Schrödinger's quantum ambiguity of the final density operator for E which hinders its direct interpretation, and bring out a commutative behaviour of the pointer observable at the final time. The latter property supports the introduction of a last interpretative principle, needed to switch from the statistical ensembles and sub-ensembles described by quantum theory to individual experimental events. It amounts to identify some formal " q-probabilities" with ordinary frequencies, but only those which refer to the final indications of the pointer. The desired properties of ideal measurements, in particular the uniqueness of the result for each individual run of the ensemble and von Neumann's reduction, are thereby recovered with economic interpretations. The status of Born's rule involving both A and S is re-evaluated, and contextuality of quantum measurements is made obvious.

  15. Influence of solar activity on the state of the wheat market in medieval England

    NASA Astrophysics Data System (ADS)

    Pustil'Nik, Lev A.; Din, Gregory Yom

    2004-09-01

    The database of professor Rogers (1887), which includes wheat prices in England in the Middle Ages, was used to search for a possible influence of solar activity on the wheat market. Our approach was based on the following: (1) Existence of the correlation between cosmic ray flux entering the terrestrial atmosphere and cloudiness of the atmosphere. (2) Cosmic ray intensity in the solar system changes with solar activity, (3) Wheat production depends on weather conditions as a nonlinear function with threshold transitions. (4) A wheat market with a limited supply (as it was in medieval England) has a highly nonlinear sensitivity to variations in wheat production with boundary states, where small changes in wheat supply could lead to bursts of prices or to prices falling. We present a conceptual model of possible modes for sensitivity of wheat prices to weather conditions, caused by solar cycle variations, and compare expected price fluctuations with price variations recorded in medieval England. We compared statistical properties of the intervals between wheat price bursts during the years 1249-1703 with statistical properties of the intervals between the minima of solar cycles during the years 1700-2000. We show that statistical properties of these two samples are similar, both for characteristics of the distributions and for histograms of the distributions. We analyze a direct link between wheat prices and solar activity in the 17th century, for which wheat prices and solar activity data (derived from 10Be isotope) are available. We show that for all 10 time moments of the solar activity minima the observed prices were higher than prices for the corresponding time moments of maximal solar activity (100% sign correlation, on a significance level < 0.2%). We consider these results a direct evidence of the causal connection between wheat prices bursts and solar activity.

  16. Should policy-makers and managers trust PSI? An empirical validation study of five patient safety indicators in a national health service

    PubMed Central

    2012-01-01

    Background Patient Safety Indicators (PSI) are being modestly used in Spain, somewhat due to concerns on their empirical properties. This paper provides evidence by answering three questions: a) Are PSI differences across hospitals systematic -rather than random?; b) Do PSI measure differences among hospital-providers -as opposed to differences among patients?; and, c) Are measurements able to detect hospitals with a higher than "expected" number of cases? Methods An empirical validation study on administrative data was carried out. All 2005 and 2006 publicly-funded hospital discharges were used to retrieve eligible cases of five PSI: Death in low-mortality DRGs (MLM); decubitus ulcer (DU); postoperative pulmonary embolism or deep-vein thrombosis (PE-DVT); catheter-related infections (CRI), and postoperative sepsis (PS). Empirical Bayes statistic (EB) was used to estimate whether the variation was systematic; logistic-multilevel modelling determined what proportion of the variation was explained by the hospital; and, shrunken residuals, as provided by multilevel modelling, were plotted to flag hospitals performing worse than expected. Results Variation across hospitals was observed to be systematic in all indicators, with EB values ranging from 0.19 (CI95%:0.12 to 0.28) in PE-DVT to 0.34 (CI95%:0.25 to 0.45) in DU. A significant proportion of the variance was explained by the hospital, once patient case-mix was adjusted: from a 6% in MLM (CI95%:3% to 11%) to a 24% (CI95%:20% to 30%) in CRI. All PSI were able to flag hospitals with rates over the expected, although this capacity decreased when the largest hospitals were analysed. Conclusion Five PSI showed reasonable empirical properties to screen healthcare performance in Spanish hospitals, particularly in the largest ones. PMID:22369291

  17. Sources of computer self-efficacy: The relationship to outcome expectations, computer anxiety, and intention to use computers

    NASA Astrophysics Data System (ADS)

    Antoine, Marilyn V.

    2011-12-01

    The purpose of this research was to extend earlier research on sources of selfefficacy (Lent, Lopez, & Biechke, 1991; Usher & Pajares, 2009) to the information technology domain. The principal investigator examined how Bandura's (1977) sources of self-efficacy information---mastery experience, vicarious experience, verbal persuasion, and physiological states---shape computer self-efficacy beliefs and influence the decision to use or not use computers. The study took place at a mid-sized Historically Black College or University in the South. A convenience sample of 105 undergraduates was drawn from students enrolled in multiple sections of two introductory computer courses. There were 67 females and 38 males. This research was a correlational study of the following variables: sources of computer self-efficacy, general computer self-efficacy, outcome expectations, computer anxiety, and intention to use computers. The principal investigator administered a survey questionnaire containing 52 Likert items to measure the major study variables. Additionally, the survey instrument collected demographic variables such as gender, age, race, intended major, classification, technology use, technology adoption category, and whether the student owns a computer. The results reveal the following: (1) Mastery experience and verbal persuasion had statistically significant relationships to general computer self-efficacy, while vicarious experience and physiological states had non-significant relationships. Mastery experience had the strongest correlation to general computer self-efficacy. (2) All of the sources of computer self-efficacy had statistically significant relationships to personal outcome expectations. Vicarious experience had the strongest correlation to personal outcome expectations. (3) All of the sources of self-efficacy had statistically significant relationships to performance outcome expectations. Vicarious experience had the strongest correlation to performance outcome expectations. (4) Mastery experience and physiological states had statistically significant relationships to computer anxiety, while vicarious experience and verbal persuasion had non-significant relationships. Physiological states had the strongest correlation to computer anxiety. (5) Mastery experience, vicarious experience, and physiological states had statistically significant relationships to intention to use computers, while verbal persuasion had a non-significant relationship. Mastery experience had the strongest correlation to intention to use computers. Gender-related findings indicate that females reported higher average mastery experience, vicarious experience, physiological states, and intention to use computers than males. Females reported lower average general computer self-efficacy, computer anxiety, verbal persuasion, personal outcome expectations, and performance outcome expectations than males. The results of this study can be used to develop strategies for increasing general computer self-efficacy, outcome expectations, and intention to use computers. The results can also be used to develop strategies for reducing computer anxiety.

  18. High-throughput stochastic tensile performance of additively manufactured stainless steel

    DOE PAGES

    Salzbrenner, Bradley C.; Rodelas, Jeffrey M.; Madison, Jonathan D.; ...

    2016-10-29

    An adage within the Additive Manufacturing (AM) community is that “complexity is free”. Complicated geometric features that normally drive manufacturing cost and limit design options are not typically problematic in AM. While geometric complexity is usually viewed from the perspective of part design, this advantage of AM also opens up new options in rapid, efficient material property evaluation and qualification. In the current work, an array of 100 miniature tensile bars are produced and tested for a comparable cost and in comparable time to a few conventional tensile bars. With this technique, it is possible to evaluate the stochastic naturemore » of mechanical behavior. The current study focuses on stochastic yield strength, ultimate strength, and ductility as measured by strain at failure (elongation). However, this method can be used to capture the statistical nature of many mechanical properties including the full stress-strain constitutive response, elastic modulus, work hardening, and fracture toughness. Moreover, the technique could extend to strain-rate and temperature dependent behavior. As a proof of concept, the technique is demonstrated on a precipitation hardened stainless steel alloy, commonly known as 17-4PH, produced by two commercial AM vendors using a laser powder bed fusion process, also commonly known as selective laser melting. Using two different commercial powder bed platforms, the vendors produced material that exhibited slightly lower strength and markedly lower ductility compared to wrought sheet. Moreover, the properties were much less repeatable in the AM materials as analyzed in the context of a Weibull distribution, and the properties did not consistently meet minimum allowable requirements for the alloy as established by AMS. The diminished, stochastic properties were examined in the context of major contributing factors such as surface roughness and internal lack-of-fusion porosity. Lastly, this high-throughput capability is expected to be useful for follow-on extensive parametric studies of factors that affect the statistical reliability of AM components.« less

  19. High-throughput stochastic tensile performance of additively manufactured stainless steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salzbrenner, Bradley C.; Rodelas, Jeffrey M.; Madison, Jonathan D.

    An adage within the Additive Manufacturing (AM) community is that “complexity is free”. Complicated geometric features that normally drive manufacturing cost and limit design options are not typically problematic in AM. While geometric complexity is usually viewed from the perspective of part design, this advantage of AM also opens up new options in rapid, efficient material property evaluation and qualification. In the current work, an array of 100 miniature tensile bars are produced and tested for a comparable cost and in comparable time to a few conventional tensile bars. With this technique, it is possible to evaluate the stochastic naturemore » of mechanical behavior. The current study focuses on stochastic yield strength, ultimate strength, and ductility as measured by strain at failure (elongation). However, this method can be used to capture the statistical nature of many mechanical properties including the full stress-strain constitutive response, elastic modulus, work hardening, and fracture toughness. Moreover, the technique could extend to strain-rate and temperature dependent behavior. As a proof of concept, the technique is demonstrated on a precipitation hardened stainless steel alloy, commonly known as 17-4PH, produced by two commercial AM vendors using a laser powder bed fusion process, also commonly known as selective laser melting. Using two different commercial powder bed platforms, the vendors produced material that exhibited slightly lower strength and markedly lower ductility compared to wrought sheet. Moreover, the properties were much less repeatable in the AM materials as analyzed in the context of a Weibull distribution, and the properties did not consistently meet minimum allowable requirements for the alloy as established by AMS. The diminished, stochastic properties were examined in the context of major contributing factors such as surface roughness and internal lack-of-fusion porosity. Lastly, this high-throughput capability is expected to be useful for follow-on extensive parametric studies of factors that affect the statistical reliability of AM components.« less

  20. Intellectual Property Needs and Expectations of Traditional Knowledge Holders. WIPO Report on Fact-Finding Missions on Intellectual Property and Traditional Knowledge (1998-1999).

    ERIC Educational Resources Information Center

    2001

    This report presents information compiled by the World Intellectual Property Organization (WIPO) from nine fact-finding missions conducted by WIPO in 1998 and 1999 on the intellectual property (IP) needs and expectations of holders of traditional knowledge. The fact-finding missions (FFMs) were designed to enable WIPO to identify the IP needs and…

  1. Image Statistics and the Representation of Material Properties in the Visual Cortex

    PubMed Central

    Baumgartner, Elisabeth; Gegenfurtner, Karl R.

    2016-01-01

    We explored perceived material properties (roughness, texturedness, and hardness) with a novel approach that compares perception, image statistics and brain activation, as measured with fMRI. We initially asked participants to rate 84 material images with respect to the above mentioned properties, and then scanned 15 of the participants with fMRI while they viewed the material images. The images were analyzed with a set of image statistics capturing their spatial frequency and texture properties. Linear classifiers were then applied to the image statistics as well as the voxel patterns of visually responsive voxels and early visual areas to discriminate between images with high and low perceptual ratings. Roughness and texturedness could be classified above chance level based on image statistics. Roughness and texturedness could also be classified based on the brain activation patterns in visual cortex, whereas hardness could not. Importantly, the agreement in classification based on image statistics and brain activation was also above chance level. Our results show that information about visual material properties is to a large degree contained in low-level image statistics, and that these image statistics are also partially reflected in brain activity patterns induced by the perception of material images. PMID:27582714

  2. Image Statistics and the Representation of Material Properties in the Visual Cortex.

    PubMed

    Baumgartner, Elisabeth; Gegenfurtner, Karl R

    2016-01-01

    We explored perceived material properties (roughness, texturedness, and hardness) with a novel approach that compares perception, image statistics and brain activation, as measured with fMRI. We initially asked participants to rate 84 material images with respect to the above mentioned properties, and then scanned 15 of the participants with fMRI while they viewed the material images. The images were analyzed with a set of image statistics capturing their spatial frequency and texture properties. Linear classifiers were then applied to the image statistics as well as the voxel patterns of visually responsive voxels and early visual areas to discriminate between images with high and low perceptual ratings. Roughness and texturedness could be classified above chance level based on image statistics. Roughness and texturedness could also be classified based on the brain activation patterns in visual cortex, whereas hardness could not. Importantly, the agreement in classification based on image statistics and brain activation was also above chance level. Our results show that information about visual material properties is to a large degree contained in low-level image statistics, and that these image statistics are also partially reflected in brain activity patterns induced by the perception of material images.

  3. Summary goodness-of-fit statistics for binary generalized linear models with noncanonical link functions.

    PubMed

    Canary, Jana D; Blizzard, Leigh; Barry, Ronald P; Hosmer, David W; Quinn, Stephen J

    2016-05-01

    Generalized linear models (GLM) with a canonical logit link function are the primary modeling technique used to relate a binary outcome to predictor variables. However, noncanonical links can offer more flexibility, producing convenient analytical quantities (e.g., probit GLMs in toxicology) and desired measures of effect (e.g., relative risk from log GLMs). Many summary goodness-of-fit (GOF) statistics exist for logistic GLM. Their properties make the development of GOF statistics relatively straightforward, but it can be more difficult under noncanonical links. Although GOF tests for logistic GLM with continuous covariates (GLMCC) have been applied to GLMCCs with log links, we know of no GOF tests in the literature specifically developed for GLMCCs that can be applied regardless of link function chosen. We generalize the Tsiatis GOF statistic originally developed for logistic GLMCCs, (TG), so that it can be applied under any link function. Further, we show that the algebraically related Hosmer-Lemeshow (HL) and Pigeon-Heyse (J(2) ) statistics can be applied directly. In a simulation study, TG, HL, and J(2) were used to evaluate the fit of probit, log-log, complementary log-log, and log models, all calculated with a common grouping method. The TG statistic consistently maintained Type I error rates, while those of HL and J(2) were often lower than expected if terms with little influence were included. Generally, the statistics had similar power to detect an incorrect model. An exception occurred when a log GLMCC was incorrectly fit to data generated from a logistic GLMCC. In this case, TG had more power than HL or J(2) . © 2015 John Wiley & Sons Ltd/London School of Economics.

  4. Statistical Methods and Sampling Design for Estimating Step Trends in Surface-Water Quality

    USGS Publications Warehouse

    Hirsch, Robert M.

    1988-01-01

    This paper addresses two components of the problem of estimating the magnitude of step trends in surface water quality. The first is finding a robust estimator appropriate to the data characteristics expected in water-quality time series. The J. L. Hodges-E. L. Lehmann class of estimators is found to be robust in comparison to other nonparametric and moment-based estimators. A seasonal Hodges-Lehmann estimator is developed and shown to have desirable properties. Second, the effectiveness of various sampling strategies is examined using Monte Carlo simulation coupled with application of this estimator. The simulation is based on a large set of total phosphorus data from the Potomac River. To assure that the simulated records have realistic properties, the data are modeled in a multiplicative fashion incorporating flow, hysteresis, seasonal, and noise components. The results demonstrate the importance of balancing the length of the two sampling periods and balancing the number of data values between the two periods.

  5. Correction for faking in self-report personality tests.

    PubMed

    Sjöberg, Lennart

    2015-10-01

    Faking is a common problem in testing with self-report personality tests, especially in high-stakes situations. A possible way to correct for it is statistical control on the basis of social desirability scales. Two such scales were developed and applied in the present paper. It was stressed that the statistical models of faking need to be adapted to different properties of the personality scales, since such scales correlate with faking to different extents. In four empirical studies of self-report personality tests, correction for faking was investigated. One of the studies was experimental, and asked participants to fake or to be honest. In the other studies, job or school applicants were investigated. It was found that the approach to correct for effects of faking in self-report personality tests advocated in the paper removed a large share of the effects, about 90%. It was found in one study that faking varied as a function of degree of how important the consequences of test results could be expected to be, more high-stakes situations being associated with more faking. The latter finding is incompatible with the claim that social desirability scales measure a general personality trait. It is concluded that faking can be measured and that correction for faking, based on such measures, can be expected to remove about 90% of its effects. © 2015 Psykologisk Metod AB. Scandinavian Journal of Psychology published by Scandinavian Psychological Associations and John Wiley & Sons Ltd.

  6. A statistical approach to recognize source classes for unassociated sources in the first Fermi-LAT catalog

    DOE PAGES

    Ackermann, M.; Ajello, M.; Allafort, A.; ...

    2012-06-15

    The Fermi Large Area Telescope (LAT) First Source Catalog (1FGL) provided spatial, spectral, and temporal properties for a large number of γ-ray sources using a uniform analysis method. After correlating with the most-complete catalogs of source types known to emit γ rays, 630 of these sources are "unassociated" (i.e., have no obvious counterparts at other wavelengths).We employ two statistical analyses of the primary γ-ray characteristics for these unassociated sources in an effort to correlate their γ-ray properties with the active galactic nucleus (AGN) and pulsar populations in 1FGL. Based on the correlation results, we classify 221 AGN-like and 134 pulsar-likemore » sources in the 1FGL unassociated sources. Furthermore, the results of these source "classifications" appear to match the expected source distributions, especially at high Galactic latitudes. While useful for planning future multiwavelength follow-up observations, these analyses use limited inputs, and their predictions should not be considered equivalent to "probable source classes" for these sources. We also discuss multiwavelength results and catalog cross-correlations to date, and provide new source associations for 229 Fermi-LAT sources that had no association listed in the 1FGL catalog. By validating the source classifications against these new associations, we find that the new association matches the predicted source class in ~80% of the sources.« less

  7. A STATISTICAL APPROACH TO RECOGNIZING SOURCE CLASSES FOR UNASSOCIATED SOURCES IN THE FIRST FERMI-LAT CATALOG

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ackermann, M.; Ajello, M.; Allafort, A.

    The Fermi Large Area Telescope (LAT) First Source Catalog (1FGL) provided spatial, spectral, and temporal properties for a large number of {gamma}-ray sources using a uniform analysis method. After correlating with the most-complete catalogs of source types known to emit {gamma} rays, 630 of these sources are 'unassociated' (i.e., have no obvious counterparts at other wavelengths). Here, we employ two statistical analyses of the primary {gamma}-ray characteristics for these unassociated sources in an effort to correlate their {gamma}-ray properties with the active galactic nucleus (AGN) and pulsar populations in 1FGL. Based on the correlation results, we classify 221 AGN-like andmore » 134 pulsar-like sources in the 1FGL unassociated sources. The results of these source 'classifications' appear to match the expected source distributions, especially at high Galactic latitudes. While useful for planning future multiwavelength follow-up observations, these analyses use limited inputs, and their predictions should not be considered equivalent to 'probable source classes' for these sources. We discuss multiwavelength results and catalog cross-correlations to date, and provide new source associations for 229 Fermi-LAT sources that had no association listed in the 1FGL catalog. By validating the source classifications against these new associations, we find that the new association matches the predicted source class in {approx}80% of the sources.« less

  8. A Statistical Approach to Recognizing Source Classes for Unassociated Sources in the First FERMI-LAT Catalog

    NASA Technical Reports Server (NTRS)

    Ackermann, M.; Ajello, M.; Allafort, A.; Antolini, E.; Baldini, L.; Ballet, J.; Barbiellini, G.; Bastieri, D.; Bellazzini, R.; Berenji, B.; hide

    2012-01-01

    The Fermi Large Area Telescope (LAT) First Source Catalog (1FGL) provided spatial, spectral, and temporal properties for a large number of gamma -ray sources using a uniform analysis method. After correlating with the mostcomplete catalogs of source types known to emit gamma rays, 630 of these sources are "unassociated" (i.e., have no obvious counterparts at other wavelengths). Here, we employ two statistical analyses of the primary gamma-ray characteristics for these unassociated sources in an effort to correlate their gamma-ray properties with the active galactic nucleus (AGN) and pulsar populations in 1FGL. Based on the correlation results, we classify 221 AGN-like and 134 pulsar-like sources in the 1FGL unassociated sources. The results of these source "classifications" appear to match the expected source distributions, especially at high Galactic latitudes. While useful for planning future multiwavelength follow-up observations, these analyses use limited inputs, and their predictions should not be considered equivalent to "probable source classes" for these sources. We discuss multiwavelength results and catalog cross-correlations to date, and provide new source associations for 229 Fermi-LAT sources that had no association listed in the 1FGL catalog. By validating the source classifications against these new associations, we find that the new association matches the predicted source class in approximately 80% of the sources.

  9. A statistical approach to recognize source classes for unassociated sources in the first Fermi-LAT catalog

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ackermann, M.; Ajello, M.; Allafort, A.

    The Fermi Large Area Telescope (LAT) First Source Catalog (1FGL) provided spatial, spectral, and temporal properties for a large number of γ-ray sources using a uniform analysis method. After correlating with the most-complete catalogs of source types known to emit γ rays, 630 of these sources are "unassociated" (i.e., have no obvious counterparts at other wavelengths).We employ two statistical analyses of the primary γ-ray characteristics for these unassociated sources in an effort to correlate their γ-ray properties with the active galactic nucleus (AGN) and pulsar populations in 1FGL. Based on the correlation results, we classify 221 AGN-like and 134 pulsar-likemore » sources in the 1FGL unassociated sources. Furthermore, the results of these source "classifications" appear to match the expected source distributions, especially at high Galactic latitudes. While useful for planning future multiwavelength follow-up observations, these analyses use limited inputs, and their predictions should not be considered equivalent to "probable source classes" for these sources. We also discuss multiwavelength results and catalog cross-correlations to date, and provide new source associations for 229 Fermi-LAT sources that had no association listed in the 1FGL catalog. By validating the source classifications against these new associations, we find that the new association matches the predicted source class in ~80% of the sources.« less

  10. Infrared small target detection with kernel Fukunaga Koontz transform

    NASA Astrophysics Data System (ADS)

    Liu, Rui-ming; Liu, Er-qi; Yang, Jie; Zhang, Tian-hao; Wang, Fang-lin

    2007-09-01

    The Fukunaga-Koontz transform (FKT) has been proposed for many years. It can be used to solve two-pattern classification problems successfully. However, there are few researchers who have definitely extended FKT to kernel FKT (KFKT). In this paper, we first complete this task. Then a method based on KFKT is developed to detect infrared small targets. KFKT is a supervised learning algorithm. How to construct training sets is very important. For automatically detecting targets, the synthetic target images and real background images are used to train KFKT. Because KFKT can represent the higher order statistical properties of images, we expect better detection performance of KFKT than that of FKT. The well-devised experiments verify that KFKT outperforms FKT in detecting infrared small targets.

  11. Testing Properties of Boolean Functions

    DTIC Science & Technology

    2012-01-01

    Applying the Hermite decomposition of f and linearity of expectation, E x,y [f(x)f(y) 〈x, y〉] = n∑ i=1 ∑ S,T∈Nn f̂(S)f̂(T )E x [HS(x)xi]E y [ HT (y)yi...otherwise it takes the value 0. Similarly, Ey[ HT (y)yi] = 1 iff T = ei. 38 Part I Exact Query Complexity 39 Chapter 5 Testing Juntas We begin by studying the...1− 2e− 6m /2 = 1− 2e−O( √ n). The estimate ν̃ is a U-statistic with kernel ψ∗f . This kernel satisfies ‖ψ∗f − Eψ∗f‖∞ ≤ 2‖ψ∗f‖∞ = 2 √ 4n log(4n/3

  12. Agent-Based Model Approach to Complex Phenomena in Real Economy

    NASA Astrophysics Data System (ADS)

    Iyetomi, H.; Aoyama, H.; Fujiwara, Y.; Ikeda, Y.; Souma, W.

    An agent-based model for firms' dynamics is developed. The model consists of firm agents with identical characteristic parameters and a bank agent. Dynamics of those agents are described by their balance sheets. Each firm tries to maximize its expected profit with possible risks in market. Infinite growth of a firm directed by the ``profit maximization" principle is suppressed by a concept of ``going concern". Possibility of bankruptcy of firms is also introduced by incorporating a retardation effect of information on firms' decision. The firms, mutually interacting through the monopolistic bank, become heterogeneous in the course of temporal evolution. Statistical properties of firms' dynamics obtained by simulations based on the model are discussed in light of observations in the real economy.

  13. Scaling in non-stationary time series. (II). Teen birth phenomenon

    NASA Astrophysics Data System (ADS)

    Ignaccolo, M.; Allegrini, P.; Grigolini, P.; Hamilton, P.; West, B. J.

    2004-05-01

    This paper is devoted to the problem of statistical mechanics raised by the analysis of an issue of sociological interest: the teen birth phenomenon. It is expected that these data are characterized by correlated fluctuations, reflecting the cooperative properties of the process. However, the assessment of the anomalous scaling generated by these correlations is made difficult, and ambiguous as well, by the non-stationary nature of the data that shows a clear dependence on seasonal periodicity (periodic component) and an average changing slowly in time (slow component) as well. We use the detrending techniques described in the companion paper [The earlier companion paper], to safely remove all the biases and to derive the genuine scaling of the teen birth phenomenon.

  14. Simulating future residential property losses from wildfire in Flathead County, Montana: Chapter 1

    USGS Publications Warehouse

    Prato, Tony; Paveglio, Travis B; Barnett, Yan; Silverstein, Robin; Hardy, Michael; Keane, Robert; Loehman, Rachel A.; Clark, Anthony; Fagre, Daniel B.; Venn, Tyron; Stockmann, Keith

    2014-01-01

    Wildfire damages to private residences in the United States and elsewhere have increased as a result of expansion of the wildland-urban interface (WUI) and other factors. Understanding this unwelcome trend requires analytical frameworks that simulate how various interacting social, economic, and biophysical factors influence those damages. A methodological framework is developed for simulating expected residential property losses from wildfire [E(RLW)], which is a probabilistic monetary measure of wildfire risk to residential properties in the WUI. E(RLW) is simulated for Flathead County, Montana for five, 10-year subperiods covering the period 2010-2059, under various assumptions about future climate change, economic growth, land use policy, and forest management. Results show statistically significant increases in the spatial extent of WUI properties, the number of residential structures at risk from wildfire, and E(RLW) over the 50-year evaluation period for both the county and smaller subareas (i.e., neighborhoods and parcels). The E(RLW) simulation framework presented here advances the field of wildfire risk assessment by providing a finer-scale tool that incorporates a set of dynamic, interacting processes. The framework can be applied using other scenarios for climate change, economic growth, land use policy, and forest management, and in other areas.

  15. Isotropic microscale mechanical properties of coral skeletons

    PubMed Central

    Pasquini, Luca; Molinari, Alan; Fantazzini, Paola; Dauphen, Yannicke; Cuif, Jean-Pierre; Levy, Oren; Dubinsky, Zvy; Caroselli, Erik; Prada, Fiorella; Goffredo, Stefano; Di Giosia, Matteo; Reggi, Michela; Falini, Giuseppe

    2015-01-01

    Scleractinian corals are a major source of biogenic calcium carbonate, yet the relationship between their skeletal microstructure and mechanical properties has been scarcely studied. In this work, the skeletons of two coral species: solitary Balanophyllia europaea and colonial Stylophora pistillata, were investigated by nanoindentation. The hardness HIT and Young's modulus EIT were determined from the analysis of several load–depth data on two perpendicular sections of the skeletons: longitudinal (parallel to the main growth axis) and transverse. Within the experimental and statistical uncertainty, the average values of the mechanical parameters are independent on the section's orientation. The hydration state of the skeletons did not affect the mechanical properties. The measured values, EIT in the 76–77 GPa range, and HIT in the 4.9–5.1 GPa range, are close to the ones expected for polycrystalline pure aragonite. Notably, a small difference in HIT is observed between the species. Different from corals, single-crystal aragonite and the nacreous layer of the seashell Atrina rigida exhibit clearly orientation-dependent mechanical properties. The homogeneous and isotropic mechanical behaviour of the coral skeletons at the microscale is correlated with the microstructure, observed by electron microscopy and atomic force microscopy, and with the X-ray diffraction patterns of the longitudinal and transverse sections. PMID:25977958

  16. The likelihood ratio as a random variable for linked markers in kinship analysis.

    PubMed

    Egeland, Thore; Slooten, Klaas

    2016-11-01

    The likelihood ratio is the fundamental quantity that summarizes the evidence in forensic cases. Therefore, it is important to understand the theoretical properties of this statistic. This paper is the last in a series of three, and the first to study linked markers. We show that for all non-inbred pairwise kinship comparisons, the expected likelihood ratio in favor of a type of relatedness depends on the allele frequencies only via the number of alleles, also for linked markers, and also if the true relationship is another one than is tested for by the likelihood ratio. Exact expressions for the expectation and variance are derived for all these cases. Furthermore, we show that the expected likelihood ratio is a non-increasing function if the recombination rate increases between 0 and 0.5 when the actual relationship is the one investigated by the LR. Besides being of theoretical interest, exact expressions such as obtained here can be used for software validation as they allow to verify the correctness up to arbitrary precision. The paper also presents results and advice of practical importance. For example, we argue that the logarithm of the likelihood ratio behaves in a fundamentally different way than the likelihood ratio itself in terms of expectation and variance, in agreement with its interpretation as weight of evidence. Equipped with the results presented and freely available software, one may check calculations and software and also do power calculations.

  17. Modeling the Test-Taking Motivation Construct through Investigation of Psychometric Properties of an Expectancy-Value-Based Questionnaire

    ERIC Educational Resources Information Center

    Knekta, Eva; Eklöf, Hanna

    2015-01-01

    The aim of this study was to evaluate the psychometric properties of an expectancy-value-based questionnaire measuring five aspects of test-taking motivation (effort, expectancies, importance, interest, and test anxiety). The questionnaire was distributed to a sample of Swedish Grade 9 students taking a low-stakes (n = 1,047) or a high-stakes (n =…

  18. On the Determination of Poisson Statistics for Haystack Radar Observations of Orbital Debris

    NASA Technical Reports Server (NTRS)

    Stokely, Christopher L.; Benbrook, James R.; Horstman, Matt

    2007-01-01

    A convenient and powerful method is used to determine if radar detections of orbital debris are observed according to Poisson statistics. This is done by analyzing the time interval between detection events. For Poisson statistics, the probability distribution of the time interval between events is shown to be an exponential distribution. This distribution is a special case of the Erlang distribution that is used in estimating traffic loads on telecommunication networks. Poisson statistics form the basis of many orbital debris models but the statistical basis of these models has not been clearly demonstrated empirically until now. Interestingly, during the fiscal year 2003 observations with the Haystack radar in a fixed staring mode, there are no statistically significant deviations observed from that expected with Poisson statistics, either independent or dependent of altitude or inclination. One would potentially expect some significant clustering of events in time as a result of satellite breakups, but the presence of Poisson statistics indicates that such debris disperse rapidly with respect to Haystack's very narrow radar beam. An exception to Poisson statistics is observed in the months following the intentional breakup of the Fengyun satellite in January 2007.

  19. Detection of material property errors in handbooks and databases using artificial neural networks with hidden correlations

    NASA Astrophysics Data System (ADS)

    Zhang, Y. M.; Evans, J. R. G.; Yang, S. F.

    2010-11-01

    The authors have discovered a systematic, intelligent and potentially automatic method to detect errors in handbooks and stop their transmission using unrecognised relationships between materials properties. The scientific community relies on the veracity of scientific data in handbooks and databases, some of which have a long pedigree covering several decades. Although various outlier-detection procedures are employed to detect and, where appropriate, remove contaminated data, errors, which had not been discovered by established methods, were easily detected by our artificial neural network in tables of properties of the elements. We started using neural networks to discover unrecognised relationships between materials properties and quickly found that they were very good at finding inconsistencies in groups of data. They reveal variations from 10 to 900% in tables of property data for the elements and point out those that are most probably correct. Compared with the statistical method adopted by Ashby and co-workers [Proc. R. Soc. Lond. Ser. A 454 (1998) p. 1301, 1323], this method locates more inconsistencies and could be embedded in database software for automatic self-checking. We anticipate that our suggestion will be a starting point to deal with this basic problem that affects researchers in every field. The authors believe it may eventually moderate the current expectation that data field error rates will persist at between 1 and 5%.

  20. Classical subjective expected utility.

    PubMed

    Cerreia-Vioglio, Simone; Maccheroni, Fabio; Marinacci, Massimo; Montrucchio, Luigi

    2013-04-23

    We consider decision makers who know that payoff-relevant observations are generated by a process that belongs to a given class M, as postulated in Wald [Wald A (1950) Statistical Decision Functions (Wiley, New York)]. We incorporate this Waldean piece of objective information within an otherwise subjective setting à la Savage [Savage LJ (1954) The Foundations of Statistics (Wiley, New York)] and show that this leads to a two-stage subjective expected utility model that accounts for both state and model uncertainty.

  1. Precision growth index using the clustering of cosmic structures and growth data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pouri, Athina; Basilakos, Spyros; Plionis, Manolis, E-mail: athpouri@phys.uoa.gr, E-mail: svasil@academyofathens.gr, E-mail: mplionis@physics.auth.gr

    2014-08-01

    We use the clustering properties of Luminous Red Galaxies (LRGs) and the growth rate data provided by the various galaxy surveys in order to constrain the growth index γ) of the linear matter fluctuations. We perform a standard χ{sup 2}-minimization procedure between theoretical expectations and data, followed by a joint likelihood analysis and we find a value of γ=0.56± 0.05, perfectly consistent with the expectations of the ΛCDM model, and Ω{sub m0} =0.29± 0.01, in very good agreement with the latest Planck results. Our analysis provides significantly more stringent growth index constraints with respect to previous studies, as indicated by the fact thatmore » the corresponding uncertainty is only ∼ 0.09 γ. Finally, allowing γ to vary with redshift in two manners (Taylor expansion around z=0, and Taylor expansion around the scale factor), we find that the combined statistical analysis between our clustering and literature growth data alleviates the degeneracy and obtain more stringent constraints with respect to other recent studies.« less

  2. Probabilistic performance estimators for computational chemistry methods: The empirical cumulative distribution function of absolute errors

    NASA Astrophysics Data System (ADS)

    Pernot, Pascal; Savin, Andreas

    2018-06-01

    Benchmarking studies in computational chemistry use reference datasets to assess the accuracy of a method through error statistics. The commonly used error statistics, such as the mean signed and mean unsigned errors, do not inform end-users on the expected amplitude of prediction errors attached to these methods. We show that, the distributions of model errors being neither normal nor zero-centered, these error statistics cannot be used to infer prediction error probabilities. To overcome this limitation, we advocate for the use of more informative statistics, based on the empirical cumulative distribution function of unsigned errors, namely, (1) the probability for a new calculation to have an absolute error below a chosen threshold and (2) the maximal amplitude of errors one can expect with a chosen high confidence level. Those statistics are also shown to be well suited for benchmarking and ranking studies. Moreover, the standard error on all benchmarking statistics depends on the size of the reference dataset. Systematic publication of these standard errors would be very helpful to assess the statistical reliability of benchmarking conclusions.

  3. Design of order statistics filters using feedforward neural networks

    NASA Astrophysics Data System (ADS)

    Maslennikova, Yu. S.; Bochkarev, V. V.

    2016-08-01

    In recent years significant progress have been made in the development of nonlinear data processing techniques. Such techniques are widely used in digital data filtering and image enhancement. Many of the most effective nonlinear filters based on order statistics. The widely used median filter is the best known order statistic filter. Generalized form of these filters could be presented based on Lloyd's statistics. Filters based on order statistics have excellent robustness properties in the presence of impulsive noise. In this paper, we present special approach for synthesis of order statistics filters using artificial neural networks. Optimal Lloyd's statistics are used for selecting of initial weights for the neural network. Adaptive properties of neural networks provide opportunities to optimize order statistics filters for data with asymmetric distribution function. Different examples demonstrate the properties and performance of presented approach.

  4. Urban scaling in Europe

    PubMed Central

    Bettencourt, Luís M. A.; Lobo, José

    2016-01-01

    Over the last few decades, in disciplines as diverse as economics, geography and complex systems, a perspective has arisen proposing that many properties of cities are quantitatively predictable due to agglomeration or scaling effects. Using new harmonized definitions for functional urban areas, we examine to what extent these ideas apply to European cities. We show that while most large urban systems in Western Europe (France, Germany, Italy, Spain, UK) approximately agree with theoretical expectations, the small number of cities in each nation and their natural variability preclude drawing strong conclusions. We demonstrate how this problem can be overcome so that cities from different urban systems can be pooled together to construct larger datasets. This leads to a simple statistical procedure to identify urban scaling relations, which then clearly emerge as a property of European cities. We compare the predictions of urban scaling to Zipf's law for the size distribution of cities and show that while the former holds well the latter is a poor descriptor of European cities. We conclude with scenarios for the size and properties of future pan-European megacities and their implications for the economic productivity, technological sophistication and regional inequalities of an integrated European urban system. PMID:26984190

  5. Statistical palaeomagnetic field modelling and dynamo numerical simulation

    NASA Astrophysics Data System (ADS)

    Bouligand, C.; Hulot, G.; Khokhlov, A.; Glatzmaier, G. A.

    2005-06-01

    By relying on two numerical dynamo simulations for which such investigations are possible, we test the validity and sensitivity of a statistical palaeomagnetic field modelling approach known as the giant gaussian process (GGP) modelling approach. This approach is currently used to analyse palaeomagnetic data at times of stable polarity and infer some information about the way the main magnetic field (MF) of the Earth has been behaving in the past and has possibly been influenced by core-mantle boundary (CMB) conditions. One simulation has been run with homogeneous CMB conditions, the other with more realistic non-homogeneous symmetry breaking CMB conditions. In both simulations, it is found that, as required by the GGP approach, the field behaves as a short-term memory process. Some severe non-stationarity is however found in the non-homogeneous case, leading to very significant departures of the Gauss coefficients from a Gaussian distribution, in contradiction with the assumptions underlying the GGP approach. A similar but less severe non-stationarity is found in the case of the homogeneous simulation, which happens to display a more Earth-like temporal behaviour than the non-homogeneous case. This suggests that a GGP modelling approach could nevertheless be applied to try and estimate the mean μ and covariance matrix γ(τ) (first- and second-order statistical moments) of the field produced by the geodynamo. A detailed study of both simulations is carried out to assess the possibility of detecting statistical symmetry breaking properties of the underlying dynamo process by inspection of estimates of μ and γ(τ). As expected (because of the role of the rotation of the Earth in the dynamo process), those estimates reveal spherical symmetry breaking properties. Equatorial symmetry breaking properties are also detected in both simulations, showing that such symmetry breaking properties can occur spontaneously under homogeneous CMB conditions. By contrast axial symmetry breaking is detected only in the non-homogenous simulation, testifying for the constraints imposed by the CMB conditions. The signature of this axial symmetry breaking is however found to be much weaker than the signature of equatorial symmetry breaking. We note that this could be the reason why only equatorial symmetry breaking properties (in the form of the well-known axial quadrupole term in the time-averaged field) have unambiguously been found so far by analysing the real data. However, this could also be because those analyses have all assumed to simple a form for γ(τ) when attempting to estimate μ. Suggestions are provided to make sure future attempts of GGP modelling with real data are being carried out in a more consistent and perhaps more efficient way.

  6. Statistical properties of several models of fractional random point processes

    NASA Astrophysics Data System (ADS)

    Bendjaballah, C.

    2011-08-01

    Statistical properties of several models of fractional random point processes have been analyzed from the counting and time interval statistics points of view. Based on the criterion of the reduced variance, it is seen that such processes exhibit nonclassical properties. The conditions for these processes to be treated as conditional Poisson processes are examined. Numerical simulations illustrate part of the theoretical calculations.

  7. Explorations in Statistics: Confidence Intervals

    ERIC Educational Resources Information Center

    Curran-Everett, Douglas

    2009-01-01

    Learning about statistics is a lot like learning about science: the learning is more meaningful if you can actively explore. This third installment of "Explorations in Statistics" investigates confidence intervals. A confidence interval is a range that we expect, with some level of confidence, to include the true value of a population parameter…

  8. Raising Courtney Just Like Her Sisters: Forging High Expectations

    ERIC Educational Resources Information Center

    Stelmack, Amie A.

    2014-01-01

    Shortly after discovering that her youngest daughter was deaf, Amie Stelmack began hearing frightening statistics about children who were deaf. Many people--including professionals--told her about shocking statistics. The statistics indicated that deaf children often did not read beyond upper elementary level. They seemed to suggest that this…

  9. The Development of Statistical Literacy at School

    ERIC Educational Resources Information Center

    Callingham, Rosemary; Watson, Jane M.

    2017-01-01

    Statistical literacy increasingly is considered an important outcome of schooling. There is little information, however, about appropriate expectations of students at different stages of schooling. Some progress towards this goal was made by Watson and Callingham (2005), who identified an empirical 6-level hierarchy of statistical literacy and the…

  10. Race, Sex, and Their Influences on Introductory Statistics Education

    ERIC Educational Resources Information Center

    van Es, Cindy; Weaver, Michelle M.

    2018-01-01

    The Survey of Attitudes Toward Statistics or SATS was administered for three consecutive years to students in an Introductory Statistics course at Cornell University. Questions requesting demographic information and expected final course grade were added. Responses were analyzed to investigate possible differences between sexes and racial/ethnic…

  11. The Development of a Professional Statistics Teaching Identity

    ERIC Educational Resources Information Center

    Whitaker, Douglas

    2016-01-01

    Motivated by the increased statistics expectations for students and their teachers because of the widespread adoption of the Common Core State Standards for Mathematics, this study explores exemplary, in-service statistics teachers' professional identities using a theoretical framework informed by Gee (2000) and communities of practice (Lave &…

  12. Use of Groundwater Lifetime Expectancy for the Performance Assessment of Deep Geologic Radioactive Waste Repositories.

    NASA Astrophysics Data System (ADS)

    Cornaton, F.; Park, Y.; Normani, S.; Sudicky, E.; Sykes, J.

    2005-12-01

    Long-term solutions for the disposal of toxic wastes usually involve isolation of the wastes in a deep subsurface geologic environment. In the case of spent nuclear fuel, the safety of the host repository depends on two main barriers: the engineered barrier and the natural geological barrier. If radionuclide leakage occurs from the engineered barrier, the geological medium represents the ultimate barrier that is relied upon to ensure safety. Consequently, an evaluation of radionuclide travel times from the repository to the biosphere is critically important in a performance assessment analysis. In this study, we develop a travel time framework based on the concept of groundwater lifetime expectancy as a safety indicator. Lifetime expectancy characterizes the time radionuclides will spend in the subsurface after their release from the repository and prior to discharging into the biosphere. The probability density function of lifetime expectancy is computed throughout the host rock by solving the backward-in-time solute transport equation subject to a properly posed set of boundary conditions. It can then be used to define optimal repository locations. In a second step, the risk associated with selected sites can be evaluated by simulating an appropriate contaminant release history. The proposed methodology is applied in the context of a typical Canadian Shield environment. Based on a statistically-generated three-dimension network of fracture zones embedded in the granitic host rock, the sensitivity and the uncertainty of lifetime expectancy to the hydraulic and dispersive properties of the fracture network, including the impact of conditioning via their surface expressions, is computed in order to demonstrate the utility of the methodology.

  13. General Public Expectation from the Communication Process with their Healthcare Providers

    PubMed Central

    Hassali, MA; Shafie, AA; Khan, TM

    2012-01-01

    The current study aimed to explore the public views and expectation about a successful communication process between the healthcare providers/physicians and patients in Penang Island, Malaysia. A cross-sectional study was conducted in Penang Island using a 14-item questionnaire. Statistical Package for Social Sciences (SPSS) software version 15.0® were used to analyze the collected data. A nonparametric statistics was applied; the Chi-square test was applied to measure the association among the variables. P-values less than 0.05 were considered statistically significant. A total of N (500) respondents have shown willingness to participate in the study with a response rate of 83.3%. The majority 319 (63.9%) have disclosed to communicate with their healthcare providers in the Malay language and about 401 (80.4%) of the respondents were found satisfied with the information provided by the physician. It was a common expectation by the most of the sample to focus more on the patient history before prescribing any medicine. Moreover, about 60.0% of the respondents expected that the healthcare providers must show patience to the patient's queries. The level of satisfaction with the information shared by the healthcare providers was higher among the respondents with a higher education level. Furthermore, patients with higher level of education expect that physician shouldwell understand their views and medical history to prescribe a better therapeutic regimen. PMID:23112539

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gibbs, Zachary M.; Kim, Hyun-Sik; Materials Research Center, Samsung Advanced Institute of Technology, Samsung Electronics, Suwon 443-803

    In characterizing thermoelectric materials, electrical and thermal transport measurements are often used to estimate electronic band structure properties such as the effective mass and band gap. The Goldsmid-Sharp band gap, E{sub g} = 2e|S|{sub max}T{sub max}, is a tool widely employed to estimate the band gap from temperature dependent Seebeck coefficient measurements. However, significant deviations of more than a factor of two are now known to occur. We find that this is when either the majority-to-minority weighted mobility ratio (A) becomes very different from 1.0 or as the band gap (E{sub g}) becomes significantly smaller than 10 k{sub B}T. For narrow gapsmore » (E{sub g} ≲ 6 k{sub B}T), the Maxwell-Boltzmann statistics applied by Goldsmid-Sharp break down and Fermi-Dirac statistics are required. We generate a chart that can be used to quickly estimate the expected correction to the Goldsmid-Sharp band gap depending on A and S{sub max}; however, additional errors can occur for S < 150 μV/K due to degenerate behavior.« less

  15. Rates of profit as correlated sums of random variables

    NASA Astrophysics Data System (ADS)

    Greenblatt, R. E.

    2013-10-01

    Profit realization is the dominant feature of market-based economic systems, determining their dynamics to a large extent. Rather than attaining an equilibrium, profit rates vary widely across firms, and the variation persists over time. Differing definitions of profit result in differing empirical distributions. To study the statistical properties of profit rates, I used data from a publicly available database for the US Economy for 2009-2010 (Risk Management Association). For each of three profit rate measures, the sample space consists of 771 points. Each point represents aggregate data from a small number of US manufacturing firms of similar size and type (NAICS code of principal product). When comparing the empirical distributions of profit rates, significant ‘heavy tails’ were observed, corresponding principally to a number of firms with larger profit rates than would be expected from simple models. An apparently novel correlated sum of random variables statistical model was used to model the data. In the case of operating and net profit rates, a number of firms show negative profits (losses), ruling out simple gamma or lognormal distributions as complete models for these data.

  16. UniEnt: uniform entropy model for the dynamics of a neuronal population

    NASA Astrophysics Data System (ADS)

    Hernandez Lahme, Damian; Nemenman, Ilya

    Sensory information and motor responses are encoded in the brain in a collective spiking activity of a large number of neurons. Understanding the neural code requires inferring statistical properties of such collective dynamics from multicellular neurophysiological recordings. Questions of whether synchronous activity or silence of multiple neurons carries information about the stimuli or the motor responses are especially interesting. Unfortunately, detection of such high order statistical interactions from data is especially challenging due to the exponentially large dimensionality of the state space of neural collectives. Here we present UniEnt, a method for the inference of strengths of multivariate neural interaction patterns. The method is based on the Bayesian prior that makes no assumptions (uniform a priori expectations) about the value of the entropy of the observed multivariate neural activity, in contrast to popular approaches that maximize this entropy. We then study previously published multi-electrode recordings data from salamander retina, exposing the relevance of higher order neural interaction patterns for information encoding in this system. This work was supported in part by Grants JSMF/220020321 and NSF/IOS/1208126.

  17. Mathematical problems in the application of multilinear models to facial emotion processing experiments

    NASA Astrophysics Data System (ADS)

    Andersen, Anders H.; Rayens, William S.; Li, Ren-Cang; Blonder, Lee X.

    2000-10-01

    In this paper we describe the enormous potential that multilinear models hold for the analysis of data from neuroimaging experiments that rely on functional magnetic resonance imaging (MRI) or other imaging modalities. A case is made for why one might fully expect that the successful introduction of these models to the neuroscience community could define the next generation of structure-seeking paradigms in the area. In spite of the potential for immediate application, there is much to do from the perspective of statistical science. That is, although multilinear models have already been particularly successful in chemistry and psychology, relatively little is known about their statistical properties. To that end, our research group at the University of Kentucky has made significant progress. In particular, we are in the process of developing formal influence measures for multilinear methods as well as associated classification models and effective implementations. We believe that these problems will be among the most important and useful to the scientific community. Details are presented herein and an application is given in the context of facial emotion processing experiments.

  18. Paracousti-UQ: A Stochastic 3-D Acoustic Wave Propagation Algorithm.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Preston, Leiph

    Acoustic full waveform algorithms, such as Paracousti, provide deterministic solutions in complex, 3-D variable environments. In reality, environmental and source characteristics are often only known in a statistical sense. Thus, to fully characterize the expected sound levels within an environment, this uncertainty in environmental and source factors should be incorporated into the acoustic simulations. Performing Monte Carlo (MC) simulations is one method of assessing this uncertainty, but it can quickly become computationally intractable for realistic problems. An alternative method, using the technique of stochastic partial differential equations (SPDE), allows computation of the statistical properties of output signals at a fractionmore » of the computational cost of MC. Paracousti-UQ solves the SPDE system of 3-D acoustic wave propagation equations and provides estimates of the uncertainty of the output simulated wave field (e.g., amplitudes, waveforms) based on estimated probability distributions of the input medium and source parameters. This report describes the derivation of the stochastic partial differential equations, their implementation, and comparison of Paracousti-UQ results with MC simulations using simple models.« less

  19. Appropriate Domain Size for Groundwater Flow Modeling with a Discrete Fracture Network Model.

    PubMed

    Ji, Sung-Hoon; Koh, Yong-Kwon

    2017-01-01

    When a discrete fracture network (DFN) is constructed from statistical conceptualization, uncertainty in simulating the hydraulic characteristics of a fracture network can arise due to the domain size. In this study, the appropriate domain size, where less significant uncertainty in the stochastic DFN model is expected, was suggested for the Korea Atomic Energy Research Institute Underground Research Tunnel (KURT) site. The stochastic DFN model for the site was established, and the appropriate domain size was determined with the density of the percolating cluster and the percolation probability using the stochastically generated DFNs for various domain sizes. The applicability of the appropriate domain size to our study site was evaluated by comparing the statistical properties of stochastically generated fractures of varying domain sizes and estimating the uncertainty in the equivalent permeability of the generated DFNs. Our results show that the uncertainty of the stochastic DFN model is acceptable when the modeling domain is larger than the determined appropriate domain size, and the appropriate domain size concept is applicable to our study site. © 2016, National Ground Water Association.

  20. Correlation of Thermally Induced Pores with Microstructural Features Using High Energy X-rays

    NASA Astrophysics Data System (ADS)

    Menasche, David B.; Shade, Paul A.; Lind, Jonathan; Li, Shiu Fai; Bernier, Joel V.; Kenesei, Peter; Schuren, Jay C.; Suter, Robert M.

    2016-11-01

    Combined application of a near-field High Energy Diffraction Microscopy measurement of crystal lattice orientation fields and a tomographic measurement of pore distributions in a sintered nickel-based superalloy sample allows pore locations to be correlated with microstructural features. Measurements were carried out at the Advanced Photon Source beamline 1-ID using an X-ray energy of 65 keV for each of the measurement modes. The nickel superalloy sample was prepared in such a way as to generate significant thermally induced porosity. A three-dimensionally resolved orientation map is directly overlaid with the tomographically determined pore map through a careful registration procedure. The data are shown to reliably reproduce the expected correlations between specific microstructural features (triple lines and quadruple nodes) and pore positions. With the statistics afforded by the 3D data set, we conclude that within statistical limits, pore formation does not depend on the relative orientations of the grains. The experimental procedures and analysis tools illustrated are being applied to a variety of materials problems in which local heterogeneities can affect materials properties.

  1. Covariate analysis of bivariate survival data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bennett, L.E.

    1992-01-01

    The methods developed are used to analyze the effects of covariates on bivariate survival data when censoring and ties are present. The proposed method provides models for bivariate survival data that include differential covariate effects and censored observations. The proposed models are based on an extension of the univariate Buckley-James estimators which replace censored data points by their expected values, conditional on the censoring time and the covariates. For the bivariate situation, it is necessary to determine the expectation of the failure times for one component conditional on the failure or censoring time of the other component. Two different methodsmore » have been developed to estimate these expectations. In the semiparametric approach these expectations are determined from a modification of Burke's estimate of the bivariate empirical survival function. In the parametric approach censored data points are also replaced by their conditional expected values where the expected values are determined from a specified parametric distribution. The model estimation will be based on the revised data set, comprised of uncensored components and expected values for the censored components. The variance-covariance matrix for the estimated covariate parameters has also been derived for both the semiparametric and parametric methods. Data from the Demographic and Health Survey was analyzed by these methods. The two outcome variables are post-partum amenorrhea and breastfeeding; education and parity were used as the covariates. Both the covariate parameter estimates and the variance-covariance estimates for the semiparametric and parametric models will be compared. In addition, a multivariate test statistic was used in the semiparametric model to examine contrasts. The significance of the statistic was determined from a bootstrap distribution of the test statistic.« less

  2. Observers Exploit Stochastic Models of Sensory Change to Help Judge the Passage of Time

    PubMed Central

    Ahrens, Misha B.; Sahani, Maneesh

    2011-01-01

    Summary Sensory stimulation can systematically bias the perceived passage of time [1–5], but why and how this happens is mysterious. In this report, we provide evidence that such biases may ultimately derive from an innate and adaptive use of stochastically evolving dynamic stimuli to help refine estimates derived from internal timekeeping mechanisms [6–15]. A simplified statistical model based on probabilistic expectations of stimulus change derived from the second-order temporal statistics of the natural environment [16, 17] makes three predictions. First, random noise-like stimuli whose statistics violate natural expectations should induce timing bias. Second, a previously unexplored obverse of this effect is that similar noise stimuli with natural statistics should reduce the variability of timing estimates. Finally, this reduction in variability should scale with the interval being timed, so as to preserve the overall Weber law of interval timing. All three predictions are borne out experimentally. Thus, in the context of our novel theoretical framework, these results suggest that observers routinely rely on sensory input to augment their sense of the passage of time, through a process of Bayesian inference based on expectations of change in the natural environment. PMID:21256018

  3. POWDERED ACTIVATED CARBON FROM NORTH DAKOTA LIGNITE: AN OPTION FOR DISINFECTION BY-PRODUCT CONTROL IN WATER TREATMENT PLANTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Daniel J. Stepan; Thomas A. Moe; Melanie D. Hetland

    New federal drinking water regulations have been promulgated to restrict the levels of disinfection by-products (DBPs) in finished public water supplies. DBPs are suspected carcinogens and are formed when organic material is partially oxidized by disinfectants commonly used in the water treatment industry. Additional federal mandates are expected in the near future that will further affect public water suppliers with respect to DBPs. Powdered activated carbon (PAC) has traditionally been used by the water treatment industry for the removal of compounds contributing to taste and odor problems. PAC also has the potential to remove naturally occurring organic matter (NOM) frommore » raw waters prior to disinfection, thus controlling the formation of regulated DBPs. Many small water systems are currently using PAC for taste and odor control and have the potential to use PAC for controlling DBPs. This project, a cooperative effort between the Energy & Environmental Research Center (EERC), the Grand Forks Water Treatment Plant, and the University of North Dakota Department of Civil Engineering, consists of several interrelated tasks. The objective of the research was to evaluate a cost-effective PAC produced from North Dakota lignite for removing NOM from water and reducing trihalomethane formation potential. The research approach was to develop a statistically valid testing protocol that can be used to compare dose-response relationships between North Dakota lignite-derived PAC and commercially available PAC products. A statistical analysis was performed to determine whether significant correlations exist between operating conditions, water properties, PAC properties, and dose-response behavior. Pertinent physical and chemical properties were also measured for each of the waters and each of the PACs.« less

  4. #AltPlanets: Exploring the Exoplanet Catalogue with Neural Networks

    NASA Astrophysics Data System (ADS)

    Laneuville, M.; Tasker, E. J.; Guttenberg, N.

    2017-12-01

    The launch of Kepler in 2009 brought the number of known exoplanets into the thousands, in a growth explosion that shows no sign of abating. While the data available for individual planets is presently typically restricted to orbital and bulk properties, the quantity of data points allows the potential for meaningful statistical analysis. It is not clear how planet mass, radius, orbital path, stellar properties and neighbouring planets influence one another, therefore it seems inevitable that patterns will be missed simply due to the difficulty of including so many dimensions. Even simple trends may be overlooked if they fall outside our expectation of planet formation; a strong risk in a field where new discoveries have destroyed theories from the first observations of hot Jupiters. A possible way forward is to take advantage of the capabilities of neural network autoencoders. The idea of such algorithms is to learn a representation (encoding) of the data in a lower dimension space, without a priori knowledge about links between the elements. This encoding space can then be used to discover the strongest correlations in the original dataset.The key point is that trends identified by a neural network are independent of any previous analysis and pre-conceived ideas about physical processes. Results can reveal new relationships between planet properties and verify existing trends. We applied this concept to study data from the NASA Exoplanet Archive and while we have begun to explore the potential use of neural networks for exoplanet data, there are many possible extensions. For example, the network can produce a large number of 'alternative planets' whose statistics should match the current distribution. This larger dataset could highlight gaps in the parameter space or indicate observations are missing particular regimes. This could guide instrument proposals towards objects liable to yield the most information.

  5. Spatial statistical analysis of basal stem root disease under natural field epidemic of oil palm

    NASA Astrophysics Data System (ADS)

    Kamu, Assis; Phin, Chong Khim; Seman, Idris Abu; Wan, Hoong Hak; Mun, Ho Chong

    2015-02-01

    Oil palm or scientifically known as Elaeis guineensis Jacq. is the most important commodity crop in Malaysia and has greatly contributed to the economy growth of the country. As far as disease is concerned in the industry, Basal Stem Rot (BSR) caused by Ganoderma boninence remains the most important disease. BSR disease is the most widely studied with information available for oil palm disease in Malaysia. However, there is still limited study on the spatial as well as temporal pattern or distribution of the disease especially under natural field epidemic condition in oil palm plantation. The objective of this study is to spatially identify the pattern of BSR disease under natural field epidemic using two geospatial analytical techniques, which are quadrat analysis for the first order properties of partial pattern analysis and nearest-neighbor analysis (NNA) for the second order properties of partial pattern analysis. Two study sites were selected with different age of tree. Both sites are located in Tawau, Sabah and managed by the same company. The results showed that at least one of the point pattern analysis used which is NNA (i.e. the second order properties of partial pattern analysis) has confirmed the disease is complete spatial randomness. This suggests the spread of the disease is not from tree to tree and the age of palm does not play a significance role in determining the spatial pattern of the disease. From the spatial pattern of the disease, it would help in the disease management program and for the industry in the future. The statistical modelling is expected to help in identifying the right model to estimate the yield loss of oil palm due to BSR disease in the future.

  6. Dual-cycle dielectrophoretic collection rates for probing the dielectric properties of nanoparticles

    PubMed Central

    Bakewell, David J; Holmes, David

    2013-01-01

    A new DEP spectroscopy method and supporting theoretical model is developed to systematically quantify the dielectric properties of nanoparticles using continuously pulsed DEP collection rates. Initial DEP collection rates, that are dependent on the nanoparticle dielectric properties, are an attractive alternative to the crossover frequency method for determining dielectric properties. The new method introduces dual-cycle amplitude modulated and frequency-switched DEP (dual-cycle DEP) where the first collection rate with a fixed frequency acts as a control, and the second collection rate frequency is switched to a chosen value, such that, it can effectively probe the dielectric properties of the nanoparticles. The application of the control means that measurement variation between DEP collection experiments is reduced so that the frequency-switched probe collection is more effective. A mathematical model of the dual-cycle method is developed that simulates the temporal dynamics of the dual-cycle DEP nanoparticle collection system. A new statistical method is also developed that enables systematic bivariate fitting of the multifrequency DEP collection rates to the Clausius–Mossotti function, and is instrumental for determining dielectric properties. A Monte-Carlo simulation validates that collection rates improve estimation of the dielectric properties, compared with the crossover method, by exploiting a larger number of independent samples. Experiments using 200 nm diameter latex nanospheres suspended in 0.2 mS/m KCl buffer yield a nanoparticle conductivity of 26 mS/m that lies within 8% of the expected value. The results show that the dual-frequency method has considerable promise particularly for automated DEP investigations and associated technologies. PMID:23172363

  7. The Effects of the Recession on Child Poverty: Poverty Statistics for 2008 and Growth in Need during 2009

    ERIC Educational Resources Information Center

    Isaacs, Julia B.

    2009-01-01

    Nearly one in five children under age 18 lived in poor families in 2008, according to poverty statistics released by the Census Bureau in September 2009. Though high, this statistic does not capture the full impact of the economic downturn, which is expected to drive poverty even higher in 2009. However, updated poverty statistics will not be…

  8. RELATIONSHIP BETWEEN STRUCTURAL AND STRENGTH CHARACTERISTICS OF FIBER-GLASS LAMINATES,

    DTIC Science & Technology

    REINFORCED PLASTICS, STRUCTURAL PROPERTIES, LAMINATES, EPOXY RESINS, GLASS TEXTILES, LOADS(FORCES), TENSILE PROPERTIES, COMPRESSIVE PROPERTIES, LIFE EXPECTANCY(SERVICE LIFE), USSR, MECHANICAL PROPERTIES.

  9. Statistical Handbook on Aging Americans. 1994 Edition. Statistical Handbook Series Number 5.

    ERIC Educational Resources Information Center

    Schick, Frank L., Ed.; Schick, Renee, Ed.

    This statistical handbook contains 378 tables and charts illustrating the changes in the United States' aging population based on data collected during the 1990 census and several other surveys. The tables and charts are organized by topic as follows: demographics (age and sex distribution, life expectancy, race and ethnicity, geographic…

  10. Validation of Scores from a New Measure of Preservice Teachers' Self-Efficacy to Teach Statistics in the Middle Grades

    ERIC Educational Resources Information Center

    Harrell-Williams, Leigh M.; Sorto, M. Alejandra; Pierce, Rebecca L.; Lesser, Lawrence M.; Murphy, Teri J.

    2014-01-01

    The influential "Common Core State Standards for Mathematics" (CCSSM) expect students to start statistics learning during middle grades. Thus teacher education and professional development programs are advised to help preservice and in-service teachers increase their knowledge and confidence to teach statistics. Although existing…

  11. Using R in Introductory Statistics Courses with the pmg Graphical User Interface

    ERIC Educational Resources Information Center

    Verzani, John

    2008-01-01

    The pmg add-on package for the open source statistics software R is described. This package provides a simple to use graphical user interface (GUI) that allows introductory statistics students, without advanced computing skills, to quickly create the graphical and numeric summaries expected of them. (Contains 9 figures.)

  12. Analysis of statistical properties of laser speckles, forming in skin and mucous of colon: potential application in laser surgery

    NASA Astrophysics Data System (ADS)

    Rubtsov, Vladimir; Kapralov, Sergey; Chalyk, Iuri; Ulianova, Onega; Ulyanov, Sergey

    2013-02-01

    Statistical properties of laser speckles, formed in skin and mucous of colon have been analyzed and compared. It has been demonstrated that first and second order statistics of "skin" speckles and "mucous" speckles are quite different. It is shown that speckles, formed in mucous, are not Gaussian one. Layered structure of colon mucous causes formation of speckled biospeckles. First- and second- order statistics of speckled speckles have been reviewed in this paper. Statistical properties of Fresnel and Fraunhofer doubly scattered and cascade speckles are described. Non-gaussian statistics of biospeckles may lead to high localization of intensity of coherent light in human tissue during the laser surgery. Way of suppression of highly localized non-gaussian speckles is suggested.

  13. Amphoteric doping of praseodymium Pr 3+ in SrTiO 3 grain boundaries

    DOE PAGES

    Yang, H.; Lee, H. S.; Kotula, P. G.; ...

    2015-03-26

    Charge Compensation in rare-earth Praseodymium (Pr 3+) doped SrTiO 3 plays an important role in determining the overall photoluminescence properties of the system. Here, the Pr 3+ doping behavior in SrTiO 3 grain boundaries (GBs) is analyzed using aberration corrected scanning transmission electron microscopy (STEM). The presence of Pr 3+ induces structure variations and changes the statistical prevalence of GB structures. In contrast to the assumption that Pr 3+ substitutes for A site as expected in the bulk, Pr 3+ is found to substitute both Sr and Ti sites inside GBs with the highest concentration in the Ti sites. Asmore » a result, this amphoteric doping behavior in the boundary plane is further confirmed by first principles theoretical calculations.« less

  14. Asymptotic Normality of the Maximum Pseudolikelihood Estimator for Fully Visible Boltzmann Machines.

    PubMed

    Nguyen, Hien D; Wood, Ian A

    2016-04-01

    Boltzmann machines (BMs) are a class of binary neural networks for which there have been numerous proposed methods of estimation. Recently, it has been shown that in the fully visible case of the BM, the method of maximum pseudolikelihood estimation (MPLE) results in parameter estimates, which are consistent in the probabilistic sense. In this brief, we investigate the properties of MPLE for the fully visible BMs further, and prove that MPLE also yields an asymptotically normal parameter estimator. These results can be used to construct confidence intervals and to test statistical hypotheses. These constructions provide a closed-form alternative to the current methods that require Monte Carlo simulation or resampling. We support our theoretical results by showing that the estimator behaves as expected in simulation studies.

  15. Reaction-diffusion on the fully-connected lattice: A+A\\rightarrow A

    NASA Astrophysics Data System (ADS)

    Turban, Loïc; Fortin, Jean-Yves

    2018-04-01

    Diffusion-coagulation can be simply described by a dynamic where particles perform a random walk on a lattice and coalesce with probability unity when meeting on the same site. Such processes display non-equilibrium properties with strong fluctuations in low dimensions. In this work we study this problem on the fully-connected lattice, an infinite-dimensional system in the thermodynamic limit, for which mean-field behaviour is expected. Exact expressions for the particle density distribution at a given time and survival time distribution for a given number of particles are obtained. In particular, we show that the time needed to reach a finite number of surviving particles (vanishing density in the scaling limit) displays strong fluctuations and extreme value statistics, characterized by a universal class of non-Gaussian distributions with singular behaviour.

  16. Self-perpetuating Spiral Arms in Disk Galaxies

    NASA Astrophysics Data System (ADS)

    D'Onghia, Elena; Vogelsberger, Mark; Hernquist, Lars

    2013-03-01

    The causes of spiral structure in galaxies remain uncertain. Leaving aside the grand bisymmetric spirals with their own well-known complications, here we consider the possibility that multi-armed spiral features originate from density inhomogeneities orbiting within disks. Using high-resolution N-body simulations, we follow the motions of stars under the influence of gravity, and show that mass concentrations with properties similar to those of giant molecular clouds can induce the development of spiral arms through a process termed swing amplification. However, unlike in earlier work, we demonstrate that the eventual response of the disk can be highly non-linear, significantly modifying the formation and longevity of the resulting patterns. Contrary to expectations, ragged spiral structures can thus survive at least in a statistical sense long after the original perturbing influence has been removed.

  17. Amphoteric Doping of Praseodymium Pr3+ in SrTiO3 Grain Boundaries

    DOE PAGES

    Yang, Hao; Lee, H. S.; Kotula, Paul G.; ...

    2015-03-23

    Charge Compensation in rare-earth Praseodymium (Pr 3+) doped SrTiO 3 plays an important role in determining the overall photoluminescence properties of the system. Here, the Pr 3+ doping behavior in SrTiO 3 grain boundaries (GBs) is analyzed using aberration corrected scanning transmission electron microscopy (STEM). The presence of Pr 3+ induces structure variations and changes the statistical prevalence of GB structures. In contrast to the assumption that Pr 3+ substitutes for A site as expected in the bulk, Pr 3+ is found to substitute both Sr and Ti sites inside GBs with the highest concentration in the Ti sites. Asmore » a result, this amphoteric doping behavior in the boundary plane is further confirmed by first principles theoretical calculations.« less

  18. The statistics of gravitational lenses. III - Astrophysical consequences of quasar lensing

    NASA Technical Reports Server (NTRS)

    Ostriker, J. P.; Vietri, M.

    1986-01-01

    The method of Schmidt and Green (1983) for calculating the luminosity function of quasars is combined with gravitational-lensing theory to compute expected properties of lensed systems. Multiple quasar images produced by galaxies are of order 0.001 of the observed quasars, with the numbers over the whole sky calculated to be (0.86, 120, 1600) to limiting B magnitudes of (16, 19, 22). The amount of 'false evolution' is small except for an interesting subset of apparently bright, large-redshift objects for which minilensing by starlike objects may be important. Some of the BL Lac objects may be in this category, with the galaxy identified as the parent object really a foreground object within which stars have lensed a background optically violent variable quasar.

  19. Simulated cosmic microwave background maps at 0.5 deg resolution: Basic results

    NASA Technical Reports Server (NTRS)

    Hinshaw, G.; Bennett, C. L.; Kogut, A.

    1995-01-01

    We have simulated full-sky maps of the cosmic microwave background (CMB) anisotropy expected from cold dark matter (CDM) models at 0.5 deg and 1.0 deg angular resolution. Statistical properties of the maps are presented as a function of sky coverage, angular resolution, and instrument noise, and the implications of these results for observability of the Doppler peak are discussed. The rms fluctuations in a map are not a particularly robust probe of the existence of a Doppler peak; however, a full correlation analysis can provide reasonable sensitivity. We find that sensitivity to the Doppler peak depends primarily on the fraction of sky covered, and only secondarily on the angular resolution and noise level. Color plates of the simulated maps are presented to illustrate the anisotropies.

  20. Property Differencing for Incremental Checking

    NASA Technical Reports Server (NTRS)

    Yang, Guowei; Khurshid, Sarfraz; Person, Suzette; Rungta, Neha

    2014-01-01

    This paper introduces iProperty, a novel approach that facilitates incremental checking of programs based on a property di erencing technique. Speci cally, iProperty aims to reduce the cost of checking properties as they are initially developed and as they co-evolve with the program. The key novelty of iProperty is to compute the di erences between the new and old versions of expected properties to reduce the number and size of the properties that need to be checked during the initial development of the properties. Furthermore, property di erencing is used in synergy with program behavior di erencing techniques to optimize common regression scenarios, such as detecting regression errors or checking feature additions for conformance to new expected properties. Experimental results in the context of symbolic execution of Java programs annotated with properties written as assertions show the e ectiveness of iProperty in utilizing change information to enable more ecient checking.

  1. Statistical Compression for Climate Model Output

    NASA Astrophysics Data System (ADS)

    Hammerling, D.; Guinness, J.; Soh, Y. J.

    2017-12-01

    Numerical climate model simulations run at high spatial and temporal resolutions generate massive quantities of data. As our computing capabilities continue to increase, storing all of the data is not sustainable, and thus is it important to develop methods for representing the full datasets by smaller compressed versions. We propose a statistical compression and decompression algorithm based on storing a set of summary statistics as well as a statistical model describing the conditional distribution of the full dataset given the summary statistics. We decompress the data by computing conditional expectations and conditional simulations from the model given the summary statistics. Conditional expectations represent our best estimate of the original data but are subject to oversmoothing in space and time. Conditional simulations introduce realistic small-scale noise so that the decompressed fields are neither too smooth nor too rough compared with the original data. Considerable attention is paid to accurately modeling the original dataset-one year of daily mean temperature data-particularly with regard to the inherent spatial nonstationarity in global fields, and to determining the statistics to be stored, so that the variation in the original data can be closely captured, while allowing for fast decompression and conditional emulation on modest computers.

  2. Modeling Ka-band low elevation angle propagation statistics

    NASA Technical Reports Server (NTRS)

    Russell, Thomas A.; Weinfield, John; Pearson, Chris; Ippolito, Louis J.

    1995-01-01

    The statistical variability of the secondary atmospheric propagation effects on satellite communications cannot be ignored at frequencies of 20 GHz or higher, particularly if the propagation margin allocation is such that link availability falls below 99 percent. The secondary effects considered in this paper are gaseous absorption, cloud absorption, and tropospheric scintillation; rain attenuation is the primary effect. Techniques and example results are presented for estimation of the overall combined impact of the atmosphere on satellite communications reliability. Statistical methods are employed throughout and the most widely accepted models for the individual effects are used wherever possible. The degree of correlation between the effects is addressed and some bounds on the expected variability in the combined effects statistics are derived from the expected variability in correlation. Example estimates are presented of combined effects statistics in the Washington D.C. area of 20 GHz and 5 deg elevation angle. The statistics of water vapor are shown to be sufficient for estimation of the statistics of gaseous absorption at 20 GHz. A computer model based on monthly surface weather is described and tested. Significant improvement in prediction of absorption extremes is demonstrated with the use of path weather data instead of surface data.

  3. Physics of Life: A Model for Non-Newtonian Properties of Living Systems

    NASA Technical Reports Server (NTRS)

    Zak, Michail

    2010-01-01

    This innovation proposes the reconciliation of the evolution of life with the second law of thermodynamics via the introduction of the First Principle for modeling behavior of living systems. The structure of the model is quantum-inspired: it acquires the topology of the Madelung equation in which the quantum potential is replaced with the information potential. As a result, the model captures the most fundamental property of life: the progressive evolution; i.e. the ability to evolve from disorder to order without any external interference. The mathematical structure of the model can be obtained from the Newtonian equations of motion (representing the motor dynamics) coupled with the corresponding Liouville equation (representing the mental dynamics) via information forces. All these specific non-Newtonian properties equip the model with the levels of complexity that matches the complexity of life, and that makes the model applicable for description of behaviors of ecological, social, and economical systems. Rather than addressing the six aspects of life (organization, metabolism, growth, adaptation, response to stimuli, and reproduction), this work focuses only on biosignature ; i.e. the mechanical invariants of life, and in particular, the geometry and kinematics of behavior of living things. Living things obey the First Principles of Newtonian mechanics. One main objective of this model is to extend the First Principles of classical physics to include phenomenological behavior on living systems; to develop a new mathematical formalism within the framework of classical dynamics that would allow one to capture the specific properties of natural or artificial living systems such as formation of the collective mind based upon abstract images of the selves and non-selves; exploitation of this collective mind for communications and predictions of future expected characteristics of evolution; and for making decisions and implementing the corresponding corrections if the expected scenario is different from the originally planned one. This approach postulates that even a primitive living species possesses additional, non-Newtonian properties that are not included in the laws of Newtonian or statistical mechanics. These properties follow from a privileged ability of living systems to possess a self-image (a concept introduced in psychology) and to interact with it. The proposed mathematical system is based on the coupling of the classical dynamical system representing the motor dynamics with the corresponding Liouville equation describing the evolution of initial uncertainties in terms of the probability density and representing the mental dynamics. The coupling is implemented by the information-based supervising forces that can be associated with self-awareness. These forces fundamentally change the pattern of the probability evolution, and therefore, lead to a major departure of the behavior of living systems from the patterns of both Newtonian and statistical mechanics. This innovation is meant to capture the signature of life based only on observable behavior, not on any biochemistry. This will not prevent the use of this model for developing artificial living systems, as well as for studying some general properties of behavior of natural, living systems.

  4. Modeling Tropical Cyclone Storm Surge and Wind Induced Risk Along the Bay of Bengal Coastline Using a Statistical Copula

    NASA Astrophysics Data System (ADS)

    Bushra, N.; Trepanier, J. C.; Rohli, R. V.

    2017-12-01

    High winds, torrential rain, and storm surges from tropical cyclones (TCs) cause massive destruction to property and cost the lives of many people. The coastline of the Bay of Bengal (BoB) ranks as one of the most susceptible to TC storm surges in the world due to low-lying elevation and a high frequency of occurrence. Bangladesh suffers the most due to its geographical setting and population density. Various models have been developed to predict storm surge in this region but none of them quantify statistical risk with empirical data. This study describes the relationship and dependency between empirical TC storm surge and peak reported wind speed at the BoB using a bivariate statistical copula and data from 1885-2011. An Archimedean, Gumbel copula with margins defined by the empirical distributions is specified as the most appropriate choice for the BoB. The model provides return periods for pairs of TC storm surge and peak wind along the BoB coastline. The BoB can expect a TC with peak reported winds of at least 24 m s-1 and surge heights of at least 4.0 m, on average, once every 3.2 years, with a quartile pointwise confidence interval of 2.7-3.8 years. In addition, the BoB can expect peak reported winds of 62 m s-1 and surge heights of at least 8.0 m, on average, once every 115.4 years, with a quartile pointwise confidence interval of 55.8-381.1 years. The purpose of the analysis is to increase the understanding of these dangerous TC characteristics to reduce fatalities and monetary losses into the future. Application of the copula will mitigate future threats of storm surge impacts on coastal communities of the BoB.

  5. On the stability and dynamics of stochastic spiking neuron models: Nonlinear Hawkes process and point process GLMs

    PubMed Central

    Truccolo, Wilson

    2017-01-01

    Point process generalized linear models (PP-GLMs) provide an important statistical framework for modeling spiking activity in single-neurons and neuronal networks. Stochastic stability is essential when sampling from these models, as done in computational neuroscience to analyze statistical properties of neuronal dynamics and in neuro-engineering to implement closed-loop applications. Here we show, however, that despite passing common goodness-of-fit tests, PP-GLMs estimated from data are often unstable, leading to divergent firing rates. The inclusion of absolute refractory periods is not a satisfactory solution since the activity then typically settles into unphysiological rates. To address these issues, we derive a framework for determining the existence and stability of fixed points of the expected conditional intensity function (CIF) for general PP-GLMs. Specifically, in nonlinear Hawkes PP-GLMs, the CIF is expressed as a function of the previous spike history and exogenous inputs. We use a mean-field quasi-renewal (QR) approximation that decomposes spike history effects into the contribution of the last spike and an average of the CIF over all spike histories prior to the last spike. Fixed points for stationary rates are derived as self-consistent solutions of integral equations. Bifurcation analysis and the number of fixed points predict that the original models can show stable, divergent, and metastable (fragile) dynamics. For fragile models, fluctuations of the single-neuron dynamics predict expected divergence times after which rates approach unphysiologically high values. This metric can be used to estimate the probability of rates to remain physiological for given time periods, e.g., for simulation purposes. We demonstrate the use of the stability framework using simulated single-neuron examples and neurophysiological recordings. Finally, we show how to adapt PP-GLM estimation procedures to guarantee model stability. Overall, our results provide a stability framework for data-driven PP-GLMs and shed new light on the stochastic dynamics of state-of-the-art statistical models of neuronal spiking activity. PMID:28234899

  6. On the stability and dynamics of stochastic spiking neuron models: Nonlinear Hawkes process and point process GLMs.

    PubMed

    Gerhard, Felipe; Deger, Moritz; Truccolo, Wilson

    2017-02-01

    Point process generalized linear models (PP-GLMs) provide an important statistical framework for modeling spiking activity in single-neurons and neuronal networks. Stochastic stability is essential when sampling from these models, as done in computational neuroscience to analyze statistical properties of neuronal dynamics and in neuro-engineering to implement closed-loop applications. Here we show, however, that despite passing common goodness-of-fit tests, PP-GLMs estimated from data are often unstable, leading to divergent firing rates. The inclusion of absolute refractory periods is not a satisfactory solution since the activity then typically settles into unphysiological rates. To address these issues, we derive a framework for determining the existence and stability of fixed points of the expected conditional intensity function (CIF) for general PP-GLMs. Specifically, in nonlinear Hawkes PP-GLMs, the CIF is expressed as a function of the previous spike history and exogenous inputs. We use a mean-field quasi-renewal (QR) approximation that decomposes spike history effects into the contribution of the last spike and an average of the CIF over all spike histories prior to the last spike. Fixed points for stationary rates are derived as self-consistent solutions of integral equations. Bifurcation analysis and the number of fixed points predict that the original models can show stable, divergent, and metastable (fragile) dynamics. For fragile models, fluctuations of the single-neuron dynamics predict expected divergence times after which rates approach unphysiologically high values. This metric can be used to estimate the probability of rates to remain physiological for given time periods, e.g., for simulation purposes. We demonstrate the use of the stability framework using simulated single-neuron examples and neurophysiological recordings. Finally, we show how to adapt PP-GLM estimation procedures to guarantee model stability. Overall, our results provide a stability framework for data-driven PP-GLMs and shed new light on the stochastic dynamics of state-of-the-art statistical models of neuronal spiking activity.

  7. Isotropy analyses of the Planck convergence map

    NASA Astrophysics Data System (ADS)

    Marques, G. A.; Novaes, C. P.; Bernui, A.; Ferreira, I. S.

    2018-01-01

    The presence of matter in the path of relic photons causes distortions in the angular pattern of the cosmic microwave background (CMB) temperature fluctuations, modifying their properties in a slight but measurable way. Recently, the Planck Collaboration released the estimated convergence map, an integrated measure of the large-scale matter distribution that produced the weak gravitational lensing (WL) phenomenon observed in Planck CMB data. We perform exhaustive analyses of this convergence map calculating the variance in small and large regions of the sky, but excluding the area masked due to Galactic contaminations, and compare them with the features expected in the set of simulated convergence maps, also released by the Planck Collaboration. Our goal is to search for sky directions or regions where the WL imprints anomalous signatures to the variance estimator revealed through a χ2 analyses at a statistically significant level. In the local analysis of the Planck convergence map, we identified eight patches of the sky in disagreement, in more than 2σ, with what is observed in the average of the simulations. In contrast, in the large regions analysis we found no statistically significant discrepancies, but, interestingly, the regions with the highest χ2 values are surrounding the ecliptic poles. Thus, our results show a good agreement with the features expected by the Λ cold dark matter concordance model, as given by the simulations. Yet, the outliers regions found here could suggest that the data still contain residual contamination, like noise, due to over- or underestimation of systematic effects in the simulation data set.

  8. Life expectancy in elderly patients following burns injury.

    PubMed

    Sepehripour, Sarvnaz; Duggineni, Sirisha; Shahsavari, Somaya; Dheansa, Baljit

    2018-05-18

    Burn injuries commonly occur in vulnerable age and social groups. Previous research has shown that frailty may represent a more important marker of adverse outcome in healthcare rather than chronological age (Roberts et al., 2012). In this paper we determined the relationship between burn injury, frailty, co-morbidities and long-term survival. Retrospective data collection from patients aged 75 with burns injuries, treated and discharged at Queen Victoria Hospital. The Clinical Frailty Scale (Rockwood et al., 2005) was used to calculate frailty at the time of admission. The expected mortality age (life expectancy) of deceased patients was obtained from two survival predictors. The data shows a statistically significant correlation between frailty score and complications and a statistically significant correlation between total body surface area percentage and complications. No significant difference was found between expected and observed age of death or life expectancy amongst the deceased (p value of 0.109). Based on the data from our unit, sustaining a burn as an elderly person does not reduce life expectancy. Medical and surgical complications, immediate, early and late, although higher with greater frailty and TBSA of burn, but do not adversely affect survival in this population. Copyright © 2018 Elsevier Ltd. All rights reserved.

  9. Repeated Measurements on Distinct Scales With Censoring—A Bayesian Approach Applied to Microarray Analysis of Maize

    PubMed Central

    Love, Tanzy; Carriquiry, Alicia

    2009-01-01

    We analyze data collected in a somatic embryogenesis experiment carried out on Zea mays at Iowa State University. The main objective of the study was to identify the set of genes in maize that actively participate in embryo development. Embryo tissue was sampled and analyzed at various time periods and under different mediums and light conditions. As is the case in many microarray experiments, the operator scanned each slide multiple times to find the slide-specific ‘optimal’ laser and sensor settings. The multiple readings of each slide are repeated measurements on different scales with differing censoring; they cannot be considered to be replicate measurements in the traditional sense. Yet it has been shown that the choice of reading can have an impact on genetic inference. We propose a hierarchical modeling approach to estimating gene expression that combines all available readings on each spot and accounts for censoring in the observed values. We assess the statistical properties of the proposed expression estimates using a simulation experiment. As expected, combining all available scans using an approach with good statistical properties results in expression estimates with noticeably lower bias and root mean squared error relative to other approaches that have been proposed in the literature. Inferences drawn from the somatic embryogenesis experiment, which motivated this work changed drastically when data were analyzed using the standard approaches or using the methodology we propose. PMID:19960120

  10. 36 CFR 902.12 - Maintenance of statistics; annual report to Congress.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 36 Parks, Forests, and Public Property 3 2010-07-01 2010-07-01 false Maintenance of statistics; annual report to Congress. 902.12 Section 902.12 Parks, Forests, and Public Property PENNSYLVANIA AVENUE DEVELOPMENT CORPORATION FREEDOM OF INFORMATION ACT General Administration § 902.12 Maintenance of statistics...

  11. Spatial and Temporal Distribution of Cloud Properties Observed by MODIS: Preliminary Level-3 Results from the Collection 5 Reprocessing

    NASA Technical Reports Server (NTRS)

    King, Michael D.; Platnick, Steven; Hubanks, Paul; Pincus, Robert

    2006-01-01

    The Moderate Resolution Imaging Spectroradiometer (MODIS) was developed by NASA and launched onboard the Terra spacecraft on December 18, 1999 and Aqua spacecraft on May 4, 2002. It achieved its final orbit and began Earth observations on February 24, 2000 for Terra and June 24, 2002 for Aqua. A comprehensive set of operational algorithms for the retrieval of cloud physical and optical properties (optical thickness, effective particle radius, water path, thermodynamic phase) have recently been updated and are being used in the new "Collection 5" processing stream being produced by the MODIS Adaptive Processing System (MODAPS) at NASA GSFC. All Terra and Aqua data are undergoing Collection 5 reprocessing with an expected completion date by the end of 2006. The archived products from these algorithms include 1 km pixel-level (Level-2) and global gridded Level-3 products. The cloud products have applications in climate change studies, climate modeling, numerical weather prediction, as well as fundamental atmospheric research. In this talk, we will summarize the available Level-3 cloud properties and their associated statistical data sets, and show preliminary Terra and Aqua results from the available Collection 5 reprocessing effort. Anticipated results include the latitudinal distribution of cloud optical and radiative properties for both liquid water and ice clouds, as well as joint histograms of cloud optical thickness and effective radius for selected geographical locations around the world.

  12. Radar observations of individual rain drops in the free atmosphere

    PubMed Central

    Schmidt, Jerome M.; Flatau, Piotr J.; Harasti, Paul R.; Yates, Robert D.; Littleton, Ricky; Pritchard, Michael S.; Fischer, Jody M.; Fischer, Erin J.; Kohri, William J.; Vetter, Jerome R.; Richman, Scott; Baranowski, Dariusz B.; Anderson, Mark J.; Fletcher, Ed; Lando, David W.

    2012-01-01

    Atmospheric remote sensing has played a pivotal role in the increasingly sophisticated representation of clouds in the numerical models used to assess global and regional climate change. This has been accomplished because the underlying bulk cloud properties can be derived from a statistical analysis of the returned microwave signals scattered by a diverse ensemble comprised of numerous cloud hydrometeors. A new Doppler radar, previously used to track small debris particles shed from the NASA space shuttle during launch, is shown to also have the capacity to detect individual cloud hydrometeors in the free atmosphere. Similar to the traces left behind on film by subatomic particles, larger cloud particles were observed to leave a well-defined radar signature (or streak), which could be analyzed to infer the underlying particle properties. We examine the unique radar and environmental conditions leading to the formation of the radar streaks and develop a theoretical framework which reveals the regulating role of the background radar reflectivity on their observed characteristics. This main expectation from theory is examined through an analysis of the drop properties inferred from radar and in situ aircraft measurements obtained in two contrasting regions of an observed multicellular storm system. The observations are placed in context of the parent storm circulation through the use of the radar’s unique high-resolution waveforms, which allow the bulk and individual hydrometeor properties to be inferred at the same time. PMID:22652569

  13. Radar observations of individual rain drops in the free atmosphere.

    PubMed

    Schmidt, Jerome M; Flatau, Piotr J; Harasti, Paul R; Yates, Robert D; Littleton, Ricky; Pritchard, Michael S; Fischer, Jody M; Fischer, Erin J; Kohri, William J; Vetter, Jerome R; Richman, Scott; Baranowski, Dariusz B; Anderson, Mark J; Fletcher, Ed; Lando, David W

    2012-06-12

    Atmospheric remote sensing has played a pivotal role in the increasingly sophisticated representation of clouds in the numerical models used to assess global and regional climate change. This has been accomplished because the underlying bulk cloud properties can be derived from a statistical analysis of the returned microwave signals scattered by a diverse ensemble comprised of numerous cloud hydrometeors. A new Doppler radar, previously used to track small debris particles shed from the NASA space shuttle during launch, is shown to also have the capacity to detect individual cloud hydrometeors in the free atmosphere. Similar to the traces left behind on film by subatomic particles, larger cloud particles were observed to leave a well-defined radar signature (or streak), which could be analyzed to infer the underlying particle properties. We examine the unique radar and environmental conditions leading to the formation of the radar streaks and develop a theoretical framework which reveals the regulating role of the background radar reflectivity on their observed characteristics. This main expectation from theory is examined through an analysis of the drop properties inferred from radar and in situ aircraft measurements obtained in two contrasting regions of an observed multicellular storm system. The observations are placed in context of the parent storm circulation through the use of the radar's unique high-resolution waveforms, which allow the bulk and individual hydrometeor properties to be inferred at the same time.

  14. Tandem mass spectrometry of human tryptic blood peptides calculated by a statistical algorithm and captured by a relational database with exploration by a general statistical analysis system.

    PubMed

    Bowden, Peter; Beavis, Ron; Marshall, John

    2009-11-02

    A goodness of fit test may be used to assign tandem mass spectra of peptides to amino acid sequences and to directly calculate the expected probability of mis-identification. The product of the peptide expectation values directly yields the probability that the parent protein has been mis-identified. A relational database could capture the mass spectral data, the best fit results, and permit subsequent calculations by a general statistical analysis system. The many files of the Hupo blood protein data correlated by X!TANDEM against the proteins of ENSEMBL were collected into a relational database. A redundant set of 247,077 proteins and peptides were correlated by X!TANDEM, and that was collapsed to a set of 34,956 peptides from 13,379 distinct proteins. About 6875 distinct proteins were only represented by a single distinct peptide, 2866 proteins showed 2 distinct peptides, and 3454 proteins showed at least three distinct peptides by X!TANDEM. More than 99% of the peptides were associated with proteins that had cumulative expectation values, i.e. probability of false positive identification, of one in one hundred or less. The distribution of peptides per protein from X!TANDEM was significantly different than those expected from random assignment of peptides.

  15. The Interplay between Spoken Language and Informal Definitions of Statistical Concepts

    ERIC Educational Resources Information Center

    Lavy, Ilana; Mashiach-Eizenberg, Michal

    2009-01-01

    Various terms are used to describe mathematical concepts, in general, and statistical concepts, in particular. Regarding statistical concepts in the Hebrew language, some of these terms have the same meaning both in their everyday use and in mathematics, such as Mode; some of them have a different meaning, such as Expected value and Life…

  16. Linking Science and Statistics: Curriculum Expectations in Three Countries

    ERIC Educational Resources Information Center

    Watson, Jane M.

    2017-01-01

    This paper focuses on the curriculum links between statistics and science that teachers need to understand and apply in order to be effective teachers of the two fields of study. Meaningful statistics does not exist without context and science is the context for this paper. Although curriculum documents differ from country to country, this paper…

  17. Positive smoking outcome expectancies mediate the association between negative affect and smoking urge among women during a quit attempt.

    PubMed

    Cano, Miguel Ángel; Lam, Cho Y; Chen, Minxing; Adams, Claire E; Correa-Fernández, Virmarie; Stewart, Diana W; McClure, Jennifer B; Cinciripini, Paul M; Wetter, David W

    2014-08-01

    Ecological momentary assessment was used to examine associations between negative affect, positive smoking outcome expectancies, and smoking urge during the first 7 days of a smoking quit attempt. Participants were 302 female smokers who enrolled in an individually tailored smoking cessation treatment study. Multilevel mediation analysis was used to examine the temporal relationship among the following: (a) the effects of negative affect and positive smoking outcome expectancies at 1 assessment point (e.g., time j) on smoking urge at the subsequent time point (e.g., time j + 1) in Model 1; and, (b) the effects of negative affect and smoking urge at time j on positive smoking outcome expectancies at time j + 1 in Model 2. The results from Model 1 showed a statistically significant effect of negative affect at time j on smoking urge at time j + 1, and this effect was mediated by positive smoking outcome expectancies at time j, both within- and between-participants. In Model 2, the within-participant indirect effect of negative affect at time j on positive smoking outcome expectancies at time j + 1 through smoking urge at time j was nonsignificant. However, a statistically significant indirect between-participants effect was found in Model 2. The findings support the hypothesis that urge and positive smoking outcome expectancies increase as a function of negative affect, and suggest a stronger effect of expectancies on urge as opposed to the effect of urge on expectancies.

  18. Positive Smoking Outcome Expectancies Mediate the Association between Negative Affect and Smoking Urge among Women During a Quit Attempt

    PubMed Central

    Cano, Miguel Ángel; Lam, Cho Y.; Chen, Minxing; Adams, Claire E.; Correa-Fernández, Virmarie; Stewart, Diana W.; McClure, Jennifer B.; Cinciripini, Paul M.; Wetter, David W.

    2014-01-01

    Ecological momentary assessment was used to examine associations between negative affect, positive smoking outcome expectancies, and smoking urge during the first 7 days of a smoking quit attempt. Participants were 302 female smokers who enrolled in an individually tailored smoking cessation treatment study. Multilevel mediation analysis was used to examine the temporal relationship among: 1) the effects of negative affect and positive smoking outcome expectancies at one assessment point (e.g., time j) on smoking urge at the subsequent time point (e.g., time j + 1) in Model 1; and, 2) the effects of negative affect and smoking urge at time j on positive smoking outcome expectancies at time j + 1 in Model 2. The results from Model 1 showed a statistically significant effect of negative affect at time j on smoking urge at time j + 1, and this effect was mediated by positive smoking outcome expectancies at time j, both within- and between-participant. In Model 2, the within-participant indirect effect of negative affect at time j on positive smoking outcome expectancies at time j + 1 through smoking urge at time j was nonsignificant. However, a statistically significant indirect between-participant effect was found in Model 2. The findings support the hypothesis that urge and positive smoking outcome expectancies increase as a function of negative affect, and suggest a stronger effect of expectancies on urge as opposed to the effect of urge on expectancies. PMID:24796849

  19. Comparing geological and statistical approaches for element selection in sediment tracing research

    NASA Astrophysics Data System (ADS)

    Laceby, J. Patrick; McMahon, Joe; Evrard, Olivier; Olley, Jon

    2015-04-01

    Elevated suspended sediment loads reduce reservoir capacity and significantly increase the cost of operating water treatment infrastructure, making the management of sediment supply to reservoirs of increasingly importance. Sediment fingerprinting techniques can be used to determine the relative contributions of different sources of sediment accumulating in reservoirs. The objective of this research is to compare geological and statistical approaches to element selection for sediment fingerprinting modelling. Time-integrated samplers (n=45) were used to obtain source samples from four major subcatchments flowing into the Baroon Pocket Dam in South East Queensland, Australia. The geochemistry of potential sources were compared to the geochemistry of sediment cores (n=12) sampled in the reservoir. The geochemical approach selected elements for modelling that provided expected, observed and statistical discrimination between sediment sources. Two statistical approaches selected elements for modelling with the Kruskal-Wallis H-test and Discriminatory Function Analysis (DFA). In particular, two different significance levels (0.05 & 0.35) for the DFA were included to investigate the importance of element selection on modelling results. A distribution model determined the relative contributions of different sources to sediment sampled in the Baroon Pocket Dam. Elemental discrimination was expected between one subcatchment (Obi Obi Creek) and the remaining subcatchments (Lexys, Falls and Bridge Creek). Six major elements were expected to provide discrimination. Of these six, only Fe2O3 and SiO2 provided expected, observed and statistical discrimination. Modelling results with this geological approach indicated 36% (+/- 9%) of sediment sampled in the reservoir cores were from mafic-derived sources and 64% (+/- 9%) were from felsic-derived sources. The geological and the first statistical approach (DFA0.05) differed by only 1% (σ 5%) for 5 out of 6 model groupings with only the Lexys Creek modelling results differing significantly (35%). The statistical model with expanded elemental selection (DFA0.35) differed from the geological model by an average of 30% for all 6 models. Elemental selection for sediment fingerprinting therefore has the potential to impact modeling results. Accordingly is important to incorporate both robust geological and statistical approaches when selecting elements for sediment fingerprinting. For the Baroon Pocket Dam, management should focus on reducing the supply of sediments derived from felsic sources in each of the subcatchments.

  20. Simulating flaring events in complex active regions driven by observed magnetograms

    NASA Astrophysics Data System (ADS)

    Dimitropoulou, M.; Isliker, H.; Vlahos, L.; Georgoulis, M. K.

    2011-05-01

    Context. We interpret solar flares as events originating in active regions that have reached the self organized critical state, by using a refined cellular automaton model with initial conditions derived from observations. Aims: We investigate whether the system, with its imposed physical elements, reaches a self organized critical state and whether well-known statistical properties of flares, such as scaling laws observed in the distribution functions of characteristic parameters, are reproduced after this state has been reached. Methods: To investigate whether the distribution functions of total energy, peak energy and event duration follow the expected scaling laws, we first applied a nonlinear force-free extrapolation that reconstructs the three-dimensional magnetic fields from two-dimensional vector magnetograms. We then locate magnetic discontinuities exceeding a threshold in the Laplacian of the magnetic field. These discontinuities are relaxed in local diffusion events, implemented in the form of cellular automaton evolution rules. Subsequent loading and relaxation steps lead the system to self organized criticality, after which the statistical properties of the simulated events are examined. Physical requirements, such as the divergence-free condition for the magnetic field vector, are approximately imposed on all elements of the model. Results: Our results show that self organized criticality is indeed reached when applying specific loading and relaxation rules. Power-law indices obtained from the distribution functions of the modeled flaring events are in good agreement with observations. Single power laws (peak and total flare energy) are obtained, as are power laws with exponential cutoff and double power laws (flare duration). The results are also compared with observational X-ray data from the GOES satellite for our active-region sample. Conclusions: We conclude that well-known statistical properties of flares are reproduced after the system has reached self organized criticality. A significant enhancement of our refined cellular automaton model is that it commences the simulation from observed vector magnetograms, thus facilitating energy calculation in physical units. The model described in this study remains consistent with fundamental physical requirements, and imposes physically meaningful driving and redistribution rules.

  1. Application of real rock pore-threat statistics to a regular pore network model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rakibul, M.; Sarker, H.; McIntyre, D.

    2011-01-01

    This work reports the application of real rock statistical data to a previously developed regular pore network model in an attempt to produce an accurate simulation tool with low computational overhead. A core plug from the St. Peter Sandstone formation in Indiana was scanned with a high resolution micro CT scanner. The pore-throat statistics of the three-dimensional reconstructed rock were extracted and the distribution of the pore-throat sizes was applied to the regular pore network model. In order to keep the equivalent model regular, only the throat area or the throat radius was varied. Ten realizations of randomly distributed throatmore » sizes were generated to simulate the drainage process and relative permeability was calculated and compared with the experimentally determined values of the original rock sample. The numerical and experimental procedures are explained in detail and the performance of the model in relation to the experimental data is discussed and analyzed. Petrophysical properties such as relative permeability are important in many applied fields such as production of petroleum fluids, enhanced oil recovery, carbon dioxide sequestration, ground water flow, etc. Relative permeability data are used for a wide range of conventional reservoir engineering calculations and in numerical reservoir simulation. Two-phase oil water relative permeability data are generated on the same core plug from both pore network model and experimental procedure. The shape and size of the relative permeability curves were compared and analyzed and good match has been observed for wetting phase relative permeability but for non-wetting phase, simulation results were found to be deviated from the experimental ones. Efforts to determine petrophysical properties of rocks using numerical techniques are to eliminate the necessity of regular core analysis, which can be time consuming and expensive. So a numerical technique is expected to be fast and to produce reliable results. In applied engineering, sometimes quick result with reasonable accuracy is acceptable than the more time consuming results. Present work is an effort to check the accuracy and validity of a previously developed pore network model for obtaining important petrophysical properties of rocks based on cutting-sized sample data.« less

  2. Application of real rock pore-throat statistics to a regular pore network model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sarker, M.R.; McIntyre, D.; Ferer, M.

    2011-01-01

    This work reports the application of real rock statistical data to a previously developed regular pore network model in an attempt to produce an accurate simulation tool with low computational overhead. A core plug from the St. Peter Sandstone formation in Indiana was scanned with a high resolution micro CT scanner. The pore-throat statistics of the three-dimensional reconstructed rock were extracted and the distribution of the pore-throat sizes was applied to the regular pore network model. In order to keep the equivalent model regular, only the throat area or the throat radius was varied. Ten realizations of randomly distributed throatmore » sizes were generated to simulate the drainage process and relative permeability was calculated and compared with the experimentally determined values of the original rock sample. The numerical and experimental procedures are explained in detail and the performance of the model in relation to the experimental data is discussed and analyzed. Petrophysical properties such as relative permeability are important in many applied fields such as production of petroleum fluids, enhanced oil recovery, carbon dioxide sequestration, ground water flow, etc. Relative permeability data are used for a wide range of conventional reservoir engineering calculations and in numerical reservoir simulation. Two-phase oil water relative permeability data are generated on the same core plug from both pore network model and experimental procedure. The shape and size of the relative permeability curves were compared and analyzed and good match has been observed for wetting phase relative permeability but for non-wetting phase, simulation results were found to be deviated from the experimental ones. Efforts to determine petrophysical properties of rocks using numerical techniques are to eliminate the necessity of regular core analysis, which can be time consuming and expensive. So a numerical technique is expected to be fast and to produce reliable results. In applied engineering, sometimes quick result with reasonable accuracy is acceptable than the more time consuming results. Present work is an effort to check the accuracy and validity of a previously developed pore network model for obtaining important petrophysical properties of rocks based on cutting-sized sample data. Introduction« less

  3. Correcting power and p-value calculations for bias in diffusion tensor imaging.

    PubMed

    Lauzon, Carolyn B; Landman, Bennett A

    2013-07-01

    Diffusion tensor imaging (DTI) provides quantitative parametric maps sensitive to tissue microarchitecture (e.g., fractional anisotropy, FA). These maps are estimated through computational processes and subject to random distortions including variance and bias. Traditional statistical procedures commonly used for study planning (including power analyses and p-value/alpha-rate thresholds) specifically model variability, but neglect potential impacts of bias. Herein, we quantitatively investigate the impacts of bias in DTI on hypothesis test properties (power and alpha-rate) using a two-sided hypothesis testing framework. We present theoretical evaluation of bias on hypothesis test properties, evaluate the bias estimation technique SIMEX for DTI hypothesis testing using simulated data, and evaluate the impacts of bias on spatially varying power and alpha rates in an empirical study of 21 subjects. Bias is shown to inflame alpha rates, distort the power curve, and cause significant power loss even in empirical settings where the expected difference in bias between groups is zero. These adverse effects can be attenuated by properly accounting for bias in the calculation of power and p-values. Copyright © 2013 Elsevier Inc. All rights reserved.

  4. Psychometric properties of a suicide screen for adjudicated youth in residential care.

    PubMed

    Langhinrichsen-Rohling, Jennifer; Hudson, Kenneth; Lamis, Dorian A; Carr, Nicole

    2012-04-01

    There is a need to efficiently and effectively screen adjudicated youth residing within the juvenile justice system for suicide proneness. Accordingly, in the current study, the psychometric properties of the Life Attitude Schedule: Short Form (LAS:S), a 24-item risk assessment for suicide proneness, were assessed using data from adjudicated youth residing in an alternative sentencing facility (n = 130). As predicted, statistically significant correlations were obtained between total LAS:S suicide proneness scores and reports of recent suicide ideation and hopelessness. Contrary to expectation, the previously reported 2-factor model for the LAS:S, with Factor 1 representing physical unhealthiness and Factor 2 representing psychological death, poorly fit the data. In adjudicated youth, we found that a single factor model derived from the 4 LAS:S subscales produced a better fit to the data than the 2-factor model. The death-related, self-related, injury-related, and negative health-related behaviors contained on the LAS:S shared common variance in these youth. A clinical implication is that practitioners can effectively use the total LAS:S score when screening adjudicated youth for suicide proneness.

  5. Portable handheld diffuse reflectance spectroscopy system for clinical evaluation of skin: a pilot study in psoriasis patients

    PubMed Central

    Tzeng, Shih-Yu; Guo, Jean-Yan; Yang, Chao-Chun; Hsu, Chao-Kai; Huang, Hung Ji; Chou, Shih-Jie; Hwang, Chi-Hung; Tseng, Sheng-Hao

    2016-01-01

    Diffuse reflectance spectroscopy (DRS) has been utilized to study biological tissues for a variety of applications. However, many DRS systems are not designed for handheld use and/or relatively expensive which limit the extensive clinical use of this technique. In this paper, we report a handheld, low-cost DRS system consisting of a light source, optical switch, and a spectrometer, that can precisely quantify the optical properties of tissue samples in the clinical setting. The handheld DRS system was employed to determine the skin chromophore concentrations, absorption and scattering properties of 11 patients with psoriasis. The measurement results were compared to the clinical severity of psoriasis as evaluated by dermatologist using PASI (Psoriasis Area and Severity Index) scores. Our statistical analyses indicated that the handheld DRS system could be a useful non-invasive tool for objective evaluation of the severity of psoriasis. It is expected that the handheld system can be used for the objective evaluation and monitoring of various skin diseases such as keloid and psoriasis. PMID:26977366

  6. Stress transmission through a model system of cohesionless elastic grains

    NASA Astrophysics Data System (ADS)

    Da Silva, Miguel; Rajchenbach, Jean

    2000-08-01

    Understanding the mechanical properties of granular materials is important for applications in civil and chemical engineering, geophysical sciences and the food industry, as well as for the control or prevention of avalanches and landslides. Unlike continuous media, granular materials lack cohesion, and cannot resist tensile stresses. Current descriptions of the mechanical properties of collections of cohesionless grains have relied either on elasto-plastic models classically used in civil engineering, or on a recent model involving hyperbolic equations. The former models suggest that collections of elastic grains submitted to a compressive load will behave elastically. Here we present the results of an experiment on a two-dimensional model system-made of discrete square cells submitted to a point load-in which the region in which the stress is confined is photoelastically visualized as a parabola. These results, which can be interpreted within a statistical framework, demonstrate that the collective response of the pile contradicts the standard elastic predictions and supports a diffusive description of stress transmission. We expect that these findings will be applicable to problems in soil mechanics, such as the behaviour of cohesionless soils or sand piles.

  7. Galactic dual population models of gamma-ray bursts

    NASA Technical Reports Server (NTRS)

    Higdon, J. C.; Lingenfelter, R. E.

    1994-01-01

    We investigate in more detail the properties of two-population models for gamma-ray bursts in the galactic disk and halo. We calculate the gamma-ray burst statistical properties, mean value of (V/V(sub max)), mean value of cos Theta, and mean value of (sin(exp 2) b), as functions of the detection flux threshold for bursts coming from both Galactic disk and massive halo populations. We consider halo models inferred from the observational constraints on the large-scale Galactic structure and we compare the expected values of mean value of (V/V(sub max)), mean value of cos Theta, and mean value of (sin(exp 2) b), with those measured by Burst and Transient Source Experiment (BATSE) and other detectors. We find that the measured values are consistent with solely Galactic populations having a range of halo distributions, mixed with local disk distributions, which can account for as much as approximately 25% of the observed BATSE bursts. M31 does not contribute to these modeled bursts. We also demonstrate, contrary to recent arguments, that the size-frequency distributions of dual population models are quite consistent with the BATSE observations.

  8. Chaotic Lagrangian models for turbulent relative dispersion.

    PubMed

    Lacorata, Guglielmo; Vulpiani, Angelo

    2017-04-01

    A deterministic multiscale dynamical system is introduced and discussed as a prototype model for relative dispersion in stationary, homogeneous, and isotropic turbulence. Unlike stochastic diffusion models, here trajectory transport and mixing properties are entirely controlled by Lagrangian chaos. The anomalous "sweeping effect," a known drawback common to kinematic simulations, is removed through the use of quasi-Lagrangian coordinates. Lagrangian dispersion statistics of the model are accurately analyzed by computing the finite-scale Lyapunov exponent (FSLE), which is the optimal measure of the scaling properties of dispersion. FSLE scaling exponents provide a severe test to decide whether model simulations are in agreement with theoretical expectations and/or observation. The results of our numerical experiments cover a wide range of "Reynolds numbers" and show that chaotic deterministic flows can be very efficient, and numerically low-cost, models of turbulent trajectories in stationary, homogeneous, and isotropic conditions. The mathematics of the model is relatively simple, and, in a geophysical context, potential applications may regard small-scale parametrization issues in general circulation models, mixed layer, and/or boundary layer turbulence models as well as Lagrangian predictability studies.

  9. Chaotic Lagrangian models for turbulent relative dispersion

    NASA Astrophysics Data System (ADS)

    Lacorata, Guglielmo; Vulpiani, Angelo

    2017-04-01

    A deterministic multiscale dynamical system is introduced and discussed as a prototype model for relative dispersion in stationary, homogeneous, and isotropic turbulence. Unlike stochastic diffusion models, here trajectory transport and mixing properties are entirely controlled by Lagrangian chaos. The anomalous "sweeping effect," a known drawback common to kinematic simulations, is removed through the use of quasi-Lagrangian coordinates. Lagrangian dispersion statistics of the model are accurately analyzed by computing the finite-scale Lyapunov exponent (FSLE), which is the optimal measure of the scaling properties of dispersion. FSLE scaling exponents provide a severe test to decide whether model simulations are in agreement with theoretical expectations and/or observation. The results of our numerical experiments cover a wide range of "Reynolds numbers" and show that chaotic deterministic flows can be very efficient, and numerically low-cost, models of turbulent trajectories in stationary, homogeneous, and isotropic conditions. The mathematics of the model is relatively simple, and, in a geophysical context, potential applications may regard small-scale parametrization issues in general circulation models, mixed layer, and/or boundary layer turbulence models as well as Lagrangian predictability studies.

  10. Statistical properties of measures of association and the Kappa statistic for assessing the accuracy of remotely sensed data using double sampling

    Treesearch

    Mohammed A. Kalkhan; Robin M. Reich; Raymond L. Czaplewski

    1996-01-01

    A Monte Carlo simulation was used to evaluate the statistical properties of measures of association and the Kappa statistic under double sampling with replacement. Three error matrices representing three levels of classification accuracy of Landsat TM Data consisting of four forest cover types in North Carolina. The overall accuracy of the five indices ranged from 0.35...

  11. Large deformation analysis of axisymmetric inhomogeneities including coupled elastic and plastic anisotropy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brannon, R.M.

    1996-12-31

    A mathematical framework is developed for the study of materials containing axisymmetric inclusions or flaws such as ellipsoidal voids, penny-shaped cracks, or fibers of circular cross-section. The general case of nonuniform statistical distributions of such heterogeneities is attacked by first considering a spatially uniform distribution of flaws that are all oriented in the same direction. Assuming an isotropic substrate, the macroscopic material properties of this simpler microstructure naturally should be transversely isotropic. An orthogonal basis for the linear subspace consisting of all double-symmetric transversely-isotropic fourth-order tensors associated with a given material vector is applied to deduce the explicit functional dependencemore » of the material properties of these aligned materials on the shared symmetry axis. The aligned and uniform microstructure seems geometrically simple enough that the macroscopic transversely isotropic properties could be derived in closed form. Since the resulting properties are transversely isotropic, the analyst must therefore be able to identify the appropriate coefficients of the transverse basis. Once these functions are identified, a principle of superposition of strain rates ay be applied to define an expectation integral for the composite properties of a material containing arbitrary anisotropic distributions of axisymmetric inhomogeneities. A proposal for coupling plastic anisotropy to the elastic anisotropy is presented in which the composite yield surface is interpreted as a distortion of the isotropic substrate yield surface; the distortion directions are coupled to the elastic anisotropy directions. Finally, some commonly assumed properties (such as major symmetry) of the Cauchy tangent stiffness tensor are shown to be inappropriate for large distortions of anisotropic materials.« less

  12. Some Statistical Properties of Tonality, 1650-1900

    ERIC Educational Resources Information Center

    White, Christopher Wm.

    2013-01-01

    This dissertation investigates the statistical properties present within corpora of common practice music, involving a data set of more than 8,000 works spanning from 1650 to 1900, and focusing specifically on the properties of the chord progressions contained therein. In the first chapter, methodologies concerning corpus analysis are presented…

  13. Occupational Skin Disease Prevention: An Educational Intervention for Hairdresser Cosmetology Students.

    PubMed

    Haughtigan, Kara; Main, Eve; Bragg-Underwood, Tonya; Watkins, Cecilia

    2017-11-01

    Cosmetologists frequently develop occupational skin disease related to workplace exposures. The purpose of this study was to evaluate an educational intervention to increase cosmetology students' occupational skin disease knowledge and use of preventive practices. A quasi-experimental design was used to evaluate students' knowledge, behaviors, intentions, expectancies, and expectations. A 20-minute verbal presentation and printed two-page educational handout were provided for participants. Statistically significant increases in knowledge, frequency of glove use, and frequency of moisturizer use were found, but the frequency of handwashing did not increase. In addition, the Behavioral Strategies subscale, the Intention subscale, and the Expectancies subscale showed statistically significant improvements. The results of this study suggest an educational intervention can increase cosmetology students' knowledge of occupational skin diseases and their use of preventive strategies.

  14. USBM (United States Bureau of Mines) borehole deformation gage absolute stress measurement test procedure: Final draft

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1986-12-01

    The technique described herein for determining the magnitudes and directions of the in situ principal stresses utilizes the stress relief in a small volume of rock when it is physically isolated from the surrounding rock mass. Measurements of deformation are related to stress magnitudes through an understanding of the constitutive behavior of the rock. The behaviors of the non-salt strata around the ESF are expected to conform approximately to that of uniform homogeneous linear-elastic materials having either isotropic or transverse isotropic properties, for which constitutive relations are developed. The constitutive behavior of the salt strata is not well understood andmore » so the overcoring technique yields information of only very limited use. For this reason the overcoring technique will not be used in the salt strata. The technique has also limited application in rocks containing joints spaced less than 8 in. (0.2 m) apart, unless a large number of test can be performed to obtain, a good statistical average. However, such unfavorably discontinuous rocks are not expected as a norm at the Deaf Smith County site. 7 refs., 22 figs., 4 tabs.« less

  15. Survey of Magnetosheath Plasma Properties at Saturn and Inference of Upstream Flow Conditions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thomsen, M. F.; Coates, A. J.; Jackman, C. M.

    A new Cassini magnetosheath data set is introduced that is based on a comprehensive survey of intervals in which the observed magnetosheath flow was encompassed within the plasma analyzer field of view and for which the computed numerical moments are therefore expected to be accurate. The data extend from 2004 day 299 to 2012 day 151 and comprise 19,155 416-s measurements. In addition to the plasma ion moments (density, temperature, and flow velocity), merged values of the plasma electron density and temperature, the energetic particle pressure, and the magnetic field vector are included in the data set. Statistical properties ofmore » various magnetosheath parameters, including dependence on local time, are presented. The magnetosheath field and flow are found to be only weakly aligned, primarily because of a relatively large z-component of the magnetic field, attributable to the field being pulled out of the equatorial orientation by flows at higher latitudes. A new procedure for using magnetosheath properties to estimate the upstream solar wind speed is proposed and used to determine that the amount of electron heating at Saturn's high Mach-number bow shock is ~4% of the dissipated flow energy. The data set is available as an electronic supplement to this paper.« less

  16. Survey of Magnetosheath Plasma Properties at Saturn and Inference of Upstream Flow Conditions

    DOE PAGES

    Thomsen, M. F.; Coates, A. J.; Jackman, C. M.; ...

    2018-03-01

    A new Cassini magnetosheath data set is introduced that is based on a comprehensive survey of intervals in which the observed magnetosheath flow was encompassed within the plasma analyzer field of view and for which the computed numerical moments are therefore expected to be accurate. The data extend from 2004 day 299 to 2012 day 151 and comprise 19,155 416-s measurements. In addition to the plasma ion moments (density, temperature, and flow velocity), merged values of the plasma electron density and temperature, the energetic particle pressure, and the magnetic field vector are included in the data set. Statistical properties ofmore » various magnetosheath parameters, including dependence on local time, are presented. The magnetosheath field and flow are found to be only weakly aligned, primarily because of a relatively large z-component of the magnetic field, attributable to the field being pulled out of the equatorial orientation by flows at higher latitudes. A new procedure for using magnetosheath properties to estimate the upstream solar wind speed is proposed and used to determine that the amount of electron heating at Saturn's high Mach-number bow shock is ~4% of the dissipated flow energy. The data set is available as an electronic supplement to this paper.« less

  17. Notes on quantitative structure-properties relationships (QSPR) (1): A discussion on a QSPR dimensionality paradox (QSPR DP) and its quantum resolution.

    PubMed

    Carbó-Dorca, Ramon; Gallegos, Ana; Sánchez, Angel J

    2009-05-01

    Classical quantitative structure-properties relationship (QSPR) statistical techniques unavoidably present an inherent paradoxical computational context. They rely on the definition of a Gram matrix in descriptor spaces, which is used afterwards to reduce the original dimension via several possible kinds of algebraic manipulations. From there, effective models for the computation of unknown properties of known molecular structures are obtained. However, the reduced descriptor dimension causes linear dependence within the set of discrete vector molecular representations, leading to positive semi-definite Gram matrices in molecular spaces. To resolve this QSPR dimensionality paradox (QSPR DP) here is proposed to adopt as starting point the quantum QSPR (QQSPR) computational framework perspective, where density functions act as infinite dimensional descriptors. The fundamental QQSPR equation, deduced from employing quantum expectation value numerical evaluation, can be approximately solved in order to obtain models exempt of the QSPR DP. The substitution of the quantum similarity matrix by an empirical Gram matrix in molecular spaces, build up with the original non manipulated discrete molecular descriptor vectors, permits to obtain classical QSPR models with the same characteristics as in QQSPR, that is: possessing a certain degree of causality and explicitly independent of the descriptor dimension. 2008 Wiley Periodicals, Inc.

  18. Comparison of Virginia's College and Career Ready Mathematics Performance Expectations with the Common Core State Standards for Mathematics

    ERIC Educational Resources Information Center

    Virginia Department of Education, 2010

    2010-01-01

    This paper presents a comparison of Virginia's mathematics performance expectations with the common core state standards for mathematics. The comparison focuses on number and quantity, algebra, functions, geometry, and statistics and probability. (Contains 1 footnote.)

  19. Perceptual basis of evolving Western musical styles

    PubMed Central

    Rodriguez Zivic, Pablo H.; Shifres, Favio; Cecchi, Guillermo A.

    2013-01-01

    The brain processes temporal statistics to predict future events and to categorize perceptual objects. These statistics, called expectancies, are found in music perception, and they span a variety of different features and time scales. Specifically, there is evidence that music perception involves strong expectancies regarding the distribution of a melodic interval, namely, the distance between two consecutive notes within the context of another. The recent availability of a large Western music dataset, consisting of the historical record condensed as melodic interval counts, has opened new possibilities for data-driven analysis of musical perception. In this context, we present an analytical approach that, based on cognitive theories of music expectation and machine learning techniques, recovers a set of factors that accurately identifies historical trends and stylistic transitions between the Baroque, Classical, Romantic, and Post-Romantic periods. We also offer a plausible musicological and cognitive interpretation of these factors, allowing us to propose them as data-driven principles of melodic expectation. PMID:23716669

  20. A space-time scan statistic for detecting emerging outbreaks.

    PubMed

    Tango, Toshiro; Takahashi, Kunihiko; Kohriyama, Kazuaki

    2011-03-01

    As a major analytical method for outbreak detection, Kulldorff's space-time scan statistic (2001, Journal of the Royal Statistical Society, Series A 164, 61-72) has been implemented in many syndromic surveillance systems. Since, however, it is based on circular windows in space, it has difficulty correctly detecting actual noncircular clusters. Takahashi et al. (2008, International Journal of Health Geographics 7, 14) proposed a flexible space-time scan statistic with the capability of detecting noncircular areas. It seems to us, however, that the detection of the most likely cluster defined in these space-time scan statistics is not the same as the detection of localized emerging disease outbreaks because the former compares the observed number of cases with the conditional expected number of cases. In this article, we propose a new space-time scan statistic which compares the observed number of cases with the unconditional expected number of cases, takes a time-to-time variation of Poisson mean into account, and implements an outbreak model to capture localized emerging disease outbreaks more timely and correctly. The proposed models are illustrated with data from weekly surveillance of the number of absentees in primary schools in Kitakyushu-shi, Japan, 2006. © 2010, The International Biometric Society.

  1. Psychometric Properties of Adaptation of the Social Efficacy and Outcome Expectations Scale to Turkish

    ERIC Educational Resources Information Center

    Bakioglu, Fuad; Turkum, Ayse Sibel

    2017-01-01

    The aim of this study was to examine the psychometric properties of the Social Efficacy and Outcome Expectations Scale (SEOES) on Turkish. The sample group included two groups of university students (ns = 440, 359). The validity of the scale was assessed using exploratory factor analysis, confirmatory factor analysis and concurrent validity, and…

  2. On the Q-dependence of the lowest-order QED corrections and other properties of the ground 11S-states in the two-electron ions

    NASA Astrophysics Data System (ADS)

    Frolov, Alexei M.

    2015-10-01

    Formulas and expectation values which are need to determine the lowest-order QED corrections (∼α3) and corresponding recoil (or finite mass) corrections in the two-electron helium-like ions are presented. Other important properties of the two-electron ions are also determined to high accuracy, including the expectation values of the quasi-singular Vinti operator and < reN-2> and < ree-2> expectation values. Elastic scattering of fast electrons by the two-electron ions in the Born approximation is considered. Interpolation formulas are derived for the bound state properties of the two-electron ions as the function of the nuclear electric charge Q.

  3. Statistical properties of filtered pseudorandom digital sequences formed from the sum of maximum-length sequences

    NASA Technical Reports Server (NTRS)

    Wallace, G. R.; Weathers, G. D.; Graf, E. R.

    1973-01-01

    The statistics of filtered pseudorandom digital sequences called hybrid-sum sequences, formed from the modulo-two sum of several maximum-length sequences, are analyzed. The results indicate that a relation exists between the statistics of the filtered sequence and the characteristic polynomials of the component maximum length sequences. An analysis procedure is developed for identifying a large group of sequences with good statistical properties for applications requiring the generation of analog pseudorandom noise. By use of the analysis approach, the filtering process is approximated by the convolution of the sequence with a sum of unit step functions. A parameter reflecting the overall statistical properties of filtered pseudorandom sequences is derived. This parameter is called the statistical quality factor. A computer algorithm to calculate the statistical quality factor for the filtered sequences is presented, and the results for two examples of sequence combinations are included. The analysis reveals that the statistics of the signals generated with the hybrid-sum generator are potentially superior to the statistics of signals generated with maximum-length generators. Furthermore, fewer calculations are required to evaluate the statistics of a large group of hybrid-sum generators than are required to evaluate the statistics of the same size group of approximately equivalent maximum-length sequences.

  4. Bayesian statistics and Monte Carlo methods

    NASA Astrophysics Data System (ADS)

    Koch, K. R.

    2018-03-01

    The Bayesian approach allows an intuitive way to derive the methods of statistics. Probability is defined as a measure of the plausibility of statements or propositions. Three rules are sufficient to obtain the laws of probability. If the statements refer to the numerical values of variables, the so-called random variables, univariate and multivariate distributions follow. They lead to the point estimation by which unknown quantities, i.e. unknown parameters, are computed from measurements. The unknown parameters are random variables, they are fixed quantities in traditional statistics which is not founded on Bayes' theorem. Bayesian statistics therefore recommends itself for Monte Carlo methods, which generate random variates from given distributions. Monte Carlo methods, of course, can also be applied in traditional statistics. The unknown parameters, are introduced as functions of the measurements, and the Monte Carlo methods give the covariance matrix and the expectation of these functions. A confidence region is derived where the unknown parameters are situated with a given probability. Following a method of traditional statistics, hypotheses are tested by determining whether a value for an unknown parameter lies inside or outside the confidence region. The error propagation of a random vector by the Monte Carlo methods is presented as an application. If the random vector results from a nonlinearly transformed vector, its covariance matrix and its expectation follow from the Monte Carlo estimate. This saves a considerable amount of derivatives to be computed, and errors of the linearization are avoided. The Monte Carlo method is therefore efficient. If the functions of the measurements are given by a sum of two or more random vectors with different multivariate distributions, the resulting distribution is generally not known. TheMonte Carlo methods are then needed to obtain the covariance matrix and the expectation of the sum.

  5. Variety and volatility in financial markets

    NASA Astrophysics Data System (ADS)

    Lillo, Fabrizio; Mantegna, Rosario N.

    2000-11-01

    We study the price dynamics of stocks traded in a financial market by considering the statistical properties of both a single time series and an ensemble of stocks traded simultaneously. We use the n stocks traded on the New York Stock Exchange to form a statistical ensemble of daily stock returns. For each trading day of our database, we study the ensemble return distribution. We find that a typical ensemble return distribution exists in most of the trading days with the exception of crash and rally days and of the days following these extreme events. We analyze each ensemble return distribution by extracting its first two central moments. We observe that these moments fluctuate in time and are stochastic processes, themselves. We characterize the statistical properties of ensemble return distribution central moments by investigating their probability density functions and temporal correlation properties. In general, time-averaged and portfolio-averaged price returns have different statistical properties. We infer from these differences information about the relative strength of correlation between stocks and between different trading days. Last, we compare our empirical results with those predicted by the single-index model and we conclude that this simple model cannot explain the statistical properties of the second moment of the ensemble return distribution.

  6. Reversed inverse regression for the univariate linear calibration and its statistical properties derived using a new methodology

    NASA Astrophysics Data System (ADS)

    Kang, Pilsang; Koo, Changhoi; Roh, Hokyu

    2017-11-01

    Since simple linear regression theory was established at the beginning of the 1900s, it has been used in a variety of fields. Unfortunately, it cannot be used directly for calibration. In practical calibrations, the observed measurements (the inputs) are subject to errors, and hence they vary, thus violating the assumption that the inputs are fixed. Therefore, in the case of calibration, the regression line fitted using the method of least squares is not consistent with the statistical properties of simple linear regression as already established based on this assumption. To resolve this problem, "classical regression" and "inverse regression" have been proposed. However, they do not completely resolve the problem. As a fundamental solution, we introduce "reversed inverse regression" along with a new methodology for deriving its statistical properties. In this study, the statistical properties of this regression are derived using the "error propagation rule" and the "method of simultaneous error equations" and are compared with those of the existing regression approaches. The accuracy of the statistical properties thus derived is investigated in a simulation study. We conclude that the newly proposed regression and methodology constitute the complete regression approach for univariate linear calibrations.

  7. Statistical properties of the radiation from SASE FEL operating in the linear regime

    NASA Astrophysics Data System (ADS)

    Saldin, E. L.; Schneidmiller, E. A.; Yurkov, M. V.

    1998-02-01

    The paper presents comprehensive analysis of statistical properties of the radiation from self amplified spontaneous emission (SASE) free electron laser operating in linear mode. The investigation has been performed in a one-dimensional approximation, assuming the electron pulse length to be much larger than a coherence length of the radiation. The following statistical properties of the SASE FEL radiation have been studied: field correlations, distribution of the radiation energy after monochromator installed at the FEL amplifier exit and photoelectric counting statistics of SASE FEL radiation. It is shown that the radiation from SASE FEL operating in linear regime possesses all the features corresponding to completely chaotic polarized radiation.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kraljic, David; Sarkar, Subir, E-mail: David.Kraljic@physics.ox.ac.uk, E-mail: Subir.Sarkar@physics.ox.ac.uk

    It has been observed [1,2] that the locally measured Hubble parameter converges quickest to the background value and the dipole structure of the velocity field is smallest in the reference frame of the Local Group of galaxies. We study the statistical properties of Lorentz boosts with respect to the Cosmic Microwave Background frame which make the Hubble flow look most uniform around a particular observer. We use a very large N-Body simulation to extract the dependence of the boost velocities on the local environment such as underdensities, overdensities, and bulk flows. We find that the observation [1,2] is not unexpectedmore » if we are located in an underdensity, which is indeed the case for our position in the universe. The amplitude of the measured boost velocity for our location is consistent with the expectation in the standard cosmology.« less

  9. An experimental investigation of the force network ensemble

    NASA Astrophysics Data System (ADS)

    Kollmer, Jonathan E.; Daniels, Karen E.

    2017-06-01

    We present an experiment in which a horizontal quasi-2D granular system with a fixed neighbor network is cyclically compressed and decompressed over 1000 cycles. We remove basal friction by floating the particles on a thin air cushion, so that particles only interact in-plane. As expected for a granular system, the applied load is not distributed uniformly, but is instead concentrated in force chains which form a network throughout the system. To visualize the structure of these networks, we use particles made from photoelastic material. The experimental setup and a new data-processing pipeline allow us to map out the evolution subject to the cyclic compressions. We characterize several statistical properties of the packing, including the probability density function of the contact force, and compare them with theoretical and numerical predictions from the force network ensemble theory.

  10. The evolution of void-filled cosmological structures

    NASA Technical Reports Server (NTRS)

    Regos, Eniko; Geller, Margaret J.

    1991-01-01

    1D, 2D, and 3D simulations are used here to investigate the salient features in the evolution of void-filled cosmological structures in universes with arbitrary values of Omega. It is found that the growth of a void as a function of time decreases significantly at the time corresponding to Omega = 0.5. In models constructed in 2D and 3D, suitable initial conditions lead to cellular structure with faceted voids similar to those observed in redshift surveys. Matter compressed to planes flows more rapidly toward condensations at the intersections than would be expected for spherical infall. The peculiar streaming velocities for void diameters of 5000 km/s should be observable. The simulations provide a more physical basis and dynamics for the bubbly and Voronois tesselation models used to derive statistical properties of cellular large-scale structure.

  11. Force-free electrodynamics in dynamical curved spacetimes

    NASA Astrophysics Data System (ADS)

    McWilliams, Sean

    2015-04-01

    We present results on our study of force-free electrodynamics in curved spacetimes. Specifically, we present several improvements to what has become the established set of evolution equations, and we apply these to study the nonlinear stability of analytically known force-free solutions for the first time. We implement our method in a new pseudo-spectral code built on top of the SpEC code for evolving dynamic spacetimes. Finally, we revisit these known solutions and attempt to clarify some interesting properties that render them analytically tractable. Finally, we preview some new work that similarly revisits the established approach to solving another problem in numerical relativity: the post-merger recoil from asymmetric gravitational-wave emission. These new results may have significant implications for the parameter dependence of recoils, and consequently on the statistical expectations for recoil velocities of merged systems.

  12. Implementation of Statistics Textbook Support with ICT and Portfolio Assessment Approach to Improve Students Teacher Mathematical Connection Skills

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Dewi, N. R.

    2017-04-01

    Statistics needed for use in the data analysis process and had a comprehensive implementation in daily life so that students must master the well statistical material. The use of Statistics textbook support with ICT and portfolio assessment approach was expected to help the students to improve mathematical connection skills. The subject of this research was 30 student teachers who take Statistics courses. The results of this research are the use of Statistics textbook support with ICT and portfolio assessment approach can improve students mathematical connection skills.

  13. SU-D-9A-02: Relative Effects of Threshold Choice and Spatial Resolution Modeling On SUV and Volume Quantification in F18-FDG PET Imaging of Anal Cancer Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, F; Shandong Cancer Hospital and Insititute, Jinan, Shandong; Bowsher, J

    2014-06-01

    Purpose: PET imaging with F18-FDG is utilized for treatment planning, treatment assessment, and prognosis. A region of interest (ROI) encompassing the tumor may be determined on the PET image, often by a threshold T on the PET standard uptake values (SUVs). Several studies have shown prognostic value for relevant ROI properties including maximum SUV value (SUVmax), metabolic tumor volume (MTV), and total glycolytic activity (TGA). The choice of threshold T may affect mean SUV value (SUVmean), MTV, and TGA. Recently spatial resolution modeling (SRM) has been introduced on many PET systems. SRM may also affect these ROI properties. The purposemore » of this work is to investigate the relative influence of SRM and threshold choice T on SUVmean, MTV, TGA, and SUVmax. Methods: For 9 anal cancer patients, 18F-FDG PET scans were performed prior to treatment. PET images were reconstructed by 2 iterations of Ordered Subsets Expectation Maximization (OSEM), with and without SRM. ROI contours were generated by 5 different SUV threshold values T: 2.5, 3.0, 30%, 40%, and 50% of SUVmax. Paired-samples t tests were used to compare SUVmean, MTV, and TGA (a) for SRM on versus off and (b) between each pair of threshold values T. SUVmax was also compared for SRM on versus off. Results: For almost all (57/60) comparisons of 2 different threshold values, SUVmean, MTV, and TGA showed statistically significant variation. For comparison of SRM on versus off, there were no statistically significant changes in SUVmax and TGA, but there were statistically significant changes in MTV for T=2.5 and T=3.0 and in SUVmean for all T. Conclusion: The near-universal statistical significance of threshold choice T suggests that, regarding harmonization across sites, threshold choice may be a greater concern than choice of SRM. However, broader study is warranted, e.g. other iterations of OSEM should be considered.« less

  14. ISCED Handbook: United Kingdom (England and Wales).

    ERIC Educational Resources Information Center

    United Nations Educational, Scientific, and Cultural Organization, Paris (France). Div. of Statistics on Education.

    The International Standard Classification of Education (ISCED) has been designed as an instrument suitable for assembling, compiling, and presenting statistics of education both within individual countries and internationally. It is expected to facilitate international compilation and comparison of education statistics as such, and also their use…

  15. Pearson-type goodness-of-fit test with bootstrap maximum likelihood estimation.

    PubMed

    Yin, Guosheng; Ma, Yanyuan

    2013-01-01

    The Pearson test statistic is constructed by partitioning the data into bins and computing the difference between the observed and expected counts in these bins. If the maximum likelihood estimator (MLE) of the original data is used, the statistic generally does not follow a chi-squared distribution or any explicit distribution. We propose a bootstrap-based modification of the Pearson test statistic to recover the chi-squared distribution. We compute the observed and expected counts in the partitioned bins by using the MLE obtained from a bootstrap sample. This bootstrap-sample MLE adjusts exactly the right amount of randomness to the test statistic, and recovers the chi-squared distribution. The bootstrap chi-squared test is easy to implement, as it only requires fitting exactly the same model to the bootstrap data to obtain the corresponding MLE, and then constructs the bin counts based on the original data. We examine the test size and power of the new model diagnostic procedure using simulation studies and illustrate it with a real data set.

  16. An Efficient Statistical Method to Compute Molecular Collisional Rate Coefficients

    NASA Astrophysics Data System (ADS)

    Loreau, Jérôme; Lique, François; Faure, Alexandre

    2018-01-01

    Our knowledge about the “cold” universe often relies on molecular spectra. A general property of such spectra is that the energy level populations are rarely at local thermodynamic equilibrium. Solving the radiative transfer thus requires the availability of collisional rate coefficients with the main colliding partners over the temperature range ∼10–1000 K. These rate coefficients are notoriously difficult to measure and expensive to compute. In particular, very few reliable collisional data exist for inelastic collisions involving reactive radicals or ions. In this Letter, we explore the use of a fast quantum statistical method to determine molecular collisional excitation rate coefficients. The method is benchmarked against accurate (but costly) rigid-rotor close-coupling calculations. For collisions proceeding through the formation of a strongly bound complex, the method is found to be highly satisfactory up to room temperature. Its accuracy decreases with decreasing potential well depth and with increasing temperature, as expected. This new method opens the way to the determination of accurate inelastic collisional data involving key reactive species such as {{{H}}}3+, H2O+, and H3O+ for which exact quantum calculations are currently not feasible.

  17. Quantum state reconstruction and photon number statistics for low dimensional semiconductor opto-electronic devices

    NASA Astrophysics Data System (ADS)

    Böhm, Fabian; Grosse, Nicolai B.; Kolarczik, Mirco; Herzog, Bastian; Achtstein, Alexander; Owschimikow, Nina; Woggon, Ulrike

    2017-09-01

    Quantum state tomography and the reconstruction of the photon number distribution are techniques to extract the properties of a light field from measurements of its mean and fluctuations. These techniques are particularly useful when dealing with macroscopic or mesoscopic systems, where a description limited to the second order autocorrelation soon becomes inadequate. In particular, the emission of nonclassical light is expected from mesoscopic quantum dot systems strongly coupled to a cavity or in systems with large optical nonlinearities. We analyze the emission of a quantum dot-semiconductor optical amplifier system by quantifying the modifications of a femtosecond laser pulse propagating through the device. Using a balanced detection scheme in a self-heterodyning setup, we achieve precise measurements of the quadrature components and their fluctuations at the quantum noise limit1. We resolve the photon number distribution and the thermal-to-coherent evolution in the photon statistics of the emission. The interferometric detection achieves a high sensitivity in the few photon limit. From our data, we can also reconstruct the second order autocorrelation function with higher precision and time resolution compared with classical Hanbury Brown-Twiss experiments.

  18. Men, Women, and Life Annuities

    ERIC Educational Resources Information Center

    King, Francis P.

    1976-01-01

    A senior research officer of Teacher Insurance and Annuity Association (TIAA) and College Retirement Equities Fund (CREF) discusses the issue of different life annuity benefits to men and women concluding that age and sex are two objective and statistically reliable factors used in determining life expectancy and thus the expected duration of…

  19. Willingness to Pay for Environmental Health Risk Reductions When There are Varying Degrees of Life Expectancy: A White Paper (2006)

    EPA Pesticide Factsheets

    The use of existing value of statistical life (VSL) estimates in benefit-cost analysis relates to relatively short changes in life expectancy. The authors' strategy for addressing this question is to briefly survey the existing economics literature.

  20. Customer satisfaction assessment at the Pacific Northwest National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    DN Anderson; ML Sours

    2000-03-23

    The Pacific Northwest National Laboratory (PNNL) is developing and implementing a customer satisfaction assessment program (CSAP) to assess the quality of research and development provided by the laboratory. This report presents the customer survey component of the PNNL CSAP. The customer survey questionnaire is composed of two major sections: Strategic Value and Project Performance. Both sections contain a set of questions that can be answered with a 5-point Likert scale response. The strategic value section consists of five questions that are designed to determine if a project directly contributes to critical future national needs. The project Performance section consists ofmore » nine questions designed to determine PNNL performance in meeting customer expectations. A statistical model for customer survey data is developed and this report discusses how to analyze the data with this model. The properties of the statistical model can be used to establish a gold standard or performance expectation for the laboratory, and then to assess progress. The gold standard is defined using laboratory management input--answers to four questions, in terms of the information obtained from the customer survey: (1) What should the average Strategic Value be for the laboratory project portfolio? (2) What Strategic Value interval should include most of the projects in the laboratory portfolio? (3) What should average Project Performance be for projects with a Strategic Value of about 2? (4) What should average Project Performance be for projects with a Strategic Value of about 4? To be able to provide meaningful answers to these questions, the PNNL customer survey will need to be fully implemented for several years, thus providing a link between management perceptions of laboratory performance and customer survey data.« less

  1. Customer Satisfaction Assessment at the Pacific Northwest National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Dale N.; Sours, Mardell L.

    2000-03-20

    The Pacific Northwest National Laboratory (PNNL) is developing and implementing a customer satisfaction assessment program (CSAP) to assess the quality of research and development provided by the laboratory. We present the customer survey component of the PNNL CSAP. The customer survey questionnaire is composed of 2 major sections, Strategic Value and Project Performance. The Strategic Value section of the questionnaire consists of 5 questions that can be answered with a 5 point Likert scale response. These questions are designed to determine if a project is directly contributing to critical future national needs. The Project Performance section of the questionnaire consistsmore » of 9 questions that can be answered with a 5 point Likert scale response. These questions determine PNNL performance in meeting customer expectations. Many approaches could be used to analyze customer survey data. We present a statistical model that can accurately capture the random behavior of customer survey data. The properties of this statistical model can be used to establish a "gold standard'' or performance expectation for the laboratory, and then assess progress. The gold standard is defined from input from laboratory management --- answers to 4 simple questions, in terms of the information obtained from the CSAP customer survey, define the standard: *What should the average Strategic Value be for the laboratory project portfolio? *What Strategic Value interval should include most of the projects in the laboratory portfolio? *What should average Project Performance be for projects with a Strategic Value of about 2? *What should average Project Performance be for projects with a Strategic Value of about 4? We discuss how to analyze CSAP customer survey data with this model. Our discussion will include "lessons learned" and issues that can invalidate this type of assessment.« less

  2. Fixations on objects in natural scenes: dissociating importance from salience

    PubMed Central

    't Hart, Bernard M.; Schmidt, Hannah C. E. F.; Roth, Christine; Einhäuser, Wolfgang

    2013-01-01

    The relation of selective attention to understanding of natural scenes has been subject to intense behavioral research and computational modeling, and gaze is often used as a proxy for such attention. The probability of an image region to be fixated typically correlates with its contrast. However, this relation does not imply a causal role of contrast. Rather, contrast may relate to an object's “importance” for a scene, which in turn drives attention. Here we operationalize importance by the probability that an observer names the object as characteristic for a scene. We modify luminance contrast of either a frequently named (“common”/“important”) or a rarely named (“rare”/“unimportant”) object, track the observers' eye movements during scene viewing and ask them to provide keywords describing the scene immediately after. When no object is modified relative to the background, important objects draw more fixations than unimportant ones. Increases of contrast make an object more likely to be fixated, irrespective of whether it was important for the original scene, while decreases in contrast have little effect on fixations. Any contrast modification makes originally unimportant objects more important for the scene. Finally, important objects are fixated more centrally than unimportant objects, irrespective of contrast. Our data suggest a dissociation between object importance (relevance for the scene) and salience (relevance for attention). If an object obeys natural scene statistics, important objects are also salient. However, when natural scene statistics are violated, importance and salience are differentially affected. Object salience is modulated by the expectation about object properties (e.g., formed by context or gist), and importance by the violation of such expectations. In addition, the dependence of fixated locations within an object on the object's importance suggests an analogy to the effects of word frequency on landing positions in reading. PMID:23882251

  3. Study Designs and Statistical Analyses for Biomarker Research

    PubMed Central

    Gosho, Masahiko; Nagashima, Kengo; Sato, Yasunori

    2012-01-01

    Biomarkers are becoming increasingly important for streamlining drug discovery and development. In addition, biomarkers are widely expected to be used as a tool for disease diagnosis, personalized medication, and surrogate endpoints in clinical research. In this paper, we highlight several important aspects related to study design and statistical analysis for clinical research incorporating biomarkers. We describe the typical and current study designs for exploring, detecting, and utilizing biomarkers. Furthermore, we introduce statistical issues such as confounding and multiplicity for statistical tests in biomarker research. PMID:23012528

  4. Inference of Markovian properties of molecular sequences from NGS data and applications to comparative genomics.

    PubMed

    Ren, Jie; Song, Kai; Deng, Minghua; Reinert, Gesine; Cannon, Charles H; Sun, Fengzhu

    2016-04-01

    Next-generation sequencing (NGS) technologies generate large amounts of short read data for many different organisms. The fact that NGS reads are generally short makes it challenging to assemble the reads and reconstruct the original genome sequence. For clustering genomes using such NGS data, word-count based alignment-free sequence comparison is a promising approach, but for this approach, the underlying expected word counts are essential.A plausible model for this underlying distribution of word counts is given through modeling the DNA sequence as a Markov chain (MC). For single long sequences, efficient statistics are available to estimate the order of MCs and the transition probability matrix for the sequences. As NGS data do not provide a single long sequence, inference methods on Markovian properties of sequences based on single long sequences cannot be directly used for NGS short read data. Here we derive a normal approximation for such word counts. We also show that the traditional Chi-square statistic has an approximate gamma distribution ,: using the Lander-Waterman model for physical mapping. We propose several methods to estimate the order of the MC based on NGS reads and evaluate those using simulations. We illustrate the applications of our results by clustering genomic sequences of several vertebrate and tree species based on NGS reads using alignment-free sequence dissimilarity measures. We find that the estimated order of the MC has a considerable effect on the clustering results ,: and that the clustering results that use a N: MC of the estimated order give a plausible clustering of the species. Our implementation of the statistics developed here is available as R package 'NGS.MC' at http://www-rcf.usc.edu/∼fsun/Programs/NGS-MC/NGS-MC.html fsun@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Characterization of protein folding by a Φ-value calculation with a statistical-mechanical model.

    PubMed

    Wako, Hiroshi; Abe, Haruo

    2016-01-01

    The Φ-value analysis approach provides information about transition-state structures along the folding pathway of a protein by measuring the effects of an amino acid mutation on folding kinetics. Here we compared the theoretically calculated Φ values of 27 proteins with their experimentally observed Φ values; the theoretical values were calculated using a simple statistical-mechanical model of protein folding. The theoretically calculated Φ values reflected the corresponding experimentally observed Φ values with reasonable accuracy for many of the proteins, but not for all. The correlation between the theoretically calculated and experimentally observed Φ values strongly depends on whether the protein-folding mechanism assumed in the model holds true in real proteins. In other words, the correlation coefficient can be expected to illuminate the folding mechanisms of proteins, providing the answer to the question of which model more accurately describes protein folding: the framework model or the nucleation-condensation model. In addition, we tried to characterize protein folding with respect to various properties of each protein apart from the size and fold class, such as the free-energy profile, contact-order profile, and sensitivity to the parameters used in the Φ-value calculation. The results showed that any one of these properties alone was not enough to explain protein folding, although each one played a significant role in it. We have confirmed the importance of characterizing protein folding from various perspectives. Our findings have also highlighted that protein folding is highly variable and unique across different proteins, and this should be considered while pursuing a unified theory of protein folding.

  6. Characterization of protein folding by a Φ-value calculation with a statistical-mechanical model

    PubMed Central

    Wako, Hiroshi; Abe, Haruo

    2016-01-01

    The Φ-value analysis approach provides information about transition-state structures along the folding pathway of a protein by measuring the effects of an amino acid mutation on folding kinetics. Here we compared the theoretically calculated Φ values of 27 proteins with their experimentally observed Φ values; the theoretical values were calculated using a simple statistical-mechanical model of protein folding. The theoretically calculated Φ values reflected the corresponding experimentally observed Φ values with reasonable accuracy for many of the proteins, but not for all. The correlation between the theoretically calculated and experimentally observed Φ values strongly depends on whether the protein-folding mechanism assumed in the model holds true in real proteins. In other words, the correlation coefficient can be expected to illuminate the folding mechanisms of proteins, providing the answer to the question of which model more accurately describes protein folding: the framework model or the nucleation-condensation model. In addition, we tried to characterize protein folding with respect to various properties of each protein apart from the size and fold class, such as the free-energy profile, contact-order profile, and sensitivity to the parameters used in the Φ-value calculation. The results showed that any one of these properties alone was not enough to explain protein folding, although each one played a significant role in it. We have confirmed the importance of characterizing protein folding from various perspectives. Our findings have also highlighted that protein folding is highly variable and unique across different proteins, and this should be considered while pursuing a unified theory of protein folding. PMID:28409079

  7. Binary black hole mergers within the LIGO horizon: statistical properties and prospects for detecting electromagnetic counterparts

    NASA Astrophysics Data System (ADS)

    Perna, Rosalba; Chruslinska, Martyna; Corsi, Alessandra; Belczynski, Krzysztof

    2018-07-01

    Binary black holes (BBHs) are one of the endpoints of isolated binary evolution, and their mergers a leading channel for gravitational wave events. Here, using the evolutionary code STARTRACK, we study the statistical properties of the BBH population from isolated binary evolution for a range of progenitor star metallicities and BH natal kicks. We compute the mass function and the distribution of the primary BH spin a as a result of mass accretion during the binary evolution, and find that this is not an efficient process to spin-up BHs, producing an increase by at most a ˜ 0.2-0.3 for very low natal BH spins. We further compute the distribution of merger sites within the host galaxy, after tracking the motion of the binaries in the potentials of a massive spiral, a massive elliptical, and a dwarf galaxy. We find that a fraction of 70-90 per cent of mergers in massive galaxies and of 40-60 per cent in dwarfs (range mostly sensitive to the natal kicks) are expected to occur inside of their hosts. The number density distribution at the merger sites further allows us to estimate the broad-band luminosity distribution that BBH mergers would produce, if associated with a kinetic energy release in an outflow, which, as a reference, we assume at the level inferred for the Fermi GBM counterpart to GW150914, with the understanding that current limits from the O1 and O2 runs would require such emission to be produced within a jet of angular size within ≲50°.

  8. Binary Black Hole Mergers within the LIGO Horizon: Statistical Properties and prospects for detecting Electromagnetic Counterparts

    NASA Astrophysics Data System (ADS)

    Perna, Rosalba; Chruslinska, Martyna; Corsi, Alessandra; Belczynski, Krzysztof

    2018-03-01

    Binary black holes (BBHs) are one of the endpoints of isolated binary evolution, and their mergers a leading channel for gravitational wave events. Here, using the evolutionary code STARTRACK, we study the statistical properties of the BBH population from isolated binary evolution for a range of progenitor star metallicities and BH natal kicks. We compute the mass function and the distribution of the primary BH spin a as a result of mass accretion during the binary evolution, and find that this is not an efficient process to spin up BHs, producing an increase by at most a ˜ 0.2-0.3 for very low natal BH spins. We further compute the distribution of merger sites within the host galaxy, after tracking the motion of the binaries in the potentials of a massive spiral, a massive elliptical, and a dwarf galaxy. We find that a fraction of 70-90% of mergers in massive galaxies and of 40-60% in dwarfs (range mostly sensitive to the natal kicks) is expected to occur inside of their hosts. The number density distribution at the merger sites further allows us to estimate the broadband luminosity distribution that BBH mergers would produce, if associated with a kinetic energy release in an outflow, which, as a reference, we assume at the level inferred for the Fermi GBM counterpart to GW150914, with the understanding that current limits from the O1 and O2 runs would require such emission to be produced within a jet of angular size within ≲ 50°.

  9. Value of Information Analysis for Time-lapse Seismic Data by Simulation-Regression

    NASA Astrophysics Data System (ADS)

    Dutta, G.; Mukerji, T.; Eidsvik, J.

    2016-12-01

    A novel method to estimate the Value of Information (VOI) of time-lapse seismic data in the context of reservoir development is proposed. VOI is a decision analytic metric quantifying the incremental value that would be created by collecting information prior to making a decision under uncertainty. The VOI has to be computed before collecting the information and can be used to justify its collection. Previous work on estimating the VOI of geophysical data has involved explicit approximation of the posterior distribution of reservoir properties given the data and then evaluating the prospect values for that posterior distribution of reservoir properties. Here, we propose to directly estimate the prospect values given the data by building a statistical relationship between them using regression. Various regression techniques such as Partial Least Squares Regression (PLSR), Multivariate Adaptive Regression Splines (MARS) and k-Nearest Neighbors (k-NN) are used to estimate the VOI, and the results compared. For a univariate Gaussian case, the VOI obtained from simulation-regression has been shown to be close to the analytical solution. Estimating VOI by simulation-regression is much less computationally expensive since the posterior distribution of reservoir properties given each possible dataset need not be modeled and the prospect values need not be evaluated for each such posterior distribution of reservoir properties. This method is flexible, since it does not require rigid model specification of posterior but rather fits conditional expectations non-parametrically from samples of values and data.

  10. The effects of multiple repairs on Inconel 718 weld mechanical properties

    NASA Technical Reports Server (NTRS)

    Russell, C. K.; Nunes, A. C., Jr.; Moore, D.

    1991-01-01

    Inconel 718 weldments were repaired 3, 6, 9, and 13 times using the gas tungsten arc welding process. The welded panels were machined into mechanical test specimens, postweld heat treated, and nondestructively tested. Tensile properties and high cycle fatigue life were evaluated and the results compared to unrepaired weld properties. Mechanical property data were analyzed using the statistical methods of difference in means for tensile properties and difference in log means and Weibull analysis for high cycle fatigue properties. Statistical analysis performed on the data did not show a significant decrease in tensile or high cycle fatigue properties due to the repeated repairs. Some degradation was observed in all properties, however, it was minimal.

  11. Chandra Survey Of Galactic Coronae Around Nearby Edge-on Disk Galaxies

    NASA Astrophysics Data System (ADS)

    Li, Jiang-Tao; Wang, D.

    2012-01-01

    The X-ray emitting coronae in nearby galaxies are expected to be produced either by accretion from the IGM or by various galactic feedbacks. It is already well known that the total hot gas luminosity of these galaxies is correlated with the stellar mass for early-type galaxies and with SFR for star forming galaxies. However, such relations always have large scatter, indicating various other processes must be involved in regulating the coronal properties. In this work, we conduct a systematical analysis of the Chandra data of 53 nearby edge-on disk galaxies. The data are reduced in a uniform manner. Various coronal properties, such as the luminosity, temperature, emission measure, electron number density, total mass, thermal energy, radiative cooling timescale, vertical and horizontal extension, elongation, and steepness of the vertical distribution, are characterized for most of the sample galaxies. For some galaxies with high enough counting statistics, we also study the thermal and chemical states of the coronal gas. We then compare these hot gas properties to other galactic properties to further study the role of different processes in producing and/or maintaining the coronae. The soft X-ray luminosity of the coronae generally correlates well with the SF activity for our sample galaxies over more than 3 orders of magnitude in SFR or Lx. In addition, the inclusion of other galactic properties could significantly improve the correlation of the SFR-Lx relation. The SN feedback efficiency is at most 10% for all the sample galaxies. We also find evidence for the effectiveness of old stellar feedback, gravitation, environmental effects, and cold-hot gas interaction in regulating the coronal properties.

  12. Nonlinear wave chaos: statistics of second harmonic fields.

    PubMed

    Zhou, Min; Ott, Edward; Antonsen, Thomas M; Anlage, Steven M

    2017-10-01

    Concepts from the field of wave chaos have been shown to successfully predict the statistical properties of linear electromagnetic fields in electrically large enclosures. The Random Coupling Model (RCM) describes these properties by incorporating both universal features described by Random Matrix Theory and the system-specific features of particular system realizations. In an effort to extend this approach to the nonlinear domain, we add an active nonlinear frequency-doubling circuit to an otherwise linear wave chaotic system, and we measure the statistical properties of the resulting second harmonic fields. We develop an RCM-based model of this system as two linear chaotic cavities coupled by means of a nonlinear transfer function. The harmonic field strengths are predicted to be the product of two statistical quantities and the nonlinearity characteristics. Statistical results from measurement-based calculation, RCM-based simulation, and direct experimental measurements are compared and show good agreement over many decades of power.

  13. Isolation and characterization of microsatellite loci in the whale shark (Rhincodon typus)

    USGS Publications Warehouse

    Ramirez-Macias, D.; Shaw, K.; Ward, R.; Galvan-Magana, F.; Vazquez-Juarez, R.

    2009-01-01

    In preparation for a study on population structure of the whale shark (Rhincodon typus), nine species-specific polymorphic microsatellite DNA markers were developed. An initial screening of 50 individuals from Holbox Island, Mexico found all nine loci to be polymorphic, with two to 17 alleles observed per locus. Observed and expected heterozygosity per locus ranged from 0.200 to 0.826 and from 0.213 to 0.857, respectively. Neither statistically significant deviations from Hardy–Weinberg expectations nor statistically significant linkage disequilibrium between loci were observed. These microsatellite loci appear suitable for examining population structure, kinship assessment and other applications.

  14. Optical Parametric Amplification of Single Photon: Statistical Properties and Quantum Interference

    NASA Astrophysics Data System (ADS)

    Xu, Xue-Xiang; Yuan, Hong-Chun

    2014-05-01

    By using phase space method, we theoretically investigate the quantum statistical properties and quantum interference of optical parametric amplification of single photon. The statistical properties, such as the Wigner function (WF), average photon number, photon number distribution and parity, are derived analytically for the fields of the two output ports. The results indicate that the fields in the output ports are multiphoton states rather than single photon state due to the amplification of the optical parametric amplifiers (OPA). In addition, the phase sensitivity is also examined by using the detection scheme of parity measurement.

  15. Generalized statistical mechanics approaches to earthquakes and tectonics.

    PubMed

    Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios

    2016-12-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.

  16. Generalized statistical mechanics approaches to earthquakes and tectonics

    PubMed Central

    Papadakis, Giorgos; Michas, Georgios

    2016-01-01

    Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548

  17. Instantaneous polarization statistic property of EM waves incident on time-varying reentry plasma

    NASA Astrophysics Data System (ADS)

    Bai, Bowen; Liu, Yanming; Li, Xiaoping; Yao, Bo; Shi, Lei

    2018-06-01

    An analytical method is proposed in this paper to study the effect of time-varying reentry plasma sheath on the instantaneous polarization statistic property of electromagnetic (EM) waves. Based on the disturbance property of the hypersonic fluid, the spatial-temporal model of the time-varying reentry plasma sheath is established. An analytical technique referred to as transmission line analogy is developed to calculate the instantaneous transmission coefficient of EM wave propagation in time-varying plasma. Then, the instantaneous polarization statistic theory of EM wave propagation in the time-varying plasma sheath is developed. Taking the S-band telemetry right hand circularly polarized wave as an example, effects of incident angle and plasma parameters, including the electron density and the collision frequency on the EM wave's polarization statistic property are studied systematically. Statistical results indicate that the lower the collision frequency and the larger the electron density and incident angle is, the worse the deterioration of the polarization property is. Meanwhile, in conditions of critical parameters of certain electron density, collision frequency, and incident angle, the transmitted waves have both the right and left hand polarization mode, and the polarization mode will reverse. The calculation results could provide useful information for adaptive polarization receiving of the spacecraft's reentry communication.

  18. Statistical Study between Solar Wind, Magnetosheath and Plasma Sheet Fluctuation Properties and Correlation with Magnetotail Bursty Bulk Flows

    NASA Astrophysics Data System (ADS)

    Chu, C. S.; Nykyri, K.; Dimmock, A. P.

    2017-12-01

    In this paper we test a hypothesis that magnetotail reconnection in the thin current sheet could be initiated by external fluctuations. Kelvin-Helmholtz instability (KHI) has been observed during southward IMF and it can produce, cold, dense plasma transport and compressional fluctuations that can move further into the magnetosphere. The properties of the KHI depend on the magnetosheath seed fluctuation spectrum (Nykyri et al., JGR, 2017). In this paper we present a statistical correlation study between Solar Wind, Magnetosheath and Plasma sheet fluctuation properties using 9+ years of THEMIS data in aberrated GSM frame, and in a normalized coordinate system that takes into account the changes of the magnetopause and bow shock location with respect to changing solar wind conditions. We present statistical results of the plasma sheet fluctuation properties (dn, dV and dB) and their dependence on IMF orientation and fluctuation properties and resulting magnetosheath state. These statistical maps are compared with spatial distribution of magnetotail Bursty Bulk Flows to study possible correlations with magnetotail reconnection and these fluctuations.

  19. Subjective Ratings of Beauty and Aesthetics: Correlations With Statistical Image Properties in Western Oil Paintings

    PubMed Central

    Lehmann, Thomas; Redies, Christoph

    2017-01-01

    For centuries, oil paintings have been a major segment of the visual arts. The JenAesthetics data set consists of a large number of high-quality images of oil paintings of Western provenance from different art periods. With this database, we studied the relationship between objective image measures and subjective evaluations of the images, especially evaluations on aesthetics (defined as artistic value) and beauty (defined as individual liking). The objective measures represented low-level statistical image properties that have been associated with aesthetic value in previous research. Subjective rating scores on aesthetics and beauty correlated not only with each other but also with different combinations of the objective measures. Furthermore, we found that paintings from different art periods vary with regard to the objective measures, that is, they exhibit specific patterns of statistical image properties. In addition, clusters of participants preferred different combinations of these properties. In conclusion, the results of the present study provide evidence that statistical image properties vary between art periods and subject matters and, in addition, they correlate with the subjective evaluation of paintings by the participants. PMID:28694958

  20. Min and Max Exponential Extreme Interval Values and Statistics

    ERIC Educational Resources Information Center

    Jance, Marsha; Thomopoulos, Nick

    2009-01-01

    The extreme interval values and statistics (expected value, median, mode, standard deviation, and coefficient of variation) for the smallest (min) and largest (max) values of exponentially distributed variables with parameter ? = 1 are examined for different observation (sample) sizes. An extreme interval value g[subscript a] is defined as a…

  1. Students' Achievements in a Statistics Course in Relation to Motivational Aspects and Study Behaviour

    ERIC Educational Resources Information Center

    Bude, Luc; Van De Wiel, Margaretha W. J.; Imbos, Tjaart; Candel, Math J. J. M.; Broers, Nick J.; Berger, Martijn P. F.

    2007-01-01

    The present study focuses on motivational constructs and their effect on students' academic achievement within an existing statistics course. First-year Health Sciences students completed a questionnaire that measures several motivational constructs: dimensions of causal attributions, outcome expectancy, affect, and study behaviour, all with…

  2. Farkle Fundamentals and Fun. Activities for Students

    ERIC Educational Resources Information Center

    Hooley, Donald E.

    2014-01-01

    The dice game Farkle provides an excellent basis for four activities that reinforce probability and expected value concepts for students in an introductory statistics class. These concepts appear in the increasingly popular AP statistics course (Peck 2011) and are used in analyzing ethical issues from insurance and gambling (COMAP 2009; Woodward…

  3. Quantifying economic fluctuations by adapting methods of statistical physics

    NASA Astrophysics Data System (ADS)

    Plerou, Vasiliki

    2001-09-01

    The first focus of this thesis is the investigation of cross-correlations between the price fluctuations of different stocks using the conceptual framework of random matrix theory (RMT), developed in physics to describe the statistical properties of energy-level spectra of complex nuclei. RMT makes predictions for the statistical properties of matrices that are universal, i.e., do not depend on the interactions between the elements comprising the system. In physical systems, deviations from the predictions of RMT provide clues regarding the mechanisms controlling the dynamics of a given system so this framework is of potential value if applied to economic systems. This thesis compares the statistics of cross-correlation matrix C-whose elements Cij are the correlation coefficients of price fluctuations of stock i and j-against the ``null hypothesis'' of a random matrix having the same symmetry properties. It is shown that comparison of the eigenvalue statistics of C with RMT results can be used to distinguish random and non-random parts of C. The non-random part of C which deviates from RMT results, provides information regarding genuine cross-correlations between stocks. The interpretations and potential practical utility of these deviations are also investigated. The second focus is the characterization of the dynamics of stock price fluctuations. The statistical properties of the changes G Δt in price over a time interval Δ t are quantified and the statistical relation between G Δt and the trading activity-measured by the number of transactions NΔ t in the interval Δt is investigated. The statistical properties of the volatility, i.e., the time dependent standard deviation of price fluctuations, is related to two microscopic quantities: NΔt and the variance W2Dt of the price changes for all transactions in the interval Δ t. In addition, the statistical relationship between G Δt and the number of shares QΔt traded in Δ t is investigated.

  4. Parent Expectations and Planning for College. Statistical Analysis Report. NCES 2008-079

    ERIC Educational Resources Information Center

    Lippman, Laura; Guzman, Lina; Keith, Julie Dombrowski; Kinukawa, Akemi; Shwalb, Rebecca; Tice, Peter

    2008-01-01

    This report uses data from the 2003 National Household Education Surveys Program (NHES) Parent and Family Involvement Survey (PFI) to examine the characteristics associated with the educational expectations parents had for their children and the postsecondary education planning practices families and schools engaged in. The results presented in…

  5. Graduate Level Research Methods and Statistics Courses: The Perspective of an Instructor

    ERIC Educational Resources Information Center

    Mulvenon, Sean W.; Wang, Victor C. X.

    2015-01-01

    The goal of an educational system or degree program is to "educate" students. This immediately raises the question of what does it mean to "educate" students. All academic institutions, degree programs and content areas are typically expected to answer this question and establish appropriate academic expectations both within…

  6. Parent and teen agreement on driving expectations prior to teen licensure.

    PubMed

    Hamann, Cara J; Ramirez, Marizen; Yang, Jingzhen; Chande, Vidya; Peek-Asa, Corinne

    2014-01-01

    To examine pre-licensure agreement on driving expectations and predictors of teen driving expectations among parent-teen dyads. Cross-sectional survey of 163 parent-teen dyads. Descriptive statistics, weighted Kappa coefficients, and linear regression were used to examine expectations about post-licensure teen driving. Teens reported high pre-licensure unsupervised driving (N = 79, 48.5%) and regular access to a car (N = 130, 81.8%). Parents and teens had low agreement on teen driving expectations (eg, after dark, κw = 0.23). Each time teens currently drove to/from school, their expectation of driving in risky conditions post-licensure increased (β = 0.21, p = .02). Pre-licensure improvement of parent-teen agreement on driving expectations are needed to have the greatest impact on preventing teens from driving in high risk conditions.

  7. Form-To-Expectation Matching Effects on First-Pass Eye Movement Measures During Reading

    PubMed Central

    Farmer, Thomas A.; Yan, Shaorong; Bicknell, Klinton; Tanenhaus, Michael K.

    2015-01-01

    Recent EEG/MEG studies suggest that when contextual information is highly predictive of some property of a linguistic signal, expectations generated from context can be translated into surprisingly low-level estimates of the physical form-based properties likely to occur in subsequent portions of the unfolding signal. Whether form-based expectations are generated and assessed during natural reading, however, remains unclear. We monitored eye movements while participants read phonologically typical and atypical nouns in noun-predictive contexts (Experiment 1), demonstrating that when a noun is strongly expected, fixation durations on first-pass eye movement measures, including first fixation duration, gaze duration, and go-past times, are shorter for nouns with category typical form-based features. In Experiments 2 and 3, typical and atypical nouns were placed in sentential contexts normed to create expectations of variable strength for a noun. Context and typicality interacted significantly at gaze duration. These results suggest that during reading, form-based expectations that are translated from higher-level category-based expectancies can facilitate the processing of a word in context, and that their effect on lexical processing is graded based on the strength of category expectancy. PMID:25915072

  8. A two-component rain model for the prediction of attenuation statistics

    NASA Technical Reports Server (NTRS)

    Crane, R. K.

    1982-01-01

    A two-component rain model has been developed for calculating attenuation statistics. In contrast to most other attenuation prediction models, the two-component model calculates the occurrence probability for volume cells or debris attenuation events. The model performed significantly better than the International Radio Consultative Committee model when used for predictions on earth-satellite paths. It is expected that the model will have applications in modeling the joint statistics required for space diversity system design, the statistics of interference due to rain scatter at attenuating frequencies, and the duration statistics for attenuation events.

  9. The quality assessment of family physician service in rural regions, Northeast of Iran in 2012

    PubMed Central

    Vafaee-Najar, Ali; Nejatzadegan, Zohreh; Pourtaleb, Arefeh; Kaffashi, Shahnaz; Vejdani, Marjan; Molavi-Taleghani, Yasamin; Ebrahimipour, Hosein

    2014-01-01

    Background: Following the implementation of family physician plan in rural areas, the quantity of provided services has been increased, but what leads on the next topic is the improvement in expected quality of service, as well. The present study aims at determining the gap between patients’ expectation and perception from the quality of services provided by family physicians during the spring and summer of 2012. Methods: This was a cross-sectional study in which 480 patients who referred to family physician centers were selected with clustering and simple randomized method. Data were collected through SERVQUAL standard questionnaire and were analyzed with descriptive statistics, using statistical T-test, Kruskal-Wallis, and Wilcoxon signed-rank tests by SPSS 16 at a significance level of 0.05. Results: The difference between the mean scores of expectation and perception was about -0.93, which is considered as statistically significant difference (P≤ 0.05). Also, the differences in five dimensions of quality were as follows: tangible -1.10, reliability -0.87, responsiveness -1.06, assurance -0.83, and empathy -0.82. Findings showed that there was a significant difference between expectation and perception in five concepts of the provided services (P≤ 0.05). Conclusion: There was a gap between the ideal situation and the current situation of family physician quality of services. Our suggestion is maintaining a strong focus on patients, creating a medical practice that would exceed patients’ expectations, providing high-quality healthcare services, and realizing the continuous improvement of all processes. In both tangible and responsive, the gap was greater than the other dimensions. It is recommended that more attention should be paid to the physical appearance of the health center environment and the availability of staff and employees. PMID:24757691

  10. 41 CFR 102-35.20 - What definitions apply to GSA's personal property regulations?

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... surveillance of personal property throughout its complete life cycle using various property management tools... includes nonexpendable personal property whose expected useful life is two years or longer and whose...) Statement of Federal Financial Accounting Standards No. 6 Accounting for Property, Plant and Equipment...

  11. 41 CFR 102-35.20 - What definitions apply to GSA's personal property regulations?

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... surveillance of personal property throughout its complete life cycle using various property management tools... includes nonexpendable personal property whose expected useful life is two years or longer and whose...) Statement of Federal Financial Accounting Standards No. 6 Accounting for Property, Plant and Equipment...

  12. 41 CFR 102-35.20 - What definitions apply to GSA's personal property regulations?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... surveillance of personal property throughout its complete life cycle using various property management tools... includes nonexpendable personal property whose expected useful life is two years or longer and whose...) Statement of Federal Financial Accounting Standards No. 6 Accounting for Property, Plant and Equipment...

  13. Lagrangian statistics of mesoscale turbulence in a natural environment: The Agulhas return current.

    PubMed

    Carbone, Francesco; Gencarelli, Christian N; Hedgecock, Ian M

    2016-12-01

    The properties of mesoscale geophysical turbulence in an oceanic environment have been investigated through the Lagrangian statistics of sea surface temperature measured by a drifting buoy within the Agulhas return current, where strong temperature mixing produces locally sharp temperature gradients. By disentangling the large-scale forcing which affects the small-scale statistics, we found that the statistical properties of intermittency are identical to those obtained from the multifractal prediction in the Lagrangian frame for the velocity trajectory. The results suggest a possible universality of turbulence scaling.

  14. Bayesian and “Anti-Bayesian” Biases in Sensory Integration for Action and Perception in the Size–Weight Illusion

    PubMed Central

    Brayanov, Jordan B.

    2010-01-01

    Which is heavier: a pound of lead or a pound of feathers? This classic trick question belies a simple but surprising truth: when lifted, the pound of lead feels heavier—a phenomenon known as the size–weight illusion. To estimate the weight of an object, our CNS combines two imperfect sources of information: a prior expectation, based on the object's appearance, and direct sensory information from lifting it. Bayes' theorem (or Bayes' law) defines the statistically optimal way to combine multiple information sources for maximally accurate estimation. Here we asked whether the mechanisms for combining these information sources produce statistically optimal weight estimates for both perceptions and actions. We first studied the ability of subjects to hold one hand steady when the other removed an object from it, under conditions in which sensory information about the object's weight sometimes conflicted with prior expectations based on its size. Since the ability to steady the supporting hand depends on the generation of a motor command that accounts for lift timing and object weight, hand motion can be used to gauge biases in weight estimation by the motor system. We found that these motor system weight estimates reflected the integration of prior expectations with real-time proprioceptive information in a Bayesian, statistically optimal fashion that discounted unexpected sensory information. This produces a motor size–weight illusion that consistently biases weight estimates toward prior expectations. In contrast, when subjects compared the weights of two objects, their perceptions defied Bayes' law, exaggerating the value of unexpected sensory information. This produces a perceptual size–weight illusion that biases weight perceptions away from prior expectations. We term this effect “anti-Bayesian” because the bias is opposite that seen in Bayesian integration. Our findings suggest that two fundamentally different strategies for the integration of prior expectations with sensory information coexist in the nervous system for weight estimation. PMID:20089821

  15. Probabilistic Fiber Composite Micromechanics

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1996-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intra-ply level, and the related effects of these on composite properties.

  16. Psychology, Science, and Knowledge Construction: Broadening Perspectives from the Replication Crisis.

    PubMed

    Shrout, Patrick E; Rodgers, Joseph L

    2018-01-04

    Psychology advances knowledge by testing statistical hypotheses using empirical observations and data. The expectation is that most statistically significant findings can be replicated in new data and in new laboratories, but in practice many findings have replicated less often than expected, leading to claims of a replication crisis. We review recent methodological literature on questionable research practices, meta-analysis, and power analysis to explain the apparently high rates of failure to replicate. Psychologists can improve research practices to advance knowledge in ways that improve replicability. We recommend that researchers adopt open science conventions of preregi-stration and full disclosure and that replication efforts be based on multiple studies rather than on a single replication attempt. We call for more sophisticated power analyses, careful consideration of the various influences on effect sizes, and more complete disclosure of nonsignificant as well as statistically significant findings.

  17. Ten Years of Cloud Properties from MODIS: Global Statistics and Use in Climate Model Evaluation

    NASA Technical Reports Server (NTRS)

    Platnick, Steven E.

    2011-01-01

    The NASA Moderate Resolution Imaging Spectroradiometer (MODIS), launched onboard the Terra and Aqua spacecrafts, began Earth observations on February 24, 2000 and June 24,2002, respectively. Among the algorithms developed and applied to this sensor, a suite of cloud products includes cloud masking/detection, cloud-top properties (temperature, pressure), and optical properties (optical thickness, effective particle radius, water path, and thermodynamic phase). All cloud algorithms underwent numerous changes and enhancements between for the latest Collection 5 production version; this process continues with the current Collection 6 development. We will show example MODIS Collection 5 cloud climatologies derived from global spatial . and temporal aggregations provided in the archived gridded Level-3 MODIS atmosphere team product (product names MOD08 and MYD08 for MODIS Terra and Aqua, respectively). Data sets in this Level-3 product include scalar statistics as well as 1- and 2-D histograms of many cloud properties, allowing for higher order information and correlation studies. In addition to these statistics, we will show trends and statistical significance in annual and seasonal means for a variety of the MODIS cloud properties, as well as the time required for detection given assumed trends. To assist in climate model evaluation, we have developed a MODIS cloud simulator with an accompanying netCDF file containing subsetted monthly Level-3 statistical data sets that correspond to the simulator output. Correlations of cloud properties with ENSO offer the potential to evaluate model cloud sensitivity; initial results will be discussed.

  18. Properties of Earth's temporarily-captured flybys

    NASA Astrophysics Data System (ADS)

    Fedorets, Grigori; Granvik, Mikael

    2014-11-01

    In addition to the Moon, a population of small temporarily-captured NEOs is predicted to orbit the Earth. The definition of a natural Earth satellite is that it is on an elliptic geocentric orbit within 0.03 au from the Earth. The population is further divided into temporarily-captured orbiters (TCOs, or minimoons, making at least one full revolution around the Earth in a coordinate system co-rotating with the Sun) and temporarily-captured flybys (TCFs) which fail to make a full revolution, but are temporarily on an elliptic orbit around the Earth. Only one minimoon has been discovered to date, but it is expected that next generation surveys will be able to detect these objects regularly.Granvik et al. (2012) performed an extensive analysis of the behaviour of these temporarily-captured objects. One of the main results was that at any given moment there is at least one 1-meter-diameter minimoon in orbit around the Earth. However, the results of Granvik et al. (2012) raised questions considering the NES population such as the bimodality of the capture duration distribution and a distinctive lack of test particles within Earth's Hill sphere, which requires investigating the statistical properties also of the TCF population.In this work we confirm the population characteristics for minimoons described by Granvik et al. (2012), and extend the analysis to TCFs. For the calculations we use a Bulirsch-Stoer integrator implemented in the OpenOrb software package (Granvik et al. 2009). We study, e.g., the capture statistics, residence-time distributions, and steady-state properties of TCFs. Our preliminary results indicate that TCFs may be suitable targets for asteroid-redirect missions. More detailed knowledge of the TCF population will also improve our understanding of the link between temporarily-captured objects and NEOs in general.References: Granvik et al. (2009) MPS 44(12), 1853-1861; Granvik et al. (2012) Icarus 218, 262-277.

  19. A state space approach for piecewise-linear recurrent neural networks for identifying computational dynamics from neural measurements.

    PubMed

    Durstewitz, Daniel

    2017-06-01

    The computational and cognitive properties of neural systems are often thought to be implemented in terms of their (stochastic) network dynamics. Hence, recovering the system dynamics from experimentally observed neuronal time series, like multiple single-unit recordings or neuroimaging data, is an important step toward understanding its computations. Ideally, one would not only seek a (lower-dimensional) state space representation of the dynamics, but would wish to have access to its statistical properties and their generative equations for in-depth analysis. Recurrent neural networks (RNNs) are a computationally powerful and dynamically universal formal framework which has been extensively studied from both the computational and the dynamical systems perspective. Here we develop a semi-analytical maximum-likelihood estimation scheme for piecewise-linear RNNs (PLRNNs) within the statistical framework of state space models, which accounts for noise in both the underlying latent dynamics and the observation process. The Expectation-Maximization algorithm is used to infer the latent state distribution, through a global Laplace approximation, and the PLRNN parameters iteratively. After validating the procedure on toy examples, and using inference through particle filters for comparison, the approach is applied to multiple single-unit recordings from the rodent anterior cingulate cortex (ACC) obtained during performance of a classical working memory task, delayed alternation. Models estimated from kernel-smoothed spike time data were able to capture the essential computational dynamics underlying task performance, including stimulus-selective delay activity. The estimated models were rarely multi-stable, however, but rather were tuned to exhibit slow dynamics in the vicinity of a bifurcation point. In summary, the present work advances a semi-analytical (thus reasonably fast) maximum-likelihood estimation framework for PLRNNs that may enable to recover relevant aspects of the nonlinear dynamics underlying observed neuronal time series, and directly link these to computational properties.

  20. An examination of the challenges influencing science instruction in Florida elementary classrooms

    NASA Astrophysics Data System (ADS)

    North, Stephanie Gwinn

    It has been shown that the mechanical properties of thin films tend to differ from their bulk counterparts. Specifically, the bulge and microtensile testing of thin films used in MEMS have revealed that these films demonstrate an inverse relationship between thickness and strength. A film dimension is not a material property, but it evidently does affect the mechanical performance of materials at very small thicknesses. A hypothetical explanation for this phenomenon is that as the thickness dimension of the film decreases, it is statistically less likely that imperfections exist in the material. It would require a very small thickness (or volume) to limit imperfections in a material, which is why this phenomenon is seen in films with thicknesses on the order of 100 nm to a few microns. Another hypothesized explanation is that the surface tension that exists in bulk material also exists in thin films but has a greater impact at such a small scale. The goal of this research is to identify a theoretical prediction of the strength of thin films based on its microstructural properties such as grain size and film thickness. This would minimize the need for expensive and complicated tests such as the bulge and microtensile tests. In this research, data was collected from the bulge and microtensile testing of copper, aluminum, gold, and polysilicon free-standing thin films. Statistical testing of this data revealed a definitive inverse relationship between thickness and strength, as well as between grain size and strength, as expected. However, due to a lack of a standardized method for either test, there were significant variations in the data. This research compares and analyzes the methods used by other researchers to develop a suggested set of instructions for a standardized bulge test and standardized microtensile test. The most important parameters to be controlled in each test were found to be strain rate, temperature, film deposition method, film length, and strain measurement.

  1. A Comparison of the Flexural and Impact Strengths and Flexural Modulus of CAD/CAM and Conventional Heat-Cured Polymethyl Methacrylate (PMMA).

    PubMed

    Al-Dwairi, Ziad N; Tahboub, Kawkab Y; Baba, Nadim Z; Goodacre, Charles J

    2018-06-13

    The introduction of computer-aided design/computer-aided manufacturing (CAD/CAM) technology to the field of removable prosthodontics has recently made it possible to fabricate complete dentures of prepolymerized polymethyl methacrylate (PMMA) blocks, which are claimed to be of better mechanical properties; however, no published reports that have evaluated mechanical properties of CAD/CAM PMMA. The purpose of this study was to compare flexural strength, impact strength, and flexural modulus of two brands of CAD/CAM PMMA and a conventional heat-cured PMMA. 45 rectangular specimens (65 mm × 10 mm × 3 mm) were fabricated (15 CAD/CAM AvaDent PMMA specimens from AvaDent, 15 CAD/CAM Tizian PMMA specimens from Shütz Dental, 15 conventional Meliodent PMMA specimens from Heraeus Kulzer) and stored in distilled water at (37  ± 1°C) for 7 days. Specimens (N = 15) in each group were subjected to the three-point bending test and impact strength test, employing the Charpy configuration on unnotched specimens. The morphology of the fractured specimens was studied under a scanning electron microscope (SEM). Statistical analysis was performed using one-way ANOVA and Tukey pairwise multiple comparisons with 95% confidence interval. The Schütz Dental specimens showed the highest mean flexural strength (130.67 MPa) and impact strength (29.56 kg/m 2 ). The highest mean flexural modulus was recorded in the AvaDent group (2519.6 MPa). The conventional heat-cured group showed the lowest mean flexural strength (93.33 MPa), impact strength (14.756 kg/m 2 ), and flexural modulus (2117.2 MPa). Differences in means of flexural properties between AvaDent and Schütz Dental specimens were not statistically significant (p > 0.05). As CAD/CAM PMMA specimens exhibited improved flexural strength, flexural modulus, and impact strength in comparison to the conventional heat-cured groups, CAD/CAM dentures are expected to be more durable. Different brands of CAD/CAM PMMA may have inherent variations in mechanical properties. © 2018 by the American College of Prosthodontists.

  2. Expectation maximization for hard X-ray count modulation profiles

    NASA Astrophysics Data System (ADS)

    Benvenuto, F.; Schwartz, R.; Piana, M.; Massone, A. M.

    2013-07-01

    Context. This paper is concerned with the image reconstruction problem when the measured data are solar hard X-ray modulation profiles obtained from the Reuven Ramaty High Energy Solar Spectroscopic Imager (RHESSI) instrument. Aims: Our goal is to demonstrate that a statistical iterative method classically applied to the image deconvolution problem is very effective when utilized to analyze count modulation profiles in solar hard X-ray imaging based on rotating modulation collimators. Methods: The algorithm described in this paper solves the maximum likelihood problem iteratively and encodes a positivity constraint into the iterative optimization scheme. The result is therefore a classical expectation maximization method this time applied not to an image deconvolution problem but to image reconstruction from count modulation profiles. The technical reason that makes our implementation particularly effective in this application is the use of a very reliable stopping rule which is able to regularize the solution providing, at the same time, a very satisfactory Cash-statistic (C-statistic). Results: The method is applied to both reproduce synthetic flaring configurations and reconstruct images from experimental data corresponding to three real events. In this second case, the performance of expectation maximization, when compared to Pixon image reconstruction, shows a comparable accuracy and a notably reduced computational burden; when compared to CLEAN, shows a better fidelity with respect to the measurements with a comparable computational effectiveness. Conclusions: If optimally stopped, expectation maximization represents a very reliable method for image reconstruction in the RHESSI context when count modulation profiles are used as input data.

  3. Statistical iterative reconstruction for streak artefact reduction when using multidetector CT to image the dento-alveolar structures.

    PubMed

    Dong, J; Hayakawa, Y; Kober, C

    2014-01-01

    When metallic prosthetic appliances and dental fillings exist in the oral cavity, the appearance of metal-induced streak artefacts is not avoidable in CT images. The aim of this study was to develop a method for artefact reduction using the statistical reconstruction on multidetector row CT images. Adjacent CT images often depict similar anatomical structures. Therefore, reconstructed images with weak artefacts were attempted using projection data of an artefact-free image in a neighbouring thin slice. Images with moderate and strong artefacts were continuously processed in sequence by successive iterative restoration where the projection data was generated from the adjacent reconstructed slice. First, the basic maximum likelihood-expectation maximization algorithm was applied. Next, the ordered subset-expectation maximization algorithm was examined. Alternatively, a small region of interest setting was designated. Finally, the general purpose graphic processing unit machine was applied in both situations. The algorithms reduced the metal-induced streak artefacts on multidetector row CT images when the sequential processing method was applied. The ordered subset-expectation maximization and small region of interest reduced the processing duration without apparent detriments. A general-purpose graphic processing unit realized the high performance. A statistical reconstruction method was applied for the streak artefact reduction. The alternative algorithms applied were effective. Both software and hardware tools, such as ordered subset-expectation maximization, small region of interest and general-purpose graphic processing unit achieved fast artefact correction.

  4. Social Facilitation Expectancies for Smoking: Psychometric Properties of a New Measure

    ERIC Educational Resources Information Center

    Schweizer, C. Amanda; Doran, Neal; Myers, Mark G.

    2014-01-01

    Objective: Expectancies about social outcomes for smoking are relevant to college student smokers, who frequently report "social smoking." A new measure, the Social Facilitation Expectancies (SFE) scale, was developed to assess these beliefs. Participants: The SFE was administered to undergraduate college student smokers ("N" =…

  5. Statistical and Measurement Properties of Features Used in Essay Assessment. Research Report. ETS RR-04-21

    ERIC Educational Resources Information Center

    Haberman, Shelby J.

    2004-01-01

    Statistical and measurement properties are examined for features used in essay assessment to determine the generalizability of the features across populations, prompts, and individuals. Data are employed from TOEFL® and GMAT® examinations and from writing for Criterion?.

  6. Probalistic Finite Elements (PFEM) structural dynamics and fracture mechanics

    NASA Technical Reports Server (NTRS)

    Liu, Wing-Kam; Belytschko, Ted; Mani, A.; Besterfield, G.

    1989-01-01

    The purpose of this work is to develop computationally efficient methodologies for assessing the effects of randomness in loads, material properties, and other aspects of a problem by a finite element analysis. The resulting group of methods is called probabilistic finite elements (PFEM). The overall objective of this work is to develop methodologies whereby the lifetime of a component can be predicted, accounting for the variability in the material and geometry of the component, the loads, and other aspects of the environment; and the range of response expected in a particular scenario can be presented to the analyst in addition to the response itself. Emphasis has been placed on methods which are not statistical in character; that is, they do not involve Monte Carlo simulations. The reason for this choice of direction is that Monte Carlo simulations of complex nonlinear response require a tremendous amount of computation. The focus of efforts so far has been on nonlinear structural dynamics. However, in the continuation of this project, emphasis will be shifted to probabilistic fracture mechanics so that the effect of randomness in crack geometry and material properties can be studied interactively with the effect of random load and environment.

  7. Phase Transition to Exact Susy

    NASA Astrophysics Data System (ADS)

    Clavelli, L.

    2007-04-01

    The anthropic principle is based on the observation that, within narrow bounds, the laws of physics are such as to have allowed the evolution of life. The string theoretic approach to understanding this observation is based on the expectation that the effective potential has an enormous number of local minima with different particle masses and perhaps totally different fundamental couplings and space time topology. The vast majority of these alternative universes are totally inhospitable to life, having, for example, vacuum energies near the natural (Planck) scale. The statistics, however, are assumed to be such that a few of these local minima (and not more) have a low enough vacuum energy and suitable other properties to support life. In the inflationary era, the "multiverse" made successive transitions between the available minima until arriving at our current state of low vacuum energy. String theory, however, also suggests that the absolute minimum of the effective potential is exactly supersymmetric. Questions then arise as to why the inflationary era did not end by a transition to one of these, when will the universe make the phase transition to the exactly supersymmetric ground state, and what will be the properties of this final state.

  8. Homogeneity Pursuit

    PubMed Central

    Ke, Tracy; Fan, Jianqing; Wu, Yichao

    2014-01-01

    This paper explores the homogeneity of coefficients in high-dimensional regression, which extends the sparsity concept and is more general and suitable for many applications. Homogeneity arises when regression coefficients corresponding to neighboring geographical regions or a similar cluster of covariates are expected to be approximately the same. Sparsity corresponds to a special case of homogeneity with a large cluster of known atom zero. In this article, we propose a new method called clustering algorithm in regression via data-driven segmentation (CARDS) to explore homogeneity. New mathematics are provided on the gain that can be achieved by exploring homogeneity. Statistical properties of two versions of CARDS are analyzed. In particular, the asymptotic normality of our proposed CARDS estimator is established, which reveals better estimation accuracy for homogeneous parameters than that without homogeneity exploration. When our methods are combined with sparsity exploration, further efficiency can be achieved beyond the exploration of sparsity alone. This provides additional insights into the power of exploring low-dimensional structures in high-dimensional regression: homogeneity and sparsity. Our results also shed lights on the properties of the fussed Lasso. The newly developed method is further illustrated by simulation studies and applications to real data. Supplementary materials for this article are available online. PMID:26085701

  9. Physical soil quality indicators for monitoring British soils

    NASA Astrophysics Data System (ADS)

    Corstanje, Ron; Mercer, Theresa G.; Rickson, Jane R.; Deeks, Lynda K.; Newell-Price, Paul; Holman, Ian; Kechavarsi, Cedric; Waine, Toby W.

    2017-09-01

    Soil condition or quality determines its ability to deliver a range of functions that support ecosystem services, human health and wellbeing. The increasing policy imperative to implement successful soil monitoring programmes has resulted in the demand for reliable soil quality indicators (SQIs) for physical, biological and chemical soil properties. The selection of these indicators needs to ensure that they are sensitive and responsive to pressure and change, e.g. they change across space and time in relation to natural perturbations and land management practices. Using a logical sieve approach based on key policy-related soil functions, this research assessed whether physical soil properties can be used to indicate the quality of British soils in terms of their capacity to deliver ecosystem goods and services. The resultant prioritised list of physical SQIs was tested for robustness, spatial and temporal variability, and expected rate of change using statistical analysis and modelling. Seven SQIs were prioritised: soil packing density, soil water retention characteristics, aggregate stability, rate of soil erosion, depth of soil, soil structure (assessed by visual soil evaluation) and soil sealing. These all have direct relevance to current and likely future soil and environmental policy and are appropriate for implementation in soil monitoring programmes.

  10. Brownian Carnot engine

    PubMed Central

    Dinis, L.; Petrov, D.; Parrondo, J. M. R.; Rica, R. A.

    2016-01-01

    The Carnot cycle imposes a fundamental upper limit to the efficiency of a macroscopic motor operating between two thermal baths1. However, this bound needs to be reinterpreted at microscopic scales, where molecular bio-motors2 and some artificial micro-engines3–5 operate. As described by stochastic thermodynamics6,7, energy transfers in microscopic systems are random and thermal fluctuations induce transient decreases of entropy, allowing for possible violations of the Carnot limit8. Here we report an experimental realization of a Carnot engine with a single optically trapped Brownian particle as the working substance. We present an exhaustive study of the energetics of the engine and analyse the fluctuations of the finite-time efficiency, showing that the Carnot bound can be surpassed for a small number of non-equilibrium cycles. As its macroscopic counterpart, the energetics of our Carnot device exhibits basic properties that one would expect to observe in any microscopic energy transducer operating with baths at different temperatures9–11. Our results characterize the sources of irreversibility in the engine and the statistical properties of the efficiency—an insight that could inspire new strategies in the design of efficient nano-motors. PMID:27330541

  11. Changes in diffusion path length with old age in diffuse optical tomography

    NASA Astrophysics Data System (ADS)

    Bonnéry, Clément; Leclerc, Paul-Olivier; Desjardins, Michèle; Hoge, Rick; Bherer, Louis; Pouliot, Philippe; Lesage, Frédéric

    2012-05-01

    Diffuse, optical near infrared imaging is increasingly being used in various neurocognitive contexts where changes in optical signals are interpreted through activation maps. Statistical population comparison of different age or clinical groups rely on the relative homogeneous distribution of measurements across subjects in order to infer changes in brain function. In the context of an increasing use of diffuse optical imaging with older adult populations, changes in tissue properties and anatomy with age adds additional confounds. Few studies investigated these changes with age. Duncan et al. measured the so-called diffusion path length factor (DPF) in a large population but did not explore beyond the age of 51 after which physiological and anatomical changes are expected to occur [Pediatr. Res. 39(5), 889-894 (1996)]. With increasing interest in studying the geriatric population with optical imaging, we studied changes in tissue properties in young and old subjects using both magnetic resonance imaging (MRI)-guided Monte-Carlo simulations and time-domain diffuse optical imaging. Our results, measured in the frontal cortex, show changes in DPF that are smaller than previously measured by Duncan et al. in a younger population. The origin of these changes are studied using simulations and experimental measures.

  12. Predicting Constraints on Ultra-Light Axion Parameters due to LSST Observations

    NASA Astrophysics Data System (ADS)

    Given, Gabriel; Grin, Daniel

    2018-01-01

    Ultra-light axions (ULAs) are a type of dark matter or dark energy candidate (depending on the mass) that are predicted to have a mass between $10^{‑33}$ and $10^{‑18}$ eV. The Large Synoptic Survey Telescope (LSST) is expected to provide a large number of weak lensing observations, which will lower the statistical uncertainty on the convergence power spectrum. I began work with Daniel Grin to predict how accurately the data from the LSST will be able to constrain ULA properties. I wrote Python code that takes a matter power spectrum calculated by axionCAMB and converts it to a convergence power spectrum. My code then takes derivatives of the convergence power spectrum with respect to several cosmological parameters; these derivatives will be used in Fisher Matrix analysis to determine the sensitivity of LSST observations to axion parameters.

  13. Premelting, fluctuations, and coarse-graining of water-ice interfaces.

    PubMed

    Limmer, David T; Chandler, David

    2014-11-14

    Using statistical field theory supplemented with molecular dynamics simulations, we consider premelting on the surface of ice as a generic consequence of broken hydrogen bonds at the boundary between the condensed and gaseous phases. A procedure for coarse-graining molecular configurations onto a continuous scalar order parameter field is discussed, which provides a convenient representation of the interface between locally crystal-like and locally liquid-like regions. A number of interfacial properties are straightforwardly evaluated using this procedure such as the average premelting thickness and surface tension. The temperature and system size dependence of the premelting layer thickness calculated in this way confirms the characteristic logarithmic growth expected for the scalar field theory that the system is mapped onto through coarse-graining, though remains finite due to long-ranged interactions. Finally, from explicit simulations the existence of a premelting layer is shown to be insensitive to bulk lattice geometry, exposed crystal face, and curvature.

  14. Premelting, fluctuations, and coarse-graining of water-ice interfaces

    NASA Astrophysics Data System (ADS)

    Limmer, David T.; Chandler, David

    2014-11-01

    Using statistical field theory supplemented with molecular dynamics simulations, we consider premelting on the surface of ice as a generic consequence of broken hydrogen bonds at the boundary between the condensed and gaseous phases. A procedure for coarse-graining molecular configurations onto a continuous scalar order parameter field is discussed, which provides a convenient representation of the interface between locally crystal-like and locally liquid-like regions. A number of interfacial properties are straightforwardly evaluated using this procedure such as the average premelting thickness and surface tension. The temperature and system size dependence of the premelting layer thickness calculated in this way confirms the characteristic logarithmic growth expected for the scalar field theory that the system is mapped onto through coarse-graining, though remains finite due to long-ranged interactions. Finally, from explicit simulations the existence of a premelting layer is shown to be insensitive to bulk lattice geometry, exposed crystal face, and curvature.

  15. Significance testing testate amoeba water table reconstructions

    NASA Astrophysics Data System (ADS)

    Payne, Richard J.; Babeshko, Kirill V.; van Bellen, Simon; Blackford, Jeffrey J.; Booth, Robert K.; Charman, Dan J.; Ellershaw, Megan R.; Gilbert, Daniel; Hughes, Paul D. M.; Jassey, Vincent E. J.; Lamentowicz, Łukasz; Lamentowicz, Mariusz; Malysheva, Elena A.; Mauquoy, Dmitri; Mazei, Yuri; Mitchell, Edward A. D.; Swindles, Graeme T.; Tsyganov, Andrey N.; Turner, T. Edward; Telford, Richard J.

    2016-04-01

    Transfer functions are valuable tools in palaeoecology, but their output may not always be meaningful. A recently-developed statistical test ('randomTF') offers the potential to distinguish among reconstructions which are more likely to be useful, and those less so. We applied this test to a large number of reconstructions of peatland water table depth based on testate amoebae. Contrary to our expectations, a substantial majority (25 of 30) of these reconstructions gave non-significant results (P > 0.05). The underlying reasons for this outcome are unclear. We found no significant correlation between randomTF P-value and transfer function performance, the properties of the training set and reconstruction, or measures of transfer function fit. These results give cause for concern but we believe it would be extremely premature to discount the results of non-significant reconstructions. We stress the need for more critical assessment of transfer function output, replication of results and ecologically-informed interpretation of palaeoecological data.

  16. Retrograde spins of near-Earth asteroids from the Yarkovsky effect.

    PubMed

    La Spina, A; Paolicchi, P; Kryszczyńska, A; Pravec, P

    2004-03-25

    Dynamical resonances in the asteroid belt are the gateway for the production of near-Earth asteroids (NEAs). To generate the observed number of NEAs, however, requires the injection of many asteroids into those resonant regions. Collisional processes have long been claimed as a possible source, but difficulties with that idea have led to the suggestion that orbital drift arising from the Yarkovsky effect dominates the injection process. (The Yarkovsky effect is a force arising from differential heating-the 'afternoon' side of an asteroid is warmer than the 'morning' side.) The two models predict different rotational properties of NEAs: the usual collisional theories are consistent with a nearly isotropic distribution of rotation vectors, whereas the 'Yarkovsky model' predicts an excess of retrograde rotations. Here we report that the spin vectors of NEAs show a strong and statistically significant excess of retrograde rotations, quantitatively consistent with the theoretical expectations of the Yarkovsky model.

  17. Parity Nonconservation in Proton-Proton and Proton-Water Scattering at 1.5 GeV/c

    DOE R&D Accomplishments Database

    Mischke, R. E.; Bowman, J. D.; Carlini, R.; MacArthur, D.; Nagle, D. E.; Frauenfelder, H.; Harper, R. W.; Yuan, V.; McDonald, A. B.; Talaga, R. L.

    1984-07-01

    Experiments searching for parity nonconservation in the scattering of 1.5 GeV/c (800 MeV) polarized protons from an unpolarized water target and a liquid hydrogen target are described. The intensity of the incident proton beam was measured upstream and downstream of the target by a pair of ionization detectors. The beam helicity was reversed at a 30-Hz rate. Auxiliary detectors monitored beam properties that could give rise to false effects. The result for the longitudinal asymmetry from the water is A{sub L} = (1.7 +- 3.3 +- 1.4) x 10{sup -7}, where the first error is statistical and the second is an estimate of systematic effects. The hydrogen data yield a preliminary result of A{sub L} = (1.0 +- 1.6) x 10{sup -7}. The systematic errors for p-p are expected to be < 1 x 10{sup -7}.

  18. Time-integrated sampling of fluvial suspended sediment: a simple methodology for small catchments

    NASA Astrophysics Data System (ADS)

    Phillips, J. M.; Russell, M. A.; Walling, D. E.

    2000-10-01

    Fine-grained (<62·5 µm) suspended sediment transport is a key component of the geochemical flux in most fluvial systems. The highly episodic nature of suspended sediment transport imposes a significant constraint on the design of sampling strategies aimed at characterizing the biogeochemical properties of such sediment. A simple sediment sampler, utilizing ambient flow to induce sedimentation by settling, is described. The sampler can be deployed unattended in small streams to collect time-integrated suspended sediment samples. In laboratory tests involving chemically dispersed sediment, the sampler collected a maximum of 71% of the input sample mass. However, under natural conditions, the existence of composite particles or flocs can be expected to increase significantly the trapping efficiency. Field trials confirmed that the particle size composition and total carbon content of the sediment collected by the sampler were representative statistically of the ambient suspended sediment.

  19. The Coalescent Process in Models with Selection

    PubMed Central

    Kaplan, N. L.; Darden, T.; Hudson, R. R.

    1988-01-01

    Statistical properties of the process describing the genealogical history of a random sample of genes are obtained for a class of population genetics models with selection. For models with selection, in contrast to models without selection, the distribution of this process, the coalescent process, depends on the distribution of the frequencies of alleles in the ancestral generations. If the ancestral frequency process can be approximated by a diffusion, then the mean and the variance of the number of segregating sites due to selectively neutral mutations in random samples can be numerically calculated. The calculations are greatly simplified if the frequencies of the alleles are tightly regulated. If the mutation rates between alleles maintained by balancing selection are low, then the number of selectively neutral segregating sites in a random sample of genes is expected to substantially exceed the number predicted under a neutral model. PMID:3066685

  20. Towards simulating and quantifying the light-cone EoR 21-cm signal

    NASA Astrophysics Data System (ADS)

    Mondal, Rajesh; Bharadwaj, Somnath; Datta, Kanan K.

    2018-02-01

    The light-cone (LC) effect causes the Epoch of Reionization (EoR) 21-cm signal T_b (\\hat{n}, ν ) to evolve significantly along the line-of-sight (LoS) direction ν. In the first part of this paper, we present a method to properly incorporate the LC effect in simulations of the EoR 21-cm signal that includes peculiar velocities. Subsequently, we discuss how to quantify the second-order statistics of the EoR 21-cm signal in the presence of the LC effect. We demonstrate that the 3D power spectrum P(k) fails to quantify the entire information because it assumes the signal to be ergodic and periodic, whereas the LC effect breaks these conditions along the LoS. Considering a LC simulation centred at redshift 8 where the mean neutral fraction drops from 0.65 to 0.35 across the box, we find that P(k) misses out ˜ 40 per cent of the information at the two ends of the 17.41 MHz simulation bandwidth. The multifrequency angular power spectrum (MAPS) C_{ℓ}(ν_1,ν_2) quantifies the statistical properties of T_b (\\hat{n}, ν ) without assuming the signal to be ergodic and periodic along the LoS. We expect this to quantify the entire statistical information of the EoR 21-cm signal. We apply MAPS to our LC simulation and present preliminary results for the EoR 21-cm signal.

  1. Statistics of baryon correlation functions in lattice QCD

    NASA Astrophysics Data System (ADS)

    Wagman, Michael L.; Savage, Martin J.; Nplqcd Collaboration

    2017-12-01

    A systematic analysis of the structure of single-baryon correlation functions calculated with lattice QCD is performed, with a particular focus on characterizing the structure of the noise associated with quantum fluctuations. The signal-to-noise problem in these correlation functions is shown, as long suspected, to result from a sign problem. The log-magnitude and complex phase are found to be approximately described by normal and wrapped normal distributions respectively. Properties of circular statistics are used to understand the emergence of a large time noise region where standard energy measurements are unreliable. Power-law tails in the distribution of baryon correlation functions, associated with stable distributions and "Lévy flights," are found to play a central role in their time evolution. A new method of analyzing correlation functions is considered for which the signal-to-noise ratio of energy measurements is constant, rather than exponentially degrading, with increasing source-sink separation time. This new method includes an additional systematic uncertainty that can be removed by performing an extrapolation, and the signal-to-noise problem reemerges in the statistics of this extrapolation. It is demonstrated that this new method allows accurate results for the nucleon mass to be extracted from the large-time noise region inaccessible to standard methods. The observations presented here are expected to apply to quantum Monte Carlo calculations more generally. Similar methods to those introduced here may lead to practical improvements in analysis of noisier systems.

  2. A finer view of the conditional galaxy luminosity function and magnitude-gap statistics

    NASA Astrophysics Data System (ADS)

    Trevisan, M.; Mamon, G. A.

    2017-10-01

    The gap between first- and second-ranked galaxy magnitudes in groups is often considered a tracer of their merger histories, which in turn may affect galaxy properties, and also serves to test galaxy luminosity functions (LFs). We remeasure the conditional luminosity function (CLF) of the Main Galaxy Sample of the SDSS in an appropriately cleaned subsample of groups from the Yang catalogue. We find that, at low group masses, our best-fitting CLF has steeper satellite high ends, yet higher ratios of characteristic satellite to central luminosities in comparison with the CLF of Yang et al. The observed fractions of groups with large and small magnitude gaps as well as the Tremaine & Richstone statistics are not compatible with either a single Schechter LF or with a Schechter-like satellite plus lognormal central LF. These gap statistics, which naturally depend on the size of the subsamples, and also on the maximum projected radius, Rmax, for defining the second brightest galaxy, can only be reproduced with two-component CLFs if we allow small gap groups to preferentially have two central galaxies, as expected when groups merge. Finally, we find that the trend of higher gap for higher group velocity dispersion, σv, at a given richness, discovered by Hearin et al., is strongly reduced when we consider σv in bins of richness, and virtually disappears when we use group mass instead of σv. This limits the applicability of gaps in refining cosmographic studies based on cluster counts.

  3. Statistical deprojection of galaxy pairs

    NASA Astrophysics Data System (ADS)

    Nottale, Laurent; Chamaraux, Pierre

    2018-06-01

    Aims: The purpose of the present paper is to provide methods of statistical analysis of the physical properties of galaxy pairs. We perform this study to apply it later to catalogs of isolated pairs of galaxies, especially two new catalogs we recently constructed that contain ≈1000 and ≈13 000 pairs, respectively. We are particularly interested by the dynamics of those pairs, including the determination of their masses. Methods: We could not compute the dynamical parameters directly since the necessary data are incomplete. Indeed, we only have at our disposal one component of the intervelocity between the members, namely along the line of sight, and two components of their interdistance, i.e., the projection on the sky-plane. Moreover, we know only one point of each galaxy orbit. Hence we need statistical methods to find the probability distribution of 3D interdistances and 3D intervelocities from their projections; we designed those methods under the term deprojection. Results: We proceed in two steps to determine and use the deprojection methods. First we derive the probability distributions expected for the various relevant projected quantities, namely intervelocity vz, interdistance rp, their ratio, and the product rp v_z^2, which is involved in mass determination. In a second step, we propose various methods of deprojection of those parameters based on the previous analysis. We start from a histogram of the projected data and we apply inversion formulae to obtain the deprojected distributions; lastly, we test the methods by numerical simulations, which also allow us to determine the uncertainties involved.

  4. Predicting Cortical Dark/Bright Asymmetries from Natural Image Statistics and Early Visual Transforms

    PubMed Central

    Cooper, Emily A.; Norcia, Anthony M.

    2015-01-01

    The nervous system has evolved in an environment with structure and predictability. One of the ubiquitous principles of sensory systems is the creation of circuits that capitalize on this predictability. Previous work has identified predictable non-uniformities in the distributions of basic visual features in natural images that are relevant to the encoding tasks of the visual system. Here, we report that the well-established statistical distributions of visual features -- such as visual contrast, spatial scale, and depth -- differ between bright and dark image components. Following this analysis, we go on to trace how these differences in natural images translate into different patterns of cortical input that arise from the separate bright (ON) and dark (OFF) pathways originating in the retina. We use models of these early visual pathways to transform natural images into statistical patterns of cortical input. The models include the receptive fields and non-linear response properties of the magnocellular (M) and parvocellular (P) pathways, with their ON and OFF pathway divisions. The results indicate that there are regularities in visual cortical input beyond those that have previously been appreciated from the direct analysis of natural images. In particular, several dark/bright asymmetries provide a potential account for recently discovered asymmetries in how the brain processes visual features, such as violations of classic energy-type models. On the basis of our analysis, we expect that the dark/bright dichotomy in natural images plays a key role in the generation of both cortical and perceptual asymmetries. PMID:26020624

  5. Optimism bias leads to inconclusive results - an empirical study

    PubMed Central

    Djulbegovic, Benjamin; Kumar, Ambuj; Magazin, Anja; Schroen, Anneke T.; Soares, Heloisa; Hozo, Iztok; Clarke, Mike; Sargent, Daniel; Schell, Michael J.

    2010-01-01

    Objective Optimism bias refers to unwarranted belief in the efficacy of new therapies. We assessed the impact of optimism bias on a proportion of trials that did not answer their research question successfully, and explored whether poor accrual or optimism bias is responsible for inconclusive results. Study Design Systematic review Setting Retrospective analysis of a consecutive series phase III randomized controlled trials (RCTs) performed under the aegis of National Cancer Institute Cooperative groups. Results 359 trials (374 comparisons) enrolling 150,232 patients were analyzed. 70% (262/374) of the trials generated conclusive results according to the statistical criteria. Investigators made definitive statements related to the treatment preference in 73% (273/374) of studies. Investigators’ judgments and statistical inferences were concordant in 75% (279/374) of trials. Investigators consistently overestimated their expected treatment effects, but to a significantly larger extent for inconclusive trials. The median ratio of expected over observed hazard ratio or odds ratio was 1.34 (range 0.19 – 15.40) in conclusive trials compared to 1.86 (range 1.09 – 12.00) in inconclusive studies (p<0.0001). Only 17% of the trials had treatment effects that matched original researchers’ expectations. Conclusion Formal statistical inference is sufficient to answer the research question in 75% of RCTs. The answers to the other 25% depend mostly on subjective judgments, which at times are in conflict with statistical inference. Optimism bias significantly contributes to inconclusive results. PMID:21163620

  6. Optimism bias leads to inconclusive results-an empirical study.

    PubMed

    Djulbegovic, Benjamin; Kumar, Ambuj; Magazin, Anja; Schroen, Anneke T; Soares, Heloisa; Hozo, Iztok; Clarke, Mike; Sargent, Daniel; Schell, Michael J

    2011-06-01

    Optimism bias refers to unwarranted belief in the efficacy of new therapies. We assessed the impact of optimism bias on a proportion of trials that did not answer their research question successfully and explored whether poor accrual or optimism bias is responsible for inconclusive results. Systematic review. Retrospective analysis of a consecutive-series phase III randomized controlled trials (RCTs) performed under the aegis of National Cancer Institute Cooperative groups. Three hundred fifty-nine trials (374 comparisons) enrolling 150,232 patients were analyzed. Seventy percent (262 of 374) of the trials generated conclusive results according to the statistical criteria. Investigators made definitive statements related to the treatment preference in 73% (273 of 374) of studies. Investigators' judgments and statistical inferences were concordant in 75% (279 of 374) of trials. Investigators consistently overestimated their expected treatment effects but to a significantly larger extent for inconclusive trials. The median ratio of expected and observed hazard ratio or odds ratio was 1.34 (range: 0.19-15.40) in conclusive trials compared with 1.86 (range: 1.09-12.00) in inconclusive studies (P<0.0001). Only 17% of the trials had treatment effects that matched original researchers' expectations. Formal statistical inference is sufficient to answer the research question in 75% of RCTs. The answers to the other 25% depend mostly on subjective judgments, which at times are in conflict with statistical inference. Optimism bias significantly contributes to inconclusive results. Copyright © 2011 Elsevier Inc. All rights reserved.

  7. Thermodynamics and statistical mechanics. [thermodynamic properties of gases

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The basic thermodynamic properties of gases are reviewed and the relations between them are derived from the first and second laws. The elements of statistical mechanics are then formulated and the partition function is derived. The classical form of the partition function is used to obtain the Maxwell-Boltzmann distribution of kinetic energies in the gas phase and the equipartition of energy theorem is given in its most general form. The thermodynamic properties are all derived as functions of the partition function. Quantum statistics are reviewed briefly and the differences between the Boltzmann distribution function for classical particles and the Fermi-Dirac and Bose-Einstein distributions for quantum particles are discussed.

  8. Students' Appreciation of Expectation and Variation as a Foundation for Statistical Understanding

    ERIC Educational Resources Information Center

    Watson, Jane M.; Callingham, Rosemary A.; Kelly, Ben A.

    2007-01-01

    This study presents the results of a partial credit Rasch analysis of in-depth interview data exploring statistical understanding of 73 school students in 6 contextual settings. The use of Rasch analysis allowed the exploration of a single underlying variable across contexts, which included probability sampling, representation of temperature…

  9. Excluding Institutionalized Elderly from Surveys: Consequences for Income and Poverty Statistics

    ERIC Educational Resources Information Center

    Peeters, Hans; Debels, Annelies; Verpoorten, Rika

    2013-01-01

    Growing life expectancy and changes in financial, marriage and labour markets have placed the income position of the elderly at the center of scientific and political discourse. As a consequence, the last decades witnessed the publication of various influential reports that contained comparative statistics on old age income inequalities on the…

  10. Gender Differences in Public Relations Students' Career Attitudes: A Benchmark Study.

    ERIC Educational Resources Information Center

    Farmer, Betty; Waugh, Lisa

    1999-01-01

    Explores students' perceptions of gender issues in public relations. Finds that there were no statistically significant differences in male and female students' desires to perform managerial activities, but there were statistically significant differences in several areas (i.e. female students expect to earn less money starting out and to be…

  11. Advance Report of Final Mortality Statistics, 1985.

    ERIC Educational Resources Information Center

    Monthly Vital Statistics Report, 1987

    1987-01-01

    This document presents mortality statistics for 1985 for the entire United States. Data analysis and discussion of these factors is included: death and death rates; death rates by age, sex, and race; expectation of life at birth and at specified ages; causes of death; infant mortality; and maternal mortality. Highlights reported include: (1) the…

  12. Business Statistics and Management Science Online: Teaching Strategies and Assessment of Student Learning

    ERIC Educational Resources Information Center

    Sebastianelli, Rose; Tamimi, Nabil

    2011-01-01

    Given the expected rise in the number of online business degrees, issues regarding quality and assessment in online courses will become increasingly important. The authors focus on the suitability of online delivery for quantitative business courses, specifically business statistics and management science. They use multiple approaches to assess…

  13. What Does Average Really Mean? Making Sense of Statistics

    ERIC Educational Resources Information Center

    DeAngelis, Karen J.; Ayers, Steven

    2009-01-01

    The recent shift toward greater accountability has put many educational leaders in a position where they are expected to collect and use increasing amounts of data to inform their decision making. Yet, because many programs that prepare administrators, including school business officials, do not require a statistics course or a course that is more…

  14. Assessing the Disconnect between Grade Expectation and Achievement in a Business Statistics Course

    ERIC Educational Resources Information Center

    Berenson, Mark L.; Ramnarayanan, Renu; Oppenheim, Alan

    2015-01-01

    In an institutional review board--approved study aimed at evaluating differences in learning between a large-sized introductory business statistics course section using courseware assisted examinations compared with small-sized sections using traditional paper-and-pencil examinations, there appeared to be a severe disconnect between the final…

  15. New Standards Require Teaching More Statistics: Are Preservice Secondary Mathematics Teachers Ready?

    ERIC Educational Resources Information Center

    Lovett, Jennifer N.; Lee, Hollylynne S.

    2017-01-01

    Mathematics teacher education programs often need to respond to changing expectations and standards for K-12 curriculum and accreditation. New standards for high school mathematics in the United States include a strong emphasis in statistics. This article reports results from a mixed methods cross-institutional study examining the preparedness of…

  16. Nonparametric Bayesian predictive distributions for future order statistics

    Treesearch

    Richard A. Johnson; James W. Evans; David W. Green

    1999-01-01

    We derive the predictive distribution for a specified order statistic, determined from a future random sample, under a Dirichlet process prior. Two variants of the approach are treated and some limiting cases studied. A practical application to monitoring the strength of lumber is discussed including choices of prior expectation and comparisons made to a Bayesian...

  17. Dynamical Constraints On The Galaxy-Halo Connection

    NASA Astrophysics Data System (ADS)

    Desmond, Harry

    2017-07-01

    Dark matter halos comprise the bulk of the universe's mass, yet must be probed by the luminous galaxies that form within them. A key goal of modern astrophysics, therefore, is to robustly relate the visible and dark mass, which to first order means relating the properties of galaxies and halos. This may be expected not only to improve our knowledge of galaxy formation, but also to enable high-precision cosmological tests using galaxies and hence maximise the utility of future galaxy surveys. As halos are inaccessible to observations - as galaxies are to N-body simulations - this relation requires an additional modelling step.The aim of this thesis is to develop and evaluate models of the galaxy-halo connection using observations of galaxy dynamics. In particular, I build empirical models based on the technique of halo abundance matching for five key dynamical scaling relations of galaxies - the Tully-Fisher, Faber-Jackson, mass-size and mass discrepancy-acceleration relations, and Fundamental Plane - which relate their baryon distributions and rotation or velocity dispersion profiles. I then develop a statistical scheme based on approximate Bayesian computation to compare the predicted and measured values of a number of summary statistics describing the relations' important features. This not only provides quantitative constraints on the free parameters of the models, but also allows absolute goodness-of-fit measures to be formulated. I find some features to be naturally accounted for by an abundance matching approach and others to impose new constraints on the galaxy-halo connection; the remainder are challenging to account for and may imply galaxy-halo correlations beyond the scope of basic abundance matching.Besides providing concrete statistical tests of specific galaxy formation theories, these results will be of use for guiding the inputs of empirical and semi-analytic galaxy formation models, which require galaxy-halo correlations to be imposed by hand. As galaxy datasets become larger and more precise in the future, we may expect these methods to continue providing insight into the relation between the visible and dark matter content of the universe and the physical processes that underlie it.

  18. Estimation of the geochemical threshold and its statistical significance

    USGS Publications Warehouse

    Miesch, A.T.

    1981-01-01

    A statistic is proposed for estimating the geochemical threshold and its statistical significance, or it may be used to identify a group of extreme values that can be tested for significance by other means. The statistic is the maximum gap between adjacent values in an ordered array after each gap has been adjusted for the expected frequency. The values in the ordered array are geochemical values transformed by either ln(?? - ??) or ln(?? - ??) and then standardized so that the mean is zero and the variance is unity. The expected frequency is taken from a fitted normal curve with unit area. The midpoint of an adjusted gap that exceeds the corresponding critical value may be taken as an estimate of the geochemical threshold, and the associated probability indicates the likelihood that the threshold separates two geochemical populations. The adjusted gap test may fail to identify threshold values if the variation tends to be continuous from background values to the higher values that reflect mineralized ground. However, the test will serve to identify other anomalies that may be too subtle to have been noted by other means. ?? 1981.

  19. Are there ergodic limits to evolution? Ergodic exploration of genome space and convergence

    PubMed Central

    McLeish, Tom C. B.

    2015-01-01

    We examine the analogy between evolutionary dynamics and statistical mechanics to include the fundamental question of ergodicity—the representative exploration of the space of possible states (in the case of evolution this is genome space). Several properties of evolutionary dynamics are identified that allow a generalization of the ergodic dynamics, familiar in dynamical systems theory, to evolution. Two classes of evolved biological structure then arise, differentiated by the qualitative duration of their evolutionary time scales. The first class has an ergodicity time scale (the time required for representative genome exploration) longer than available evolutionary time, and has incompletely explored the genotypic and phenotypic space of its possibilities. This case generates no expectation of convergence to an optimal phenotype or possibility of its prediction. The second, more interesting, class exhibits an evolutionary form of ergodicity—essentially all of the structural space within the constraints of slower evolutionary variables have been sampled; the ergodicity time scale for the system evolution is less than the evolutionary time. In this case, some convergence towards similar optima may be expected for equivalent systems in different species where both possess ergodic evolutionary dynamics. When the fitness maximum is set by physical, rather than co-evolved, constraints, it is additionally possible to make predictions of some properties of the evolved structures and systems. We propose four structures that emerge from evolution within genotypes whose fitness is induced from their phenotypes. Together, these result in an exponential speeding up of evolution, when compared with complete exploration of genomic space. We illustrate a possible case of application and a prediction of convergence together with attaining a physical fitness optimum in the case of invertebrate compound eye resolution. PMID:26640648

  20. Combined electronic and thermodynamic approaches for enhancing the thermoelectric properties of Ti-doped PbTe.

    PubMed

    Komisarchik, G; Gelbstein, Y; Fuks, D

    2016-11-30

    Lead telluride based compounds are of great interest due to their enhanced thermoelectric transport properties. Nevertheless, the donor type impurities in this class of materials are currently mainly limited and alternative types of donor impurities are still required for optimizing the thermoelectric performance. In the current research titanium as a donor impurity in PbTe is examined. Although titanium is known to form resonant levels above the conduction band in PbTe, it does not enhance the thermo-power beyond the classical predictions. Recent experiments showed that alloying with a small amount of Ti (∼0.1 at%) gives a significant increase in the figure of merit. In the current research ab initio calculations were applied in order to correlate the reported experimental results with a thermoelectric optimization model. It was found that a Ti concentration of ∼1.4 at% in the Pb sublattice is expected to maximize the thermoelectric power factor. Using a statistical thermodynamic approach and in agreement with the previously reported appearance of a secondary intermetallic phase, the actual Ti solubility limit in PbTe is found to be ∼0.3 at%. Based on the proposed model, the mechanism for the formation of the previously observed secondary phase is attributed to phase separation reactions, characterized by a positive enthalpy of formation in the system. With extrapolation of the obtained ab initio results, it is demonstrated that lower Ti-doping concentrations than previously experimentally reported ones are expected to provide power factor values close to the maximal one, making doping with Ti a promising opportunity for the generation of highly efficient n-type PbTe-based thermoelectric materials.

  1. Are there ergodic limits to evolution? Ergodic exploration of genome space and convergence.

    PubMed

    McLeish, Tom C B

    2015-12-06

    We examine the analogy between evolutionary dynamics and statistical mechanics to include the fundamental question of ergodicity-the representative exploration of the space of possible states (in the case of evolution this is genome space). Several properties of evolutionary dynamics are identified that allow a generalization of the ergodic dynamics, familiar in dynamical systems theory, to evolution. Two classes of evolved biological structure then arise, differentiated by the qualitative duration of their evolutionary time scales. The first class has an ergodicity time scale (the time required for representative genome exploration) longer than available evolutionary time, and has incompletely explored the genotypic and phenotypic space of its possibilities. This case generates no expectation of convergence to an optimal phenotype or possibility of its prediction. The second, more interesting, class exhibits an evolutionary form of ergodicity-essentially all of the structural space within the constraints of slower evolutionary variables have been sampled; the ergodicity time scale for the system evolution is less than the evolutionary time. In this case, some convergence towards similar optima may be expected for equivalent systems in different species where both possess ergodic evolutionary dynamics. When the fitness maximum is set by physical, rather than co-evolved, constraints, it is additionally possible to make predictions of some properties of the evolved structures and systems. We propose four structures that emerge from evolution within genotypes whose fitness is induced from their phenotypes. Together, these result in an exponential speeding up of evolution, when compared with complete exploration of genomic space. We illustrate a possible case of application and a prediction of convergence together with attaining a physical fitness optimum in the case of invertebrate compound eye resolution.

  2. X-RAY ABSORPTION BY THE WARM-HOT INTERGALACTIC MEDIUM IN THE HERCULES SUPERCLUSTER

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ren, Bin; Fang, Taotao; Buote, David A., E-mail: fangt@xmu.edu.cn

    2014-02-10

    ''Missing baryons'', in the form of warm-hot intergalactic medium (WHIM), are expected to reside in cosmic filamentary structures that can be traced by signposts such as large-scale galaxy superstructures. The clear detection of an X-ray absorption line in the Sculptor Wall demonstrated the success of using galaxy superstructures as a signpost to search for the WHIM. Here we present an XMM -Newton Reflection Grating Spectrometer observation of the blazar Mkn 501, located in the Hercules Supercluster. We detected an O VII Kα absorption line at the 98.7% level (2.5σ) at the redshift of the foreground Hercules Supercluster. The derived properties of themore » absorber are consistent with theoretical expectations of the WHIM. We discuss the implication of our detection for the search for the ''missing baryons''. While this detection shows again that using signposts is a very effective strategy to search for the WHIM, follow-up observations are crucial both to strengthen the statistical significance of the detection and to rule out other interpretations. A local, z ∼ 0 O VII Kα absorption line was also clearly detected at the 4σ level, and we discuss its implications for our understanding of the hot gas content of our Galaxy.« less

  3. Development and evaluation of social cognitive measures related to adolescent physical activity.

    PubMed

    Dewar, Deborah L; Lubans, David Revalds; Morgan, Philip James; Plotnikoff, Ronald C

    2013-05-01

    This study aimed to develop and evaluate the construct validity and reliability of modernized social cognitive measures relating to physical activity behaviors in adolescents. An instrument was developed based on constructs from Bandura's Social Cognitive Theory and included the following scales: self-efficacy, situation (perceived physical environment), social support, behavioral strategies, and outcome expectations and expectancies. The questionnaire was administered in a sample of 171 adolescents (age = 13.6 ± 1.2 years, females = 61%). Confirmatory factor analysis was employed to examine model-fit for each scale using multiple indices, including chi-square index, comparative-fit index (CFI), goodness-of-fit index (GFI), and the root mean square error of approximation (RMSEA). Reliability properties were also examined (ICC and Cronbach's alpha). Each scale represented a statistically sound measure: fit indices indicated each model to be an adequate-to-exact fit to the data; internal consistency was acceptable to good (α = 0.63-0.79); rank order repeatability was strong (ICC = 0.82-0.91). Results support the validity and reliability of social cognitive scales relating to physical activity among adolescents. As such, the developed scales have utility for the identification of potential social cognitive correlates of youth physical activity, mediators of physical activity behavior changes and the testing of theoretical models based on Social Cognitive Theory.

  4. Patient (customer) expectations in hospitals.

    PubMed

    Bostan, Sedat; Acuner, Taner; Yilmaz, Gökhan

    2007-06-01

    The expectations of patient are one of the determining factors of healthcare service. The purpose of this study is to measure the Patients' Expectations, based on Patient's Rights. This study was done with Likert-Survey in Trabzon population. The analyses showed that the level of the expectations of the patient was high on the factor of receiving information and at an acceptable level on the other factors. Statistical meaningfulness was determined between age, sex, education, health insurance, and the income of the family and the expectations of the patients (p<0.05). According to this study, the current legal regulations have higher standards than the expectations of the patients. The reason that the satisfaction of the patients high level is interpreted due to the fact that the level of the expectation is low. It is suggested that the educational and public awareness studies on the patients' rights must be done in order to increase the expectations of the patients.

  5. The level crossing rates and associated statistical properties of a random frequency response function

    NASA Astrophysics Data System (ADS)

    Langley, Robin S.

    2018-03-01

    This work is concerned with the statistical properties of the frequency response function of the energy of a random system. Earlier studies have considered the statistical distribution of the function at a single frequency, or alternatively the statistics of a band-average of the function. In contrast the present analysis considers the statistical fluctuations over a frequency band, and results are obtained for the mean rate at which the function crosses a specified level (or equivalently, the average number of times the level is crossed within the band). Results are also obtained for the probability of crossing a specified level at least once, the mean rate of occurrence of peaks, and the mean trough-to-peak height. The analysis is based on the assumption that the natural frequencies and mode shapes of the system have statistical properties that are governed by the Gaussian Orthogonal Ensemble (GOE), and the validity of this assumption is demonstrated by comparison with numerical simulations for a random plate. The work has application to the assessment of the performance of dynamic systems that are sensitive to random imperfections.

  6. Response properties of ON-OFF retinal ganglion cells to high-order stimulus statistics.

    PubMed

    Xiao, Lei; Gong, Han-Yan; Gong, Hai-Qing; Liang, Pei-Ji; Zhang, Pu-Ming

    2014-10-17

    The visual stimulus statistics are the fundamental parameters to provide the reference for studying visual coding rules. In this study, the multi-electrode extracellular recording experiments were designed and implemented on bullfrog retinal ganglion cells to explore the neural response properties to the changes in stimulus statistics. The changes in low-order stimulus statistics, such as intensity and contrast, were clearly reflected in the neuronal firing rate. However, it was difficult to distinguish the changes in high-order statistics, such as skewness and kurtosis, only based on the neuronal firing rate. The neuronal temporal filtering and sensitivity characteristics were further analyzed. We observed that the peak-to-peak amplitude of the temporal filter and the neuronal sensitivity, which were obtained from either neuronal ON spikes or OFF spikes, could exhibit significant changes when the high-order stimulus statistics were changed. These results indicate that in the retina, the neuronal response properties may be reliable and powerful in carrying some complex and subtle visual information. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Using the Modification Index and Standardized Expected Parameter Change for Model Modification

    ERIC Educational Resources Information Center

    Whittaker, Tiffany A.

    2012-01-01

    Model modification is oftentimes conducted after discovering a badly fitting structural equation model. During the modification process, the modification index (MI) and the standardized expected parameter change (SEPC) are 2 statistics that may be used to aid in the selection of parameters to add to a model to improve the fit. The purpose of this…

  8. Long-range correlations, geometrical structure, and transport properties of macromolecular solutions. The equivalence of configurational statistics and geometrodynamics of large molecules.

    PubMed

    Mezzasalma, Stefano A

    2007-12-04

    A special theory of Brownian relativity was previously proposed to describe the universal picture arising in ideal polymer solutions. In brief, it redefines a Gaussian macromolecule in a 4-dimensional diffusive spacetime, establishing a (weak) Lorentz-Poincaré invariance between liquid and polymer Einstein's laws for Brownian movement. Here, aimed at inquiring into the effect of correlations, we deepen the extension of the special theory to a general formulation. The previous statistical equivalence, for dynamic trajectories of liquid molecules and static configurations of macromolecules, and rather obvious in uncorrelated systems, is enlarged by a more general principle of equivalence, for configurational statistics and geometrodynamics. Accordingly, the three geodesic motion, continuity, and field equations could be rewritten, and a number of scaling behaviors were recovered in a spacetime endowed with general static isotropic metric (i.e., for equilibrium polymer solutions). We also dealt with universality in the volume fraction and, unexpectedly, found that a hyperscaling relation of the form, (average size) x (diffusivity) x (viscosity)1/2 ~f(N0, phi0) is fulfilled in several regimes, both in the chain monomer number (N) and polymer volume fraction (phi). Entangled macromolecular dynamics was treated as a geodesic light deflection, entaglements acting in close analogy to the field generated by a spherically symmetric mass source, where length fluctuations of the chain primitive path behave as azimuth fluctuations of its shape. Finally, the general transformation rule for translational and diffusive frames gives a coordinate gauge invariance, suggesting a widened Lorentz-Poincaré symmetry for Brownian statistics. We expect this approach to find effective applications to solutions of arbitrarily large molecules displaying a variety of structures, where the effect of geometry is more explicit and significant in itself (e.g., surfactants, lipids, proteins).

  9. Measuring the Sensitivity of Single-locus “Neutrality Tests” Using a Direct Perturbation Approach

    PubMed Central

    Garrigan, Daniel; Lewontin, Richard; Wakeley, John

    2010-01-01

    A large number of statistical tests have been proposed to detect natural selection based on a sample of variation at a single genetic locus. These tests measure the deviation of the allelic frequency distribution observed within populations from the distribution expected under a set of assumptions that includes both neutral evolution and equilibrium population demography. The present study considers a new way to assess the statistical properties of these tests of selection, by their behavior in response to direct perturbations of the steady-state allelic frequency distribution, unconstrained by any particular nonequilibrium demographic scenario. Results from Monte Carlo computer simulations indicate that most tests of selection are more sensitive to perturbations of the allele frequency distribution that increase the variance in allele frequencies than to perturbations that decrease the variance. Simulations also demonstrate that it requires, on average, 4N generations (N is the diploid effective population size) for tests of selection to relax to their theoretical, steady-state distributions following different perturbations of the allele frequency distribution to its extremes. This relatively long relaxation time highlights the fact that these tests are not robust to violations of the other assumptions of the null model besides neutrality. Lastly, genetic variation arising under an example of a regularly cycling demographic scenario is simulated. Tests of selection performed on this last set of simulated data confirm the confounding nature of these tests for the inference of natural selection, under a demographic scenario that likely holds for many species. The utility of using empirical, genomic distributions of test statistics, instead of the theoretical steady-state distribution, is discussed as an alternative for improving the statistical inference of natural selection. PMID:19744997

  10. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Öztürk, Hande; Noyan, I. Cevdet

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  11. Expected values and variances of Bragg peak intensities measured in a nanocrystalline powder diffraction experiment

    DOE PAGES

    Öztürk, Hande; Noyan, I. Cevdet

    2017-08-24

    A rigorous study of sampling and intensity statistics applicable for a powder diffraction experiment as a function of crystallite size is presented. Our analysis yields approximate equations for the expected value, variance and standard deviations for both the number of diffracting grains and the corresponding diffracted intensity for a given Bragg peak. The classical formalism published in 1948 by Alexander, Klug & Kummer [J. Appl. Phys.(1948),19, 742–753] appears as a special case, limited to large crystallite sizes, here. It is observed that both the Lorentz probability expression and the statistics equations used in the classical formalism are inapplicable for nanocrystallinemore » powder samples.« less

  12. How to feed environmental studies with soil information to address SDG 'Zero hunger'

    NASA Astrophysics Data System (ADS)

    Hendriks, Chantal; Stoorvogel, Jetse; Claessens, Lieven

    2017-04-01

    As pledged by UN Sustainable Development Goal (SDG) 2, there should be zero hunger, food security, improved food nutrition and sustainable agriculture by 2030. Environmental studies are essential to reach SDG 2. Soils play a crucial role, especially in addressing 'Zero hunger'. This study aims to discuss the connection between the supply and demand of soil data for environmental studies and how this connection can be improved illustrating different methods. As many studies are resource constrained, the options to collect new soil data are limited. Therefore, it is essential to use existing soil information, auxiliary data and collected field data efficiently. Existing soil data are criticised in literature as i) being dominantly qualitative, ii) being often outdated, iii) being not spatially exhaustive, iv) being only available at general scales, v) being inconsistent, and vi) lacking quality assessments. Additional field data can help to overcome some of these problems. Outdated maps can, for example, be improved by collecting additional soil data in areas where changes in soil properties are expected. Existing soil data can also provide insight in the expected soil variability and, as such, these data can be used for the design of sampling schemes. Existing soil data are also crucial input for studies on digital soil mapping because they give information on parent material and the relative age of soils. Digital soil mapping is commonly applied as an efficient method to quantitatively predict the spatial variation of soil properties. However, the efficiency of digital soil mapping may increase if we look at functional soil properties (e.g. nutrient availability, available water capacity) for the soil profile that vary in a two-dimensional space rather than at basic soil properties of individual soil layers (e.g. texture, organic matter content, nitrogen content) that vary in a three-dimensional space. Digital soil mapping techniques are based on statistical relations between soil properties and environmental variables. However, in some cases a more mechanistic approach, based on pedological knowledge, might be more convincing to predict soil properties. This study showed that the soil science community is able to provide the required soil information for environmental studies. However, there is not a single solution that provides the required soil data. Case studies are needed to prove that certain methods meet the data requirements, whereafter these case studies function as a lighthouse to other studies. We illustrate data availability and methodological innovations for a case study in Kenya, where the CGIAR Research Program on Climate Change, Agriculture and Food Security (CCAFS) aims to contribute to SDG 2.

  13. Influence of nonlinear effects on statistical properties of the radiation from SASE FEL

    NASA Astrophysics Data System (ADS)

    Saldin, E. L.; Schneidmiller, E. A.; Yurkov, M. V.

    1998-02-01

    The paper presents analysis of statistical properties of the radiation from self-amplified spontaneous emission (SASE) free-electron laser operating in nonlinear mode. The present approach allows one to calculate the following statistical properties of the SASE FEL radiation: time and spectral field correlation functions, distribution of the fluctuations of the instantaneous radiation power, distribution of the energy in the electron bunch, distribution of the radiation energy after monochromator installed at the FEL amplifier exit and the radiation spectrum. It has been observed that the statistics of the instantaneous radiation power from SASE FEL operating in the nonlinear regime changes significantly with respect to the linear regime. All numerical results presented in the paper have been calculated for the 70 nm SASE FEL at the TESLA Test Facility under construction at DESY.

  14. Statistical mechanics based on fractional classical and quantum mechanics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com

    2014-03-15

    The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.

  15. Statistical behavior of the tensile property of heated cotton fiber

    USDA-ARS?s Scientific Manuscript database

    The temperature dependence of the tensile property of single cotton fiber was studied in the range of 160-300°C using Favimat test, and its statistical behavior was interpreted in terms of structural changes. The tenacity of control cotton fiber was well described by the single Weibull distribution,...

  16. A critique of Rasch residual fit statistics.

    PubMed

    Karabatsos, G

    2000-01-01

    In test analysis involving the Rasch model, a large degree of importance is placed on the "objective" measurement of individual abilities and item difficulties. The degree to which the objectivity properties are attained, of course, depends on the degree to which the data fit the Rasch model. It is therefore important to utilize fit statistics that accurately and reliably detect the person-item response inconsistencies that threaten the measurement objectivity of persons and items. Given this argument, it is somewhat surprising that there is far more emphasis placed in the objective measurement of person and items than there is in the measurement quality of Rasch fit statistics. This paper provides a critical analysis of the residual fit statistics of the Rasch model, arguably the most often used fit statistics, in an effort to illustrate that the task of Rasch fit analysis is not as simple and straightforward as it appears to be. The faulty statistical properties of the residual fit statistics do not allow either a convenient or a straightforward approach to Rasch fit analysis. For instance, given a residual fit statistic, the use of a single minimum critical value for misfit diagnosis across different testing situations, where the situations vary in sample and test properties, leads to both the overdetection and underdetection of misfit. To improve this situation, it is argued that psychometricians need to implement residual-free Rasch fit statistics that are based on the number of Guttman response errors, or use indices that are statistically optimal in detecting measurement disturbances.

  17. Quantum formalism for classical statistics

    NASA Astrophysics Data System (ADS)

    Wetterich, C.

    2018-06-01

    In static classical statistical systems the problem of information transport from a boundary to the bulk finds a simple description in terms of wave functions or density matrices. While the transfer matrix formalism is a type of Heisenberg picture for this problem, we develop here the associated Schrödinger picture that keeps track of the local probabilistic information. The transport of the probabilistic information between neighboring hypersurfaces obeys a linear evolution equation, and therefore the superposition principle for the possible solutions. Operators are associated to local observables, with rules for the computation of expectation values similar to quantum mechanics. We discuss how non-commutativity naturally arises in this setting. Also other features characteristic of quantum mechanics, such as complex structure, change of basis or symmetry transformations, can be found in classical statistics once formulated in terms of wave functions or density matrices. We construct for every quantum system an equivalent classical statistical system, such that time in quantum mechanics corresponds to the location of hypersurfaces in the classical probabilistic ensemble. For suitable choices of local observables in the classical statistical system one can, in principle, compute all expectation values and correlations of observables in the quantum system from the local probabilistic information of the associated classical statistical system. Realizing a static memory material as a quantum simulator for a given quantum system is not a matter of principle, but rather of practical simplicity.

  18. Free-energy and the brain

    PubMed Central

    Friston, Karl J.; Stephan, Klaas E.

    2009-01-01

    If one formulates Helmholtz’s ideas about perception in terms of modern-day theories one arrives at a model of perceptual inference and learning that can explain a remarkable range of neurobiological facts. Using constructs from statistical physics it can be shown that the problems of inferring what cause our sensory input and learning causal regularities in the sensorium can be resolved using exactly the same principles. Furthermore, inference and learning can proceed in a biologically plausible fashion. The ensuing scheme rests on Empirical Bayes and hierarchical models of how sensory information is generated. The use of hierarchical models enables the brain to construct prior expectations in a dynamic and context-sensitive fashion. This scheme provides a principled way to understand many aspects of the brain’s organisation and responses. In this paper, we suggest that these perceptual processes are just one emergent property of systems that conform to a free-energy principle. The free-energy considered here represents a bound on the surprise inherent in any exchange with the environment, under expectations encoded by its state or configuration. A system can minimise free-energy by changing its configuration to change the way it samples the environment, or to change its expectations. These changes correspond to action and perception respectively and lead to an adaptive exchange with the environment that is characteristic of biological systems. This treatment implies that the system’s state and structure encode an implicit and probabilistic model of the environment. We will look at models entailed by the brain and how minimisation of free-energy can explain its dynamics and structure. PMID:19325932

  19. Observations and modeling of EMIC wave properties in the presence of multiple ion species as function of magnetic local time

    NASA Astrophysics Data System (ADS)

    Lee, Justin H.; Angelopoulos, Vassilis

    2014-11-01

    Electromagnetic ion cyclotron (EMIC) wave generation and propagation in Earth's magnetosphere depend on readily measurable hot (a few to tens of keV) plasma sheet ions, elusive plasmaspheric or ionospheric cold (sub-eV to a few eV) ions, and partially heated warm ions (tens to hundreds of eV). Previous work has assumed all low-energy ions are cold and not considered possible effects of warm ions. Using measurements by multiple Time History of Events and Macroscale Interactions during Substorms spacecraft, we analyze four typical EMIC wave events in the four magnetic local time sectors and consider the properties of both cold and warm ions supplied from previous statistical studies to interpret the wave observations using linear theory. As expected, we find that dusk EMIC waves grow due to the presence of drifting hot anisotropic protons and cold plasmaspheric ions with a dominant cold proton component. Near midnight, EMIC waves are less common because warm heavy ions that suppress wave growth are more abundant there. The waves can grow when cold, plume-like density enhancements are present, however. Dawn EMIC waves, known for their peculiar properties, are generated away from the equator and change polarization during propagation through the warm plasma cloak. Noon EMIC waves can also be generated nonlocally and their properties modified during propagation by a plasmaspheric plume combined with low-energy ions from solar and terrestrial sources. Accounting for multiple ion species, measured wave dispersion, and propagation characteristics can explain previously elusive EMIC wave properties and are therefore important for future studies of EMIC wave effects on energetic particle depletion.

  20. Cannabis expectancies in substance misusers: French validation of the Marijuana Effect Expectancy Questionnaire.

    PubMed

    Guillem, Eric; Notides, Christine; Vorspan, Florence; Debray, Marcel; Nieto, Isabel; Leroux, Mayliss; Lépine, Jean-Pierre

    2011-01-01

    The aim of this study was to evaluate the psychometric properties of the French version of the Marijuana Effect Expectancy Questionnaire (48 items) and study the cannabis expectancies according to the patterns of substance use and psychiatric disorders (DSM-IV). A sample of 263 subjects (average age 33.1 years [SD = 8.7], 56% men) consisting of cannabis users (n = 64), psychiatric inpatients (n = 175, most of whom were hospitalized for withdrawal), and a control group (n = 24) completed the questionnaire. Internal reliability was good (α= .87) and temporal reliability was satisfactory, with 24 of 48 items having a significant κ ≥ .41. Factor analysis showed four main factors that explained 42.1% of the total variance. The women feared Cognitive Impairment and Negative Effects, and Negative Behavioral Effects more than the men. The onset age of cannabis use, onset age of abuse, abuse and dependence were associated with fewer negative expectancies. Cannabis dependents differed from abusers by more Relaxation and Social Facilitation expectancies. Patients with major depressive episodes, panic disorder, social anxiety disorder, or posttraumatic stress disorder feared negative effects the most. Schizophrenic patients expected more Perceptual Enhancement and Craving. The French version of the Marijuana Effect Expectancy Questionnaire has good psychometric properties and is valid to assess cannabis expectancies in adolescents and adults with substance use disorders. Copyright © American Academy of Addiction Psychiatry.

  1. Correlation between the different therapeutic properties of Chinese medicinal herbs and delayed luminescence.

    PubMed

    Pang, Jingxiang; Fu, Jialei; Yang, Meina; Zhao, Xiaolei; van Wijk, Eduard; Wang, Mei; Fan, Hua; Han, Jinxiang

    2016-03-01

    In the practice and principle of Chinese medicine, herbal materials are classified according to their therapeutic properties. 'Cold' and 'heat' are the most important classes of Chinese medicinal herbs according to the theory of traditional Chinese medicine (TCM). In this work, delayed luminescence (DL) was measured for different samples of Chinese medicinal herbs using a sensitive photon multiplier detection system. A comparison of DL parameters, including mean intensity and statistic entropy, was undertaken to discriminate between the 'cold' and 'heat' properties of Chinese medicinal herbs. The results suggest that there are significant differences in mean intensity and statistic entropy and using this method combined with statistical analysis may provide novel parameters for the characterization of Chinese medicinal herbs in relation to their energetic properties. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Weighted analysis methods for mapped plot forest inventory data: Tables, regressions, maps and graphs

    Treesearch

    Paul C. Van Deusen; Linda S. Heath

    2010-01-01

    Weighted estimation methods for analysis of mapped plot forest inventory data are discussed. The appropriate weighting scheme can vary depending on the type of analysis and graphical display. Both statistical issues and user expectations need to be considered in these methods. A weighting scheme is proposed that balances statistical considerations and the logical...

  3. 76 FR 28483 - Self-Regulatory Organizations; NYSE Arca, Inc.; Notice of Filing of Proposed Rule Change To List...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-17

    ... given day. In seeking to achieve each Fund's daily investment objective, the Sponsor uses a mathematical... Registration Statements. The Sponsor expects the Funds to have a statistical correlation \\15\\ over time of -.95... the underlying Index or Benchmark. The statistical measure of correlation is known as the...

  4. Are Young Children with Cochlear Implants Sensitive to the Statistics of Words in the Ambient Spoken Language?

    ERIC Educational Resources Information Center

    Guo, Ling-Yu; McGregor, Karla K.; Spencer, Linda J.

    2015-01-01

    Purpose: The purpose of this study was to determine whether children with cochlear implants (CIs) are sensitive to statistical characteristics of words in the ambient spoken language, whether that sensitivity changes in expected ways as their spoken lexicon grows, and whether that sensitivity varies with unilateral or bilateral implantation.…

  5. Development of computer-assisted instruction application for statistical data analysis android platform as learning resource

    NASA Astrophysics Data System (ADS)

    Hendikawati, P.; Arifudin, R.; Zahid, M. Z.

    2018-03-01

    This study aims to design an android Statistics Data Analysis application that can be accessed through mobile devices to making it easier for users to access. The Statistics Data Analysis application includes various topics of basic statistical along with a parametric statistics data analysis application. The output of this application system is parametric statistics data analysis that can be used for students, lecturers, and users who need the results of statistical calculations quickly and easily understood. Android application development is created using Java programming language. The server programming language uses PHP with the Code Igniter framework, and the database used MySQL. The system development methodology used is the Waterfall methodology with the stages of analysis, design, coding, testing, and implementation and system maintenance. This statistical data analysis application is expected to support statistical lecturing activities and make students easier to understand the statistical analysis of mobile devices.

  6. Who took the "x" out of expectancy-value theory? A psychological mystery, a substantive-methodological synergy, and a cross-national generalization.

    PubMed

    Nagengast, Benjamin; Marsh, Herbert W; Scalas, L Francesca; Xu, Man K; Hau, Kit-Tai; Trautwein, Ulrich

    2011-08-01

    Expectancy-value theory (EVT) is a dominant theory of human motivation. Historically, the Expectancy × Value interaction, in which motivation is high only if both expectancy and value are high, was central to EVT. However, the Expectancy × Value interaction mysteriously disappeared from published research more than 25 years ago. Using large representative samples of 15-year-olds (N = 398,750) from 57 diverse countries, we attempted to solve this mystery by testing Expectancy × Value interactions using latent-variable models with interactions. Expectancy (science self-concept), value (enjoyment of science), and the Expectancy × Value interaction all had statistically significant positive effects on both engagement in science activities and intentions of pursuing scientific careers; these results were similar for the total sample and for nearly all of the 57 countries considered separately. This study, apparently the strongest cross-national test of EVT ever undertaken, supports the generalizability of EVT predictions--including the "lost" Expectancy × Value interaction.

  7. Extending Landauer's bound from bit erasure to arbitrary computation

    NASA Astrophysics Data System (ADS)

    Wolpert, David

    The minimal thermodynamic work required to erase a bit, known as Landauer's bound, has been extensively investigated both theoretically and experimentally. However, when viewed as a computation that maps inputs to outputs, bit erasure has a very special property: the output does not depend on the input. Existing analyses of thermodynamics of bit erasure implicitly exploit this property, and thus cannot be directly extended to analyze the computation of arbitrary input-output maps. Here we show how to extend these earlier analyses of bit erasure to analyze the thermodynamics of arbitrary computations. Doing this establishes a formal connection between the thermodynamics of computers and much of theoretical computer science. We use this extension to analyze the thermodynamics of the canonical ``general purpose computer'' considered in computer science theory: a universal Turing machine (UTM). We consider a UTM which maps input programs to output strings, where inputs are drawn from an ensemble of random binary sequences, and prove: i) The minimal work needed by a UTM to run some particular input program X and produce output Y is the Kolmogorov complexity of Y minus the log of the ``algorithmic probability'' of Y. This minimal amount of thermodynamic work has a finite upper bound, which is independent of the output Y, depending only on the details of the UTM. ii) The expected work needed by a UTM to compute some given output Y is infinite. As a corollary, the overall expected work to run a UTM is infinite. iii) The expected work needed by an arbitrary Turing machine T (not necessarily universal) to compute some given output Y can either be infinite or finite, depending on Y and the details of T. To derive these results we must combine ideas from nonequilibrium statistical physics with fundamental results from computer science, such as Levin's coding theorem and other theorems about universal computation. I would like to ackowledge the Santa Fe Institute, Grant No. TWCF0079/AB47 from the Templeton World Charity Foundation, Grant No. FQXi-RHl3-1349 from the FQXi foundation, and Grant No. CHE-1648973 from the U.S. National Science Foundation.

  8. Fast Prediction of Blast Damage from Airbursts: An Empirical Monte Carlo approach

    NASA Astrophysics Data System (ADS)

    Brown, Peter G.; Stokan, Ed

    2016-10-01

    The February 15, 2013 Chelyabinsk airburst was the first modern bolide whose associated shockwave caused blast damage at the ground (Popova et al., 2013). Near-Earth Object (NEO) impacts in the Chelyabinsk-size range (~20 m) are expected to occur every few decades (Boslough et al., 2015) and therefore we expect ground damage from meteoric airbursts to be the next planetary defense threat to be confronted. With pre-impact detections of small NEOs certain to become more common, decision makers will be faced with estimating blast damage from impactors with uncertain physical properties on short timescales.High fidelity numerical bolide entry models have been developed in recent years (eg. Boslough and Crawford, 2008; Shuvalov et al., 2013), but the wide range in a priori data about strength, fragmentation behavior, and other physical properties for a specific impactor make predictions of bolide behavior difficult. The long computational running times for hydrocode models make the exploration of a wide parameter space challenging in the days to hours before an actual impact.Our approach to this problem is to use an analytical bolide entry model, the triggered-progressive fragmentation model (TPFM) developed by ReVelle (2005) within a Monte Carlo formalism. In particular, we couple this model with empirical constraints on the statistical spread in strength for meter-scale impactors from Brown et al (2015) based on the observed height at maximum bolide brightness. We also use the correlation of peak bolide brightness with total energy as given by Brown (2016) as a proxy for fragmentation behaviour. Using these constraints, we are able to quickly generate a large set of realizations of probable bolide energy deposition curves and produce simple estimates of expected blast damage using existing analytical relations.We validate this code with the known parameters of the Chelyabinsk airburst and explore how changes to the entry conditions of the observed bolide may have modified the blast damage at the ground. We will also present how this approach could be used in an actual short-warning impact scenario.

  9. [Diabetes mellitus: Contribution to changes in the life expectancy in Mexico 1990, 2000, and 2010].

    PubMed

    Dávila-Cervantes, Claudio A; Pardo Montaño, Ana M

    2014-01-01

    To analyze the level and trend of diabetes mellitus (DM) in Mexico, and its contribution to the changes in temporary life expectancy between 20 and 100 years of age, in the period 1990-2010. Data comes from National Mortality Vital Statistics and from the Population Census from the Mexican National Institute of Geography and Statistics (INEGI). We calculated standardized mortality rates. To analyze the impact of DM on the temporary life expectancy (80e20) we used Pollard’s method. Between 1990 and 2010, the standardized mortality rate for people 20 years and older increased by 224 %. The contribution of DM for men to the change in life expectancy during 1990-2000 was a reduction of 0.31 years; for women was a reduction of 0.32 years; in the period 2000-2010 the reduction continued for both men and women (0.34 and 0.12 years respectively). Mortality from DM continues to increase, especially for men, but for women a modest reduction was observed. It is essential to apply health services and programs aimed at reducing mortality from this cause, focused on prevention, early detection and timely treatment, with concrete actions on vulnerable groups.

  10. A study of the H I and optical properties of Low Surface Brightness galaxies: spirals, dwarfs, and irregulars

    NASA Astrophysics Data System (ADS)

    Honey, M.; van Driel, W.; Das, M.; Martin, J.-M.

    2018-06-01

    We present a study of the H I and optical properties of nearby (z ≤ 0.1) Low Surface Brightness galaxies (LSBGs). We started with a literature sample of ˜900 LSBGs and divided them into three morphological classes: spirals, irregulars, and dwarfs. Of these, we could use ˜490 LSBGs to study their H I and stellar masses, colours, and colour-magnitude diagrams, and local environment, compare them with normal, High Surface Brightness (HSB) galaxies and determine the differences between the three morphological classes. We found that LSB and HSB galaxies span a similar range in H I and stellar masses, and have a similar M_{H I}/M⋆-M⋆ relationship. Among the LSBGs, as expected, the spirals have the highest average H I and stellar masses, both of about 109.8 M⊙. The LSGBs' (g - r) integrated colour is nearly constant as function of H I mass for all classes. In the colour-magnitude diagram, the spirals are spread over the red and blue regions whereas the irregulars and dwarfs are confined to the blue region. The spirals also exhibit a steeper slope in the M_{H I}/M⋆-M⋆ plane. Within their local environment, we confirmed that LSBGs are more isolated than HSB galaxies, and LSB spirals more isolated than irregulars and dwarfs. Kolmogorov-Smirnov statistical tests on the H I mass, stellar mass, and number of neighbours indicate that the spirals are a statistically different population from the dwarfs and irregulars. This suggests that the spirals may have different formation and H I evolution than the dwarfs and irregulars.

  11. Measurement of Fatigue in Cancer, Stroke, and HIV Using the Functional Assessment of Chronic Illness Therapy – Fatigue (FACIT-F) Scale

    PubMed Central

    Butt, Zeeshan; Lai, Jin-shei; Rao, Deepa; Heinemann, Allen W.; Bill, Alex; Cella, David

    2012-01-01

    Objective Given the importance of fatigue in cancer, stroke and HIV, we sought to assess the measurement properties of a single, well-described fatigue scale in these populations. We hypothesized that the psychometric properties of the Functional Assessment of Chronic Illness Therapy – Fatigue (FACIT-F) subscale would be favorable and that the scale could serve as a useful indicator of fatigue in these populations. Methods Patients were eligible for the study if they were outpatients, aged 18 or older, with a diagnosis of cancer (n=297), stroke (n=51), or HIV/AIDS (n=51). All participants were able to understand and speak English. Patients answered study-related questions, including the FACIT-F using a touch-screen laptop, assisted by the research assistant as necessary. Clinical information was abstracted from patients’ medical records. Results Item-level statistics on the FACIT-F were similar across the groups and internal consistency reliability was uniformly high (α>0.91). Correlations with performance status ratings were statistically significant across the groups (range r=−0.28 to −0.80). Fatigue scores were moderately to highly correlated with general quality of life (range r=0.66–0.80) in patients with cancer, stroke, and HIV. Divergent validity was supported in low correlations with variables not expected to correlate with fatigue. Conclusions Originally developed to assess cancer-related fatigue, the FACIT-F has utility as a measure of fatigue in other populations, such as stroke and HIV. Ongoing research will soon allow for comparison of FACIT-F scores to those obtained using the fatigue measures from the Patient-Reported Outcomes Measurement Information System (PROMIS®; www.nihpromis.org) initiative. PMID:23272990

  12. Halo Coronal Mass Ejections during Solar Cycle 24: reconstruction of the global scenario and geoeffectiveness

    NASA Astrophysics Data System (ADS)

    Scolini, Camilla; Messerotti, Mauro; Poedts, Stefaan; Rodriguez, Luciano

    2018-02-01

    In this study we present a statistical analysis of 53 fast Earth-directed halo CMEs observed by the SOHO/LASCO instrument during the period Jan. 2009-Sep. 2015, and we use this CME sample to test the capabilities of a Sun-to-Earth prediction scheme for CME geoeffectiveness. First, we investigate the CME association with other solar activity features by means of multi-instrument observations of the solar magnetic and plasma properties. Second, using coronagraphic images to derive the CME kinematical properties at 0.1 AU, we propagate the events to 1 AU by means of the WSA-ENLIL+Cone model. Simulation results at Earth are compared with in-situ observations at L1. By applying the pressure balance condition at the magnetopause and a solar wind-Kp index coupling function, we estimate the expected magnetospheric compression and geomagnetic activity level, and compare them with global data records. The analysis indicates that 82% of the CMEs arrived at Earth in the next 4 days. Almost the totality of them compressed the magnetopause below geosynchronous orbits and triggered a geomagnetic storm. Complex sunspot-rich active regions associated with energetic flares result the most favourable configurations from which geoeffective CMEs originate. The analysis of related SEP events shows that 74% of the CMEs associated with major SEPs were geoeffective. Moreover, the SEP production is enhanced in the case of fast and interacting CMEs. In this work we present a first attempt at applying a Sun-to-Earth geoeffectiveness prediction scheme - based on 3D simulations and solar wind-geomagnetic activity coupling functions - to a statistical set of potentially geoeffective halo CMEs. The results of the prediction scheme are in good agreement with geomagnetic activity data records, although further studies performing a fine-tuning of such scheme are needed.

  13. Sensitivity of super-efficient data envelopment analysis results to individual decision-making units: an example of surgical workload by specialty.

    PubMed

    Dexter, Franklin; O'Neill, Liam; Xin, Lei; Ledolter, Johannes

    2008-12-01

    We use resampling of data to explore the basic statistical properties of super-efficient data envelopment analysis (DEA) when used as a benchmarking tool by the manager of a single decision-making unit. Our focus is the gaps in the outputs (i.e., slacks adjusted for upward bias), as they reveal which outputs can be increased. The numerical experiments show that the estimates of the gaps fail to exhibit asymptotic consistency, a property expected for standard statistical inference. Specifically, increased sample sizes were not always associated with more accurate forecasts of the output gaps. The baseline DEA's gaps equaled the mode of the jackknife and the mode of resampling with/without replacement from any subset of the population; usually, the baseline DEA's gaps also equaled the median. The quartile deviations of gaps were close to zero when few decision-making units were excluded from the sample and the study unit happened to have few other units contributing to its benchmark. The results for the quartile deviations can be explained in terms of the effective combinations of decision-making units that contribute to the DEA solution. The jackknife can provide all the combinations contributing to the quartile deviation and only needs to be performed for those units that are part of the benchmark set. These results show that there is a strong rationale for examining DEA results with a sensitivity analysis that excludes one benchmark hospital at a time. This analysis enhances the quality of decision support using DEA estimates for the potential ofa decision-making unit to grow one or more of its outputs.

  14. Prediction of CpG-island function: CpG clustering vs. sliding-window methods

    PubMed Central

    2010-01-01

    Background Unmethylated stretches of CpG dinucleotides (CpG islands) are an outstanding property of mammal genomes. Conventionally, these regions are detected by sliding window approaches using %G + C, CpG observed/expected ratio and length thresholds as main parameters. Recently, clustering methods directly detect clusters of CpG dinucleotides as a statistical property of the genome sequence. Results We compare sliding-window to clustering (i.e. CpGcluster) predictions by applying new ways to detect putative functionality of CpG islands. Analyzing the co-localization with several genomic regions as a function of window size vs. statistical significance (p-value), CpGcluster shows a higher overlap with promoter regions and highly conserved elements, at the same time showing less overlap with Alu retrotransposons. The major difference in the prediction was found for short islands (CpG islets), often exclusively predicted by CpGcluster. Many of these islets seem to be functional, as they are unmethylated, highly conserved and/or located within the promoter region. Finally, we show that window-based islands can spuriously overlap several, differentially regulated promoters as well as different methylation domains, which might indicate a wrong merge of several CpG islands into a single, very long island. The shorter CpGcluster islands seem to be much more specific when concerning the overlap with alternative transcription start sites or the detection of homogenous methylation domains. Conclusions The main difference between sliding-window approaches and clustering methods is the length of the predicted islands. Short islands, often differentially methylated, are almost exclusively predicted by CpGcluster. This suggests that CpGcluster may be the algorithm of choice to explore the function of these short, but putatively functional CpG islands. PMID:20500903

  15. The Longterm Centimeter-band Total Flux and Linear Polarization Properties of the Pearson-Readhead Survey Sources

    NASA Astrophysics Data System (ADS)

    Aller, M. F.; Aller, H. D.; Hughes, P. A.

    2001-12-01

    Using centimeter-band total flux and linear polarization observations of the Pearson-Readhead sample sources systematically obtained with the UMRAO 26-m radio telescope during the past 16 years, we identify the range of variability properties and their temporal changes as functions of both optical and radio morphological classification. We find that our earlier statistical analysis, based on a time window of 6.4 years, did not delineate the full amplitude range of the total flux variability; further, several galaxies exhibit longterm, systematic changes or rather infrequent outbursts requiring long term observations for detection. Using radio classification as a delineator, we confirm, and find additional evidence, that significant changes in flux density can occur in steep spectrum and lobe-dominated objects as well as in compact, flat-spectrum objects. We find that statistically the time-averaged total flux density spectra steepen when longer time windows are included, which we attribute to a selection effect in the source sample. We have identified preferred orientations of the electric vector of the polarized emission (EVPA) in an unbiased manner in several sources, including several QSOs which have exhibited large variations in total flux while maintaining stable EVPAs, and compared these with orientations of the flow direction indicated by VLB morphology. We have looked for systematic, monotonic changes in EVPA which might be expected in the emission from a precessing jet, but none were identified. A Scargle periodogram analysis found no strong evidence for periodicity in any of the sample sources. We thank the NSF for grants AST-8815678, AST-9120224, AST-9421979, and AST-9900723 which provided partial support for this research. The operation of the 26-meter telescope is supported by the University of Michigan Department of Astronomy.

  16. To t-Test or Not to t-Test? A p-Values-Based Point of View in the Receiver Operating Characteristic Curve Framework.

    PubMed

    Vexler, Albert; Yu, Jihnhee

    2018-04-13

    A common statistical doctrine supported by many introductory courses and textbooks is that t-test type procedures based on normally distributed data points are anticipated to provide a standard in decision-making. In order to motivate scholars to examine this convention, we introduce a simple approach based on graphical tools of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. In this context, we propose employing a p-values-based method, taking into account the stochastic nature of p-values. We focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we extend the EPV concept to be considered in terms of the ROC curve technique. This provides expressive evaluations and visualizations of a wide spectrum of testing mechanisms' properties. We show that the conventional power characterization of tests is a partial aspect of the presented EPV/ROC technique. We desire that this explanation of the EPV/ROC approach convinces researchers of the usefulness of the EPV/ROC approach for depicting different characteristics of decision-making procedures, in light of the growing interest regarding correct p-values-based applications.

  17. The span of correlations in dolphin whistle sequences

    NASA Astrophysics Data System (ADS)

    Ferrer-i-Cancho, Ramon; McCowan, Brenda

    2012-06-01

    Long-range correlations are found in symbolic sequences from human language, music and DNA. Determining the span of correlations in dolphin whistle sequences is crucial for shedding light on their communicative complexity. Dolphin whistles share various statistical properties with human words, i.e. Zipf's law for word frequencies (namely that the probability of the ith most frequent word of a text is about i-α) and a parallel of the tendency of more frequent words to have more meanings. The finding of Zipf's law for word frequencies in dolphin whistles has been the topic of an intense debate on its implications. One of the major arguments against the relevance of Zipf's law in dolphin whistles is that it is not possible to distinguish the outcome of a die-rolling experiment from that of a linguistic or communicative source producing Zipf's law for word frequencies. Here we show that statistically significant whistle-whistle correlations extend back to the second previous whistle in the sequence, using a global randomization test, and to the fourth previous whistle, using a local randomization test. None of these correlations are expected by a die-rolling experiment and other simple explanations of Zipf's law for word frequencies, such as Simon's model, that produce sequences of unpredictable elements.

  18. Experimental evaluation of nonclassical correlations between measurement outcomes and target observable in a quantum measurement

    NASA Astrophysics Data System (ADS)

    Iinuma, Masataka; Suzuki, Yutaro; Nii, Taiki; Kinoshita, Ryuji; Hofmann, Holger F.

    2016-03-01

    In general, it is difficult to evaluate measurement errors when the initial and final conditions of the measurement make it impossible to identify the correct value of the target observable. Ozawa proposed a solution based on the operator algebra of observables which has recently been used in experiments investigating the error-disturbance trade-off of quantum measurements. Importantly, this solution makes surprisingly detailed statements about the relations between measurement outcomes and the unknown target observable. In the present paper, we investigate this relation by performing a sequence of two measurements on the polarization of a photon, so that the first measurement commutes with the target observable and the second measurement is sensitive to a complementary observable. While the initial measurement can be evaluated using classical statistics, the second measurement introduces the effects of quantum correlations between the noncommuting physical properties. By varying the resolution of the initial measurement, we can change the relative contribution of the nonclassical correlations and identify their role in the evaluation of the quantum measurement. It is shown that the most striking deviation from classical expectations is obtained at the transition between weak and strong measurements, where the competition between different statistical effects results in measurement values well outside the range of possible eigenvalues.

  19. On improvement to the Shock Propagation Model (SPM) applied to interplanetary shock transit time forecasting

    NASA Astrophysics Data System (ADS)

    Li, H. J.; Wei, F. S.; Feng, X. S.; Xie, Y. Q.

    2008-09-01

    This paper investigates methods to improve the predictions of Shock Arrival Time (SAT) of the original Shock Propagation Model (SPM). According to the classical blast wave theory adopted in the SPM, the shock propagating speed is determined by the total energy of the original explosion together with the background solar wind speed. Noting that there exists an intrinsic limit to the transit times computed by the SPM predictions for a specified ambient solar wind, we present a statistical analysis on the forecasting capability of the SPM using this intrinsic property. Two facts about SPM are found: (1) the error in shock energy estimation is not the only cause of the prediction errors and we should not expect that the accuracy of SPM to be improved drastically by an exact shock energy input; and (2) there are systematic differences in prediction results both for the strong shocks propagating into a slow ambient solar wind and for the weak shocks into a fast medium. Statistical analyses indicate the physical details of shock propagation and thus clearly point out directions of the future improvement of the SPM. A simple modification is presented here, which shows that there is room for improvement of SPM and thus that the original SPM is worthy of further development.

  20. An experimental study of the surface elevation probability distribution and statistics of wind-generated waves

    NASA Technical Reports Server (NTRS)

    Huang, N. E.; Long, S. R.

    1980-01-01

    Laboratory experiments were performed to measure the surface elevation probability density function and associated statistical properties for a wind-generated wave field. The laboratory data along with some limited field data were compared. The statistical properties of the surface elevation were processed for comparison with the results derived from the Longuet-Higgins (1963) theory. It is found that, even for the highly non-Gaussian cases, the distribution function proposed by Longuet-Higgins still gives good approximations.

  1. A Probabilistic Model of Local Sequence Alignment That Simplifies Statistical Significance Estimation

    PubMed Central

    Eddy, Sean R.

    2008-01-01

    Sequence database searches require accurate estimation of the statistical significance of scores. Optimal local sequence alignment scores follow Gumbel distributions, but determining an important parameter of the distribution (λ) requires time-consuming computational simulation. Moreover, optimal alignment scores are less powerful than probabilistic scores that integrate over alignment uncertainty (“Forward” scores), but the expected distribution of Forward scores remains unknown. Here, I conjecture that both expected score distributions have simple, predictable forms when full probabilistic modeling methods are used. For a probabilistic model of local sequence alignment, optimal alignment bit scores (“Viterbi” scores) are Gumbel-distributed with constant λ = log 2, and the high scoring tail of Forward scores is exponential with the same constant λ. Simulation studies support these conjectures over a wide range of profile/sequence comparisons, using 9,318 profile-hidden Markov models from the Pfam database. This enables efficient and accurate determination of expectation values (E-values) for both Viterbi and Forward scores for probabilistic local alignments. PMID:18516236

  2. Statistical properties of alternative national forest inventory area estimators

    Treesearch

    Francis Roesch; John Coulston; Andrew D. Hill

    2012-01-01

    The statistical properties of potential estimators of forest area for the USDA Forest Service's Forest Inventory and Analysis (FIA) program are presented and discussed. The current FIA area estimator is compared and contrasted with a weighted mean estimator and an estimator based on the Polya posterior, in the presence of nonresponse. Estimator optimality is...

  3. Financial Statistics of Institutions of Higher Education: Property, 1969-70.

    ERIC Educational Resources Information Center

    Mertins, Paul F.; Brandt, Norman J.

    This publication presents a part of the data provided by institutions of higher education in response to a questionnaire entitled "Financial Statistics of Institutions of Higher Education, 1969-70," which was included in the fifth annual Higher Education General Information Survey (HEGIS). This publication deals with the property related data.…

  4. Degree of cure and fracture properties of experimental acid-resin modified composites under wet and dry conditions

    PubMed Central

    López-Suevos, Francisco; Dickens, Sabine H.

    2008-01-01

    Objective Evaluate the effects of core structure and storage conditions on the mechanical properties of acid-resin modified composites and a control material by three-point bending and conversion measurements 15 min and 24 h after curing. Methods The monomers pyromellitic dimethacrylate (PMDM), biphenyldicarboxylic-acid dimethacrylate (BPDM), (isopropylidene-diphenoxy)bis(phthalic-acid) dimethacrylate (IPDM), oxydiphthalic-acid dimethacrylate (ODPDM), and Bis-GMA were mixed with triethyleneglycol dimethacrylate (TEGDMA) in a 40/60 molar ratio, and photo-activated. Composite bars (Barium-oxide-glass/resin = 3/1 mass ratio, (2 × 2 × 25) mm, n = 5) were light-cured for 1 min per side. Flexural strength (FS), elastic modulus (E), and work-of-fracture (WoF) were determined in three-point bending after 15 min (stored dry); and after 24 h under dry and wet storage conditions at 37 °C. Corresponding degrees of conversion (DC) were evaluated by Fourier transform infrared spectroscopy. Data was statistically analyzed (2-way analysis of variance, ANOVA, Holm-Sidak, p < 0.05). Results Post-curing significantly increased FS, E and DC in nearly all cases. WoF did not change, or even decreased with time. For all properties ANOVA found significant differences and interactions of time and material. Wet storage reduced the moduli and the other properties measured with the exception of FS and WoF of ODPDM; DC only decreased in BPDM and IPDM composites. Significance Differences in core structure resulted in significantly different physical properties of the composites studied with two phenyl rings connected by one ether linkage as in ODPDM having superior FS, WoF and DC especially after 24 h under wet conditions. As expected, post-curing significantly contributed to the final mechanical properties of the composites, while wet storage generally reduced the mechanical properties. PMID:17980422

  5. Vortex phase-induced changes of the statistical properties of a partially coherent radially polarized beam.

    PubMed

    Guo, Lina; Chen, Yahong; Liu, Xianlong; Liu, Lin; Cai, Yangjian

    2016-06-27

    Partially coherent radially polarized (PCRP) beam was introduced and generated in recent years. In this paper, we investigate the statistical properties of a PCRP beam embedded with a vortex phase (i.e., PCRP vortex beam). We derive the analytical formula for the cross-spectral density matrix of a PCRP vortex beam propagating through a paraxial ABCD optical system and analyze the statistical properties of a PCRP vortex beam focused by a thin lens. It is found that the statistical properties of a PCRP vortex beam on propagation are much different from those of a PCRP beam. The vortex phase induces not only the rotation of the beam spot, but also the changes of the beam shape, the degree of polarization and the state of polarization. We also find that the vortex phase plays a role of resisting the coherence-induced degradation of the intensity distribution and the coherence-induced depolarization. Furthermore, we report experimental generation of a PCRP vortex beam for the first time. Our results will be useful for trapping and rotating particles, free-space optical communications and detection of phase object.

  6. Semi-Poisson statistics in quantum chaos.

    PubMed

    García-García, Antonio M; Wang, Jiao

    2006-03-01

    We investigate the quantum properties of a nonrandom Hamiltonian with a steplike singularity. It is shown that the eigenfunctions are multifractals and, in a certain range of parameters, the level statistics is described exactly by semi-Poisson statistics (SP) typical of pseudointegrable systems. It is also shown that our results are universal, namely, they depend exclusively on the presence of the steplike singularity and are not modified by smooth perturbations of the potential or the addition of a magnetic flux. Although the quantum properties of our system are similar to those of a disordered conductor at the Anderson transition, we report important quantitative differences in both the level statistics and the multifractal dimensions controlling the transition. Finally, the study of quantum transport properties suggests that the classical singularity induces quantum anomalous diffusion. We discuss how these findings may be experimentally corroborated by using ultracold atoms techniques.

  7. Statistical studies of animal response data from USF toxicity screening test method

    NASA Technical Reports Server (NTRS)

    Hilado, C. J.; Machado, A. M.

    1978-01-01

    Statistical examination of animal response data obtained using Procedure B of the USF toxicity screening test method indicates that the data deviate only slightly from a normal or Gaussian distribution. This slight departure from normality is not expected to invalidate conclusions based on theoretical statistics. Comparison of times to staggering, convulsions, collapse, and death as endpoints shows that time to death appears to be the most reliable endpoint because it offers the lowest probability of missed observations and premature judgements.

  8. Southeast Atlantic Cloud Properties in a Multivariate Statistical Model - How Relevant is Air Mass History for Local Cloud Properties?

    NASA Astrophysics Data System (ADS)

    Fuchs, Julia; Cermak, Jan; Andersen, Hendrik

    2017-04-01

    This study aims at untangling the impacts of external dynamics and local conditions on cloud properties in the Southeast Atlantic (SEA) by combining satellite and reanalysis data using multivariate statistics. The understanding of clouds and their determinants at different scales is important for constraining the Earth's radiative budget, and thus prominent in climate-system research. In this study, SEA stratocumulus cloud properties are observed not only as the result of local environmental conditions but also as affected by external dynamics and spatial origins of air masses entering the study area. In order to assess to what extent cloud properties are impacted by aerosol concentration, air mass history, and meteorology, a multivariate approach is conducted using satellite observations of aerosol and cloud properties (MODIS, SEVIRI), information on aerosol species composition (MACC) and meteorological context (ERA-Interim reanalysis). To account for the often-neglected but important role of air mass origin, information on air mass history based on HYSPLIT modeling is included in the statistical model. This multivariate approach is intended to lead to a better understanding of the physical processes behind observed stratocumulus cloud properties in the SEA.

  9. [Prediction of life expectancy for prostate cancer patients based on the kinetic theory of aging of living systems].

    PubMed

    Viktorov, A A; Zharinov, G M; Neklasova, N Ju; Morozova, E E

    2017-01-01

    The article presents a methodical approach for prediction of life expectancy for people diagnosed with prostate cancer based on the kinetic theory of aging of living systems. The life expectancy is calculated by solving the differential equation for the rate of aging for three different stage of life - «normal» life, life with prostate cancer and life after combination therapy for prostate cancer. The mathematical model of aging for each stage of life has its own parameters identified by the statistical analysis of healthcare data from the Zharinov's databank and Rosstat CDR NES databank. The core of the methodical approach is the statistical correlation between growth rate of the prostate specific antigen level (PSA-level) or the PSA doubling time (PSA DT) before therapy, and lifespan: the higher the PSA DT is, the greater lifespan. The patients were grouped under the «fast PSA DT» and «slow PSA DT» categories. The satisfactory matching between calculations and experiment is shown. The prediction error of group life expectancy is due to the completeness and reliability of the main data source. A detailed monitoring of the basic health indicators throughout the each person life in each analyzed group is required. The absence of this particular information makes it impossible to predict the individual life expectancy.

  10. A statistical study of merging galaxies: Theory and observations

    NASA Technical Reports Server (NTRS)

    Chatterjee, Tapan K.

    1990-01-01

    A study of the expected frequency of merging galaxies is conducted, using the impulsive approximation. Results indicate that if we consider mergers involving galaxy pairs without halos in a single crossing time or orbital period, the expected frequency of mergers is two orders of magnitude below the observed value for the present epoch. If we consider mergers involving several orbital periods or crossing times, the expected frequency goes up by an order of magnitude. Preliminary calculation indicate that if we consider galaxy mergers between pairs with massive halos, the merger is very much hastened.

  11. Leads Detection Using Mixture Statistical Distribution Based CRF Algorithm from Sentinel-1 Dual Polarization SAR Imagery

    NASA Astrophysics Data System (ADS)

    Zhang, Yu; Li, Fei; Zhang, Shengkai; Zhu, Tingting

    2017-04-01

    Synthetic Aperture Radar (SAR) is significantly important for polar remote sensing since it can provide continuous observations in all days and all weather. SAR can be used for extracting the surface roughness information characterized by the variance of dielectric properties and different polarization channels, which make it possible to observe different ice types and surface structure for deformation analysis. In November, 2016, Chinese National Antarctic Research Expedition (CHINARE) 33rd cruise has set sails in sea ice zone in Antarctic. Accurate leads spatial distribution in sea ice zone for routine planning of ship navigation is essential. In this study, the semantic relationship between leads and sea ice categories has been described by the Conditional Random Fields (CRF) model, and leads characteristics have been modeled by statistical distributions in SAR imagery. In the proposed algorithm, a mixture statistical distribution based CRF is developed by considering the contexture information and the statistical characteristics of sea ice for improving leads detection in Sentinel-1A dual polarization SAR imagery. The unary potential and pairwise potential in CRF model is constructed by integrating the posteriori probability estimated from statistical distributions. For mixture statistical distribution parameter estimation, Method of Logarithmic Cumulants (MoLC) is exploited for single statistical distribution parameters estimation. The iteration based Expectation Maximal (EM) algorithm is investigated to calculate the parameters in mixture statistical distribution based CRF model. In the posteriori probability inference, graph-cut energy minimization method is adopted in the initial leads detection. The post-processing procedures including aspect ratio constrain and spatial smoothing approaches are utilized to improve the visual result. The proposed method is validated on Sentinel-1A SAR C-band Extra Wide Swath (EW) Ground Range Detected (GRD) imagery with a pixel spacing of 40 meters near Prydz Bay area, East Antarctica. Main work is listed as follows: 1) A mixture statistical distribution based CRF algorithm has been developed for leads detection from Sentinel-1A dual polarization images. 2) The assessment of the proposed mixture statistical distribution based CRF method and single distribution based CRF algorithm has been presented. 3) The preferable parameters sets including statistical distributions, the aspect ratio threshold and spatial smoothing window size have been provided. In the future, the proposed algorithm will be developed for the operational Sentinel series data sets processing due to its less time consuming cost and high accuracy in leads detection.

  12. Properties of Nonabelian Quantum Hall States

    NASA Astrophysics Data System (ADS)

    Simon, Steven H.

    2004-03-01

    The quantum statistics of particles refers to the behavior of a multiparticle wavefunction under adiabatic interchange of two identical particles. While a three dimensional world affords the possibilities of Bosons or Fermions, the two dimensional world has more exotic possibilities such as Fractional and Nonabelian statistics (J. Frölich, in ``Nonperturbative Quantum Field Theory", ed, G. t'Hooft. 1988). The latter is perhaps the most interesting where the wavefunction obeys a ``nonabelian'' representation of the braid group - meaning that braiding A around B then B around C is not the same as braiding B around C then A around B. This property enables one to think about using these exotic systems for robust topological quantum computation (M. Freedman, A. Kitaev, et al, Bull Am Math Soc 40, 31 (2003)). Surprisingly, it is thought that quasiparticles excitations with such nonabelian statistics may actually exist in certain quantum Hall states that have already been observed. The most likely such candidate is the quantum Hall ν=5/2 state(R. L. Willett et al, Phys. Rev. Lett. 59, 1776-1779 (1987)), thought to be a so-called Moore-Read Pfaffian state(G. Moore and N. Read, Nucl Phys. B360 362 (1991)), which can be thought of as a p-wave paired superconducting state of composite fermions(M. Greiter, X. G. Wen, and F. Wilczek, PRL 66, 3205 (1991)). Using this superconducting analogy, we use a Chern-Simons field theory approach to make a number of predictions as to what experimental signatures one should expect for this state if it really is this Moore-Read state(K. Foster, N. Bonesteel, and S. H. Simon, PRL 91 046804 (2003)). We will then discuss how the nonabelian statistics can be explored in detail using a quantum monte-carlo approach (Y. Tserkovnyak and S. H. Simon, PRL 90 106802 (2003)), (I. Finkler, Y. Tserkovnyak, and S. H. Simon, work in progress.) that allows one to explicitly drag one particle around another and observe the change in the wavefunctions. Unfortunately, it turns out that the Moore-Read state is not suited for topological quantum computationfootnote[3]M. Freedman, A. Kitaev, et al, Bull Am Math Soc 40, 31 (2003). so we will turn our attention to more the so-called parafermionic states(E. Rezayi and N. Read, Phys. Rev. B 59, 8084-8092 (1999).) which may also exist in nature.

  13. The influence of various test plans on mission reliability. [for Shuttle Spacelab payloads

    NASA Technical Reports Server (NTRS)

    Stahle, C. V.; Gongloff, H. R.; Young, J. P.; Keegan, W. B.

    1977-01-01

    Methods have been developed for the evaluation of cost effective vibroacoustic test plans for Shuttle Spacelab payloads. The shock and vibration environments of components have been statistically represented, and statistical decision theory has been used to evaluate the cost effectiveness of five basic test plans with structural test options for two of the plans. Component, subassembly, and payload testing have been performed for each plan along with calculations of optimum test levels and expected costs. The tests have been ranked according to both minimizing expected project costs and vibroacoustic reliability. It was found that optimum costs may vary up to $6 million with the lowest plan eliminating component testing and maintaining flight vibration reliability via subassembly tests at high acoustic levels.

  14. Efficient estimation of Pareto model: Some modified percentile estimators.

    PubMed

    Bhatti, Sajjad Haider; Hussain, Shahzad; Ahmad, Tanvir; Aslam, Muhammad; Aftab, Muhammad; Raza, Muhammad Ali

    2018-01-01

    The article proposes three modified percentile estimators for parameter estimation of the Pareto distribution. These modifications are based on median, geometric mean and expectation of empirical cumulative distribution function of first-order statistic. The proposed modified estimators are compared with traditional percentile estimators through a Monte Carlo simulation for different parameter combinations with varying sample sizes. Performance of different estimators is assessed in terms of total mean square error and total relative deviation. It is determined that modified percentile estimator based on expectation of empirical cumulative distribution function of first-order statistic provides efficient and precise parameter estimates compared to other estimators considered. The simulation results were further confirmed using two real life examples where maximum likelihood and moment estimators were also considered.

  15. Grain-Size Based Additivity Models for Scaling Multi-rate Uranyl Surface Complexation in Subsurface Sediments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhang, Xiaoying; Liu, Chongxuan; Hu, Bill X.

    This study statistically analyzed a grain-size based additivity model that has been proposed to scale reaction rates and parameters from laboratory to field. The additivity model assumed that reaction properties in a sediment including surface area, reactive site concentration, reaction rate, and extent can be predicted from field-scale grain size distribution by linearly adding reaction properties for individual grain size fractions. This study focused on the statistical analysis of the additivity model with respect to reaction rate constants using multi-rate uranyl (U(VI)) surface complexation reactions in a contaminated sediment as an example. Experimental data of rate-limited U(VI) desorption in amore » stirred flow-cell reactor were used to estimate the statistical properties of multi-rate parameters for individual grain size fractions. The statistical properties of the rate constants for the individual grain size fractions were then used to analyze the statistical properties of the additivity model to predict rate-limited U(VI) desorption in the composite sediment, and to evaluate the relative importance of individual grain size fractions to the overall U(VI) desorption. The result indicated that the additivity model provided a good prediction of the U(VI) desorption in the composite sediment. However, the rate constants were not directly scalable using the additivity model, and U(VI) desorption in individual grain size fractions have to be simulated in order to apply the additivity model. An approximate additivity model for directly scaling rate constants was subsequently proposed and evaluated. The result found that the approximate model provided a good prediction of the experimental results within statistical uncertainty. This study also found that a gravel size fraction (2-8mm), which is often ignored in modeling U(VI) sorption and desorption, is statistically significant to the U(VI) desorption in the sediment.« less

  16. A statistical study of EMIC waves observed by Cluster. 1. Wave properties. EMIC Wave Properties

    DOE PAGES

    Allen, R. C.; Zhang, J. -C.; Kistler, L. M.; ...

    2015-07-23

    Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In our study, we present a statistical analysis of EMIC wave properties using 10 years (2001–2010) of datamore » from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. Thus, the statistical analysis is presented in two papers. OUr paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.« less

  17. All words are not created equal: Expectations about word length guide infant statistical learning

    PubMed Central

    Lew-Williams, Casey; Saffran, Jenny R.

    2011-01-01

    Infants have been described as ‘statistical learners’ capable of extracting structure (such as words) from patterned input (such as language). Here, we investigated whether prior knowledge influences how infants track transitional probabilities in word segmentation tasks. Are infants biased by prior experience when engaging in sequential statistical learning? In a laboratory simulation of learning across time, we exposed 9- and 10-month-old infants to a list of either bisyllabic or trisyllabic nonsense words, followed by a pause-free speech stream composed of a different set of bisyllabic or trisyllabic nonsense words. Listening times revealed successful segmentation of words from fluent speech only when words were uniformly bisyllabic or trisyllabic throughout both phases of the experiment. Hearing trisyllabic words during the pre-exposure phase derailed infants’ abilities to segment speech into bisyllabic words, and vice versa. We conclude that prior knowledge about word length equips infants with perceptual expectations that facilitate efficient processing of subsequent language input. PMID:22088408

  18. From Wald to Savage: homo economicus becomes a Bayesian statistician.

    PubMed

    Giocoli, Nicola

    2013-01-01

    Bayesian rationality is the paradigm of rational behavior in neoclassical economics. An economic agent is deemed rational when she maximizes her subjective expected utility and consistently revises her beliefs according to Bayes's rule. The paper raises the question of how, when and why this characterization of rationality came to be endorsed by mainstream economists. Though no definitive answer is provided, it is argued that the question is of great historiographic importance. The story begins with Abraham Wald's behaviorist approach to statistics and culminates with Leonard J. Savage's elaboration of subjective expected utility theory in his 1954 classic The Foundations of Statistics. The latter's acknowledged fiasco to achieve a reinterpretation of traditional inference techniques along subjectivist and behaviorist lines raises the puzzle of how a failed project in statistics could turn into such a big success in economics. Possible answers call into play the emphasis on consistency requirements in neoclassical theory and the impact of the postwar transformation of U.S. business schools. © 2012 Wiley Periodicals, Inc.

  19. Cancer Incidence of 2,4-D Production Workers

    PubMed Central

    Burns, Carol; Bodner, Kenneth; Swaen, Gerard; Collins, James; Beard, Kathy; Lee, Marcia

    2011-01-01

    Despite showing no evidence of carcinogenicity in laboratory animals, the herbicide 2,4-dichlorophenoxyacetic acid (2,4-D) has been associated with non-Hodgkin lymphoma (NHL) in some human epidemiology studies, albeit inconsistently. We matched an existing cohort of 2,4-D manufacturing employees with cancer registries in three US states resulting in 244 cancers compared to 276 expected cases. The Standardized Incidence Ratio (SIR) for the 14 NHL cases was 1.36 (95% Confidence Interval (CI) 0.74–2.29). Risk estimates were higher in the upper cumulative exposure and duration subgroups, yet not statistically significant. There were no clear patterns of NHL risk with period of hire and histology subtypes. Statistically significant results were observed for prostate cancer (SIR = 0.74, 95% CI 0.57–0.94), and “other respiratory” cancers (SIR = 3.79, 95% CI 1.22–8.84; 4 of 5 cases were mesotheliomas). Overall, we observed fewer cancer cases than expected, and a non statistically significant increase in the number of NHL cases. PMID:22016704

  20. Systematic Biases in Parameter Estimation of Binary Black-Hole Mergers

    NASA Technical Reports Server (NTRS)

    Littenberg, Tyson B.; Baker, John G.; Buonanno, Alessandra; Kelly, Bernard J.

    2012-01-01

    Parameter estimation of binary-black-hole merger events in gravitational-wave data relies on matched filtering techniques, which, in turn, depend on accurate model waveforms. Here we characterize the systematic biases introduced in measuring astrophysical parameters of binary black holes by applying the currently most accurate effective-one-body templates to simulated data containing non-spinning numerical-relativity waveforms. For advanced ground-based detectors, we find that the systematic biases are well within the statistical error for realistic signal-to-noise ratios (SNR). These biases grow to be comparable to the statistical errors at high signal-to-noise ratios for ground-based instruments (SNR approximately 50) but never dominate the error budget. At the much larger signal-to-noise ratios expected for space-based detectors, these biases will become large compared to the statistical errors but are small enough (at most a few percent in the black-hole masses) that we expect they should not affect broad astrophysical conclusions that may be drawn from the data.

  1. Evaluation of high-resolution sea ice models on the basis of statistical and scaling properties of Arctic sea ice drift and deformation

    NASA Astrophysics Data System (ADS)

    Girard, L.; Weiss, J.; Molines, J. M.; Barnier, B.; Bouillon, S.

    2009-08-01

    Sea ice drift and deformation from models are evaluated on the basis of statistical and scaling properties. These properties are derived from two observation data sets: the RADARSAT Geophysical Processor System (RGPS) and buoy trajectories from the International Arctic Buoy Program (IABP). Two simulations obtained with the Louvain-la-Neuve Ice Model (LIM) coupled to a high-resolution ocean model and a simulation obtained with the Los Alamos Sea Ice Model (CICE) were analyzed. Model ice drift compares well with observations in terms of large-scale velocity field and distributions of velocity fluctuations although a significant bias on the mean ice speed is noted. On the other hand, the statistical properties of ice deformation are not well simulated by the models: (1) The distributions of strain rates are incorrect: RGPS distributions of strain rates are power law tailed, i.e., exhibit "wild randomness," whereas models distributions remain in the Gaussian attraction basin, i.e., exhibit "mild randomness." (2) The models are unable to reproduce the spatial and temporal correlations of the deformation fields: In the observations, ice deformation follows spatial and temporal scaling laws that express the heterogeneity and the intermittency of deformation. These relations do not appear in simulated ice deformation. Mean deformation in models is almost scale independent. The statistical properties of ice deformation are a signature of the ice mechanical behavior. The present work therefore suggests that the mechanical framework currently used by models is inappropriate. A different modeling framework based on elastic interactions could improve the representation of the statistical and scaling properties of ice deformation.

  2. Assessing Attitudes towards Statistics among Medical Students: Psychometric Properties of the Serbian Version of the Survey of Attitudes Towards Statistics (SATS)

    PubMed Central

    Stanisavljevic, Dejana; Trajkovic, Goran; Marinkovic, Jelena; Bukumiric, Zoran; Cirkovic, Andja; Milic, Natasa

    2014-01-01

    Background Medical statistics has become important and relevant for future doctors, enabling them to practice evidence based medicine. Recent studies report that students’ attitudes towards statistics play an important role in their statistics achievements. The aim of the study was to test the psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS) in order to acquire a valid instrument to measure attitudes inside the Serbian educational context. Methods The validation study was performed on a cohort of 417 medical students who were enrolled in an obligatory introductory statistics course. The SATS adaptation was based on an internationally accepted methodology for translation and cultural adaptation. Psychometric properties of the Serbian version of the SATS were analyzed through the examination of factorial structure and internal consistency. Results Most medical students held positive attitudes towards statistics. The average total SATS score was above neutral (4.3±0.8), and varied from 1.9 to 6.2. Confirmatory factor analysis validated the six-factor structure of the questionnaire (Affect, Cognitive Competence, Value, Difficulty, Interest and Effort). Values for fit indices TLI (0.940) and CFI (0.961) were above the cut-off of ≥0.90. The RMSEA value of 0.064 (0.051–0.078) was below the suggested value of ≤0.08. Cronbach’s alpha of the entire scale was 0.90, indicating scale reliability. In a multivariate regression model, self-rating of ability in mathematics and current grade point average were significantly associated with the total SATS score after adjusting for age and gender. Conclusion Present study provided the evidence for the appropriate metric properties of the Serbian version of SATS. Confirmatory factor analysis validated the six-factor structure of the scale. The SATS might be reliable and a valid instrument for identifying medical students’ attitudes towards statistics in the Serbian educational context. PMID:25405489

  3. Assessing attitudes towards statistics among medical students: psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS).

    PubMed

    Stanisavljevic, Dejana; Trajkovic, Goran; Marinkovic, Jelena; Bukumiric, Zoran; Cirkovic, Andja; Milic, Natasa

    2014-01-01

    Medical statistics has become important and relevant for future doctors, enabling them to practice evidence based medicine. Recent studies report that students' attitudes towards statistics play an important role in their statistics achievements. The aim of the study was to test the psychometric properties of the Serbian version of the Survey of Attitudes Towards Statistics (SATS) in order to acquire a valid instrument to measure attitudes inside the Serbian educational context. The validation study was performed on a cohort of 417 medical students who were enrolled in an obligatory introductory statistics course. The SATS adaptation was based on an internationally accepted methodology for translation and cultural adaptation. Psychometric properties of the Serbian version of the SATS were analyzed through the examination of factorial structure and internal consistency. Most medical students held positive attitudes towards statistics. The average total SATS score was above neutral (4.3±0.8), and varied from 1.9 to 6.2. Confirmatory factor analysis validated the six-factor structure of the questionnaire (Affect, Cognitive Competence, Value, Difficulty, Interest and Effort). Values for fit indices TLI (0.940) and CFI (0.961) were above the cut-off of ≥0.90. The RMSEA value of 0.064 (0.051-0.078) was below the suggested value of ≤0.08. Cronbach's alpha of the entire scale was 0.90, indicating scale reliability. In a multivariate regression model, self-rating of ability in mathematics and current grade point average were significantly associated with the total SATS score after adjusting for age and gender. Present study provided the evidence for the appropriate metric properties of the Serbian version of SATS. Confirmatory factor analysis validated the six-factor structure of the scale. The SATS might be reliable and a valid instrument for identifying medical students' attitudes towards statistics in the Serbian educational context.

  4. Statistics of optical and geometrical properties of cirrus cloud over tibetan plateau measured by lidar and radiosonde

    NASA Astrophysics Data System (ADS)

    Dai, Guangyao; Wu, Songhua; Song, Xiaoquan; Zhai, Xiaochun

    2018-04-01

    Cirrus clouds affect the energy budget and hydrological cycle of the earth's atmosphere. The Tibetan Plateau (TP) plays a significant role in the global and regional climate. Optical and geometrical properties of cirrus clouds in the TP were measured in July-August 2014 by lidar and radiosonde. The statistics and temperature dependences of the corresponding properties are analyzed. The cirrus cloud formations are discussed with respect to temperature deviation and dynamic processes.

  5. Evaluation of the Kinetic Property of Single-Molecule Junctions by Tunneling Current Measurements.

    PubMed

    Harashima, Takanori; Hasegawa, Yusuke; Kiguchi, Manabu; Nishino, Tomoaki

    2018-01-01

    We investigated the formation and breaking of single-molecule junctions of two kinds of dithiol molecules by time-resolved tunneling current measurements in a metal nanogap. The resulting current trajectory was statistically analyzed to determine the single-molecule conductance and, more importantly, to reveal the kinetic property of the single-molecular junction. These results suggested that combining a measurement of the single-molecule conductance and statistical analysis is a promising method to uncover the kinetic properties of the single-molecule junction.

  6. Statistically based material properties: A military handbook-17 perspective

    NASA Technical Reports Server (NTRS)

    Neal, Donald M.; Vangel, Mark G.

    1990-01-01

    The statistical procedures and their importance in obtaining composite material property values in designing structures for aircraft and military combat systems are described. The property value is such that the strength exceeds this value with a prescribed probability with 95 percent confidence in the assertion. The survival probabilities are the 99th percentile and 90th percentile for the A and B basis values respectively. The basis values for strain to failure measurements are defined in a similar manner. The B value is the primary concern.

  7. 40 CFR Appendix K to Part 50 - Interpretation of the National Ambient Air Quality Standards for Particulate Matter

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., other techniques, such as the use of statistical models or the use of historical data could be..., mathematical techniques should be applied to account for the trends to ensure that the expected annual values... emission patterns, either the most recent representative year(s) could be used or statistical techniques or...

  8. Understanding regulatory networks requires more than computing a multitude of graph statistics. Comment on "Drivers of structural features in gene regulatory networks: From biophysical constraints to biological function" by O.C. Martin et al.

    NASA Astrophysics Data System (ADS)

    Tkačik, Gašper

    2016-07-01

    The article by O. Martin and colleagues provides a much needed systematic review of a body of work that relates the topological structure of genetic regulatory networks to evolutionary selection for function. This connection is very important. Using the current wealth of genomic data, statistical features of regulatory networks (e.g., degree distributions, motif composition, etc.) can be quantified rather easily; it is, however, often unclear how to interpret the results. On a graph theoretic level the statistical significance of the results can be evaluated by comparing observed graphs to ;randomized; ones (bravely ignoring the issue of how precisely to randomize!) and comparing the frequency of appearance of a particular network structure relative to a randomized null expectation. While this is a convenient operational test for statistical significance, its biological meaning is questionable. In contrast, an in-silico genotype-to-phenotype model makes explicit the assumptions about the network function, and thus clearly defines the expected network structures that can be compared to the case of no selection for function and, ultimately, to data.

  9. GeneLab

    NASA Technical Reports Server (NTRS)

    Berrios, Daniel C.; Thompson, Terri G.

    2015-01-01

    NASA GeneLab is expected to capture and distribute omics data and experimental and process conditions most relevant to research community in their statistical and theoretical analysis of NASAs omics data.

  10. Comparing Data Sets: Implicit Summaries of the Statistical Properties of Number Sets

    ERIC Educational Resources Information Center

    Morris, Bradley J.; Masnick, Amy M.

    2015-01-01

    Comparing datasets, that is, sets of numbers in context, is a critical skill in higher order cognition. Although much is known about how people compare single numbers, little is known about how number sets are represented and compared. We investigated how subjects compared datasets that varied in their statistical properties, including ratio of…

  11. Best Phd thesis Prize: Statistical analysis of ALFALFA galaxies: insights in galaxy

    NASA Astrophysics Data System (ADS)

    Papastergis, E.

    2013-09-01

    We use the rich dataset of local universe galaxies detected by the ALFALFA 21cm survey to study the statistical properties of gas-bearing galaxies. In particular, we measure the number density of galaxies as a function of their baryonic mass ("baryonic mass function") and rotational velocity ("velocity width function"), and we characterize their clustering properties ("two-point correlation function"). These statistical distributions are determined by both the properties of dark matter on small scales, as well as by the complex baryonic processes through which galaxies form over cosmic time. We interpret the ALFALFA measurements with the aid of publicly available cosmological N-body simulations and we present some key results related to galaxy formation and small-scale cosmology.

  12. Primary expectations of secondary metabolites

    USDA-ARS?s Scientific Manuscript database

    Plant secondary metabolites (e.g., phenolics) are important for human health, in addition to the organoleptic properties they impart to fresh and processed foods. Consumer expectations such as appearance, taste, or texture influence their purchasing decisions. Thorough identification of phenolic com...

  13. Primary expectations of secondary metabolites

    USDA-ARS?s Scientific Manuscript database

    My program examines the plant secondary metabolites (i.e. phenolics) important for human health, and which impart the organoleptic properties that are quality indicators for fresh and processed foods. Consumer expectations such as appearance, taste, or texture influence their purchasing decisions; a...

  14. 24 CFR 581.4 - Suitability determination.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... earlier than six months prior to the expected date when the property will become unutilized or... determination for a particular piece of property, and the reasons for that determination; and (2) The...

  15. Statistical properties of Chinese phonemic networks

    NASA Astrophysics Data System (ADS)

    Yu, Shuiyuan; Liu, Haitao; Xu, Chunshan

    2011-04-01

    The study of properties of speech sound systems is of great significance in understanding the human cognitive mechanism and the working principles of speech sound systems. Some properties of speech sound systems, such as the listener-oriented feature and the talker-oriented feature, have been unveiled with the statistical study of phonemes in human languages and the research of the interrelations between human articulatory gestures and the corresponding acoustic parameters. With all the phonemes of speech sound systems treated as a coherent whole, our research, which focuses on the dynamic properties of speech sound systems in operation, investigates some statistical parameters of Chinese phoneme networks based on real text and dictionaries. The findings are as follows: phonemic networks have high connectivity degrees and short average distances; the degrees obey normal distribution and the weighted degrees obey power law distribution; vowels enjoy higher priority than consonants in the actual operation of speech sound systems; the phonemic networks have high robustness against targeted attacks and random errors. In addition, for investigating the structural properties of a speech sound system, a statistical study of dictionaries is conducted, which shows the higher frequency of shorter words and syllables and the tendency that the longer a word is, the shorter the syllables composing it are. From these structural properties and dynamic properties one can derive the following conclusion: the static structure of a speech sound system tends to promote communication efficiency and save articulation effort while the dynamic operation of this system gives preference to reliable transmission and easy recognition. In short, a speech sound system is an effective, efficient and reliable communication system optimized in many aspects.

  16. Optimal periodic proof test based on cost-effective and reliability criteria

    NASA Technical Reports Server (NTRS)

    Yang, J.-N.

    1976-01-01

    An exploratory study for the optimization of periodic proof tests for fatigue-critical structures is presented. The optimal proof load level and the optimal number of periodic proof tests are determined by minimizing the total expected (statistical average) cost, while the constraint on the allowable level of structural reliability is satisfied. The total expected cost consists of the expected cost of proof tests, the expected cost of structures destroyed by proof tests, and the expected cost of structural failure in service. It is demonstrated by numerical examples that significant cost saving and reliability improvement for fatigue-critical structures can be achieved by the application of the optimal periodic proof test. The present study is relevant to the establishment of optimal maintenance procedures for fatigue-critical structures.

  17. Prospects for Probing Strong Gravity with a Pulsar-Black Hole System

    NASA Technical Reports Server (NTRS)

    Wex, N.; Liu, K.; Eatough, R. P.; Kramer, M.; Cordes, J. M.; Lazio, T. J. W.

    2012-01-01

    The discovery of a pulsar (PSR) in orbit around a black hole (BH) is expected to provide a superb new probe of relativistic gravity and BH properties. Apart from a precise mass measurement for the BH, one could expect a clean verification of the dragging of space-time caused by the BH spin. In order to measure the quadrupole moment of the BH for testing the no-hair theorem of general relativity (GR), one has to hope for a sufficiently massive BH. In this respect, a PSR orbiting the super-massive BH in the center of our Galaxy would be the ultimate laboratory for gravity tests with PSRs. But even for gravity theories that predict the same properties for BHs as GR, a PSR-BH system would constitute an excellent test system, due to the high grade of asymmetry in the strong field properties of these two components. Here we highlight some of the potential gravity tests that one could expect from different PSR-BH systems.

  18. On the linearity of tracer bias around voids

    NASA Astrophysics Data System (ADS)

    Pollina, Giorgia; Hamaus, Nico; Dolag, Klaus; Weller, Jochen; Baldi, Marco; Moscardini, Lauro

    2017-07-01

    The large-scale structure of the Universe can be observed only via luminous tracers of the dark matter. However, the clustering statistics of tracers are biased and depend on various properties, such as their host-halo mass and assembly history. On very large scales, this tracer bias results in a constant offset in the clustering amplitude, known as linear bias. Towards smaller non-linear scales, this is no longer the case and tracer bias becomes a complicated function of scale and time. We focus on tracer bias centred on cosmic voids, I.e. depressions of the density field that spatially dominate the Universe. We consider three types of tracers: galaxies, galaxy clusters and active galactic nuclei, extracted from the hydrodynamical simulation Magneticum Pathfinder. In contrast to common clustering statistics that focus on auto-correlations of tracers, we find that void-tracer cross-correlations are successfully described by a linear bias relation. The tracer-density profile of voids can thus be related to their matter-density profile by a single number. We show that it coincides with the linear tracer bias extracted from the large-scale auto-correlation function and expectations from theory, if sufficiently large voids are considered. For smaller voids we observe a shift towards higher values. This has important consequences on cosmological parameter inference, as the problem of unknown tracer bias is alleviated up to a constant number. The smallest scales in existing data sets become accessible to simpler models, providing numerous modes of the density field that have been disregarded so far, but may help to further reduce statistical errors in constraining cosmology.

  19. Study of aluminum particle combustion in solid propellant plumes using digital in-line holography and imaging pyrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, Yi; Guildenbecher, Daniel R.; Hoffmeister, Kathryn N. G.

    The combustion of molten metals is an important area of study with applications ranging from solid aluminized rocket propellants to fireworks displays. Our work uses digital in-line holography (DIH) to experimentally quantify the three-dimensional position, size, and velocity of aluminum particles during combustion of ammonium perchlorate (AP) based solid-rocket propellants. Additionally, spatially resolved particle temperatures are simultaneously measured using two-color imaging pyrometry. To allow for fast characterization of the properties of tens of thousands of particles, automated data processing routines are proposed. In using these methods, statistics from aluminum particles with diameters ranging from 15 to 900 µm are collectedmore » at an ambient pressure of 83 kPa. In the first set of DIH experiments, increasing initial propellant temperature is shown to enhance the agglomeration of nascent aluminum at the burning surface, resulting in ejection of large molten aluminum particles into the exhaust plume. The resulting particle number and volume distributions are quantified. In the second set of simultaneous DIH and pyrometry experiments, particle size and velocity relationships as well as temperature statistics are explored. The average measured temperatures are found to be 2640 ± 282 K, which compares well with previous estimates of the range of particle and gas-phase temperatures. The novel methods proposed here represent new capabilities for simultaneous quantification of the joint size, velocity, and temperature statistics during the combustion of molten metal particles. The proposed techniques are expected to be useful for detailed performance assessment of metalized solid-rocket propellants.« less

  20. Combining optical remote sensing, agricultural statistics and field observations for culture recognition over a peri-urban region

    NASA Astrophysics Data System (ADS)

    Delbart, Nicolas; Emmanuelle, Vaudour; Fabienne, Maignan; Catherine, Ottlé; Jean-Marc, Gilliot

    2017-04-01

    This study explores the potential of multi-temporal optical remote sensing, with high revisit frequency, to derive missing information on agricultural calendar and crop types over the agricultural lands in the Versailles plain in the western Paris suburbs. This study comes besides past and ongoing studies on the use of radar and high spatial resolution optical remote sensing to monitor agricultural practices in this study area (e.g. Vaudour et al. 2014). Agricultural statistics, such as the Land Parcel Identification System (LPIS) for France, permit to know the nature of annual crops for each digitized declared field of this land parcel registry. However, within each declared field several cropped plots and a diversity of practices may exist, being marked by agricultural rotations which vary both spatially and temporally within it and differ from one year to the other. Even though the new LPIS to be released in 2016 is expected to describe individual plots within declared fields, its attributes may not enable to discriminate between winter and spring crops. Here we evaluate the potential of high observation frequency remote sensing to differentiate seasonal crops based essentially on the seasonality of the spectral properties. In particular, we use the Landsat data to spatially disaggregate the LPIS statistical data, on the basis of the analysis of the remote sensing spectral seasonality measured on a number of selected ground-observed fields. This work is carried out in the framework of the CNES TOSCA-PLEIADES-CO of the French Space Agency.

  1. Study of aluminum particle combustion in solid propellant plumes using digital in-line holography and imaging pyrometry

    DOE PAGES

    Chen, Yi; Guildenbecher, Daniel R.; Hoffmeister, Kathryn N. G.; ...

    2017-05-05

    The combustion of molten metals is an important area of study with applications ranging from solid aluminized rocket propellants to fireworks displays. Our work uses digital in-line holography (DIH) to experimentally quantify the three-dimensional position, size, and velocity of aluminum particles during combustion of ammonium perchlorate (AP) based solid-rocket propellants. Additionally, spatially resolved particle temperatures are simultaneously measured using two-color imaging pyrometry. To allow for fast characterization of the properties of tens of thousands of particles, automated data processing routines are proposed. In using these methods, statistics from aluminum particles with diameters ranging from 15 to 900 µm are collectedmore » at an ambient pressure of 83 kPa. In the first set of DIH experiments, increasing initial propellant temperature is shown to enhance the agglomeration of nascent aluminum at the burning surface, resulting in ejection of large molten aluminum particles into the exhaust plume. The resulting particle number and volume distributions are quantified. In the second set of simultaneous DIH and pyrometry experiments, particle size and velocity relationships as well as temperature statistics are explored. The average measured temperatures are found to be 2640 ± 282 K, which compares well with previous estimates of the range of particle and gas-phase temperatures. The novel methods proposed here represent new capabilities for simultaneous quantification of the joint size, velocity, and temperature statistics during the combustion of molten metal particles. The proposed techniques are expected to be useful for detailed performance assessment of metalized solid-rocket propellants.« less

  2. Statistical properties of two sine waves in Gaussian noise.

    NASA Technical Reports Server (NTRS)

    Esposito, R.; Wilson, L. R.

    1973-01-01

    A detailed study is presented of some statistical properties of a stochastic process that consists of the sum of two sine waves of unknown relative phase and a normal process. Since none of the statistics investigated seem to yield a closed-form expression, all the derivations are cast in a form that is particularly suitable for machine computation. Specifically, results are presented for the probability density function (pdf) of the envelope and the instantaneous value, the moments of these distributions, and the relative cumulative density function (cdf).

  3. The halo Boltzmann equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Biagetti, Matteo; Desjacques, Vincent; Kehagias, Alex

    2016-04-01

    Dark matter halos are the building blocks of the universe as they host galaxies and clusters. The knowledge of the clustering properties of halos is therefore essential for the understanding of the galaxy statistical properties. We derive an effective halo Boltzmann equation which can be used to describe the halo clustering statistics. In particular, we show how the halo Boltzmann equation encodes a statistically biased gravitational force which generates a bias in the peculiar velocities of virialized halos with respect to the underlying dark matter, as recently observed in N-body simulations.

  4. Information Entropy Production of Maximum Entropy Markov Chains from Spike Trains

    NASA Astrophysics Data System (ADS)

    Cofré, Rodrigo; Maldonado, Cesar

    2018-01-01

    We consider the maximum entropy Markov chain inference approach to characterize the collective statistics of neuronal spike trains, focusing on the statistical properties of the inferred model. We review large deviations techniques useful in this context to describe properties of accuracy and convergence in terms of sampling size. We use these results to study the statistical fluctuation of correlations, distinguishability and irreversibility of maximum entropy Markov chains. We illustrate these applications using simple examples where the large deviation rate function is explicitly obtained for maximum entropy models of relevance in this field.

  5. 41 CFR 101-27.203 - Program objectives.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ...) Conduct inventory management analyses to determine if shelf-life stocks are expected to be utilized prior... 41 Public Contracts and Property Management 2 2010-07-01 2010-07-01 true Program objectives. 101-27.203 Section 101-27.203 Public Contracts and Property Management Federal Property Management...

  6. Modelling 1-minute directional observations of the global irradiance.

    NASA Astrophysics Data System (ADS)

    Thejll, Peter; Pagh Nielsen, Kristian; Andersen, Elsa; Furbo, Simon

    2016-04-01

    Direct and diffuse irradiances from the sky has been collected at 1-minute intervals for about a year from the experimental station at the Technical University of Denmark for the IEA project "Solar Resource Assessment and Forecasting". These data were gathered by pyrheliometers tracking the Sun, as well as with apertured pyranometers gathering 1/8th and 1/16th of the light from the sky in 45 degree azimuthal ranges pointed around the compass. The data are gathered in order to develop detailed models of the potentially available solar energy and its variations at high temporal resolution in order to gain a more detailed understanding of the solar resource. This is important for a better understanding of the sub-grid scale cloud variation that cannot be resolved with climate and weather models. It is also important for optimizing the operation of active solar energy systems such as photovoltaic plants and thermal solar collector arrays, and for passive solar energy and lighting to buildings. We present regression-based modelling of the observed data, and focus, here, on the statistical properties of the model fits. Using models based on the one hand on what is found in the literature and on physical expectations, and on the other hand on purely statistical models, we find solutions that can explain up to 90% of the variance in global radiation. The models leaning on physical insights include terms for the direct solar radiation, a term for the circum-solar radiation, a diffuse term and a term for the horizon brightening/darkening. The purely statistical model is found using data- and formula-validation approaches picking model expressions from a general catalogue of possible formulae. The method allows nesting of expressions, and the results found are dependent on and heavily constrained by the cross-validation carried out on statistically independent testing and training data-sets. Slightly better fits -- in terms of variance explained -- is found using the purely statistical fitting/searching approach. We describe the methods applied, results found, and discuss the different potentials of the physics- and statistics-only based model-searches.

  7. Structural and molecular basis of starch viscosity in hexaploid wheat.

    PubMed

    Ral, J-P; Cavanagh, C R; Larroque, O; Regina, A; Morell, M K

    2008-06-11

    Wheat starch is considered to have a low paste viscosity relative to other starches. Consequently, wheat starch is not preferred for many applications as compared to other high paste viscosity starches. Increasing the viscosity of wheat starch is expected to increase the functionality of a range of wheat flour-based products in which the texture is an important aspect of consumer acceptance (e.g., pasta, and instant and yellow alkaline noodles). To understand the molecular basis of starch viscosity, we have undertaken a comprehensive structural and rheological analysis of starches from a genetically diverse set of wheat genotypes, which revealed significant variation in starch traits including starch granule protein content, starch-associated lipid content and composition, phosphate content, and the structures of the amylose and amylopectin fractions. Statistical analysis highlighted the association between amylopectin chains of 18-25 glucose residues and starch pasting properties. Principal component analysis also identified an association between monoesterified phosphate and starch pasting properties in wheat despite the low starch-phosphate level in wheat as compared to tuber starches. We also found a strong negative correlation between the phosphate ester content and the starch content in flour. Previously observed associations between internal starch granule fatty acids and the swelling peak time and pasting temperature have been confirmed. This study has highlighted a range of parameters associated with increased starch viscosity that could be used in prebreeding/breeding programs to modify wheat starch pasting properties.

  8. The Dutch-Flemish PROMIS Physical Function item bank exhibited strong psychometric properties in patients with chronic pain.

    PubMed

    Crins, Martine H P; Terwee, Caroline B; Klausch, Thomas; Smits, Niels; de Vet, Henrica C W; Westhovens, Rene; Cella, David; Cook, Karon F; Revicki, Dennis A; van Leeuwen, Jaap; Boers, Maarten; Dekker, Joost; Roorda, Leo D

    2017-07-01

    The objective of this study was to assess the psychometric properties of the Dutch-Flemish Patient-Reported Outcomes Measurement Information System (PROMIS) Physical Function item bank in Dutch patients with chronic pain. A bank of 121 items was administered to 1,247 Dutch patients with chronic pain. Unidimensionality was assessed by fitting a one-factor confirmatory factor analysis and evaluating resulting fit statistics. Items were calibrated with the graded response model and its fit was evaluated. Cross-cultural validity was assessed by testing items for differential item functioning (DIF) based on language (Dutch vs. English). Construct validity was evaluated by calculation correlations between scores on the Dutch-Flemish PROMIS Physical Function measure and scores on generic and disease-specific measures. Results supported the Dutch-Flemish PROMIS Physical Function item bank's unidimensionality (Comparative Fit Index = 0.976, Tucker Lewis Index = 0.976) and model fit. Item thresholds targeted a wide range of physical function construct (threshold-parameters range: -4.2 to 5.6). Cross-cultural validity was good as four items only showed DIF for language and their impact on item scores was minimal. Physical Function scores were strongly associated with scores on all other measures (all correlations ≤ -0.60 as expected). The Dutch-Flemish PROMIS Physical Function item bank exhibited good psychometric properties. Development of a computer adaptive test based on the large bank is warranted. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Architecture of marine food webs: To be or not be a 'small-world'.

    PubMed

    Marina, Tomás Ignacio; Saravia, Leonardo A; Cordone, Georgina; Salinas, Vanesa; Doyle, Santiago R; Momo, Fernando R

    2018-01-01

    The search for general properties in network structure has been a central issue for food web studies in recent years. One such property is the small-world topology that combines a high clustering and a small distance between nodes of the network. This property may increase food web resilience but make them more sensitive to the extinction of connected species. Food web theory has been developed principally from freshwater and terrestrial ecosystems, largely omitting marine habitats. If theory needs to be modified to accommodate observations from marine ecosystems, based on major differences in several topological characteristics is still on debate. Here we investigated if the small-world topology is a common structural pattern in marine food webs. We developed a novel, simple and statistically rigorous method to examine the largest set of complex marine food webs to date. More than half of the analyzed marine networks exhibited a similar or lower characteristic path length than the random expectation, whereas 39% of the webs presented a significantly higher clustering than its random counterpart. Our method proved that 5 out of 28 networks fulfilled both features of the small-world topology: short path length and high clustering. This work represents the first rigorous analysis of the small-world topology and its associated features in high-quality marine networks. We conclude that such topology is a structural pattern that is not maximized in marine food webs; thus it is probably not an effective model to study robustness, stability and feasibility of marine ecosystems.

  10. 41 CFR 101-27.101 - General.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... FEDERAL PROPERTY MANAGEMENT REGULATIONS SUPPLY AND PROCUREMENT 27-INVENTORY MANAGEMENT 27.1-Stock... inventory which is that portion carried to satisfy average expected demand, and safety stock which is that portion carried for protection against stock depletion occurring when demand exceeds average expected...

  11. Global Health Observatory (GHO): Life Expectancy

    MedlinePlus

    ... Overview Statistics Cooperation strategies Democratic Republic of the Congo » Emergencies Focus on » Bangladesh Rohingya Democratic Republic of the Congo Iraq Nigeria Somalia South Sudan Syrian Arab Republic ...

  12. Tables of square-law signal detection statistics for Hann spectra with 50 percent overlap

    NASA Technical Reports Server (NTRS)

    Deans, Stanley R.; Cullers, D. Kent

    1991-01-01

    The Search for Extraterrestrial Intelligence, currently being planned by NASA, will require that an enormous amount of data be analyzed in real time by special purpose hardware. It is expected that overlapped Hann data windows will play an important role in this analysis. In order to understand the statistical implication of this approach, it has been necessary to compute detection statistics for overlapped Hann spectra. Tables of signal detection statistics are given for false alarm rates from 10(exp -14) to 10(exp -1) and signal detection probabilities from 0.50 to 0.99; the number of computed spectra ranges from 4 to 2000.

  13. Technical Note: Statistical dependences between channels in radiochromic film readings. Implications in multichannel dosimetry.

    PubMed

    González-López, Antonio; Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen

    2016-05-01

    This note studies the statistical relationships between color channels in radiochromic film readings with flatbed scanners. The same relationships are studied for noise. Finally, their implications for multichannel film dosimetry are discussed. Radiochromic films exposed to wedged fields of 6 MV energy were read in a flatbed scanner. The joint histograms of pairs of color channels were used to obtain the joint and conditional probability density functions between channels. Then, the conditional expectations and variances of one channel given another channel were obtained. Noise was extracted from film readings by means of a multiresolution analysis. Two different dose ranges were analyzed, the first one ranging from 112 to 473 cGy and the second one from 52 to 1290 cGy. For the smallest dose range, the conditional expectations of one channel given another channel can be approximated by linear functions, while the conditional variances are fairly constant. The slopes of the linear relationships between channels can be used to simplify the expression that estimates the dose by means of the multichannel method. The slopes of the linear relationships between each channel and the red one can also be interpreted as weights in the final contribution to dose estimation. However, for the largest dose range, the conditional expectations of one channel given another channel are no longer linear functions. Finally, noises in different channels were found to correlate weakly. Signals present in different channels of radiochromic film readings show a strong statistical dependence. By contrast, noise correlates weakly between channels. For the smallest dose range analyzed, the linear behavior between the conditional expectation of one channel given another channel can be used to simplify calculations in multichannel film dosimetry.

  14. A new approach to process control using Instability Index

    NASA Astrophysics Data System (ADS)

    Weintraub, Jeffrey; Warrick, Scott

    2016-03-01

    The merits of a robust Statistical Process Control (SPC) methodology have long been established. In response to the numerous SPC rule combinations, processes, and the high cost of containment, the Instability Index (ISTAB) is presented as a tool for managing these complexities. ISTAB focuses limited resources on key issues and provides a window into the stability of manufacturing operations. ISTAB takes advantage of the statistical nature of processes by comparing the observed average run length (OARL) to the expected run length (ARL), resulting in a gap value called the ISTAB index. The ISTAB index has three characteristic behaviors that are indicative of defects in an SPC instance. Case 1: The observed average run length is excessively long relative to expectation. ISTAB > 0 is indicating the possibility that the limits are too wide. Case 2: The observed average run length is consistent with expectation. ISTAB near zero is indicating that the process is stable. Case 3: The observed average run length is inordinately short relative to expectation. ISTAB < 0 is indicating that the limits are too tight, the process is unstable or both. The probability distribution of run length is the basis for establishing an ARL. We demonstrate that the geometric distribution is a good approximation to run length across a wide variety of rule sets. Excessively long run lengths are associated with one kind of defect in an SPC instance; inordinately short run lengths are associated with another. A sampling distribution is introduced as a way to quantify excessively long and inordinately short observed run lengths. This paper provides detailed guidance for action limits on these run lengths. ISTAB as a statistical method of review facilitates automated instability detection. This paper proposes a management system based on ISTAB as an enhancement to more traditional SPC approaches.

  15. Induced beliefs about a fictive energy drink influences 200-m sprint performance.

    PubMed

    de la Vega, Ricardo; Alberti, Sara; Ruíz-Barquín, Roberto; Soós, István; Szabo, Attila

    2017-09-01

    Placebo and nocebo effects occur in response to subjective expectations and their subsequent neural actions. Research shows that information shapes expectations that, consequently, influence people's behaviour. In this study, we examined the effects of a fictive and inert green colour energy drink provided for three groups (n = 20/group) with different information. The first group was led to expect that the drink augments running performance (positive information), the second group was led to expect that the drink may or may not improve performance (partial-positive information), while the third group was told that earlier research could not demonstrate that the drink improves performance (neutral/control). At baseline, the three groups did not differ in their 200-m sprint performance (p > .05). One week later, 20-min immediately after ingesting the drink, all participants again ran 200 m. The positive information group increased its performance by 2.41 s, which was statistically significant (p < .001) and also perceived its sprint-time shorter (p < .05) than the other two groups. A better performance (0.97 s) that approached but did not reach statistical significance was also noted in the partial-positive information group, and a lesser change (0.72 s) that was statistically not significant was noted in the neutral information control group. These results reveal that drinking an inert liquid, primed with positive information, changes both the actual and the self-perceived time on a 200-m sprint. The current findings also suggest that the level of certainty of the information might be linked to the magnitude of change in performance.

  16. Technical Note: Statistical dependences between channels in radiochromic film readings. Implications in multichannel dosimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    González-López, Antonio, E-mail: antonio.gonzalez7@carm.es; Vera-Sánchez, Juan Antonio; Ruiz-Morales, Carmen

    Purpose: This note studies the statistical relationships between color channels in radiochromic film readings with flatbed scanners. The same relationships are studied for noise. Finally, their implications for multichannel film dosimetry are discussed. Methods: Radiochromic films exposed to wedged fields of 6 MV energy were read in a flatbed scanner. The joint histograms of pairs of color channels were used to obtain the joint and conditional probability density functions between channels. Then, the conditional expectations and variances of one channel given another channel were obtained. Noise was extracted from film readings by means of a multiresolution analysis. Two different dosemore » ranges were analyzed, the first one ranging from 112 to 473 cGy and the second one from 52 to 1290 cGy. Results: For the smallest dose range, the conditional expectations of one channel given another channel can be approximated by linear functions, while the conditional variances are fairly constant. The slopes of the linear relationships between channels can be used to simplify the expression that estimates the dose by means of the multichannel method. The slopes of the linear relationships between each channel and the red one can also be interpreted as weights in the final contribution to dose estimation. However, for the largest dose range, the conditional expectations of one channel given another channel are no longer linear functions. Finally, noises in different channels were found to correlate weakly. Conclusions: Signals present in different channels of radiochromic film readings show a strong statistical dependence. By contrast, noise correlates weakly between channels. For the smallest dose range analyzed, the linear behavior between the conditional expectation of one channel given another channel can be used to simplify calculations in multichannel film dosimetry.« less

  17. Development and computer implementation of design/analysis techniques for multilayered composite structures. Probabilistic fiber composite micromechanics. M.S. Thesis, Mar. 1987 Final Report, 1 Sep. 1984 - 1 Oct. 1990

    NASA Technical Reports Server (NTRS)

    Stock, Thomas A.

    1995-01-01

    Probabilistic composite micromechanics methods are developed that simulate expected uncertainties in unidirectional fiber composite properties. These methods are in the form of computational procedures using Monte Carlo simulation. The variables in which uncertainties are accounted for include constituent and void volume ratios, constituent elastic properties and strengths, and fiber misalignment. A graphite/epoxy unidirectional composite (ply) is studied to demonstrate fiber composite material property variations induced by random changes expected at the material micro level. Regression results are presented to show the relative correlation between predictor and response variables in the study. These computational procedures make possible a formal description of anticipated random processes at the intraply level, and the related effects of these on composite properties.

  18. A Simplified Algorithm for Statistical Investigation of Damage Spreading

    NASA Astrophysics Data System (ADS)

    Gecow, Andrzej

    2009-04-01

    On the way to simulating adaptive evolution of complex system describing a living object or human developed project, a fitness should be defined on node states or network external outputs. Feedbacks lead to circular attractors of these states or outputs which make it difficult to define a fitness. The main statistical effects of adaptive condition are the result of small change tendency and to appear, they only need a statistically correct size of damage initiated by evolutionary change of system. This observation allows to cut loops of feedbacks and in effect to obtain a particular statistically correct state instead of a long circular attractor which in the quenched model is expected for chaotic network with feedback. Defining fitness on such states is simple. We calculate only damaged nodes and only once. Such an algorithm is optimal for investigation of damage spreading i.e. statistical connections of structural parameters of initial change with the size of effected damage. It is a reversed-annealed method—function and states (signals) may be randomly substituted but connections are important and are preserved. The small damages important for adaptive evolution are correctly depicted in comparison to Derrida annealed approximation which expects equilibrium levels for large networks. The algorithm indicates these levels correctly. The relevant program in Pascal, which executes the algorithm for a wide range of parameters, can be obtained from the author.

  19. Fully Bayesian tests of neutrality using genealogical summary statistics.

    PubMed

    Drummond, Alexei J; Suchard, Marc A

    2008-10-31

    Many data summary statistics have been developed to detect departures from neutral expectations of evolutionary models. However questions about the neutrality of the evolution of genetic loci within natural populations remain difficult to assess. One critical cause of this difficulty is that most methods for testing neutrality make simplifying assumptions simultaneously about the mutational model and the population size model. Consequentially, rejecting the null hypothesis of neutrality under these methods could result from violations of either or both assumptions, making interpretation troublesome. Here we harness posterior predictive simulation to exploit summary statistics of both the data and model parameters to test the goodness-of-fit of standard models of evolution. We apply the method to test the selective neutrality of molecular evolution in non-recombining gene genealogies and we demonstrate the utility of our method on four real data sets, identifying significant departures of neutrality in human influenza A virus, even after controlling for variation in population size. Importantly, by employing a full model-based Bayesian analysis, our method separates the effects of demography from the effects of selection. The method also allows multiple summary statistics to be used in concert, thus potentially increasing sensitivity. Furthermore, our method remains useful in situations where analytical expectations and variances of summary statistics are not available. This aspect has great potential for the analysis of temporally spaced data, an expanding area previously ignored for limited availability of theory and methods.

  20. Gaia DR2 documentation Chapter 1: Introduction

    NASA Astrophysics Data System (ADS)

    de Bruijne, J. H. J.; Abreu, A.; Brown, A. G. A.; Castañeda, J.; Cheek, N.; Crowley, C.; De Angeli, F.; Drimmel, R.; Fabricius, C.; Fleitas, J.; Gracia-Abril, G.; Guerra, R.; Hutton, A.; Messineo, R.; Mora, A.; Nienartowicz, K.; Panem, C.; Siddiqui, H.

    2018-04-01

    This chapter of the Gaia DR2 documentation describes the Gaia mission, the Gaia spacecraft, and the organisation of the Gaia Data Processing and Analysis Consortium (DPAC), which is responsible for the processing and analysis of the Gaia data. Furthermore, various properties of the data release are summarised, including statistical properties, object statistics, completeness, selection and filtering criteria, and limitations of the data.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Özel, Gamze

    Bivariate Kumaraswamy (BK) distribution whose marginals are Kumaraswamy distributions has been recently introduced. However, its statistical properties are not studied in detail. In this study, statistical properties of the BK distribution are investigated. We suggest that the BK could provide suitable description for the earthquakes characteristics of Turkey. We support this argument using earthquakesoccurred in Turkey between 1900 and 2009. We also find that the BK distribution simulates earthquakes well.

  2. A statistical study of EMIC waves observed by Cluster: 1. Wave properties

    NASA Astrophysics Data System (ADS)

    Allen, R. C.; Zhang, J.-C.; Kistler, L. M.; Spence, H. E.; Lin, R.-L.; Klecker, B.; Dunlop, M. W.; André, M.; Jordanova, V. K.

    2015-07-01

    Electromagnetic ion cyclotron (EMIC) waves are an important mechanism for particle energization and losses inside the magnetosphere. In order to better understand the effects of these waves on particle dynamics, detailed information about the occurrence rate, wave power, ellipticity, normal angle, energy propagation angle distributions, and local plasma parameters are required. Previous statistical studies have used in situ observations to investigate the distribution of these parameters in the magnetic local time versus L-shell (MLT-L) frame within a limited magnetic latitude (MLAT) range. In this study, we present a statistical analysis of EMIC wave properties using 10 years (2001-2010) of data from Cluster, totaling 25,431 min of wave activity. Due to the polar orbit of Cluster, we are able to investigate EMIC waves at all MLATs and MLTs. This allows us to further investigate the MLAT dependence of various wave properties inside different MLT sectors and further explore the effects of Shabansky orbits on EMIC wave generation and propagation. The statistical analysis is presented in two papers. This paper focuses on the wave occurrence distribution as well as the distribution of wave properties. The companion paper focuses on local plasma parameters during wave observations as well as wave generation proxies.

  3. Statistical properties of radiation from VUV and X-ray free electron laser

    NASA Astrophysics Data System (ADS)

    Saldin, E. L.; Schneidmiller, E. A.; Yurkov, M. V.

    1998-03-01

    The paper presents a comprehensive analysis of the statistical properties of the radiation from a self-amplified spontaneous emission (SASE) free electron laser operating in linear and nonlinear mode. The investigation has been performed in a one-dimensional approximation assuming the electron pulse length to be much larger than a coherence length of the radiation. The following statistical properties of the SASE FEL radiation have been studied in detail: time and spectral field correlations, distribution of the fluctuations of the instantaneous radiation power, distribution of the energy in the electron bunch, distribution of the radiation energy after the monochromator installed at the FEL amplifier exit and radiation spectrum. The linear high gain limit is studied analytically. It is shown that the radiation from a SASE FEL operating in the linear regime possesses all the features corresponding to completely chaotic polarized radiation. A detailed study of statistical properties of the radiation from a SASE FEL operating in linear and nonlinear regime has been performed by means of time-dependent simulation codes. All numerical results presented in the paper have been calculated for the 70 nm SASE FEL at the TESLA Test Facility being under construction at DESY.

  4. Statistical variances of diffusional properties from ab initio molecular dynamics simulations

    NASA Astrophysics Data System (ADS)

    He, Xingfeng; Zhu, Yizhou; Epstein, Alexander; Mo, Yifei

    2018-12-01

    Ab initio molecular dynamics (AIMD) simulation is widely employed in studying diffusion mechanisms and in quantifying diffusional properties of materials. However, AIMD simulations are often limited to a few hundred atoms and a short, sub-nanosecond physical timescale, which leads to models that include only a limited number of diffusion events. As a result, the diffusional properties obtained from AIMD simulations are often plagued by poor statistics. In this paper, we re-examine the process to estimate diffusivity and ionic conductivity from the AIMD simulations and establish the procedure to minimize the fitting errors. In addition, we propose methods for quantifying the statistical variance of the diffusivity and ionic conductivity from the number of diffusion events observed during the AIMD simulation. Since an adequate number of diffusion events must be sampled, AIMD simulations should be sufficiently long and can only be performed on materials with reasonably fast diffusion. We chart the ranges of materials and physical conditions that can be accessible by AIMD simulations in studying diffusional properties. Our work provides the foundation for quantifying the statistical confidence levels of diffusion results from AIMD simulations and for correctly employing this powerful technique.

  5. How Attitudes towards Statistics Courses and the Field of Statistics Predicts Statistics Anxiety among Undergraduate Social Science Majors: A Validation of the Statistical Anxiety Scale

    ERIC Educational Resources Information Center

    O'Bryant, Monique J.

    2017-01-01

    The aim of this study was to validate an instrument that can be used by instructors or social scientist who are interested in evaluating statistics anxiety. The psychometric properties of the English version of the Statistical Anxiety Scale (SAS) was examined through a confirmatory factor analysis of scores from a sample of 323 undergraduate…

  6. Remineralization Property of an Orthodontic Primer Containing a Bioactive Glass with Silver and Zinc

    PubMed Central

    Lee, Seung-Min; Kim, In-Ryoung; Park, Bong-Soo; Ko, Ching-Chang; Son, Woo-Sung; Kim, Yong-Il

    2017-01-01

    White spot lesions (WSLs) are irreversible damages in orthodontic treatment due to excessive etching or demineralization by microorganisms. In this study, we conducted a mechanical and cell viability test to examine the antibacterial properties of 0.2% and 1% bioactive glass (BAG) and silver-doped and zinc-doped BAGs in a primer and evaluated their clinical applicability to prevent WSLs. The microhardness statistically significantly increased in the adhesive-containing BAG, while the other samples showed no statistically significant difference compared with the control group. The shear bond strength of all samples increased compared with that of the control group. The cell viability of the control and sample groups was similar within 24 h, but decreased slightly over 48 h. All samples showed antibacterial properties. Regarding remineralization property, the group containing 0.2% of the samples showed remineralization properties compared with the control group, but was not statistically significant; further, the group containing 1% of the samples showed a significant difference compared with the control group. Among them, the orthodontic bonding primer containing 1% silver-doped BAG showed the highest remineralization property. The new orthodontic bonding primer used in this study showed an antimicrobial effect, chemical remineralization effect, and WSL prevention as well as clinically applicable properties, both physically and biologically. PMID:29088092

  7. Stationary variational estimates for the effective response and field fluctuations in nonlinear composites

    NASA Astrophysics Data System (ADS)

    Ponte Castañeda, Pedro

    2016-11-01

    This paper presents a variational method for estimating the effective constitutive response of composite materials with nonlinear constitutive behavior. The method is based on a stationary variational principle for the macroscopic potential in terms of the corresponding potential of a linear comparison composite (LCC) whose properties are the trial fields in the variational principle. When used in combination with estimates for the LCC that are exact to second order in the heterogeneity contrast, the resulting estimates for the nonlinear composite are also guaranteed to be exact to second-order in the contrast. In addition, the new method allows full optimization with respect to the properties of the LCC, leading to estimates that are fully stationary and exhibit no duality gaps. As a result, the effective response and field statistics of the nonlinear composite can be estimated directly from the appropriately optimized linear comparison composite. By way of illustration, the method is applied to a porous, isotropic, power-law material, and the results are found to compare favorably with earlier bounds and estimates. However, the basic ideas of the method are expected to work for broad classes of composites materials, whose effective response can be given appropriate variational representations, including more general elasto-plastic and soft hyperelastic composites and polycrystals.

  8. Do In Situ Observations Contain Signatures of Intermittent Fast Solar Wind Acceleration?

    NASA Astrophysics Data System (ADS)

    Matteini, L.; Horbury, T. S.; Stansby, D.

    2017-12-01

    Disentangling local plasma properties and Solar origin structures in in situ data is a crucial aspect for the understanding of solar wind acceleration and evolution. This is particularly challenging at 1 AU and beyond, where structures of various origin have had time to interact and merge, smoothing out their main characteristics. Observations of more pristine plasma closer to the Sun are therefore needed. In preparation of the forthcoming Solar Orbiter and Parker Solar Probe missions, Helios observations as close as to 0.3 AU - although old, not yet fully exploited - can be used to test our expectations and make new predictions. Recent observations (Matteini et al. 2014, 2015) have outlined the presence of intense (up to 1000km/s) and short-living velocity peaks that ubiquitously characterize the typical profile of the fast solar wind at 0.3 AU, suggesting that these features could be remnants of processes occurring in the Solar atmosphere and a signature of intermittent solar wind acceleration from coronal holes. We discuss results about statistics of these events, characterizing their physical properties and trying to link them with typical Solar temporal and spatial scales. Finally we also discuss how these velocity peaks will likely affect the future in situ exploration of the inner heliosphere by Solar Orbiter and the Parker Solar Probe.

  9. Effects of pressure and electrical charge on macromolecular transport across bovine lens basement membrane.

    PubMed

    Ferrell, Nicholas; Cameron, Kathleen O; Groszek, Joseph J; Hofmann, Christina L; Li, Lingyan; Smith, Ross A; Bian, Aihua; Shintani, Ayumi; Zydney, Andrew L; Fissell, William H

    2013-04-02

    Molecular transport through the basement membrane is important for a number of physiological functions, and dysregulation of basement membrane architecture can have serious pathological consequences. The structure-function relationships that govern molecular transport in basement membranes are not fully understood. The basement membrane from the lens capsule of the eye is a collagen IV-rich matrix that can easily be extracted and manipulated in vitro. As such, it provides a convenient model for studying the functional relationships that govern molecular transport in basement membranes. Here we investigate the effects of increased transmembrane pressure and solute electrical charge on the transport properties of the lens basement membrane (LBM) from the bovine eye. Pressure-permeability relationships in LBM transport were governed primarily by changes in diffusive and convective contributions to solute flux and not by pressure-dependent changes in intrinsic membrane properties. The solute electrical charge had a minimal but statistically significant effect on solute transport through the LBM that was opposite of the expected electrokinetic behavior. The observed transport characteristics of the LBM are discussed in the context of established membrane transport modeling and previous work on the effects of pressure and electrical charge in other basement membrane systems. Copyright © 2013 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  10. The Bologna complete sample of nearby radio sources. II. Phase referenced observations of faint nuclear sources

    NASA Astrophysics Data System (ADS)

    Liuzzo, E.; Giovannini, G.; Giroletti, M.; Taylor, G. B.

    2009-10-01

    Aims: To study statistical properties of different classes of sources, it is necessary to observe a sample that is free of selection effects. To do this, we initiated a project to observe a complete sample of radio galaxies selected from the B2 Catalogue of Radio Sources and the Third Cambridge Revised Catalogue (3CR), with no selection constraint on the nuclear properties. We named this sample “the Bologna Complete Sample” (BCS). Methods: We present new VLBI observations at 5 and 1.6 GHz for 33 sources drawn from a sample not biased toward orientation. By combining these data with those in the literature, information on the parsec-scale morphology is available for a total of 76 of 94 radio sources with a range in radio power and kiloparsec-scale morphologies. Results: The fraction of two-sided sources at milliarcsecond resolution is high (30%), compared to the fraction found in VLBI surveys selected at centimeter wavelengths, as expected from the predictions of unified models. The parsec-scale jets are generally found to be straight and to line up with the kiloparsec-scale jets. A few peculiar sources are discussed in detail. Tables 1-4 are only available in electronic form at http://www.aanda.org

  11. A Psychometric Review of the Personality Inventory for DSM-5 (PID-5): Current Status and Future Directions.

    PubMed

    Al-Dajani, Nadia; Gralnick, Tara M; Bagby, R Michael

    2016-01-01

    The paradigm of personality psychopathology is shifting from one that is purely categorical in nature to one grounded in dimensional individual differences. Section III (Emerging Measures and Models) of the Diagnostic and Statistical Manual of Mental Disorders (5th ed. [DSM-5]; American Psychiatric Association, 2013), for example, includes a hybrid categorical/dimensional model of personality disorder classification. To inform the hybrid model, the DSM-5 Personality and Personality Disorders Work Group developed a self-report instrument to assess pathological personality traits-the Personality Inventory for the DSM-5 (PID-5). Since its recent introduction, 30 papers (39 samples) have been published examining various aspects of its psychometric properties. In this article, we review the psychometric characteristics of the PID-5 using the Standards for Educational and Psychological Testing as our framework. The PID-5 demonstrates adequate psychometric properties, including a replicable factor structure, convergence with existing personality instruments, and expected associations with broadly conceptualized clinical constructs. More research is needed with specific consideration to clinical utility, additional forms of reliability and validity, relations with psychopathological personality traits using clinical samples, alternative methods of criterion validation, effective employment of cut scores, and the inclusion of validity scales to propel this movement forward.

  12. Caffeine Expectancy Questionnaire (CaffEQ): Construction, Psychometric Properties, and Associations with Caffeine Use, Caffeine Dependence, and Other Related Variables

    ERIC Educational Resources Information Center

    Huntley, Edward D.; Juliano, Laura M.

    2012-01-01

    Expectancies for drug effects predict drug initiation, use, cessation, and relapse, and may play a causal role in drug effects (i.e., placebo effects). Surprisingly little is known about expectancies for caffeine even though it is the most widely used psychoactive drug in the world. In a series of independent studies, the nature and scope of…

  13. Quantifying specific capacity and salinity variability in Amman Zarqa Basin, Central Jordan, using empirical statistical and geostatistical techniques.

    PubMed

    Shaqour, F; Taany, R; Rimawi, O; Saffarini, G

    2016-01-01

    Modeling groundwater properties is an important tool by means of which water resources management can judge whether these properties are within the safe limits or not. This is usually done regularly and in the aftermath of crises that are expected to reflect negatively on groundwater properties, as occurred in Jordan due to crises in neighboring countries. In this study, specific capacity and salinity of groundwater of B2/A7 aquifer in Amman Zarqa Basin were evaluated to figure out the effect of population increase in this basin as a result of refugee flux from neighboring countries to this heavily populated basin after Gulf crises 1990 and 2003. Both properties were found to exhibit a three-parameter lognormal distribution. The empirically calculated β parameter of this distribution mounted up to 0.39 m(3)/h/min for specific capacity and 238 ppm for salinity. This parameter is suggested to account for the global changes that took place all over the basin during the entire period of observation and not for local changes at every well or at certain localities in the basin. It can be considered as an exploratory result of data analysis. Formal and implicit evaluation followed this step using structural analysis and construction of experimental semivariograms that represent the spatial variability of both properties. The adopted semivariograms were then used to construct maps to illustrate the spatial variability of the properties under consideration using kriging interpolation techniques. Semivariograms show that specific capacity and salinity values are spatially dependent within 14,529 and 16,309 m, respectively. Specific capacity semivariogram exhibit a nugget effect on a small scale (324 m). This can be attributed to heterogeneity or inadequacies in measurement. Specific capacity and salinity maps show that the major changes exhibit a northwest southeast trend, near As-Samra Wastewater Treatment Plant. The results of this study suggest proper management practices.

  14. Use Trends Indicated by Statistically Calibrated Recreational Sites in the National Forest System

    Treesearch

    Gary L. Tyre

    1971-01-01

    Trends in statistically sampled use of developed sites in the National Forest system indicate an average annual increase of 6.0 percent in the period 1966-69. The high variability of the measure precludes its use for projecting expected future use, but it can be important in gauging the credibility of annual use changes at both sampled and unsampled locations.

  15. Internal construct validity of the Shirom-Melamed Burnout Questionnaire (SMBQ)

    PubMed Central

    2012-01-01

    Background Burnout is a mental condition defined as a result of continuous and long-term stress exposure, particularly related to psychosocial factors at work. This paper seeks to examine the psychometric properties of the Shirom-Melamed Burnout Questionnaire (SMBQ) for validation of use in a clinical setting. Methods Data from both a clinical (319) and general population (319) samples of health care and social insurance workers were included in the study. Data were analysed using both classical and modern test theory approaches, including Confirmatory Factor Analysis (CFA) and Rasch analysis. Results Of the 638 people recruited into the study 416 (65%) persons were working full or part time. Data from the SMBQ failed a CFA, and initially failed to satisfy Rasch model expectations. After the removal of 4 of the original items measuring tension, and accommodating local dependency in the data, model expectations were met. As such, the total score from the revised scale is a sufficient statistic for ascertaining burnout and an interval scale transformation is available. The scale as a whole was perfectly targeted to the joint sample. A cut point of 4.4 for severe burnout was chosen at the intersection of the distributions of the clinical and general population. Conclusion A revised 18 item version of the SMBQ satisfies modern measurement standards. Using its cut point it offers the opportunity to identify potential clinical cases of burnout. PMID:22214479

  16. Seeing in the dark - I. Multi-epoch alchemy

    NASA Astrophysics Data System (ADS)

    Huff, Eric M.; Hirata, Christopher M.; Mandelbaum, Rachel; Schlegel, David; Seljak, Uroš; Lupton, Robert H.

    2014-05-01

    Weak lensing by large-scale structure is an invaluable cosmological tool given that most of the energy density of the concordance cosmology is invisible. Several large ground-based imaging surveys will attempt to measure this effect over the coming decade, but reliable control of the spurious lensing signal introduced by atmospheric turbulence and telescope optics remains a challenging problem. We address this challenge with a demonstration that point spread function (PSF) effects on measured galaxy shapes in the Sloan Digital Sky Survey (SDSS) can be corrected with existing analysis techniques. In this work, we co-add existing SDSS imaging on the equatorial stripe in order to build a data set with the statistical power to measure cosmic shear, while using a rounding kernel method to null out the effects of the anisotropic PSF. We build a galaxy catalogue from the combined imaging, characterize its photometric properties and show that the spurious shear remaining in this catalogue after the PSF correction is negligible compared to the expected cosmic shear signal. We identify a new source of systematic error in the shear-shear autocorrelations arising from selection biases related to masking. Finally, we discuss the circumstances in which this method is expected to be useful for upcoming ground-based surveys that have lensing as one of the science goals, and identify the systematic errors that can reduce its efficacy.

  17. Analysis and modelling of surface Urban Heat Island in 20 Canadian cities under climate and land-cover change.

    PubMed

    Gaur, Abhishek; Eichenbaum, Markus Kalev; Simonovic, Slobodan P

    2018-01-15

    Surface Urban Heat Island (SUHI) is an urban climate phenomenon that is expected to respond to future climate and land-use land-cover change. It is important to further our understanding of physical mechanisms that govern SUHI phenomenon to enhance our ability to model future SUHI characteristics under changing geophysical conditions. In this study, SUHI phenomenon is quantified and modelled at 20 cities distributed across Canada. By analyzing MODerate Resolution Imaging Spectroradiometer (MODIS) sensed surface temperature at the cities over 2002-2012, it is found that 16 out of 20 selected cities have experienced a positive SUHI phenomenon while 4 cities located in the prairies region and high elevation locations have experienced a negative SUHI phenomenon in the past. A statistically significant relationship between observed SUHI magnitude and city elevation is also recorded over the observational period. A Physical Scaling downscaling model is then validated and used to downscale future surface temperature projections from 3 GCMs and 2 extreme Representative Concentration Pathways in the urban and rural areas of the cities. Future changes in SUHI magnitudes between historical (2006-2015) and future timelines: 2030s (2026-2035), 2050s (2046-2055), and 2090s (2091-2100) are estimated. Analysis of future projected changes indicate that 15 (13) out of 20 cities can be expected to experience increases in SUHI magnitudes in future under RCP 2.6 (RCP 8.5). A statistically significant relationship between projected future SUHI change and current size of the cities is also obtained. The study highlights the role of city properties (i.e. its size, elevation, and surrounding land-cover) towards shaping their current and future SUHI characteristics. The results from this analysis will help decision-makers to manage Canadian cities more efficiently under rapidly changing geophysical and demographical conditions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Effects of force fields on the conformational and dynamic properties of amyloid β(1-40) dimer explored by replica exchange molecular dynamics simulations.

    PubMed

    Watts, Charles R; Gregory, Andrew; Frisbie, Cole; Lovas, Sándor

    2018-03-01

    The conformational space and structural ensembles of amyloid beta (Aβ) peptides and their oligomers in solution are inherently disordered and proven to be challenging to study. Optimum force field selection for molecular dynamics (MD) simulations and the biophysical relevance of results are still unknown. We compared the conformational space of the Aβ(1-40) dimers by 300 ns replica exchange MD simulations at physiological temperature (310 K) using: the AMBER-ff99sb-ILDN, AMBER-ff99sb*-ILDN, AMBER-ff99sb-NMR, and CHARMM22* force fields. Statistical comparisons of simulation results to experimental data and previously published simulations utilizing the CHARMM22* and CHARMM36 force fields were performed. All force fields yield sampled ensembles of conformations with collision cross sectional areas for the dimer that are statistically significantly larger than experimental results. All force fields, with the exception of AMBER-ff99sb-ILDN (8.8 ± 6.4%) and CHARMM36 (2.7 ± 4.2%), tend to overestimate the α-helical content compared to experimental CD (5.3 ± 5.2%). Using the AMBER-ff99sb-NMR force field resulted in the greatest degree of variance (41.3 ± 12.9%). Except for the AMBER-ff99sb-NMR force field, the others tended to under estimate the expected amount of β-sheet and over estimate the amount of turn/bend/random coil conformations. All force fields, with the exception AMBER-ff99sb-NMR, reproduce a theoretically expected β-sheet-turn-β-sheet conformational motif, however, only the CHARMM22* and CHARMM36 force fields yield results compatible with collapse of the central and C-terminal hydrophobic cores from residues 17-21 and 30-36. Although analyses of essential subspace sampling showed only minor variations between force fields, secondary structures of lowest energy conformers are different. © 2017 Wiley Periodicals, Inc.

  19. Mathematics pre-service teachers’ statistical reasoning about meaning

    NASA Astrophysics Data System (ADS)

    Kristanto, Y. D.

    2018-01-01

    This article offers a descriptive qualitative analysis of 3 second-year pre-service teachers’ statistical reasoning about the mean. Twenty-six pre-service teachers were tested using an open-ended problem where they were expected to analyze a method in finding the mean of a data. Three of their test results are selected to be analyzed. The results suggest that the pre-service teachers did not use context to develop the interpretation of mean. Therefore, this article also offers strategies to promote statistical reasoning about mean that use various contexts.

  20. A comparative analysis of the statistical properties of large mobile phone calling networks.

    PubMed

    Li, Ming-Xia; Jiang, Zhi-Qiang; Xie, Wen-Jie; Miccichè, Salvatore; Tumminello, Michele; Zhou, Wei-Xing; Mantegna, Rosario N

    2014-05-30

    Mobile phone calling is one of the most widely used communication methods in modern society. The records of calls among mobile phone users provide us a valuable proxy for the understanding of human communication patterns embedded in social networks. Mobile phone users call each other forming a directed calling network. If only reciprocal calls are considered, we obtain an undirected mutual calling network. The preferential communication behavior between two connected users can be statistically tested and it results in two Bonferroni networks with statistically validated edges. We perform a comparative analysis of the statistical properties of these four networks, which are constructed from the calling records of more than nine million individuals in Shanghai over a period of 110 days. We find that these networks share many common structural properties and also exhibit idiosyncratic features when compared with previously studied large mobile calling networks. The empirical findings provide us an intriguing picture of a representative large social network that might shed new lights on the modelling of large social networks.

  1. Statistical properties of the anomalous scaling exponent estimator based on time-averaged mean-square displacement

    NASA Astrophysics Data System (ADS)

    Sikora, Grzegorz; Teuerle, Marek; Wyłomańska, Agnieszka; Grebenkov, Denis

    2017-08-01

    The most common way of estimating the anomalous scaling exponent from single-particle trajectories consists of a linear fit of the dependence of the time-averaged mean-square displacement on the lag time at the log-log scale. We investigate the statistical properties of this estimator in the case of fractional Brownian motion (FBM). We determine the mean value, the variance, and the distribution of the estimator. Our theoretical results are confirmed by Monte Carlo simulations. In the limit of long trajectories, the estimator is shown to be asymptotically unbiased, consistent, and with vanishing variance. These properties ensure an accurate estimation of the scaling exponent even from a single (long enough) trajectory. As a consequence, we prove that the usual way to estimate the diffusion exponent of FBM is correct from the statistical point of view. Moreover, the knowledge of the estimator distribution is the first step toward new statistical tests of FBM and toward a more reliable interpretation of the experimental histograms of scaling exponents in microbiology.

  2. Assessment of corneal properties based on statistical modeling of OCT speckle.

    PubMed

    Jesus, Danilo A; Iskander, D Robert

    2017-01-01

    A new approach to assess the properties of the corneal micro-structure in vivo based on the statistical modeling of speckle obtained from Optical Coherence Tomography (OCT) is presented. A number of statistical models were proposed to fit the corneal speckle data obtained from OCT raw image. Short-term changes in corneal properties were studied by inducing corneal swelling whereas age-related changes were observed analyzing data of sixty-five subjects aged between twenty-four and seventy-three years. Generalized Gamma distribution has shown to be the best model, in terms of the Akaike's Information Criterion, to fit the OCT corneal speckle. Its parameters have shown statistically significant differences (Kruskal-Wallis, p < 0.001) for short and age-related corneal changes. In addition, it was observed that age-related changes influence the corneal biomechanical behaviour when corneal swelling is induced. This study shows that Generalized Gamma distribution can be utilized to modeling corneal speckle in OCT in vivo providing complementary quantified information where micro-structure of corneal tissue is of essence.

  3. Brain Aneurysm Statistics and Facts

    MedlinePlus

    ... Physical Challenge Emotional Challenges Potential Deficits Strategies For Short-Term Memory Loss Rehabilitation Kinds of Therapy What to Expect Common Questions How Long Until I Get Better? Why am I so ...

  4. 41 CFR 105-50.202-1 - Copies of statistical or other studies.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 41 Public Contracts and Property Management 3 2011-01-01 2011-01-01 false Copies of statistical or... Services Administration § 105-50.202-1 Copies of statistical or other studies. This material includes a copy of any existing statistical or other studies and compilations, results of technical tests and...

  5. 41 CFR 105-50.202-1 - Copies of statistical or other studies.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 41 Public Contracts and Property Management 3 2010-07-01 2010-07-01 false Copies of statistical or... Services Administration § 105-50.202-1 Copies of statistical or other studies. This material includes a copy of any existing statistical or other studies and compilations, results of technical tests and...

  6. Why Nations Become Wealthy: The Effects of Adult Longevity on Saving

    PubMed Central

    Kinugasa, Tomoko; Mason, Andrew

    2007-01-01

    We analyze steady state and out-of-steady-state effects of the transition in adult longevity on the national saving rate using historical data and international panel data. The rise in adult life expectancy has a large and statistically significant effect on aggregate saving. The effects have been especially pronounced in East Asia because its mortality transition was very rapid. Gains in life expectancy are much more important than declines in child dependency. Population aging may not lead to lower saving rates in the future if life expectancy and the duration of retirement continue to increase. PMID:18167514

  7. The method of expected number of deaths, 1786-1886-1986.

    PubMed

    Keiding, N

    1987-04-01

    "The method of expected number of deaths is an integral part of standardization of vital rates, which is one of the oldest statistical techniques. The expected number of deaths was calculated in 18th century actuarial mathematics...but the method seems to have been forgotten, and was reinvented in connection with 19th century studies of geographical and occupational variations of mortality.... It is noted that standardization of rates is intimately connected to the study of relative mortality, and a short description of very recent developments in the methodology of that area is included." (SUMMARY IN FRE) excerpt

  8. Statistics for clinical nursing practice: an introduction.

    PubMed

    Rickard, Claire M

    2008-11-01

    Difficulty in understanding statistics is one of the most frequently reported barriers to nurses applying research results in their practice. Yet the amount of nursing research published each year continues to grow, as does the expectation that nurses will undertake practice based on this evidence. Critical care nurses do not need to be statisticians, but they do need to develop a working knowledge of statistics so they can be informed consumers of research and so practice can evolve and improve. For those undertaking a research project, statistical literacy is required to interact with other researchers and statisticians, so as to best design and undertake the project. This article is the first in a series that guides critical care nurses through statistical terms and concepts relevant to their practice.

  9. The "serendipitous brain": Low expectancy and timing uncertainty of conscious events improve awareness of unconscious ones (evidence from the Attentional Blink).

    PubMed

    Lasaponara, Stefano; Dragone, Alessio; Lecce, Francesca; Di Russo, Francesco; Doricchi, Fabrizio

    2015-10-01

    To anticipate upcoming sensory events, the brain picks-up and exploits statistical regularities in the sensory environment. However, it is untested whether cumulated predictive knowledge about consciously seen stimuli improves the access to awareness of stimuli that usually go unseen. To explore this issue, we exploited the Attentional Blink (AB) effect, where conscious processing of a first visual target (T1) hinders detection of early following targets (T2). We report that timing uncertainty and low expectancy about the occurrence of consciously seen T2s presented outside the AB period, improve detection of early and otherwise often unseen T2s presented inside the AB. Recording of high-resolution Event Related Potentials (ERPs) and the study of their intracranial sources showed that the brain achieves this improvement by initially amplifying and extending the pre-conscious storage of T2s' traces signalled by the N2 wave originating in the extra-striate cortex. This enhancement in the N2 wave is followed by specific changes in the latency and amplitude of later components in the P3 wave (P3a and P3b), signalling access of the sensory trace to the network of parietal and frontal areas modulating conscious processing. These findings show that the interaction between conscious and unconscious processing changes adaptively as a function of the probabilistic properties of the sensory environment and that the combination of an active attentional state with loose probabilistic and temporal expectancies on forthcoming conscious events favors the emergence to awareness of otherwise unnoticed visual events. This likely provides an insight on the attentional conditions that predispose an active observer to unexpected "serendipitous" findings. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Rasch-built Overall Disability Scale for patients with chemotherapy-induced peripheral neuropathy (CIPN-R-ODS).

    PubMed

    Binda, D; Vanhoutte, E K; Cavaletti, G; Cornblath, D R; Postma, T J; Frigeni, B; Alberti, P; Bruna, J; Velasco, R; Argyriou, A A; Kalofonos, H P; Psimaras, D; Ricard, D; Pace, A; Galiè, E; Briani, C; Dalla Torre, C; Lalisang, R I; Boogerd, W; Brandsma, D; Koeppen, S; Hense, J; Storey, D; Kerrigan, S; Schenone, A; Fabbri, S; Rossi, E; Valsecchi, M G; Faber, C G; Merkies, I S J; Galimberti, S; Lanzani, F; Mattavelli, L; Piatti, M L; Bidoli, P; Cazzaniga, M; Cortinovis, D; Lucchetta, M; Campagnolo, M; Bakkers, M; Brouwer, B; Boogerd, W; Grant, R; Reni, L; Piras, B; Pessino, A; Padua, L; Granata, G; Leandri, M; Ghignotti, I; Plasmati, R; Pastorelli, F; Heimans, J J; Eurelings, M; Meijer, R J; Grisold, W; Lindeck Pozza, E; Mazzeo, A; Toscano, A; Russo, M; Tomasello, C; Altavilla, G; Penas Prado, M; Dominguez Gonzalez, C; Dorsey, S G

    2013-09-01

    Chemotherapy-induced peripheral neuropathy (CIPN) is a common neurological side-effect of cancer treatment and may lead to declines in patients' daily functioning and quality of life. To date, there are no modern clinimetrically well-evaluated outcome measures available to assess disability in CIPN patients. The objective of the study was to develop an interval-weighted scale to capture activity limitations and participation restrictions in CIPN patients using the Rasch methodology and to determine its validity and reliability properties. A preliminary Rasch-built Overall Disability Scale (pre-R-ODS) comprising 146 items was assessed twice (interval: 2-3 weeks; test-retest reliability) in 281 CIPN patients with a stable clinical condition. The obtained data were subjected to Rasch analyses to determine whether model expectations would be met, and if necessarily, adaptations were made to obtain proper model fit (internal validity). External validity was obtained by correlating the CIPN-R-ODS with the National Cancer Institute-Common Toxicity Criteria (NCI-CTC) neuropathy scales and the Pain-Intensity Numeric-Rating-Scale (PI-NRS). The preliminary R-ODS did not meet Rasch model's expectations. Items displaying misfit statistics, disordered thresholds, item bias or local dependency were systematically removed. The final CIPN-R-ODS consisting of 28 items fulfilled all the model's expectations with proper validity and reliability, and was unidimensional. The final CIPN-R-ODS is a Rasch-built disease-specific, interval measure suitable to detect disability in CIPN patients and bypasses the shortcomings of classical test theory ordinal-based measures. Its use is recommended in future clinical trials in CIPN. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. HARMONIC IN-PAINTING OF COSMIC MICROWAVE BACKGROUND SKY BY CONSTRAINED GAUSSIAN REALIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Jaiseung; Naselsky, Pavel; Mandolesi, Nazzareno, E-mail: jkim@nbi.dk

    The presence of astrophysical emissions between the last scattering surface and our vantage point requires us to apply a foreground mask on cosmic microwave background (CMB) sky maps, leading to large cuts around the Galactic equator and numerous holes. Since many CMB analysis, in particular on the largest angular scales, may be performed on a whole-sky map in a more straightforward and reliable manner, it is of utmost importance to develop an efficient method to fill in the masked pixels in a way compliant with the expected statistical properties and the unmasked pixels. In this Letter, we consider the Montemore » Carlo simulation of a constrained Gaussian field and derive it CMB anisotropy in harmonic space, where a feasible implementation is possible with good approximation. We applied our method to simulated data, which shows that our method produces a plausible whole-sky map, given the unmasked pixels, and a theoretical expectation. Subsequently, we applied our method to the Wilkinson Microwave Anisotropy Probe foreground-reduced maps and investigated the anomalous alignment between quadrupole and octupole components. From our investigation, we find that the alignment in the foreground-reduced maps is even higher than the Internal Linear Combination map. We also find that the V-band map has higher alignment than other bands, despite the expectation that the V-band map has less foreground contamination than other bands. Therefore, we find it hard to attribute the alignment to residual foregrounds. Our method will be complementary to other efforts on in-painting or reconstructing the masked CMB data, and of great use to Planck surveyor and future missions.« less

  12. Service quality in healthcare institutions: establishing the gaps for policy action.

    PubMed

    Abuosi, Aaron A; Atinga, Roger A

    2013-01-01

    The authors seek to examine two key issues: to assess patients' hospital service quality perceptions and expectation using SERVQUAL; and to outline the distinct concepts used to assess patient perceptions. Questionnaires were administered to 250 patients on admission and follow-up visits. The 22 paired SERVQUAL expectation and perception items were adopted. Repeated t-measures and factor analysis with Varimax rotation were used to analyse data. Results showed that patient expectations were not being met during medical treatment. Perceived service quality was rated lower than expectations for all variables. The mean difference between perceptions and expectations was statistically significant. Contrary to the SERVQUAL five-factor model, four service-quality factors were identified in the study. Findings have practical implications for hospital managers who should consider stepping up staffing levels backed by client-centred training programmes to help clinicians deliver care to patients' expectations. Limited studies are tailored towards patients' service-quality perception and expectation in Ghanaian hospitals. The findings therefore provide valuable information for policy and practice.

  13. Computing physical properties with quantum Monte Carlo methods with statistical fluctuations independent of system size.

    PubMed

    Assaraf, Roland

    2014-12-01

    We show that the recently proposed correlated sampling without reweighting procedure extends the locality (asymptotic independence of the system size) of a physical property to the statistical fluctuations of its estimator. This makes the approach potentially vastly more efficient for computing space-localized properties in large systems compared with standard correlated methods. A proof is given for a large collection of noninteracting fragments. Calculations on hydrogen chains suggest that this behavior holds not only for systems displaying short-range correlations, but also for systems with long-range correlations.

  14. Application of Ontology Technology in Health Statistic Data Analysis.

    PubMed

    Guo, Minjiang; Hu, Hongpu; Lei, Xingyun

    2017-01-01

    Research Purpose: establish health management ontology for analysis of health statistic data. Proposed Methods: this paper established health management ontology based on the analysis of the concepts in China Health Statistics Yearbook, and used protégé to define the syntactic and semantic structure of health statistical data. six classes of top-level ontology concepts and their subclasses had been extracted and the object properties and data properties were defined to establish the construction of these classes. By ontology instantiation, we can integrate multi-source heterogeneous data and enable administrators to have an overall understanding and analysis of the health statistic data. ontology technology provides a comprehensive and unified information integration structure of the health management domain and lays a foundation for the efficient analysis of multi-source and heterogeneous health system management data and enhancement of the management efficiency.

  15. Assessing positive and negative experiences: validation of a new measure of well-being in an Italian population.

    PubMed

    Corno, Giulia; Molinari, Guadalupe; Baños, Rosa Maria

    2016-01-01

    The aim of this study is to explore the psychometric properties of an affect scale, the Scale of Positive and Negative Experience (SPANE), in an Italian-speaking population. The results of this study demonstrate that the Italian version of the SPANE has psychometric properties similar to those shown by the original and previous versions, and it presents satisfactory reliability and factorial validity. The results of the Confirmatory Factor Analysis support the expected two-factor structure, positive and negative feeling, which characterized the previous versions. As expected, measures of negative affect, anxiety, negative future expectances, and depression correlated positively with the negative experiences SPANE subscale, and negatively with the positive experiences SPANE subscale. Results of this study demonstrate that the Italian version of the SPANE has psychometric properties similar to those shown by the original and previous versions, and it presents satisfactory reliability and factorial validity. The use of this instrument provides clinically useful information about a person’s overall emotional experience and it is an indicator of well-being. Although further studies are required to confirm the psychometric characteristics of the scale, the SPANE Italian version is expected to improve theoretical and empirical research on the well-being of the Italian population.

  16. Adaptations in a hierarchical food web of southeastern Lake Michigan

    USGS Publications Warehouse

    Krause, Ann E.; Frank, Ken A.; Jones, Michael L.; Nalepa, Thomas F.; Barbiero, Richard P.; Madenjian, Charles P.; Agy, Megan; Evans, Marlene S.; Taylor, William W.; Mason, Doran M.; Léonard, Nancy J.

    2009-01-01

    Two issues in ecological network theory are: (1) how to construct an ecological network model and (2) how do entire networks (as opposed to individual species) adapt to changing conditions? We present a novel method for constructing an ecological network model for the food web of southeastern Lake Michigan (USA) and we identify changes in key system properties that are large relative to their uncertainty as this ecological network adapts from one time point to a second time point in response to multiple perturbations. To construct our food web for southeastern Lake Michigan, we followed the list of seven recommendations outlined in Cohen et al. [Cohen, J.E., et al., 1993. Improving food webs. Ecology 74, 252–258] for improving food webs. We explored two inter-related extensions of hierarchical system theory with our food web; the first one was that subsystems react to perturbations independently in the short-term and the second one was that a system's properties change at a slower rate than its subsystems’ properties. We used Shannon's equations to provide quantitative versions of the basic food web properties: number of prey, number of predators, number of feeding links, and connectance (or density). We then compared these properties between the two time-periods by developing distributions of each property for each time period that took uncertainty about the property into account. We compared these distributions, and concluded that non-overlapping distributions indicated changes in these properties that were large relative to their uncertainty. Two subsystems were identified within our food web system structure (p < 0.001). One subsystem had more non-overlapping distributions in food web properties between Time 1 and Time 2 than the other subsystem. The overall system had all overlapping distributions in food web properties between Time 1 and Time 2. These results supported both extensions of hierarchical systems theory. Interestingly, the subsystem with more non-overlapping distributions in food web properties was the subsystem that contained primarily benthic taxa, contrary to expectations that the identified major perturbations (lower phosphorous inputs and invasive species) would more greatly affect the subsystem containing primarily pelagic taxa. Future food-web research should employ rigorous statistical analysis and incorporate uncertainty in food web properties for a better understanding of how ecological networks adapt.

  17. 24 CFR 35.115 - Exemptions.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... apply if a child less than age 6 resides or is expected to reside in the dwelling unit (see definitions... life, health or safety, or to protect property from further structural damage (such as when a property...

  18. 24 CFR 35.115 - Exemptions.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... apply if a child less than age 6 resides or is expected to reside in the dwelling unit (see definitions... life, health or safety, or to protect property from further structural damage (such as when a property...

  19. Approximate Model Checking of PCTL Involving Unbounded Path Properties

    NASA Astrophysics Data System (ADS)

    Basu, Samik; Ghosh, Arka P.; He, Ru

    We study the problem of applying statistical methods for approximate model checking of probabilistic systems against properties encoded as PCTL formulas. Such approximate methods have been proposed primarily to deal with state-space explosion that makes the exact model checking by numerical methods practically infeasible for large systems. However, the existing statistical methods either consider a restricted subset of PCTL, specifically, the subset that can only express bounded until properties; or rely on user-specified finite bound on the sample path length. We propose a new method that does not have such restrictions and can be effectively used to reason about unbounded until properties. We approximate probabilistic characteristics of an unbounded until property by that of a bounded until property for a suitably chosen value of the bound. In essence, our method is a two-phase process: (a) the first phase is concerned with identifying the bound k 0; (b) the second phase computes the probability of satisfying the k 0-bounded until property as an estimate for the probability of satisfying the corresponding unbounded until property. In both phases, it is sufficient to verify bounded until properties which can be effectively done using existing statistical techniques. We prove the correctness of our technique and present its prototype implementations. We empirically show the practical applicability of our method by considering different case studies including a simple infinite-state model, and large finite-state models such as IPv4 zeroconf protocol and dining philosopher protocol modeled as Discrete Time Markov chains.

  20. Applying the Anderson-Darling test to suicide clusters: evidence of contagion at U. S. universities?

    PubMed

    MacKenzie, Donald W

    2013-01-01

    Suicide clusters at Cornell University and the Massachusetts Institute of Technology (MIT) prompted popular and expert speculation of suicide contagion. However, some clustering is to be expected in any random process. This work tested whether suicide clusters at these two universities differed significantly from those expected under a homogeneous Poisson process, in which suicides occur randomly and independently of one another. Suicide dates were collected for MIT and Cornell for 1990-2012. The Anderson-Darling statistic was used to test the goodness-of-fit of the intervals between suicides to distribution expected under the Poisson process. Suicides at MIT were consistent with the homogeneous Poisson process, while those at Cornell showed clustering inconsistent with such a process (p = .05). The Anderson-Darling test provides a statistically powerful means to identify suicide clustering in small samples. Practitioners can use this method to test for clustering in relevant communities. The difference in clustering behavior between the two institutions suggests that more institutions should be studied to determine the prevalence of suicide clustering in universities and its causes.

  1. Multiplicative Modeling of Children's Growth and Its Statistical Properties

    NASA Astrophysics Data System (ADS)

    Kuninaka, Hiroto; Matsushita, Mitsugu

    2014-03-01

    We develop a numerical growth model that can predict the statistical properties of the height distribution of Japanese children. Our previous studies have clarified that the height distribution of schoolchildren shows a transition from the lognormal distribution to the normal distribution during puberty. In this study, we demonstrate by simulation that the transition occurs owing to the variability of the onset of puberty.

  2. Statistical Properties of SEE Rate Calculation in the Limits of Large and Small Event Counts

    NASA Technical Reports Server (NTRS)

    Ladbury, Ray

    2007-01-01

    This viewgraph presentation reviews the Statistical properties of Single Event Effects (SEE) rate calculations. The goal of SEE rate calculation is to bound the SEE rate, though the question is by how much. The presentation covers: (1) Understanding errors on SEE cross sections, (2) Methodology: Maximum Likelihood and confidence Contours, (3) Tests with Simulated data and (4) Applications.

  3. Statistical analysis of influence of soil source on leaching of arsenic and copper from CCA-C treated wood

    Treesearch

    Patricia Lebow; Richard Ziobro; Linda Sites; Tor Schultz; David Pettry; Darrel Nicholas; Stan Lebow; Pascal Kamdem; Roger Fox; Douglas Crawford

    2006-01-01

    Leaching of wood preservatives affects the long-term efficacy and environmental impact of treated wood. Soil properties and wood characteristicscan affectleaching of woad preservatives, but these effects are not well understood. This paper reports a statistical analysis of the effects of soil and wood properties on leaching of arsenic (As) and copper (Cu) from southern...

  4. Ex Vivo Characterization of Canine Liver Tissue Viscoelasticity Following High Intensity Focused Ultrasound (HIFU) Ablation

    PubMed Central

    Shahmirzadi, Danial; Hou, Gary Y.; Chen, Jiangang; Konofagou, Elisa E.

    2014-01-01

    Elasticity imaging has shown great promise in detecting High Intensity Focused Ultrasound (HIFU) lesions based on their distinct biomechanical properties. However, quantitative mechanical properties of the tissue and the optimal intensity for obtaining the best contrast parameters remain scarce. In this study, fresh canine livers were ablated using combinations of ISPTA intensities of 5.55, 7.16 and 9.07 kW/cm2 and time durations of 10 and 30 s ex vivo; leading to six groups of ablated tissues. Biopsy samples were then interrogated using dynamic shear mechanical testing within the range of 0.1-10 Hz to characterize the post-ablation tissue viscoelastic properties. All mechanical parameters were found to be frequency dependent. Compared to the unablated cases, all six groups of ablated tissues showed statistically-significant higher complex shear modulus and shear viscosity. However, among the ablated groups, both complex shear modulus and shear viscosity were found to monotonically increase in groups 1-4 (5.55 kW/cm2 for 10 s, 7.16 kW/cm2 for 10 s, 9.07 kW/cm2 & 10 s, and 5.55 kW/cm2 & 30 s, respectively), but decrease in groups 5 and 6 (7.16 kW/cm2 for 30 s, and 9.07 kW/cm2 for 30 s, respectively). For groups 5 and 6, the temperature was expected to exceed the boiling point, and therefore, the decreased stiffening could be due to the compromised integrity of the tissue microstructure. Future studies are needed to estimate the tissue mechanical properties in vivo and perform real-time monitoring of tissue alterations during ablation. PMID:24315395

  5. A comparison of positive and negative alcohol expectancy and value and their multiplicative composite as predictors of post-treatment abstinence survivorship.

    PubMed

    Jones, B T; McMahon, J

    1996-01-01

    Within social learning theory, positive alcohol expectancies represent motivation to drink and negative expectancies, motivation to restrain. It is also recognized that a subjective evaluation of expectancies ought to moderate their impact, although the evidence for this in social drinkers is problematic. This paper addresses the speculation that the moderating effect will be more evident in clinical populations. This study shows that (i) both expectancy and value reliably, independently and equally predict clients' abstinence survivorship following discharge from a treatment programme (and that this is almost entirely confined to the negative rather than positive terms). When (ii) expectancy evaluations are processed against expectancy through multiplicative composites (i.e. expectancy x value), their predictive power is only equivalent to either expectancy or value on its own. However (iii) when the multiplicative composite is assessed following the statistical guidelines advocated by Evans (1991) (i.e. within the same model as its constituents, expectancy and value) the increase in outcome variance explained by its inclusion is negligible and casts doubt upon its use in alcohol research. This does not appear to apply to value, however, and its possible role in treatment is discussed.

  6. Self-Averaging Property of Minimal Investment Risk of Mean-Variance Model.

    PubMed

    Shinzato, Takashi

    2015-01-01

    In portfolio optimization problems, the minimum expected investment risk is not always smaller than the expected minimal investment risk. That is, using a well-known approach from operations research, it is possible to derive a strategy that minimizes the expected investment risk, but this strategy does not always result in the best rate of return on assets. Prior to making investment decisions, it is important to an investor to know the potential minimal investment risk (or the expected minimal investment risk) and to determine the strategy that will maximize the return on assets. We use the self-averaging property to analyze the potential minimal investment risk and the concentrated investment level for the strategy that gives the best rate of return. We compare the results from our method with the results obtained by the operations research approach and with those obtained by a numerical simulation using the optimal portfolio. The results of our method and the numerical simulation are in agreement, but they differ from that of the operations research approach.

  7. Statistical lamb wave localization based on extreme value theory

    NASA Astrophysics Data System (ADS)

    Harley, Joel B.

    2018-04-01

    Guided wave localization methods based on delay-and-sum imaging, matched field processing, and other techniques have been designed and researched to create images that locate and describe structural damage. The maximum value of these images typically represent an estimated damage location. Yet, it is often unclear if this maximum value, or any other value in the image, is a statistically significant indicator of damage. Furthermore, there are currently few, if any, approaches to assess the statistical significance of guided wave localization images. As a result, we present statistical delay-and-sum and statistical matched field processing localization methods to create statistically significant images of damage. Our framework uses constant rate of false alarm statistics and extreme value theory to detect damage with little prior information. We demonstrate our methods with in situ guided wave data from an aluminum plate to detect two 0.75 cm diameter holes. Our results show an expected improvement in statistical significance as the number of sensors increase. With seventeen sensors, both methods successfully detect damage with statistical significance.

  8. Turbulence of Weak Gravitational Waves in the Early Universe.

    PubMed

    Galtier, Sébastien; Nazarenko, Sergey V

    2017-12-01

    We study the statistical properties of an ensemble of weak gravitational waves interacting nonlinearly in a flat space-time. We show that the resonant three-wave interactions are absent and develop a theory for four-wave interactions in the reduced case of a 2.5+1 diagonal metric tensor. In this limit, where only plus-polarized gravitational waves are present, we derive the interaction Hamiltonian and consider the asymptotic regime of weak gravitational wave turbulence. Both direct and inverse cascades are found for the energy and the wave action, respectively, and the corresponding wave spectra are derived. The inverse cascade is characterized by a finite-time propagation of the metric excitations-a process similar to an explosive nonequilibrium Bose-Einstein condensation, which provides an efficient mechanism to ironing out small-scale inhomogeneities. The direct cascade leads to an accumulation of the radiation energy in the system. These processes might be important for understanding the early Universe where a background of weak nonlinear gravitational waves is expected.

  9. Investigations on colour dependent photo induced microactuation effect of FSMA and proposing suitable mechanisms to control the effect

    NASA Astrophysics Data System (ADS)

    Bagchi, A.; Sarkar, S.; Mukhopadhyay, P. K.

    2018-02-01

    Three different coloured focused laser beams were used to study the photo induced microactuation effect found in some ferromagnetic shape memory alloys. Besides trying to uncover the basic causes of this unique and as yet unexplained effect, these studies are to help find other conditions to further characterize the effect for practical use. In this study some mechanisms have been proposed to control the amplitude of actuation of the sample. Control of the actuation of the FSMA sample both linearly with the help of a continuously variable neutral density filter as well periodically with the help of a linear polarizer was achieved. Statistical analysis of the experimental data was also done by applying ANOVA studies on the data to conclusively provide evidence in support of the relationship between the actuation of the sample and the various controlling factors. This study is expected to pave the way to implement this property of the sample in fabricating and operating useful micro-mechanical systems in the near future.

  10. Clustering change patterns using Fourier transformation with time-course gene expression data.

    PubMed

    Kim, Jaehee

    2011-01-01

    To understand the behavior of genes, it is important to explore how the patterns of gene expression change over a period of time because biologically related gene groups can share the same change patterns. In this study, the problem of finding similar change patterns is induced to clustering with the derivative Fourier coefficients. This work is aimed at discovering gene groups with similar change patterns which share similar biological properties. We developed a statistical model using derivative Fourier coefficients to identify similar change patterns of gene expression. We used a model-based method to cluster the Fourier series estimation of derivatives. We applied our model to cluster change patterns of yeast cell cycle microarray expression data with alpha-factor synchronization. It showed that, as the method clusters with the probability-neighboring data, the model-based clustering with our proposed model yielded biologically interpretable results. We expect that our proposed Fourier analysis with suitably chosen smoothing parameters could serve as a useful tool in classifying genes and interpreting possible biological change patterns.

  11. Statistical properties of color-signal spaces.

    PubMed

    Lenz, Reiner; Bui, Thanh Hai

    2005-05-01

    In applications of principal component analysis (PCA) it has often been observed that the eigenvector with the largest eigenvalue has only nonnegative entries when the vectors of the underlying stochastic process have only nonnegative values. This has been used to show that the coordinate vectors in PCA are all located in a cone. We prove that the nonnegativity of the first eigenvector follows from the Perron-Frobenius (and Krein-Rutman theory). Experiments show also that for stochastic processes with nonnegative signals the mean vector is often very similar to the first eigenvector. This is not true in general, but we first give a heuristical explanation why we can expect such a similarity. We then derive a connection between the dominance of the first eigenvalue and the similarity between the mean and the first eigenvector and show how to check the relative size of the first eigenvalue without actually computing it. In the last part of the paper we discuss the implication of theoretical results for multispectral color processing.

  12. Statistical properties of color-signal spaces

    NASA Astrophysics Data System (ADS)

    Lenz, Reiner; Hai Bui, Thanh

    2005-05-01

    In applications of principal component analysis (PCA) it has often been observed that the eigenvector with the largest eigenvalue has only nonnegative entries when the vectors of the underlying stochastic process have only nonnegative values. This has been used to show that the coordinate vectors in PCA are all located in a cone. We prove that the nonnegativity of the first eigenvector follows from the Perron-Frobenius (and Krein-Rutman theory). Experiments show also that for stochastic processes with nonnegative signals the mean vector is often very similar to the first eigenvector. This is not true in general, but we first give a heuristical explanation why we can expect such a similarity. We then derive a connection between the dominance of the first eigenvalue and the similarity between the mean and the first eigenvector and show how to check the relative size of the first eigenvalue without actually computing it. In the last part of the paper we discuss the implication of theoretical results for multispectral color processing.

  13. Investigations on colour dependent photo induced microactuation effect of FSMA and proposing suitable mechanisms to control the effect

    NASA Astrophysics Data System (ADS)

    Bagchi, A.; Sarkar, S.; Mukhopadhyay, P. K.

    2018-07-01

    Three different coloured focused laser beams were used to study the photo induced microactuation effect found in some ferromagnetic shape memory alloys. Besides trying to uncover the basic causes of this unique and as yet unexplained effect, these studies are to help find other conditions to further characterize the effect for practical use. In this study some mechanisms have been proposed to control the amplitude of actuation of the sample. Control of the actuation of the FSMA sample both linearly with the help of a continuously variable neutral density filter as well periodically with the help of a linear polarizer was achieved. Statistical analysis of the experimental data was also done by applying ANOVA studies on the data to conclusively provide evidence in support of the relationship between the actuation of the sample and the various controlling factors. This study is expected to pave the way to implement this property of the sample in fabricating and operating useful micro-mechanical systems in the near future.

  14. Software-defined Quantum Networking Ecosystem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Humble, Travis S.; Sadlier, Ronald

    The software enables a user to perform modeling and simulation of software-defined quantum networks. The software addresses the problem of how to synchronize transmission of quantum and classical signals through multi-node networks and to demonstrate quantum information protocols such as quantum teleportation. The software approaches this problem by generating a graphical model of the underlying network and attributing properties to each node and link in the graph. The graphical model is then simulated using a combination of discrete-event simulators to calculate the expected state of each node and link in the graph at a future time. A user interacts withmore » the software by providing an initial network model and instantiating methods for the nodes to transmit information with each other. This includes writing application scripts in python that make use of the software library interfaces. A user then initiates the application scripts, which invokes the software simulation. The user then uses the built-in diagnostic tools to query the state of the simulation and to collect statistics on synchronization.« less

  15. Statistical Mechanical Proof of the Second Law of Thermodynamics based on Volume Entropy

    NASA Astrophysics Data System (ADS)

    Campisi, Michele

    2007-10-01

    As pointed out in [M. Campisi. Stud. Hist. Phil. M. P. 36 (2005) 275-290] the volume entropy (that is the logarithm of the volume of phase space enclosed by the constant energy hyper-surface) provides a good mechanical analogue of thermodynamic entropy because it satisfies the heat theorem and it is an adiabatic invariant. This property explains the ``equal'' sign in Clausius principle (Sf>=Si) in a purely mechanical way and suggests that the volume entropy might explain the ``larger than'' sign (i.e. the Law of Entropy Increase) if non adiabatic transformations were considered. Based on the principles of quantum mechanics here we prove that, provided the initial equilibrium satisfy the natural condition of decreasing ordering of probabilities, the expectation value of the volume entropy cannot decrease for arbitrary transformations performed by some external sources of work on a insulated system. This can be regarded as a rigorous quantum mechanical proof of the Second Law.

  16. Adapting bioinformatics curricula for big data.

    PubMed

    Greene, Anna C; Giffin, Kristine A; Greene, Casey S; Moore, Jason H

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. © The Author 2015. Published by Oxford University Press.

  17. An experimental limit on the charge of antihydrogen

    PubMed Central

    Amole, C.; Ashkezari, M. D.; Baquero-Ruiz, M.; Bertsche, W.; Butler, E.; Capra, A.; Cesar, C. L.; Charlton, M.; Eriksson, S.; Fajans, J.; Friesen, T.; Fujiwara, M. C.; Gill, D. R.; Gutierrez, A.; Hangst, J. S.; Hardy, W. N.; Hayden, M. E.; Isaac, C. A.; Jonsell, S.; Kurchaninov, L.; Little, A.; Madsen, N.; McKenna, J. T. K.; Menary, S.; Napoli, S. C.; Nolan, P.; Olchanski, K.; Olin, A.; Povilus, A.; Pusa, P.; Rasmussen, C.Ø.; Robicheaux, F.; Sarid, E.; Silveira, D. M.; So, C.; Tharp, T. D.; Thompson, R. I.; van der Werf, D. P.; Vendeiro, Z.; Wurtele, J. S.; Zhmoginov, A. I.; Charman, A. E.

    2014-01-01

    The properties of antihydrogen are expected to be identical to those of hydrogen, and any differences would constitute a profound challenge to the fundamental theories of physics. The most commonly discussed antiatom-based tests of these theories are searches for antihydrogen-hydrogen spectral differences (tests of CPT (charge-parity-time) invariance) or gravitational differences (tests of the weak equivalence principle). Here we, the ALPHA Collaboration, report a different and somewhat unusual test of CPT and of quantum anomaly cancellation. A retrospective analysis of the influence of electric fields on antihydrogen atoms released from the ALPHA trap finds a mean axial deflection of 4.1±3.4 mm for an average axial electric field of 0.51 V mm−1. Combined with extensive numerical modelling, this measurement leads to a bound on the charge Qe of antihydrogen of Q=(−1.3±1.1±0.4) × 10−8. Here, e is the unit charge, and the errors are from statistics and systematic effects. PMID:24892800

  18. Chromosomal Thermal Index: a comprehensive way to integrate the thermal adaptation of Drosophila subobscura whole karyotype.

    PubMed

    Arenas, Conxita; Zivanovic, Goran; Mestres, Francesc

    2018-02-01

    Drosophila has demonstrated to be an excellent model to study the adaptation of organisms to global warming, with inversion chromosomal polymorphism having a key role in this adaptation. Here, we introduce a new index (Chromosomal Thermal Index or CTI) to quantify the thermal adaptation of a population according to its composition of "warm" and "cold" adapted inversions. This index is intuitive, has good statistical properties, and can be used to hypothesis on the effect of global warming on natural populations. We show the usefulness of CTI using data from European populations of D. subobscura, sampled in different years. Out of 15 comparisons over time, nine showed significant increase of CTI, in accordance with global warming expectations. Although large regions of the genome outside inversions contain thermal adaptation genes, our results show that the total amount of warm or cold inversions in populations seems to be directly involved in thermal adaptation, whereas the interactions between the inversions content of homologous and non-homologous chromosomes are not relevant.

  19. Adapting bioinformatics curricula for big data

    PubMed Central

    Greene, Anna C.; Giffin, Kristine A.; Greene, Casey S.

    2016-01-01

    Modern technologies are capable of generating enormous amounts of data that measure complex biological systems. Computational biologists and bioinformatics scientists are increasingly being asked to use these data to reveal key systems-level properties. We review the extent to which curricula are changing in the era of big data. We identify key competencies that scientists dealing with big data are expected to possess across fields, and we use this information to propose courses to meet these growing needs. While bioinformatics programs have traditionally trained students in data-intensive science, we identify areas of particular biological, computational and statistical emphasis important for this era that can be incorporated into existing curricula. For each area, we propose a course structured around these topics, which can be adapted in whole or in parts into existing curricula. In summary, specific challenges associated with big data provide an important opportunity to update existing curricula, but we do not foresee a wholesale redesign of bioinformatics training programs. PMID:25829469

  20. Probing the statistics of transport in the Hénon Map

    NASA Astrophysics Data System (ADS)

    Alus, O.; Fishman, S.; Meiss, J. D.

    2016-09-01

    The phase space of an area-preserving map typically contains infinitely many elliptic islands embedded in a chaotic sea. Orbits near the boundary of a chaotic region have been observed to stick for long times, strongly influencing their transport properties. The boundary is composed of invariant "boundary circles." We briefly report recent results of the distribution of rotation numbers of boundary circles for the Hénon quadratic map and show that the probability of occurrence of small integer entries of their continued fraction expansions is larger than would be expected for a number chosen at random. However, large integer entries occur with probabilities distributed proportionally to the random case. The probability distributions of ratios of fluxes through island chains is reported as well. These island chains are neighbours in the sense of the Meiss-Ott Markov-tree model. Two distinct universality families are found. The distributions of the ratio between the flux and orbital period are also presented. All of these results have implications for models of transport in mixed phase space.

Top