Sample records for statistical complexity measure

  1. Unifying Complexity and Information

    NASA Astrophysics Data System (ADS)

    Ke, Da-Guan

    2013-04-01

    Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.

  2. Generalized statistical complexity measures: Geometrical and analytical properties

    NASA Astrophysics Data System (ADS)

    Martin, M. T.; Plastino, A.; Rosso, O. A.

    2006-09-01

    We discuss bounds on the values adopted by the generalized statistical complexity measures [M.T. Martin et al., Phys. Lett. A 311 (2003) 126; P.W. Lamberti et al., Physica A 334 (2004) 119] introduced by López Ruiz et al. [Phys. Lett. A 209 (1995) 321] and Shiner et al. [Phys. Rev. E 59 (1999) 1459]. Several new theorems are proved and illustrated with reference to the celebrated logistic map.

  3. Applied statistics in agricultural, biological, and environmental sciences.

    USDA-ARS?s Scientific Manuscript database

    Agronomic research often involves measurement and collection of multiple response variables in an effort to understand the more complex nature of the system being studied. Multivariate statistical methods encompass the simultaneous analysis of all random variables measured on each experimental or s...

  4. Statistical complexity measure of pseudorandom bit generators

    NASA Astrophysics Data System (ADS)

    González, C. M.; Larrondo, H. A.; Rosso, O. A.

    2005-08-01

    Pseudorandom number generators (PRNG) are extensively used in Monte Carlo simulations, gambling machines and cryptography as substitutes of ideal random number generators (RNG). Each application imposes different statistical requirements to PRNGs. As L’Ecuyer clearly states “the main goal for Monte Carlo methods is to reproduce the statistical properties on which these methods are based whereas for gambling machines and cryptology, observing the sequence of output values for some time should provide no practical advantage for predicting the forthcoming numbers better than by just guessing at random”. In accordance with different applications several statistical test suites have been developed to analyze the sequences generated by PRNGs. In a recent paper a new statistical complexity measure [Phys. Lett. A 311 (2003) 126] has been defined. Here we propose this measure, as a randomness quantifier of a PRNGs. The test is applied to three very well known and widely tested PRNGs available in the literature. All of them are based on mathematical algorithms. Another PRNGs based on Lorenz 3D chaotic dynamical system is also analyzed. PRNGs based on chaos may be considered as a model for physical noise sources and important new results are recently reported. All the design steps of this PRNG are described, and each stage increase the PRNG randomness using different strategies. It is shown that the MPR statistical complexity measure is capable to quantify this randomness improvement. The PRNG based on the chaotic 3D Lorenz dynamical system is also evaluated using traditional digital signal processing tools for comparison.

  5. Statistical Analysis of Big Data on Pharmacogenomics

    PubMed Central

    Fan, Jianqing; Liu, Han

    2013-01-01

    This paper discusses statistical methods for estimating complex correlation structure from large pharmacogenomic datasets. We selectively review several prominent statistical methods for estimating large covariance matrix for understanding correlation structure, inverse covariance matrix for network modeling, large-scale simultaneous tests for selecting significantly differently expressed genes and proteins and genetic markers for complex diseases, and high dimensional variable selection for identifying important molecules for understanding molecule mechanisms in pharmacogenomics. Their applications to gene network estimation and biomarker selection are used to illustrate the methodological power. Several new challenges of Big data analysis, including complex data distribution, missing data, measurement error, spurious correlation, endogeneity, and the need for robust statistical methods, are also discussed. PMID:23602905

  6. Multivariate analysis: greater insights into complex systems

    USDA-ARS?s Scientific Manuscript database

    Many agronomic researchers measure and collect multiple response variables in an effort to understand the more complex nature of the system being studied. Multivariate (MV) statistical methods encompass the simultaneous analysis of all random variables (RV) measured on each experimental or sampling ...

  7. Dissecting the genetics of complex traits using summary association statistics.

    PubMed

    Pasaniuc, Bogdan; Price, Alkes L

    2017-02-01

    During the past decade, genome-wide association studies (GWAS) have been used to successfully identify tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyse summary association statistics. Here, we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases.

  8. Dissecting the genetics of complex traits using summary association statistics

    PubMed Central

    Pasaniuc, Bogdan; Price, Alkes L.

    2017-01-01

    During the past decade, genome-wide association studies (GWAS) have successfully identified tens of thousands of genetic variants associated with complex traits and diseases. These studies have produced extensive repositories of genetic variation and trait measurements across large numbers of individuals, providing tremendous opportunities for further analyses. However, privacy concerns and other logistical considerations often limit access to individual-level genetic data, motivating the development of methods that analyze summary association statistics. Here we review recent progress on statistical methods that leverage summary association data to gain insights into the genetic basis of complex traits and diseases. PMID:27840428

  9. On the Way to Appropriate Model Complexity

    NASA Astrophysics Data System (ADS)

    Höge, M.

    2016-12-01

    When statistical models are used to represent natural phenomena they are often too simple or too complex - this is known. But what exactly is model complexity? Among many other definitions, the complexity of a model can be conceptualized as a measure of statistical dependence between observations and parameters (Van der Linde, 2014). However, several issues remain when working with model complexity: A unique definition for model complexity is missing. Assuming a definition is accepted, how can model complexity be quantified? How can we use a quantified complexity to the better of modeling? Generally defined, "complexity is a measure of the information needed to specify the relationships between the elements of organized systems" (Bawden & Robinson, 2015). The complexity of a system changes as the knowledge about the system changes. For models this means that complexity is not a static concept: With more data or higher spatio-temporal resolution of parameters, the complexity of a model changes. There are essentially three categories into which all commonly used complexity measures can be classified: (1) An explicit representation of model complexity as "Degrees of freedom" of a model, e.g. effective number of parameters. (2) Model complexity as code length, a.k.a. "Kolmogorov complexity": The longer the shortest model code, the higher its complexity (e.g. in bits). (3) Complexity defined via information entropy of parametric or predictive uncertainty. Preliminary results show that Bayes theorem allows for incorporating all parts of the non-static concept of model complexity like data quality and quantity or parametric uncertainty. Therefore, we test how different approaches for measuring model complexity perform in comparison to a fully Bayesian model selection procedure. Ultimately, we want to find a measure that helps to assess the most appropriate model.

  10. Statistics of multi-look AIRSAR imagery: A comparison of theory with measurements

    NASA Technical Reports Server (NTRS)

    Lee, J. S.; Hoppel, K. W.; Mango, S. A.

    1993-01-01

    The intensity and amplitude statistics of SAR images, such as L-Band HH for SEASAT and SIR-B, and C-Band VV for ERS-1 have been extensively investigated for various terrain, ground cover and ocean surfaces. Less well-known are the statistics between multiple channels of polarimetric of interferometric SAR's, especially for the multi-look processed data. In this paper, we investigate the probability density functions (PDF's) of phase differences, the magnitude of complex products and the amplitude ratios, between polarization channels (i.e. HH, HV, and VV) using 1-look and 4-look AIRSAR polarimetric data. Measured histograms are compared with theoretical PDF's which were recently derived based on a complex Gaussian model.

  11. Characterizing and locating air pollution sources in a complex industrial district using optical remote sensing technology and multivariate statistical modeling.

    PubMed

    Chang, Pao-Erh Paul; Yang, Jen-Chih Rena; Den, Walter; Wu, Chang-Fu

    2014-09-01

    Emissions of volatile organic compounds (VOCs) are most frequent environmental nuisance complaints in urban areas, especially where industrial districts are nearby. Unfortunately, identifying the responsible emission sources of VOCs is essentially a difficult task. In this study, we proposed a dynamic approach to gradually confine the location of potential VOC emission sources in an industrial complex, by combining multi-path open-path Fourier transform infrared spectrometry (OP-FTIR) measurement and the statistical method of principal component analysis (PCA). Close-cell FTIR was further used to verify the VOC emission source by measuring emitted VOCs from selected exhaust stacks at factories in the confined areas. Multiple open-path monitoring lines were deployed during a 3-month monitoring campaign in a complex industrial district. The emission patterns were identified and locations of emissions were confined by the wind data collected simultaneously. N,N-Dimethyl formamide (DMF), 2-butanone, toluene, and ethyl acetate with mean concentrations of 80.0 ± 1.8, 34.5 ± 0.8, 103.7 ± 2.8, and 26.6 ± 0.7 ppbv, respectively, were identified as the major VOC mixture at all times of the day around the receptor site. As the toxic air pollutant, the concentrations of DMF in air samples were found exceeding the ambient standard despite the path-average effect of OP-FTIR upon concentration levels. The PCA data identified three major emission sources, including PU coating, chemical packaging, and lithographic printing industries. Applying instrumental measurement and statistical modeling, this study has established a systematic approach for locating emission sources. Statistical modeling (PCA) plays an important role in reducing dimensionality of a large measured dataset and identifying underlying emission sources. Instrumental measurement, however, helps verify the outcomes of the statistical modeling. The field study has demonstrated the feasibility of using multi-path OP-FTIR measurement. The wind data incorporating with the statistical modeling (PCA) may successfully identify the major emission source in a complex industrial district.

  12. Quantifying networks complexity from information geometry viewpoint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Felice, Domenico, E-mail: domenico.felice@unicam.it; Mancini, Stefano; INFN-Sezione di Perugia, Via A. Pascoli, I-06123 Perugia

    We consider a Gaussian statistical model whose parameter space is given by the variances of random variables. Underlying this model we identify networks by interpreting random variables as sitting on vertices and their correlations as weighted edges among vertices. We then associate to the parameter space a statistical manifold endowed with a Riemannian metric structure (that of Fisher-Rao). Going on, in analogy with the microcanonical definition of entropy in Statistical Mechanics, we introduce an entropic measure of networks complexity. We prove that it is invariant under networks isomorphism. Above all, considering networks as simplicial complexes, we evaluate this entropy onmore » simplexes and find that it monotonically increases with their dimension.« less

  13. Information geometric methods for complexity

    NASA Astrophysics Data System (ADS)

    Felice, Domenico; Cafaro, Carlo; Mancini, Stefano

    2018-03-01

    Research on the use of information geometry (IG) in modern physics has witnessed significant advances recently. In this review article, we report on the utilization of IG methods to define measures of complexity in both classical and, whenever available, quantum physical settings. A paradigmatic example of a dramatic change in complexity is given by phase transitions (PTs). Hence, we review both global and local aspects of PTs described in terms of the scalar curvature of the parameter manifold and the components of the metric tensor, respectively. We also report on the behavior of geodesic paths on the parameter manifold used to gain insight into the dynamics of PTs. Going further, we survey measures of complexity arising in the geometric framework. In particular, we quantify complexity of networks in terms of the Riemannian volume of the parameter space of a statistical manifold associated with a given network. We are also concerned with complexity measures that account for the interactions of a given number of parts of a system that cannot be described in terms of a smaller number of parts of the system. Finally, we investigate complexity measures of entropic motion on curved statistical manifolds that arise from a probabilistic description of physical systems in the presence of limited information. The Kullback-Leibler divergence, the distance to an exponential family and volumes of curved parameter manifolds, are examples of essential IG notions exploited in our discussion of complexity. We conclude by discussing strengths, limits, and possible future applications of IG methods to the physics of complexity.

  14. Modeling Complex Phenomena Using Multiscale Time Sequences

    DTIC Science & Technology

    2009-08-24

    measures based on Hurst and Holder exponents , auto-regressive methods and Fourier and wavelet decomposition methods. The applications for this technology...relate to each other. This can be done by combining a set statistical fractal measures based on Hurst and Holder exponents , auto-regressive...different scales and how these scales relate to each other. This can be done by combining a set statistical fractal measures based on Hurst and

  15. Ionospheric scintillation studies

    NASA Technical Reports Server (NTRS)

    Rino, C. L.; Freemouw, E. J.

    1973-01-01

    The diffracted field of a monochromatic plane wave was characterized by two complex correlation functions. For a Gaussian complex field, these quantities suffice to completely define the statistics of the field. Thus, one can in principle calculate the statistics of any measurable quantity in terms of the model parameters. The best data fits were achieved for intensity statistics derived under the Gaussian statistics hypothesis. The signal structure that achieved the best fit was nearly invariant with scintillation level and irregularity source (ionosphere or solar wind). It was characterized by the fact that more than 80% of the scattered signal power is in phase quadrature with the undeviated or coherent signal component. Thus, the Gaussian-statistics hypothesis is both convenient and accurate for channel modeling work.

  16. Evaluating measurement models in clinical research: covariance structure analysis of latent variable models of self-conception.

    PubMed

    Hoyle, R H

    1991-02-01

    Indirect measures of psychological constructs are vital to clinical research. On occasion, however, the meaning of indirect measures of psychological constructs is obfuscated by statistical procedures that do not account for the complex relations between items and latent variables and among latent variables. Covariance structure analysis (CSA) is a statistical procedure for testing hypotheses about the relations among items that indirectly measure a psychological construct and relations among psychological constructs. This article introduces clinical researchers to the strengths and limitations of CSA as a statistical procedure for conceiving and testing structural hypotheses that are not tested adequately with other statistical procedures. The article is organized around two empirical examples that illustrate the use of CSA for evaluating measurement models with correlated error terms, higher-order factors, and measured and latent variables.

  17. Statistical similarity measures for link prediction in heterogeneous complex networks

    NASA Astrophysics Data System (ADS)

    Shakibian, Hadi; Charkari, Nasrollah Moghadam

    2018-07-01

    The majority of the link prediction measures in heterogeneous complex networks rely on the nodes connectivities while less attention has been paid to the importance of the nodes and paths. In this paper, we propose some new meta-path based statistical similarity measures to properly perform link prediction task. The main idea in the proposed measures is to drive some co-occurrence events in a number of co-occurrence matrices that are occurred between the visited nodes obeying a meta-path. The extracted co-occurrence matrices are analyzed in terms of the energy, inertia, local homogeneity, correlation, and information measure of correlation to determine various information theoretic measures. We evaluate the proposed measures, denoted as link energy, link inertia, link local homogeneity, link correlation, and link information measure of correlation, using a standard DBLP network data set. The results of the AUC score and Precision rate indicate the validity and accuracy of the proposed measures in comparison to the popular meta-path based similarity measures.

  18. Using Carbon Emissions Data to "Heat Up" Descriptive Statistics

    ERIC Educational Resources Information Center

    Brooks, Robert

    2012-01-01

    This article illustrates using carbon emissions data in an introductory statistics assignment. The carbon emissions data has desirable characteristics including: choice of measure; skewness; and outliers. These complexities allow research and public policy debate to be introduced. (Contains 4 figures and 2 tables.)

  19. Analysis of spontaneous MEG activity in mild cognitive impairment and Alzheimer's disease using spectral entropies and statistical complexity measures

    NASA Astrophysics Data System (ADS)

    Bruña, Ricardo; Poza, Jesús; Gómez, Carlos; García, María; Fernández, Alberto; Hornero, Roberto

    2012-06-01

    Alzheimer's disease (AD) is the most common cause of dementia. Over the last few years, a considerable effort has been devoted to exploring new biomarkers. Nevertheless, a better understanding of brain dynamics is still required to optimize therapeutic strategies. In this regard, the characterization of mild cognitive impairment (MCI) is crucial, due to the high conversion rate from MCI to AD. However, only a few studies have focused on the analysis of magnetoencephalographic (MEG) rhythms to characterize AD and MCI. In this study, we assess the ability of several parameters derived from information theory to describe spontaneous MEG activity from 36 AD patients, 18 MCI subjects and 26 controls. Three entropies (Shannon, Tsallis and Rényi entropies), one disequilibrium measure (based on Euclidean distance ED) and three statistical complexities (based on Lopez Ruiz-Mancini-Calbet complexity LMC) were used to estimate the irregularity and statistical complexity of MEG activity. Statistically significant differences between AD patients and controls were obtained with all parameters (p < 0.01). In addition, statistically significant differences between MCI subjects and controls were achieved by ED and LMC (p < 0.05). In order to assess the diagnostic ability of the parameters, a linear discriminant analysis with a leave-one-out cross-validation procedure was applied. The accuracies reached 83.9% and 65.9% to discriminate AD and MCI subjects from controls, respectively. Our findings suggest that MCI subjects exhibit an intermediate pattern of abnormalities between normal aging and AD. Furthermore, the proposed parameters provide a new description of brain dynamics in AD and MCI.

  20. Evaluating Abstract Art: Relation between Term Usage, Subjective Ratings, Image Properties and Personality Traits.

    PubMed

    Lyssenko, Nathalie; Redies, Christoph; Hayn-Leichsenring, Gregor U

    2016-01-01

    One of the major challenges in experimental aesthetics is the uncertainty of the terminology used in experiments. In this study, we recorded terms that are spontaneously used by participants to describe abstract artworks and studied their relation to the second-order statistical image properties of the same artworks (Experiment 1). We found that the usage frequency of some structure-describing terms correlates with statistical image properties, such as PHOG Self-Similarity, Anisotropy and Complexity. Additionally, emotion-associated terms correlate with measured color values. Next, based on the most frequently used terms, we created five different rating scales (Experiment 2) and obtained ratings of participants for the abstract paintings on these scales. We found significant correlations between descriptive score ratings (e.g., between structure and subjective complexity), between evaluative and descriptive score ratings (e.g., between preference and subjective complexity/interest) and between descriptive score ratings and statistical image properties (e.g., between interest and PHOG Self-Similarity, Complexity and Anisotropy). Additionally, we determined the participants' personality traits as described in the 'Big Five Inventory' (Goldberg, 1990; Rammstedt and John, 2005) and correlated them with the ratings and preferences of individual participants. Participants with higher scores for Neuroticism showed preferences for objectively more complex images, as well as a different notion of the term complex when compared with participants with lower scores for Neuroticism. In conclusion, this study demonstrates an association between objectively measured image properties and the subjective terms that participants use to describe or evaluate abstract artworks. Moreover, our results suggest that the description of abstract artworks, their evaluation and the preference of participants for their low-level statistical properties are linked to personality traits.

  1. Evaluating Abstract Art: Relation between Term Usage, Subjective Ratings, Image Properties and Personality Traits

    PubMed Central

    Lyssenko, Nathalie; Redies, Christoph; Hayn-Leichsenring, Gregor U.

    2016-01-01

    One of the major challenges in experimental aesthetics is the uncertainty of the terminology used in experiments. In this study, we recorded terms that are spontaneously used by participants to describe abstract artworks and studied their relation to the second-order statistical image properties of the same artworks (Experiment 1). We found that the usage frequency of some structure-describing terms correlates with statistical image properties, such as PHOG Self-Similarity, Anisotropy and Complexity. Additionally, emotion-associated terms correlate with measured color values. Next, based on the most frequently used terms, we created five different rating scales (Experiment 2) and obtained ratings of participants for the abstract paintings on these scales. We found significant correlations between descriptive score ratings (e.g., between structure and subjective complexity), between evaluative and descriptive score ratings (e.g., between preference and subjective complexity/interest) and between descriptive score ratings and statistical image properties (e.g., between interest and PHOG Self-Similarity, Complexity and Anisotropy). Additionally, we determined the participants’ personality traits as described in the ‘Big Five Inventory’ (Goldberg, 1990; Rammstedt and John, 2005) and correlated them with the ratings and preferences of individual participants. Participants with higher scores for Neuroticism showed preferences for objectively more complex images, as well as a different notion of the term complex when compared with participants with lower scores for Neuroticism. In conclusion, this study demonstrates an association between objectively measured image properties and the subjective terms that participants use to describe or evaluate abstract artworks. Moreover, our results suggest that the description of abstract artworks, their evaluation and the preference of participants for their low-level statistical properties are linked to personality traits. PMID:27445933

  2. Mapping and discrimination of networks in the complexity-entropy plane

    NASA Astrophysics Data System (ADS)

    Wiedermann, Marc; Donges, Jonathan F.; Kurths, Jürgen; Donner, Reik V.

    2017-10-01

    Complex networks are usually characterized in terms of their topological, spatial, or information-theoretic properties and combinations of the associated metrics are used to discriminate networks into different classes or categories. However, even with the present variety of characteristics at hand it still remains a subject of current research to appropriately quantify a network's complexity and correspondingly discriminate between different types of complex networks, like infrastructure or social networks, on such a basis. Here we explore the possibility to classify complex networks by means of a statistical complexity measure that has formerly been successfully applied to distinguish different types of chaotic and stochastic time series. It is composed of a network's averaged per-node entropic measure characterizing the network's information content and the associated Jenson-Shannon divergence as a measure of disequilibrium. We study 29 real-world networks and show that networks of the same category tend to cluster in distinct areas of the resulting complexity-entropy plane. We demonstrate that within our framework, connectome networks exhibit among the highest complexity while, e.g., transportation and infrastructure networks display significantly lower values. Furthermore, we demonstrate the utility of our framework by applying it to families of random scale-free and Watts-Strogatz model networks. We then show in a second application that the proposed framework is useful to objectively construct threshold-based networks, such as functional climate networks or recurrence networks, by choosing the threshold such that the statistical network complexity is maximized.

  3. An Overview of Particle Sampling Bias

    NASA Technical Reports Server (NTRS)

    Meyers, James F.; Edwards, Robert V.

    1984-01-01

    The complex relation between particle arrival statistics and the interarrival statistics is explored. It is known that the mean interarrival time given an initial velocity is generally not the inverse of the mean rate corresponding to that velocity. Necessary conditions for the measurement of the conditional rate are given.

  4. Automated information and control complex of hydro-gas endogenous mine processes

    NASA Astrophysics Data System (ADS)

    Davkaev, K. S.; Lyakhovets, M. V.; Gulevich, T. M.; Zolin, K. A.

    2017-09-01

    The automated information and control complex designed to prevent accidents, related to aerological situation in the underground workings, accounting of the received and handed over individual devices, transmission and display of measurement data, and the formation of preemptive solutions is considered. Examples for the automated workplace of an airgas control operator by individual means are given. The statistical characteristics of field data characterizing the aerological situation in the mine are obtained. The conducted studies of statistical characteristics confirm the feasibility of creating a subsystem of controlled gas distribution with an adaptive arrangement of points for gas control. The adaptive (multivariant) algorithm for processing measuring information of continuous multidimensional quantities and influencing factors has been developed.

  5. Habitat Complexity in Aquatic Microcosms Affects Processes Driven by Detritivores

    PubMed Central

    Flores, Lorea; Bailey, R. A.; Elosegi, Arturo; Larrañaga, Aitor; Reiss, Julia

    2016-01-01

    Habitat complexity can influence predation rates (e.g. by providing refuge) but other ecosystem processes and species interactions might also be modulated by the properties of habitat structure. Here, we focussed on how complexity of artificial habitat (plastic plants), in microcosms, influenced short-term processes driven by three aquatic detritivores. The effects of habitat complexity on leaf decomposition, production of fine organic matter and pH levels were explored by measuring complexity in three ways: 1. as the presence vs. absence of habitat structure; 2. as the amount of structure (3 or 4.5 g of plastic plants); and 3. as the spatial configuration of structures (measured as fractal dimension). The experiment also addressed potential interactions among the consumers by running all possible species combinations. In the experimental microcosms, habitat complexity influenced how species performed, especially when comparing structure present vs. structure absent. Treatments with structure showed higher fine particulate matter production and lower pH compared to treatments without structures and this was probably due to higher digestion and respiration when structures were present. When we explored the effects of the different complexity levels, we found that the amount of structure added explained more than the fractal dimension of the structures. We give a detailed overview of the experimental design, statistical models and R codes, because our statistical analysis can be applied to other study systems (and disciplines such as restoration ecology). We further make suggestions of how to optimise statistical power when artificially assembling, and analysing, ‘habitat complexity’ by not confounding complexity with the amount of structure added. In summary, this study highlights the importance of habitat complexity for energy flow and the maintenance of ecosystem processes in aquatic ecosystems. PMID:27802267

  6. Magnetic resonance imaging features of complex Chiari malformation variant of Chiari 1 malformation.

    PubMed

    Moore, Hannah E; Moore, Kevin R

    2014-11-01

    Complex Chiari malformation is a subgroup of Chiari 1 malformation with distinct imaging features. Children with complex Chiari malformation are reported to have a more severe clinical phenotype and sometimes require more extensive surgical treatment than those with uncomplicated Chiari 1 malformation. We describe reported MR imaging features of complex Chiari malformation and evaluate the utility of craniometric parameters and qualitative anatomical observations for distinguishing complex Chiari malformation from uncomplicated Chiari 1 malformation. We conducted a retrospective search of the institutional imaging database using the keywords "Chiari" and "Chiari 1" to identify children imaged during the 2006-2011 time period. Children with Chiari 2 malformation were excluded after imaging review. We used the first available diagnostic brain or cervical spine MR study for data measurement. Standard measurements and observations were made of obex level (mm), cerebellar tonsillar descent (mm), perpendicular distance to basion-C2 line (pB-C2, mm), craniocervical angle (degrees), clivus length, and presence or absence of syringohydromyelia, basilar invagination and congenital craniovertebral junction osseous anomalies. After imaging review, we accessed the institutional health care clinical database to determine whether each subject clinically met criteria for Chiari 1 malformation or complex Chiari malformation. Obex level and craniocervical angle measurements showed statistically significant differences between the populations with complex Chiari malformation and uncomplicated Chiari 1 malformation. Cerebellar tonsillar descent and perpendicular distance to basion-C2 line measurements trended toward but did not meet statistical significance. Odontoid retroflexion, craniovertebral junction osseous anomalies, and syringohydromyelia were all observed proportionally more often in children with complex Chiari malformation than in those with Chiari 1 malformation. Characteristic imaging features of complex Chiari malformation, especially obex level, permit its distinction from the more common uncomplicated Chiari 1 malformation.

  7. Statistical process control: A feasibility study of the application of time-series measurement in early neurorehabilitation after acquired brain injury.

    PubMed

    Markovic, Gabriela; Schult, Marie-Louise; Bartfai, Aniko; Elg, Mattias

    2017-01-31

    Progress in early cognitive recovery after acquired brain injury is uneven and unpredictable, and thus the evaluation of rehabilitation is complex. The use of time-series measurements is susceptible to statistical change due to process variation. To evaluate the feasibility of using a time-series method, statistical process control, in early cognitive rehabilitation. Participants were 27 patients with acquired brain injury undergoing interdisciplinary rehabilitation of attention within 4 months post-injury. The outcome measure, the Paced Auditory Serial Addition Test, was analysed using statistical process control. Statistical process control identifies if and when change occurs in the process according to 3 patterns: rapid, steady or stationary performers. The statistical process control method was adjusted, in terms of constructing the baseline and the total number of measurement points, in order to measure a process in change. Statistical process control methodology is feasible for use in early cognitive rehabilitation, since it provides information about change in a process, thus enabling adjustment of the individual treatment response. Together with the results indicating discernible subgroups that respond differently to rehabilitation, statistical process control could be a valid tool in clinical decision-making. This study is a starting-point in understanding the rehabilitation process using a real-time-measurements approach.

  8. SU-E-J-261: Statistical Analysis and Chaotic Dynamics of Respiratory Signal of Patients in BodyFix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michalski, D; Huq, M; Bednarz, G

    Purpose: To quantify respiratory signal of patients in BodyFix undergoing 4DCT scan with and without immobilization cover. Methods: 20 pairs of respiratory tracks recorded with RPM system during 4DCT scan were analyzed. Descriptive statistic was applied to selected parameters of exhale-inhale decomposition. Standardized signals were used with the delay method to build orbits in embedded space. Nonlinear behavior was tested with surrogate data. Sample entropy SE, Lempel-Ziv complexity LZC and the largest Lyapunov exponents LLE were compared. Results: Statistical tests show difference between scans for inspiration time and its variability, which is bigger for scans without cover. The same ismore » for variability of the end of exhalation and inhalation. Other parameters fail to show the difference. For both scans respiratory signals show determinism and nonlinear stationarity. Statistical test on surrogate data reveals their nonlinearity. LLEs show signals chaotic nature and its correlation with breathing period and its embedding delay time. SE, LZC and LLE measure respiratory signal complexity. Nonlinear characteristics do not differ between scans. Conclusion: Contrary to expectation cover applied to patients in BodyFix appears to have limited effect on signal parameters. Analysis based on trajectories of delay vectors shows respiratory system nonlinear character and its sensitive dependence on initial conditions. Reproducibility of respiratory signal can be evaluated with measures of signal complexity and its predictability window. Longer respiratory period is conducive for signal reproducibility as shown by these gauges. Statistical independence of the exhale and inhale times is also supported by the magnitude of LLE. The nonlinear parameters seem more appropriate to gauge respiratory signal complexity since its deterministic chaotic nature. It contrasts with measures based on harmonic analysis that are blind for nonlinear features. Dynamics of breathing, so crucial for 4D-based clinical technologies, can be better controlled if nonlinear-based methodology, which reflects respiration characteristic, is applied. Funding provided by Varian Medical Systems via Investigator Initiated Research Project.« less

  9. Observation of non-classical correlations in sequential measurements of photon polarization

    NASA Astrophysics Data System (ADS)

    Suzuki, Yutaro; Iinuma, Masataka; Hofmann, Holger F.

    2016-10-01

    A sequential measurement of two non-commuting quantum observables results in a joint probability distribution for all output combinations that can be explained in terms of an initial joint quasi-probability of the non-commuting observables, modified by the resolution errors and back-action of the initial measurement. Here, we show that the error statistics of a sequential measurement of photon polarization performed at different measurement strengths can be described consistently by an imaginary correlation between the statistics of resolution and back-action. The experimental setup was designed to realize variable strength measurements with well-controlled imaginary correlation between the statistical errors caused by the initial measurement of diagonal polarizations, followed by a precise measurement of the horizontal/vertical polarization. We perform the experimental characterization of an elliptically polarized input state and show that the same complex joint probability distribution is obtained at any measurement strength.

  10. Monitoring the soil degradation by Metastatistical Analysis

    NASA Astrophysics Data System (ADS)

    Oleschko, K.; Gaona, C.; Tarquis, A.

    2009-04-01

    The effectiveness of fractal toolbox to capture the critical behavior of soil structural patterns during the chemical and physical degradation was documented by our numerous experiments (Oleschko et al., 2008 a; 2008 b). The spatio-temporal dynamics of these patterns was measured and mapped with high precision in terms of fractal descriptors. All tested fractal techniques were able to detect the statistically significant differences in structure between the perfect spongy and massive patterns of uncultivated and sodium-saline agricultural soils, respectively. For instance, the Hurst exponent, extracted from the Chernozeḿ micromorphological images and from the time series of its physical and mechanical properties measured in situ, detected the roughness decrease (and therefore the increase in H - from 0.17 to 0.30 for images) derived from the loss of original structure complexity. The combined use of different fractal descriptors brings statistical precision into the quantification of natural system degradation and provides a means for objective soil structure comparison (Oleschko et al., 2000). The ability of fractal parameters to capture critical behavior and phase transition was documented for different contrasting situations, including from Andosols deforestation and erosion, to Vertisols high fructuring and consolidation. The Hurst exponent is used to measure the type of persistence and degree of complexity of structure dynamics. We conclude that there is an urgent need to select and adopt a standardized toolbox for fractal analysis and complexity measures in Earth Sciences. We propose to use the second-order (meta-) statistics as subtle measures of complexity (Atmanspacher et al., 1997). The high degree of correlation was documented between the fractal and high-order statistical descriptors (four central moments of stochastic variable distribution) used to the system heterogeneity and variability analysis. We proposed to call this combined fractal/statistical toolbox Metastatistical Analysis and recommend it to the projects directed to soil degradation monitoring. References: 1. Oleschko, K., B.S. Figueroa, M.E. Miranda, M.A. Vuelvas and E.R. Solleiro, Soil & Till. Res. 55, 43 (2000). 2. Oleschko, K., Korvin, G., Figueroa S. B., Vuelvas, M.A., Balankin, A., Flores L., Carreño, D. Fractal radar scattering from soil. Physical Review E.67, 041403, 2003. 3. Zamora-Castro S., Oleschko, K. Flores, L., Ventura, E. Jr., Parrot, J.-F., 2008. Fractal mapping of pore and solids attributes. Vadose Zone Journal, v. 7, Issue2: 473-492. 4. Oleschko, K., Korvin, G., Muñoz, A., Velásquez, J., Miranda, M.E., Carreon, D., Flores, L., Martínez, M., Velásquez-Valle, M., Brambilla, F., Parrot, J.-F. Ronquillo, G., 2008. Fractal mapping of soil moisture content from remote sensed multi-scale data. Nonlinear Proceses in Geophysics Journal, 15: 711-725. 5. Atmanspacher, H., Räth, Ch., Wiedenmann, G., 1997. Statistics and meta-statistics in the concept of complexity. Physica A, 234: 819-829.

  11. Sample Skewness as a Statistical Measurement of Neuronal Tuning Sharpness

    PubMed Central

    Samonds, Jason M.; Potetz, Brian R.; Lee, Tai Sing

    2014-01-01

    We propose using the statistical measurement of the sample skewness of the distribution of mean firing rates of a tuning curve to quantify sharpness of tuning. For some features, like binocular disparity, tuning curves are best described by relatively complex and sometimes diverse functions, making it difficult to quantify sharpness with a single function and parameter. Skewness provides a robust nonparametric measure of tuning curve sharpness that is invariant with respect to the mean and variance of the tuning curve and is straightforward to apply to a wide range of tuning, including simple orientation tuning curves and complex object tuning curves that often cannot even be described parametrically. Because skewness does not depend on a specific model or function of tuning, it is especially appealing to cases of sharpening where recurrent interactions among neurons produce sharper tuning curves that deviate in a complex manner from the feedforward function of tuning. Since tuning curves for all neurons are not typically well described by a single parametric function, this model independence additionally allows skewness to be applied to all recorded neurons, maximizing the statistical power of a set of data. We also compare skewness with other nonparametric measures of tuning curve sharpness and selectivity. Compared to these other nonparametric measures tested, skewness is best used for capturing the sharpness of multimodal tuning curves defined by narrow peaks (maximum) and broad valleys (minima). Finally, we provide a more formal definition of sharpness using a shape-based information gain measure and derive and show that skewness is correlated with this definition. PMID:24555451

  12. Statistical Physics in the Era of Big Data

    ERIC Educational Resources Information Center

    Wang, Dashun

    2013-01-01

    With the wealth of data provided by a wide range of high-throughout measurement tools and technologies, statistical physics of complex systems is entering a new phase, impacting in a meaningful fashion a wide range of fields, from cell biology to computer science to economics. In this dissertation, by applying tools and techniques developed in…

  13. Shearlet-based measures of entropy and complexity for two-dimensional patterns

    NASA Astrophysics Data System (ADS)

    Brazhe, Alexey

    2018-06-01

    New spatial entropy and complexity measures for two-dimensional patterns are proposed. The approach is based on the notion of disequilibrium and is built on statistics of directional multiscale coefficients of the fast finite shearlet transform. Shannon entropy and Jensen-Shannon divergence measures are employed. Both local and global spatial complexity and entropy estimates can be obtained, thus allowing for spatial mapping of complexity in inhomogeneous patterns. The algorithm is validated in numerical experiments with a gradually decaying periodic pattern and Ising surfaces near critical state. It is concluded that the proposed algorithm can be instrumental in describing a wide range of two-dimensional imaging data, textures, or surfaces, where an understanding of the level of order or randomness is desired.

  14. Statistical and sampling issues when using multiple particle tracking

    NASA Astrophysics Data System (ADS)

    Savin, Thierry; Doyle, Patrick S.

    2007-08-01

    Video microscopy can be used to simultaneously track several microparticles embedded in a complex material. The trajectories are used to extract a sample of displacements at random locations in the material. From this sample, averaged quantities characterizing the dynamics of the probes are calculated to evaluate structural and/or mechanical properties of the assessed material. However, the sampling of measured displacements in heterogeneous systems is singular because the volume of observation with video microscopy is finite. By carefully characterizing the sampling design in the experimental output of the multiple particle tracking technique, we derive estimators for the mean and variance of the probes’ dynamics that are independent of the peculiar statistical characteristics. We expose stringent tests of these estimators using simulated and experimental complex systems with a known heterogeneous structure. Up to a certain fundamental limitation, which we characterize through a material degree of sampling by the embedded probe tracking, these estimators can be applied to quantify the heterogeneity of a material, providing an original and intelligible kind of information on complex fluid properties. More generally, we show that the precise assessment of the statistics in the multiple particle tracking output sample of observations is essential in order to provide accurate unbiased measurements.

  15. Quantifying the statistical complexity of low-frequency fluctuations in semiconductor lasers with optical feedback

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tiana-Alsina, J.; Torrent, M. C.; Masoller, C.

    Low-frequency fluctuations (LFFs) represent a dynamical instability that occurs in semiconductor lasers when they are operated near the lasing threshold and subject to moderate optical feedback. LFFs consist of sudden power dropouts followed by gradual, stepwise recoveries. We analyze experimental time series of intensity dropouts and quantify the complexity of the underlying dynamics employing two tools from information theory, namely, Shannon's entropy and the Martin, Plastino, and Rosso statistical complexity measure. These measures are computed using a method based on ordinal patterns, by which the relative length and ordering of consecutive interdropout intervals (i.e., the time intervals between consecutive intensitymore » dropouts) are analyzed, disregarding the precise timing of the dropouts and the absolute durations of the interdropout intervals. We show that this methodology is suitable for quantifying subtle characteristics of the LFFs, and in particular the transition to fully developed chaos that takes place when the laser's pump current is increased. Our method shows that the statistical complexity of the laser does not increase continuously with the pump current, but levels off before reaching the coherence collapse regime. This behavior coincides with that of the first- and second-order correlations of the interdropout intervals, suggesting that these correlations, and not the chaotic behavior, are what determine the level of complexity of the laser's dynamics. These results hold for two different dynamical regimes, namely, sustained LFFs and coexistence between LFFs and steady-state emission.« less

  16. Quantitative Measures for Software Independent Verification and Validation

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    1996-01-01

    As software is maintained or reused, it undergoes an evolution which tends to increase the overall complexity of the code. To understand the effects of this, we brought in statistics experts and leading researchers in software complexity, reliability, and their interrelationships. These experts' project has resulted in our ability to statistically correlate specific code complexity attributes, in orthogonal domains, to errors found over time in the HAL/S flight software which flies in the Space Shuttle. Although only a prototype-tools experiment, the result of this research appears to be extendable to all other NASA software, given appropriate data similar to that logged for the Shuttle onboard software. Our research has demonstrated that a more complete domain coverage can be mathematically demonstrated with the approach we have applied, thereby ensuring full insight into the cause-and-effects relationship between the complexity of a software system and the fault density of that system. By applying the operational profile we can characterize the dynamic effects of software path complexity under this same approach We now have the ability to measure specific attributes which have been statistically demonstrated to correlate to increased error probability, and to know which actions to take, for each complexity domain. Shuttle software verifiers can now monitor the changes in the software complexity, assess the added or decreased risk of software faults in modified code, and determine necessary corrections. The reports, tool documentation, user's guides, and new approach that have resulted from this research effort represent advances in the state of the art of software quality and reliability assurance. Details describing how to apply this technique to other NASA code are contained in this document.

  17. A multi-factor Rasch scale for artistic judgment.

    PubMed

    Bezruczko, Nikolaus

    2002-01-01

    Measurement properties are reported for a combined scale of abstract and figurative artistic judgment aptitude items. Abstract items are synthetic, rule-based images from Visual Designs Test which implements a statistical algorithm to control design complexity and redundancy, and figurative items are canvas paintings in five styles, Fauvism, Post-Impressionism, Surrealism, Renaissance, and Baroque especially created for this research. The paintings integrate syntactic structure from VDT Abstract designs with thematic content for each style at four levels of complexity while controlling redundancy. Trained test administrators collected preference for synthetic abstract designs and authentic figurative art from 462 examinees in Johnson O'Connor Research Foundation testing offices in Boston, New York, Chicago, and Dallas. The Rasch model replicated measurement properties for VDT Abstract items and identified an item hierarchy that was statistically invariant between genders and generally stable across age for new, authentic figurative items. Further examination of the figurative item hierarchy revealed that complexity interacts with style and meaning. Sound measurement properties for a combined VDT Abstract and Figurative scale shows promise for a comprehensive artistic judgment construct.

  18. Control entropy identifies differential changes in complexity of walking and running gait patterns with increasing speed in highly trained runners

    NASA Astrophysics Data System (ADS)

    McGregor, Stephen J.; Busa, Michael A.; Skufca, Joseph; Yaggie, James A.; Bollt, Erik M.

    2009-06-01

    Regularity statistics have been previously applied to walking gait measures in the hope of gaining insight into the complexity of gait under different conditions and in different populations. Traditional regularity statistics are subject to the requirement of stationarity, a limitation for examining changes in complexity under dynamic conditions such as exhaustive exercise. Using a novel measure, control entropy (CE), applied to triaxial continuous accelerometry, we report changes in complexity of walking and running during increasing speeds up to exhaustion in highly trained runners. We further apply Karhunen-Loeve analysis in a new and novel way to the patterns of CE responses in each of the three axes to identify dominant modes of CE responses in the vertical, mediolateral, and anterior/posterior planes. The differential CE responses observed between the different axes in this select population provide insight into the constraints of walking and running in those who may have optimized locomotion. Future comparisons between athletes, healthy untrained, and clinical populations using this approach may help elucidate differences between optimized and diseased locomotor control.

  19. Robust Strategy for Rocket Engine Health Monitoring

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    2001-01-01

    Monitoring the health of rocket engine systems is essentially a two-phase process. The acquisition phase involves sensing physical conditions at selected locations, converting physical inputs to electrical signals, conditioning the signals as appropriate to establish scale or filter interference, and recording results in a form that is easy to interpret. The inference phase involves analysis of results from the acquisition phase, comparison of analysis results to established health measures, and assessment of health indications. A variety of analytical tools may be employed in the inference phase of health monitoring. These tools can be separated into three broad categories: statistical, rule based, and model based. Statistical methods can provide excellent comparative measures of engine operating health. They require well-characterized data from an ensemble of "typical" engines, or "golden" data from a specific test assumed to define the operating norm in order to establish reliable comparative measures. Statistical methods are generally suitable for real-time health monitoring because they do not deal with the physical complexities of engine operation. The utility of statistical methods in rocket engine health monitoring is hindered by practical limits on the quantity and quality of available data. This is due to the difficulty and high cost of data acquisition, the limited number of available test engines, and the problem of simulating flight conditions in ground test facilities. In addition, statistical methods incur a penalty for disregarding flow complexity and are therefore limited in their ability to define performance shift causality. Rule based methods infer the health state of the engine system based on comparison of individual measurements or combinations of measurements with defined health norms or rules. This does not mean that rule based methods are necessarily simple. Although binary yes-no health assessment can sometimes be established by relatively simple rules, the causality assignment needed for refined health monitoring often requires an exceptionally complex rule base involving complicated logical maps. Structuring the rule system to be clear and unambiguous can be difficult, and the expert input required to maintain a large logic network and associated rule base can be prohibitive.

  20. Genetic programming approach to evaluate complexity of texture images

    NASA Astrophysics Data System (ADS)

    Ciocca, Gianluigi; Corchs, Silvia; Gasparini, Francesca

    2016-11-01

    We adopt genetic programming (GP) to define a measure that can predict complexity perception of texture images. We perform psychophysical experiments on three different datasets to collect data on the perceived complexity. The subjective data are used for training, validation, and test of the proposed measure. These data are also used to evaluate several possible candidate measures of texture complexity related to both low level and high level image features. We select four of them (namely roughness, number of regions, chroma variance, and memorability) to be combined in a GP framework. This approach allows a nonlinear combination of the measures and could give hints on how the related image features interact in complexity perception. The proposed complexity measure M exhibits Pearson correlation coefficients of 0.890 on the training set, 0.728 on the validation set, and 0.724 on the test set. M outperforms each of all the single measures considered. From the statistical analysis of different GP candidate solutions, we found that the roughness measure evaluated on the gray level image is the most dominant one, followed by the memorability, the number of regions, and finally the chroma variance.

  1. Complexity quantification of dense array EEG using sample entropy analysis.

    PubMed

    Ramanand, Pravitha; Nampoori, V P N; Sreenivasan, R

    2004-09-01

    In this paper, a time series complexity analysis of dense array electroencephalogram signals is carried out using the recently introduced Sample Entropy (SampEn) measure. This statistic quantifies the regularity in signals recorded from systems that can vary from the purely deterministic to purely stochastic realm. The present analysis is conducted with an objective of gaining insight into complexity variations related to changing brain dynamics for EEG recorded from the three cases of passive, eyes closed condition, a mental arithmetic task and the same mental task carried out after a physical exertion task. It is observed that the statistic is a robust quantifier of complexity suited for short physiological signals such as the EEG and it points to the specific brain regions that exhibit lowered complexity during the mental task state as compared to a passive, relaxed state. In the case of mental tasks carried out before and after the performance of a physical exercise, the statistic can detect the variations brought in by the intermediate fatigue inducing exercise period. This enhances its utility in detecting subtle changes in the brain state that can find wider scope for applications in EEG based brain studies.

  2. Complex degree of mutual anisotropy in diagnostics of biological tissues physiological changes

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. A.; Dubolazov, O. V.; Karachevtcev, A. O.; Zabolotna, N. I.

    2011-05-01

    To characterize the degree of consistency of parameters of the optically uniaxial birefringent protein nets of blood plasma a new parameter - complex degree of mutual anisotropy is suggested. The technique of polarization measuring the coordinate distributions of the complex degree of mutual anisotropy of blood plasma is developed. It is shown that statistic approach to the analysis of complex degree of mutual anisotropy distributions of blood plasma is effective in the diagnosis and differentiation of acute inflammation - acute and gangrenous appendicitis.

  3. Complex degree of mutual anisotropy in diagnostics of biological tissues physiological changes

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. A.; Dubolazov, A. V.; Karachevtcev, A. O.; Zabolotna, N. I.

    2011-09-01

    To characterize the degree of consistency of parameters of the optically uniaxial birefringent protein nets of blood plasma a new parameter - complex degree of mutual anisotropy is suggested. The technique of polarization measuring the coordinate distributions of the complex degree of mutual anisotropy of blood plasma is developed. It is shown that statistic approach to the analysis of complex degree of mutual anisotropy distributions of blood plasma is effective in the diagnosis and differentiation of acute inflammation - acute and gangrenous appendicitis.

  4. SPICE: exploration and analysis of post-cytometric complex multivariate datasets.

    PubMed

    Roederer, Mario; Nozzi, Joshua L; Nason, Martha C

    2011-02-01

    Polychromatic flow cytometry results in complex, multivariate datasets. To date, tools for the aggregate analysis of these datasets across multiple specimens grouped by different categorical variables, such as demographic information, have not been optimized. Often, the exploration of such datasets is accomplished by visualization of patterns with pie charts or bar charts, without easy access to statistical comparisons of measurements that comprise multiple components. Here we report on algorithms and a graphical interface we developed for these purposes. In particular, we discuss thresholding necessary for accurate representation of data in pie charts, the implications for display and comparison of normalized versus unnormalized data, and the effects of averaging when samples with significant background noise are present. Finally, we define a statistic for the nonparametric comparison of complex distributions to test for difference between groups of samples based on multi-component measurements. While originally developed to support the analysis of T cell functional profiles, these techniques are amenable to a broad range of datatypes. Published 2011 Wiley-Liss, Inc.

  5. Multilevel Modeling for Research in Group Work

    ERIC Educational Resources Information Center

    Selig, James P.; Trott, Arianna; Lemberger, Matthew E.

    2017-01-01

    Researchers in group counseling often encounter complex data from individual clients who are members of a group. Clients in the same group may be more similar than clients from different groups and this can lead to violations of statistical assumptions. The complexity of the data also means that predictors and outcomes can be measured at both the…

  6. Technical Note: The Initial Stages of Statistical Data Analysis

    PubMed Central

    Tandy, Richard D.

    1998-01-01

    Objective: To provide an overview of several important data-related considerations in the design stage of a research project and to review the levels of measurement and their relationship to the statistical technique chosen for the data analysis. Background: When planning a study, the researcher must clearly define the research problem and narrow it down to specific, testable questions. The next steps are to identify the variables in the study, decide how to group and treat subjects, and determine how to measure, and the underlying level of measurement of, the dependent variables. Then the appropriate statistical technique can be selected for data analysis. Description: The four levels of measurement in increasing complexity are nominal, ordinal, interval, and ratio. Nominal data are categorical or “count” data, and the numbers are treated as labels. Ordinal data can be ranked in a meaningful order by magnitude. Interval data possess the characteristics of ordinal data and also have equal distances between levels. Ratio data have a natural zero point. Nominal and ordinal data are analyzed with nonparametric statistical techniques and interval and ratio data with parametric statistical techniques. Advantages: Understanding the four levels of measurement and when it is appropriate to use each is important in determining which statistical technique to use when analyzing data. PMID:16558489

  7. Combining data visualization and statistical approaches for interpreting measurements and meta-data: Integrating heatmaps, variable clustering, and mixed regression models

    EPA Science Inventory

    The advent of new higher throughput analytical instrumentation has put a strain on interpreting and explaining the results from complex studies. Contemporary human, environmental, and biomonitoring data sets are comprised of tens or hundreds of analytes, multiple repeat measures...

  8. The Complexity of Solar and Geomagnetic Indices

    NASA Astrophysics Data System (ADS)

    Pesnell, W. Dean

    2017-08-01

    How far in advance can the sunspot number be predicted with any degree of confidence? Solar cycle predictions are needed to plan long-term space missions. Fleets of satellites circle the Earth collecting science data, protecting astronauts, and relaying information. All of these satellites are sensitive at some level to solar cycle effects. Statistical and timeseries analyses of the sunspot number are often used to predict solar activity. These methods have not been completely successful as the solar dynamo changes over time and one cycle's sunspots are not a faithful predictor of the next cycle's activity. In some ways, using these techniques is similar to asking whether the stock market can be predicted. It has been shown that the Dow Jones Industrial Average (DJIA) can be more accurately predicted during periods when it obeys certain statistical properties than at other times. The Hurst exponent is one such way to partition the data. Another measure of the complexity of a timeseries is the fractal dimension. We can use these measures of complexity to compare the sunspot number with other solar and geomagnetic indices. Our concentration is on how trends are removed by the various techniques, either internally or externally. Comparisons of the statistical properties of the various solar indices may guide us in understanding how the dynamo manifests in the various indices and the Sun.

  9. PLASS: Protein-ligand affinity statistical score a knowledge-based force-field model of interaction derived from the PDB

    NASA Astrophysics Data System (ADS)

    Ozrin, V. D.; Subbotin, M. V.; Nikitin, S. M.

    2004-04-01

    We have developed PLASS (Protein-Ligand Affinity Statistical Score), a pair-wise potential of mean-force for rapid estimation of the binding affinity of a ligand molecule to a protein active site. This scoring function is derived from the frequency of occurrence of atom-type pairs in crystallographic complexes taken from the Protein Data Bank (PDB). Statistical distributions are converted into distance-dependent contributions to the Gibbs free interaction energy for 10 atomic types using the Boltzmann hypothesis, with only one adjustable parameter. For a representative set of 72 protein-ligand structures, PLASS scores correlate well with the experimentally measured dissociation constants: a correlation coefficient R of 0.82 and RMS error of 2.0 kcal/mol. Such high accuracy results from our novel treatment of the volume correction term, which takes into account the inhomogeneous properties of the protein-ligand complexes. PLASS is able to rank reliably the affinity of complexes which have as much diversity as in the PDB.

  10. Reversibility in Quantum Models of Stochastic Processes

    NASA Astrophysics Data System (ADS)

    Gier, David; Crutchfield, James; Mahoney, John; James, Ryan

    Natural phenomena such as time series of neural firing, orientation of layers in crystal stacking and successive measurements in spin-systems are inherently probabilistic. The provably minimal classical models of such stochastic processes are ɛ-machines, which consist of internal states, transition probabilities between states and output values. The topological properties of the ɛ-machine for a given process characterize the structure, memory and patterns of that process. However ɛ-machines are often not ideal because their statistical complexity (Cμ) is demonstrably greater than the excess entropy (E) of the processes they represent. Quantum models (q-machines) of the same processes can do better in that their statistical complexity (Cq) obeys the relation Cμ >= Cq >= E. q-machines can be constructed to consider longer lengths of strings, resulting in greater compression. With code-words of sufficiently long length, the statistical complexity becomes time-symmetric - a feature apparently novel to this quantum representation. This result has ramifications for compression of classical information in quantum computing and quantum communication technology.

  11. Potentiation Following Ballistic and Nonballistic Complexes: The Effect of Strength Level.

    PubMed

    Suchomel, Timothy J; Sato, Kimitake; DeWeese, Brad H; Ebben, William P; Stone, Michael H

    2016-07-01

    Suchomel, TJ, Sato, K, DeWeese, BH, Ebben, WP, and Stone, MH. Potentiation following ballistic and nonballistic complexes: the effect of strength level. J Strength Cond Res 30(7): 1825-1833, 2016-The purpose of this study was to compare the temporal profile of strong and weak subjects during ballistic and nonballistic potentiation complexes. Eight strong (relative back squat = 2.1 ± 0.1 times body mass) and 8 weak (relative back squat = 1.6 ± 0.2 times body mass) males performed squat jumps immediately and every minute up to 10 minutes following potentiation complexes that included ballistic or nonballistic concentric-only half-squat (COHS) performed at 90% of their 1 repetition maximum COHS. Jump height (JH) and allometrically scaled peak power (PPa) were compared using a series of 2 × 12 repeated measures analyses of variance. No statistically significant strength level main effects for JH (p = 0.442) or PPa (p = 0.078) existed during the ballistic condition. In contrast, statistically significant main effects for time existed for both JH (p = 0.014) and PPa (p < 0.001); however, no statistically significant pairwise comparisons were present (p > 0.05). Statistically significant strength level main effects existed for PPa (p = 0.039) but not for JH (p = 0.137) during the nonballistic condition. Post hoc analysis revealed that the strong subjects produced statistically greater PPa than the weaker subjects (p = 0.039). Statistically significant time main effects existed for time existed for PPa (p = 0.015), but not for JH (p = 0.178). No statistically significant strength level × time interaction effects for JH (p = 0.319) or PPa (p = 0.203) were present for the ballistic or nonballistic conditions. Practical significance indicated by effect sizes and the relationships between maximum potentiation and relative strength suggest that stronger subjects potentiate earlier and to a greater extent than weaker subjects during ballistic and nonballistic potentiation complexes.

  12. Bayesian statistics in radionuclide metrology: measurement of a decaying source

    NASA Astrophysics Data System (ADS)

    Bochud, François O.; Bailat, Claude J.; Laedermann, Jean-Pascal

    2007-08-01

    The most intuitive way of defining a probability is perhaps through the frequency at which it appears when a large number of trials are realized in identical conditions. The probability derived from the obtained histogram characterizes the so-called frequentist or conventional statistical approach. In this sense, probability is defined as a physical property of the observed system. By contrast, in Bayesian statistics, a probability is not a physical property or a directly observable quantity, but a degree of belief or an element of inference. The goal of this paper is to show how Bayesian statistics can be used in radionuclide metrology and what its advantages and disadvantages are compared with conventional statistics. This is performed through the example of an yttrium-90 source typically encountered in environmental surveillance measurement. Because of the very low activity of this kind of source and the small half-life of the radionuclide, this measurement takes several days, during which the source decays significantly. Several methods are proposed to compute simultaneously the number of unstable nuclei at a given reference time, the decay constant and the background. Asymptotically, all approaches give the same result. However, Bayesian statistics produces coherent estimates and confidence intervals in a much smaller number of measurements. Apart from the conceptual understanding of statistics, the main difficulty that could deter radionuclide metrologists from using Bayesian statistics is the complexity of the computation.

  13. Rescore protein-protein docked ensembles with an interface contact statistics.

    PubMed

    Mezei, Mihaly

    2017-02-01

    The recently developed statistical measure for the type of residue-residue contact at protein complex interfaces, based on a parameter-free definition of contact, has been used to define a contact score that is correlated with the likelihood of correctness of a proposed complex structure. Comparing the proposed contact scores on the native structure and on a set of model structures the proposed measure was shown to generally favor the native structure but in itself was not able to reliably score the native structure to be the best. Adjusting the scores of redocking experiments with the contact score showed that the adjusted score was able to move up the ranking of the native-like structure among the proposed complexes when the native-like was not ranked the best by the respective program. Tests on docking of unbound proteins compared the contact scores of the complexes with the contact score of the crystal structure again showing the tendency of the contact score to favor native-like conformations. The possibility of using the contact score to improve the determination of biological dimers in a crystal structure was also explored. Proteins 2017; 85:235-241. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  14. Quantum communication complexity advantage implies violation of a Bell inequality

    PubMed Central

    Buhrman, Harry; Czekaj, Łukasz; Grudka, Andrzej; Horodecki, Michał; Horodecki, Paweł; Markiewicz, Marcin; Speelman, Florian; Strelchuk, Sergii

    2016-01-01

    We obtain a general connection between a large quantum advantage in communication complexity and Bell nonlocality. We show that given any protocol offering a sufficiently large quantum advantage in communication complexity, there exists a way of obtaining measurement statistics that violate some Bell inequality. Our main tool is port-based teleportation. If the gap between quantum and classical communication complexity can grow arbitrarily large, the ratio of the quantum value to the classical value of the Bell quantity becomes unbounded with the increase in the number of inputs and outputs. PMID:26957600

  15. Family Environment and Cognitive Development: Twelve Analytic Models

    ERIC Educational Resources Information Center

    Walberg, Herbert J.; Marjoribanks, Kevin

    1976-01-01

    The review indicates that refined measures of the family environment and the use of complex statistical models increase the understanding of the relationships between socioeconomic status, sibling variables, family environment, and cognitive development. (RC)

  16. Measuring Circulation Desk Activities Using a Random Alarm Mechanism.

    ERIC Educational Resources Information Center

    Mosborg, Stella Frank

    1980-01-01

    Reports a job analysis methodology to gather meaningful data related to circulation desk activity. The technique is designed to give librarians statistical data on actual time expenditures for complex and varying activities. (Author/RAA)

  17. How weak values emerge in joint measurements on cloned quantum systems.

    PubMed

    Hofmann, Holger F

    2012-07-13

    A statistical analysis of optimal universal cloning shows that it is possible to identify an ideal (but nonpositive) copying process that faithfully maps all properties of the original Hilbert space onto two separate quantum systems, resulting in perfect correlations for all observables. The joint probabilities for noncommuting measurements on separate clones then correspond to the real parts of the complex joint probabilities observed in weak measurements on a single system, where the measurements on the two clones replace the corresponding sequence of weak measurement and postselection. The imaginary parts of weak measurement statics can be obtained by replacing the cloning process with a partial swap operation. A controlled-swap operation combines both processes, making the complete weak measurement statistics accessible as a well-defined contribution to the joint probabilities of fully resolved projective measurements on the two output systems.

  18. Single-molecule conductance studies of photo-active and photochromic molecules

    NASA Astrophysics Data System (ADS)

    Tam, E. S.; Parks, J. J.; Santiago-Berrios, M. B.; Zhong, Y.-W.; Abruna, H. D.; Ralph, D. C.

    2010-03-01

    We perform statistical measurements of single molecule conductance in repeatedly-formed metal-molecule-metal junctions at room temperature. Our results on diaminoalkanes are consistent with those reported by the Venkataraman group. We focus on photo-active and photochromic molecules, including a series of transition-metal complexes with different metal centers and endgroups. We compare the trend in conductance across the family of complexes with that expected from electrochemical measurements. We will also report initial results on the voltage dependence of single-molecule conductances and the effects of optical excitations.

  19. Evaluating Cellular Polyfunctionality with a Novel Polyfunctionality Index

    PubMed Central

    Larsen, Martin; Sauce, Delphine; Arnaud, Laurent; Fastenackels, Solène; Appay, Victor; Gorochov, Guy

    2012-01-01

    Functional evaluation of naturally occurring or vaccination-induced T cell responses in mice, men and monkeys has in recent years advanced from single-parameter (e.g. IFN-γ-secretion) to much more complex multidimensional measurements. Co-secretion of multiple functional molecules (such as cytokines and chemokines) at the single-cell level is now measurable due primarily to major advances in multiparametric flow cytometry. The very extensive and complex datasets generated by this technology raise the demand for proper analytical tools that enable the analysis of combinatorial functional properties of T cells, hence polyfunctionality. Presently, multidimensional functional measures are analysed either by evaluating all combinations of parameters individually or by summing frequencies of combinations that include the same number of simultaneous functions. Often these evaluations are visualized as pie charts. Whereas pie charts effectively represent and compare average polyfunctionality profiles of particular T cell subsets or patient groups, they do not document the degree or variation of polyfunctionality within a group nor does it allow more sophisticated statistical analysis. Here we propose a novel polyfunctionality index that numerically evaluates the degree and variation of polyfuntionality, and enable comparative and correlative parametric and non-parametric statistical tests. Moreover, it allows the usage of more advanced statistical approaches, such as cluster analysis. We believe that the polyfunctionality index will render polyfunctionality an appropriate end-point measure in future studies of T cell responsiveness. PMID:22860124

  20. Statistical benchmark for BosonSampling

    NASA Astrophysics Data System (ADS)

    Walschaers, Mattia; Kuipers, Jack; Urbina, Juan-Diego; Mayer, Klaus; Tichy, Malte Christopher; Richter, Klaus; Buchleitner, Andreas

    2016-03-01

    Boson samplers—set-ups that generate complex many-particle output states through the transmission of elementary many-particle input states across a multitude of mutually coupled modes—promise the efficient quantum simulation of a classically intractable computational task, and challenge the extended Church-Turing thesis, one of the fundamental dogmas of computer science. However, as in all experimental quantum simulations of truly complex systems, one crucial problem remains: how to certify that a given experimental measurement record unambiguously results from enforcing the claimed dynamics, on bosons, fermions or distinguishable particles? Here we offer a statistical solution to the certification problem, identifying an unambiguous statistical signature of many-body quantum interference upon transmission across a multimode, random scattering device. We show that statistical analysis of only partial information on the output state allows to characterise the imparted dynamics through particle type-specific features of the emerging interference patterns. The relevant statistical quantifiers are classically computable, define a falsifiable benchmark for BosonSampling, and reveal distinctive features of many-particle quantum dynamics, which go much beyond mere bunching or anti-bunching effects.

  1. Minimized state complexity of quantum-encoded cryptic processes

    NASA Astrophysics Data System (ADS)

    Riechers, Paul M.; Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-05-01

    The predictive information required for proper trajectory sampling of a stochastic process can be more efficiently transmitted via a quantum channel than a classical one. This recent discovery allows quantum information processing to drastically reduce the memory necessary to simulate complex classical stochastic processes. It also points to a new perspective on the intrinsic complexity that nature must employ in generating the processes we observe. The quantum advantage increases with codeword length: the length of process sequences used in constructing the quantum communication scheme. In analogy with the classical complexity measure, statistical complexity, we use this reduced communication cost as an entropic measure of state complexity in the quantum representation. Previously difficult to compute, the quantum advantage is expressed here in closed form using spectral decomposition. This allows for efficient numerical computation of the quantum-reduced state complexity at all encoding lengths, including infinite. Additionally, it makes clear how finite-codeword reduction in state complexity is controlled by the classical process's cryptic order, and it allows asymptotic analysis of infinite-cryptic-order processes.

  2. Best Phd thesis Prize: Statistical analysis of ALFALFA galaxies: insights in galaxy

    NASA Astrophysics Data System (ADS)

    Papastergis, E.

    2013-09-01

    We use the rich dataset of local universe galaxies detected by the ALFALFA 21cm survey to study the statistical properties of gas-bearing galaxies. In particular, we measure the number density of galaxies as a function of their baryonic mass ("baryonic mass function") and rotational velocity ("velocity width function"), and we characterize their clustering properties ("two-point correlation function"). These statistical distributions are determined by both the properties of dark matter on small scales, as well as by the complex baryonic processes through which galaxies form over cosmic time. We interpret the ALFALFA measurements with the aid of publicly available cosmological N-body simulations and we present some key results related to galaxy formation and small-scale cosmology.

  3. Functional constraints on tooth morphology in carnivorous mammals

    PubMed Central

    2012-01-01

    Background The range of potential morphologies resulting from evolution is limited by complex interacting processes, ranging from development to function. Quantifying these interactions is important for understanding adaptation and convergent evolution. Using three-dimensional reconstructions of carnivoran and dasyuromorph tooth rows, we compared statistical models of the relationship between tooth row shape and the opposing tooth row, a static feature, as well as measures of mandibular motion during chewing (occlusion), which are kinetic features. This is a new approach to quantifying functional integration because we use measures of movement and displacement, such as the amount the mandible translates laterally during occlusion, as opposed to conventional morphological measures, such as mandible length and geometric landmarks. By sampling two distantly related groups of ecologically similar mammals, we study carnivorous mammals in general rather than a specific group of mammals. Results Statistical model comparisons demonstrate that the best performing models always include some measure of mandibular motion, indicating that functional and statistical models of tooth shape as purely a function of the opposing tooth row are too simple and that increased model complexity provides a better understanding of tooth form. The predictors of the best performing models always included the opposing tooth row shape and a relative linear measure of mandibular motion. Conclusions Our results provide quantitative support of long-standing hypotheses of tooth row shape as being influenced by mandibular motion in addition to the opposing tooth row. Additionally, this study illustrates the utility and necessity of including kinetic features in analyses of morphological integration. PMID:22899809

  4. Advanced functional network analysis in the geosciences: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Runge, Jakob; Schultz, Hanna C. H.; Wiedermann, Marc; Zech, Alraune; Feldhoff, Jan; Rheinwalt, Aljoscha; Kutza, Hannes; Radebach, Alexander; Marwan, Norbert; Kurths, Jürgen

    2013-04-01

    Functional networks are a powerful tool for analyzing large geoscientific datasets such as global fields of climate time series originating from observations or model simulations. pyunicorn (pythonic unified complex network and recurrence analysis toolbox) is an open-source, fully object-oriented and easily parallelizable package written in the language Python. It allows for constructing functional networks (aka climate networks) representing the structure of statistical interrelationships in large datasets and, subsequently, investigating this structure using advanced methods of complex network theory such as measures for networks of interacting networks, node-weighted statistics or network surrogates. Additionally, pyunicorn allows to study the complex dynamics of geoscientific systems as recorded by time series by means of recurrence networks and visibility graphs. The range of possible applications of the package is outlined drawing on several examples from climatology.

  5. Are weather models better than gridded observations for precipitation in the mountains? (Invited)

    NASA Astrophysics Data System (ADS)

    Gutmann, E. D.; Rasmussen, R.; Liu, C.; Ikeda, K.; Clark, M. P.; Brekke, L. D.; Arnold, J.; Raff, D. A.

    2013-12-01

    Mountain snowpack is a critical storage component in the water cycle, and it provides drinking water for tens of millions of people in the Western US alone. This water store is susceptible to climate change both because warming temperatures are likely to lead to earlier melt and a temporal shift of the hydrograph, and because changing atmospheric conditions are likely to change the precipitation patterns that produce the snowpack. Current measurements of snowfall in complex terrain are limited in number due in part to the logistics of installing equipment in complex terrain. We show that this limitation leads to statistical artifacts in gridded observations of current climate including errors in precipitation season totals of a factor of two or more, increases in wet day fraction, and decreases in storm intensity. In contrast, a high-resolution numerical weather model (WRF) is able to reproduce observed precipitation patterns, leading to confidence in its predictions for areas without measurements and new observations support this. Running WRF for a future climate scenario shows substantial changes in the spatial patterns of precipitation in the mountains related to the physics of hydrometeor production and detrainment that are not captured by statistical downscaling products. The stationarity in statistical downscaling products is likely to lead to important errors in our estimation of future precipitation in complex terrain.

  6. Spectral Entropies as Information-Theoretic Tools for Complex Network Comparison

    NASA Astrophysics Data System (ADS)

    De Domenico, Manlio; Biamonte, Jacob

    2016-10-01

    Any physical system can be viewed from the perspective that information is implicitly represented in its state. However, the quantification of this information when it comes to complex networks has remained largely elusive. In this work, we use techniques inspired by quantum statistical mechanics to define an entropy measure for complex networks and to develop a set of information-theoretic tools, based on network spectral properties, such as Rényi q entropy, generalized Kullback-Leibler and Jensen-Shannon divergences, the latter allowing us to define a natural distance measure between complex networks. First, we show that by minimizing the Kullback-Leibler divergence between an observed network and a parametric network model, inference of model parameter(s) by means of maximum-likelihood estimation can be achieved and model selection can be performed with appropriate information criteria. Second, we show that the information-theoretic metric quantifies the distance between pairs of networks and we can use it, for instance, to cluster the layers of a multilayer system. By applying this framework to networks corresponding to sites of the human microbiome, we perform hierarchical cluster analysis and recover with high accuracy existing community-based associations. Our results imply that spectral-based statistical inference in complex networks results in demonstrably superior performance as well as a conceptual backbone, filling a gap towards a network information theory.

  7. Characterization of time series via Rényi complexity-entropy curves

    NASA Astrophysics Data System (ADS)

    Jauregui, M.; Zunino, L.; Lenzi, E. K.; Mendes, R. S.; Ribeiro, H. V.

    2018-05-01

    One of the most useful tools for distinguishing between chaotic and stochastic time series is the so-called complexity-entropy causality plane. This diagram involves two complexity measures: the Shannon entropy and the statistical complexity. Recently, this idea has been generalized by considering the Tsallis monoparametric generalization of the Shannon entropy, yielding complexity-entropy curves. These curves have proven to enhance the discrimination among different time series related to stochastic and chaotic processes of numerical and experimental nature. Here we further explore these complexity-entropy curves in the context of the Rényi entropy, which is another monoparametric generalization of the Shannon entropy. By combining the Rényi entropy with the proper generalization of the statistical complexity, we associate a parametric curve (the Rényi complexity-entropy curve) with a given time series. We explore this approach in a series of numerical and experimental applications, demonstrating the usefulness of this new technique for time series analysis. We show that the Rényi complexity-entropy curves enable the differentiation among time series of chaotic, stochastic, and periodic nature. In particular, time series of stochastic nature are associated with curves displaying positive curvature in a neighborhood of their initial points, whereas curves related to chaotic phenomena have a negative curvature; finally, periodic time series are represented by vertical straight lines.

  8. A generalized complexity measure based on Rényi entropy

    NASA Astrophysics Data System (ADS)

    Sánchez-Moreno, Pablo; Angulo, Juan Carlos; Dehesa, Jesus S.

    2014-08-01

    The intrinsic statistical complexities of finite many-particle systems (i.e., those defined in terms of the single-particle density) quantify the degree of structure or patterns, far beyond the entropy measures. They are intuitively constructed to be minima at the opposite extremes of perfect order and maximal randomness. Starting from the pioneering LMC measure, which satisfies these requirements, some extensions of LMC-Rényi type have been published in the literature. The latter measures were shown to describe a variety of physical aspects of the internal disorder in atomic and molecular systems (e.g., quantum phase transitions, atomic shell filling) which are not grasped by their mother LMC quantity. However, they are not minimal for maximal randomness in general. In this communication, we propose a generalized LMC-Rényi complexity which overcomes this problem. Some applications which illustrate this fact are given.

  9. Cerebral blood flow during paroxysmal EEG activation induced by sleep in patients with complex partial seizures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gozukirmizi, E.; Meyer, J.S.; Okabe, T.

    1982-01-01

    Cerebral blood flow (CBF) measurements were combined with sleep polysomnography in nine patients with complex partial seizures. Two methods were used: the 133Xe method for measuring regional (rCBF) and the stable xenon CT method for local (LCBF). Compared to nonepileptic subjects, who show diffuse CBF decreases during stages I-II, non-REM sleep onset, patients with complex partial seizures show statistically significant increases in CBF which are maximal in regions where the EEG focus is localized and are predominantly seen in one temporal region but are also propagated to other cerebral areas. Both CBF methods gave comparable results, but greater statistical significancemore » was achieved by stable xenon CT methodology. CBF increases are more diffuse than predicted by EEG paroxysmal activity recorded from scalp electrodes. An advantage of the 133Xe inhalation method was achievement of reliable data despite movement of the head. This was attributed to the use of a helmet which maintained the probes approximated to the scalp. Disadvantages were poor resolution (7 cm3) and two-dimensional information. The advantage of stable xenon CT method is excellent resolution (80 mm3) in three dimensions, but a disadvantage is that movement of the head in patients with seizure disorders may limit satisfactory measurements.« less

  10. The Problem of Auto-Correlation in Parasitology

    PubMed Central

    Pollitt, Laura C.; Reece, Sarah E.; Mideo, Nicole; Nussey, Daniel H.; Colegrave, Nick

    2012-01-01

    Explaining the contribution of host and pathogen factors in driving infection dynamics is a major ambition in parasitology. There is increasing recognition that analyses based on single summary measures of an infection (e.g., peak parasitaemia) do not adequately capture infection dynamics and so, the appropriate use of statistical techniques to analyse dynamics is necessary to understand infections and, ultimately, control parasites. However, the complexities of within-host environments mean that tracking and analysing pathogen dynamics within infections and among hosts poses considerable statistical challenges. Simple statistical models make assumptions that will rarely be satisfied in data collected on host and parasite parameters. In particular, model residuals (unexplained variance in the data) should not be correlated in time or space. Here we demonstrate how failure to account for such correlations can result in incorrect biological inference from statistical analysis. We then show how mixed effects models can be used as a powerful tool to analyse such repeated measures data in the hope that this will encourage better statistical practices in parasitology. PMID:22511865

  11. Online incidental statistical learning of audiovisual word sequences in adults: a registered report.

    PubMed

    Kuppuraj, Sengottuvel; Duta, Mihaela; Thompson, Paul; Bishop, Dorothy

    2018-02-01

    Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory-picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test-retest reliability ( r  = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process.

  12. Online incidental statistical learning of audiovisual word sequences in adults: a registered report

    PubMed Central

    Duta, Mihaela; Thompson, Paul

    2018-01-01

    Statistical learning has been proposed as a key mechanism in language learning. Our main goal was to examine whether adults are capable of simultaneously extracting statistical dependencies in a task where stimuli include a range of structures amenable to statistical learning within a single paradigm. We devised an online statistical learning task using real word auditory–picture sequences that vary in two dimensions: (i) predictability and (ii) adjacency of dependent elements. This task was followed by an offline recall task to probe learning of each sequence type. We registered three hypotheses with specific predictions. First, adults would extract regular patterns from continuous stream (effect of grammaticality). Second, within grammatical conditions, they would show differential speeding up for each condition as a factor of statistical complexity of the condition and exposure. Third, our novel approach to measure online statistical learning would be reliable in showing individual differences in statistical learning ability. Further, we explored the relation between statistical learning and a measure of verbal short-term memory (STM). Forty-two participants were tested and retested after an interval of at least 3 days on our novel statistical learning task. We analysed the reaction time data using a novel regression discontinuity approach. Consistent with prediction, participants showed a grammaticality effect, agreeing with the predicted order of difficulty for learning different statistical structures. Furthermore, a learning index from the task showed acceptable test–retest reliability (r = 0.67). However, STM did not correlate with statistical learning. We discuss the findings noting the benefits of online measures in tracking the learning process. PMID:29515876

  13. On the Measurement of Morphology and Its Change.

    DTIC Science & Technology

    1982-03-01

    that transitions within the ceratopsian dinosaurs necessitated overly complex or impossible grid deformations. THETA-RHO ANALYSIS An alternative...1982, A robust comparison of the three dimensional configurations of protein molelcules. Tech. Rpt. 224, Ser. 2, Dept. Statistics Princeton Univ., 18 p

  14. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan F.; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik V.; Marwan, Norbert; Dijkstra, Henk A.; Kurths, Jürgen

    2015-11-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology.

  15. Non-extensivity and complexity in the earthquake activity at the West Corinth rift (Greece)

    NASA Astrophysics Data System (ADS)

    Michas, Georgios; Vallianatos, Filippos; Sammonds, Peter

    2013-04-01

    Earthquakes exhibit complex phenomenology that is revealed from the fractal structure in space, time and magnitude. For that reason other tools rather than the simple Poissonian statistics seem more appropriate to describe the statistical properties of the phenomenon. Here we use Non-Extensive Statistical Physics [NESP] to investigate the inter-event time distribution of the earthquake activity at the west Corinth rift (central Greece). This area is one of the most seismotectonically active areas in Europe, with an important continental N-S extension and high seismicity rates. NESP concept refers to the non-additive Tsallis entropy Sq that includes Boltzmann-Gibbs entropy as a particular case. This concept has been successfully used for the analysis of a variety of complex dynamic systems including earthquakes, where fractality and long-range interactions are important. The analysis indicates that the cumulative inter-event time distribution can be successfully described with NESP, implying the complexity that characterizes the temporal occurrences of earthquakes. Further on, we use the Tsallis entropy (Sq) and the Fischer Information Measure (FIM) to investigate the complexity that characterizes the inter-event time distribution through different time windows along the evolution of the seismic activity at the West Corinth rift. The results of this analysis reveal a different level of organization and clusterization of the seismic activity in time. Acknowledgments. GM wish to acknowledge the partial support of the Greek State Scholarships Foundation (IKY).

  16. Primer of statistics in dental research: part I.

    PubMed

    Shintani, Ayumi

    2014-01-01

    Statistics play essential roles in evidence-based dentistry (EBD) practice and research. It ranges widely from formulating scientific questions, designing studies, collecting and analyzing data to interpreting, reporting, and presenting study findings. Mastering statistical concepts appears to be an unreachable goal among many dental researchers in part due to statistical authorities' limitations of explaining statistical principles to health researchers without elaborating complex mathematical concepts. This series of 2 articles aim to introduce dental researchers to 9 essential topics in statistics to conduct EBD with intuitive examples. The part I of the series includes the first 5 topics (1) statistical graph, (2) how to deal with outliers, (3) p-value and confidence interval, (4) testing equivalence, and (5) multiplicity adjustment. Part II will follow to cover the remaining topics including (6) selecting the proper statistical tests, (7) repeated measures analysis, (8) epidemiological consideration for causal association, and (9) analysis of agreement. Copyright © 2014. Published by Elsevier Ltd.

  17. Kolmogorov complexity, statistical regularization of inverse problems, and Birkhoff's formalization of beauty

    NASA Astrophysics Data System (ADS)

    Kreinovich, Vladik; Longpre, Luc; Koshelev, Misha

    1998-09-01

    Most practical applications of statistical methods are based on the implicit assumption that if an event has a very small probability, then it cannot occur. For example, the probability that a kettle placed on a cold stove would start boiling by itself is not 0, it is positive, but it is so small, that physicists conclude that such an event is simply impossible. This assumption is difficult to formalize in traditional probability theory, because this theory only describes measures on sets and does not allow us to divide functions into 'random' and non-random ones. This distinction was made possible by the idea of algorithmic randomness, introduce by Kolmogorov and his student Martin- Loef in the 1960s. We show that this idea can also be used for inverse problems. In particular, we prove that for every probability measure, the corresponding set of random functions is compact, and, therefore, the corresponding restricted inverse problem is well-defined. The resulting techniques turns out to be interestingly related with the qualitative esthetic measure introduced by G. Birkhoff as order/complexity.

  18. A new universality class in corpus of texts; A statistical physics study

    NASA Astrophysics Data System (ADS)

    Najafi, Elham; Darooneh, Amir H.

    2018-05-01

    Text can be regarded as a complex system. There are some methods in statistical physics which can be used to study this system. In this work, by means of statistical physics methods, we reveal new universal behaviors of texts associating with the fractality values of words in a text. The fractality measure indicates the importance of words in a text by considering distribution pattern of words throughout the text. We observed a power law relation between fractality of text and vocabulary size for texts and corpora. We also observed this behavior in studying biological data.

  19. Building out a Measurement Model to Incorporate Complexities of Testing in the Language Domain

    ERIC Educational Resources Information Center

    Wilson, Mark; Moore, Stephen

    2011-01-01

    This paper provides a summary of a novel and integrated way to think about the item response models (most often used in measurement applications in social science areas such as psychology, education, and especially testing of various kinds) from the viewpoint of the statistical theory of generalized linear and nonlinear mixed models. In addition,…

  20. Statistical Model of Dynamic Markers of the Alzheimer's Pathological Cascade.

    PubMed

    Balsis, Steve; Geraci, Lisa; Benge, Jared; Lowe, Deborah A; Choudhury, Tabina K; Tirso, Robert; Doody, Rachelle S

    2018-05-05

    Alzheimer's disease (AD) is a progressive disease reflected in markers across assessment modalities, including neuroimaging, cognitive testing, and evaluation of adaptive function. Identifying a single continuum of decline across assessment modalities in a single sample is statistically challenging because of the multivariate nature of the data. To address this challenge, we implemented advanced statistical analyses designed specifically to model complex data across a single continuum. We analyzed data from the Alzheimer's Disease Neuroimaging Initiative (ADNI; N = 1,056), focusing on indicators from the assessments of magnetic resonance imaging (MRI) volume, fluorodeoxyglucose positron emission tomography (FDG-PET) metabolic activity, cognitive performance, and adaptive function. Item response theory was used to identify the continuum of decline. Then, through a process of statistical scaling, indicators across all modalities were linked to that continuum and analyzed. Findings revealed that measures of MRI volume, FDG-PET metabolic activity, and adaptive function added measurement precision beyond that provided by cognitive measures, particularly in the relatively mild range of disease severity. More specifically, MRI volume, and FDG-PET metabolic activity become compromised in the very mild range of severity, followed by cognitive performance and finally adaptive function. Our statistically derived models of the AD pathological cascade are consistent with existing theoretical models.

  1. How Good Are Statistical Models at Approximating Complex Fitness Landscapes?

    PubMed Central

    du Plessis, Louis; Leventhal, Gabriel E.; Bonhoeffer, Sebastian

    2016-01-01

    Fitness landscapes determine the course of adaptation by constraining and shaping evolutionary trajectories. Knowledge of the structure of a fitness landscape can thus predict evolutionary outcomes. Empirical fitness landscapes, however, have so far only offered limited insight into real-world questions, as the high dimensionality of sequence spaces makes it impossible to exhaustively measure the fitness of all variants of biologically meaningful sequences. We must therefore revert to statistical descriptions of fitness landscapes that are based on a sparse sample of fitness measurements. It remains unclear, however, how much data are required for such statistical descriptions to be useful. Here, we assess the ability of regression models accounting for single and pairwise mutations to correctly approximate a complex quasi-empirical fitness landscape. We compare approximations based on various sampling regimes of an RNA landscape and find that the sampling regime strongly influences the quality of the regression. On the one hand it is generally impossible to generate sufficient samples to achieve a good approximation of the complete fitness landscape, and on the other hand systematic sampling schemes can only provide a good description of the immediate neighborhood of a sequence of interest. Nevertheless, we obtain a remarkably good and unbiased fit to the local landscape when using sequences from a population that has evolved under strong selection. Thus, current statistical methods can provide a good approximation to the landscape of naturally evolving populations. PMID:27189564

  2. Multi-frequency complex network from time series for uncovering oil-water flow structure.

    PubMed

    Gao, Zhong-Ke; Yang, Yu-Xuan; Fang, Peng-Cheng; Jin, Ning-De; Xia, Cheng-Yi; Hu, Li-Dan

    2015-02-04

    Uncovering complex oil-water flow structure represents a challenge in diverse scientific disciplines. This challenge stimulates us to develop a new distributed conductance sensor for measuring local flow signals at different positions and then propose a novel approach based on multi-frequency complex network to uncover the flow structures from experimental multivariate measurements. In particular, based on the Fast Fourier transform, we demonstrate how to derive multi-frequency complex network from multivariate time series. We construct complex networks at different frequencies and then detect community structures. Our results indicate that the community structures faithfully represent the structural features of oil-water flow patterns. Furthermore, we investigate the network statistic at different frequencies for each derived network and find that the frequency clustering coefficient enables to uncover the evolution of flow patterns and yield deep insights into the formation of flow structures. Current results present a first step towards a network visualization of complex flow patterns from a community structure perspective.

  3. Neuronal Correlation Parameter and the Idea of Thermodynamic Entropy of an N-Body Gravitationally Bounded System.

    PubMed

    Haranas, Ioannis; Gkigkitzis, Ioannis; Kotsireas, Ilias; Austerlitz, Carlos

    2017-01-01

    Understanding how the brain encodes information and performs computation requires statistical and functional analysis. Given the complexity of the human brain, simple methods that facilitate the interpretation of statistical correlations among different brain regions can be very useful. In this report we introduce a numerical correlation measure that may serve the interpretation of correlational neuronal data, and may assist in the evaluation of different brain states. The description of the dynamical brain system, through a global numerical measure may indicate the presence of an action principle which may facilitate a application of physics principles in the study of the human brain and cognition.

  4. Cortical Sensitivity to Guitar Note Patterns: EEG Entrainment to Repetition and Key.

    PubMed

    Bridwell, David A; Leslie, Emily; McCoy, Dakarai Q; Plis, Sergey M; Calhoun, Vince D

    2017-01-01

    Music is ubiquitous throughout recent human culture, and many individual's have an innate ability to appreciate and understand music. Our appreciation of music likely emerges from the brain's ability to process a series of repeated complex acoustic patterns. In order to understand these processes further, cortical responses were measured to a series of guitar notes presented with a musical pattern or without a pattern. ERP responses to individual notes were measured using a 24 electrode Bluetooth mobile EEG system (Smarting mBrainTrain) while 13 healthy non-musicians listened to structured (i.e., within musical keys and with repetition) or random sequences of guitar notes for 10 min each. We demonstrate an increased amplitude to the ERP that appears ~200 ms to notes presented within the musical sequence. This amplitude difference between random notes and patterned notes likely reflects individual's cortical sensitivity to guitar note patterns. These amplitudes were compared to ERP responses to a rare note embedded within a stream of frequent notes to determine whether the sensitivity to complex musical structure overlaps with the sensitivity to simple irregularities reflected in traditional auditory oddball experiments. Response amplitudes to the negative peak at ~175 ms are statistically correlated with the mismatch negativity (MMN) response measured to a rare note presented among a series of frequent notes (i.e., in a traditional oddball sequence), but responses to the subsequent positive peak at ~200 do not show a statistical relationship with the P300 response. Thus, the sensitivity to musical structure identified to 4 Hz note patterns appears somewhat distinct from the sensitivity to statistical regularities reflected in the traditional "auditory oddball" sequence. Overall, we suggest that this is a promising approach to examine individual's sensitivity to complex acoustic patterns, which may overlap with higher level cognitive processes, including language.

  5. Probing the Fluctuations of Optical Properties in Time-Resolved Spectroscopy

    NASA Astrophysics Data System (ADS)

    Randi, Francesco; Esposito, Martina; Giusti, Francesca; Misochko, Oleg; Parmigiani, Fulvio; Fausti, Daniele; Eckstein, Martin

    2017-11-01

    We show that, in optical pump-probe experiments on bulk samples, the statistical distribution of the intensity of ultrashort light pulses after interaction with a nonequilibrium complex material can be used to measure the time-dependent noise of the current in the system. We illustrate the general arguments for a photoexcited Peierls material. The transient noise spectroscopy allows us to measure to what extent electronic degrees of freedom dynamically obey the fluctuation-dissipation theorem, and how well they thermalize during the coherent lattice vibrations. The proposed statistical measurement developed here provides a new general framework to retrieve dynamical information on the excited distributions in nonequilibrium experiments, which could be extended to other degrees of freedom of magnetic or vibrational origin.

  6. Avalanches and generalized memory associativity in a network model for conscious and unconscious mental functioning

    NASA Astrophysics Data System (ADS)

    Siddiqui, Maheen; Wedemann, Roseli S.; Jensen, Henrik Jeldtoft

    2018-01-01

    We explore statistical characteristics of avalanches associated with the dynamics of a complex-network model, where two modules corresponding to sensorial and symbolic memories interact, representing unconscious and conscious mental processes. The model illustrates Freud's ideas regarding the neuroses and that consciousness is related with symbolic and linguistic memory activity in the brain. It incorporates the Stariolo-Tsallis generalization of the Boltzmann Machine in order to model memory retrieval and associativity. In the present work, we define and measure avalanche size distributions during memory retrieval, in order to gain insight regarding basic aspects of the functioning of these complex networks. The avalanche sizes defined for our model should be related to the time consumed and also to the size of the neuronal region which is activated, during memory retrieval. This allows the qualitative comparison of the behaviour of the distribution of cluster sizes, obtained during fMRI measurements of the propagation of signals in the brain, with the distribution of avalanche sizes obtained in our simulation experiments. This comparison corroborates the indication that the Nonextensive Statistical Mechanics formalism may indeed be more well suited to model the complex networks which constitute brain and mental structure.

  7. Complex network theory for the identification and assessment of candidate protein targets.

    PubMed

    McGarry, Ken; McDonald, Sharon

    2018-06-01

    In this work we use complex network theory to provide a statistical model of the connectivity patterns of human proteins and their interaction partners. Our intention is to identify important proteins that may be predisposed to be potential candidates as drug targets for therapeutic interventions. Target proteins usually have more interaction partners than non-target proteins, but there are no hard-and-fast rules for defining the actual number of interactions. We devise a statistical measure for identifying hub proteins, we score our target proteins with gene ontology annotations. The important druggable protein targets are likely to have similar biological functions that can be assessed for their potential therapeutic value. Our system provides a statistical analysis of the local and distant neighborhood protein interactions of the potential targets using complex network measures. This approach builds a more accurate model of drug-to-target activity and therefore the likely impact on treating diseases. We integrate high quality protein interaction data from the HINT database and disease associated proteins from the DrugTarget database. Other sources include biological knowledge from Gene Ontology and drug information from DrugBank. The problem is a very challenging one since the data is highly imbalanced between target proteins and the more numerous nontargets. We use undersampling on the training data and build Random Forest classifier models which are used to identify previously unclassified target proteins. We validate and corroborate these findings from the available literature. Copyright © 2018 Elsevier Ltd. All rights reserved.

  8. The ICF has made a difference to functioning and disability measurement and statistics.

    PubMed

    Madden, Rosamond H; Bundy, Anita

    2018-02-12

    Fifteen years after the publication of the International Classification of Functioning, Disability and Health (ICF), we investigated: How ICF applications align with ICF aims, contents and principles, and how the ICF has been used to improve measurement of functioning and related statistics. In a scoping review, we investigated research published 2001-2015 relating to measurement and statistics for evidence of: a change in thinking; alignment of applications with ICF specifications and philosophy; and the emergence of new knowledge. The ICF is used in diverse applications, settings and countries, with processes largely aligned with the ICF and intended to improve measurement and statistics: new national surveys, information systems and ICF-based instruments; and international efforts to improve disability data. Knowledge is growing about the components and interactions of the ICF model, the diverse effects of the environment on functioning, and the meaning and measurement of participation. The ICF provides specificity and a common language in the complex world of functioning and disability and is stimulating new thinking, new applications in measurement and statistics, and the assembling of new knowledge. Nevertheless, the field needs to mature. Identified gaps suggest ways to improve measurement and statistics to underpin policies, services and outcomes. Implications for Rehabilitation The ICF offers a conceptualization of functioning and disability that can underpin assessment and documentation in rehabilitation, with a growing body of experience to draw on for guidance. Experience with the ICF reminds practitioners to consider all the domains of participation, the effect of the environment on participation and the importance of involving clients/patients in assessment and service planning. Understanding the variability of functioning within everyday environments and designing interventions for removing barriers in various environments is a vital part of rehabilitation planning.

  9. Characterizing time series via complexity-entropy curves

    NASA Astrophysics Data System (ADS)

    Ribeiro, Haroldo V.; Jauregui, Max; Zunino, Luciano; Lenzi, Ervin K.

    2017-06-01

    The search for patterns in time series is a very common task when dealing with complex systems. This is usually accomplished by employing a complexity measure such as entropies and fractal dimensions. However, such measures usually only capture a single aspect of the system dynamics. Here, we propose a family of complexity measures for time series based on a generalization of the complexity-entropy causality plane. By replacing the Shannon entropy by a monoparametric entropy (Tsallis q entropy) and after considering the proper generalization of the statistical complexity (q complexity), we build up a parametric curve (the q -complexity-entropy curve) that is used for characterizing and classifying time series. Based on simple exact results and numerical simulations of stochastic processes, we show that these curves can distinguish among different long-range, short-range, and oscillating correlated behaviors. Also, we verify that simulated chaotic and stochastic time series can be distinguished based on whether these curves are open or closed. We further test this technique in experimental scenarios related to chaotic laser intensity, stock price, sunspot, and geomagnetic dynamics, confirming its usefulness. Finally, we prove that these curves enhance the automatic classification of time series with long-range correlations and interbeat intervals of healthy subjects and patients with heart disease.

  10. Mathematics. Exceptional Child Education Curriculum K-12.

    ERIC Educational Resources Information Center

    Jordon, Thelma; And Others

    The mathematics curriculum provides a framework of instruction for exceptional child education in grades K-12. Content areas include: numeration, whole numbers, rational numbers, real/complex numbers, calculator literacy, measurement, geometry, statistics, functions/relations, computer literacy, and pre-algebra. The guide is organized by content…

  11. Statistical methodology for the analysis of dye-switch microarray experiments

    PubMed Central

    Mary-Huard, Tristan; Aubert, Julie; Mansouri-Attia, Nadera; Sandra, Olivier; Daudin, Jean-Jacques

    2008-01-01

    Background In individually dye-balanced microarray designs, each biological sample is hybridized on two different slides, once with Cy3 and once with Cy5. While this strategy ensures an automatic correction of the gene-specific labelling bias, it also induces dependencies between log-ratio measurements that must be taken into account in the statistical analysis. Results We present two original statistical procedures for the statistical analysis of individually balanced designs. These procedures are compared with the usual ML and REML mixed model procedures proposed in most statistical toolboxes, on both simulated and real data. Conclusion The UP procedure we propose as an alternative to usual mixed model procedures is more efficient and significantly faster to compute. This result provides some useful guidelines for the analysis of complex designs. PMID:18271965

  12. Examining Complexity across Domains: Relating Subjective and Objective Measures of Affective Environmental Scenes, Paintings and Music

    PubMed Central

    Marin, Manuela M.; Leder, Helmut

    2013-01-01

    Subjective complexity has been found to be related to hedonic measures of preference, pleasantness and beauty, but there is no consensus about the nature of this relationship in the visual and musical domains. Moreover, the affective content of stimuli has been largely neglected so far in the study of complexity but is crucial in many everyday contexts and in aesthetic experiences. We thus propose a cross-domain approach that acknowledges the multidimensional nature of complexity and that uses a wide range of objective complexity measures combined with subjective ratings. In four experiments, we employed pictures of affective environmental scenes, representational paintings, and Romantic solo and chamber music excerpts. Stimuli were pre-selected to vary in emotional content (pleasantness and arousal) and complexity (low versus high number of elements). For each set of stimuli, in a between-subjects design, ratings of familiarity, complexity, pleasantness and arousal were obtained for a presentation time of 25 s from 152 participants. In line with Berlyne’s collative-motivation model, statistical analyses controlling for familiarity revealed a positive relationship between subjective complexity and arousal, and the highest correlations were observed for musical stimuli. Evidence for a mediating role of arousal in the complexity-pleasantness relationship was demonstrated in all experiments, but was only significant for females with regard to music. The direction and strength of the linear relationship between complexity and pleasantness depended on the stimulus type and gender. For environmental scenes, the root mean square contrast measures and measures of compressed file size correlated best with subjective complexity, whereas only edge detection based on phase congruency yielded equivalent results for representational paintings. Measures of compressed file size and event density also showed positive correlations with complexity and arousal in music, which is relevant for the discussion on which aspects of complexity are domain-specific and which are domain-general. PMID:23977295

  13. Examining complexity across domains: relating subjective and objective measures of affective environmental scenes, paintings and music.

    PubMed

    Marin, Manuela M; Leder, Helmut

    2013-01-01

    Subjective complexity has been found to be related to hedonic measures of preference, pleasantness and beauty, but there is no consensus about the nature of this relationship in the visual and musical domains. Moreover, the affective content of stimuli has been largely neglected so far in the study of complexity but is crucial in many everyday contexts and in aesthetic experiences. We thus propose a cross-domain approach that acknowledges the multidimensional nature of complexity and that uses a wide range of objective complexity measures combined with subjective ratings. In four experiments, we employed pictures of affective environmental scenes, representational paintings, and Romantic solo and chamber music excerpts. Stimuli were pre-selected to vary in emotional content (pleasantness and arousal) and complexity (low versus high number of elements). For each set of stimuli, in a between-subjects design, ratings of familiarity, complexity, pleasantness and arousal were obtained for a presentation time of 25 s from 152 participants. In line with Berlyne's collative-motivation model, statistical analyses controlling for familiarity revealed a positive relationship between subjective complexity and arousal, and the highest correlations were observed for musical stimuli. Evidence for a mediating role of arousal in the complexity-pleasantness relationship was demonstrated in all experiments, but was only significant for females with regard to music. The direction and strength of the linear relationship between complexity and pleasantness depended on the stimulus type and gender. For environmental scenes, the root mean square contrast measures and measures of compressed file size correlated best with subjective complexity, whereas only edge detection based on phase congruency yielded equivalent results for representational paintings. Measures of compressed file size and event density also showed positive correlations with complexity and arousal in music, which is relevant for the discussion on which aspects of complexity are domain-specific and which are domain-general.

  14. Temporal Comparisons of Internet Topology

    DTIC Science & Technology

    2014-06-01

    Number CAIDA Cooperative Association of Internet Data Analysis CDN Content Delivery Network CI Confidence Interval DoS denial of service GMT Greenwich...the CAIDA data. Our methods include analysis of graph theoretical measures as well as complex network and statistical measures that will quantify the...tool that probes the Internet for topology analysis and performance [26]. Scamper uses network diagnostic tools, such as traceroute and ping, to probe

  15. Inhomogeneous point-process entropy: An instantaneous measure of complexity in discrete systems

    NASA Astrophysics Data System (ADS)

    Valenza, Gaetano; Citi, Luca; Scilingo, Enzo Pasquale; Barbieri, Riccardo

    2014-05-01

    Measures of entropy have been widely used to characterize complexity, particularly in physiological dynamical systems modeled in discrete time. Current approaches associate these measures to finite single values within an observation window, thus not being able to characterize the system evolution at each moment in time. Here, we propose a new definition of approximate and sample entropy based on the inhomogeneous point-process theory. The discrete time series is modeled through probability density functions, which characterize and predict the time until the next event occurs as a function of the past history. Laguerre expansions of the Wiener-Volterra autoregressive terms account for the long-term nonlinear information. As the proposed measures of entropy are instantaneously defined through probability functions, the novel indices are able to provide instantaneous tracking of the system complexity. The new measures are tested on synthetic data, as well as on real data gathered from heartbeat dynamics of healthy subjects and patients with cardiac heart failure and gait recordings from short walks of young and elderly subjects. Results show that instantaneous complexity is able to effectively track the system dynamics and is not affected by statistical noise properties.

  16. Education research: a case-based bioethics curriculum for neurology residents.

    PubMed

    Tolchin, Benjamin; Willey, Joshua Z; Prager, Kenneth

    2015-03-31

    In 2012, the American Academy of Neurology (AAN) updated and expanded its ethics curriculum into Practical Ethics in Clinical Neurology, a case-based ethics curriculum for neurologists. We piloted a case-based bioethics curriculum for neurology residents using the framework and topics recommended by the AAN, matched to clinical cases drawn from Columbia's neurologic services. Our primary outcome was residents' ability to analyze and manage ethically complex cases as measured on precurriculum and postcurriculum multiple-choice quizzes. Secondary outcomes included precurriculum and postcurriculum self-assessed comfort in discussing and managing ethically complex cases, as well as attendance at ethics discussion sessions as compared to attendance at other didactic sessions. Resident performance on quizzes improved from 75.8% to 86.7% (p = 0.02). Comfort in discussing ethically complex cases improved from 6.4 to 7.4 on a 10-point scale (p = 0.03). Comfort in managing such cases trended toward improvement but did not reach statistical significance. Attendance was significantly better at ethics discussions (73.5%) than at other didactic sessions (61.7%, p = 0.04). Our formal case-based ethics curriculum for neurology residents, based on core topics drawn from the AAN's published curricula, was successfully piloted. Our study showed a statistically significant improvement in residents' ability to analyze and manage ethically complex cases as measured by multiple-choice tests and self-assessments. © 2015 American Academy of Neurology.

  17. Pattern-Based Inverse Modeling for Characterization of Subsurface Flow Models with Complex Geologic Heterogeneity

    NASA Astrophysics Data System (ADS)

    Golmohammadi, A.; Jafarpour, B.; M Khaninezhad, M. R.

    2017-12-01

    Calibration of heterogeneous subsurface flow models leads to ill-posed nonlinear inverse problems, where too many unknown parameters are estimated from limited response measurements. When the underlying parameters form complex (non-Gaussian) structured spatial connectivity patterns, classical variogram-based geostatistical techniques cannot describe the underlying connectivity patterns. Modern pattern-based geostatistical methods that incorporate higher-order spatial statistics are more suitable for describing such complex spatial patterns. Moreover, when the underlying unknown parameters are discrete (geologic facies distribution), conventional model calibration techniques that are designed for continuous parameters cannot be applied directly. In this paper, we introduce a novel pattern-based model calibration method to reconstruct discrete and spatially complex facies distributions from dynamic flow response data. To reproduce complex connectivity patterns during model calibration, we impose a feasibility constraint to ensure that the solution follows the expected higher-order spatial statistics. For model calibration, we adopt a regularized least-squares formulation, involving data mismatch, pattern connectivity, and feasibility constraint terms. Using an alternating directions optimization algorithm, the regularized objective function is divided into a continuous model calibration problem, followed by mapping the solution onto the feasible set. The feasibility constraint to honor the expected spatial statistics is implemented using a supervised machine learning algorithm. The two steps of the model calibration formulation are repeated until the convergence criterion is met. Several numerical examples are used to evaluate the performance of the developed method.

  18. A Backscatter-Lidar Forward-Operator

    NASA Astrophysics Data System (ADS)

    Geisinger, Armin; Behrendt, Andreas; Wulfmeyer, Volker; Vogel, Bernhard; Mattis, Ina; Flentje, Harald; Förstner, Jochen; Potthast, Roland

    2015-04-01

    We have developed a forward-operator which is capable of calculating virtual lidar profiles from atmospheric state simulations. The operator allows us to compare lidar measurements and model simulations based on the same measurement parameter: the lidar backscatter profile. This method simplifies qualitative comparisons and also makes quantitative comparisons possible, including statistical error quantification. Implemented into an aerosol-capable model system, the operator will act as a component to assimilate backscatter-lidar measurements. As many weather services maintain already networks of backscatter-lidars, such data are acquired already in an operational manner. To estimate and quantify errors due to missing or uncertain aerosol information, we started sensitivity studies about several scattering parameters such as the aerosol size and both the real and imaginary part of the complex index of refraction. Furthermore, quantitative and statistical comparisons between measurements and virtual measurements are shown in this study, i.e. applying the backscatter-lidar forward-operator on model output.

  19. Systematic sampling for suspended sediment

    Treesearch

    Robert B. Thomas

    1991-01-01

    Abstract - Because of high costs or complex logistics, scientific populations cannot be measured entirely and must be sampled. Accepted scientific practice holds that sample selection be based on statistical principles to assure objectivity when estimating totals and variances. Probability sampling--obtaining samples with known probabilities--is the only method that...

  20. Generating realistic environments for cyber operations development, testing, and training

    NASA Astrophysics Data System (ADS)

    Berk, Vincent H.; Gregorio-de Souza, Ian; Murphy, John P.

    2012-06-01

    Training eective cyber operatives requires realistic network environments that incorporate the structural and social complexities representative of the real world. Network trac generators facilitate repeatable experiments for the development, training and testing of cyber operations. However, current network trac generators, ranging from simple load testers to complex frameworks, fail to capture the realism inherent in actual environments. In order to improve the realism of network trac generated by these systems, it is necessary to quantitatively measure the level of realism in generated trac with respect to the environment being mimicked. We categorize realism measures into statistical, content, and behavioral measurements, and propose various metrics that can be applied at each level to indicate how eectively the generated trac mimics the real world.

  1. The degree of mutual anisotropy of biological liquids polycrystalline nets as a parameter in diagnostics and differentiations of hominal inflammatory processes

    NASA Astrophysics Data System (ADS)

    Angelsky, O. V.; Ushenko, Yu. A.; Balanetska, V. O.

    2011-09-01

    To characterize the degree of consistency of parameters of the optically uniaxial birefringent protein nets of blood plasma a new parameter - complex degree of mutual anisotropy is suggested. The technique of polarization measuring the coordinate distributions of the complex degree of mutual anisotropy of blood plasma is developed. It is shown that statistic approach to the analysis of the complex degree of mutual anisotropy distributions of blood plasma is effective during the diagnostics and differentiation of an acute inflammatory processes as well as acute and gangrenous appendicitis.

  2. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers.

    PubMed

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  3. Detecting the chaotic nature in a transitional boundary layer using symbolic information-theory quantifiers

    NASA Astrophysics Data System (ADS)

    Zhang, Wen; Liu, Peiqing; Guo, Hao; Wang, Jinjun

    2017-11-01

    The permutation entropy and the statistical complexity are employed to study the boundary-layer transition induced by the surface roughness. The velocity signals measured in the transition process are analyzed with these symbolic quantifiers, as well as the complexity-entropy causality plane, and the chaotic nature of the instability fluctuations is identified. The frequency of the dominant fluctuations has been found according to the time scales corresponding to the extreme values of the symbolic quantifiers. The laminar-turbulent transition process is accompanied by the evolution in the degree of organization of the complex eddy motions, which is also characterized with the growing smaller and flatter circles in the complexity-entropy causality plane. With the help of the permutation entropy and the statistical complexity, the differences between the chaotic fluctuations detected in the experiments and the classical Tollmien-Schlichting wave are shown and discussed. It is also found that the chaotic features of the instability fluctuations can be approximated with a number of regular sine waves superimposed on the fluctuations of the undisturbed laminar boundary layer. This result is related to the physical mechanism in the generation of the instability fluctuations, which is the noise-induced chaos.

  4. The role of shape complexity in the detection of closed contours.

    PubMed

    Wilder, John; Feldman, Jacob; Singh, Manish

    2016-09-01

    The detection of contours in noise has been extensively studied, but the detection of closed contours, such as the boundaries of whole objects, has received relatively little attention. Closed contours pose substantial challenges not present in the simple (open) case, because they form the outlines of whole shapes and thus take on a range of potentially important configural properties. In this paper we consider the detection of closed contours in noise as a probabilistic decision problem. Previous work on open contours suggests that contour complexity, quantified as the negative log probability (Description Length, DL) of the contour under a suitably chosen statistical model, impairs contour detectability; more complex (statistically surprising) contours are harder to detect. In this study we extended this result to closed contours, developing a suitable probabilistic model of whole shapes that gives rise to several distinct though interrelated measures of shape complexity. We asked subjects to detect either natural shapes (Exp. 1) or experimentally manipulated shapes (Exp. 2) embedded in noise fields. We found systematic effects of global shape complexity on detection performance, demonstrating how aspects of global shape and form influence the basic process of object detection. Copyright © 2015 Elsevier Ltd. All rights reserved.

  5. The Hanford Thyroid Disease Study: an alternative view of the findings.

    PubMed

    Hoffman, F Owen; Ruttenber, A James; Apostoaei, A Iulian; Carroll, Raymond J; Greenland, Sander

    2007-02-01

    The Hanford Thyroid Disease Study (HTDS) is one of the largest and most complex epidemiologic studies of the relation between environmental exposures to I and thyroid disease. The study detected no dose-response relation using a 0.05 level for statistical significance. The results for thyroid cancer appear inconsistent with those from other studies of populations with similar exposures, and either reflect inadequate statistical power, bias, or unique relations between exposure and disease risk. In this paper, we explore these possibilities, and present evidence that the HTDS statistical power was inadequate due to complex uncertainties associated with the mathematical models and assumptions used to reconstruct individual doses. We conclude that, at the very least, the confidence intervals reported by the HTDS for thyroid cancer and other thyroid diseases are too narrow because they fail to reflect key uncertainties in the measurement-error structure. We recommend that the HTDS results be interpreted as inconclusive rather than as evidence for little or no disease risk from Hanford exposures.

  6. Constructing Noise-Invariant Representations of Sound in the Auditory Pathway

    PubMed Central

    Rabinowitz, Neil C.; Willmore, Ben D. B.; King, Andrew J.; Schnupp, Jan W. H.

    2013-01-01

    Identifying behaviorally relevant sounds in the presence of background noise is one of the most important and poorly understood challenges faced by the auditory system. An elegant solution to this problem would be for the auditory system to represent sounds in a noise-invariant fashion. Since a major effect of background noise is to alter the statistics of the sounds reaching the ear, noise-invariant representations could be promoted by neurons adapting to stimulus statistics. Here we investigated the extent of neuronal adaptation to the mean and contrast of auditory stimulation as one ascends the auditory pathway. We measured these forms of adaptation by presenting complex synthetic and natural sounds, recording neuronal responses in the inferior colliculus and primary fields of the auditory cortex of anaesthetized ferrets, and comparing these responses with a sophisticated model of the auditory nerve. We find that the strength of both forms of adaptation increases as one ascends the auditory pathway. To investigate whether this adaptation to stimulus statistics contributes to the construction of noise-invariant sound representations, we also presented complex, natural sounds embedded in stationary noise, and used a decoding approach to assess the noise tolerance of the neuronal population code. We find that the code for complex sounds in the periphery is affected more by the addition of noise than the cortical code. We also find that noise tolerance is correlated with adaptation to stimulus statistics, so that populations that show the strongest adaptation to stimulus statistics are also the most noise-tolerant. This suggests that the increase in adaptation to sound statistics from auditory nerve to midbrain to cortex is an important stage in the construction of noise-invariant sound representations in the higher auditory brain. PMID:24265596

  7. Using complexity metrics with R-R intervals and BPM heart rate measures.

    PubMed

    Wallot, Sebastian; Fusaroli, Riccardo; Tylén, Kristian; Jegindø, Else-Marie

    2013-01-01

    Lately, growing attention in the health sciences has been paid to the dynamics of heart rate as indicator of impending failures and for prognoses. Likewise, in social and cognitive sciences, heart rate is increasingly employed as a measure of arousal, emotional engagement and as a marker of interpersonal coordination. However, there is no consensus about which measurements and analytical tools are most appropriate in mapping the temporal dynamics of heart rate and quite different metrics are reported in the literature. As complexity metrics of heart rate variability depend critically on variability of the data, different choices regarding the kind of measures can have a substantial impact on the results. In this article we compare linear and non-linear statistics on two prominent types of heart beat data, beat-to-beat intervals (R-R interval) and beats-per-min (BPM). As a proof-of-concept, we employ a simple rest-exercise-rest task and show that non-linear statistics-fractal (DFA) and recurrence (RQA) analyses-reveal information about heart beat activity above and beyond the simple level of heart rate. Non-linear statistics unveil sustained post-exercise effects on heart rate dynamics, but their power to do so critically depends on the type data that is employed: While R-R intervals are very susceptible to non-linear analyses, the success of non-linear methods for BPM data critically depends on their construction. Generally, "oversampled" BPM time-series can be recommended as they retain most of the information about non-linear aspects of heart beat dynamics.

  8. Image statistics for surface reflectance perception.

    PubMed

    Sharan, Lavanya; Li, Yuanzhen; Motoyoshi, Isamu; Nishida, Shin'ya; Adelson, Edward H

    2008-04-01

    Human observers can distinguish the albedo of real-world surfaces even when the surfaces are viewed in isolation, contrary to the Gelb effect. We sought to measure this ability and to understand the cues that might underlie it. We took photographs of complex surfaces such as stucco and asked observers to judge their diffuse reflectance by comparing them to a physical Munsell scale. Their judgments, while imperfect, were highly correlated with the true reflectance. The judgments were also highly correlated with certain image statistics, such as moment and percentile statistics of the luminance and subband histograms. When we digitally manipulated these statistics in an image, human judgments were correspondingly altered. Moreover, linear combinations of such statistics allow a machine vision system (operating within the constrained world of single surfaces) to estimate albedo with an accuracy similar to that of human observers. Taken together, these results indicate that some simple image statistics have a strong influence on the judgment of surface reflectance.

  9. The correlation of initial radiographic characteristics of distal radius fractures and injuries of the triangular fibrocartilage complex.

    PubMed

    Kasapinova, K; Kamiloski, V

    2016-06-01

    Our purpose was to determine the correlation of initial radiographic parameters of a distal radius fracture with an injury of the triangular fibrocartilage complex. In a prospective study, 85 patients with surgically treated distal radius fractures were included. Wrist arthroscopy was used to identify and classify triangular fibrocartilage complex lesions. The initial radial length and angulation, dorsal angulation, ulnar variance and distal radioulnar distance were measured. Wrist arthroscopy identified a triangular fibrocartilage complex lesion in 45 patients. Statistical analysis did not identify a correlation with any single radiographic parameter of the distal radius fractures with the associated triangular fibrocartilage complex injuries. The initial radiograph of a distal radius fracture does not predict a triangular fibrocartilage complex injury. III. © The Author(s) 2016.

  10. US EPA'S LANDSCAPE ECOLOGY RESEARCH: ASSESSING TRENDS FOR WETLANDS AND SURFACE WATERS USING REMORE SENSING, GIS, AND FIELD-BASED TECHNIQUES

    EPA Science Inventory

    The US EPA, Environmental Sciences Division-Las Vegas is using a variety of geopspatical and statistical modeling approaches to locate and assess the complex functions of wetland ecosystems. These assessments involve measuring landscape characteristrics and change, at multiple s...

  11. Averaging Models: Parameters Estimation with the R-Average Procedure

    ERIC Educational Resources Information Center

    Vidotto, G.; Massidda, D.; Noventa, S.

    2010-01-01

    The Functional Measurement approach, proposed within the theoretical framework of Information Integration Theory (Anderson, 1981, 1982), can be a useful multi-attribute analysis tool. Compared to the majority of statistical models, the averaging model can account for interaction effects without adding complexity. The R-Average method (Vidotto &…

  12. Specifying and Refining a Complex Measurement Model.

    ERIC Educational Resources Information Center

    Levy, Roy; Mislevy, Robert J.

    This paper aims to describe a Bayesian approach to modeling and estimating cognitive models both in terms of statistical machinery and actual instrument development. Such a method taps the knowledge of experts to provide initial estimates for the probabilistic relationships among the variables in a multivariate latent variable model and refines…

  13. Mixed Effects Models for Resampled Network Statistics Improves Statistical Power to Find Differences in Multi-Subject Functional Connectivity

    PubMed Central

    Narayan, Manjari; Allen, Genevera I.

    2016-01-01

    Many complex brain disorders, such as autism spectrum disorders, exhibit a wide range of symptoms and disability. To understand how brain communication is impaired in such conditions, functional connectivity studies seek to understand individual differences in brain network structure in terms of covariates that measure symptom severity. In practice, however, functional connectivity is not observed but estimated from complex and noisy neural activity measurements. Imperfect subject network estimates can compromise subsequent efforts to detect covariate effects on network structure. We address this problem in the case of Gaussian graphical models of functional connectivity, by proposing novel two-level models that treat both subject level networks and population level covariate effects as unknown parameters. To account for imperfectly estimated subject level networks when fitting these models, we propose two related approaches—R2 based on resampling and random effects test statistics, and R3 that additionally employs random adaptive penalization. Simulation studies using realistic graph structures reveal that R2 and R3 have superior statistical power to detect covariate effects compared to existing approaches, particularly when the number of within subject observations is comparable to the size of subject networks. Using our novel models and methods to study parts of the ABIDE dataset, we find evidence of hypoconnectivity associated with symptom severity in autism spectrum disorders, in frontoparietal and limbic systems as well as in anterior and posterior cingulate cortices. PMID:27147940

  14. Statistical Image Properties in Large Subsets of Traditional Art, Bad Art, and Abstract Art

    PubMed Central

    Redies, Christoph; Brachmann, Anselm

    2017-01-01

    Several statistical image properties have been associated with large subsets of traditional visual artworks. Here, we investigate some of these properties in three categories of art that differ in artistic claim and prestige: (1) Traditional art of different cultural origin from established museums and art collections (oil paintings and graphic art of Western provenance, Islamic book illustration and Chinese paintings), (2) Bad Art from two museums that collect contemporary artworks of lesser importance (© Museum Of Bad Art [MOBA], Somerville, and Official Bad Art Museum of Art [OBAMA], Seattle), and (3) twentieth century abstract art of Western provenance from two prestigious museums (Tate Gallery and Kunstsammlung Nordrhein-Westfalen). We measured the following four statistical image properties: the fractal dimension (a measure relating to subjective complexity); self-similarity (a measure of how much the sections of an image resemble the image as a whole), 1st-order entropy of edge orientations (a measure of how uniformly different orientations are represented in an image); and 2nd-order entropy of edge orientations (a measure of how independent edge orientations are across an image). As shown previously, traditional artworks of different styles share similar values for these measures. The values for Bad Art and twentieth century abstract art show a considerable overlap with those of traditional art, but we also identified numerous examples of Bad Art and abstract art that deviate from traditional art. By measuring statistical image properties, we quantify such differences in image composition for the first time. PMID:29118692

  15. Statistical Image Properties in Large Subsets of Traditional Art, Bad Art, and Abstract Art.

    PubMed

    Redies, Christoph; Brachmann, Anselm

    2017-01-01

    Several statistical image properties have been associated with large subsets of traditional visual artworks. Here, we investigate some of these properties in three categories of art that differ in artistic claim and prestige: (1) Traditional art of different cultural origin from established museums and art collections (oil paintings and graphic art of Western provenance, Islamic book illustration and Chinese paintings), (2) Bad Art from two museums that collect contemporary artworks of lesser importance (© Museum Of Bad Art [MOBA], Somerville, and Official Bad Art Museum of Art [OBAMA], Seattle), and (3) twentieth century abstract art of Western provenance from two prestigious museums (Tate Gallery and Kunstsammlung Nordrhein-Westfalen). We measured the following four statistical image properties: the fractal dimension (a measure relating to subjective complexity); self-similarity (a measure of how much the sections of an image resemble the image as a whole), 1st-order entropy of edge orientations (a measure of how uniformly different orientations are represented in an image); and 2nd-order entropy of edge orientations (a measure of how independent edge orientations are across an image). As shown previously, traditional artworks of different styles share similar values for these measures. The values for Bad Art and twentieth century abstract art show a considerable overlap with those of traditional art, but we also identified numerous examples of Bad Art and abstract art that deviate from traditional art. By measuring statistical image properties, we quantify such differences in image composition for the first time.

  16. Systems Engineering Metrics: Organizational Complexity and Product Quality Modeling

    NASA Technical Reports Server (NTRS)

    Mog, Robert A.

    1997-01-01

    Innovative organizational complexity and product quality models applicable to performance metrics for NASA-MSFC's Systems Analysis and Integration Laboratory (SAIL) missions and objectives are presented. An intensive research effort focuses on the synergistic combination of stochastic process modeling, nodal and spatial decomposition techniques, organizational and computational complexity, systems science and metrics, chaos, and proprietary statistical tools for accelerated risk assessment. This is followed by the development of a preliminary model, which is uniquely applicable and robust for quantitative purposes. Exercise of the preliminary model using a generic system hierarchy and the AXAF-I architectural hierarchy is provided. The Kendall test for positive dependence provides an initial verification and validation of the model. Finally, the research and development of the innovation is revisited, prior to peer review. This research and development effort results in near-term, measurable SAIL organizational and product quality methodologies, enhanced organizational risk assessment and evolutionary modeling results, and 91 improved statistical quantification of SAIL productivity interests.

  17. Statistical methods for thermonuclear reaction rates and nucleosynthesis simulations

    NASA Astrophysics Data System (ADS)

    Iliadis, Christian; Longland, Richard; Coc, Alain; Timmes, F. X.; Champagne, Art E.

    2015-03-01

    Rigorous statistical methods for estimating thermonuclear reaction rates and nucleosynthesis are becoming increasingly established in nuclear astrophysics. The main challenge being faced is that experimental reaction rates are highly complex quantities derived from a multitude of different measured nuclear parameters (e.g., astrophysical S-factors, resonance energies and strengths, particle and γ-ray partial widths). We discuss the application of the Monte Carlo method to two distinct, but related, questions. First, given a set of measured nuclear parameters, how can one best estimate the resulting thermonuclear reaction rates and associated uncertainties? Second, given a set of appropriate reaction rates, how can one best estimate the abundances from nucleosynthesis (i.e., reaction network) calculations? The techniques described here provide probability density functions that can be used to derive statistically meaningful reaction rates and final abundances for any desired coverage probability. Examples are given for applications to s-process neutron sources, core-collapse supernovae, classical novae, and Big Bang nucleosynthesis.

  18. Constructing networks with correlation maximization methods.

    PubMed

    Mellor, Joseph C; Wu, Jie; Delisi, Charles

    2004-01-01

    Problems of inference in systems biology are ideally reduced to formulations which can efficiently represent the features of interest. In the case of predicting gene regulation and pathway networks, an important feature which describes connected genes and proteins is the relationship between active and inactive forms, i.e. between the "on" and "off" states of the components. While not optimal at the limits of resolution, these logical relationships between discrete states can often yield good approximations of the behavior in larger complex systems, where exact representation of measurement relationships may be intractable. We explore techniques for extracting binary state variables from measurement of gene expression, and go on to describe robust measures for statistical significance and information that can be applied to many such types of data. We show how statistical strength and information are equivalent criteria in limiting cases, and demonstrate the application of these measures to simple systems of gene regulation.

  19. Evaluation of single and multiple Doppler lidar techniques to measure complex flow during the XPIA field campaign

    NASA Astrophysics Data System (ADS)

    Choukulkar, Aditya; Brewer, W. Alan; Sandberg, Scott P.; Weickmann, Ann; Bonin, Timothy A.; Hardesty, R. Michael; Lundquist, Julie K.; Delgado, Ruben; Valerio Iungo, G.; Ashton, Ryan; Debnath, Mithu; Bianco, Laura; Wilczak, James M.; Oncley, Steven; Wolfe, Daniel

    2017-01-01

    Accurate three-dimensional information of wind flow fields can be an important tool in not only visualizing complex flow but also understanding the underlying physical processes and improving flow modeling. However, a thorough analysis of the measurement uncertainties is required to properly interpret results. The XPIA (eXperimental Planetary boundary layer Instrumentation Assessment) field campaign conducted at the Boulder Atmospheric Observatory (BAO) in Erie, CO, from 2 March to 31 May 2015 brought together a large suite of in situ and remote sensing measurement platforms to evaluate complex flow measurement strategies. In this paper, measurement uncertainties for different single and multi-Doppler strategies using simple scan geometries (conical, vertical plane and staring) are investigated. The tradeoffs (such as time-space resolution vs. spatial coverage) among the different measurement techniques are evaluated using co-located measurements made near the BAO tower. Sensitivity of the single-/multi-Doppler measurement uncertainties to averaging period are investigated using the sonic anemometers installed on the BAO tower as the standard reference. Finally, the radiometer measurements are used to partition the measurement periods as a function of atmospheric stability to determine their effect on measurement uncertainty. It was found that with an increase in spatial coverage and measurement complexity, the uncertainty in the wind measurement also increased. For multi-Doppler techniques, the increase in uncertainty for temporally uncoordinated measurements is possibly due to requiring additional assumptions of stationarity along with horizontal homogeneity and less representative line-of-sight velocity statistics. It was also found that wind speed measurement uncertainty was lower during stable conditions compared to unstable conditions.

  20. Derivative Free Optimization of Complex Systems with the Use of Statistical Machine Learning Models

    DTIC Science & Technology

    2015-09-12

    AFRL-AFOSR-VA-TR-2015-0278 DERIVATIVE FREE OPTIMIZATION OF COMPLEX SYSTEMS WITH THE USE OF STATISTICAL MACHINE LEARNING MODELS Katya Scheinberg...COMPLEX SYSTEMS WITH THE USE OF STATISTICAL MACHINE LEARNING MODELS 5a.  CONTRACT NUMBER 5b.  GRANT NUMBER FA9550-11-1-0239 5c.  PROGRAM ELEMENT...developed, which has been the focus of our research. 15. SUBJECT TERMS optimization, Derivative-Free Optimization, Statistical Machine Learning 16. SECURITY

  1. An example of complex modelling in dentistry using Markov chain Monte Carlo (MCMC) simulation.

    PubMed

    Helfenstein, Ulrich; Menghini, Giorgio; Steiner, Marcel; Murati, Francesca

    2002-09-01

    In the usual regression setting one regression line is computed for a whole data set. In a more complex situation, each person may be observed for example at several points in time and thus a regression line might be calculated for each person. Additional complexities, such as various forms of errors in covariables may make a straightforward statistical evaluation difficult or even impossible. During recent years methods have been developed allowing convenient analysis of problems where the data and the corresponding models show these and many other forms of complexity. The methodology makes use of a Bayesian approach and Markov chain Monte Carlo (MCMC) simulations. The methods allow the construction of increasingly elaborate models by building them up from local sub-models. The essential structure of the models can be represented visually by directed acyclic graphs (DAG). This attractive property allows communication and discussion of the essential structure and the substantial meaning of a complex model without needing algebra. After presentation of the statistical methods an example from dentistry is presented in order to demonstrate their application and use. The dataset of the example had a complex structure; each of a set of children was followed up over several years. The number of new fillings in permanent teeth had been recorded at several ages. The dependent variables were markedly different from the normal distribution and could not be transformed to normality. In addition, explanatory variables were assumed to be measured with different forms of error. Illustration of how the corresponding models can be estimated conveniently via MCMC simulation, in particular, 'Gibbs sampling', using the freely available software BUGS is presented. In addition, how the measurement error may influence the estimates of the corresponding coefficients is explored. It is demonstrated that the effect of the independent variable on the dependent variable may be markedly underestimated if the measurement error is not taken into account ('regression dilution bias'). Markov chain Monte Carlo methods may be of great value to dentists in allowing analysis of data sets which exhibit a wide range of different forms of complexity.

  2. Modeling Conditional Probabilities in Complex Educational Assessments. CSE Technical Report.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Almond, Russell; Dibello, Lou; Jenkins, Frank; Steinberg, Linda; Yan, Duanli; Senturk, Deniz

    An active area in psychometric research is coordinated task design and statistical analysis built around cognitive models. Compared with classical test theory and item response theory, there is often less information from observed data about the measurement-model parameters. On the other hand, there is more information from the grounding…

  3. Neuronal couplings between retinal ganglion cells inferred by efficient inverse statistical physics methods

    PubMed Central

    Cocco, Simona; Leibler, Stanislas; Monasson, Rémi

    2009-01-01

    Complexity of neural systems often makes impracticable explicit measurements of all interactions between their constituents. Inverse statistical physics approaches, which infer effective couplings between neurons from their spiking activity, have been so far hindered by their computational complexity. Here, we present 2 complementary, computationally efficient inverse algorithms based on the Ising and “leaky integrate-and-fire” models. We apply those algorithms to reanalyze multielectrode recordings in the salamander retina in darkness and under random visual stimulus. We find strong positive couplings between nearby ganglion cells common to both stimuli, whereas long-range couplings appear under random stimulus only. The uncertainty on the inferred couplings due to limitations in the recordings (duration, small area covered on the retina) is discussed. Our methods will allow real-time evaluation of couplings for large assemblies of neurons. PMID:19666487

  4. Application of higher-order cepstral techniques in problems of fetal heart signal extraction

    NASA Astrophysics Data System (ADS)

    Sabry-Rizk, Madiha; Zgallai, Walid; Hardiman, P.; O'Riordan, J.

    1996-10-01

    Recently, cepstral analysis based on second order statistics and homomorphic filtering techniques have been used in the adaptive decomposition of overlapping, or otherwise, and noise contaminated ECG complexes of mothers and fetals obtained by a transabdominal surface electrodes connected to a monitoring instrument, an interface card, and a PC. Differential time delays of fetal heart beats measured from a reference point located on the mother complex after transformation to cepstra domains are first obtained and this is followed by fetal heart rate variability computations. Homomorphic filtering in the complex cepstral domain and the subuent transformation to the time domain results in fetal complex recovery. However, three problems have been identified with second-order based cepstral techniques that needed rectification in this paper. These are (1) errors resulting from the phase unwrapping algorithms and leading to fetal complex perturbation, (2) the unavoidable conversion of noise statistics from Gaussianess to non-Gaussianess due to the highly non-linear nature of homomorphic transform does warrant stringent noise cancellation routines, (3) due to the aforementioned problems in (1) and (2), it is difficult to adaptively optimize windows to include all individual fetal complexes in the time domain based on amplitude thresholding routines in the complex cepstral domain (i.e. the task of `zooming' in on weak fetal complexes requires more processing time). The use of third-order based high resolution differential cepstrum technique results in recovery of the delay of the order of 120 milliseconds.

  5. Approximate Entropy in the Electroencephalogram During Wake and Sleep

    PubMed Central

    Burioka, Naoto; Miyata, Masanori; Cornélissen, Germaine; Halberg, Franz; Takeshima, Takao; Kaplan, Daniel T.; Suyama, Hisashi; Endo, Masanori; Maegaki, Yoshihiro; Nomura, Takashi; Tomita, Yutaka; Nakashima, Kenji; Shimizu, Eiji

    2006-01-01

    Entropy measurement can discriminate among complex systems, including deterministic, stochastic and composite systems. We evaluated the changes of approximate entropy (ApEn) in signals of the electroencephalogram (EEG) during sleep. EEG signals were recorded from eight healthy volunteers during nightly sleep. We estimated the values of ApEn in EEG signals in each sleep stage. The ApEn values for EEG signals (mean ± SD) were 0.896 ± 0.264 during eyes-closed waking state, 0.738 ± 0.089 during Stage I, 0.615 ± 0.107 during Stage II, 0.487 ± 0.101 during Stage III, 0.397 ± 0.078 during Stage IV and 0.789 ± 0.182 during REM sleep. The ApEn values were found to differ with statistical significance among the six different stages of consciousness (ANOVA, p<0.001). ApEn of EEG was statistically significantly lower during Stage IV and higher during wake and REM sleep. We conclude that ApEn measurement can be useful to estimate sleep stages and the complexity in brain activity. PMID:15683194

  6. Inferring Master Painters' Esthetic Biases from the Statistics of Portraits

    PubMed Central

    Aleem, Hassan; Correa-Herran, Ivan; Grzywacz, Norberto M.

    2017-01-01

    The Processing Fluency Theory posits that the ease of sensory information processing in the brain facilitates esthetic pleasure. Accordingly, the theory would predict that master painters should display biases toward visual properties such as symmetry, balance, and moderate complexity. Have these biases been occurring and if so, have painters been optimizing these properties (fluency variables)? Here, we address these questions with statistics of portrait paintings from the Early Renaissance period. To do this, we first developed different computational measures for each of the aforementioned fluency variables. Then, we measured their statistics in 153 portraits from 26 master painters, in 27 photographs of people in three controlled poses, and in 38 quickly snapped photographs of individual persons. A statistical comparison between Early Renaissance portraits and quickly snapped photographs revealed that painters showed a bias toward balance, symmetry, and moderate complexity. However, a comparison between portraits and controlled-pose photographs showed that painters did not optimize each of these properties. Instead, different painters presented biases toward different, narrow ranges of fluency variables. Further analysis suggested that the painters' individuality stemmed in part from having to resolve the tension between complexity vs. symmetry and balance. We additionally found that constraints on the use of different painting materials by distinct painters modulated these fluency variables systematically. In conclusion, the Processing Fluency Theory of Esthetic Pleasure would need expansion if we were to apply it to the history of visual art since it cannot explain the lack of optimization of each fluency variables. To expand the theory, we propose the existence of a Neuroesthetic Space, which encompasses the possible values that each of the fluency variables can reach in any given art period. We discuss the neural mechanisms of this Space and propose that it has a distributed representation in the human brain. We further propose that different artists reside in different, small sub-regions of the Space. This Neuroesthetic-Space hypothesis raises the question of how painters and their paintings evolve across art periods. PMID:28337133

  7. Complex analysis of neuronal spike trains of deep brain nuclei in patients with Parkinson's disease.

    PubMed

    Chan, Hsiao-Lung; Lin, Ming-An; Lee, Shih-Tseng; Tsai, Yu-Tai; Chao, Pei-Kuang; Wu, Tony

    2010-04-05

    Deep brain stimulation (DBS) of the subthalamic nucleus (STN) has been used to alleviate symptoms of Parkinson's disease. During image-guided stereotactic surgery, signals from microelectrode recordings are used to distinguish the STN from adjacent areas, particularly from the substantia nigra pars reticulata (SNr). Neuronal firing patterns based on interspike intervals (ISI) are commonly used. In the present study, arrival time-based measures, including Lempel-Ziv complexity and deviation-from-Poisson index were employed. Our results revealed significant differences in the arrival time-based measures among non-motor STN, motor STN and SNr and better discrimination than the ISI-based measures. The larger deviations from the Poisson process in the SNr implied less complex dynamics of neuronal discharges. If spike classification was not used, the arrival time-based measures still produced statistical differences among STN subdivisions and SNr, but the ISI-based measures only showed significant differences between motor and non-motor STN. Arrival time-based measures are less affected by spike misclassifications, and may be used as an adjunct for the identification of the STN during microelectrode targeting. Copyright 2010 Elsevier Inc. All rights reserved.

  8. Integrative approaches for large-scale transcriptome-wide association studies

    PubMed Central

    Gusev, Alexander; Ko, Arthur; Shi, Huwenbo; Bhatia, Gaurav; Chung, Wonil; Penninx, Brenda W J H; Jansen, Rick; de Geus, Eco JC; Boomsma, Dorret I; Wright, Fred A; Sullivan, Patrick F; Nikkola, Elina; Alvarez, Marcus; Civelek, Mete; Lusis, Aldons J.; Lehtimäki, Terho; Raitoharju, Emma; Kähönen, Mika; Seppälä, Ilkka; Raitakari, Olli T.; Kuusisto, Johanna; Laakso, Markku; Price, Alkes L.; Pajukanta, Päivi; Pasaniuc, Bogdan

    2016-01-01

    Many genetic variants influence complex traits by modulating gene expression, thus altering the abundance levels of one or multiple proteins. Here, we introduce a powerful strategy that integrates gene expression measurements with summary association statistics from large-scale genome-wide association studies (GWAS) to identify genes whose cis-regulated expression is associated to complex traits. We leverage expression imputation to perform a transcriptome wide association scan (TWAS) to identify significant expression-trait associations. We applied our approaches to expression data from blood and adipose tissue measured in ~3,000 individuals overall. We imputed gene expression into GWAS data from over 900,000 phenotype measurements to identify 69 novel genes significantly associated to obesity-related traits (BMI, lipids, and height). Many of the novel genes are associated with relevant phenotypes in the Hybrid Mouse Diversity Panel. Our results showcase the power of integrating genotype, gene expression and phenotype to gain insights into the genetic basis of complex traits. PMID:26854917

  9. Examining the locus of age effects on complex span tasks.

    PubMed

    McCabe, Jennifer; Hartman, Marilyn

    2003-09-01

    To investigate the locus of age effects on complex span tasks, the authors evaluated the contributions of working memory functions and processing speed. Age differences were found in measures of storage capacity, language processing speed, and lower level speed. Statistically controlling for each of these in hierarchical regressions substantially reduced, but did not eliminate, the complex span age effect. Accounting for lower level speed and storage, however, removed essentially the entire age effect, suggesting that both functions play important and independent roles. Additional evidence for the role of storage capacity was the absence of complex span age differences with span size calibrated to individual word span performance. Explanations for age differences based on inhibition and concurrent task performamce were not supported.

  10. Interference in the classical probabilistic model and its representation in complex Hilbert space

    NASA Astrophysics Data System (ADS)

    Khrennikov, Andrei Yu.

    2005-10-01

    The notion of a context (complex of physical conditions, that is to say: specification of the measurement setup) is basic in this paper.We show that the main structures of quantum theory (interference of probabilities, Born's rule, complex probabilistic amplitudes, Hilbert state space, representation of observables by operators) are present already in a latent form in the classical Kolmogorov probability model. However, this model should be considered as a calculus of contextual probabilities. In our approach it is forbidden to consider abstract context independent probabilities: “first context and only then probability”. We construct the representation of the general contextual probabilistic dynamics in the complex Hilbert space. Thus dynamics of the wave function (in particular, Schrödinger's dynamics) can be considered as Hilbert space projections of a realistic dynamics in a “prespace”. The basic condition for representing of the prespace-dynamics is the law of statistical conservation of energy-conservation of probabilities. In general the Hilbert space projection of the “prespace” dynamics can be nonlinear and even irreversible (but it is always unitary). Methods developed in this paper can be applied not only to quantum mechanics, but also to classical statistical mechanics. The main quantum-like structures (e.g., interference of probabilities) might be found in some models of classical statistical mechanics. Quantum-like probabilistic behavior can be demonstrated by biological systems. In particular, it was recently found in some psychological experiments.

  11. Polarization-correlation optical microscopy of anisotropic biological layers

    NASA Astrophysics Data System (ADS)

    Ushenko, A. G.; Dubolazov, A. V.; Ushenko, V. A.; Ushenko, Yu. A.; Sakhnovskiy, M. Y.; Balazyuk, V. N.; Khukhlina, O.; Viligorska, K.; Bykov, A.; Doronin, A.; Meglinski, I.

    2016-09-01

    The theoretical background of azimuthally stable method of Jones-matrix mapping of histological sections of biopsy of myocardium tissue on the basis of spatial frequency selection of the mechanisms of linear and circular birefringence is presented. The diagnostic application of a new correlation parameter - complex degree of mutual anisotropy - is analytically substantiated. The method of measuring coordinate distributions of complex degree of mutual anisotropy with further spatial filtration of their high- and low-frequency components is developed. The interconnections of such distributions with parameters of linear and circular birefringence of myocardium tissue histological sections are found. The comparative results of measuring the coordinate distributions of complex degree of mutual anisotropy formed by fibrillar networks of myosin fibrils of myocardium tissue of different necrotic states - dead due to coronary heart disease and acute coronary insufficiency are shown. The values and ranges of change of the statistical (moments of the 1st - 4th order) parameters of complex degree of mutual anisotropy coordinate distributions are studied. The objective criteria of differentiation of cause of death are determined.

  12. Approach to determine measurement uncertainty in complex nanosystems with multiparametric dependencies and multivariate output quantities

    NASA Astrophysics Data System (ADS)

    Hampel, B.; Liu, B.; Nording, F.; Ostermann, J.; Struszewski, P.; Langfahl-Klabes, J.; Bieler, M.; Bosse, H.; Güttler, B.; Lemmens, P.; Schilling, M.; Tutsch, R.

    2018-03-01

    In many cases, the determination of the measurement uncertainty of complex nanosystems provides unexpected challenges. This is in particular true for complex systems with many degrees of freedom, i.e. nanosystems with multiparametric dependencies and multivariate output quantities. The aim of this paper is to address specific questions arising during the uncertainty calculation of such systems. This includes the division of the measurement system into subsystems and the distinction between systematic and statistical influences. We demonstrate that, even if the physical systems under investigation are very different, the corresponding uncertainty calculation can always be realized in a similar manner. This is exemplarily shown in detail for two experiments, namely magnetic nanosensors and ultrafast electro-optical sampling of complex time-domain signals. For these examples the approach for uncertainty calculation following the guide to the expression of uncertainty in measurement (GUM) is explained, in which correlations between multivariate output quantities are captured. To illustate the versatility of the proposed approach, its application to other experiments, namely nanometrological instruments for terahertz microscopy, dimensional scanning probe microscopy, and measurement of concentration of molecules using surface enhanced Raman scattering, is shortly discussed in the appendix. We believe that the proposed approach provides a simple but comprehensive orientation for uncertainty calculation in the discussed measurement scenarios and can also be applied to similar or related situations.

  13. Applied immuno-epidemiological research: an approach for integrating existing knowledge into the statistical analysis of multiple immune markers.

    PubMed

    Genser, Bernd; Fischer, Joachim E; Figueiredo, Camila A; Alcântara-Neves, Neuza; Barreto, Mauricio L; Cooper, Philip J; Amorim, Leila D; Saemann, Marcus D; Weichhart, Thomas; Rodrigues, Laura C

    2016-05-20

    Immunologists often measure several correlated immunological markers, such as concentrations of different cytokines produced by different immune cells and/or measured under different conditions, to draw insights from complex immunological mechanisms. Although there have been recent methodological efforts to improve the statistical analysis of immunological data, a framework is still needed for the simultaneous analysis of multiple, often correlated, immune markers. This framework would allow the immunologists' hypotheses about the underlying biological mechanisms to be integrated. We present an analytical approach for statistical analysis of correlated immune markers, such as those commonly collected in modern immuno-epidemiological studies. We demonstrate i) how to deal with interdependencies among multiple measurements of the same immune marker, ii) how to analyse association patterns among different markers, iii) how to aggregate different measures and/or markers to immunological summary scores, iv) how to model the inter-relationships among these scores, and v) how to use these scores in epidemiological association analyses. We illustrate the application of our approach to multiple cytokine measurements from 818 children enrolled in a large immuno-epidemiological study (SCAALA Salvador), which aimed to quantify the major immunological mechanisms underlying atopic diseases or asthma. We demonstrate how to aggregate systematically the information captured in multiple cytokine measurements to immunological summary scores aimed at reflecting the presumed underlying immunological mechanisms (Th1/Th2 balance and immune regulatory network). We show how these aggregated immune scores can be used as predictors in regression models with outcomes of immunological studies (e.g. specific IgE) and compare the results to those obtained by a traditional multivariate regression approach. The proposed analytical approach may be especially useful to quantify complex immune responses in immuno-epidemiological studies, where investigators examine the relationship among epidemiological patterns, immune response, and disease outcomes.

  14. Improving single-molecule FRET measurements by confining molecules in nanopipettes

    NASA Astrophysics Data System (ADS)

    Vogelsang, J.; Doose, S.; Sauer, M.; Tinnefeld, P.

    2007-07-01

    In recent years Fluorescence Resonance Energy Transfer (FRET) has been widely used to determine distances, observe distance dynamics, and monitor molecular binding at the single-molecule level. A basic constraint of single-molecule FRET studies is the limited distance resolution owing to low photon statistics. We demonstrate that by confining molecules in nanopipettes (50-100 nm diameter) smFRET can be measured with improved photon statistics reducing the width of FRET proximity ratio distributions (PRD). This increase in distance resolution makes it possible to reveal subpopulations and dynamics in biomolecular complexes. Our data indicate that the width of PRD is not only determined by photon statistics (shot noise) and distance distributions between the chromophores but that photoinduced dark states of the acceptor also contribute to the PRD width. Furthermore, acceptor dark states such as triplet states influence the accuracy of determined mean FRET values. In this context, we present a strategy for the correction of the shift of the mean PR that is related to triplet induced blinking of the acceptor using reference FCS measurements.

  15. Local image statistics: maximum-entropy constructions and perceptual salience

    PubMed Central

    Victor, Jonathan D.; Conte, Mary M.

    2012-01-01

    The space of visual signals is high-dimensional and natural visual images have a highly complex statistical structure. While many studies suggest that only a limited number of image statistics are used for perceptual judgments, a full understanding of visual function requires analysis not only of the impact of individual image statistics, but also, how they interact. In natural images, these statistical elements (luminance distributions, correlations of low and high order, edges, occlusions, etc.) are intermixed, and their effects are difficult to disentangle. Thus, there is a need for construction of stimuli in which one or more statistical elements are introduced in a controlled fashion, so that their individual and joint contributions can be analyzed. With this as motivation, we present algorithms to construct synthetic images in which local image statistics—including luminance distributions, pair-wise correlations, and higher-order correlations—are explicitly specified and all other statistics are determined implicitly by maximum-entropy. We then apply this approach to measure the sensitivity of the human visual system to local image statistics and to sample their interactions. PMID:22751397

  16. Statistical Features of Complex Systems ---Toward Establishing Sociological Physics---

    NASA Astrophysics Data System (ADS)

    Kobayashi, Naoki; Kuninaka, Hiroto; Wakita, Jun-ichi; Matsushita, Mitsugu

    2011-07-01

    Complex systems have recently attracted much attention, both in natural sciences and in sociological sciences. Members constituting a complex system evolve through nonlinear interactions among each other. This means that in a complex system the multiplicative experience or, so to speak, the history of each member produces its present characteristics. If attention is paid to any statistical property in any complex system, the lognormal distribution is the most natural and appropriate among the standard or ``normal'' statistics to overview the whole system. In fact, the lognormality emerges rather conspicuously when we examine, as familiar and typical examples of statistical aspects in complex systems, the nursing-care period for the aged, populations of prefectures and municipalities, and our body height and weight. Many other examples are found in nature and society. On the basis of these observations, we discuss the possibility of sociological physics.

  17. Three lessons for genetic toxicology from baseball analytics.

    PubMed

    Dertinger, Stephen D

    2017-07-01

    In many respects the evolution of baseball statistics mirrors advances made in the field of genetic toxicology. From its inception, baseball and statistics have been inextricably linked. Generations of players and fans have used a number of relatively simple measurements to describe team and individual player's current performance, as well as for historical record-keeping purposes. Over the years, baseball analytics has progressed in several important ways. Early advances were based on deriving more meaningful metrics from simpler forerunners. Now, technological innovations are delivering much deeper insights. Videography, radar, and other advances that include automatic player recognition capabilities provide the means to measure more complex and useful factors. Fielders' reaction times, efficiency of the route taken to reach a batted ball, and pitch-framing effectiveness come to mind. With the current availability of complex measurements from multiple data streams, multifactorial analyses occurring via machine learning algorithms have become necessary to make sense of the terabytes of data that are now being captured in every Major League Baseball game. Collectively, these advances have transformed baseball statistics from being largely descriptive in nature to serving data-driven, predictive roles. Whereas genetic toxicology has charted a somewhat parallel course, a case can be made that greater utilization of baseball's mindset and strategies would serve our scientific field well. This paper describes three useful lessons for genetic toxicology, courtesy of the field of baseball analytics: seek objective knowledge; incorporate multiple data streams; and embrace machine learning. Environ. Mol. Mutagen. 58:390-397, 2017. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  18. Geo-statistical analysis of Culicoides spp. distribution and abundance in Sicily, Italy.

    PubMed

    Blanda, Valeria; Blanda, Marcellocalogero; La Russa, Francesco; Scimeca, Rossella; Scimeca, Salvatore; D'Agostino, Rosalia; Auteri, Michelangelo; Torina, Alessandra

    2018-02-01

    Biting midges belonging to Culicoides imicola, Culicoides obsoletus complex and Culicoides pulicaris complex (Diptera: Ceratopogonidae) are increasingly implicated as vectors of bluetongue virus in Palaearctic regions. Culicoides obsoletus complex includes C. obsoletus (sensu stricto), C. scoticus, C. dewulfi and C. chiopterus. Culicoides pulicaris and C. lupicaris belong to the Culicoides pulicaris complex. The aim of this study was a geo-statistical analysis of the abundance and spatial distribution of Culicoides spp. involved in bluetongue virus transmission. As part of the national bluetongue surveillance plan 7081 catches were collected in 897 Sicilian farms from 2000 to 2013. Onderstepoort-type blacklight traps were used for sample collection and each catch was analysed for the presence of Culicoides spp. and for the presence and abundance of Culicoides vector species (C. imicola, C. pulicaris / C. obsoletus complexes). A geo-statistical analysis was carried out monthly via the interpolation of measured values based on the Inverse Distance Weighted method, using a GIS tool. Raster maps were reclassified into seven classes according to the presence and abundance of Culicoides, in order to obtain suitable maps for Map Algebra operations. Sicilian provinces showing a very high abundance of Culicoides vector species were Messina (80% of the whole area), Palermo (20%) and Catania (12%). A total of 5654 farms fell within the very high risk area for bluetongue (21% of the 26,676 farms active in Sicily); of these, 3483 farms were in Messina, 1567 in Palermo and 604 in Catania. Culicoides imicola was prevalent in Palermo, C. pulicaris in Messina and C. obsoletus complex was very abundant over the whole island with the highest abundance value in Messina. Our study reports the results of a geo-statistical analysis concerning the abundance and spatial distribution of Culicoides spp. in Sicily throughout the fourteen year study. It provides useful decision support in the field of epidemiology, allowing the identification of areas to be monitored as bases for improved surveillance plans. Moreover, this knowledge can become a tool for the evaluation of virus transmission risks, especially if related to vector competence.

  19. Effects of head-down bed rest on complex heart rate variability: Response to LBNP testing

    NASA Technical Reports Server (NTRS)

    Goldberger, Ary L.; Mietus, Joseph E.; Rigney, David R.; Wood, Margie L.; Fortney, Suzanne M.

    1994-01-01

    Head-down bed rest is used to model physiological changes during spaceflight. We postulated that bed rest would decrease the degree of complex physiological heart rate variability. We analyzed continuous heart rate data from digitized Holter recordings in eight healthy female volunteers (age 28-34 yr) who underwent a 13-day 6 deg head-down bed rest study with serial lower body negative pressure (LBNP) trials. Heart rate variability was measured on a 4-min data sets using conventional time and frequency domain measures as well as with a new measure of signal 'complexity' (approximate entropy). Data were obtained pre-bed rest (control), during bed rest (day 4 and day 9 or 11), and 2 days post-bed rest (recovery). Tolerance to LBNP was significantly reduced on both bed rest days vs. pre-bed rest. Heart rate variability was assessed at peak LBNP. Heart rate approximate entropy was significantly decreased at day 4 and day 9 or 11, returning toward normal during recovery. Heart rate standard deviation and the ratio of high- to low-power frequency did not change significantly. We conclude that short-term bed rest is associated with a decrease in the complex variability of heart rate during LBNP testing in healthy young adult women. Measurement of heart rate complexity, using a method derived from nonlinear dynamics ('chaos theory'), may provide a sensitive marker of this loss of physiological variability, complementing conventional time and frequency domain statistical measures.

  20. Evaluation of single and multiple Doppler lidar techniques to measure complex flow during the XPIA field campaign

    DOE PAGES

    Choukulkar, Aditya; Brewer, W. Alan; Sandberg, Scott P.; ...

    2017-01-23

    Accurate three-dimensional information of wind flow fields can be an important tool in not only visualizing complex flow but also understanding the underlying physical processes and improving flow modeling. However, a thorough analysis of the measurement uncertainties is required to properly interpret results. The XPIA (eXperimental Planetary boundary layer Instrumentation Assessment) field campaign conducted at the Boulder Atmospheric Observatory (BAO) in Erie, CO, from 2 March to 31 May 2015 brought together a large suite of in situ and remote sensing measurement platforms to evaluate complex flow measurement strategies. In this paper, measurement uncertainties for different single and multi-Doppler strategies using simple scanmore » geometries (conical, vertical plane and staring) are investigated. The tradeoffs (such as time–space resolution vs. spatial coverage) among the different measurement techniques are evaluated using co-located measurements made near the BAO tower. Sensitivity of the single-/multi-Doppler measurement uncertainties to averaging period are investigated using the sonic anemometers installed on the BAO tower as the standard reference. Finally, the radiometer measurements are used to partition the measurement periods as a function of atmospheric stability to determine their effect on measurement uncertainty. It was found that with an increase in spatial coverage and measurement complexity, the uncertainty in the wind measurement also increased. For multi-Doppler techniques, the increase in uncertainty for temporally uncoordinated measurements is possibly due to requiring additional assumptions of stationarity along with horizontal homogeneity and less representative line-of-sight velocity statistics. Lastly, it was also found that wind speed measurement uncertainty was lower during stable conditions compared to unstable conditions.« less

  1. Permutation entropy and statistical complexity analysis of turbulence in laboratory plasmas and the solar wind.

    PubMed

    Weck, P J; Schaffner, D A; Brown, M R; Wicks, R T

    2015-02-01

    The Bandt-Pompe permutation entropy and the Jensen-Shannon statistical complexity are used to analyze fluctuating time series of three different turbulent plasmas: the magnetohydrodynamic (MHD) turbulence in the plasma wind tunnel of the Swarthmore Spheromak Experiment (SSX), drift-wave turbulence of ion saturation current fluctuations in the edge of the Large Plasma Device (LAPD), and fully developed turbulent magnetic fluctuations of the solar wind taken from the Wind spacecraft. The entropy and complexity values are presented as coordinates on the CH plane for comparison among the different plasma environments and other fluctuation models. The solar wind is found to have the highest permutation entropy and lowest statistical complexity of the three data sets analyzed. Both laboratory data sets have larger values of statistical complexity, suggesting that these systems have fewer degrees of freedom in their fluctuations, with SSX magnetic fluctuations having slightly less complexity than the LAPD edge I(sat). The CH plane coordinates are compared to the shape and distribution of a spectral decomposition of the wave forms. These results suggest that fully developed turbulence (solar wind) occupies the lower-right region of the CH plane, and that other plasma systems considered to be turbulent have less permutation entropy and more statistical complexity. This paper presents use of this statistical analysis tool on solar wind plasma, as well as on an MHD turbulent experimental plasma.

  2. Confidence intervals and hypothesis testing for the Permutation Entropy with an application to epilepsy

    NASA Astrophysics Data System (ADS)

    Traversaro, Francisco; O. Redelico, Francisco

    2018-04-01

    In nonlinear dynamics, and to a lesser extent in other fields, a widely used measure of complexity is the Permutation Entropy. But there is still no known method to determine the accuracy of this measure. There has been little research on the statistical properties of this quantity that characterize time series. The literature describes some resampling methods of quantities used in nonlinear dynamics - as the largest Lyapunov exponent - but these seems to fail. In this contribution, we propose a parametric bootstrap methodology using a symbolic representation of the time series to obtain the distribution of the Permutation Entropy estimator. We perform several time series simulations given by well-known stochastic processes: the 1/fα noise family, and show in each case that the proposed accuracy measure is as efficient as the one obtained by the frequentist approach of repeating the experiment. The complexity of brain electrical activity, measured by the Permutation Entropy, has been extensively used in epilepsy research for detection in dynamical changes in electroencephalogram (EEG) signal with no consideration of the variability of this complexity measure. An application of the parametric bootstrap methodology is used to compare normal and pre-ictal EEG signals.

  3. Electrophysiological Measures of Resting State Functional Connectivity and Their Relationship with Working Memory Capacity in Childhood

    ERIC Educational Resources Information Center

    Barnes, Jessica J.; Woolrich, Mark W.; Baker, Kate; Colclough, Giles L.; Astle, Duncan E.

    2016-01-01

    Functional connectivity is the statistical association of neuronal activity time courses across distinct brain regions, supporting specific cognitive processes. This coordination of activity is likely to be highly important for complex aspects of cognition, such as the communication of fluctuating task goals from higher-order control regions to…

  4. Adult Literacy. Cuyahoga County Data Brief

    ERIC Educational Resources Information Center

    Center on Urban Poverty and Community Development (NJ1), 2010

    2010-01-01

    There are no direct measures of adult literacy in Cuyahoga County. Instead, this report uses estimates based on a statistical model derived from the National Survey of Adult Literacy. Adult literacy levels range from Level 1 (the most basic) to Level 5 (the most complex). People with Level 1 literacy are at a severe disadvantage in the sense that…

  5. The effects of an energy efficiency retrofit on indoor air quality.

    PubMed

    Frey, S E; Destaillats, H; Cohn, S; Ahrentzen, S; Fraser, M P

    2015-04-01

    To investigate the impacts of an energy efficiency retrofit, indoor air quality and resident health were evaluated at a low-income senior housing apartment complex in Phoenix, Arizona, before and after a green energy building renovation. Indoor and outdoor air quality sampling was carried out simultaneously with a questionnaire to characterize personal habits and general health of residents. Measured indoor formaldehyde levels before the building retrofit routinely exceeded reference exposure limits, but in the long-term follow-up sampling, indoor formaldehyde decreased for the entire study population by a statistically significant margin. Indoor PM levels were dominated by fine particles and showed a statistically significant decrease in the long-term follow-up sampling within certain resident subpopulations (i.e. residents who report smoking and residents who had lived longer at the apartment complex). © 2014 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  6. The value and cost of complexity in predictive modelling: role of tissue anisotropic conductivity and fibre tracts in neuromodulation

    NASA Astrophysics Data System (ADS)

    Salman Shahid, Syed; Bikson, Marom; Salman, Humaira; Wen, Peng; Ahfock, Tony

    2014-06-01

    Objectives. Computational methods are increasingly used to optimize transcranial direct current stimulation (tDCS) dose strategies and yet complexities of existing approaches limit their clinical access. Since predictive modelling indicates the relevance of subject/pathology based data and hence the need for subject specific modelling, the incremental clinical value of increasingly complex modelling methods must be balanced against the computational and clinical time and costs. For example, the incorporation of multiple tissue layers and measured diffusion tensor (DTI) based conductivity estimates increase model precision but at the cost of clinical and computational resources. Costs related to such complexities aggregate when considering individual optimization and the myriad of potential montages. Here, rather than considering if additional details change current-flow prediction, we consider when added complexities influence clinical decisions. Approach. Towards developing quantitative and qualitative metrics of value/cost associated with computational model complexity, we considered field distributions generated by two 4 × 1 high-definition montages (m1 = 4 × 1 HD montage with anode at C3 and m2 = 4 × 1 HD montage with anode at C1) and a single conventional (m3 = C3-Fp2) tDCS electrode montage. We evaluated statistical methods, including residual error (RE) and relative difference measure (RDM), to consider the clinical impact and utility of increased complexities, namely the influence of skull, muscle and brain anisotropic conductivities in a volume conductor model. Main results. Anisotropy modulated current-flow in a montage and region dependent manner. However, significant statistical changes, produced within montage by anisotropy, did not change qualitative peak and topographic comparisons across montages. Thus for the examples analysed, clinical decision on which dose to select would not be altered by the omission of anisotropic brain conductivity. Significance. Results illustrate the need to rationally balance the role of model complexity, such as anisotropy in detailed current flow analysis versus value in clinical dose design. However, when extending our analysis to include axonal polarization, the results provide presumably clinically meaningful information. Hence the importance of model complexity may be more relevant with cellular level predictions of neuromodulation.

  7. Modeling the complexity of acoustic emission during intermittent plastic deformation: Power laws and multifractal spectra

    NASA Astrophysics Data System (ADS)

    Kumar, Jagadish; Ananthakrishna, G.

    2018-01-01

    Scale-invariant power-law distributions for acoustic emission signals are ubiquitous in several plastically deforming materials. However, power-law distributions for acoustic emission energies are reported in distinctly different plastically deforming situations such as hcp and fcc single and polycrystalline samples exhibiting smooth stress-strain curves and in dilute metallic alloys exhibiting discontinuous flow. This is surprising since the underlying dislocation mechanisms in these two types of deformations are very different. So far, there have been no models that predict the power-law statistics for discontinuous flow. Furthermore, the statistics of the acoustic emission signals in jerky flow is even more complex, requiring multifractal measures for a proper characterization. There has been no model that explains the complex statistics either. Here we address the problem of statistical characterization of the acoustic emission signals associated with the three types of the Portevin-Le Chatelier bands. Following our recently proposed general framework for calculating acoustic emission, we set up a wave equation for the elastic degrees of freedom with a plastic strain rate as a source term. The energy dissipated during acoustic emission is represented by the Rayleigh-dissipation function. Using the plastic strain rate obtained from the Ananthakrishna model for the Portevin-Le Chatelier effect, we compute the acoustic emission signals associated with the three Portevin-Le Chatelier bands and the Lüders-like band. The so-calculated acoustic emission signals are used for further statistical characterization. Our results show that the model predicts power-law statistics for all the acoustic emission signals associated with the three types of Portevin-Le Chatelier bands with the exponent values increasing with increasing strain rate. The calculated multifractal spectra corresponding to the acoustic emission signals associated with the three band types have a maximum spread for the type C bands and decreasing with types B and A. We further show that the acoustic emission signals associated with Lüders-like band also exhibit a power-law distribution and multifractality.

  8. Modeling the complexity of acoustic emission during intermittent plastic deformation: Power laws and multifractal spectra.

    PubMed

    Kumar, Jagadish; Ananthakrishna, G

    2018-01-01

    Scale-invariant power-law distributions for acoustic emission signals are ubiquitous in several plastically deforming materials. However, power-law distributions for acoustic emission energies are reported in distinctly different plastically deforming situations such as hcp and fcc single and polycrystalline samples exhibiting smooth stress-strain curves and in dilute metallic alloys exhibiting discontinuous flow. This is surprising since the underlying dislocation mechanisms in these two types of deformations are very different. So far, there have been no models that predict the power-law statistics for discontinuous flow. Furthermore, the statistics of the acoustic emission signals in jerky flow is even more complex, requiring multifractal measures for a proper characterization. There has been no model that explains the complex statistics either. Here we address the problem of statistical characterization of the acoustic emission signals associated with the three types of the Portevin-Le Chatelier bands. Following our recently proposed general framework for calculating acoustic emission, we set up a wave equation for the elastic degrees of freedom with a plastic strain rate as a source term. The energy dissipated during acoustic emission is represented by the Rayleigh-dissipation function. Using the plastic strain rate obtained from the Ananthakrishna model for the Portevin-Le Chatelier effect, we compute the acoustic emission signals associated with the three Portevin-Le Chatelier bands and the Lüders-like band. The so-calculated acoustic emission signals are used for further statistical characterization. Our results show that the model predicts power-law statistics for all the acoustic emission signals associated with the three types of Portevin-Le Chatelier bands with the exponent values increasing with increasing strain rate. The calculated multifractal spectra corresponding to the acoustic emission signals associated with the three band types have a maximum spread for the type C bands and decreasing with types B and A. We further show that the acoustic emission signals associated with Lüders-like band also exhibit a power-law distribution and multifractality.

  9. Information and complexity measures in the interface of a metal and a superconductor

    NASA Astrophysics Data System (ADS)

    Moustakidis, Ch. C.; Panos, C. P.

    2018-06-01

    Fisher information, Shannon information entropy and Statistical Complexity are calculated for the interface of a normal metal and a superconductor, as a function of the temperature for several materials. The order parameter Ψ (r) derived from the Ginzburg-Landau theory is used as an input together with experimental values of critical transition temperature Tc and the superconducting coherence length ξ0. Analytical expressions are obtained for information and complexity measures. Thus Tc is directly related in a simple way with disorder and complexity. An analytical relation is found of the Fisher Information with the energy profile of superconductivity i.e. the ratio of surface free energy and the bulk free energy. We verify that a simple relation holds between Shannon and Fisher information i.e. a decomposition of a global information quantity (Shannon) in terms of two local ones (Fisher information), previously derived and verified for atoms and molecules by Liu et al. Finally, we find analytical expressions for generalized information measures like the Tsallis entropy and Fisher information. We conclude that the proper value of the non-extensivity parameter q ≃ 1, in agreement with previous work using a different model, where q ≃ 1.005.

  10. Analyzing complex networks evolution through Information Theory quantifiers

    NASA Astrophysics Data System (ADS)

    Carpi, Laura C.; Rosso, Osvaldo A.; Saco, Patricia M.; Ravetti, Martín Gómez

    2011-01-01

    A methodology to analyze dynamical changes in complex networks based on Information Theory quantifiers is proposed. The square root of the Jensen-Shannon divergence, a measure of dissimilarity between two probability distributions, and the MPR Statistical Complexity are used to quantify states in the network evolution process. Three cases are analyzed, the Watts-Strogatz model, a gene network during the progression of Alzheimer's disease and a climate network for the Tropical Pacific region to study the El Niño/Southern Oscillation (ENSO) dynamic. We find that the proposed quantifiers are able not only to capture changes in the dynamics of the processes but also to quantify and compare states in their evolution.

  11. Experimental Determination of Dynamical Lee-Yang Zeros

    NASA Astrophysics Data System (ADS)

    Brandner, Kay; Maisi, Ville F.; Pekola, Jukka P.; Garrahan, Juan P.; Flindt, Christian

    2017-05-01

    Statistical physics provides the concepts and methods to explain the phase behavior of interacting many-body systems. Investigations of Lee-Yang zeros—complex singularities of the free energy in systems of finite size—have led to a unified understanding of equilibrium phase transitions. The ideas of Lee and Yang, however, are not restricted to equilibrium phenomena. Recently, Lee-Yang zeros have been used to characterize nonequilibrium processes such as dynamical phase transitions in quantum systems after a quench or dynamic order-disorder transitions in glasses. Here, we experimentally realize a scheme for determining Lee-Yang zeros in such nonequilibrium settings. We extract the dynamical Lee-Yang zeros of a stochastic process involving Andreev tunneling between a normal-state island and two superconducting leads from measurements of the dynamical activity along a trajectory. From the short-time behavior of the Lee-Yang zeros, we predict the large-deviation statistics of the activity which is typically difficult to measure. Our method paves the way for further experiments on the statistical mechanics of many-body systems out of equilibrium.

  12. The African Origin of Complex Projectile Technology: An Analysis Using Tip Cross-Sectional Area and Perimeter

    PubMed Central

    Sisk, Matthew L.; Shea, John J.

    2011-01-01

    Despite a body of literature focusing on the functionality of modern and stylistically distinct projectile points, comparatively little attention has been paid to quantifying the functionality of the early stages of projectile use. Previous work identified a simple ballistics measure, the Tip Cross-Sectional Area, as a way of determining if a given class of stone points could have served as effective projectile armatures. Here we use this in combination with an alternate measure, the Tip Cross-Sectional Perimeter, a more accurate proxy of the force needed to penetrate a target to a lethal depth. The current study discusses this measure and uses it to analyze a collection of measurements from African Middle Stone Age pointed stone artifacts. Several point types that were rejected in previous studies are statistically indistinguishable from ethnographic projectile points using this new measure. The ramifications of this finding for a Middle Stone Age origin of complex projectile technology is discussed. PMID:21755048

  13. Exploratory study on a statistical method to analyse time resolved data obtained during nanomaterial exposure measurements

    NASA Astrophysics Data System (ADS)

    Clerc, F.; Njiki-Menga, G.-H.; Witschger, O.

    2013-04-01

    Most of the measurement strategies that are suggested at the international level to assess workplace exposure to nanomaterials rely on devices measuring, in real time, airborne particles concentrations (according different metrics). Since none of the instruments to measure aerosols can distinguish a particle of interest to the background aerosol, the statistical analysis of time resolved data requires special attention. So far, very few approaches have been used for statistical analysis in the literature. This ranges from simple qualitative analysis of graphs to the implementation of more complex statistical models. To date, there is still no consensus on a particular approach and the current period is always looking for an appropriate and robust method. In this context, this exploratory study investigates a statistical method to analyse time resolved data based on a Bayesian probabilistic approach. To investigate and illustrate the use of the this statistical method, particle number concentration data from a workplace study that investigated the potential for exposure via inhalation from cleanout operations by sandpapering of a reactor producing nanocomposite thin films have been used. In this workplace study, the background issue has been addressed through the near-field and far-field approaches and several size integrated and time resolved devices have been used. The analysis of the results presented here focuses only on data obtained with two handheld condensation particle counters. While one was measuring at the source of the released particles, the other one was measuring in parallel far-field. The Bayesian probabilistic approach allows a probabilistic modelling of data series, and the observed task is modelled in the form of probability distributions. The probability distributions issuing from time resolved data obtained at the source can be compared with the probability distributions issuing from the time resolved data obtained far-field, leading in a quantitative estimation of the airborne particles released at the source when the task is performed. Beyond obtained results, this exploratory study indicates that the analysis of the results requires specific experience in statistics.

  14. Right-Sizing Statistical Models for Longitudinal Data

    PubMed Central

    Wood, Phillip K.; Steinley, Douglas; Jackson, Kristina M.

    2015-01-01

    Arguments are proposed that researchers using longitudinal data should consider more and less complex statistical model alternatives to their initially chosen techniques in an effort to “right-size” the model to the data at hand. Such model comparisons may alert researchers who use poorly fitting overly parsimonious models to more complex better fitting alternatives, and, alternatively, may identify more parsimonious alternatives to overly complex (and perhaps empirically under-identified and/or less powerful) statistical models. A general framework is proposed for considering (often nested) relationships between a variety of psychometric and growth curve models. A three-step approach is proposed in which models are evaluated based on the number and patterning of variance components prior to selection of better-fitting growth models that explain both mean and variation/covariation patterns. The orthogonal, free-curve slope-intercept (FCSI) growth model is considered as a general model which includes, as special cases, many models including the Factor Mean model (FM, McArdle & Epstein, 1987), McDonald's (1967) linearly constrained factor model, Hierarchical Linear Models (HLM), Repeated Measures MANOVA, and the Linear Slope Intercept (LinearSI) Growth Model. The FCSI model, in turn, is nested within the Tuckerized factor model. The approach is illustrated by comparing alternative models in a longitudinal study of children's vocabulary and by comparison of several candidate parametric growth and chronometric models in a Monte Carlo study. PMID:26237507

  15. Right-sizing statistical models for longitudinal data.

    PubMed

    Wood, Phillip K; Steinley, Douglas; Jackson, Kristina M

    2015-12-01

    Arguments are proposed that researchers using longitudinal data should consider more and less complex statistical model alternatives to their initially chosen techniques in an effort to "right-size" the model to the data at hand. Such model comparisons may alert researchers who use poorly fitting, overly parsimonious models to more complex, better-fitting alternatives and, alternatively, may identify more parsimonious alternatives to overly complex (and perhaps empirically underidentified and/or less powerful) statistical models. A general framework is proposed for considering (often nested) relationships between a variety of psychometric and growth curve models. A 3-step approach is proposed in which models are evaluated based on the number and patterning of variance components prior to selection of better-fitting growth models that explain both mean and variation-covariation patterns. The orthogonal free curve slope intercept (FCSI) growth model is considered a general model that includes, as special cases, many models, including the factor mean (FM) model (McArdle & Epstein, 1987), McDonald's (1967) linearly constrained factor model, hierarchical linear models (HLMs), repeated-measures multivariate analysis of variance (MANOVA), and the linear slope intercept (linearSI) growth model. The FCSI model, in turn, is nested within the Tuckerized factor model. The approach is illustrated by comparing alternative models in a longitudinal study of children's vocabulary and by comparing several candidate parametric growth and chronometric models in a Monte Carlo study. (c) 2015 APA, all rights reserved).

  16. The Effect of Electroencephalogram (EEG) Reference Choice on Information-Theoretic Measures of the Complexity and Integration of EEG Signals

    PubMed Central

    Trujillo, Logan T.; Stanfield, Candice T.; Vela, Ruben D.

    2017-01-01

    Converging evidence suggests that human cognition and behavior emerge from functional brain networks interacting on local and global scales. We investigated two information-theoretic measures of functional brain segregation and integration—interaction complexity CI(X), and integration I(X)—as applied to electroencephalographic (EEG) signals and how these measures are affected by choice of EEG reference. CI(X) is a statistical measure of the system entropy accounted for by interactions among its elements, whereas I(X) indexes the overall deviation from statistical independence of the individual elements of a system. We recorded 72 channels of scalp EEG from human participants who sat in a wakeful resting state (interleaved counterbalanced eyes-open and eyes-closed blocks). CI(X) and I(X) of the EEG signals were computed using four different EEG references: linked-mastoids (LM) reference, average (AVG) reference, a Laplacian (LAP) “reference-free” transformation, and an infinity (INF) reference estimated via the Reference Electrode Standardization Technique (REST). Fourier-based power spectral density (PSD), a standard measure of resting state activity, was computed for comparison and as a check of data integrity and quality. We also performed dipole source modeling in order to assess the accuracy of neural source CI(X) and I(X) estimates obtained from scalp-level EEG signals. CI(X) was largest for the LAP transformation, smallest for the LM reference, and at intermediate values for the AVG and INF references. I(X) was smallest for the LAP transformation, largest for the LM reference, and at intermediate values for the AVG and INF references. Furthermore, across all references, CI(X) and I(X) reliably distinguished between resting-state conditions (larger values for eyes-open vs. eyes-closed). These findings occurred in the context of the overall expected pattern of resting state PSD. Dipole modeling showed that simulated scalp EEG-level CI(X) and I(X) reflected changes in underlying neural source dependencies, but only for higher levels of integration and with highest accuracy for the LAP transformation. Our observations suggest that the Laplacian-transformation should be preferred for the computation of scalp-level CI(X) and I(X) due to its positive impact on EEG signal quality and statistics, reduction of volume-conduction, and the higher accuracy this provides when estimating scalp-level EEG complexity and integration. PMID:28790884

  17. Nonlinear digital signal processing in mental health: characterization of major depression using instantaneous entropy measures of heartbeat dynamics.

    PubMed

    Valenza, Gaetano; Garcia, Ronald G; Citi, Luca; Scilingo, Enzo P; Tomaz, Carlos A; Barbieri, Riccardo

    2015-01-01

    Nonlinear digital signal processing methods that address system complexity have provided useful computational tools for helping in the diagnosis and treatment of a wide range of pathologies. More specifically, nonlinear measures have been successful in characterizing patients with mental disorders such as Major Depression (MD). In this study, we propose the use of instantaneous measures of entropy, namely the inhomogeneous point-process approximate entropy (ipApEn) and the inhomogeneous point-process sample entropy (ipSampEn), to describe a novel characterization of MD patients undergoing affective elicitation. Because these measures are built within a nonlinear point-process model, they allow for the assessment of complexity in cardiovascular dynamics at each moment in time. Heartbeat dynamics were characterized from 48 healthy controls and 48 patients with MD while emotionally elicited through either neutral or arousing audiovisual stimuli. Experimental results coming from the arousing tasks show that ipApEn measures are able to instantaneously track heartbeat complexity as well as discern between healthy subjects and MD patients. Conversely, standard heart rate variability (HRV) analysis performed in both time and frequency domains did not show any statistical significance. We conclude that measures of entropy based on nonlinear point-process models might contribute to devising useful computational tools for care in mental health.

  18. Use of Statistical Analyses in the Ophthalmic Literature

    PubMed Central

    Lisboa, Renato; Meira-Freitas, Daniel; Tatham, Andrew J.; Marvasti, Amir H.; Sharpsten, Lucie; Medeiros, Felipe A.

    2014-01-01

    Purpose To identify the most commonly used statistical analyses in the ophthalmic literature and to determine the likely gain in comprehension of the literature that readers could expect if they were to sequentially add knowledge of more advanced techniques to their statistical repertoire. Design Cross-sectional study Methods All articles published from January 2012 to December 2012 in Ophthalmology, American Journal of Ophthalmology and Archives of Ophthalmology were reviewed. A total of 780 peer-reviewed articles were included. Two reviewers examined each article and assigned categories to each one depending on the type of statistical analyses used. Discrepancies between reviewers were resolved by consensus. Main Outcome Measures Total number and percentage of articles containing each category of statistical analysis were obtained. Additionally we estimated the accumulated number and percentage of articles that a reader would be expected to be able to interpret depending on their statistical repertoire. Results Readers with little or no statistical knowledge would be expected to be able to interpret the statistical methods presented in only 20.8% of articles. In order to understand more than half (51.4%) of the articles published, readers were expected to be familiar with at least 15 different statistical methods. Knowledge of 21 categories of statistical methods was necessary to comprehend 70.9% of articles, while knowledge of more than 29 categories was necessary to comprehend more than 90% of articles. Articles in retina and glaucoma subspecialties showed a tendency for using more complex analysis when compared to cornea. Conclusions Readers of clinical journals in ophthalmology need to have substantial knowledge of statistical methodology to understand the results of published studies in the literature. The frequency of use of complex statistical analyses also indicates that those involved in the editorial peer-review process must have sound statistical knowledge in order to critically appraise articles submitted for publication. The results of this study could provide guidance to direct the statistical learning of clinical ophthalmologists, researchers and educators involved in the design of courses for residents and medical students. PMID:24612977

  19. Statistical methods used in articles published by the Journal of Periodontal and Implant Science.

    PubMed

    Choi, Eunsil; Lyu, Jiyoung; Park, Jinyoung; Kim, Hae-Young

    2014-12-01

    The purposes of this study were to assess the trend of use of statistical methods including parametric and nonparametric methods and to evaluate the use of complex statistical methodology in recent periodontal studies. This study analyzed 123 articles published in the Journal of Periodontal & Implant Science (JPIS) between 2010 and 2014. Frequencies and percentages were calculated according to the number of statistical methods used, the type of statistical method applied, and the type of statistical software used. Most of the published articles considered (64.4%) used statistical methods. Since 2011, the percentage of JPIS articles using statistics has increased. On the basis of multiple counting, we found that the percentage of studies in JPIS using parametric methods was 61.1%. Further, complex statistical methods were applied in only 6 of the published studies (5.0%), and nonparametric statistical methods were applied in 77 of the published studies (38.9% of a total of 198 studies considered). We found an increasing trend towards the application of statistical methods and nonparametric methods in recent periodontal studies and thus, concluded that increased use of complex statistical methodology might be preferred by the researchers in the fields of study covered by JPIS.

  20. Development of a Statistical Validation Methodology for Fire Weather Indices

    Treesearch

    Brian E. Potter; Scott Goodrick; Tim Brown

    2003-01-01

    Fire managers and forecasters must have tools, such as fire indices, to summarize large amounts of complex information. These tools allow them to identify and plan for periods of elevated risk and/or wildfire potential. This need was once met using simple measures like relative humidity or maximum daily temperature (e.g., Gisborne, 1936) to describe fire weather, and...

  1. An online sleep apnea detection method based on recurrence quantification analysis.

    PubMed

    Nguyen, Hoa Dinh; Wilkins, Brek A; Cheng, Qi; Benjamin, Bruce Allen

    2014-07-01

    This paper introduces an online sleep apnea detection method based on heart rate complexity as measured by recurrence quantification analysis (RQA) statistics of heart rate variability (HRV) data. RQA statistics can capture nonlinear dynamics of a complex cardiorespiratory system during obstructive sleep apnea. In order to obtain a more robust measurement of the nonstationarity of the cardiorespiratory system, we use different fixed amount of neighbor thresholdings for recurrence plot calculation. We integrate a feature selection algorithm based on conditional mutual information to select the most informative RQA features for classification, and hence, to speed up the real-time classification process without degrading the performance of the system. Two types of binary classifiers, i.e., support vector machine and neural network, are used to differentiate apnea from normal sleep. A soft decision fusion rule is developed to combine the results of these classifiers in order to improve the classification performance of the whole system. Experimental results show that our proposed method achieves better classification results compared with the previous recurrence analysis-based approach. We also show that our method is flexible and a strong candidate for a real efficient sleep apnea detection system.

  2. Seeking a fingerprint: analysis of point processes in actigraphy recording

    NASA Astrophysics Data System (ADS)

    Gudowska-Nowak, Ewa; Ochab, Jeremi K.; Oleś, Katarzyna; Beldzik, Ewa; Chialvo, Dante R.; Domagalik, Aleksandra; Fąfrowicz, Magdalena; Marek, Tadeusz; Nowak, Maciej A.; Ogińska, Halszka; Szwed, Jerzy; Tyburczyk, Jacek

    2016-05-01

    Motor activity of humans displays complex temporal fluctuations which can be characterised by scale-invariant statistics, thus demonstrating that structure and fluctuations of such kinetics remain similar over a broad range of time scales. Previous studies on humans regularly deprived of sleep or suffering from sleep disorders predicted a change in the invariant scale parameters with respect to those for healthy subjects. In this study we investigate the signal patterns from actigraphy recordings by means of characteristic measures of fractional point processes. We analyse spontaneous locomotor activity of healthy individuals recorded during a week of regular sleep and a week of chronic partial sleep deprivation. Behavioural symptoms of lack of sleep can be evaluated by analysing statistics of duration times during active and resting states, and alteration of behavioural organisation can be assessed by analysis of power laws detected in the event count distribution, distribution of waiting times between consecutive movements and detrended fluctuation analysis of recorded time series. We claim that among different measures characterising complexity of the actigraphy recordings and their variations implied by chronic sleep distress, the exponents characterising slopes of survival functions in resting states are the most effective biomarkers distinguishing between healthy and sleep-deprived groups.

  3. Statistical complexity without explicit reference to underlying probabilities

    NASA Astrophysics Data System (ADS)

    Pennini, F.; Plastino, A.

    2018-06-01

    We show that extremely simple systems of a not too large number of particles can be simultaneously thermally stable and complex. To such an end, we extend the statistical complexity's notion to simple configurations of non-interacting particles, without appeal to probabilities, and discuss configurational properties.

  4. Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package

    NASA Astrophysics Data System (ADS)

    Donges, Jonathan; Heitzig, Jobst; Beronov, Boyan; Wiedermann, Marc; Runge, Jakob; Feng, Qing Yi; Tupikina, Liubov; Stolbova, Veronika; Donner, Reik; Marwan, Norbert; Dijkstra, Henk; Kurths, Jürgen

    2016-04-01

    We introduce the pyunicorn (Pythonic unified complex network and recurrence analysis toolbox) open source software package for applying and combining modern methods of data analysis and modeling from complex network theory and nonlinear time series analysis. pyunicorn is a fully object-oriented and easily parallelizable package written in the language Python. It allows for the construction of functional networks such as climate networks in climatology or functional brain networks in neuroscience representing the structure of statistical interrelationships in large data sets of time series and, subsequently, investigating this structure using advanced methods of complex network theory such as measures and models for spatial networks, networks of interacting networks, node-weighted statistics, or network surrogates. Additionally, pyunicorn provides insights into the nonlinear dynamics of complex systems as recorded in uni- and multivariate time series from a non-traditional perspective by means of recurrence quantification analysis, recurrence networks, visibility graphs, and construction of surrogate time series. The range of possible applications of the library is outlined, drawing on several examples mainly from the field of climatology. pyunicorn is available online at https://github.com/pik-copan/pyunicorn. Reference: J.F. Donges, J. Heitzig, B. Beronov, M. Wiedermann, J. Runge, Q.-Y. Feng, L. Tupikina, V. Stolbova, R.V. Donner, N. Marwan, H.A. Dijkstra, and J. Kurths, Unified functional network and nonlinear time series analysis for complex systems science: The pyunicorn package, Chaos 25, 113101 (2015), DOI: 10.1063/1.4934554, Preprint: arxiv.org:1507.01571 [physics.data-an].

  5. Effective control of complex turbulent dynamical systems through statistical functionals.

    PubMed

    Majda, Andrew J; Qi, Di

    2017-05-30

    Turbulent dynamical systems characterized by both a high-dimensional phase space and a large number of instabilities are ubiquitous among complex systems in science and engineering, including climate, material, and neural science. Control of these complex systems is a grand challenge, for example, in mitigating the effects of climate change or safe design of technology with fully developed shear turbulence. Control of flows in the transition to turbulence, where there is a small dimension of instabilities about a basic mean state, is an important and successful discipline. In complex turbulent dynamical systems, it is impossible to track and control the large dimension of instabilities, which strongly interact and exchange energy, and new control strategies are needed. The goal of this paper is to propose an effective statistical control strategy for complex turbulent dynamical systems based on a recent statistical energy principle and statistical linear response theory. We illustrate the potential practical efficiency and verify this effective statistical control strategy on the 40D Lorenz 1996 model in forcing regimes with various types of fully turbulent dynamics with nearly one-half of the phase space unstable.

  6. FIRST RESULTS FROM THE RAPID-RESPONSE SPECTROPHOTOMETRIC CHARACTERIZATION OF NEAR-EARTH OBJECTS USING UKIRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mommert, M.; Trilling, D. E.; Petersen, E.

    2016-04-15

    Using the Wide Field Camera for the United Kingdom Infrared Telescope (UKIRT), we measure the near-infrared colors of near-Earth objects (NEOs) in order to put constraints on their taxonomic classifications. The rapid-response character of our observations allows us to observe NEOs when they are close to the Earth and bright. Here we present near-infrared color measurements of 86 NEOs, most of which were observed within a few days of their discovery, allowing us to characterize NEOs with diameters of only a few meters. Using machine-learning methods, we compare our measurements to existing asteroid spectral data and provide probabilistic taxonomic classificationsmore » for our targets. Our observations allow us to distinguish between S-complex, C/X-complex, D-type, and V-type asteroids. Our results suggest that the fraction of S-complex asteroids in the whole NEO population is lower than the fraction of ordinary chondrites in the meteorite fall statistics. Future data obtained with UKIRT will be used to investigate the significance of this discrepancy.« less

  7. Various complexity measures in confined hydrogen atom

    NASA Astrophysics Data System (ADS)

    Majumdar, Sangita; Mukherjee, Neetik; Roy, Amlan K.

    2017-11-01

    Several well-known statistical measures similar to LMC and Fisher-Shannon complexity have been computed for confined hydrogen atom in both position (r) and momentum (p) spaces. Further, a more generalized form of these quantities with Rényi entropy (R) is explored here. The role of scaling parameter in the exponential part is also pursued. R is evaluated taking order of entropic moments α, β as (2/3, 3) in r and p spaces. Detailed systematic results of these measures with respect to variation of confinement radius rc is presented for low-lying states such as, 1 s - 3 d, 4 f and 5 g . For nodal states, such as 2 s, 3 s and 3 p , as rc progresses there appears a maximum followed by a minimum in r space, having certain values of the scaling parameter. However, the corresponding p-space results lack such distinct patterns. This study reveals many other interesting features.

  8. Overarching framework for data-based modelling

    NASA Astrophysics Data System (ADS)

    Schelter, Björn; Mader, Malenka; Mader, Wolfgang; Sommerlade, Linda; Platt, Bettina; Lai, Ying-Cheng; Grebogi, Celso; Thiel, Marco

    2014-02-01

    One of the main modelling paradigms for complex physical systems are networks. When estimating the network structure from measured signals, typically several assumptions such as stationarity are made in the estimation process. Violating these assumptions renders standard analysis techniques fruitless. We here propose a framework to estimate the network structure from measurements of arbitrary non-linear, non-stationary, stochastic processes. To this end, we propose a rigorous mathematical theory that underlies this framework. Based on this theory, we present a highly efficient algorithm and the corresponding statistics that are immediately sensibly applicable to measured signals. We demonstrate its performance in a simulation study. In experiments of transitions between vigilance stages in rodents, we infer small network structures with complex, time-dependent interactions; this suggests biomarkers for such transitions, the key to understand and diagnose numerous diseases such as dementia. We argue that the suggested framework combines features that other approaches followed so far lack.

  9. Characterizing Tityus discrepans scorpion venom from a fractal perspective: Venom complexity, effects of captivity, sexual dimorphism, differences among species.

    PubMed

    D'Suze, Gina; Sandoval, Moisés; Sevcik, Carlos

    2015-12-15

    A characteristic of venom elution patterns, shared with many other complex systems, is that many their features cannot be properly described with statistical or euclidean concepts. The understanding of such systems became possible with Mandelbrot's fractal analysis. Venom elution patterns were produced using the reversed phase high performance liquid chromatography (HPLC) with 1 mg of venom. One reason for the lack of quantitative analyses of the sources of venom variability is parametrizing the venom chromatograms' complexity. We quantize this complexity by means of an algorithm which estimates the contortedness (Q) of a waveform. Fractal analysis was used to compare venoms and to measure inter- and intra-specific venom variability. We studied variations in venom complexity derived from gender, seasonal and environmental factors, duration of captivity in the laboratory, technique used to milk venom. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. Ranking streamflow model performance based on Information theory metrics

    NASA Astrophysics Data System (ADS)

    Martinez, Gonzalo; Pachepsky, Yakov; Pan, Feng; Wagener, Thorsten; Nicholson, Thomas

    2016-04-01

    The accuracy-based model performance metrics not necessarily reflect the qualitative correspondence between simulated and measured streamflow time series. The objective of this work was to use the information theory-based metrics to see whether they can be used as complementary tool for hydrologic model evaluation and selection. We simulated 10-year streamflow time series in five watersheds located in Texas, North Carolina, Mississippi, and West Virginia. Eight model of different complexity were applied. The information-theory based metrics were obtained after representing the time series as strings of symbols where different symbols corresponded to different quantiles of the probability distribution of streamflow. The symbol alphabet was used. Three metrics were computed for those strings - mean information gain that measures the randomness of the signal, effective measure complexity that characterizes predictability and fluctuation complexity that characterizes the presence of a pattern in the signal. The observed streamflow time series has smaller information content and larger complexity metrics than the precipitation time series. Watersheds served as information filters and and streamflow time series were less random and more complex than the ones of precipitation. This is reflected the fact that the watershed acts as the information filter in the hydrologic conversion process from precipitation to streamflow. The Nash Sutcliffe efficiency metric increased as the complexity of models increased, but in many cases several model had this efficiency values not statistically significant from each other. In such cases, ranking models by the closeness of the information-theory based parameters in simulated and measured streamflow time series can provide an additional criterion for the evaluation of hydrologic model performance.

  11. Sparse approximation of currents for statistics on curves and surfaces.

    PubMed

    Durrleman, Stanley; Pennec, Xavier; Trouvé, Alain; Ayache, Nicholas

    2008-01-01

    Computing, processing, visualizing statistics on shapes like curves or surfaces is a real challenge with many applications ranging from medical image analysis to computational geometry. Modelling such geometrical primitives with currents avoids feature-based approach as well as point-correspondence method. This framework has been proved to be powerful to register brain surfaces or to measure geometrical invariants. However, if the state-of-the-art methods perform efficiently pairwise registrations, new numerical schemes are required to process groupwise statistics due to an increasing complexity when the size of the database is growing. Statistics such as mean and principal modes of a set of shapes often have a heavy and highly redundant representation. We propose therefore to find an adapted basis on which mean and principal modes have a sparse decomposition. Besides the computational improvement, this sparse representation offers a way to visualize and interpret statistics on currents. Experiments show the relevance of the approach on 34 sets of 70 sulcal lines and on 50 sets of 10 meshes of deep brain structures.

  12. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries.

    PubMed

    Rivera-Rodriguez, Claudia L; Resch, Stephen; Haneuse, Sebastien

    2018-01-01

    In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty.

  13. Quantifying and reducing statistical uncertainty in sample-based health program costing studies in low- and middle-income countries

    PubMed Central

    Resch, Stephen

    2018-01-01

    Objectives: In many low- and middle-income countries, the costs of delivering public health programs such as for HIV/AIDS, nutrition, and immunization are not routinely tracked. A number of recent studies have sought to estimate program costs on the basis of detailed information collected on a subsample of facilities. While unbiased estimates can be obtained via accurate measurement and appropriate analyses, they are subject to statistical uncertainty. Quantification of this uncertainty, for example, via standard errors and/or 95% confidence intervals, provides important contextual information for decision-makers and for the design of future costing studies. While other forms of uncertainty, such as that due to model misspecification, are considered and can be investigated through sensitivity analyses, statistical uncertainty is often not reported in studies estimating the total program costs. This may be due to a lack of awareness/understanding of (1) the technical details regarding uncertainty estimation and (2) the availability of software with which to calculate uncertainty for estimators resulting from complex surveys. We provide an overview of statistical uncertainty in the context of complex costing surveys, emphasizing the various potential specific sources that contribute to overall uncertainty. Methods: We describe how analysts can compute measures of uncertainty, either via appropriately derived formulae or through resampling techniques such as the bootstrap. We also provide an overview of calibration as a means of using additional auxiliary information that is readily available for the entire program, such as the total number of doses administered, to decrease uncertainty and thereby improve decision-making and the planning of future studies. Results: A recent study of the national program for routine immunization in Honduras shows that uncertainty can be reduced by using information available prior to the study. This method can not only be used when estimating the total cost of delivering established health programs but also to decrease uncertainty when the interest lies in assessing the incremental effect of an intervention. Conclusion: Measures of statistical uncertainty associated with survey-based estimates of program costs, such as standard errors and 95% confidence intervals, provide important contextual information for health policy decision-making and key inputs for the design of future costing studies. Such measures are often not reported, possibly because of technical challenges associated with their calculation and a lack of awareness of appropriate software. Modern statistical analysis methods for survey data, such as calibration, provide a means to exploit additional information that is readily available but was not used in the design of the study to significantly improve the estimation of total cost through the reduction of statistical uncertainty. PMID:29636964

  14. Statistics of Optical Coherence Tomography Data From Human Retina

    PubMed Central

    de Juan, Joaquín; Ferrone, Claudia; Giannini, Daniela; Huang, David; Koch, Giorgio; Russo, Valentina; Tan, Ou; Bruni, Carlo

    2010-01-01

    Optical coherence tomography (OCT) has recently become one of the primary methods for noninvasive probing of the human retina. The pseudoimage formed by OCT (the so-called B-scan) varies probabilistically across pixels due to complexities in the measurement technique. Hence, sensitive automatic procedures of diagnosis using OCT may exploit statistical analysis of the spatial distribution of reflectance. In this paper, we perform a statistical study of retinal OCT data. We find that the stretched exponential probability density function can model well the distribution of intensities in OCT pseudoimages. Moreover, we show a small, but significant correlation between neighbor pixels when measuring OCT intensities with pixels of about 5 µm. We then develop a simple joint probability model for the OCT data consistent with known retinal features. This model fits well the stretched exponential distribution of intensities and their spatial correlation. In normal retinas, fit parameters of this model are relatively constant along retinal layers, but varies across layers. However, in retinas with diabetic retinopathy, large spikes of parameter modulation interrupt the constancy within layers, exactly where pathologies are visible. We argue that these results give hope for improvement in statistical pathology-detection methods even when the disease is in its early stages. PMID:20304733

  15. Statistical inference with quantum measurements: methodologies for nitrogen vacancy centers in diamond

    NASA Astrophysics Data System (ADS)

    Hincks, Ian; Granade, Christopher; Cory, David G.

    2018-01-01

    The analysis of photon count data from the standard nitrogen vacancy (NV) measurement process is treated as a statistical inference problem. This has applications toward gaining better and more rigorous error bars for tasks such as parameter estimation (e.g. magnetometry), tomography, and randomized benchmarking. We start by providing a summary of the standard phenomenological model of the NV optical process in terms of Lindblad jump operators. This model is used to derive random variables describing emitted photons during measurement, to which finite visibility, dark counts, and imperfect state preparation are added. NV spin-state measurement is then stated as an abstract statistical inference problem consisting of an underlying biased coin obstructed by three Poisson rates. Relevant frequentist and Bayesian estimators are provided, discussed, and quantitatively compared. We show numerically that the risk of the maximum likelihood estimator is well approximated by the Cramér-Rao bound, for which we provide a simple formula. Of the estimators, we in particular promote the Bayes estimator, owing to its slightly better risk performance, and straightforward error propagation into more complex experiments. This is illustrated on experimental data, where quantum Hamiltonian learning is performed and cross-validated in a fully Bayesian setting, and compared to a more traditional weighted least squares fit.

  16. Adaptation in Coding by Large Populations of Neurons in the Retina

    NASA Astrophysics Data System (ADS)

    Ioffe, Mark L.

    A comprehensive theory of neural computation requires an understanding of the statistical properties of the neural population code. The focus of this work is the experimental study and theoretical analysis of the statistical properties of neural activity in the tiger salamander retina. This is an accessible yet complex system, for which we control the visual input and record from a substantial portion--greater than a half--of the ganglion cell population generating the spiking output. Our experiments probe adaptation of the retina to visual statistics: a central feature of sensory systems which have to adjust their limited dynamic range to a far larger space of possible inputs. In Chapter 1 we place our work in context with a brief overview of the relevant background. In Chapter 2 we describe the experimental methodology of recording from 100+ ganglion cells in the tiger salamander retina. In Chapter 3 we first present the measurements of adaptation of individual cells to changes in stimulation statistics and then investigate whether pairwise correlations in fluctuations of ganglion cell activity change across different stimulation conditions. We then transition to a study of the population-level probability distribution of the retinal response captured with maximum-entropy models. Convergence of the model inference is presented in Chapter 4. In Chapter 5 we first test the empirical presence of a phase transition in such models fitting the retinal response to different experimental conditions, and then proceed to develop other characterizations which are sensitive to complexity in the interaction matrix. This includes an analysis of the dynamics of sampling at finite temperature, which demonstrates a range of subtle attractor-like properties in the energy landscape. These are largely conserved when ambient illumination is varied 1000-fold, a result not necessarily apparent from the measured low-order statistics of the distribution. Our results form a consistent picture which is discussed at the end of Chapter 5. We conclude with a few future directions related to this thesis.

  17. Methods and means of Fourier-Stokes polarimetry and the spatial frequency filtering of phase anisotropy manifestations

    NASA Astrophysics Data System (ADS)

    Novakovskaya, O. Yu.; Ushenko, A. G.; Dubolazov, A. V.; Ushenko, V. A.; Ushenko, Yu. A.; Sakhnovskiy, M. Yu.; Soltys, I. V.; Zhytaryuk, V. H.; Olar, O. V.; Sidor, M.; Gorsky, M. P.

    2016-12-01

    The theoretical background of azimuthally stable method of Jones-matrix mapping of histological sections of biopsy of myocardium tissue on the basis of spatial frequency selection of the mechanisms of linear and circular birefringence is presented. The diagnostic application of a new correlation parameter - complex degree of mutual anisotropy - is analytically substantiated. The method of measuring coordinate distributions of complex degree of mutual anisotropy with further spatial filtration of their high- and low-frequency components is developed. The interconnections of such distributions with parameters of linear and circular birefringence of myocardium tissue histological sections are found. The comparative results of measuring the coordinate distributions of complex degree of mutual anisotropy formed by fibrillar networks of myosin fibrils of myocardium tissue of different necrotic states - dead due to coronary heart disease and acute coronary insufficiency are shown. The values and ranges of change of the statistical (moments of the 1st - 4th order) parameters of complex degree of mutual anisotropy coordinate distributions are studied. The objective criteria of differentiation of cause of death are determined.

  18. Use of Multivariate Linkage Analysis for Dissection of a Complex Cognitive Trait

    PubMed Central

    Marlow, Angela J.; Fisher, Simon E.; Francks, Clyde; MacPhie, I. Laurence; Cherny, Stacey S.; Richardson, Alex J.; Talcott, Joel B.; Stein, John F.; Monaco, Anthony P.; Cardon, Lon R.

    2003-01-01

    Replication of linkage results for complex traits has been exceedingly difficult, owing in part to the inability to measure the precise underlying phenotype, small sample sizes, genetic heterogeneity, and statistical methods employed in analysis. Often, in any particular study, multiple correlated traits have been collected, yet these have been analyzed independently or, at most, in bivariate analyses. Theoretical arguments suggest that full multivariate analysis of all available traits should offer more power to detect linkage; however, this has not yet been evaluated on a genomewide scale. Here, we conduct multivariate genomewide analyses of quantitative-trait loci that influence reading- and language-related measures in families affected with developmental dyslexia. The results of these analyses are substantially clearer than those of previous univariate analyses of the same data set, helping to resolve a number of key issues. These outcomes highlight the relevance of multivariate analysis for complex disorders for dissection of linkage results in correlated traits. The approach employed here may aid positional cloning of susceptibility genes in a wide spectrum of complex traits. PMID:12587094

  19. Measurement and Statistics of Application Business in Complex Internet

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Li, Yang; Li, Yipeng; Wu, Shuhang; Song, Shiji; Ren, Yong

    Owing to independent topologies and autonomic routing mechanism, the logical networks formed by Internet application business behavior cause the significant influence on the physical networks. In this paper, the backbone traffic of TUNET (Tsinghua University Networks) is measured, further more, the two most important application business: HTTP and P2P are analyzed at IP-packet level. It is shown that uplink HTTP and P2P packets behavior presents spatio-temporal power-law characteristics with exponents 1.25 and 1.53 respectively. Downlink HTTP packets behavior also presents power-law characteristics, but has more little exponents γ = 0.82 which differs from traditional complex networks research result. Moreover, downlink P2P packets distribution presents an approximate power-law which means that flow equilibrium profits little from distributed peer-to peer mechanism actually.

  20. Water Quality Sensing and Spatio-Temporal Monitoring Structure with Autocorrelation Kernel Methods.

    PubMed

    Vizcaíno, Iván P; Carrera, Enrique V; Muñoz-Romero, Sergio; Cumbal, Luis H; Rojo-Álvarez, José Luis

    2017-10-16

    Pollution on water resources is usually analyzed with monitoring campaigns, which consist of programmed sampling, measurement, and recording of the most representative water quality parameters. These campaign measurements yields a non-uniform spatio-temporal sampled data structure to characterize complex dynamics phenomena. In this work, we propose an enhanced statistical interpolation method to provide water quality managers with statistically interpolated representations of spatial-temporal dynamics. Specifically, our proposal makes efficient use of the a priori available information of the quality parameter measurements through Support Vector Regression (SVR) based on Mercer's kernels. The methods are benchmarked against previously proposed methods in three segments of the Machángara River and one segment of the San Pedro River in Ecuador, and their different dynamics are shown by statistically interpolated spatial-temporal maps. The best interpolation performance in terms of mean absolute error was the SVR with Mercer's kernel given by either the Mahalanobis spatial-temporal covariance matrix or by the bivariate estimated autocorrelation function. In particular, the autocorrelation kernel provides with significant improvement of the estimation quality, consistently for all the six water quality variables, which points out the relevance of including a priori knowledge of the problem.

  1. Water Quality Sensing and Spatio-Temporal Monitoring Structure with Autocorrelation Kernel Methods

    PubMed Central

    Vizcaíno, Iván P.; Muñoz-Romero, Sergio; Cumbal, Luis H.

    2017-01-01

    Pollution on water resources is usually analyzed with monitoring campaigns, which consist of programmed sampling, measurement, and recording of the most representative water quality parameters. These campaign measurements yields a non-uniform spatio-temporal sampled data structure to characterize complex dynamics phenomena. In this work, we propose an enhanced statistical interpolation method to provide water quality managers with statistically interpolated representations of spatial-temporal dynamics. Specifically, our proposal makes efficient use of the a priori available information of the quality parameter measurements through Support Vector Regression (SVR) based on Mercer’s kernels. The methods are benchmarked against previously proposed methods in three segments of the Machángara River and one segment of the San Pedro River in Ecuador, and their different dynamics are shown by statistically interpolated spatial-temporal maps. The best interpolation performance in terms of mean absolute error was the SVR with Mercer’s kernel given by either the Mahalanobis spatial-temporal covariance matrix or by the bivariate estimated autocorrelation function. In particular, the autocorrelation kernel provides with significant improvement of the estimation quality, consistently for all the six water quality variables, which points out the relevance of including a priori knowledge of the problem. PMID:29035333

  2. The High Cost of Complexity in Experimental Design and Data Analysis: Type I and Type II Error Rates in Multiway ANOVA.

    ERIC Educational Resources Information Center

    Smith, Rachel A.; Levine, Timothy R.; Lachlan, Kenneth A.; Fediuk, Thomas A.

    2002-01-01

    Notes that the availability of statistical software packages has led to a sharp increase in use of complex research designs and complex statistical analyses in communication research. Reports a series of Monte Carlo simulations which demonstrate that this complexity may come at a heavier cost than many communication researchers realize. Warns…

  3. Testing for independence in J×K contingency tables with complex sample survey data.

    PubMed

    Lipsitz, Stuart R; Fitzmaurice, Garrett M; Sinha, Debajyoti; Hevelone, Nathanael; Giovannucci, Edward; Hu, Jim C

    2015-09-01

    The test of independence of row and column variables in a (J×K) contingency table is a widely used statistical test in many areas of application. For complex survey samples, use of the standard Pearson chi-squared test is inappropriate due to correlation among units within the same cluster. Rao and Scott (1981, Journal of the American Statistical Association 76, 221-230) proposed an approach in which the standard Pearson chi-squared statistic is multiplied by a design effect to adjust for the complex survey design. Unfortunately, this test fails to exist when one of the observed cell counts equals zero. Even with the large samples typical of many complex surveys, zero cell counts can occur for rare events, small domains, or contingency tables with a large number of cells. Here, we propose Wald and score test statistics for independence based on weighted least squares estimating equations. In contrast to the Rao-Scott test statistic, the proposed Wald and score test statistics always exist. In simulations, the score test is found to perform best with respect to type I error. The proposed method is motivated by, and applied to, post surgical complications data from the United States' Nationwide Inpatient Sample (NIS) complex survey of hospitals in 2008. © 2015, The International Biometric Society.

  4. Recurrence Density Enhanced Complex Networks for Nonlinear Time Series Analysis

    NASA Astrophysics Data System (ADS)

    Costa, Diego G. De B.; Reis, Barbara M. Da F.; Zou, Yong; Quiles, Marcos G.; Macau, Elbert E. N.

    We introduce a new method, which is entitled Recurrence Density Enhanced Complex Network (RDE-CN), to properly analyze nonlinear time series. Our method first transforms a recurrence plot into a figure of a reduced number of points yet preserving the main and fundamental recurrence properties of the original plot. This resulting figure is then reinterpreted as a complex network, which is further characterized by network statistical measures. We illustrate the computational power of RDE-CN approach by time series by both the logistic map and experimental fluid flows, which show that our method distinguishes different dynamics sufficiently well as the traditional recurrence analysis. Therefore, the proposed methodology characterizes the recurrence matrix adequately, while using a reduced set of points from the original recurrence plots.

  5. Trajectory and Relative Dispersion Case Studies and Statistics from the Green River Mesoscale Deformation, Dispersion, and Dissipation Program

    NASA Astrophysics Data System (ADS)

    Niemann, Brand Lee

    A major field program to study beta-mesoscale transport and dispersion over complex mountainous terrain was conducted during 1969 with the cooperation of three government agencies at the White Sands Missile Range in central Utah. The purpose of the program was to measure simultaneously on a large number of days the synoptic and mesoscale wind fields, the relative dispersion between pairs of particle trajectories and the rate of small scale turbulence dissipation. The field program included measurements during more than 60 days in the months of March, June, and November. The large quantity of data generated from this program has been processed and analyzed to provide case studies and statistics to evaluate and refine Lagrangian variable trajectory models. The case studies selected to illustrate the complexities of mesoscale transport and dispersion over complex terrain include those with terrain blocking, lee waves, and stagnation, as well as those with large vertical wind shears and horizontal wind field deformation. The statistics of relative particle dispersion were computed and compared to the classical theories of Richardson and Batchelor and the more recent theories of Lin and Kao among others. The relative particle dispersion was generally found to increase with travel time in the alongwind and crosswind directions, but in a more oscillatory than sustained or even accelerated manner as predicted by most theories, unless substantial wind shears or finite vertical separations between particles were present. The relative particle dispersion in the vertical was generally found to be small and bounded even when substantial vertical motions due to lee waves were present because of the limiting effect of stable temperature stratification. The data show that velocity shears have a more significant effect than turbulence on relative particle dispersion and that sufficient turbulence may not always be present above the planetary boundary layer for "wind direction shear induced dispersion" to become effective horizontal dispersion by vertical mixing over the shear layer. The statistics of relative particle dispersion in the three component directions have been summarized and stratified by flow parameters for use in practical prediction problems.

  6. Profilometry In The Angstrom Region

    NASA Astrophysics Data System (ADS)

    Politch, Jacob

    1989-01-01

    An interferometric system, based on heterodyne principle is described and which enables profile measurements of a surface with a high accuracy. It is possible to measure height variations of 4 Angstroms with a spatial resolution of 1 micrometer. Fran the surface height measurements, there were calculated its statistical properties, such as the R of the heights, the slopes and also its spectral density. The last one identifies the spatial frequencies of the surface, caused for example by the diamond turning mad-line and also by the measuring maChine. For an electro-magnetic wave with a Gaussian profile, which is incident the surface under test, the reflected complex field amplitude (CFA) near the focal region was calculated. jibe have defined the "Macroscopic wavelength" A, which was found to be constant for variations ▵z of the focal distance from the plane under test, for variations of the bean diameter wo in the focal region, while the complex index of refraction (CIF) of the surface under test was kept constant.

  7. Complexity-entropy causality plane: A useful approach for distinguishing songs

    NASA Astrophysics Data System (ADS)

    Ribeiro, Haroldo V.; Zunino, Luciano; Mendes, Renio S.; Lenzi, Ervin K.

    2012-04-01

    Nowadays we are often faced with huge databases resulting from the rapid growth of data storage technologies. This is particularly true when dealing with music databases. In this context, it is essential to have techniques and tools able to discriminate properties from these massive sets. In this work, we report on a statistical analysis of more than ten thousand songs aiming to obtain a complexity hierarchy. Our approach is based on the estimation of the permutation entropy combined with an intensive complexity measure, building up the complexity-entropy causality plane. The results obtained indicate that this representation space is very promising to discriminate songs as well as to allow a relative quantitative comparison among songs. Additionally, we believe that the here-reported method may be applied in practical situations since it is simple, robust and has a fast numerical implementation.

  8. A Longitudinal Analysis of the Influence of a Peer Run Warm Line Phone Service on Psychiatric Recovery.

    PubMed

    Dalgin, Rebecca Spirito; Dalgin, M Halim; Metzger, Scott J

    2018-05-01

    This article focuses on the impact of a peer run warm line as part of the psychiatric recovery process. It utilized data including the Recovery Assessment Scale, community integration measures and crisis service usage. Longitudinal statistical analysis was completed on 48 sets of data from 2011, 2012, and 2013. Although no statistically significant differences were observed for the RAS score, community integration data showed increases in visits to primary care doctors, leisure/recreation activities and socialization with others. This study highlights the complexity of psychiatric recovery and that nonclinical peer services like peer run warm lines may be critical to the process.

  9. Stokes-correlometry of polarization-inhomogeneous objects

    NASA Astrophysics Data System (ADS)

    Ushenko, O. G.; Dubolazov, A.; Bodnar, G. B.; Bachynskiy, V. T.; Vanchulyak, O.

    2018-01-01

    The paper consists of two parts. The first part presents short theoretical basics of the method of Stokes-correlometry description of optical anisotropy of biological tissues. It was provided experimentally measured coordinate distributions of modulus (MSV) and phase (PhSV) of complex Stokes vector of skeletal muscle tissue. It was defined the values and ranges of changes of statistic moments of the 1st-4th orders, which characterize the distributions of values of MSV and PhSV. The second part presents the data of statistic analysis of the distributions of modulus MSV and PhSV. It was defined the objective criteria of differentiation of samples with urinary incontinence.

  10. Measuring and improving the quality of postoperative epidural analgesia for major abdominal surgery using statistical process control charts.

    PubMed

    Duncan, Fiona; Haigh, Carol

    2013-10-01

    To explore and improve the quality of continuous epidural analgesia for pain relief using Statistical Process Control tools. Measuring the quality of pain management interventions is complex. Intermittent audits do not accurately capture the results of quality improvement initiatives. The failure rate for one intervention, epidural analgesia, is approximately 30% in everyday practice, so it is an important area for improvement. Continuous measurement and analysis are required to understand the multiple factors involved in providing effective pain relief. Process control and quality improvement Routine prospectively acquired data collection started in 2006. Patients were asked about their pain and side effects of treatment. Statistical Process Control methods were applied for continuous data analysis. A multidisciplinary group worked together to identify reasons for variation in the data and instigated ideas for improvement. The key measure for improvement was a reduction in the percentage of patients with an epidural in severe pain. The baseline control charts illustrated the recorded variation in the rate of several processes and outcomes for 293 surgical patients. The mean visual analogue pain score (VNRS) was four. There was no special cause variation when data were stratified by surgeons, clinical area or patients who had experienced pain before surgery. Fifty-seven per cent of patients were hypotensive on the first day after surgery. We were able to demonstrate a significant improvement in the failure rate of epidurals as the project continued with quality improvement interventions. Statistical Process Control is a useful tool for measuring and improving the quality of pain management. The applications of Statistical Process Control methods offer the potential to learn more about the process of change and outcomes in an Acute Pain Service both locally and nationally. We have been able to develop measures for improvement and benchmarking in routine care that has led to the establishment of a national pain registry. © 2013 Blackwell Publishing Ltd.

  11. Experimental Analysis and Measurement of Situation Awareness

    DTIC Science & Technology

    1995-11-01

    the participant is interacting that can be characterized uniquely by a set of information, knowledge and response options. However, the concept of a...should receive attention is when the interruption or the surprise creates a statistical interaction between two or more of the other variables of...Awareness in Complex Systems. Daytona Beach, Fl: Embry-Riddle Aeronautical University Press. Sarter, N.B., and Woods, D.D. (1994). Pilot interaction

  12. UAV Swarm Mission Planning Development Using Evolutionary Algorithms - Part I

    DTIC Science & Technology

    2008-05-01

    desired behaviors in autonomous vehicles is a difficult problem at best and in general prob- ably impossible to completely resolve in complex dynamic...associated behaviors. Various techniques inspired by biological self-organized systems as found in forging insects and flocking birds, revolve around...swarms of heterogeneous vehicles in a distributed simulation system with animated graphics. Statistical measurements and observations indicate that bio

  13. An approach for the assessment of the statistical aspects of the SEA coupling loss factors and the vibrational energy transmission in complex aircraft structures: Experimental investigation and methods benchmark

    NASA Astrophysics Data System (ADS)

    Bouhaj, M.; von Estorff, O.; Peiffer, A.

    2017-09-01

    In the application of Statistical Energy Analysis "SEA" to complex assembled structures, a purely predictive model often exhibits errors. These errors are mainly due to a lack of accurate modelling of the power transmission mechanism described through the Coupling Loss Factors (CLF). Experimental SEA (ESEA) is practically used by the automotive and aerospace industry to verify and update the model or to derive the CLFs for use in an SEA predictive model when analytical estimates cannot be made. This work is particularly motivated by the lack of procedures that allow an estimate to be made of the variance and confidence intervals of the statistical quantities when using the ESEA technique. The aim of this paper is to introduce procedures enabling a statistical description of measured power input, vibration energies and the derived SEA parameters. Particular emphasis is placed on the identification of structural CLFs of complex built-up structures comparing different methods. By adopting a Stochastic Energy Model (SEM), the ensemble average in ESEA is also addressed. For this purpose, expressions are obtained to randomly perturb the energy matrix elements and generate individual samples for the Monte Carlo (MC) technique applied to derive the ensemble averaged CLF. From results of ESEA tests conducted on an aircraft fuselage section, the SEM approach provides a better performance of estimated CLFs compared to classical matrix inversion methods. The expected range of CLF values and the synthesized energy are used as quality criteria of the matrix inversion, allowing to assess critical SEA subsystems, which might require a more refined statistical description of the excitation and the response fields. Moreover, the impact of the variance of the normalized vibration energy on uncertainty of the derived CLFs is outlined.

  14. Statistical assessment of the learning curves of health technologies.

    PubMed

    Ramsay, C R; Grant, A M; Wallace, S A; Garthwaite, P H; Monk, A F; Russell, I T

    2001-01-01

    (1) To describe systematically studies that directly assessed the learning curve effect of health technologies. (2) Systematically to identify 'novel' statistical techniques applied to learning curve data in other fields, such as psychology and manufacturing. (3) To test these statistical techniques in data sets from studies of varying designs to assess health technologies in which learning curve effects are known to exist. METHODS - STUDY SELECTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): For a study to be included, it had to include a formal analysis of the learning curve of a health technology using a graphical, tabular or statistical technique. METHODS - STUDY SELECTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): For a study to be included, it had to include a formal assessment of a learning curve using a statistical technique that had not been identified in the previous search. METHODS - DATA SOURCES: Six clinical and 16 non-clinical biomedical databases were searched. A limited amount of handsearching and scanning of reference lists was also undertaken. METHODS - DATA EXTRACTION (HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW): A number of study characteristics were abstracted from the papers such as study design, study size, number of operators and the statistical method used. METHODS - DATA EXTRACTION (NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH): The new statistical techniques identified were categorised into four subgroups of increasing complexity: exploratory data analysis; simple series data analysis; complex data structure analysis, generic techniques. METHODS - TESTING OF STATISTICAL METHODS: Some of the statistical methods identified in the systematic searches for single (simple) operator series data and for multiple (complex) operator series data were illustrated and explored using three data sets. The first was a case series of 190 consecutive laparoscopic fundoplication procedures performed by a single surgeon; the second was a case series of consecutive laparoscopic cholecystectomy procedures performed by ten surgeons; the third was randomised trial data derived from the laparoscopic procedure arm of a multicentre trial of groin hernia repair, supplemented by data from non-randomised operations performed during the trial. RESULTS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: Of 4571 abstracts identified, 272 (6%) were later included in the study after review of the full paper. Some 51% of studies assessed a surgical minimal access technique and 95% were case series. The statistical method used most often (60%) was splitting the data into consecutive parts (such as halves or thirds), with only 14% attempting a more formal statistical analysis. The reporting of the studies was poor, with 31% giving no details of data collection methods. RESULTS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: Of 9431 abstracts assessed, 115 (1%) were deemed appropriate for further investigation and, of these, 18 were included in the study. All of the methods for complex data sets were identified in the non-clinical literature. These were discriminant analysis, two-stage estimation of learning rates, generalised estimating equations, multilevel models, latent curve models, time series models and stochastic parameter models. In addition, eight new shapes of learning curves were identified. RESULTS - TESTING OF STATISTICAL METHODS: No one particular shape of learning curve performed significantly better than another. The performance of 'operation time' as a proxy for learning differed between the three procedures. Multilevel modelling using the laparoscopic cholecystectomy data demonstrated and measured surgeon-specific and confounding effects. The inclusion of non-randomised cases, despite the possible limitations of the method, enhanced the interpretation of learning effects. CONCLUSIONS - HEALTH TECHNOLOGY ASSESSMENT LITERATURE REVIEW: The statistical methods used for assessing learning effects in health technology assessment have been crude and the reporting of studies poor. CONCLUSIONS - NON-HEALTH TECHNOLOGY ASSESSMENT LITERATURE SEARCH: A number of statistical methods for assessing learning effects were identified that had not hitherto been used in health technology assessment. There was a hierarchy of methods for the identification and measurement of learning, and the more sophisticated methods for both have had little if any use in health technology assessment. This demonstrated the value of considering fields outside clinical research when addressing methodological issues in health technology assessment. CONCLUSIONS - TESTING OF STATISTICAL METHODS: It has been demonstrated that the portfolio of techniques identified can enhance investigations of learning curve effects. (ABSTRACT TRUNCATED)

  15. Globalizing Air Pollution

    NASA Astrophysics Data System (ADS)

    Lin, J.

    2017-12-01

    Recent studies have revealed the issue of globalizing air pollution through complex coupling of atmospheric transport (physical route) and economic trade (socioeconomic route). Recognition of such globalizing air pollution has important implications for understanding the impacts of regional and global consumption (of goods and services) on air quality, public health, climate and the ecosystems. And addressing these questions often requires improved modeling, measurements and economic-emission statistics. This talk will introduce the concept and mechanism of globalizing air pollution, with following demonstrations based on recent works on modeling, satellite measurement and multi-disciplinary assessment.

  16. An inferentialist perspective on the coordination of actions and reasons involved in making a statistical inference

    NASA Astrophysics Data System (ADS)

    Bakker, Arthur; Ben-Zvi, Dani; Makar, Katie

    2017-12-01

    To understand how statistical and other types of reasoning are coordinated with actions to reduce uncertainty, we conducted a case study in vocational education that involved statistical hypothesis testing. We analyzed an intern's research project in a hospital laboratory in which reducing uncertainties was crucial to make a valid statistical inference. In his project, the intern, Sam, investigated whether patients' blood could be sent through pneumatic post without influencing the measurement of particular blood components. We asked, in the process of making a statistical inference, how are reasons and actions coordinated to reduce uncertainty? For the analysis, we used the semantic theory of inferentialism, specifically, the concept of webs of reasons and actions—complexes of interconnected reasons for facts and actions; these reasons include premises and conclusions, inferential relations, implications, motives for action, and utility of tools for specific purposes in a particular context. Analysis of interviews with Sam, his supervisor and teacher as well as video data of Sam in the classroom showed that many of Sam's actions aimed to reduce variability, rule out errors, and thus reduce uncertainties so as to arrive at a valid inference. Interestingly, the decisive factor was not the outcome of a t test but of the reference change value, a clinical chemical measure of analytic and biological variability. With insights from this case study, we expect that students can be better supported in connecting statistics with context and in dealing with uncertainty.

  17. Probing the Statistical Properties of Unknown Texts: Application to the Voynich Manuscript

    PubMed Central

    Amancio, Diego R.; Altmann, Eduardo G.; Rybski, Diego; Oliveira, Osvaldo N.; Costa, Luciano da F.

    2013-01-01

    While the use of statistical physics methods to analyze large corpora has been useful to unveil many patterns in texts, no comprehensive investigation has been performed on the interdependence between syntactic and semantic factors. In this study we propose a framework for determining whether a text (e.g., written in an unknown alphabet) is compatible with a natural language and to which language it could belong. The approach is based on three types of statistical measurements, i.e. obtained from first-order statistics of word properties in a text, from the topology of complex networks representing texts, and from intermittency concepts where text is treated as a time series. Comparative experiments were performed with the New Testament in 15 different languages and with distinct books in English and Portuguese in order to quantify the dependency of the different measurements on the language and on the story being told in the book. The metrics found to be informative in distinguishing real texts from their shuffled versions include assortativity, degree and selectivity of words. As an illustration, we analyze an undeciphered medieval manuscript known as the Voynich Manuscript. We show that it is mostly compatible with natural languages and incompatible with random texts. We also obtain candidates for keywords of the Voynich Manuscript which could be helpful in the effort of deciphering it. Because we were able to identify statistical measurements that are more dependent on the syntax than on the semantics, the framework may also serve for text analysis in language-dependent applications. PMID:23844002

  18. Probing the statistical properties of unknown texts: application to the Voynich Manuscript.

    PubMed

    Amancio, Diego R; Altmann, Eduardo G; Rybski, Diego; Oliveira, Osvaldo N; Costa, Luciano da F

    2013-01-01

    While the use of statistical physics methods to analyze large corpora has been useful to unveil many patterns in texts, no comprehensive investigation has been performed on the interdependence between syntactic and semantic factors. In this study we propose a framework for determining whether a text (e.g., written in an unknown alphabet) is compatible with a natural language and to which language it could belong. The approach is based on three types of statistical measurements, i.e. obtained from first-order statistics of word properties in a text, from the topology of complex networks representing texts, and from intermittency concepts where text is treated as a time series. Comparative experiments were performed with the New Testament in 15 different languages and with distinct books in English and Portuguese in order to quantify the dependency of the different measurements on the language and on the story being told in the book. The metrics found to be informative in distinguishing real texts from their shuffled versions include assortativity, degree and selectivity of words. As an illustration, we analyze an undeciphered medieval manuscript known as the Voynich Manuscript. We show that it is mostly compatible with natural languages and incompatible with random texts. We also obtain candidates for keywords of the Voynich Manuscript which could be helpful in the effort of deciphering it. Because we were able to identify statistical measurements that are more dependent on the syntax than on the semantics, the framework may also serve for text analysis in language-dependent applications.

  19. Teaching Statistics--Despite Its Applications

    ERIC Educational Resources Information Center

    Ridgway, Jim; Nicholson, James; McCusker, Sean

    2007-01-01

    Evidence-based policy requires sophisticated modelling and reasoning about complex social data. The current UK statistics curricula do not equip tomorrow's citizens to understand such reasoning. We advocate radical curriculum reform, designed to require students to reason from complex data.

  20. Autonomous Modeling, Statistical Complexity and Semi-annealed Treatment of Boolean Networks

    NASA Astrophysics Data System (ADS)

    Gong, Xinwei

    This dissertation presents three studies on Boolean networks. Boolean networks are a class of mathematical systems consisting of interacting elements with binary state variables. Each element is a node with a Boolean logic gate, and the presence of interactions between any two nodes is represented by directed links. Boolean networks that implement the logic structures of real systems are studied as coarse-grained models of the real systems. Large random Boolean networks are studied with mean field approximations and used to provide a baseline of possible behaviors of large real systems. This dissertation presents one study of the former type, concerning the stable oscillation of a yeast cell-cycle oscillator, and two studies of the latter type, respectively concerning the statistical complexity of large random Boolean networks and an extension of traditional mean field techniques that accounts for the presence of short loops. In the cell-cycle oscillator study, a novel autonomous update scheme is introduced to study the stability of oscillations in small networks. A motif that corrects pulse-growing perturbations and a motif that grows pulses are identified. A combination of the two motifs is capable of sustaining stable oscillations. Examining a Boolean model of the yeast cell-cycle oscillator using an autonomous update scheme yields evidence that it is endowed with such a combination. Random Boolean networks are classified as ordered, critical or disordered based on their response to small perturbations. In the second study, random Boolean networks are taken as prototypical cases for the evaluation of two measures of complexity based on a criterion for optimal statistical prediction. One measure, defined for homogeneous systems, does not distinguish between the static spatial inhomogeneity in the ordered phase and the dynamical inhomogeneity in the disordered phase. A modification in which complexities of individual nodes are calculated yields vanishing complexity values for networks in the ordered and critical phases and for highly disordered networks, peaking somewhere in the disordered phase. Individual nodes with high complexity have, on average, a larger influence on the system dynamics. Lastly, a semi-annealed approximation that preserves the correlation between states at neighboring nodes is introduced to study a social game-inspired network model in which all links are bidirectional and all nodes have a self-input. The technique developed here is shown to yield accurate predictions of distribution of players' states, and accounts for some nontrivial collective behavior of game theoretic interest.

  1. Learning Predictive Statistics: Strategies and Brain Mechanisms.

    PubMed

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-08-30

    When immersed in a new environment, we are challenged to decipher initially incomprehensible streams of sensory information. However, quite rapidly, the brain finds structure and meaning in these incoming signals, helping us to predict and prepare ourselves for future actions. This skill relies on extracting the statistics of event streams in the environment that contain regularities of variable complexity from simple repetitive patterns to complex probabilistic combinations. Here, we test the brain mechanisms that mediate our ability to adapt to the environment's statistics and predict upcoming events. By combining behavioral training and multisession fMRI in human participants (male and female), we track the corticostriatal mechanisms that mediate learning of temporal sequences as they change in structure complexity. We show that learning of predictive structures relates to individual decision strategy; that is, selecting the most probable outcome in a given context (maximizing) versus matching the exact sequence statistics. These strategies engage distinct human brain regions: maximizing engages dorsolateral prefrontal, cingulate, sensory-motor regions, and basal ganglia (dorsal caudate, putamen), whereas matching engages occipitotemporal regions (including the hippocampus) and basal ganglia (ventral caudate). Our findings provide evidence for distinct corticostriatal mechanisms that facilitate our ability to extract behaviorally relevant statistics to make predictions. SIGNIFICANCE STATEMENT Making predictions about future events relies on interpreting streams of information that may initially appear incomprehensible. Past work has studied how humans identify repetitive patterns and associative pairings. However, the natural environment contains regularities that vary in complexity from simple repetition to complex probabilistic combinations. Here, we combine behavior and multisession fMRI to track the brain mechanisms that mediate our ability to adapt to changes in the environment's statistics. We provide evidence for an alternate route for learning complex temporal statistics: extracting the most probable outcome in a given context is implemented by interactions between executive and motor corticostriatal mechanisms compared with visual corticostriatal circuits (including hippocampal cortex) that support learning of the exact temporal statistics. Copyright © 2017 Wang et al.

  2. Relationship of multiscale entropy to task difficulty and sway velocity in healthy young adults.

    PubMed

    Lubetzky, Anat V; Price, Robert; Ciol, Marcia A; Kelly, Valerie E; McCoy, Sarah W

    2015-01-01

    Multiscale entropy (MSE) is a nonlinear measure of postural control that quantifies how complex the postural sway is by assigning a complexity index to the center of pressure (COP) oscillations. While complexity has been shown to be task dependent, the relationship between sway complexity and level of task challenge is currently unclear. This study tested whether MSE can detect short-term changes in postural control in response to increased standing balance task difficulty in healthy young adults and compared this response to that of a traditional measure of postural steadiness, root mean square of velocity (VRMS). COP data from 20 s of quiet stance were analyzed when 30 healthy young adults stood on the following surfaces: on floor and foam with eyes open and closed and on the compliant side of a Both Sides Up (BOSU) ball with eyes open. Complexity index (CompI) was derived from MSE curves. Repeated measures analysis of variance across standing conditions showed a statistically significant effect of condition (p < 0.001) in both the anterior-posterior and medio-lateral directions for both CompI and VRMS. In the medio-lateral direction there was a gradual increase in CompI and VRMS with increased standing challenge. In the anterior-posterior direction, VRMS showed a gradual increase whereas CompI showed significant differences between the BOSU and all other conditions. CompI was moderately and significantly correlated with VRMS. Both nonlinear and traditional measures of postural control were sensitive to the task and increased with increasing difficulty of standing balance tasks in healthy young adults.

  3. Spectroscopic and Statistical Techniques for Information Recovery in Metabonomics and Metabolomics

    NASA Astrophysics Data System (ADS)

    Lindon, John C.; Nicholson, Jeremy K.

    2008-07-01

    Methods for generating and interpreting metabolic profiles based on nuclear magnetic resonance (NMR) spectroscopy, mass spectrometry (MS), and chemometric analysis methods are summarized and the relative strengths and weaknesses of NMR and chromatography-coupled MS approaches are discussed. Given that all data sets measured to date only probe subsets of complex metabolic profiles, we describe recent developments for enhanced information recovery from the resulting complex data sets, including integration of NMR- and MS-based metabonomic results and combination of metabonomic data with data from proteomics, transcriptomics, and genomics. We summarize the breadth of applications, highlight some current activities, discuss the issues relating to metabonomics, and identify future trends.

  4. Spectroscopic and statistical techniques for information recovery in metabonomics and metabolomics.

    PubMed

    Lindon, John C; Nicholson, Jeremy K

    2008-01-01

    Methods for generating and interpreting metabolic profiles based on nuclear magnetic resonance (NMR) spectroscopy, mass spectrometry (MS), and chemometric analysis methods are summarized and the relative strengths and weaknesses of NMR and chromatography-coupled MS approaches are discussed. Given that all data sets measured to date only probe subsets of complex metabolic profiles, we describe recent developments for enhanced information recovery from the resulting complex data sets, including integration of NMR- and MS-based metabonomic results and combination of metabonomic data with data from proteomics, transcriptomics, and genomics. We summarize the breadth of applications, highlight some current activities, discuss the issues relating to metabonomics, and identify future trends.

  5. Fractal structure enables temporal prediction in music.

    PubMed

    Rankin, Summer K; Fink, Philip W; Large, Edward W

    2014-10-01

    1/f serial correlations and statistical self-similarity (fractal structure) have been measured in various dimensions of musical compositions. Musical performances also display 1/f properties in expressive tempo fluctuations, and listeners predict tempo changes when synchronizing. Here the authors show that the 1/f structure is sufficient for listeners to predict the onset times of upcoming musical events. These results reveal what information listeners use to anticipate events in complex, non-isochronous acoustic rhythms, and this will entail innovative models of temporal synchronization. This finding could improve therapies for Parkinson's and related disorders and inform deeper understanding of how endogenous neural rhythms anticipate events in complex, temporally structured communication signals.

  6. Arsenic distribution and valence state variation studied by fast hierarchical length-scale morphological, compositional, and speciation imaging at the Nanoscopium, Synchrotron Soleil

    NASA Astrophysics Data System (ADS)

    Somogyi, Andrea; Medjoubi, Kadda; Sancho-Tomas, Maria; Visscher, P. T.; Baranton, Gil; Philippot, Pascal

    2017-09-01

    The understanding of real complex geological, environmental and geo-biological processes depends increasingly on in-depth non-invasive study of chemical composition and morphology. In this paper we used scanning hard X-ray nanoprobe techniques in order to study the elemental composition, morphology and As speciation in complex highly heterogeneous geological samples. Multivariate statistical analytical techniques, such as principal component analysis and clustering were used for data interpretation. These measurements revealed the quantitative and valance state inhomogeneity of As and its relation to the total compositional and morphological variation of the sample at sub-μm scales.

  7. [Cone-beam CT evaluation of nasomaxillary complex and upper airway following rapid maxillary expansion].

    PubMed

    Li, Lei; Qi, Suqing; Wang, Hongwei; Ren, Sufeng; Ban, Jiandong

    2015-07-01

    To evaluate the naso-maxillary complex width and pharyngeal airway volume changes after rapid maxillary expansion (RME). Thirty-five patients were selected (18 males, 17 females, mean age, 12.1 ± 1.1 years). All patients underwent orthodontic treatment with Hyrax palatal expanders. Cone-beam CT (CBCT) scan was taken before treatment (T0), 16 days (T1) and three months (T3) after RME. Naso-maxillary complex width and pharyngeal airway volume were measured. After treatment the width of piriform aperture and maxillary width were significantly increased compared with that before treatment (P < 0.05). Three months after RME, no statistical difference was found in maxillary width compared with that before treatment. The nasopharyngeal volume significantly increased by 29.9% compared with that before treatment (P < 0.05), and the volume remained relatively stable after three months. RME resulted in a significant increase in the naso-maxillary complex width and nasopharyngeal volume.

  8. A New Metrics for Countries' Fitness and Products' Complexity

    NASA Astrophysics Data System (ADS)

    Tacchella, Andrea; Cristelli, Matthieu; Caldarelli, Guido; Gabrielli, Andrea; Pietronero, Luciano

    2012-10-01

    Classical economic theories prescribe specialization of countries industrial production. Inspection of the country databases of exported products shows that this is not the case: successful countries are extremely diversified, in analogy with biosystems evolving in a competitive dynamical environment. The challenge is assessing quantitatively the non-monetary competitive advantage of diversification which represents the hidden potential for development and growth. Here we develop a new statistical approach based on coupled non-linear maps, whose fixed point defines a new metrics for the country Fitness and product Complexity. We show that a non-linear iteration is necessary to bound the complexity of products by the fitness of the less competitive countries exporting them. We show that, given the paradigm of economic complexity, the correct and simplest approach to measure the competitiveness of countries is the one presented in this work. Furthermore our metrics appears to be economically well-grounded.

  9. Experimentally modeling stochastic processes with less memory by the use of a quantum processor

    PubMed Central

    Palsson, Matthew S.; Gu, Mile; Ho, Joseph; Wiseman, Howard M.; Pryde, Geoff J.

    2017-01-01

    Computer simulation of observable phenomena is an indispensable tool for engineering new technology, understanding the natural world, and studying human society. However, the most interesting systems are often so complex that simulating their future behavior demands storing immense amounts of information regarding how they have behaved in the past. For increasingly complex systems, simulation becomes increasingly difficult and is ultimately constrained by resources such as computer memory. Recent theoretical work shows that quantum theory can reduce this memory requirement beyond ultimate classical limits, as measured by a process’ statistical complexity, C. We experimentally demonstrate this quantum advantage in simulating stochastic processes. Our quantum implementation observes a memory requirement of Cq = 0.05 ± 0.01, far below the ultimate classical limit of C = 1. Scaling up this technique would substantially reduce the memory required in simulations of more complex systems. PMID:28168218

  10. Combining measurements to estimate properties and characterization extent of complex biochemical mixtures; applications to Heparan Sulfate

    PubMed Central

    Pradines, Joël R.; Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Farutin, Victor; Huang, Yongqing; Gunay, Nur Sibel; Capila, Ishan

    2016-01-01

    Complex mixtures of molecular species, such as glycoproteins and glycosaminoglycans, have important biological and therapeutic functions. Characterization of these mixtures with analytical chemistry measurements is an important step when developing generic drugs such as biosimilars. Recent developments have focused on analytical methods and statistical approaches to test similarity between mixtures. The question of how much uncertainty on mixture composition is reduced by combining several measurements still remains mostly unexplored. Mathematical frameworks to combine measurements, estimate mixture properties, and quantify remaining uncertainty, i.e. a characterization extent, are introduced here. Constrained optimization and mathematical modeling are applied to a set of twenty-three experimental measurements on heparan sulfate, a mixture of linear chains of disaccharides having different levels of sulfation. While this mixture has potentially over two million molecular species, mathematical modeling and the small set of measurements establish the existence of nonhomogeneity of sulfate level along chains and the presence of abundant sulfate repeats. Constrained optimization yields not only estimations of sulfate repeats and sulfate level at each position in the chains but also bounds on these levels, thereby estimating the extent of characterization of the sulfation pattern which is achieved by the set of measurements. PMID:27112127

  11. Combining measurements to estimate properties and characterization extent of complex biochemical mixtures; applications to Heparan Sulfate.

    PubMed

    Pradines, Joël R; Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Farutin, Victor; Huang, Yongqing; Gunay, Nur Sibel; Capila, Ishan

    2016-04-26

    Complex mixtures of molecular species, such as glycoproteins and glycosaminoglycans, have important biological and therapeutic functions. Characterization of these mixtures with analytical chemistry measurements is an important step when developing generic drugs such as biosimilars. Recent developments have focused on analytical methods and statistical approaches to test similarity between mixtures. The question of how much uncertainty on mixture composition is reduced by combining several measurements still remains mostly unexplored. Mathematical frameworks to combine measurements, estimate mixture properties, and quantify remaining uncertainty, i.e. a characterization extent, are introduced here. Constrained optimization and mathematical modeling are applied to a set of twenty-three experimental measurements on heparan sulfate, a mixture of linear chains of disaccharides having different levels of sulfation. While this mixture has potentially over two million molecular species, mathematical modeling and the small set of measurements establish the existence of nonhomogeneity of sulfate level along chains and the presence of abundant sulfate repeats. Constrained optimization yields not only estimations of sulfate repeats and sulfate level at each position in the chains but also bounds on these levels, thereby estimating the extent of characterization of the sulfation pattern which is achieved by the set of measurements.

  12. Combining measurements to estimate properties and characterization extent of complex biochemical mixtures; applications to Heparan Sulfate

    NASA Astrophysics Data System (ADS)

    Pradines, Joël R.; Beccati, Daniela; Lech, Miroslaw; Ozug, Jennifer; Farutin, Victor; Huang, Yongqing; Gunay, Nur Sibel; Capila, Ishan

    2016-04-01

    Complex mixtures of molecular species, such as glycoproteins and glycosaminoglycans, have important biological and therapeutic functions. Characterization of these mixtures with analytical chemistry measurements is an important step when developing generic drugs such as biosimilars. Recent developments have focused on analytical methods and statistical approaches to test similarity between mixtures. The question of how much uncertainty on mixture composition is reduced by combining several measurements still remains mostly unexplored. Mathematical frameworks to combine measurements, estimate mixture properties, and quantify remaining uncertainty, i.e. a characterization extent, are introduced here. Constrained optimization and mathematical modeling are applied to a set of twenty-three experimental measurements on heparan sulfate, a mixture of linear chains of disaccharides having different levels of sulfation. While this mixture has potentially over two million molecular species, mathematical modeling and the small set of measurements establish the existence of nonhomogeneity of sulfate level along chains and the presence of abundant sulfate repeats. Constrained optimization yields not only estimations of sulfate repeats and sulfate level at each position in the chains but also bounds on these levels, thereby estimating the extent of characterization of the sulfation pattern which is achieved by the set of measurements.

  13. Evidence of the non-extensive character of Earth's ambient noise.

    NASA Astrophysics Data System (ADS)

    Koutalonis, Ioannis; Vallianatos, Filippos

    2017-04-01

    Investigation of dynamical features of ambient seismic noise is one of the important scientific and practical research challenges. In the same time there isgrowing interest concerning an approach to study Earth Physics based on thescience of complex systems and non extensive statistical mechanics which is a generalization of Boltzmann-Gibbs statistical physics (Vallianatos et al., 2016).This seems to be a promising framework for studying complex systems exhibitingphenomena such as, long-range interactions, and memory effects. Inthis work we use non-extensive statistical mechanics and signal analysis methodsto explore the nature of ambient noise as measured in the stations of the HSNC in South Aegean (Chatzopoulos et al., 2016). In the present work we analyzed the de-trended increments time series of ambient seismic noise X(t), in time windows of 20 minutes to 10 seconds within "calm time zones" where the human-induced noise presents a minimum. Following the non extensive statistical physics approach, the probability distribution function of the increments of ambient noise is investigated. Analyzing the probability density function (PDF)p(X), normalized to zero mean and unit varianceresults that the fluctuations of Earth's ambient noise follows a q-Gaussian distribution asdefined in the frame of non-extensive statisticalmechanics indicated the possible existence of memory effects in Earth's ambient noise. References: F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016. G. Chatzopoulos, I.Papadopoulos, F.Vallianatos, The Hellenic Seismological Network of Crete (HSNC): Validation and results of the 2013 aftershock,Advances in Geosciences, 41, 65-72, 2016.

  14. PHOG analysis of self-similarity in aesthetic images

    NASA Astrophysics Data System (ADS)

    Amirshahi, Seyed Ali; Koch, Michael; Denzler, Joachim; Redies, Christoph

    2012-03-01

    In recent years, there have been efforts in defining the statistical properties of aesthetic photographs and artworks using computer vision techniques. However, it is still an open question how to distinguish aesthetic from non-aesthetic images with a high recognition rate. This is possibly because aesthetic perception is influenced also by a large number of cultural variables. Nevertheless, the search for statistical properties of aesthetic images has not been futile. For example, we have shown that the radially averaged power spectrum of monochrome artworks of Western and Eastern provenance falls off according to a power law with increasing spatial frequency (1/f2 characteristics). This finding implies that this particular subset of artworks possesses a Fourier power spectrum that is self-similar across different scales of spatial resolution. Other types of aesthetic images, such as cartoons, comics and mangas also display this type of self-similarity, as do photographs of complex natural scenes. Since the human visual system is adapted to encode images of natural scenes in a particular efficient way, we have argued that artists imitate these statistics in their artworks. In support of this notion, we presented results that artists portrait human faces with the self-similar Fourier statistics of complex natural scenes although real-world photographs of faces are not self-similar. In view of these previous findings, we investigated other statistical measures of self-similarity to characterize aesthetic and non-aesthetic images. In the present work, we propose a novel measure of self-similarity that is based on the Pyramid Histogram of Oriented Gradients (PHOG). For every image, we first calculate PHOG up to pyramid level 3. The similarity between the histograms of each section at a particular level is then calculated to the parent section at the previous level (or to the histogram at the ground level). The proposed approach is tested on datasets of aesthetic and non-aesthetic categories of monochrome images. The aesthetic image datasets comprise a large variety of artworks of Western provenance. Other man-made aesthetically pleasing images, such as comics, cartoons and mangas, were also studied. For comparison, a database of natural scene photographs is used, as well as datasets of photographs of plants, simple objects and faces that are in general of low aesthetic value. As expected, natural scenes exhibit the highest degree of PHOG self-similarity. Images of artworks also show high selfsimilarity values, followed by cartoons, comics and mangas. On average, other (non-aesthetic) image categories are less self-similar in the PHOG analysis. A measure of scale-invariant self-similarity (PHOG) allows a good separation of the different aesthetic and non-aesthetic image categories. Our results provide further support for the notion that, like complex natural scenes, images of artworks display a higher degree of self-similarity across different scales of resolution than other image categories. Whether the high degree of self-similarity is the basis for the perception of beauty in both complex natural scenery and artworks remains to be investigated.

  15. Full-Counting Many-Particle Dynamics: Nonlocal and Chiral Propagation of Correlations

    NASA Astrophysics Data System (ADS)

    Ashida, Yuto; Ueda, Masahito

    2018-05-01

    The ability to measure single quanta allows the complete characterization of small quantum systems known as full-counting statistics. Quantum gas microscopy enables one to observe many-body systems at the single-atom precision. We extend the idea of full-counting statistics to nonequilibrium open many-particle dynamics and apply it to discuss the quench dynamics. By way of illustration, we consider an exactly solvable model to demonstrate the emergence of unique phenomena such as nonlocal and chiral propagation of correlations, leading to a concomitant oscillatory entanglement growth. We find that correlations can propagate beyond the conventional maximal speed, known as the Lieb-Robinson bound, at the cost of probabilistic nature of quantum measurement. These features become most prominent at the real-to-complex spectrum transition point of an underlying parity-time-symmetric effective non-Hermitian Hamiltonian. A possible experimental situation with quantum gas microscopy is discussed.

  16. A functional U-statistic method for association analysis of sequencing data.

    PubMed

    Jadhav, Sneha; Tong, Xiaoran; Lu, Qing

    2017-11-01

    Although sequencing studies hold great promise for uncovering novel variants predisposing to human diseases, the high dimensionality of the sequencing data brings tremendous challenges to data analysis. Moreover, for many complex diseases (e.g., psychiatric disorders) multiple related phenotypes are collected. These phenotypes can be different measurements of an underlying disease, or measurements characterizing multiple related diseases for studying common genetic mechanism. Although jointly analyzing these phenotypes could potentially increase the power of identifying disease-associated genes, the different types of phenotypes pose challenges for association analysis. To address these challenges, we propose a nonparametric method, functional U-statistic method (FU), for multivariate analysis of sequencing data. It first constructs smooth functions from individuals' sequencing data, and then tests the association of these functions with multiple phenotypes by using a U-statistic. The method provides a general framework for analyzing various types of phenotypes (e.g., binary and continuous phenotypes) with unknown distributions. Fitting the genetic variants within a gene using a smoothing function also allows us to capture complexities of gene structure (e.g., linkage disequilibrium, LD), which could potentially increase the power of association analysis. Through simulations, we compared our method to the multivariate outcome score test (MOST), and found that our test attained better performance than MOST. In a real data application, we apply our method to the sequencing data from Minnesota Twin Study (MTS) and found potential associations of several nicotine receptor subunit (CHRN) genes, including CHRNB3, associated with nicotine dependence and/or alcohol dependence. © 2017 WILEY PERIODICALS, INC.

  17. Equivalence Testing of Complex Particle Size Distribution Profiles Based on Earth Mover's Distance.

    PubMed

    Hu, Meng; Jiang, Xiaohui; Absar, Mohammad; Choi, Stephanie; Kozak, Darby; Shen, Meiyu; Weng, Yu-Ting; Zhao, Liang; Lionberger, Robert

    2018-04-12

    Particle size distribution (PSD) is an important property of particulates in drug products. In the evaluation of generic drug products formulated as suspensions, emulsions, and liposomes, the PSD comparisons between a test product and the branded product can provide useful information regarding in vitro and in vivo performance. Historically, the FDA has recommended the population bioequivalence (PBE) statistical approach to compare the PSD descriptors D50 and SPAN from test and reference products to support product equivalence. In this study, the earth mover's distance (EMD) is proposed as a new metric for comparing PSD particularly when the PSD profile exhibits complex distribution (e.g., multiple peaks) that is not accurately described by the D50 and SPAN descriptor. EMD is a statistical metric that measures the discrepancy (distance) between size distribution profiles without a prior assumption of the distribution. PBE is then adopted to perform statistical test to establish equivalence based on the calculated EMD distances. Simulations show that proposed EMD-based approach is effective in comparing test and reference profiles for equivalence testing and is superior compared to commonly used distance measures, e.g., Euclidean and Kolmogorov-Smirnov distances. The proposed approach was demonstrated by evaluating equivalence of cyclosporine ophthalmic emulsion PSDs that were manufactured under different conditions. Our results show that proposed approach can effectively pass an equivalent product (e.g., reference product against itself) and reject an inequivalent product (e.g., reference product against negative control), thus suggesting its usefulness in supporting bioequivalence determination of a test product to the reference product which both possess multimodal PSDs.

  18. Estimating statistical isotropy violation in CMB due to non-circular beam and complex scan in minutes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pant, Nidhi; Das, Santanu; Mitra, Sanjit

    Mild, unavoidable deviations from circular-symmetry of instrumental beams along with scan strategy can give rise to measurable Statistical Isotropy (SI) violation in Cosmic Microwave Background (CMB) experiments. If not accounted properly, this spurious signal can complicate the extraction of other SI violation signals (if any) in the data. However, estimation of this effect through exact numerical simulation is computationally intensive and time consuming. A generalized analytical formalism not only provides a quick way of estimating this signal, but also gives a detailed understanding connecting the leading beam anisotropy components to a measurable BipoSH characterisation of SI violation. In this paper,more » we provide an approximate generic analytical method for estimating the SI violation generated due to a non-circular (NC) beam and arbitrary scan strategy, in terms of the Bipolar Spherical Harmonic (BipoSH) spectra. Our analytical method can predict almost all the features introduced by a NC beam in a complex scan and thus reduces the need for extensive numerical simulation worth tens of thousands of CPU hours into minutes long calculations. As an illustrative example, we use WMAP beams and scanning strategy to demonstrate the easability, usability and efficiency of our method. We test all our analytical results against that from exact numerical simulations.« less

  19. Causes and correlations in cambium phenology: towards an integrated framework of xylogenesis.

    PubMed

    Rossi, Sergio; Morin, Hubert; Deslauriers, Annie

    2012-03-01

    Although habitually considered as a whole, xylogenesis is a complex process of division and maturation of a pool of cells where the relationship between the phenological phases generating such a growth pattern remains essentially unknown. This study investigated the causal relationships in cambium phenology of black spruce [Picea mariana (Mill.) BSP] monitored for 8 years on four sites of the boreal forest of Quebec, Canada. The dependency links connecting the timing of xylem cell differentiation and cell production were defined and the resulting causal model was analysed with d-sep tests and generalized mixed models with repeated measurements, and tested with Fisher's C statistics to determine whether and how causality propagates through the measured variables. The higher correlations were observed between the dates of emergence of the first developing cells and between the ending of the differentiation phases, while the number of cells was significantly correlated with all phenological phases. The model with eight dependency links was statistically valid for explaining the causes and correlations between the dynamics of cambium phenology. Causal modelling suggested that the phenological phases involved in xylogenesis are closely interconnected by complex relationships of cause and effect, with the onset of cell differentiation being the main factor directly or indirectly triggering all successive phases of xylem maturation.

  20. Causes and correlations in cambium phenology: towards an integrated framework of xylogenesis

    PubMed Central

    Rossi, Sergio; Morin, Hubert; Deslauriers, Annie

    2012-01-01

    Although habitually considered as a whole, xylogenesis is a complex process of division and maturation of a pool of cells where the relationship between the phenological phases generating such a growth pattern remains essentially unknown. This study investigated the causal relationships in cambium phenology of black spruce [Picea mariana (Mill.) BSP] monitored for 8 years on four sites of the boreal forest of Quebec, Canada. The dependency links connecting the timing of xylem cell differentiation and cell production were defined and the resulting causal model was analysed with d-sep tests and generalized mixed models with repeated measurements, and tested with Fisher’s C statistics to determine whether and how causality propagates through the measured variables. The higher correlations were observed between the dates of emergence of the first developing cells and between the ending of the differentiation phases, while the number of cells was significantly correlated with all phenological phases. The model with eight dependency links was statistically valid for explaining the causes and correlations between the dynamics of cambium phenology. Causal modelling suggested that the phenological phases involved in xylogenesis are closely interconnected by complex relationships of cause and effect, with the onset of cell differentiation being the main factor directly or indirectly triggering all successive phases of xylem maturation. PMID:22174441

  1. Calculations of proton-binding thermodynamics in proteins.

    PubMed

    Beroza, P; Case, D A

    1998-01-01

    Computational models of proton binding can range from the chemically complex and statistically simple (as in the quantum calculations) to the chemically simple and statistically complex. Much progress has been made in the multiple-site titration problem. Calculations have improved with the inclusion of more flexibility in regard to both the geometry of the proton binding and the larger scale protein motions associated with titration. This article concentrated on the principles of current calculations, but did not attempt to survey their quantitative performance. This is (1) because such comparisons are given in the cited papers and (2) because continued developments in understanding conformational flexibility and interaction energies will be needed to develop robust methods with strong predictive power. Nevertheless, the advances achieved over the past few years should not be underestimated: serious calculations of protonation behavior and its coupling to conformational change can now be confidently pursued against a backdrop of increasing understanding of the strengths and limitations of such models. It is hoped that such theoretical advances will also spur renewed experimental interest in measuring both overall titration curves and individual pKa values or pKa shifts. Exploration of the shapes of individual titration curves (as measured by Hill coefficients and other parameters) would also be useful in assessing the accuracy of computations and in drawing connections to functional behavior.

  2. Ariadne's Thread: A Robust Software Solution Leading to Automated Absolute and Relative Quantification of SRM Data.

    PubMed

    Nasso, Sara; Goetze, Sandra; Martens, Lennart

    2015-09-04

    Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.

  3. In vivo and in situ measurement and modelling of intra-body effective complex permittivity

    PubMed Central

    Blanes-Vidal, Victoria; Harslund, Jakob L.F.; Ramezani, Mohammad H.; Kjeldsen, Jens; Johansen, Per Michael; Thiel, David; Tarokh, Vahid

    2015-01-01

    Radio frequency tracking of medical micro-robots in minimally invasive medicine is usually investigated upon the assumption that the human body is a homogeneous propagation medium. In this Letter, the authors conducted various trial programs to measure and model the effective complex permittivity ε in terms of refraction ε′, absorption ε″ and their variations in gastrointestinal (GI) tract organs (i.e. oesophagus, stomach, small intestine and large intestine) and the porcine abdominal wall under in vivo and in situ conditions. They further investigated the effects of irregular and unsynchronised contractions and simulated peristaltic movements of the GI tract organs inside the abdominal cavity and in the presence of the abdominal wall on the measurements and variations of ε′ and ε′′. They advanced the previous models of effective complex permittivity of a multilayer inhomogeneous medium, by estimating an analytical model that accounts for reflections between the layers and calculates the attenuation that the wave encounters as it traverses the GI tract and the abdominal wall. They observed that deviation from the specified nominal layer thicknesses due to non-geometric boundaries of GI tract morphometric variables has an impact on the performance of the authors’ model. Therefore, they derived statistical-based models for ε′ and ε′′ using their experimental measurements. PMID:26713157

  4. In vivo and in situ measurement and modelling of intra-body effective complex permittivity.

    PubMed

    Nadimi, Esmaeil S; Blanes-Vidal, Victoria; Harslund, Jakob L F; Ramezani, Mohammad H; Kjeldsen, Jens; Johansen, Per Michael; Thiel, David; Tarokh, Vahid

    2015-12-01

    Radio frequency tracking of medical micro-robots in minimally invasive medicine is usually investigated upon the assumption that the human body is a homogeneous propagation medium. In this Letter, the authors conducted various trial programs to measure and model the effective complex permittivity ε in terms of refraction ε', absorption ε″ and their variations in gastrointestinal (GI) tract organs (i.e. oesophagus, stomach, small intestine and large intestine) and the porcine abdominal wall under in vivo and in situ conditions. They further investigated the effects of irregular and unsynchronised contractions and simulated peristaltic movements of the GI tract organs inside the abdominal cavity and in the presence of the abdominal wall on the measurements and variations of ε' and ε''. They advanced the previous models of effective complex permittivity of a multilayer inhomogeneous medium, by estimating an analytical model that accounts for reflections between the layers and calculates the attenuation that the wave encounters as it traverses the GI tract and the abdominal wall. They observed that deviation from the specified nominal layer thicknesses due to non-geometric boundaries of GI tract morphometric variables has an impact on the performance of the authors' model. Therefore, they derived statistical-based models for ε' and ε'' using their experimental measurements.

  5. Financial instability from local market measures

    NASA Astrophysics Data System (ADS)

    Bardoscia, Marco; Livan, Giacomo; Marsili, Matteo

    2012-08-01

    We study the emergence of instabilities in a stylized model of a financial market, when different market actors calculate prices according to different (local) market measures. We derive typical properties for ensembles of large random markets using techniques borrowed from statistical mechanics of disordered systems. We show that, depending on the number of financial instruments available and on the heterogeneity of local measures, the market moves from an arbitrage-free phase to an unstable one, where the complexity of the market—as measured by the diversity of financial instruments—increases, and arbitrage opportunities arise. A sharp transition separates the two phases. Focusing on two different classes of local measures inspired by real market strategies, we are able to analytically compute the critical lines, corroborating our findings with numerical simulations.

  6. Using complexity metrics with R-R intervals and BPM heart rate measures

    PubMed Central

    Wallot, Sebastian; Fusaroli, Riccardo; Tylén, Kristian; Jegindø, Else-Marie

    2013-01-01

    Lately, growing attention in the health sciences has been paid to the dynamics of heart rate as indicator of impending failures and for prognoses. Likewise, in social and cognitive sciences, heart rate is increasingly employed as a measure of arousal, emotional engagement and as a marker of interpersonal coordination. However, there is no consensus about which measurements and analytical tools are most appropriate in mapping the temporal dynamics of heart rate and quite different metrics are reported in the literature. As complexity metrics of heart rate variability depend critically on variability of the data, different choices regarding the kind of measures can have a substantial impact on the results. In this article we compare linear and non-linear statistics on two prominent types of heart beat data, beat-to-beat intervals (R-R interval) and beats-per-min (BPM). As a proof-of-concept, we employ a simple rest-exercise-rest task and show that non-linear statistics—fractal (DFA) and recurrence (RQA) analyses—reveal information about heart beat activity above and beyond the simple level of heart rate. Non-linear statistics unveil sustained post-exercise effects on heart rate dynamics, but their power to do so critically depends on the type data that is employed: While R-R intervals are very susceptible to non-linear analyses, the success of non-linear methods for BPM data critically depends on their construction. Generally, “oversampled” BPM time-series can be recommended as they retain most of the information about non-linear aspects of heart beat dynamics. PMID:23964244

  7. Towards an automatic wind speed and direction profiler for Wide Field adaptive optics systems

    NASA Astrophysics Data System (ADS)

    Sivo, G.; Turchi, A.; Masciadri, E.; Guesalaga, A.; Neichel, B.

    2018-05-01

    Wide Field Adaptive Optics (WFAO) systems are among the most sophisticated adaptive optics (AO) systems available today on large telescopes. Knowledge of the vertical spatio-temporal distribution of wind speed (WS) and direction (WD) is fundamental to optimize the performance of such systems. Previous studies already proved that the Gemini Multi-Conjugated AO system (GeMS) is able to retrieve measurements of the WS and WD stratification using the SLOpe Detection And Ranging (SLODAR) technique and to store measurements in the telemetry data. In order to assess the reliability of these estimates and of the SLODAR technique applied to such complex AO systems, in this study we compared WS and WD values retrieved from GeMS with those obtained with the atmospheric model Meso-NH on a rich statistical sample of nights. It has previously been proved that the latter technique provided excellent agreement with a large sample of radiosoundings, both in statistical terms and on individual flights. It can be considered, therefore, as an independent reference. The excellent agreement between GeMS measurements and the model that we find in this study proves the robustness of the SLODAR approach. To bypass the complex procedures necessary to achieve automatic measurements of the wind with GeMS, we propose a simple automatic method to monitor nightly WS and WD using Meso-NH model estimates. Such a method can be applied to whatever present or new-generation facilities are supported by WFAO systems. The interest of this study is, therefore, well beyond the optimization of GeMS performance.

  8. Admixture, Population Structure, and F-Statistics.

    PubMed

    Peter, Benjamin M

    2016-04-01

    Many questions about human genetic history can be addressed by examining the patterns of shared genetic variation between sets of populations. A useful methodological framework for this purpose isF-statistics that measure shared genetic drift between sets of two, three, and four populations and can be used to test simple and complex hypotheses about admixture between populations. This article provides context from phylogenetic and population genetic theory. I review how F-statistics can be interpreted as branch lengths or paths and derive new interpretations, using coalescent theory. I further show that the admixture tests can be interpreted as testing general properties of phylogenies, allowing extension of some ideas applications to arbitrary phylogenetic trees. The new results are used to investigate the behavior of the statistics under different models of population structure and show how population substructure complicates inference. The results lead to simplified estimators in many cases, and I recommend to replace F3 with the average number of pairwise differences for estimating population divergence. Copyright © 2016 by the Genetics Society of America.

  9. THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES.

    PubMed

    Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil

    2016-10-01

    In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors.

  10. THE MEASUREMENT OF BONE QUALITY USING GRAY LEVEL CO-OCCURRENCE MATRIX TEXTURAL FEATURES

    PubMed Central

    Shirvaikar, Mukul; Huang, Ning; Dong, Xuanliang Neil

    2016-01-01

    In this paper, statistical methods for the estimation of bone quality to predict the risk of fracture are reported. Bone mineral density and bone architecture properties are the main contributors of bone quality. Dual-energy X-ray Absorptiometry (DXA) is the traditional clinical measurement technique for bone mineral density, but does not include architectural information to enhance the prediction of bone fragility. Other modalities are not practical due to cost and access considerations. This study investigates statistical parameters based on the Gray Level Co-occurrence Matrix (GLCM) extracted from two-dimensional projection images and explores links with architectural properties and bone mechanics. Data analysis was conducted on Micro-CT images of 13 trabecular bones (with an in-plane spatial resolution of about 50μm). Ground truth data for bone volume fraction (BV/TV), bone strength and modulus were available based on complex 3D analysis and mechanical tests. Correlation between the statistical parameters and biomechanical test results was studied using regression analysis. The results showed Cluster-Shade was strongly correlated with the microarchitecture of the trabecular bone and related to mechanical properties. Once the principle thesis of utilizing second-order statistics is established, it can be extended to other modalities, providing cost and convenience advantages for patients and doctors. PMID:28042512

  11. A Not-So-Fundamental Limitation on Studying Complex Systems with Statistics: Comment on Rabin (2011)

    NASA Astrophysics Data System (ADS)

    Thomas, Drew M.

    2012-12-01

    Although living organisms are affected by many interrelated and unidentified variables, this complexity does not automatically impose a fundamental limitation on statistical inference. Nor need one invoke such complexity as an explanation of the "Truth Wears Off" or "decline" effect; similar "decline" effects occur with far simpler systems studied in physics. Selective reporting and publication bias, and scientists' biases in favor of reporting eye-catching results (in general) or conforming to others' results (in physics) better explain this feature of the "Truth Wears Off" effect than Rabin's suggested limitation on statistical inference.

  12. Airflows and turbulent flux measurements in mountainous terrain: Part 1. Canopy and local effects

    USGS Publications Warehouse

    Turnipseed, Andrew A.; Anderson, Dean E.; Blanken, Peter D.; Baugh, William M.; Monson, Russell K.

    2003-01-01

    We have studied the effects of local topography and canopy structure on turbulent flux measurements at a site located in mountainous terrain within a subalpine, coniferous forest. Our primary aim was to determine whether the complex terrain of the site affects the accuracy of eddy flux measurements from a practical perspective. We observed displacement heights, roughness lengths, spectral peaks, turbulent length scales, and profiles of turbulent intensities that were comparable in magnitude and pattern to those reported for forest canopies in simpler terrain. We conclude that in many of these statistical measures, the local canopy exerts considerably more influence than does topographical complexity. Lack of vertical flux divergence and modeling suggests that the flux footprints for the site are within the standards acceptable for the application of flux statistics. We investigated three different methods of coordinate rotation: double rotation (DR), triple rotation (TR), and planar-fit rotation (PF). Significant variability in rotation angles at low wind speeds was encountered with the commonly used DR and TR methods, as opposed to the PF method, causing some overestimation of the fluxes. However, these differences in fluxes were small when applied to large datasets involving sensible heat and CO2 fluxes. We observed evidence of frequent drainage flows near the ground during stable, stratified conditions at night. Concurrent with the appearance of these flows, we observed a positive bias in the mean vertical wind speed, presumably due to subtle topographic variations inducing a flow convergence below the measurement sensors. In the presence of such drainage flows, advection of scalars and non-zero bias in the mean vertical wind speed can complicate closure of the mass conservation budget at the site.

  13. Improved biovolume estimation of Microcystis aeruginosa colonies: A statistical approach.

    PubMed

    Alcántara, I; Piccini, C; Segura, A M; Deus, S; González, C; Martínez de la Escalera, G; Kruk, C

    2018-05-27

    The Microcystis aeruginosa complex (MAC) clusters many of the most common freshwater and brackish bloom-forming cyanobacteria. In monitoring protocols, biovolume estimation is a common approach to determine MAC colonies biomass and useful for prediction purposes. Biovolume (μm 3 mL -1 ) is calculated multiplying organism abundance (orgL -1 ) by colonial volume (μm 3 org -1 ). Colonial volume is estimated based on geometric shapes and requires accurate measurements of dimensions using optical microscopy. A trade-off between easy-to-measure but low-accuracy simple shapes (e.g. sphere) and time costly but high-accuracy complex shapes (e.g. ellipsoid) volume estimation is posed. Overestimations effects in ecological studies and management decisions associated to harmful blooms are significant due to the large sizes of MAC colonies. In this work, we aimed to increase the precision of MAC biovolume estimations by developing a statistical model based on two easy-to-measure dimensions. We analyzed field data from a wide environmental gradient (800 km) spanning freshwater to estuarine and seawater. We measured length, width and depth from ca. 5700 colonies under an inverted microscope and estimated colonial volume using three different recommended geometrical shapes (sphere, prolate spheroid and ellipsoid). Because of the non-spherical shape of MAC the ellipsoid resulted in the most accurate approximation, whereas the sphere overestimated colonial volume (3-80) especially for large colonies (MLD higher than 300 μm). Ellipsoid requires measuring three dimensions and is time-consuming. Therefore, we constructed different statistical models to predict organisms depth based on length and width. Splitting the data into training (2/3) and test (1/3) sets, all models resulted in low training (1.41-1.44%) and testing average error (1.3-2.0%). The models were also evaluated using three other independent datasets. The multiple linear model was finally selected to calculate MAC volume as an ellipsoid based on length and width. This work contributes to achieve a better estimation of MAC volume applicable to monitoring programs as well as to ecological research. Copyright © 2017. Published by Elsevier B.V.

  14. Power-law statistics of neurophysiological processes analyzed using short signals

    NASA Astrophysics Data System (ADS)

    Pavlova, Olga N.; Runnova, Anastasiya E.; Pavlov, Alexey N.

    2018-04-01

    We discuss the problem of quantifying power-law statistics of complex processes from short signals. Based on the analysis of electroencephalograms (EEG) we compare three interrelated approaches which enable characterization of the power spectral density (PSD) and show that an application of the detrended fluctuation analysis (DFA) or the wavelet-transform modulus maxima (WTMM) method represents a useful way of indirect characterization of the PSD features from short data sets. We conclude that despite DFA- and WTMM-based measures can be obtained from the estimated PSD, these tools outperform the standard spectral analysis when characterization of the analyzed regime should be provided based on a very limited amount of data.

  15. The non-statistical dynamics of the 18O + 32O2 isotope exchange reaction at two energies

    NASA Astrophysics Data System (ADS)

    Van Wyngarden, Annalise L.; Mar, Kathleen A.; Quach, Jim; Nguyen, Anh P. Q.; Wiegel, Aaron A.; Lin, Shi-Ying; Lendvay, Gyorgy; Guo, Hua; Lin, Jim J.; Lee, Yuan T.; Boering, Kristie A.

    2014-08-01

    The dynamics of the 18O(3P) + 32O2 isotope exchange reaction were studied using crossed atomic and molecular beams at collision energies (Ecoll) of 5.7 and 7.3 kcal/mol, and experimental results were compared with quantum statistical (QS) and quasi-classical trajectory (QCT) calculations on the O3(X1A') potential energy surface (PES) of Babikov et al. [D. Babikov, B. K. Kendrick, R. B. Walker, R. T. Pack, P. Fleurat-Lesard, and R. Schinke, J. Chem. Phys. 118, 6298 (2003)]. In both QS and QCT calculations, agreement with experiment was markedly improved by performing calculations with the experimental distribution of collision energies instead of fixed at the average collision energy. At both collision energies, the scattering displayed a forward bias, with a smaller bias at the lower Ecoll. Comparisons with the QS calculations suggest that 34O2 is produced with a non-statistical rovibrational distribution that is hotter than predicted, and the discrepancy is larger at the lower Ecoll. If this underprediction of rovibrational excitation by the QS method is not due to PES errors and/or to non-adiabatic effects not included in the calculations, then this collision energy dependence is opposite to what might be expected based on collision complex lifetime arguments and opposite to that measured for the forward bias. While the QCT calculations captured the experimental product vibrational energy distribution better than the QS method, the QCT results underpredicted rotationally excited products, overpredicted forward-bias and predicted a trend in the strength of forward-bias with collision energy opposite to that measured, indicating that it does not completely capture the dynamic behavior measured in the experiment. Thus, these results further underscore the need for improvement in theoretical treatments of dynamics on the O3(X1A') PES and perhaps of the PES itself in order to better understand and predict non-statistical effects in this reaction and in the formation of ozone (in which the intermediate O3* complex is collisionally stabilized by a third body). The scattering data presented here at two different collision energies provide important benchmarks to guide these improvements.

  16. Acute effect of a complex training protocol of back squats on 30-m sprint times of elite male military athletes

    PubMed Central

    Ojeda, Álvaro Huerta; Ríos, Luis Chirosa; Barrilao, Rafael Guisado; Serrano, Pablo Cáceres

    2016-01-01

    [Purpose] The aim of this study was to determine the acute effect temporal of a complex training protocol on 30 meter sprint times. A secondary objective was to evaluate the fatigue indexes of military athletes. [Subjects and Methods] Seven military athletes were the subjects of this study. The variables measured were times in 30-meter sprint, and average power and peak power of squats. The intervention session with complex training consisted of 4 sets of 5 repetitions at 30% 1RM + 4 repetitions at 60% 1RM + 3 repetitions of 30 meters with 120-second rests. For the statistical analysis repeated measures of ANOVA was used, and for the post hoc analysis, student’s t-test was used. [Results] Times in 30 meter sprints showed a significant reduction between the control set and the four experimental sets, but the average power and peak power of squats did not show significant changes. [Conclusion] The results of the study show the acute positive effect of complex training, over time, in 30-meter sprint by military athletes. This effect is due to the post activation potentiation of the lower limbs’ muscles in the 30 meters sprint. PMID:27134353

  17. Acute effect of a complex training protocol of back squats on 30-m sprint times of elite male military athletes.

    PubMed

    Ojeda, Álvaro Huerta; Ríos, Luis Chirosa; Barrilao, Rafael Guisado; Serrano, Pablo Cáceres

    2016-03-01

    [Purpose] The aim of this study was to determine the acute effect temporal of a complex training protocol on 30 meter sprint times. A secondary objective was to evaluate the fatigue indexes of military athletes. [Subjects and Methods] Seven military athletes were the subjects of this study. The variables measured were times in 30-meter sprint, and average power and peak power of squats. The intervention session with complex training consisted of 4 sets of 5 repetitions at 30% 1RM + 4 repetitions at 60% 1RM + 3 repetitions of 30 meters with 120-second rests. For the statistical analysis repeated measures of ANOVA was used, and for the post hoc analysis, student's t-test was used. [Results] Times in 30 meter sprints showed a significant reduction between the control set and the four experimental sets, but the average power and peak power of squats did not show significant changes. [Conclusion] The results of the study show the acute positive effect of complex training, over time, in 30-meter sprint by military athletes. This effect is due to the post activation potentiation of the lower limbs' muscles in the 30 meters sprint.

  18. A Bayesian approach for parameter estimation and prediction using a computationally intensive model

    DOE PAGES

    Higdon, Dave; McDonnell, Jordan D.; Schunck, Nicolas; ...

    2015-02-05

    Bayesian methods have been successful in quantifying uncertainty in physics-based problems in parameter estimation and prediction. In these cases, physical measurements y are modeled as the best fit of a physics-based modelmore » $$\\eta (\\theta )$$, where θ denotes the uncertain, best input setting. Hence the statistical model is of the form $$y=\\eta (\\theta )+\\epsilon ,$$ where $$\\epsilon $$ accounts for measurement, and possibly other, error sources. When nonlinearity is present in $$\\eta (\\cdot )$$, the resulting posterior distribution for the unknown parameters in the Bayesian formulation is typically complex and nonstandard, requiring computationally demanding computational approaches such as Markov chain Monte Carlo (MCMC) to produce multivariate draws from the posterior. Although generally applicable, MCMC requires thousands (or even millions) of evaluations of the physics model $$\\eta (\\cdot )$$. This requirement is problematic if the model takes hours or days to evaluate. To overcome this computational bottleneck, we present an approach adapted from Bayesian model calibration. This approach combines output from an ensemble of computational model runs with physical measurements, within a statistical formulation, to carry out inference. A key component of this approach is a statistical response surface, or emulator, estimated from the ensemble of model runs. We demonstrate this approach with a case study in estimating parameters for a density functional theory model, using experimental mass/binding energy measurements from a collection of atomic nuclei. Lastly, we also demonstrate how this approach produces uncertainties in predictions for recent mass measurements obtained at Argonne National Laboratory.« less

  19. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses

    PubMed Central

    Stephen, Emily P.; Lepage, Kyle Q.; Eden, Uri T.; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S.; Guenther, Frank H.; Kramer, Mark A.

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty—both in the functional network edges and the corresponding aggregate measures of network topology—are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here—appropriate for static and dynamic network inference and different statistical measures of coupling—permits the evaluation of confidence in network measures in a variety of settings common to neuroscience. PMID:24678295

  20. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses.

    PubMed

    Stephen, Emily P; Lepage, Kyle Q; Eden, Uri T; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S; Guenther, Frank H; Kramer, Mark A

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty-both in the functional network edges and the corresponding aggregate measures of network topology-are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here-appropriate for static and dynamic network inference and different statistical measures of coupling-permits the evaluation of confidence in network measures in a variety of settings common to neuroscience.

  1. Bulk measurements of messy chemistries are needed for a theory of the origins of life

    NASA Astrophysics Data System (ADS)

    Guttenberg, Nicholas; Virgo, Nathaniel; Chandru, Kuhan; Scharf, Caleb; Mamajanov, Irena

    2017-11-01

    A feature of many of the chemical systems plausibly involved in the origins of terrestrial life is that they are complex and messy-producing a wide range of compounds via a wide range of mechanisms. However, the fundamental behaviour of such systems is currently not well understood; we do not have the tools to make statistical predictions about such complex chemical networks. This is, in part, due to a lack of quantitative data from which such a theory could be built; specifically, functional measurements of messy chemical systems. Here, we propose that the pantheon of experimental approaches to the origins of life should be expanded to include the study of `functional measurements'-the direct study of bulk properties of chemical systems and their interactions with other compounds, the formation of structures and other behaviours, even in cases where the precise composition and mechanisms are unknown. This article is part of the themed issue 'Reconceptualizing the origins of life'.

  2. CMM Data Analysis Tool

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Due to the increase in the use of Coordinate Measuring Machines (CMMs) to measure fine details and complex geometries in manufacturing, many programs have been made to compile and analyze the data. These programs typically require extensive setup to determine the expected results in order to not only track the pass/fail of a dimension, but also to use statistical process control (SPC). These extra steps and setup times have been addressed through the CMM Data Analysis Tool, which only requires the output of the CMM to provide both pass/fail analysis on all parts run to the same inspection program asmore » well as provide graphs which help visualize where the part measures within the allowed tolerances. This provides feedback not only to the customer for approval of a part during development, but also to machining process engineers to identify when any dimension is drifting towards an out of tolerance condition during production. This program can handle hundreds of parts with complex dimensions and will provide an analysis within minutes.« less

  3. The glassy random laser: replica symmetry breaking in the intensity fluctuations of emission spectra

    PubMed Central

    Antenucci, Fabrizio; Crisanti, Andrea; Leuzzi, Luca

    2015-01-01

    The behavior of a newly introduced overlap parameter, measuring the correlation between intensity fluctuations of waves in random media, is analyzed in different physical regimes, with varying amount of disorder and non-linearity. This order parameter allows to identify the laser transition in random media and describes its possible glassy nature in terms of emission spectra data, the only data so far accessible in random laser measurements. The theoretical analysis is performed in terms of the complex spherical spin-glass model, a statistical mechanical model describing the onset and the behavior of random lasers in open cavities. Replica Symmetry Breaking theory allows to discern different kinds of randomness in the high pumping regime, including the most complex and intriguing glassy randomness. The outcome of the theoretical study is, eventually, compared to recent intensity fluctuation overlap measurements demonstrating the validity of the theory and providing a straightforward interpretation of qualitatively different spectral behaviors in different random lasers. PMID:26616194

  4. Biological markers of intermediate outcomes in studies of indoor air and other complex mixtures.

    PubMed Central

    Wilcosky, T C

    1993-01-01

    Biological markers of intermediate health outcomes sometimes provide a superior alternative to traditional measures of pollutant-related disease. Some opportunities and methodologic issues associated with using markers are discussed in the context of exposures to four complex mixtures: environmental tobacco smoke and nitrogen dioxide, acid aerosols and oxidant outdoor pollution, environmental tobacco smoke and radon, and volatile organic compounds. For markers of intermediate health outcomes, the most important property is the positive predictive value for clinical outcomes of interest. Unless the marker has a known relationship with disease, a marker response conveys no information about disease risk. Most markers are nonspecific in that various exposures cause the same marker response. Although nonspecificity can be an asset in studies of complex mixtures, it leads to problems with confounding and dilution of exposure-response associations in the presence of other exposures. The timing of a marker's measurement in relation to the occurrence of exposure influences the ability to detect a response; measurements made too early or too late may underestimate the response's magnitude. Noninvasive markers, such as those measured in urine, blood, or nasal lavage fluid, are generally more useful for field studies than are invasive markers. However, invasive markers, such as those measured in bronchoalveolar lavage fluid or lung specimens from autopsies, provide the most direct evidence of pulmonary damage from exposure to air pollutants. Unfortunately, the lack of basic information about marker properties (e.g., sensitivity, variability, statistical link with disease) currently precludes the effective use of most markers in studies of complex mixtures. PMID:8206030

  5. Three faces of entropy for complex systems: Information, thermodynamics, and the maximum entropy principle

    NASA Astrophysics Data System (ADS)

    Thurner, Stefan; Corominas-Murtra, Bernat; Hanel, Rudolf

    2017-09-01

    There are at least three distinct ways to conceptualize entropy: entropy as an extensive thermodynamic quantity of physical systems (Clausius, Boltzmann, Gibbs), entropy as a measure for information production of ergodic sources (Shannon), and entropy as a means for statistical inference on multinomial processes (Jaynes maximum entropy principle). Even though these notions represent fundamentally different concepts, the functional form of the entropy for thermodynamic systems in equilibrium, for ergodic sources in information theory, and for independent sampling processes in statistical systems, is degenerate, H (p ) =-∑ipilogpi . For many complex systems, which are typically history-dependent, nonergodic, and nonmultinomial, this is no longer the case. Here we show that for such processes, the three entropy concepts lead to different functional forms of entropy, which we will refer to as SEXT for extensive entropy, SIT for the source information rate in information theory, and SMEP for the entropy functional that appears in the so-called maximum entropy principle, which characterizes the most likely observable distribution functions of a system. We explicitly compute these three entropy functionals for three concrete examples: for Pólya urn processes, which are simple self-reinforcing processes, for sample-space-reducing (SSR) processes, which are simple history dependent processes that are associated with power-law statistics, and finally for multinomial mixture processes.

  6. Mutual information and phase dependencies: measures of reduced nonlinear cardiorespiratory interactions after myocardial infarction.

    PubMed

    Hoyer, Dirk; Leder, Uwe; Hoyer, Heike; Pompe, Bernd; Sommer, Michael; Zwiener, Ulrich

    2002-01-01

    The heart rate variability (HRV) is related to several mechanisms of the complex autonomic functioning such as respiratory heart rate modulation and phase dependencies between heart beat cycles and breathing cycles. The underlying processes are basically nonlinear. In order to understand and quantitatively assess those physiological interactions an adequate coupling analysis is necessary. We hypothesized that nonlinear measures of HRV and cardiorespiratory interdependencies are superior to the standard HRV measures in classifying patients after acute myocardial infarction. We introduced mutual information measures which provide access to nonlinear interdependencies as counterpart to the classically linear correlation analysis. The nonlinear statistical autodependencies of HRV were quantified by auto mutual information, the respiratory heart rate modulation by cardiorespiratory cross mutual information, respectively. The phase interdependencies between heart beat cycles and breathing cycles were assessed basing on the histograms of the frequency ratios of the instantaneous heart beat and respiratory cycles. Furthermore, the relative duration of phase synchronized intervals was acquired. We investigated 39 patients after acute myocardial infarction versus 24 controls. The discrimination of these groups was improved by cardiorespiratory cross mutual information measures and phase interdependencies measures in comparison to the linear standard HRV measures. This result was statistically confirmed by means of logistic regression models of particular variable subsets and their receiver operating characteristics.

  7. Forecasting daily source air quality using multivariate statistical analysis and radial basis function networks.

    PubMed

    Sun, Gang; Hoff, Steven J; Zelle, Brian C; Nelson, Minda A

    2008-12-01

    It is vital to forecast gas and particle matter concentrations and emission rates (GPCER) from livestock production facilities to assess the impact of airborne pollutants on human health, ecological environment, and global warming. Modeling source air quality is a complex process because of abundant nonlinear interactions between GPCER and other factors. The objective of this study was to introduce statistical methods and radial basis function (RBF) neural network to predict daily source air quality in Iowa swine deep-pit finishing buildings. The results show that four variables (outdoor and indoor temperature, animal units, and ventilation rates) were identified as relative important model inputs using statistical methods. It can be further demonstrated that only two factors, the environment factor and the animal factor, were capable of explaining more than 94% of the total variability after performing principal component analysis. The introduction of fewer uncorrelated variables to the neural network would result in the reduction of the model structure complexity, minimize computation cost, and eliminate model overfitting problems. The obtained results of RBF network prediction were in good agreement with the actual measurements, with values of the correlation coefficient between 0.741 and 0.995 and very low values of systemic performance indexes for all the models. The good results indicated the RBF network could be trained to model these highly nonlinear relationships. Thus, the RBF neural network technology combined with multivariate statistical methods is a promising tool for air pollutant emissions modeling.

  8. Characterization of doctor-patient communication using heartbeat nonlinear dynamics: A preliminary study using Lagged Poincaré Plots.

    PubMed

    Nardelli, M; Del Piccolo, L; Danzi, Op; Perlini, C; Tedeschi, F; Greco, A; Scilingo, Ep; Valenza, G

    2017-07-01

    Emphatic doctor-patient communication has been associated with an improved psycho-physiological well-being involving cardiovascular and neuroendocrine responses. Nevertheless, a comprehensive assessment of heartbeat linear and nonlinear/complex dynamics throughout the communication of a life-threatening disease has not been performed yet. To this extent, we here study heart rate variability (HRV) series gathered from 17 subjects while watching a video where an oncologist discloses the diagnosis of a cancer metastasis to a patient. Further 17 subjects watched the same video including additional affective emphatic contents. For the assessment of the two groups, linear heartbeat dynamics was quantified through measures defined in the time and frequency domains, whereas nonlinear/complex dynamics referred to measures of entropy, and combined Lagged Poincare Plots (LPP) and symbolic analyses. Considering differences between the beginning and the end of the video, results from non-parametric statistical tests demonstrated that the group watching emphatic contents showed HRV changes in the LF/HF ratio exclusively. Conversely, the group watching the purely informative video showed changes in vagal activity (i.e., HF power), LF/HF ratio, as well as LPP measures. Additionally, a Support Vector Machine algorithm including HRV nonlinear/complex information was able to automatically discern between groups with an accuracy of 76.47%. We therefore propose the use of heartbeat nonlinear/complex dynamics to objectively assess the empathy level of healthy women.

  9. Electroencephalography signatures of attention-deficit/hyperactivity disorder: clinical utility.

    PubMed

    Alba, Guzmán; Pereda, Ernesto; Mañas, Soledad; Méndez, Leopoldo D; González, Almudena; González, Julián J

    2015-01-01

    The techniques and the most important results on the use of electroencephalography (EEG) to extract different measures are reviewed in this work, which can be clinically useful to study subjects with attention-deficit/hyperactivity disorder (ADHD). First, we discuss briefly and in simple terms the EEG analysis and processing techniques most used in the context of ADHD. We review techniques that both analyze individual EEG channels (univariate measures) and study the statistical interdependence between different EEG channels (multivariate measures), the so-called functional brain connectivity. Among the former ones, we review the classical indices of absolute and relative spectral power and estimations of the complexity of the channels, such as the approximate entropy and the Lempel-Ziv complexity. Among the latter ones, we focus on the magnitude square coherence and on different measures based on the concept of generalized synchronization and its estimation in the state space. Second, from a historical point of view, we present the most important results achieved with these techniques and their clinical utility (sensitivity, specificity, and accuracy) to diagnose ADHD. Finally, we propose future research lines based on these results.

  10. Growing complex network of citations of scientific papers: Modeling and measurements

    NASA Astrophysics Data System (ADS)

    Golosovsky, Michael; Solomon, Sorin

    2017-01-01

    We consider the network of citations of scientific papers and use a combination of the theoretical and experimental tools to uncover microscopic details of this network growth. Namely, we develop a stochastic model of citation dynamics based on the copying-redirection-triadic closure mechanism. In a complementary and coherent way, the model accounts both for statistics of references of scientific papers and for their citation dynamics. Originating in empirical measurements, the model is cast in such a way that it can be verified quantitatively in every aspect. Such validation is performed by measuring citation dynamics of physics papers. The measurements revealed nonlinear citation dynamics, the nonlinearity being intricately related to network topology. The nonlinearity has far-reaching consequences including nonstationary citation distributions, diverging citation trajectories of similar papers, runaways or "immortal papers" with infinite citation lifetime, etc. Thus nonlinearity in complex network growth is our most important finding. In a more specific context, our results can be a basis for quantitative probabilistic prediction of citation dynamics of individual papers and of the journal impact factor.

  11. Refractive errors in patients with newly diagnosed diabetes mellitus.

    PubMed

    Yarbağ, Abdülhekim; Yazar, Hayrullah; Akdoğan, Mehmet; Pekgör, Ahmet; Kaleli, Suleyman

    2015-01-01

    Diabetes mellitus is a complex metabolic disorder that involves the small blood vessels, often causing widespread damage to tissues, including the eyes' optic refractive error. In patients with newly diagnosed diabetes mellitus who have unstable blood glucose levels, refraction may be incorrect. We aimed to investigate refraction in patients who were recently diagnosed with diabetes and treated at our centre. This prospective study was performed from February 2013 to January 2014. Patients were diagnosed with diabetes mellitus using laboratory biochemical tests and clinical examination. Venous fasting plasma glucose (fpg) levels were measured along with refractive errors. Two measurements were taken: initially and after four weeks. The last difference between the initial and end refractive measurements were evaluated. Our patients were 100 males and 30 females who had been newly diagnosed with type II DM. The refractive and fpg levels were measured twice in all patients. The average values of the initial measurements were as follows: fpg level, 415 mg/dl; average refractive value, +2.5 D (Dioptres). The average end of period measurements were fpg, 203 mg/dl; average refractive value, +0.75 D. There is a statistically significant difference between after four weeks measurements with initially measurements of fasting plasma glucose (fpg) levels (p<0.05) and there is a statistically significant relationship between changes in fpg changes with glasses ID (p<0.05) and the disappearance of blurred vision (to be greater than 50% success rate) were statistically significant (p<0.05). Also, were detected upon all these results the absence of any age and sex effects (p>0.05). Refractive error is affected in patients with newly diagnosed diabetes mellitus; therefore, plasma glucose levels should be considered in the selection of glasses.

  12. Cost-Effectiveness Analysis: a proposal of new reporting standards in statistical analysis

    PubMed Central

    Bang, Heejung; Zhao, Hongwei

    2014-01-01

    Cost-effectiveness analysis (CEA) is a method for evaluating the outcomes and costs of competing strategies designed to improve health, and has been applied to a variety of different scientific fields. Yet, there are inherent complexities in cost estimation and CEA from statistical perspectives (e.g., skewness, bi-dimensionality, and censoring). The incremental cost-effectiveness ratio that represents the additional cost per one unit of outcome gained by a new strategy has served as the most widely accepted methodology in the CEA. In this article, we call for expanded perspectives and reporting standards reflecting a more comprehensive analysis that can elucidate different aspects of available data. Specifically, we propose that mean and median-based incremental cost-effectiveness ratios and average cost-effectiveness ratios be reported together, along with relevant summary and inferential statistics as complementary measures for informed decision making. PMID:24605979

  13. Robust Statistical Detection of Power-Law Cross-Correlation.

    PubMed

    Blythe, Duncan A J; Nikulin, Vadim V; Müller, Klaus-Robert

    2016-06-02

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram.

  14. Robust Statistical Detection of Power-Law Cross-Correlation

    PubMed Central

    Blythe, Duncan A. J.; Nikulin, Vadim V.; Müller, Klaus-Robert

    2016-01-01

    We show that widely used approaches in statistical physics incorrectly indicate the existence of power-law cross-correlations between financial stock market fluctuations measured over several years and the neuronal activity of the human brain lasting for only a few minutes. While such cross-correlations are nonsensical, no current methodology allows them to be reliably discarded, leaving researchers at greater risk when the spurious nature of cross-correlations is not clear from the unrelated origin of the time series and rather requires careful statistical estimation. Here we propose a theory and method (PLCC-test) which allows us to rigorously and robustly test for power-law cross-correlations, correctly detecting genuine and discarding spurious cross-correlations, thus establishing meaningful relationships between processes in complex physical systems. Our method reveals for the first time the presence of power-law cross-correlations between amplitudes of the alpha and beta frequency ranges of the human electroencephalogram. PMID:27250630

  15. Experimental Procedure for Determination of the Dielectric Properties of Biological Samples in the 2-50 GHz Range

    PubMed Central

    Odelstad, Elias; Raman, Sujith; Rydberg, Anders

    2014-01-01

    The objective of this paper was to test and evaluate an experimental procedure for providing data on the complex permittivity of different cell lines in the 2–50-GHz range at room temperature, for the purpose of future dosimetric studies. The complex permittivity measurements were performed on cells suspended in culture medium using an open-ended coaxial probe. Maxwell’s mixture equation then allows the calculation of the permittivity profiles of the cells from the difference in permittivity between the cell suspensions and pure culture medium. The open-ended coaxial probe turned out to be very sensitive to disturbances affecting the measurements, resulting in poor precision. Permittivity differences were not large in relation to the spread of the measurements and repeated measurements were performed to improve statistics. The 95% confidence intervals were computed for the arithmetic means of the measured permittivity differences in order to test the statistical significance. The results showed that for bone cells at the lowest tested concentration (33 500/ml), there were significance in the real part of the permittivity at frequencies above 30 GHz, and no significance in the imaginary part. For the second lowest concentration (67 000/ml) there was no significance at all. For a medium concentration of bone cells (135 000/ml) there was no significance in the real part, but there was significance in the imaginary part at frequencies below about 25 GHz. The cell suspension with a concentration of 1 350 000/ml had significance in the real part for both high (above 30 GHz) and low (below 15 GHz) frequencies. The imaginary part showed significance for frequencies below 25 GHz. In the case of an osteosarcoma cell line with a concentration of 2 700 000/ml, only the imaginary part showed significance, and only for frequencies below 15 GHz. For muscle cells at a concentration of 743 450/ml, there was only significance in the imaginary part for frequencies below 5 GHz. The experimental data indicated that the complex permittivity of the culture medium may be used for modeling of cell suspensions. PMID:27170886

  16. Complexity control algorithm based on adaptive mode selection for interframe coding in high efficiency video coding

    NASA Astrophysics Data System (ADS)

    Chen, Gang; Yang, Bing; Zhang, Xiaoyun; Gao, Zhiyong

    2017-07-01

    The latest high efficiency video coding (HEVC) standard significantly increases the encoding complexity for improving its coding efficiency. Due to the limited computational capability of handheld devices, complexity constrained video coding has drawn great attention in recent years. A complexity control algorithm based on adaptive mode selection is proposed for interframe coding in HEVC. Considering the direct proportionality between encoding time and computational complexity, the computational complexity is measured in terms of encoding time. First, complexity is mapped to a target in terms of prediction modes. Then, an adaptive mode selection algorithm is proposed for the mode decision process. Specifically, the optimal mode combination scheme that is chosen through offline statistics is developed at low complexity. If the complexity budget has not been used up, an adaptive mode sorting method is employed to further improve coding efficiency. The experimental results show that the proposed algorithm achieves a very large complexity control range (as low as 10%) for the HEVC encoder while maintaining good rate-distortion performance. For the lowdelayP condition, compared with the direct resource allocation method and the state-of-the-art method, an average gain of 0.63 and 0.17 dB in BDPSNR is observed for 18 sequences when the target complexity is around 40%.

  17. Extracting features of Gaussian self-similar stochastic processes via the Bandt-Pompe approach.

    PubMed

    Rosso, O A; Zunino, L; Pérez, D G; Figliola, A; Larrondo, H A; Garavaglia, M; Martín, M T; Plastino, A

    2007-12-01

    By recourse to appropriate information theory quantifiers (normalized Shannon entropy and Martín-Plastino-Rosso intensive statistical complexity measure), we revisit the characterization of Gaussian self-similar stochastic processes from a Bandt-Pompe viewpoint. We show that the ensuing approach exhibits considerable advantages with respect to other treatments. In particular, clear quantifiers gaps are found in the transition between the continuous processes and their associated noises.

  18. Evaluation of the nephrotoxicity of complex mixtures containing organics and metals: advantages and disadvantages of the use of real-world complex mixtures.

    PubMed

    Simmons, J E; Yang, R S; Berman, E

    1995-02-01

    As part of a multidisciplinary health effects study, the nephrotoxicity of complex industrial waste mixtures was assessed. Adult, male Fischer 344 rats were gavaged with samples of complex industrial waste and nephrotoxicity evaluated 24 hr later. Of the 10 tested samples, 4 produced increased absolute or relative kidney weight, or both, coupled with a statistically significant alteration in at least one of the measured serum parameters (urea nitrogen (BUN), creatinine (CREAT), and BUN/CREAT ratio). Although the waste samples had been analyzed for a number of organic chemicals and 7 of the 10 samples were analyzed also for 12 elemental metals and metalloids, their nephrotoxicity was not readily predicted from the partial chemical characterization data. Because the chemical form or speciation of the metals was unknown, it was not possible to estimate their contribution to the observed biological response. Various experimental approaches, including use of real-world complex mixtures, chemically defined synthetic mixtures, and simple mixtures, will be necessary to adequately determine the potential human health risk from exposure to complex chemical mixtures.

  19. Cryo-Scanning Electron Microscopy of Captured Cirrus Ice Particles

    NASA Astrophysics Data System (ADS)

    Magee, N. B.; Boaggio, K.; Bandamede, M.; Bancroft, L.; Hurler, K.

    2016-12-01

    We present the latest collection of high-resolution cryo-scanning electron microscopy images and microanalysis of cirrus ice particles captured by high-altitude balloon (ICE-Ball, see abstracts by K. Boaggio and M. Bandamede). Ice particle images and sublimation-residues are derived from particles captured during approximately 15 balloon flights conducted in Pennsylvania and New Jersey over the past 12 months. Measurements include 3D digital elevation model reconstructions of ice particles, and associated statistical analyses of entire particles and particle sub-facets and surfaces. This 3D analysis reveals that morphologies of most ice particles captured deviate significantly from ideal habits, and display geometric complexity and surface roughness at multiple measureable scales, ranging from 100's nanometers to 100's of microns. The presentation suggests potential a path forward for representing scattering from a realistically complex array of ice particle shapes and surfaces.

  20. Diffusion tensor imaging in children with tuberous sclerosis complex: tract-based spatial statistics assessment of brain microstructural changes.

    PubMed

    Zikou, Anastasia K; Xydis, Vasileios G; Astrakas, Loukas G; Nakou, Iliada; Tzarouchi, Loukia C; Tzoufi, Meropi; Argyropoulou, Maria I

    2016-07-01

    There is evidence of microstructural changes in normal-appearing white matter of patients with tuberous sclerosis complex. To evaluate major white matter tracts in children with tuberous sclerosis complex using tract-based spatial statistics diffusion tensor imaging (DTI) analysis. Eight children (mean age ± standard deviation: 8.5 ± 5.5 years) with an established diagnosis of tuberous sclerosis complex and 8 age-matched controls were studied. The imaging protocol consisted of T1-weighted high-resolution 3-D spoiled gradient-echo sequence and a spin-echo, echo-planar diffusion-weighted sequence. Differences in the diffusion indices were evaluated using tract-based spatial statistics. Tract-based spatial statistics showed increased axial diffusivity in the children with tuberous sclerosis complex in the superior and anterior corona radiata, the superior longitudinal fascicle, the inferior fronto-occipital fascicle, the uncinate fascicle and the anterior thalamic radiation. No significant differences were observed in fractional anisotropy, mean diffusivity and radial diffusivity between patients and control subjects. No difference was found in the diffusion indices between the baseline and follow-up examination in the patient group. Patients with tuberous sclerosis complex have increased axial diffusivity in major white matter tracts, probably related to reduced axonal integrity.

  1. A novel approach to simulate gene-environment interactions in complex diseases.

    PubMed

    Amato, Roberto; Pinelli, Michele; D'Andrea, Daniel; Miele, Gennaro; Nicodemi, Mario; Raiconi, Giancarlo; Cocozza, Sergio

    2010-01-05

    Complex diseases are multifactorial traits caused by both genetic and environmental factors. They represent the major part of human diseases and include those with largest prevalence and mortality (cancer, heart disease, obesity, etc.). Despite a large amount of information that has been collected about both genetic and environmental risk factors, there are few examples of studies on their interactions in epidemiological literature. One reason can be the incomplete knowledge of the power of statistical methods designed to search for risk factors and their interactions in these data sets. An improvement in this direction would lead to a better understanding and description of gene-environment interactions. To this aim, a possible strategy is to challenge the different statistical methods against data sets where the underlying phenomenon is completely known and fully controllable, for example simulated ones. We present a mathematical approach that models gene-environment interactions. By this method it is possible to generate simulated populations having gene-environment interactions of any form, involving any number of genetic and environmental factors and also allowing non-linear interactions as epistasis. In particular, we implemented a simple version of this model in a Gene-Environment iNteraction Simulator (GENS), a tool designed to simulate case-control data sets where a one gene-one environment interaction influences the disease risk. The main aim has been to allow the input of population characteristics by using standard epidemiological measures and to implement constraints to make the simulator behaviour biologically meaningful. By the multi-logistic model implemented in GENS it is possible to simulate case-control samples of complex disease where gene-environment interactions influence the disease risk. The user has full control of the main characteristics of the simulated population and a Monte Carlo process allows random variability. A knowledge-based approach reduces the complexity of the mathematical model by using reasonable biological constraints and makes the simulation more understandable in biological terms. Simulated data sets can be used for the assessment of novel statistical methods or for the evaluation of the statistical power when designing a study.

  2. Certification of highly complex safety-related systems.

    PubMed

    Reinert, D; Schaefer, M

    1999-01-01

    The BIA has now 15 years of experience with the certification of complex electronic systems for safety-related applications in the machinery sector. Using the example of machining centres this presentation will show the systematic procedure for verifying and validating control systems using Application Specific Integrated Circuits (ASICs) and microcomputers for safety functions. One section will describe the control structure of machining centres with control systems using "integrated safety." A diverse redundant architecture combined with crossmonitoring and forced dynamization is explained. In the main section the steps of the systematic certification procedure are explained showing some results of the certification of drilling machines. Specification reviews, design reviews with test case specification, statistical analysis, and walk-throughs are the analytical measures in the testing process. Systematic tests based on the test case specification, Electro Magnetic Interference (EMI), and environmental testing, and site acceptance tests on the machines are the testing measures for validation. A complex software driven system is always undergoing modification. Most of the changes are not safety-relevant but this has to be proven. A systematic procedure for certifying software modifications is presented in the last section of the paper.

  3. A comparison of spectral magnitude and phase-locking value analyses of the frequency-following response to complex tones

    PubMed Central

    Zhu, Li; Bharadwaj, Hari; Xia, Jing; Shinn-Cunningham, Barbara

    2013-01-01

    Two experiments, both presenting diotic, harmonic tone complexes (100 Hz fundamental), were conducted to explore the envelope-related component of the frequency-following response (FFRENV), a measure of synchronous, subcortical neural activity evoked by a periodic acoustic input. Experiment 1 directly compared two common analysis methods, computing the magnitude spectrum and the phase-locking value (PLV). Bootstrapping identified which FFRENV frequency components were statistically above the noise floor for each metric and quantified the statistical power of the approaches. Across listeners and conditions, the two methods produced highly correlated results. However, PLV analysis required fewer processing stages to produce readily interpretable results. Moreover, at the fundamental frequency of the input, PLVs were farther above the metric's noise floor than spectral magnitudes. Having established the advantages of PLV analysis, the efficacy of the approach was further demonstrated by investigating how different acoustic frequencies contribute to FFRENV, analyzing responses to complex tones composed of different acoustic harmonics of 100 Hz (Experiment 2). Results show that the FFRENV response is dominated by peripheral auditory channels responding to unresolved harmonics, although low-frequency channels driven by resolved harmonics also contribute. These results demonstrate the utility of the PLV for quantifying the strength of FFRENV across conditions. PMID:23862815

  4. A weighted generalized score statistic for comparison of predictive values of diagnostic tests.

    PubMed

    Kosinski, Andrzej S

    2013-03-15

    Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations that are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we presented, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic that incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, always reduces to the score statistic in the independent samples situation, and preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe that the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the WGS test statistic in a general GEE setting. Copyright © 2012 John Wiley & Sons, Ltd.

  5. A weighted generalized score statistic for comparison of predictive values of diagnostic tests

    PubMed Central

    Kosinski, Andrzej S.

    2013-01-01

    Positive and negative predictive values are important measures of a medical diagnostic test performance. We consider testing equality of two positive or two negative predictive values within a paired design in which all patients receive two diagnostic tests. The existing statistical tests for testing equality of predictive values are either Wald tests based on the multinomial distribution or the empirical Wald and generalized score tests within the generalized estimating equations (GEE) framework. As presented in the literature, these test statistics have considerably complex formulas without clear intuitive insight. We propose their re-formulations which are mathematically equivalent but algebraically simple and intuitive. As is clearly seen with a new re-formulation we present, the generalized score statistic does not always reduce to the commonly used score statistic in the independent samples case. To alleviate this, we introduce a weighted generalized score (WGS) test statistic which incorporates empirical covariance matrix with newly proposed weights. This statistic is simple to compute, it always reduces to the score statistic in the independent samples situation, and it preserves type I error better than the other statistics as demonstrated by simulations. Thus, we believe the proposed WGS statistic is the preferred statistic for testing equality of two predictive values and for corresponding sample size computations. The new formulas of the Wald statistics may be useful for easy computation of confidence intervals for difference of predictive values. The introduced concepts have potential to lead to development of the weighted generalized score test statistic in a general GEE setting. PMID:22912343

  6. Capturing rogue waves by multi-point statistics

    NASA Astrophysics Data System (ADS)

    Hadjihosseini, A.; Wächter, Matthias; Hoffmann, N. P.; Peinke, J.

    2016-01-01

    As an example of a complex system with extreme events, we investigate ocean wave states exhibiting rogue waves. We present a statistical method of data analysis based on multi-point statistics which for the first time allows the grasping of extreme rogue wave events in a highly satisfactory statistical manner. The key to the success of the approach is mapping the complexity of multi-point data onto the statistics of hierarchically ordered height increments for different time scales, for which we can show that a stochastic cascade process with Markov properties is governed by a Fokker-Planck equation. Conditional probabilities as well as the Fokker-Planck equation itself can be estimated directly from the available observational data. With this stochastic description surrogate data sets can in turn be generated, which makes it possible to work out arbitrary statistical features of the complex sea state in general, and extreme rogue wave events in particular. The results also open up new perspectives for forecasting the occurrence probability of extreme rogue wave events, and even for forecasting the occurrence of individual rogue waves based on precursory dynamics.

  7. System of multifunctional laser polarimetry of phase and amplitude anisotropy in the diagnosis of endometriosis

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. O.; Dubolazov, O. V.; Olar, O. V.

    2015-11-01

    The theoretical background of azimuthally stable method Jones matrix mapping of histological sections of biopsy of uterine neck on the basis of spatial-frequency selection of the mechanisms of linear and circular birefringence is presented. The comparative results of measuring the coordinate distributions of complex degree of mutual anisotropy formed by polycristalline networks of blood plasma layers of donors (group 1) and patients with endometriosis (group 2). The values and ranges of change of the statistical (moments of the 1st - 4th order) parameters of complex degree of mutual anisotropy coordinate distributions are studied. The objective criteria of diagnostics of the pathology and differentiation of its severity degree are determined.

  8. Multifunctional polarization tomography of optical anisotropy of biological layers in diagnosis of endometriosis

    NASA Astrophysics Data System (ADS)

    Ushenko, O. G.; Koval, L. D.; Dubolazov, O. V.; Ushenko, Yu. O.; Savich, V. O.; Sidor, M. I.; Marchuk, Yu. F.

    2015-09-01

    The theoretical background of azimuthally stable method Jones matrix mapping of histological sections of biopsy of uterine neck on the basis of spatial-frequency selection of the mechanisms of linear and circular birefringence is presented. The comparative results of measuring the coordinate distributions of complex degree of mutual anisotropy formed by polycristalline networks of blood plasma layers of donors (group 1) and patients with endometriosis (group 2). The values and ranges of change of the statistical (moments of the 1st - 4th order) parameters of complex degree of mutual anisotropy coordinate distributions are studied. The objective criteria of diagnostics of the pathology and differentiation of its severity degree are determined.

  9. Parameterization of light absorption by components of seawater in optically complex coastal waters of the Crimea Peninsula (Black Sea).

    PubMed

    Dmitriev, Egor V; Khomenko, Georges; Chami, Malik; Sokolov, Anton A; Churilova, Tatyana Y; Korotaev, Gennady K

    2009-03-01

    The absorption of sunlight by oceanic constituents significantly contributes to the spectral distribution of the water-leaving radiance. Here it is shown that current parameterizations of absorption coefficients do not apply to the optically complex waters of the Crimea Peninsula. Based on in situ measurements, parameterizations of phytoplankton, nonalgal, and total particulate absorption coefficients are proposed. Their performance is evaluated using a log-log regression combined with a low-pass filter and the nonlinear least-square method. Statistical significance of the estimated parameters is verified using the bootstrap method. The parameterizations are relevant for chlorophyll a concentrations ranging from 0.45 up to 2 mg/m(3).

  10. Stan: Statistical inference

    NASA Astrophysics Data System (ADS)

    Stan Development Team

    2018-01-01

    Stan facilitates statistical inference at the frontiers of applied statistics and provides both a modeling language for specifying complex statistical models and a library of statistical algorithms for computing inferences with those models. These components are exposed through interfaces in environments such as R, Python, and the command line.

  11. [Factor Analysis: Principles to Evaluate Measurement Tools for Mental Health].

    PubMed

    Campo-Arias, Adalberto; Herazo, Edwin; Oviedo, Heidi Celina

    2012-09-01

    The validation of a measurement tool in mental health is a complex process that usually starts by estimating reliability, to later approach its validity. Factor analysis is a way to know the number of dimensions, domains or factors of a measuring tool, generally related to the construct validity of the scale. The analysis could be exploratory or confirmatory, and helps in the selection of the items with better performance. For an acceptable factor analysis, it is necessary to follow some steps and recommendations, conduct some statistical tests, and rely on a proper sample of participants. Copyright © 2012 Asociación Colombiana de Psiquiatría. Publicado por Elsevier España. All rights reserved.

  12. Evaluation of selected recurrence measures in discriminating pre-ictal and inter-ictal periods from epileptic EEG data

    NASA Astrophysics Data System (ADS)

    Ngamga, Eulalie Joelle; Bialonski, Stephan; Marwan, Norbert; Kurths, Jürgen; Geier, Christian; Lehnertz, Klaus

    2016-04-01

    We investigate the suitability of selected measures of complexity based on recurrence quantification analysis and recurrence networks for an identification of pre-seizure states in multi-day, multi-channel, invasive electroencephalographic recordings from five epilepsy patients. We employ several statistical techniques to avoid spurious findings due to various influencing factors and due to multiple comparisons and observe precursory structures in three patients. Our findings indicate a high congruence among measures in identifying seizure precursors and emphasize the current notion of seizure generation in large-scale epileptic networks. A final judgment of the suitability for field studies, however, requires evaluation on a larger database.

  13. Spectroscopic studies of Np(V) complexation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stout, B.E.

    The complexation of Np(V) with aliphatic (oxalic, malonic, succinic, glutaric, and maleic) and aromatic (phthalic, pyromellitic, hemimellitic, trimellitic, and mellitic) polycarboxylic acids was studied by spectrophotometry at 1 M ionic strength (NaClO{sub 4}) and 23 C. For the aliphatic systems, the stability of the neptunyl complexes was found to decrease as the carbon chain length of the ligand increased which was attributed to an entropy effect. In polycarboxylate systems, the stability constant decreased in the order hemimellitate > mellitate > pyromellitate > trimellitate, phthalate. With the exception of hemimellitate, this trend follows the order of decreasing basicity of the ligand.more » After correction of the stability constant for statistical effects, the stability of the mellitate, pyromellitate, trimellitate, and phthalate complexes were approximately the same. The unexpected strength the hemimellitate complexation was attributed to an increase in electron density at the binding site from the non-chelating carboxylate group through induction. The complexation of phthalate, trimellitate and hemimellitate and mellitate were studied as a function of pH. Trimellitate and mellitate were found to form ML as well as ML complexes while for phthalate and hemimellitate only ML species were observed. The stability constants of the cation-cation complexes Np(V)-U(VI) and Np(V)-Np(V), measured at 6 M ionic strength (HClO{sub 4}) and 25 C, were found to be 2.45 {+-} 0.05 and 1.41 {+-} 0.14, respectively. The change in enthalpy for the Np(V)-U(VI) system as determined by the measurement of the stability constant as function of temperature was -14.3 {+-} 1.6 kJ/mol.« less

  14. Sedimentological analysis and bed thickness statistics from a Carboniferous deep-water channel-levee complex: Myall Trough, SE Australia

    NASA Astrophysics Data System (ADS)

    Palozzi, Jason; Pantopoulos, George; Maravelis, Angelos G.; Nordsvan, Adam; Zelilidis, Avraam

    2018-02-01

    This investigation presents an outcrop-based integrated study of internal division analysis and statistical treatment of turbidite bed thickness applied to a Carboniferous deep-water channel-levee complex in the Myall Trough, southeast Australia. Turbidite beds of the studied succession are characterized by a range of sedimentary structures grouped into two main associations, a thick-bedded and a thin-bedded one, that reflect channel-fill and overbank/levee deposits, respectively. Three vertically stacked channel-levee cycles have been identified. Results of statistical analysis of bed thickness, grain-size and internal division patterns applied on the studied channel-levee succession, indicate that turbidite bed thickness data seem to be well characterized by a bimodal lognormal distribution, which is possibly reflecting the difference between deposition from lower-density flows (in a levee/overbank setting) and very high-density flows (in a channel fill setting). Power law and exponential distributions were observed to hold only for the thick-bedded parts of the succession and cannot characterize the whole bed thickness range of the studied sediments. The succession also exhibits non-random clustering of bed thickness and grain-size measurements. The studied sediments are also characterized by the presence of statistically detected fining-upward sandstone packets. A novel quantitative approach (change-point analysis) is proposed for the detection of those packets. Markov permutation statistics also revealed the existence of order in the alternation of internal divisions in the succession expressed by an optimal internal division cycle reflecting two main types of gravity flow events deposited within both thick-bedded conglomeratic and thin-bedded sandstone associations. The analytical methods presented in this study can be used as additional tools for quantitative analysis and recognition of depositional environments in hydrocarbon-bearing research of ancient deep-water channel-levee settings.

  15. Visualizing Teacher Education as a Complex System: A Nested Simplex System Approach

    ERIC Educational Resources Information Center

    Ludlow, Larry; Ell, Fiona; Cochran-Smith, Marilyn; Newton, Avery; Trefcer, Kaitlin; Klein, Kelsey; Grudnoff, Lexie; Haigh, Mavis; Hill, Mary F.

    2017-01-01

    Our purpose is to provide an exploratory statistical representation of initial teacher education as a complex system comprised of dynamic influential elements. More precisely, we reveal what the system looks like for differently-positioned teacher education stakeholders based on our framework for gathering, statistically analyzing, and graphically…

  16. Impulse Response Operators for Structural Complexes

    DTIC Science & Technology

    1990-05-12

    systems of the complex. The statistical energy analysis (SEA) is one such a device [ 13, 14]. The rendering of SEA from equation (21) and/or (25) lies...Propagation.] 13. L. Cremer, M. Heckl, and E.E. Ungar 1973 Structure-Borne Sound (Springer Verlag). 14. R. H. Lyon 1975 Statistical Energy Analysis of

  17. Selecting the "Best" Factor Structure and Moving Measurement Validation Forward: An Illustration.

    PubMed

    Schmitt, Thomas A; Sass, Daniel A; Chappelle, Wayne; Thompson, William

    2018-04-09

    Despite the broad literature base on factor analysis best practices, research seeking to evaluate a measure's psychometric properties frequently fails to consider or follow these recommendations. This leads to incorrect factor structures, numerous and often overly complex competing factor models and, perhaps most harmful, biased model results. Our goal is to demonstrate a practical and actionable process for factor analysis through (a) an overview of six statistical and psychometric issues and approaches to be aware of, investigate, and report when engaging in factor structure validation, along with a flowchart for recommended procedures to understand latent factor structures; (b) demonstrating these issues to provide a summary of the updated Posttraumatic Stress Disorder Checklist (PCL-5) factor models and a rationale for validation; and (c) conducting a comprehensive statistical and psychometric validation of the PCL-5 factor structure to demonstrate all the issues we described earlier. Considering previous research, the PCL-5 was evaluated using a sample of 1,403 U.S. Air Force remotely piloted aircraft operators with high levels of battlefield exposure. Previously proposed PCL-5 factor structures were not supported by the data, but instead a bifactor model is arguably more statistically appropriate.

  18. Measurements and analysis in imaging for biomedical applications

    NASA Astrophysics Data System (ADS)

    Hoeller, Timothy L.

    2009-02-01

    A Total Quality Management (TQM) approach can be used to analyze data from biomedical optical and imaging platforms of tissues. A shift from individuals to teams, partnerships, and total participation are necessary from health care groups for improved prognostics using measurement analysis. Proprietary measurement analysis software is available for calibrated, pixel-to-pixel measurements of angles and distances in digital images. Feature size, count, and color are determinable on an absolute and comparative basis. Although changes in images of histomics are based on complex and numerous factors, the variation of changes in imaging analysis to correlations of time, extent, and progression of illness can be derived. Statistical methods are preferred. Applications of the proprietary measurement software are available for any imaging platform. Quantification of results provides improved categorization of illness towards better health. As health care practitioners try to use quantified measurement data for patient diagnosis, the techniques reported can be used to track and isolate causes better. Comparisons, norms, and trends are available from processing of measurement data which is obtained easily and quickly from Scientific Software and methods. Example results for the class actions of Preventative and Corrective Care in Ophthalmology and Dermatology, respectively, are provided. Improved and quantified diagnosis can lead to better health and lower costs associated with health care. Systems support improvements towards Lean and Six Sigma affecting all branches of biology and medicine. As an example for use of statistics, the major types of variation involving a study of Bone Mineral Density (BMD) are examined. Typically, special causes in medicine relate to illness and activities; whereas, common causes are known to be associated with gender, race, size, and genetic make-up. Such a strategy of Continuous Process Improvement (CPI) involves comparison of patient results to baseline data using F-statistics. Self-parings over time are also useful. Special and common causes are identified apart from aging in applying the statistical methods. In the future, implementation of imaging measurement methods by research staff, doctors, and concerned patient partners result in improved health diagnosis, reporting, and cause determination. The long-term prospects for quantified measurements are better quality in imaging analysis with applications of higher utility for heath care providers.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sen, Satyabrata; Rao, Nageswara S; Wu, Qishi

    There have been increasingly large deployments of radiation detection networks that require computationally fast algorithms to produce prompt results over ad-hoc sub-networks of mobile devices, such as smart-phones. These algorithms are in sharp contrast to complex network algorithms that necessitate all measurements to be sent to powerful central servers. In this work, at individual sensors, we employ Wald-statistic based detection algorithms which are computationally very fast, and are implemented as one of three Z-tests and four chi-square tests. At fusion center, we apply the K-out-of-N fusion to combine the sensors hard decisions. We characterize the performance of detection methods bymore » deriving analytical expressions for the distributions of underlying test statistics, and by analyzing the fusion performances in terms of K, N, and the false-alarm rates of individual detectors. We experimentally validate our methods using measurements from indoor and outdoor characterization tests of the Intelligence Radiation Sensors Systems (IRSS) program. In particular, utilizing the outdoor measurements, we construct two important real-life scenarios, boundary surveillance and portal monitoring, and present the results of our algorithms.« less

  20. Statistical analysis and ANN modeling for predicting hydrological extremes under climate change scenarios: the example of a small Mediterranean agro-watershed.

    PubMed

    Kourgialas, Nektarios N; Dokou, Zoi; Karatzas, George P

    2015-05-01

    The purpose of this study was to create a modeling management tool for the simulation of extreme flow events under current and future climatic conditions. This tool is a combination of different components and can be applied in complex hydrogeological river basins, where frequent flood and drought phenomena occur. The first component is the statistical analysis of the available hydro-meteorological data. Specifically, principal components analysis was performed in order to quantify the importance of the hydro-meteorological parameters that affect the generation of extreme events. The second component is a prediction-forecasting artificial neural network (ANN) model that simulates, accurately and efficiently, river flow on an hourly basis. This model is based on a methodology that attempts to resolve a very difficult problem related to the accurate estimation of extreme flows. For this purpose, the available measurements (5 years of hourly data) were divided in two subsets: one for the dry and one for the wet periods of the hydrological year. This way, two ANNs were created, trained, tested and validated for a complex Mediterranean river basin in Crete, Greece. As part of the second management component a statistical downscaling tool was used for the creation of meteorological data according to the higher and lower emission climate change scenarios A2 and B1. These data are used as input in the ANN for the forecasting of river flow for the next two decades. The final component is the application of a meteorological index on the measured and forecasted precipitation and flow data, in order to assess the severity and duration of extreme events. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Measuring word complexity in speech screening: single-word sampling to identify phonological delay/disorder in preschool children.

    PubMed

    Anderson, Carolyn; Cohen, Wendy

    2012-01-01

    Children's speech sound development is assessed by comparing speech production with the typical development of speech sounds based on a child's age and developmental profile. One widely used method of sampling is to elicit a single-word sample along with connected speech. Words produced spontaneously rather than imitated may give a more accurate indication of a child's speech development. A published word complexity measure can be used to score later-developing speech sounds and more complex word patterns. There is a need for a screening word list that is quick to administer and reliably differentiates children with typically developing speech from children with patterns of delayed/disordered speech. To identify a short word list based on word complexity that could be spontaneously named by most typically developing children aged 3;00-5;05 years. One hundred and five children aged between 3;00 and 5;05 years from three local authority nursery schools took part in the study. Items from a published speech assessment were modified and extended to include a range of phonemic targets in different word positions in 78 monosyllabic and polysyllabic words. The 78 words were ranked both by phonemic/phonetic complexity as measured by word complexity and by ease of spontaneous production. The ten most complex words (hereafter Triage 10) were named spontaneously by more than 90% of the children. There was no significant difference between the complexity measures for five identified age groups when the data were examined in 6-month groups. A qualitative analysis revealed eight children with profiles of phonological delay or disorder. When these children were considered separately, there was a statistically significant difference (p < 0.005) between the mean word complexity measure of the group compared with the mean for the remaining children in all other age groups. The Triage 10 words reliably differentiated children with typically developing speech from those with delayed or disordered speech patterns. The Triage 10 words can be used as a screening tool for triage and general assessment and have the potential to monitor progress during intervention. Further testing is being undertaken to establish reliability with children referred to speech and language therapy services. © 2012 Royal College of Speech and Language Therapists.

  2. Digital Reef Rugosity Estimates Coral Reef Habitat Complexity

    PubMed Central

    Dustan, Phillip; Doherty, Orla; Pardede, Shinta

    2013-01-01

    Ecological habitats with greater structural complexity contain more species due to increased niche diversity. This is especially apparent on coral reefs where individual coral colonies aggregate to give a reef its morphology, species zonation, and three dimensionality. Structural complexity is classically measured with a reef rugosity index, which is the ratio of a straight line transect to the distance a flexible chain of equal length travels when draped over the reef substrate; yet, other techniques from visual categories to remote sensing have been used to characterize structural complexity at scales from microhabitats to reefscapes. Reef-scale methods either lack quantitative precision or are too time consuming to be routinely practical, while remotely sensed indices are mismatched to the finer scale morphology of coral colonies and reef habitats. In this communication a new digital technique, Digital Reef Rugosity (DRR) is described which utilizes a self-contained water level gauge enabling a diver to quickly and accurately characterize rugosity with non-invasive millimeter scale measurements of coral reef surface height at decimeter intervals along meter scale transects. The precise measurements require very little post-processing and are easily imported into a spreadsheet for statistical analyses and modeling. To assess its applicability we investigated the relationship between DRR and fish community structure at four coral reef sites on Menjangan Island off the northwest corner of Bali, Indonesia and one on mainland Bali to the west of Menjangan Island; our findings show a positive relationship between DRR and fish diversity. Since structural complexity drives key ecological processes on coral reefs, we consider that DRR may become a useful quantitative community-level descriptor to characterize reef complexity. PMID:23437380

  3. Digital reef rugosity estimates coral reef habitat complexity.

    PubMed

    Dustan, Phillip; Doherty, Orla; Pardede, Shinta

    2013-01-01

    Ecological habitats with greater structural complexity contain more species due to increased niche diversity. This is especially apparent on coral reefs where individual coral colonies aggregate to give a reef its morphology, species zonation, and three dimensionality. Structural complexity is classically measured with a reef rugosity index, which is the ratio of a straight line transect to the distance a flexible chain of equal length travels when draped over the reef substrate; yet, other techniques from visual categories to remote sensing have been used to characterize structural complexity at scales from microhabitats to reefscapes. Reef-scale methods either lack quantitative precision or are too time consuming to be routinely practical, while remotely sensed indices are mismatched to the finer scale morphology of coral colonies and reef habitats. In this communication a new digital technique, Digital Reef Rugosity (DRR) is described which utilizes a self-contained water level gauge enabling a diver to quickly and accurately characterize rugosity with non-invasive millimeter scale measurements of coral reef surface height at decimeter intervals along meter scale transects. The precise measurements require very little post-processing and are easily imported into a spreadsheet for statistical analyses and modeling. To assess its applicability we investigated the relationship between DRR and fish community structure at four coral reef sites on Menjangan Island off the northwest corner of Bali, Indonesia and one on mainland Bali to the west of Menjangan Island; our findings show a positive relationship between DRR and fish diversity. Since structural complexity drives key ecological processes on coral reefs, we consider that DRR may become a useful quantitative community-level descriptor to characterize reef complexity.

  4. The improved degree of urban road traffic network: A case study of Xiamen, China

    NASA Astrophysics Data System (ADS)

    Wang, Shiguang; Zheng, Lili; Yu, Dexin

    2017-03-01

    The complex network theory is applied to the study of urban road traffic network topology, and we constructed a new measure to characterize an urban road network. It is inspiring to quantify the interaction more appropriately between nodes in complex networks, especially in the field of traffic. The measure takes into account properties of lanes (e.g. number of lanes, width, traffic direction). As much, it is a more comprehensive measure in comparison to previous network measures. It can be used to grasp the features of urban street network more clearly. We applied this measure to the road network in Xiamen, China. Based on a standard method from statistical physics, we examined in more detail the distribution of this new measure and found that (1) due to the limitation of space geographic attributes, traditional research conclusions acquired by using the original definition of degree to study the primal approach modeled urban street network are not very persuasive; (2) both of the direction of the network connection and the degree's odd or even classifications need to be analyzed specifically; (3) the improved degree distribution presents obvious hierarchy, and hierarchical values conform to the power-law distribution, and correlation of our new measure shows some significant segmentation of the urban road network.

  5. High precision mass measurements for wine metabolomics

    PubMed Central

    Roullier-Gall, Chloé; Witting, Michael; Gougeon, Régis D.; Schmitt-Kopplin, Philippe

    2014-01-01

    An overview of the critical steps for the non-targeted Ultra-High Performance Liquid Chromatography coupled with Quadrupole Time-of-Flight Mass Spectrometry (UPLC-Q-ToF-MS) analysis of wine chemistry is given, ranging from the study design, data preprocessing and statistical analyses, to markers identification. UPLC-Q-ToF-MS data was enhanced by the alignment of exact mass data from FTICR-MS, and marker peaks were identified using UPLC-Q-ToF-MS2. In combination with multivariate statistical tools and the annotation of peaks with metabolites from relevant databases, this analytical process provides a fine description of the chemical complexity of wines, as exemplified in the case of red (Pinot noir) and white (Chardonnay) wines from various geographic origins in Burgundy. PMID:25431760

  6. High precision mass measurements for wine metabolomics

    NASA Astrophysics Data System (ADS)

    Roullier-Gall, Chloé; Witting, Michael; Gougeon, Régis; Schmitt-Kopplin, Philippe

    2014-11-01

    An overview of the critical steps for the non-targeted Ultra-High Performance Liquid Chromatography coupled with Quadrupole Time-of-Flight Mass Spectrometry (UPLC-Q-ToF-MS) analysis of wine chemistry is given, ranging from the study design, data preprocessing and statistical analyses, to markers identification. UPLC-Q-ToF-MS data was enhanced by the alignment of exact mass data from FTICR-MS, and marker peaks were identified using UPLC-Q-ToF-MS². In combination with multivariate statistical tools and the annotation of peaks with metabolites from relevant databases, this analytical process provides a fine description of the chemical complexity of wines, as exemplified in the case of red (Pinot noir) and white (Chardonnay) wines from various geographic origins in Burgundy.

  7. Current algebra, statistical mechanics and quantum models

    NASA Astrophysics Data System (ADS)

    Vilela Mendes, R.

    2017-11-01

    Results obtained in the past for free boson systems at zero and nonzero temperatures are revisited to clarify the physical meaning of current algebra reducible functionals which are associated to systems with density fluctuations, leading to observable effects on phase transitions. To use current algebra as a tool for the formulation of quantum statistical mechanics amounts to the construction of unitary representations of diffeomorphism groups. Two mathematical equivalent procedures exist for this purpose. One searches for quasi-invariant measures on configuration spaces, the other for a cyclic vector in Hilbert space. Here, one argues that the second approach is closer to the physical intuition when modelling complex systems. An example of application of the current algebra methodology to the pairing phenomenon in two-dimensional fermion systems is discussed.

  8. 3D shape recovery from image focus using gray level co-occurrence matrix

    NASA Astrophysics Data System (ADS)

    Mahmood, Fahad; Munir, Umair; Mehmood, Fahad; Iqbal, Javaid

    2018-04-01

    Recovering a precise and accurate 3-D shape of the target object utilizing robust 3-D shape recovery algorithm is an ultimate objective of computer vision community. Focus measure algorithm plays an important role in this architecture which convert the color values of each pixel of the acquired 2-D image dataset into corresponding focus values. After convolving the focus measure filter with the input 2-D image dataset, a 3-D shape recovery approach is applied which will recover the depth map. In this document, we are concerned with proposing Gray Level Co-occurrence Matrix along with its statistical features for computing the focus information of the image dataset. The Gray Level Co-occurrence Matrix quantifies the texture present in the image using statistical features and then applies joint probability distributive function of the gray level pairs of the input image. Finally, we quantify the focus value of the input image using Gaussian Mixture Model. Due to its little computational complexity, sharp focus measure curve, robust to random noise sources and accuracy, it is considered as superior alternative to most of recently proposed 3-D shape recovery approaches. This algorithm is deeply investigated on real image sequences and synthetic image dataset. The efficiency of the proposed scheme is also compared with the state of art 3-D shape recovery approaches. Finally, by means of two global statistical measures, root mean square error and correlation, we claim that this approach -in spite of simplicity generates accurate results.

  9. Parameters Selection for Bivariate Multiscale Entropy Analysis of Postural Fluctuations in Fallers and Non-Fallers Older Adults.

    PubMed

    Ramdani, Sofiane; Bonnet, Vincent; Tallon, Guillaume; Lagarde, Julien; Bernard, Pierre Louis; Blain, Hubert

    2016-08-01

    Entropy measures are often used to quantify the regularity of postural sway time series. Recent methodological developments provided both multivariate and multiscale approaches allowing the extraction of complexity features from physiological signals; see "Dynamical complexity of human responses: A multivariate data-adaptive framework," in Bulletin of Polish Academy of Science and Technology, vol. 60, p. 433, 2012. The resulting entropy measures are good candidates for the analysis of bivariate postural sway signals exhibiting nonstationarity and multiscale properties. These methods are dependant on several input parameters such as embedding parameters. Using two data sets collected from institutionalized frail older adults, we numerically investigate the behavior of a recent multivariate and multiscale entropy estimator; see "Multivariate multiscale entropy: A tool for complexity analysis of multichannel data," Physics Review E, vol. 84, p. 061918, 2011. We propose criteria for the selection of the input parameters. Using these optimal parameters, we statistically compare the multivariate and multiscale entropy values of postural sway data of non-faller subjects to those of fallers. These two groups are discriminated by the resulting measures over multiple time scales. We also demonstrate that the typical parameter settings proposed in the literature lead to entropy measures that do not distinguish the two groups. This last result confirms the importance of the selection of appropriate input parameters.

  10. A Statistical Physics Characterization of the Complex Systems Dynamics: Quantifying Complexity from Spatio-Temporal Interactions

    PubMed Central

    Koorehdavoudi, Hana; Bogdan, Paul

    2016-01-01

    Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity. PMID:27297496

  11. Nonlinear Complexity Analysis of Brain fMRI Signals in Schizophrenia

    PubMed Central

    Sokunbi, Moses O.; Gradin, Victoria B.; Waiter, Gordon D.; Cameron, George G.; Ahearn, Trevor S.; Murray, Alison D.; Steele, Douglas J.; Staff, Roger T.

    2014-01-01

    We investigated the differences in brain fMRI signal complexity in patients with schizophrenia while performing the Cyberball social exclusion task, using measures of Sample entropy and Hurst exponent (H). 13 patients meeting diagnostic and Statistical Manual of Mental Disorders, 4th Edition (DSM IV) criteria for schizophrenia and 16 healthy controls underwent fMRI scanning at 1.5 T. The fMRI data of both groups of participants were pre-processed, the entropy characterized and the Hurst exponent extracted. Whole brain entropy and H maps of the groups were generated and analysed. The results after adjusting for age and sex differences together show that patients with schizophrenia exhibited higher complexity than healthy controls, at mean whole brain and regional levels. Also, both Sample entropy and Hurst exponent agree that patients with schizophrenia have more complex fMRI signals than healthy controls. These results suggest that schizophrenia is associated with more complex signal patterns when compared to healthy controls, supporting the increase in complexity hypothesis, where system complexity increases with age or disease, and also consistent with the notion that schizophrenia is characterised by a dysregulation of the nonlinear dynamics of underlying neuronal systems. PMID:24824731

  12. A Statistical Physics Characterization of the Complex Systems Dynamics: Quantifying Complexity from Spatio-Temporal Interactions

    NASA Astrophysics Data System (ADS)

    Koorehdavoudi, Hana; Bogdan, Paul

    2016-06-01

    Biological systems are frequently categorized as complex systems due to their capabilities of generating spatio-temporal structures from apparent random decisions. In spite of research on analyzing biological systems, we lack a quantifiable framework for measuring their complexity. To fill this gap, in this paper, we develop a new paradigm to study a collective group of N agents moving and interacting in a three-dimensional space. Our paradigm helps to identify the spatio-temporal states of the motion of the group and their associated transition probabilities. This framework enables the estimation of the free energy landscape corresponding to the identified states. Based on the energy landscape, we quantify missing information, emergence, self-organization and complexity for a collective motion. We show that the collective motion of the group of agents evolves to reach the most probable state with relatively lowest energy level and lowest missing information compared to other possible states. Our analysis demonstrates that the natural group of animals exhibit a higher degree of emergence, self-organization and complexity over time. Consequently, this algorithm can be integrated into new frameworks to engineer collective motions to achieve certain degrees of emergence, self-organization and complexity.

  13. Development of local complexity metrics to quantify the effect of anatomical noise on detectability of lung nodules in chest CT imaging

    NASA Astrophysics Data System (ADS)

    Solomon, Justin; Rubin, Geoffrey; Smith, Taylor; Harrawood, Brian; Choudhury, Kingshuk Roy; Samei, Ehsan

    2017-03-01

    The purpose of this study was to develop metrics of local anatomical complexity and compare them with detectability of lung nodules in CT. Data were drawn retrospectively from a published perception experiment in which detectability was assessed in cases enriched with virtual nodules (13 radiologists x 157 total nodules = 2041 responses). A local anatomical complexity metric called the distractor index was developed, defined as the Gaussian weighted proportion (i.e., average) of distracting local voxels (50 voxels in-plane, 5 slices). A distracting voxel was classified by thresholding image data that had been selectively filtered to enhance nodule-like features. The distractor index was measured for each nodule location in the nodule-free images. The local pixel standard deviation (STD) was also measured for each nodule. Other confounding factors of search fraction (proportion of lung voxels to total voxels in the given slice) and peripheral distance (defined as the 3D distance of the nodule from the trachea bifurcation) were measured. A generalized linear mixed-effects statistical model (no interaction terms, probit link function, random reader term) was fit to the data to determine the influence of each metric on detectability. In order of decreasing effect size: distractor index, STD, and search fraction all significantly affected detectability (P < 0.001). Distance to the trachea did not have a significant effect (P < 0.05). These data demonstrate that local lung complexity degrades detection of lung nodules and the distractor index could serve as a good surrogate metric to quantify anatomical complexity.

  14. Deep Generative Models of Galaxy Images for the Calibration of the Next Generation of Weak Lensing Surveys

    NASA Astrophysics Data System (ADS)

    Lanusse, Francois; Ravanbakhsh, Siamak; Mandelbaum, Rachel; Schneider, Jeff; Poczos, Barnabas

    2017-01-01

    Weak gravitational lensing has long been identified as one of the most powerful probes to investigate the nature of dark energy. As such, weak lensing is at the heart of the next generation of cosmological surveys such as LSST, Euclid or WFIRST.One particularly crititcal source of systematic errors in these surveys comes from the shape measurement algorithms tasked with estimating galaxy shapes. GREAT3, the last community challenge to assess the quality of state-of-the-art shape measurement algorithms has in particular demonstrated that all current methods are biased to various degrees and, more importantly, that these biases depend on the details of the galaxy morphologies. These biases can be measured and calibrated by generating mock observations where a known lensing signal has been introduced and comparing the resulting measurements to the ground-truth. Producing these mock observations however requires input galaxy images of higher resolution and S/N than the simulated survey, which typically implies acquiring extremely expensive space-based observations.The goal of this work is to train a deep generative model on already available Hubble Space Telescope data which can then be used to sample new galaxy images conditioned on parameters such as magnitude, size or redshift and exhibiting complex morphologies. Such model can allow us to inexpensively produce large set of realistic realistic images for calibration purposes.We implement a conditional generative model based on state-of-the-art deep learning methods and fit it to deep galaxy images from the COSMOS survey. The quality of the model is assessed by computing an extensive set of galaxy morphology statistics on the generated images. Beyond simple second moment statistics such as size and ellipticity, we apply more complex statistics specifically designed to be sensitive to disturbed galaxy morphologies. We find excellent agreement between the morphologies of real and model generated galaxies.Our results suggest that such deep generative models represent a reliable alternative to the acquisition of expensive high quality observations for generating the calibration data needed by the next generation of weak lensing surveys.

  15. Texture functions in image analysis: A computationally efficient solution

    NASA Technical Reports Server (NTRS)

    Cox, S. C.; Rose, J. F.

    1983-01-01

    A computationally efficient means for calculating texture measurements from digital images by use of the co-occurrence technique is presented. The calculation of the statistical descriptors of image texture and a solution that circumvents the need for calculating and storing a co-occurrence matrix are discussed. The results show that existing efficient algorithms for calculating sums, sums of squares, and cross products can be used to compute complex co-occurrence relationships directly from the digital image input.

  16. The Role of IQGAP1 in Breast Carcinoma

    DTIC Science & Technology

    2012-10-01

    and"-tubulin expression was measured as described above. Statistical Analysis —All experiments were repeated inde- pendently at least three times...IQGAP1 Binds HER2—In vitro analysis with pure proteins was used to examine a possible interaction between IQGAP1 and HER2. GST alone or GST-HER2 was...incubated with puri- fied IQGAP1, and complexes were isolated with glutathione- Sepharose. Analysis by Western blotting reveals that IQGAP1 bindsHER2

  17. A Study of Gaps in Cyber Defense Automation

    DTIC Science & Technology

    2016-10-13

    converting all characters to lowercase. Next, the normalized file is tokenized using an n -length window. These n -tokens are then hashed into a Bloom filter...expensive and not readily available to most developers. However, with a complexity of O(N2), where N is the number of files (or the number of...prioritized by statistics that measure the impact of each feature on the website’s chances of becoming compromised, and the top N features are submitted to

  18. About my Child: measuring 'Complexity' in neurodisability. Evidence of reliability and validity.

    PubMed

    Ritzema, A M; Lach, L M; Rosenbaum, P; Nicholas, D

    2016-05-01

    About my Child, 26-item version (AMC-26) was developed as a measure of child health 'complexity' and has been proposed as a tool for understanding the functional needs of children and the priorities of families. The current study investigated the reliability and validity of AMC-26 with a sample of caregivers of children with neurodevelopmental disorders (NDD; n = 258) who completed AMC-26 as part of a larger study on parenting children with NDD. A subsample of children from the larger study (n = 49) were assessed using standardized measures of cognitive and adaptive functioning. Factor analysis revealed that a four-component model explained 51.12% of the variance. Cronbach's alpha was calculated for each of the four factors and for the scale as a whole, and ranged from 0.75 to 0.85, suggesting a high level of internal consistency. Construct validity was tested through comparisons with the results of standardized measures of child functioning. Predicted relationships for factors one, two and three were statistically significant and in the expected directions. Predictions for factor four were partially supported. AMC-26 was also expected to serve as an indicator of caregiver distress. Drawing on a sample of caregivers from the larger study (n = 251) the model was found to be significant and explained 23% of the variance in caregiver depressive symptoms (R(2)  = .053, F (1, 249) = 14.06, P < .001). Based on these observations, the authors contend that AMC-26 may be used by clinicians and researchers as a tool to capture child function and child health complexity. Such a measure may help elucidate the relationships between child complexity and family well-being. This is an important avenue for further investigation. © 2016 John Wiley & Sons Ltd.

  19. Floodplain complexity and surface metrics: influences of scale and geomorphology

    USGS Publications Warehouse

    Scown, Murray W.; Thoms, Martin C.; DeJager, Nathan R.

    2015-01-01

    Many studies of fluvial geomorphology and landscape ecology examine a single river or landscape, thus lack generality, making it difficult to develop a general understanding of the linkages between landscape patterns and larger-scale driving variables. We examined the spatial complexity of eight floodplain surfaces in widely different geographic settings and determined how patterns measured at different scales relate to different environmental drivers. Floodplain surface complexity is defined as having highly variable surface conditions that are also highly organised in space. These two components of floodplain surface complexity were measured across multiple sampling scales from LiDAR-derived DEMs. The surface character and variability of each floodplain were measured using four surface metrics; namely, standard deviation, skewness, coefficient of variation, and standard deviation of curvature from a series of moving window analyses ranging from 50 to 1000 m in radius. The spatial organisation of each floodplain surface was measured using spatial correlograms of the four surface metrics. Surface character, variability, and spatial organisation differed among the eight floodplains; and random, fragmented, highly patchy, and simple gradient spatial patterns were exhibited, depending upon the metric and window size. Differences in surface character and variability among the floodplains became statistically stronger with increasing sampling scale (window size), as did their associations with environmental variables. Sediment yield was consistently associated with differences in surface character and variability, as were flow discharge and variability at smaller sampling scales. Floodplain width was associated with differences in the spatial organization of surface conditions at smaller sampling scales, while valley slope was weakly associated with differences in spatial organisation at larger scales. A comparison of floodplain landscape patterns measured at different scales would improve our understanding of the role that different environmental variables play at different scales and in different geomorphic settings.

  20. Comparison of robotics, functional electrical stimulation, and motor learning methods for treatment of persistent upper extremity dysfunction after stroke: a randomized controlled trial.

    PubMed

    McCabe, Jessica; Monkiewicz, Michelle; Holcomb, John; Pundik, Svetlana; Daly, Janis J

    2015-06-01

    To compare response to upper-limb treatment using robotics plus motor learning (ML) versus functional electrical stimulation (FES) plus ML versus ML alone, according to a measure of complex functional everyday tasks for chronic, severely impaired stroke survivors. Single-blind, randomized trial. Medical center. Enrolled subjects (N=39) were >1 year postsingle stroke (attrition rate=10%; 35 completed the study). All groups received treatment 5d/wk for 5h/d (60 sessions), with unique treatment as follows: ML alone (n=11) (5h/d partial- and whole-task practice of complex functional tasks), robotics plus ML (n=12) (3.5h/d of ML and 1.5h/d of shoulder/elbow robotics), and FES plus ML (n=12) (3.5h/d of ML and 1.5h/d of FES wrist/hand coordination training). Primary measure: Arm Motor Ability Test (AMAT), with 13 complex functional tasks; secondary measure: upper-limb Fugl-Meyer coordination scale (FM). There was no significant difference found in treatment response across groups (AMAT: P≥.584; FM coordination: P≥.590). All 3 treatment groups demonstrated clinically and statistically significant improvement in response to treatment (AMAT and FM coordination: P≤.009). A group treatment paradigm of 1:3 (therapist/patient) ratio proved feasible for provision of the intensive treatment. No adverse effects. Severely impaired stroke survivors with persistent (>1y) upper-extremity dysfunction can make clinically and statistically significant gains in coordination and functional task performance in response to robotics plus ML, FES plus ML, and ML alone in an intensive and long-duration intervention; no group differences were found. Additional studies are warranted to determine the effectiveness of these methods in the clinical setting. Copyright © 2015 American Congress of Rehabilitation Medicine. Published by Elsevier Inc. All rights reserved.

  1. The new challenges of multiplex networks: Measures and models

    NASA Astrophysics Data System (ADS)

    Battiston, Federico; Nicosia, Vincenzo; Latora, Vito

    2017-02-01

    What do societies, the Internet, and the human brain have in common? They are all examples of complex relational systems, whose emerging behaviours are largely determined by the non-trivial networks of interactions among their constituents, namely individuals, computers, or neurons, rather than only by the properties of the units themselves. In the last two decades, network scientists have proposed models of increasing complexity to better understand real-world systems. Only recently we have realised that multiplexity, i.e. the coexistence of several types of interactions among the constituents of a complex system, is responsible for substantial qualitative and quantitative differences in the type and variety of behaviours that a complex system can exhibit. As a consequence, multilayer and multiplex networks have become a hot topic in complexity science. Here we provide an overview of some of the measures proposed so far to characterise the structure of multiplex networks, and a selection of models aiming at reproducing those structural properties and quantifying their statistical significance. Focusing on a subset of relevant topics, this brief review is a quite comprehensive introduction to the most basic tools for the analysis of multiplex networks observed in the real-world. The wide applicability of multiplex networks as a framework to model complex systems in different fields, from biology to social sciences, and the colloquial tone of the paper will make it an interesting read for researchers working on both theoretical and experimental analysis of networked systems.

  2. Random forests for classification in ecology

    USGS Publications Warehouse

    Cutler, D.R.; Edwards, T.C.; Beard, K.H.; Cutler, A.; Hess, K.T.; Gibson, J.; Lawler, J.J.

    2007-01-01

    Classification procedures are some of the most widely used statistical methods in ecology. Random forests (RF) is a new and powerful statistical classifier that is well established in other disciplines but is relatively unknown in ecology. Advantages of RF compared to other statistical classifiers include (1) very high classification accuracy; (2) a novel method of determining variable importance; (3) ability to model complex interactions among predictor variables; (4) flexibility to perform several types of statistical data analysis, including regression, classification, survival analysis, and unsupervised learning; and (5) an algorithm for imputing missing values. We compared the accuracies of RF and four other commonly used statistical classifiers using data on invasive plant species presence in Lava Beds National Monument, California, USA, rare lichen species presence in the Pacific Northwest, USA, and nest sites for cavity nesting birds in the Uinta Mountains, Utah, USA. We observed high classification accuracy in all applications as measured by cross-validation and, in the case of the lichen data, by independent test data, when comparing RF to other common classification methods. We also observed that the variables that RF identified as most important for classifying invasive plant species coincided with expectations based on the literature. ?? 2007 by the Ecological Society of America.

  3. Disequilibrium, complexity, the Schottky effect, and q-entropies, in paramagnetism

    NASA Astrophysics Data System (ADS)

    Pennini, F.; Plastino, A.

    2017-12-01

    We investigate connections between statistical quantifiers and paramagnetism. More concretely, we apply the notions of (i) disequilibrium and (ii) statistical complexity, to a paramagnetic system of non-coupled dipoles. Interesting insights are thereby obtained. In particular, we encounter a kind of criticality, not associated to the temperature but to the disequilibrium.

  4. Statistical Assessment of Variability of Terminal Restriction Fragment Length Polymorphism Analysis Applied to Complex Microbial Communities ▿ †

    PubMed Central

    Rossi, Pierre; Gillet, François; Rohrbach, Emmanuelle; Diaby, Nouhou; Holliger, Christof

    2009-01-01

    The variability of terminal restriction fragment polymorphism analysis applied to complex microbial communities was assessed statistically. Recent technological improvements were implemented in the successive steps of the procedure, resulting in a standardized procedure which provided a high level of reproducibility. PMID:19749066

  5. Wigner surmises and the two-dimensional homogeneous Poisson point process.

    PubMed

    Sakhr, Jamal; Nieminen, John M

    2006-04-01

    We derive a set of identities that relate the higher-order interpoint spacing statistics of the two-dimensional homogeneous Poisson point process to the Wigner surmises for the higher-order spacing distributions of eigenvalues from the three classical random matrix ensembles. We also report a remarkable identity that equates the second-nearest-neighbor spacing statistics of the points of the Poisson process and the nearest-neighbor spacing statistics of complex eigenvalues from Ginibre's ensemble of 2 x 2 complex non-Hermitian random matrices.

  6. The relationship between the INTERMED patient complexity instrument and Level of Care Utilisation System (LOCUS).

    PubMed

    Thurber, Steven; Wilson, Ann; Realmuto, George; Specker, Sheila

    2018-03-01

    To investigate the concurrent and criterion validity of two independently developed measurement instruments, INTERMED and LOCUS, designed to improve the treatment and clinical management of patients with complex symptom manifestations. Participants (N = 66) were selected from hospital records based on the complexity of presenting symptoms, with tripartite diagnoses across biological, psychiatric and addiction domains. Biopsychosocial information from hospital records were submitted to INTERMED and LOCUS grids. In addition, Global Assessment of Functioning (GAF) ratings were gathered for statistical analyses. The product moment correlation between INTERMED and LOCUS was 0.609 (p = .01). Inverse zero-order correlations for INTERMED and LOCUS total score and GAF were obtained. However, only the beta weight for LOCUS and GAF was significant. An exploratory principal components analysis further illuminated areas of convergence between the instruments. INTERMED and LOCUS demonstrated shared variance. INTERMED appeared more sensitive to complex medical conditions and severe physiological reactions, whereas LOCUS findings are more strongly related to psychiatric symptoms. Implications are discussed.

  7. Ganglion cell complex scan in the early prediction of glaucoma.

    PubMed

    Ganekal, S

    2012-01-01

    To compare the macular ganglion cell complex (GCC) with peripapillary retinal fiber layer (RNFL) thickness map in glaucoma suspects and patients. Forty participants (20 glaucoma suspects and 20 glaucoma patients) were enrolled. Macular GCC and RNFL thickness maps were performed in both eyes of each participant in the same visit. The sensitivity and specificity of a color code less than 5% (red or yellow) for glaucoma diagnosis were calculated. Standard Automated Perimetry was performed with the Octopus 3.1.1 Dynamic 24-2 program. The statistical analysis was performed with the SPSS 10.1 (SPSS Inc. Chicago, IL, EUA). Results were expressed as mean +/- standard deviation and a p value of 0.05 or less was considered significant. Provide absolute numbers of these findings with their units of measurement. There was a statistically significant difference in average RNFL thickness (p=0.004), superior RNFL thickness (p=0.006), inferior RNFL thickness (p=0.0005) and average GCC (p=0.03) between the suspects and glaucoma patients. There was no difference in optic disc area (p=0.35) and vertical cup/disc ratio (p=0.234) in both groups. While 38% eyes had an abnormal GCC and 13% had an abnormal RNFL thickness in the glaucoma suspect group, 98% had an abnormal GCC and 90% had an abnormal RNFL thickness in the glaucoma group. The ability to diagnose glaucoma with macular GCC thickness is comparable to that with peripapillary RNFL thickness . Macular GCC thickness measurements may be a good alternative or a complementary measurement to RNFL thickness assessment in the clinical evaluation of glaucoma. © NEPjOPH.

  8. Charge carrier effective mass and concentration derived from combination of Seebeck coefficient and Te 125 NMR measurements in complex tellurides

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Levin, E. M.

    Thermoelectric materials utilize the Seebeck effect to convert heat to electrical energy. The Seebeck coefficient (thermopower), S, depends on the free (mobile) carrier concentration, n, and effective mass, m*, as S ~ m*/n 2/3. The carrier concentration in tellurides can be derived from 125Te nuclear magnetic resonance (NMR) spin-lattice relaxation measurements. The NMR spin-lattice relaxation rate, 1/T 1, depends on both n and m* as 1/T 1~(m*) 3/2n (within classical Maxwell-Boltzmann statistics) or as 1/T1~(m*) 2n 2/3 (within quantum Fermi-Dirac statistics), which challenges the correct determination of the carrier concentration in some materials by NMR. Here it is shown thatmore » the combination of the Seebeck coefficient and 125Te NMR spin-lattice relaxation measurements in complex tellurides provides a unique opportunity to derive the carrier effective mass and then to calculate the carrier concentration. This approach was used to study Ag xSb xGe 50–2xTe 50, well-known GeTe-based high-efficiency tellurium-antimony-germanium-silver thermoelectric materials, where the replacement of Ge by [Ag+Sb] results in significant enhancement of the Seebeck coefficient. Thus, values of both m* and n derived using this combination show that the enhancement of thermopower can be attributed primarily to an increase of the carrier effective mass and partially to a decrease of the carrier concentration when the [Ag+Sb] content increases.« less

  9. Charge carrier effective mass and concentration derived from combination of Seebeck coefficient and Te 125 NMR measurements in complex tellurides

    DOE PAGES

    Levin, E. M.

    2016-06-27

    Thermoelectric materials utilize the Seebeck effect to convert heat to electrical energy. The Seebeck coefficient (thermopower), S, depends on the free (mobile) carrier concentration, n, and effective mass, m*, as S ~ m*/n 2/3. The carrier concentration in tellurides can be derived from 125Te nuclear magnetic resonance (NMR) spin-lattice relaxation measurements. The NMR spin-lattice relaxation rate, 1/T 1, depends on both n and m* as 1/T 1~(m*) 3/2n (within classical Maxwell-Boltzmann statistics) or as 1/T1~(m*) 2n 2/3 (within quantum Fermi-Dirac statistics), which challenges the correct determination of the carrier concentration in some materials by NMR. Here it is shown thatmore » the combination of the Seebeck coefficient and 125Te NMR spin-lattice relaxation measurements in complex tellurides provides a unique opportunity to derive the carrier effective mass and then to calculate the carrier concentration. This approach was used to study Ag xSb xGe 50–2xTe 50, well-known GeTe-based high-efficiency tellurium-antimony-germanium-silver thermoelectric materials, where the replacement of Ge by [Ag+Sb] results in significant enhancement of the Seebeck coefficient. Thus, values of both m* and n derived using this combination show that the enhancement of thermopower can be attributed primarily to an increase of the carrier effective mass and partially to a decrease of the carrier concentration when the [Ag+Sb] content increases.« less

  10. Refined generalized multiscale entropy analysis for physiological signals

    NASA Astrophysics Data System (ADS)

    Liu, Yunxiao; Lin, Youfang; Wang, Jing; Shang, Pengjian

    2018-01-01

    Multiscale entropy analysis has become a prevalent complexity measurement and been successfully applied in various fields. However, it only takes into account the information of mean values (first moment) in coarse-graining procedure. Then generalized multiscale entropy (MSEn) considering higher moments to coarse-grain a time series was proposed and MSEσ2 has been implemented. However, the MSEσ2 sometimes may yield an imprecise estimation of entropy or undefined entropy, and reduce statistical reliability of sample entropy estimation as scale factor increases. For this purpose, we developed the refined model, RMSEσ2, to improve MSEσ2. Simulations on both white noise and 1 / f noise show that RMSEσ2 provides higher entropy reliability and reduces the occurrence of undefined entropy, especially suitable for short time series. Besides, we discuss the effect on RMSEσ2 analysis from outliers, data loss and other concepts in signal processing. We apply the proposed model to evaluate the complexity of heartbeat interval time series derived from healthy young and elderly subjects, patients with congestive heart failure and patients with atrial fibrillation respectively, compared to several popular complexity metrics. The results demonstrate that RMSEσ2 measured complexity (a) decreases with aging and diseases, and (b) gives significant discrimination between different physiological/pathological states, which may facilitate clinical application.

  11. Improving a complex finite-difference ground water flow model through the use of an analytic element screening model

    USGS Publications Warehouse

    Hunt, R.J.; Anderson, M.P.; Kelson, V.A.

    1998-01-01

    This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.This paper demonstrates that analytic element models have potential as powerful screening tools that can facilitate or improve calibration of more complicated finite-difference and finite-element models. We demonstrate how a two-dimensional analytic element model was used to identify errors in a complex three-dimensional finite-difference model caused by incorrect specification of boundary conditions. An improved finite-difference model was developed using boundary conditions developed from a far-field analytic element model. Calibration of a revised finite-difference model was achieved using fewer zones of hydraulic conductivity and lake bed conductance than the original finite-difference model. Calibration statistics were also improved in that simulated base-flows were much closer to measured values. The improved calibration is due mainly to improved specification of the boundary conditions made possible by first solving the far-field problem with an analytic element model.

  12. Some new results on the statistics of radio wave scintillation. I - Empirical evidence for Gaussian statistics

    NASA Technical Reports Server (NTRS)

    Rino, C. L.; Livingston, R. C.; Whitney, H. E.

    1976-01-01

    This paper presents an analysis of ionospheric scintillation data which shows that the underlying statistical structure of the signal can be accurately modeled by the additive complex Gaussian perturbation predicted by the Born approximation in conjunction with an application of the central limit theorem. By making use of this fact, it is possible to estimate the in-phase, phase quadrature, and cophased scattered power by curve fitting to measured intensity histograms. By using this procedure, it is found that typically more than 80% of the scattered power is in phase quadrature with the undeviated signal component. Thus, the signal is modeled by a Gaussian, but highly non-Rician process. From simultaneous UHF and VHF data, only a weak dependence of this statistical structure on changes in the Fresnel radius is deduced. The signal variance is found to have a nonquadratic wavelength dependence. It is hypothesized that this latter effect is a subtle manifestation of locally homogeneous irregularity structures, a mathematical model proposed by Kolmogorov (1941) in his early studies of incompressible fluid turbulence.

  13. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (IBM VERSION)

    NASA Technical Reports Server (NTRS)

    Manteufel, R.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  14. SAP- FORTRAN STATIC SOURCE CODE ANALYZER PROGRAM (DEC VAX VERSION)

    NASA Technical Reports Server (NTRS)

    Merwarth, P. D.

    1994-01-01

    The FORTRAN Static Source Code Analyzer program, SAP, was developed to automatically gather statistics on the occurrences of statements and structures within a FORTRAN program and to provide for the reporting of those statistics. Provisions have been made for weighting each statistic and to provide an overall figure of complexity. Statistics, as well as figures of complexity, are gathered on a module by module basis. Overall summed statistics are also accumulated for the complete input source file. SAP accepts as input syntactically correct FORTRAN source code written in the FORTRAN 77 standard language. In addition, code written using features in the following languages is also accepted: VAX-11 FORTRAN, IBM S/360 FORTRAN IV Level H Extended; and Structured FORTRAN. The SAP program utilizes two external files in its analysis procedure. A keyword file allows flexibility in classifying statements and in marking a statement as either executable or non-executable. A statistical weight file allows the user to assign weights to all output statistics, thus allowing the user flexibility in defining the figure of complexity. The SAP program is written in FORTRAN IV for batch execution and has been implemented on a DEC VAX series computer under VMS and on an IBM 370 series computer under MVS. The SAP program was developed in 1978 and last updated in 1985.

  15. Measuring the intangibles: a metrics for the economic complexity of countries and products.

    PubMed

    Cristelli, Matthieu; Gabrielli, Andrea; Tacchella, Andrea; Caldarelli, Guido; Pietronero, Luciano

    2013-01-01

    We investigate a recent methodology we have proposed to extract valuable information on the competitiveness of countries and complexity of products from trade data. Standard economic theories predict a high level of specialization of countries in specific industrial sectors. However, a direct analysis of the official databases of exported products by all countries shows that the actual situation is very different. Countries commonly considered as developed ones are extremely diversified, exporting a large variety of products from very simple to very complex. At the same time countries generally considered as less developed export only the products also exported by the majority of countries. This situation calls for the introduction of a non-monetary and non-income-based measure for country economy complexity which uncovers the hidden potential for development and growth. The statistical approach we present here consists of coupled non-linear maps relating the competitiveness/fitness of countries to the complexity of their products. The fixed point of this transformation defines a metrics for the fitness of countries and the complexity of products. We argue that the key point to properly extract the economic information is the non-linearity of the map which is necessary to bound the complexity of products by the fitness of the less competitive countries exporting them. We present a detailed comparison of the results of this approach directly with those of the Method of Reflections by Hidalgo and Hausmann, showing the better performance of our method and a more solid economic, scientific and consistent foundation.

  16. Measuring the Intangibles: A Metrics for the Economic Complexity of Countries and Products

    PubMed Central

    Cristelli, Matthieu; Gabrielli, Andrea; Tacchella, Andrea; Caldarelli, Guido; Pietronero, Luciano

    2013-01-01

    We investigate a recent methodology we have proposed to extract valuable information on the competitiveness of countries and complexity of products from trade data. Standard economic theories predict a high level of specialization of countries in specific industrial sectors. However, a direct analysis of the official databases of exported products by all countries shows that the actual situation is very different. Countries commonly considered as developed ones are extremely diversified, exporting a large variety of products from very simple to very complex. At the same time countries generally considered as less developed export only the products also exported by the majority of countries. This situation calls for the introduction of a non-monetary and non-income-based measure for country economy complexity which uncovers the hidden potential for development and growth. The statistical approach we present here consists of coupled non-linear maps relating the competitiveness/fitness of countries to the complexity of their products. The fixed point of this transformation defines a metrics for the fitness of countries and the complexity of products. We argue that the key point to properly extract the economic information is the non-linearity of the map which is necessary to bound the complexity of products by the fitness of the less competitive countries exporting them. We present a detailed comparison of the results of this approach directly with those of the Method of Reflections by Hidalgo and Hausmann, showing the better performance of our method and a more solid economic, scientific and consistent foundation. PMID:23940633

  17. Estimation of usual occasion-based individual drinking patterns using diary survey data.

    PubMed

    Hill-McManus, Daniel; Angus, Colin; Meng, Yang; Holmes, John; Brennan, Alan; Sylvia Meier, Petra

    2014-01-01

    In order to successfully address excessive alcohol consumption it is essential to have a means of measuring the drinking patterns of a nation. Owing to the multi-dimensional nature of drinking patterns, usual survey methods have their limitations. The aim of this study was to make use of extremely detailed diary survey data to demonstrate a method of combining different survey measures of drinking in order to reduce these limitations. Data for 1724 respondents of the 2000/01 National Diet and Nutrition Survey was used to obtain a drinking occasion dataset, by plotting the respondent's blood alcohol content over time. Drinking frequency, level and variation measures were chosen to characterise drinking behaviour and usual behaviour was estimated via statistical methods. Complex patterns in drinking behaviour were observed amongst population subgroups using the chosen consumption measures. The predicted drinking distribution combines diary data equivalent coverage with a more accurate proportion of non-drinkers. This statistical analysis provides a means of obtaining average consumption measures from diary data and thus reducing the main limitation of this type of data for many applications. We hope that this will facilitate the use of such data in a wide range of applications such as risk modelling, especially for acute harms, and burden of disease studies. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  18. Nanosecond to submillisecond dynamics in dye-labeled single-stranded DNA, as revealed by ensemble measurements and photon statistics at single-molecule level.

    PubMed

    Kaji, Takahiro; Ito, Syoji; Iwai, Shigenori; Miyasaka, Hiroshi

    2009-10-22

    Single-molecule and ensemble time-resolved fluorescence measurements were applied for the investigation of the conformational dynamics of single-stranded DNA, ssDNA, connected with a fluorescein dye by a C6 linker, where the motions both of DNA and the C6 linker affect the geometry of the system. From the ensemble measurement of the fluorescence quenching via photoinduced electron transfer with a guanine base in the DNA sequence, three main conformations were found in aqueous solution: a conformation unaffected by the guanine base in the excited state lifetime of fluorescein, a conformation in which the fluorescence is dynamically quenched in the excited-state lifetime, and a conformation leading to rapid quenching via nonfluorescent complex. The analysis by using the parameters acquired from the ensemble measurements for interphoton time distribution histograms and FCS autocorrelations by the single-molecule measurement revealed that interconversion in these three conformations took place with two characteristic time constants of several hundreds of nanoseconds and tens of microseconds. The advantage of the combination use of the ensemble measurements with the single-molecule detections for rather complex dynamic motions is discussed by integrating the experimental results with those obtained by molecular dynamics simulation.

  19. Physics-based statistical model and simulation method of RF propagation in urban environments

    DOEpatents

    Pao, Hsueh-Yuan; Dvorak, Steven L.

    2010-09-14

    A physics-based statistical model and simulation/modeling method and system of electromagnetic wave propagation (wireless communication) in urban environments. In particular, the model is a computationally efficient close-formed parametric model of RF propagation in an urban environment which is extracted from a physics-based statistical wireless channel simulation method and system. The simulation divides the complex urban environment into a network of interconnected urban canyon waveguides which can be analyzed individually; calculates spectral coefficients of modal fields in the waveguides excited by the propagation using a database of statistical impedance boundary conditions which incorporates the complexity of building walls in the propagation model; determines statistical parameters of the calculated modal fields; and determines a parametric propagation model based on the statistical parameters of the calculated modal fields from which predictions of communications capability may be made.

  20. Effect of two complex training protocols of back squats in blood indicators of muscular damage in military athletes

    PubMed Central

    Ojeda, Álvaro Huerta; Ríos, Luis Chirosa; Barrilao, Rafael Guisado; Ríos, Ignacio Chirosa; Serrano, Pablo Cáceres

    2016-01-01

    [Purpose] The aim of this study was to determine the variations in the blood muscular damage indicators post application of two complex training programs for back squats. [Subjects and Methods] Seven military athletes were the subjects of this study. The study had a quasi-experimental cross-over intra-subject design. Two complex training protocols were applied, and the variables to be measured were cortisol, metabolic creatine kinase, and total creatine kinase. For the statistical analysis, Student’s t-test was used. [Results] Twenty-four hours post effort, a significant decrease in cortisol level was shown for both protocols; however, the metabolic creatine kinase and total creatine kinase levels showed a significant increase. [Conclusion] Both protocols lowered the indicator of main muscular damage in the blood supply (cortisol). This proved that the work weight did not generate significant muscular damage in the 24-hour post-exercise period. PMID:27313356

  1. Epistemic View of Quantum States and Communication Complexity of Quantum Channels

    NASA Astrophysics Data System (ADS)

    Montina, Alberto

    2012-09-01

    The communication complexity of a quantum channel is the minimal amount of classical communication required for classically simulating a process of state preparation, transmission through the channel and subsequent measurement. It establishes a limit on the power of quantum communication in terms of classical resources. We show that classical simulations employing a finite amount of communication can be derived from a special class of hidden variable theories where quantum states represent statistical knowledge about the classical state and not an element of reality. This special class has attracted strong interest very recently. The communication cost of each derived simulation is given by the mutual information between the quantum state and the classical state of the parent hidden variable theory. Finally, we find that the communication complexity for single qubits is smaller than 1.28 bits. The previous known upper bound was 1.85 bits.

  2. Effect of two complex training protocols of back squats in blood indicators of muscular damage in military athletes.

    PubMed

    Ojeda, Álvaro Huerta; Ríos, Luis Chirosa; Barrilao, Rafael Guisado; Ríos, Ignacio Chirosa; Serrano, Pablo Cáceres

    2016-05-01

    [Purpose] The aim of this study was to determine the variations in the blood muscular damage indicators post application of two complex training programs for back squats. [Subjects and Methods] Seven military athletes were the subjects of this study. The study had a quasi-experimental cross-over intra-subject design. Two complex training protocols were applied, and the variables to be measured were cortisol, metabolic creatine kinase, and total creatine kinase. For the statistical analysis, Student's t-test was used. [Results] Twenty-four hours post effort, a significant decrease in cortisol level was shown for both protocols; however, the metabolic creatine kinase and total creatine kinase levels showed a significant increase. [Conclusion] Both protocols lowered the indicator of main muscular damage in the blood supply (cortisol). This proved that the work weight did not generate significant muscular damage in the 24-hour post-exercise period.

  3. Canadian Health Measures Survey pre-test: design, methods, results.

    PubMed

    Tremblay, Mark; Langlois, Renée; Bryan, Shirley; Esliger, Dale; Patterson, Julienne

    2007-01-01

    The Canadian Health Measures Survey (CHMS) pre-test was conducted to provide information about the challenges and costs associated with administering a physical health measures survey in Canada. To achieve the specific objectives of the pre-test, protocols were developed and tested, and methods for household interviewing and clinic testing were designed and revised. The cost, logistics and suitability of using fixed sites for the CHMS were assessed. Although data collection, transfer and storage procedures are complex, the pre-test experience confirmed Statistics Canada's ability to conduct a direct health measures survey and the willingness of Canadians to participate in such a health survey. Many operational and logistical procedures worked well and, with minor modifications, are being employed in the main survey. Fixed sites were problematic, and survey costs were higher than expected.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cipolla, C.L.; Mayerhofer, M.

    The paper details the acquisition of detailed core and pressure data and the subsequent reservoir modeling in the Ozona Gas Field, Crockett County, Texas. The Canyon formation is the focus of the study and consists of complex turbidite sands characterized by numerous lenticular gas bearing members. The sands cannot be characterized using indirect measurements (logs) and no reliable porosity-permeability relationship could be developed. The reservoir simulation results illustrate the problems associated with interpreting typical pressure and production data in tight gas sands and details procedures to identify incremental reserves. Reservoir layering was represented by five model layers and layer permeabilitiesmore » were estimated based on statistical distributions from core measurements.« less

  5. Design of a Ka-Band Propagation Terminal for Atmospheric Measurements in Polar Regions

    NASA Technical Reports Server (NTRS)

    Houts, Jacquelynne R.; Nessel, James A.; Zemba, Michael J.

    2016-01-01

    This paper describes the design and performance of a Ka-Band beacon receiver developed at NASA Glenn Research Center (GRC) that will be installed alongside an existing Ka-Band Radiometer [2] located at the east end of the Svalbard Near Earth Network (NEN) complex. The goal of this experiment is to characterize rain fade attenuation to improve the performance of existing statistical rain attenuation models. The ground terminal developed by NASA GRC utilizes an FFT-based frequency estimation [3] receiver capable of characterizing total path attenuation effects due to gaseous absorption, clouds, rain, and scintillation by directly measuring the propagated signal from the satellite Thor 7.

  6. Design of a Ka-band Propagation Terminal for Atmospheric Measurements in Polar Regions

    NASA Technical Reports Server (NTRS)

    Houts, Jacquelynne R.; Nessel, James A.; Zemba, Michael J.

    2016-01-01

    This paper describes the design and performance of a Ka-Band beacon receiver developed at NASA Glenn Research Center (GRC) that will be installed alongside an existing Ka-Band Radiometer located at the east end of the Svalbard Near Earth Network (NEN) complex. The goal of this experiment is to characterize rain fade attenuation to improve the performance of existing statistical rain attenuation models. The ground terminal developed by NASA GRC utilizes an FFT-based frequency estimation receiver capable of characterizing total path attenuation effects due to gaseous absorption, clouds, rain, and scintillation by directly measuring the propagated signal from the satellite Thor 7.

  7. Identifying protein complex by integrating characteristic of core-attachment into dynamic PPI network.

    PubMed

    Shen, Xianjun; Yi, Li; Jiang, Xingpeng; He, Tingting; Yang, Jincai; Xie, Wei; Hu, Po; Hu, Xiaohua

    2017-01-01

    How to identify protein complex is an important and challenging task in proteomics. It would make great contribution to our knowledge of molecular mechanism in cell life activities. However, the inherent organization and dynamic characteristic of cell system have rarely been incorporated into the existing algorithms for detecting protein complexes because of the limitation of protein-protein interaction (PPI) data produced by high throughput techniques. The availability of time course gene expression profile enables us to uncover the dynamics of molecular networks and improve the detection of protein complexes. In order to achieve this goal, this paper proposes a novel algorithm DCA (Dynamic Core-Attachment). It detects protein-complex core comprising of continually expressed and highly connected proteins in dynamic PPI network, and then the protein complex is formed by including the attachments with high adhesion into the core. The integration of core-attachment feature into the dynamic PPI network is responsible for the superiority of our algorithm. DCA has been applied on two different yeast dynamic PPI networks and the experimental results show that it performs significantly better than the state-of-the-art techniques in terms of prediction accuracy, hF-measure and statistical significance in biology. In addition, the identified complexes with strong biological significance provide potential candidate complexes for biologists to validate.

  8. AAS and spectrophotometric methods for the determination metoprolol tartrate in tablets

    NASA Astrophysics Data System (ADS)

    Alpdoğan, Güzin; Sungur, Sidika

    1999-11-01

    Sensitive and specific atomic adsorption spectroscopy (AAS) and spectrophotometric methods have been developed for the determination of beta adrenergic blocking drug, metoprolol tartrate.The method is based on the formation of Cu(II) dithiocarbamate complex by derivatization of the secondary amino group of metoprolol with CS 2 and CuCl 2 in the presence of ammonia.The copper-bis(dithiocarbamate) complex was extracted into chloroform and the concentration of metoprolol tartrate was determined directly by spectrophotometric and indirectly by AAS measurement of copper.The two methods developed were applied to the assay of metoprolol tartrate in commercial tablet formulations.The methods were compared statistically with each other and with the high performance liquid chromatography (HPLC) method of USPXXII using t- and F-tests.

  9. A statistical learning strategy for closed-loop control of fluid flows

    NASA Astrophysics Data System (ADS)

    Guéniat, Florimond; Mathelin, Lionel; Hussaini, M. Yousuff

    2016-12-01

    This work discusses a closed-loop control strategy for complex systems utilizing scarce and streaming data. A discrete embedding space is first built using hash functions applied to the sensor measurements from which a Markov process model is derived, approximating the complex system's dynamics. A control strategy is then learned using reinforcement learning once rewards relevant with respect to the control objective are identified. This method is designed for experimental configurations, requiring no computations nor prior knowledge of the system, and enjoys intrinsic robustness. It is illustrated on two systems: the control of the transitions of a Lorenz'63 dynamical system, and the control of the drag of a cylinder flow. The method is shown to perform well.

  10. Towards a Phylogenetic Approach to the Composition of Species Complexes in the North and Central American Triatoma, Vectors of Chagas Disease

    PubMed Central

    de la Rúa, Nicholas M.; Bustamante, Dulce M.; Menes, Marianela; Stevens, Lori; Monroy, Carlota; Kilpatrick, William; Rizzo, Donna; Klotz, Stephen A.; Schmidt, Justin; Axen, Heather J.; Dorn, Patricia L.

    2014-01-01

    Phylogenetic relationships of insect vectors of parasitic diseases are important for understanding the evolution of epidemiologically relevant traits, and may be useful in vector control. The subfamily Triatominae (Hemiptera:Reduviidae) includes ~140 extant species arranged in five tribes comprised of 15 genera. The genus Triatoma is the most species-rich and contains important vectors of Trypanosoma cruzi, the causative agent of Chagas disease. Triatoma species were grouped into complexes originally by morphology and more recently with the addition of information from molecular phylogenetics (the four-complex hypothesis); however, without a strict adherence to monophyly. To date, the validity of proposed species complexes has not been tested by statistical tests of topology. The goal of this study was to clarify the systematics of 19 Triatoma species from North and Central America. We inferred their evolutionary relatedness using two independent data sets: the complete nuclear Internal Transcribed Spacer-2 ribosomal DNA (ITS-2 rDNA) and head morphometrics. In addition, we used the Shimodaira-Hasegawa statistical test of topology to assess the fit of the data to a set of competing systematic hypotheses (topologies). An unconstrained topology inferred from the ITS-2 data was compared to topologies constrained based on the four-complex hypothesis or one inferred from our morphometry results. The unconstrained topology represents a statistically significant better fit of the molecular data than either the four-complex or the morphometric topology. We propose an update to the composition of species complexes in the North and Central American Triatoma, based on a phylogeny inferred from ITS-2 as a first step towards updating the phylogeny of the complexes based on monophyly and statistical tests of topologies. PMID:24681261

  11. Fish: A New Computer Program for Friendly Introductory Statistics Help

    ERIC Educational Resources Information Center

    Brooks, Gordon P.; Raffle, Holly

    2005-01-01

    All introductory statistics students must master certain basic descriptive statistics, including means, standard deviations and correlations. Students must also gain insight into such complex concepts as the central limit theorem and standard error. This article introduces and describes the Friendly Introductory Statistics Help (FISH) computer…

  12. Computer Science and Statistics. Proceedings of the Symposium on the Interface (18th) Held on March 19-21, 1986 in Fort Collins, Colorado.

    DTIC Science & Technology

    1987-08-26

    example, expert systems research would benefit examples are the Acute Renal Failure [15] system, the if it could attract statisticians to assist in...research projects including the Acute Renal Failure [15] system, the 6. EXPLAINING COMPLEX REASONING INTERNIST-] [22] system for diagnosis within the...the MEDAS and Acute Renal Failure systems. task at any point in reasoning about a case is constrained to Entropy-discriminate makes use of a measure

  13. Sensor Analytics: Radioactive gas Concentration Estimation and Error Propagation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, Dale N.; Fagan, Deborah K.; Suarez, Reynold

    2007-04-15

    This paper develops the mathematical statistics of a radioactive gas quantity measurement and associated error propagation. The probabilistic development is a different approach to deriving attenuation equations and offers easy extensions to more complex gas analysis components through simulation. The mathematical development assumes a sequential process of three components; I) the collection of an environmental sample, II) component gas extraction from the sample through the application of gas separation chemistry, and III) the estimation of radioactivity of component gases.

  14. Advanced Artificial Intelligence Technology Testbed

    NASA Technical Reports Server (NTRS)

    Anken, Craig S.

    1993-01-01

    The Advanced Artificial Intelligence Technology Testbed (AAITT) is a laboratory testbed for the design, analysis, integration, evaluation, and exercising of large-scale, complex, software systems, composed of both knowledge-based and conventional components. The AAITT assists its users in the following ways: configuring various problem-solving application suites; observing and measuring the behavior of these applications and the interactions between their constituent modules; gathering and analyzing statistics about the occurrence of key events; and flexibly and quickly altering the interaction of modules within the applications for further study.

  15. Is Statistical Learning Constrained by Lower Level Perceptual Organization?

    PubMed Central

    Emberson, Lauren L.; Liu, Ran; Zevin, Jason D.

    2013-01-01

    In order for statistical information to aid in complex developmental processes such as language acquisition, learning from higher-order statistics (e.g. across successive syllables in a speech stream to support segmentation) must be possible while perceptual abilities (e.g. speech categorization) are still developing. The current study examines how perceptual organization interacts with statistical learning. Adult participants were presented with multiple exemplars from novel, complex sound categories designed to reflect some of the spectral complexity and variability of speech. These categories were organized into sequential pairs and presented such that higher-order statistics, defined based on sound categories, could support stream segmentation. Perceptual similarity judgments and multi-dimensional scaling revealed that participants only perceived three perceptual clusters of sounds and thus did not distinguish the four experimenter-defined categories, creating a tension between lower level perceptual organization and higher-order statistical information. We examined whether the resulting pattern of learning is more consistent with statistical learning being “bottom-up,” constrained by the lower levels of organization, or “top-down,” such that higher-order statistical information of the stimulus stream takes priority over the perceptual organization, and perhaps influences perceptual organization. We consistently find evidence that learning is constrained by perceptual organization. Moreover, participants generalize their learning to novel sounds that occupy a similar perceptual space, suggesting that statistical learning occurs based on regions of or clusters in perceptual space. Overall, these results reveal a constraint on learning of sound sequences, such that statistical information is determined based on lower level organization. These findings have important implications for the role of statistical learning in language acquisition. PMID:23618755

  16. Introduction to the Special Series: Current Directions for Measuring Parenting Constructs to Inform Prevention Science.

    PubMed

    Lindhiem, Oliver; Shaffer, Anne

    2017-04-01

    Parenting behaviors are multifaceted and dynamic and therefore challenging to quantify. Measurement methods have critical implications for study results, particularly for prevention trials designed to modify parenting behaviors. Although multiple approaches can complement one another and contribute to a more complete understanding of prevention trials, the assumptions and implications of each approach are not always clearly addressed. Greater attention to the measurement of complex constructs such as parenting is needed to advance the field of prevention science. This series examines the challenges of measuring changes in parenting behaviors in the context of prevention trials. All manuscripts in the special series address measurement issues and make practical recommendations for prevention researchers. Manuscripts in this special series include (1) empirical studies that demonstrate novel measurement approaches, (2) re-analyses of prevention trial outcome data directly comparing and contrasting two or more methods, and (3) a statistical primer and practical guide to analyzing proportion data.

  17. Using entropy measures to characterize human locomotion.

    PubMed

    Leverick, Graham; Szturm, Tony; Wu, Christine Q

    2014-12-01

    Entropy measures have been widely used to quantify the complexity of theoretical and experimental dynamical systems. In this paper, the value of using entropy measures to characterize human locomotion is demonstrated based on their construct validity, predictive validity in a simple model of human walking and convergent validity in an experimental study. Results show that four of the five considered entropy measures increase meaningfully with the increased probability of falling in a simple passive bipedal walker model. The same four entropy measures also experienced statistically significant increases in response to increasing age and gait impairment caused by cognitive interference in an experimental study. Of the considered entropy measures, the proposed quantized dynamical entropy (QDE) and quantization-based approximation of sample entropy (QASE) offered the best combination of sensitivity to changes in gait dynamics and computational efficiency. Based on these results, entropy appears to be a viable candidate for assessing the stability of human locomotion.

  18. Information properties of morphologically complex words modulate brain activity during word reading

    PubMed Central

    Hultén, Annika; Lehtonen, Minna; Lagus, Krista; Salmelin, Riitta

    2018-01-01

    Abstract Neuroimaging studies of the reading process point to functionally distinct stages in word recognition. Yet, current understanding of the operations linked to those various stages is mainly descriptive in nature. Approaches developed in the field of computational linguistics may offer a more quantitative approach for understanding brain dynamics. Our aim was to evaluate whether a statistical model of morphology, with well‐defined computational principles, can capture the neural dynamics of reading, using the concept of surprisal from information theory as the common measure. The Morfessor model, created for unsupervised discovery of morphemes, is based on the minimum description length principle and attempts to find optimal units of representation for complex words. In a word recognition task, we correlated brain responses to word surprisal values derived from Morfessor and from other psycholinguistic variables that have been linked with various levels of linguistic abstraction. The magnetoencephalography data analysis focused on spatially, temporally and functionally distinct components of cortical activation observed in reading tasks. The early occipital and occipito‐temporal responses were correlated with parameters relating to visual complexity and orthographic properties, whereas the later bilateral superior temporal activation was correlated with whole‐word based and morphological models. The results show that the word processing costs estimated by the statistical Morfessor model are relevant for brain dynamics of reading during late processing stages. PMID:29524274

  19. The statistical mechanics of complex signaling networks: nerve growth factor signaling

    NASA Astrophysics Data System (ADS)

    Brown, K. S.; Hill, C. C.; Calero, G. A.; Myers, C. R.; Lee, K. H.; Sethna, J. P.; Cerione, R. A.

    2004-10-01

    The inherent complexity of cellular signaling networks and their importance to a wide range of cellular functions necessitates the development of modeling methods that can be applied toward making predictions and highlighting the appropriate experiments to test our understanding of how these systems are designed and function. We use methods of statistical mechanics to extract useful predictions for complex cellular signaling networks. A key difficulty with signaling models is that, while significant effort is being made to experimentally measure the rate constants for individual steps in these networks, many of the parameters required to describe their behavior remain unknown or at best represent estimates. To establish the usefulness of our approach, we have applied our methods toward modeling the nerve growth factor (NGF)-induced differentiation of neuronal cells. In particular, we study the actions of NGF and mitogenic epidermal growth factor (EGF) in rat pheochromocytoma (PC12) cells. Through a network of intermediate signaling proteins, each of these growth factors stimulates extracellular regulated kinase (Erk) phosphorylation with distinct dynamical profiles. Using our modeling approach, we are able to predict the influence of specific signaling modules in determining the integrated cellular response to the two growth factors. Our methods also raise some interesting insights into the design and possible evolution of cellular systems, highlighting an inherent property of these systems that we call 'sloppiness.'

  20. Information properties of morphologically complex words modulate brain activity during word reading.

    PubMed

    Hakala, Tero; Hultén, Annika; Lehtonen, Minna; Lagus, Krista; Salmelin, Riitta

    2018-06-01

    Neuroimaging studies of the reading process point to functionally distinct stages in word recognition. Yet, current understanding of the operations linked to those various stages is mainly descriptive in nature. Approaches developed in the field of computational linguistics may offer a more quantitative approach for understanding brain dynamics. Our aim was to evaluate whether a statistical model of morphology, with well-defined computational principles, can capture the neural dynamics of reading, using the concept of surprisal from information theory as the common measure. The Morfessor model, created for unsupervised discovery of morphemes, is based on the minimum description length principle and attempts to find optimal units of representation for complex words. In a word recognition task, we correlated brain responses to word surprisal values derived from Morfessor and from other psycholinguistic variables that have been linked with various levels of linguistic abstraction. The magnetoencephalography data analysis focused on spatially, temporally and functionally distinct components of cortical activation observed in reading tasks. The early occipital and occipito-temporal responses were correlated with parameters relating to visual complexity and orthographic properties, whereas the later bilateral superior temporal activation was correlated with whole-word based and morphological models. The results show that the word processing costs estimated by the statistical Morfessor model are relevant for brain dynamics of reading during late processing stages. © 2018 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.

  1. Implementation of cross correlation for energy discrimination on the time-of-flight spectrometer CORELLI.

    PubMed

    Ye, Feng; Liu, Yaohua; Whitfield, Ross; Osborn, Ray; Rosenkranz, Stephan

    2018-04-01

    The CORELLI instrument at Oak Ridge National Laboratory is a statistical chopper spectrometer designed and optimized to probe complex disorder in crystalline materials through diffuse scattering experiments. On CORELLI, the high efficiency of white-beam Laue diffraction combined with elastic discrimination have enabled an unprecedented data collection rate to obtain both the total and the elastic-only scattering over a large volume of reciprocal space from a single measurement. To achieve this, CORELLI is equipped with a statistical chopper to modulate the incoming neutron beam quasi-randomly, and then the cross-correlation method is applied to reconstruct the elastic component from the scattering data. Details of the implementation of the cross-correlation method on CORELLI are given and its performance is discussed.

  2. Implementation of cross correlation for energy discrimination on the time-of-flight spectrometer CORELLI

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ye, Feng; Liu, Yaohua; Whitfield, Ross

    The CORELLI instrument at Oak Ridge National Laboratory is a statistical chopper spectrometer designed and optimized to probe complex disorder in crystalline materials through diffuse scattering experiments. On CORELLI, the high efficiency of white-beam Laue diffraction combined with elastic discrimination have enabled an unprecedented data collection rate to obtain both the total and the elastic-only scattering over a large volume of reciprocal space from a single measurement. To achieve this, CORELLI is equipped with a statistical chopper to modulate the incoming neutron beam quasi-randomly, and then the cross-correlation method is applied to reconstruct the elastic component from the scattering data.more » Lastly, details of the implementation of the cross-correlation method on CORELLI are given and its performance is discussed.« less

  3. Implementation of cross correlation for energy discrimination on the time-of-flight spectrometer CORELLI

    DOE PAGES

    Ye, Feng; Liu, Yaohua; Whitfield, Ross; ...

    2018-03-26

    The CORELLI instrument at Oak Ridge National Laboratory is a statistical chopper spectrometer designed and optimized to probe complex disorder in crystalline materials through diffuse scattering experiments. On CORELLI, the high efficiency of white-beam Laue diffraction combined with elastic discrimination have enabled an unprecedented data collection rate to obtain both the total and the elastic-only scattering over a large volume of reciprocal space from a single measurement. To achieve this, CORELLI is equipped with a statistical chopper to modulate the incoming neutron beam quasi-randomly, and then the cross-correlation method is applied to reconstruct the elastic component from the scattering data.more » Lastly, details of the implementation of the cross-correlation method on CORELLI are given and its performance is discussed.« less

  4. Statistics without Tears: Complex Statistics with Simple Arithmetic

    ERIC Educational Resources Information Center

    Smith, Brian

    2011-01-01

    One of the often overlooked aspects of modern statistics is the analysis of time series data. Modern introductory statistics courses tend to rush to probabilistic applications involving risk and confidence. Rarely does the first level course linger on such useful and fascinating topics as time series decomposition, with its practical applications…

  5. Subband Image Coding with Jointly Optimized Quantizers

    NASA Technical Reports Server (NTRS)

    Kossentini, Faouzi; Chung, Wilson C.; Smith Mark J. T.

    1995-01-01

    An iterative design algorithm for the joint design of complexity- and entropy-constrained subband quantizers and associated entropy coders is proposed. Unlike conventional subband design algorithms, the proposed algorithm does not require the use of various bit allocation algorithms. Multistage residual quantizers are employed here because they provide greater control of the complexity-performance tradeoffs, and also because they allow efficient and effective high-order statistical modeling. The resulting subband coder exploits statistical dependencies within subbands, across subbands, and across stages, mainly through complexity-constrained high-order entropy coding. Experimental results demonstrate that the complexity-rate-distortion performance of the new subband coder is exceptional.

  6. Robust approaches to quantification of margin and uncertainty for sparse data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin

    Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less

  7. Quantifying the Energy Landscape Statistics in Proteins - a Relaxation Mode Analysis

    NASA Astrophysics Data System (ADS)

    Cai, Zhikun; Zhang, Yang

    Energy landscape, the hypersurface in the configurational space, has been a useful concept in describing complex processes that occur over a very long time scale, such as the multistep slow relaxations of supercooled liquids and folding of polypeptide chains into structured proteins. Despite extensive simulation studies, its experimental characterization still remains a challenge. To address this challenge, we developed a relaxation mode analysis (RMA) for liquids under a framework analogous to the normal mode analysis for solids. Using RMA, important statistics of the activation barriers of the energy landscape becomes accessible from experimentally measurable two-point correlation functions, e.g. using quasi-elastic and inelastic scattering experiments. We observed a prominent coarsening effect of the energy landscape. The results were further confirmed by direct sampling of the energy landscape using a metadynamics-like adaptive autonomous basin climbing computation. We first demonstrate RMA in a supercooled liquid when dynamical cooperativity emerges in the landscape-influenced regime. Then we show this framework reveals encouraging energy landscape statistics when applied to proteins.

  8. On entropy, financial markets and minority games

    NASA Astrophysics Data System (ADS)

    Zapart, Christopher A.

    2009-04-01

    The paper builds upon an earlier statistical analysis of financial time series with Shannon information entropy, published in [L. Molgedey, W. Ebeling, Local order, entropy and predictability of financial time series, European Physical Journal B-Condensed Matter and Complex Systems 15/4 (2000) 733-737]. A novel generic procedure is proposed for making multistep-ahead predictions of time series by building a statistical model of entropy. The approach is first demonstrated on the chaotic Mackey-Glass time series and later applied to Japanese Yen/US dollar intraday currency data. The paper also reinterprets Minority Games [E. Moro, The minority game: An introductory guide, Advances in Condensed Matter and Statistical Physics (2004)] within the context of physical entropy, and uses models derived from minority game theory as a tool for measuring the entropy of a model in response to time series. This entropy conditional upon a model is subsequently used in place of information-theoretic entropy in the proposed multistep prediction algorithm.

  9. Diagnostic Statistics for the Assessment and Characterization of Complex Turbulent Flows

    NASA Technical Reports Server (NTRS)

    Ristorcelli, J. R.

    1995-01-01

    A simple parameterization scheme for a complex turbulent flow using nondimensional parameters coming from the Reynolds stress equations is given. Definitions and brief descriptions of the physical significance of several nondimensional parameters that are used to characterize turbulence from the viewpoint of single-point turbulence closures are given. These nondimensional parameters reflect measures of (1) the spectral band width of the turbulence; (2) deviations from the ideal Kolmogorov behavior; (3) the relative magnitude, orientation, and temporal duration of the deformation to which the turbulence is subjected; (4) one and two-point measures of the large and small scale anisotropy of the turbulence; and (5) inhomogeneity. This is an attempt to create a more systematic methodology for the diagnosis and classification of turbulent flows as well as in the development, validation, and application of turbulence model strategies. The parameters serve also to indicate the adequacy of various assumptions made in single-point turbulence models and in suggesting the appropriate turbulence strategy for a particular complex flow. The compilation will be of interest to experimentalists and to those involved in either computing turbulent flows or whose interests lies in verifying the adequacy of the phenomenological beliefs used in turbulence closures.

  10. Measuring pair-wise molecular interactions in a complex mixture

    NASA Astrophysics Data System (ADS)

    Chakraborty, Krishnendu; Varma, Manoj M.; Venkatapathi, Murugesan

    2016-03-01

    Complex biological samples such as serum contain thousands of proteins and other molecules spanning up to 13 orders of magnitude in concentration. Present measurement techniques do not permit the analysis of all pair-wise interactions between the components of such a complex mixture to a given target molecule. In this work we explore the use of nanoparticle tags which encode the identity of the molecule to obtain the statistical distribution of pair-wise interactions using their Localized Surface Plasmon Resonance (LSPR) signals. The nanoparticle tags are chosen such that the binding between two molecules conjugated to the respective nanoparticle tags can be recognized by the coupling of their LSPR signals. This numerical simulation is done by DDA to investigate this approach using a reduced system consisting of three nanoparticles (a gold ellipsoid with aspect ratio 2.5 and short axis 16 nm, and two silver ellipsoids with aspect ratios 3 and 2 and short axes 8 nm and 10 nm respectively) and the set of all possible dimers formed between them. Incident light was circularly polarized and all possible particle and dimer orientations were considered. We observed that minimum peak separation between two spectra is 5 nm while maximum is 184nm.

  11. Exploring the practicing-connections hypothesis: using gesture to support coordination of ideas in understanding a complex statistical concept.

    PubMed

    Son, Ji Y; Ramos, Priscilla; DeWolf, Melissa; Loftus, William; Stigler, James W

    2018-01-01

    In this article, we begin to lay out a framework and approach for studying how students come to understand complex concepts in rich domains. Grounded in theories of embodied cognition, we advance the view that understanding of complex concepts requires students to practice, over time, the coordination of multiple concepts, and the connection of this system of concepts to situations in the world. Specifically, we explore the role that a teacher's gesture might play in supporting students' coordination of two concepts central to understanding in the domain of statistics: mean and standard deviation. In Study 1 we show that university students who have just taken a statistics course nevertheless have difficulty taking both mean and standard deviation into account when thinking about a statistical scenario. In Study 2 we show that presenting the same scenario with an accompanying gesture to represent variation significantly impacts students' interpretation of the scenario. Finally, in Study 3 we present evidence that instructional videos on the internet fail to leverage gesture as a means of facilitating understanding of complex concepts. Taken together, these studies illustrate an approach to translating current theories of cognition into principles that can guide instructional design.

  12. Towards Cost-Effective Operational Monitoring Systems for Complex Waters: Analyzing Small-Scale Coastal Processes with Optical Transmissometry

    PubMed Central

    Gonçalves-Araujo, Rafael; Wiegmann, Sonja; Torrecilla, Elena; Bardaji, Raul; Röttgers, Rüdiger; Bracher, Astrid; Piera, Jaume

    2017-01-01

    The detection and prediction of changes in coastal ecosystems require a better understanding of the complex physical, chemical and biological interactions, which involves that observations should be performed continuously. For this reason, there is an increasing demand for small, simple and cost-effective in situ sensors to analyze complex coastal waters at a broad range of scales. In this context, this study seeks to explore the potential of beam attenuation spectra, c(λ), measured in situ with an advanced-technology optical transmissometer, for assessing temporal and spatial patterns in the complex estuarine waters of Alfacs Bay (NW Mediterranean) as a test site. In particular, the information contained in the spectral beam attenuation coefficient was assessed and linked with different biogeochemical variables. The attenuation at λ = 710 nm was used as a proxy for particle concentration, TSM, whereas a novel parameter was adopted as an optical indicator for chlorophyll a (Chl-a) concentration, based on the local maximum of c(λ) observed at the long-wavelength side of the red band Chl-a absorption peak. In addition, since coloured dissolved organic matter (CDOM) has an important influence on the beam attenuation spectral shape and complementary measurements of particle size distribution were available, the beam attenuation spectral slope was used to analyze the CDOM content. Results were successfully compared with optical and biogeochemical variables from laboratory analysis of collocated water samples, and statistically significant correlations were found between the attenuation proxies and the biogeochemical variables TSM, Chl-a and CDOM. This outcome depicted the potential of high-frequency beam attenuation measurements as a simple, continuous and cost-effective approach for rapid detection of changes and patterns in biogeochemical properties in complex coastal environments. PMID:28107539

  13. Syntactic and Story Structure Complexity in the Narratives of High- and Low-Language Ability Children with Autism Spectrum Disorder

    PubMed Central

    Peristeri, Eleni; Andreou, Maria; Tsimpli, Ianthi M.

    2017-01-01

    Although language impairment is commonly associated with the autism spectrum disorder (ASD), the Diagnostic Statistical Manual no longer includes language impairment as a necessary component of an ASD diagnosis (American Psychiatric Association, 2013). However, children with ASD and no comorbid intellectual disability struggle with some aspects of language whose precise nature is still outstanding. Narratives have been extensively used as a tool to examine lexical and syntactic abilities, as well as pragmatic skills in children with ASD. This study contributes to this literature by investigating the narrative skills of 30 Greek-speaking children with ASD and normal non-verbal IQ, 16 with language skills in the upper end of the normal range (ASD-HL), and 14 in the lower end of the normal range (ASD-LL). The control group consisted of 15 age-matched typically-developing (TD) children. Narrative performance was measured in terms of both microstructural and macrostructural properties. Microstructural properties included lexical and syntactic measures of complexity such as subordinate vs. coordinate clauses and types of subordinate clauses. Macrostructure was measured in terms of the diversity in the use of internal state terms (ISTs) and story structure complexity, i.e., children's ability to produce important units of information that involve the setting, characters, events, and outcomes of the story, as well as the characters' thoughts and feelings. The findings demonstrate that high language ability and syntactic complexity pattern together in ASD children's narrative performance and that language ability compensates for autistic children's pragmatic deficit associated with the production of Theory of Mind-related ISTs. Nevertheless, both groups of children with ASD (high and low language ability) scored lower than the TD controls in the production of Theory of Mind-unrelated ISTs, modifier clauses and story structure complexity. PMID:29209258

  14. Syntactic and Story Structure Complexity in the Narratives of High- and Low-Language Ability Children with Autism Spectrum Disorder.

    PubMed

    Peristeri, Eleni; Andreou, Maria; Tsimpli, Ianthi M

    2017-01-01

    Although language impairment is commonly associated with the autism spectrum disorder (ASD), the Diagnostic Statistical Manual no longer includes language impairment as a necessary component of an ASD diagnosis (American Psychiatric Association, 2013). However, children with ASD and no comorbid intellectual disability struggle with some aspects of language whose precise nature is still outstanding. Narratives have been extensively used as a tool to examine lexical and syntactic abilities, as well as pragmatic skills in children with ASD. This study contributes to this literature by investigating the narrative skills of 30 Greek-speaking children with ASD and normal non-verbal IQ, 16 with language skills in the upper end of the normal range (ASD-HL), and 14 in the lower end of the normal range (ASD-LL). The control group consisted of 15 age-matched typically-developing (TD) children. Narrative performance was measured in terms of both microstructural and macrostructural properties. Microstructural properties included lexical and syntactic measures of complexity such as subordinate vs. coordinate clauses and types of subordinate clauses. Macrostructure was measured in terms of the diversity in the use of internal state terms (ISTs) and story structure complexity, i.e., children's ability to produce important units of information that involve the setting, characters, events, and outcomes of the story, as well as the characters' thoughts and feelings. The findings demonstrate that high language ability and syntactic complexity pattern together in ASD children's narrative performance and that language ability compensates for autistic children's pragmatic deficit associated with the production of Theory of Mind-related ISTs. Nevertheless, both groups of children with ASD (high and low language ability) scored lower than the TD controls in the production of Theory of Mind-unrelated ISTs, modifier clauses and story structure complexity.

  15. Fluctuating asymmetry in broiler chickens: a decision protocol for trait selection in seven measuring methods.

    PubMed

    Van Nuffel, A; Tuyttens, F A M; Van Dongen, S; Talloen, W; Van Poucke, E; Sonck, B; Lens, L

    2007-12-01

    Nonidentical development of bilateral traits due to disturbing genetic or developmental factors is called fluctuating asymmetry (FA) if such deviations are continuously distributed. Fluctuating asymmetry is believed to be a reliable indicator of the fitness and welfare of an animal. Despite an increasing body of research, the link between FA and animal performance or welfare is reported to be inconsistent, possibly, among other reasons, due to inaccurate measuring protocols or incorrect statistical analyses. This paper reviews problems of interpreting FA results in poultry and provides guidelines for the measurement and analysis of FA, applied to broilers. A wide range of morphological traits were measured by 7 different techniques (ranging from measurements on living broilers or intact carcasses to X-rays, bones, and digital images) and evaluated for their applicability to estimate FA. Following 4 selection criteria (significant FA, absence of directional asymmetry or antisymmetry, absence of between-trait correlation in signed FA values, and high signal-to-noise ratio), from 3 to 14 measurements per method were found suitable for estimating the degree of FA. The accuracy of FA estimates was positively related to the complexity and time investment of the measuring method. In addition, our study clearly shows the importance of securing adequate statistical power when designing FA studies. Repeatability analyses of FA estimates indicated the need for larger sample sizes, more repeated measurements, or both, than are commonly used in FA studies.

  16. Constructing three emotion knowledge tests from the invariant measurement approach

    PubMed Central

    Prieto, Gerardo; Burin, Debora I.

    2017-01-01

    Background Psychological constructionist models like the Conceptual Act Theory (CAT) postulate that complex states such as emotions are composed of basic psychological ingredients that are more clearly respected by the brain than basic emotions. The objective of this study was the construction and initial validation of Emotion Knowledge measures from the CAT frame by means of an invariant measurement approach, the Rasch Model (RM). Psychological distance theory was used to inform item generation. Methods Three EK tests—emotion vocabulary (EV), close emotional situations (CES) and far emotional situations (FES)—were constructed and tested with the RM in a community sample of 100 females and 100 males (age range: 18–65), both separately and conjointly. Results It was corroborated that data-RM fit was sufficient. Then, the effect of type of test and emotion on Rasch-modelled item difficulty was tested. Significant effects of emotion on EK item difficulty were found, but the only statistically significant difference was that between “happiness” and the remaining emotions; neither type of test, nor interaction effects on EK item difficulty were statistically significant. The testing of gender differences was carried out after corroborating that differential item functioning (DIF) would not be a plausible alternative hypothesis for the results. No statistically significant sex-related differences were found out in EV, CES, FES, or total EK. However, the sign of d indicate that female participants were consistently better than male ones, a result that will be of interest for future meta-analyses. Discussion The three EK tests are ready to be used as components of a higher-level measurement process. PMID:28929013

  17. Network-Physics (NP) BEC DIGITAL(#)-VULNERABILITY; ``Q-Computing"=Simple-Arithmetic;Modular-Congruences=SignalXNoise PRODUCTS=Clock-model;BEC-Factorization;RANDOM-# Definition;P=/=NP TRIVIAL Proof!!!

    NASA Astrophysics Data System (ADS)

    Pi, E. I.; Siegel, E.

    2010-03-01

    Siegel[AMS Natl.Mtg.(2002)-Abs.973-60-124] digits logarithmic- law inversion to ONLY BEQS BEC:Quanta/Bosons=#: EMP-like SEVERE VULNERABILITY of ONLY #-networks(VS.ANALOG INvulnerability) via Barabasi NP(VS.dynamics[Not.AMS(5/2009)] critique);(so called)``quantum-computing''(QC) = simple-arithmetic (sansdivision);algorithmiccomplexities:INtractibility/UNdecidabi lity/INefficiency/NONcomputability/HARDNESS(so MIScalled) ``noise''-induced-phase-transition(NIT)ACCELERATION:Cook-Levin theorem Reducibility = RG fixed-points; #-Randomness DEFINITION via WHAT? Query(VS. Goldreich[Not.AMS(2002)] How? mea culpa)= ONLY MBCS hot-plasma v #-clumping NON-random BEC; Modular-Arithmetic Congruences = Signal x Noise PRODUCTS = clock-model; NON-Shor[Physica A,341,586(04)]BEC logarithmic-law inversion factorization: Watkins #-theory U statistical- physics); P=/=NP C-S TRIVIAL Proof: Euclid!!! [(So Miscalled) computational-complexity J-O obviation(3 millennia AGO geometry: NO:CC,``CS'';``Feet of Clay!!!'']; Query WHAT?:Definition: (so MIScalled)``complexity''=UTTER-SIMPLICITY!! v COMPLICATEDNESS MEASURE(S).

  18. A new test method for the evaluation of total antioxidant activity of herbal products.

    PubMed

    Zaporozhets, Olga A; Krushynska, Olena A; Lipkovska, Natalia A; Barvinchenko, Valentina N

    2004-01-14

    A new test method for measuring the antioxidant power of herbal products, based on solid-phase spectrophotometry using tetrabenzo-[b,f,j,n][1,5,9,13]-tetraazacyclohexadecine-Cu(II) complex immobilized on silica gel, is proposed. The absorbance of the modified sorbent (lambda(max) = 712 nm) increases proportionally to the total antioxidant activity of the sample solution. The method represents an attractive alternative to the mostly used radical scavenging capacity assays, because they generally require complex long-lasting stages to be carried out. The proposed test method is simple ("drop and measure" procedure is applied), rapid (10 min/sample), requires only the monitoring of time and absorbance, and provides good statistical parameters (s(r)

  19. A practical tool for maximal information coefficient analysis.

    PubMed

    Albanese, Davide; Riccadonna, Samantha; Donati, Claudio; Franceschi, Pietro

    2018-04-01

    The ability of finding complex associations in large omics datasets, assessing their significance, and prioritizing them according to their strength can be of great help in the data exploration phase. Mutual information-based measures of association are particularly promising, in particular after the recent introduction of the TICe and MICe estimators, which combine computational efficiency with superior bias/variance properties. An open-source software implementation of these two measures providing a complete procedure to test their significance would be extremely useful. Here, we present MICtools, a comprehensive and effective pipeline that combines TICe and MICe into a multistep procedure that allows the identification of relationships of various degrees of complexity. MICtools calculates their strength assessing statistical significance using a permutation-based strategy. The performances of the proposed approach are assessed by an extensive investigation in synthetic datasets and an example of a potential application on a metagenomic dataset is also illustrated. We show that MICtools, combining TICe and MICe, is able to highlight associations that would not be captured by conventional strategies.

  20. Analysis strategies for longitudinal attachment loss data.

    PubMed

    Beck, J D; Elter, J R

    2000-02-01

    The purpose of this invited review is to describe and discuss methods currently in use to quantify the progression of attachment loss in epidemiological studies of periodontal disease, and to make recommendations for specific analytic methods based upon the particular design of the study and structure of the data. The review concentrates on the definition of incident attachment loss (ALOSS) and its component parts; measurement issues including thresholds and regression to the mean; methods of accounting for longitudinal change, including changes in means, changes in proportions of affected sites, incidence density, the effect of tooth loss and reversals, and repeated events; statistical models of longitudinal change, including the incorporation of the time element, use of linear, logistic or Poisson regression or survival analysis, and statistical tests; site vs person level of analysis, including statistical adjustment for correlated data; the strengths and limitations of ALOSS data. Examples from the Piedmont 65+ Dental Study are used to illustrate specific concepts. We conclude that incidence density is the preferred methodology to use for periodontal studies with more than one period of follow-up and that the use of studies not employing methods for dealing with complex samples, correlated data, and repeated measures does not take advantage of our current understanding of the site- and person-level variables important in periodontal disease and may generate biased results.

  1. The effects of organizational flexibility on nurse utilization and vacancy statistics in Ontario hospitals.

    PubMed

    Fisher, Anita; Baumann, Andrea; Blythe, Jennifer

    2007-01-01

    Social and economic changes in industrial societies during the past quarter-century encouraged organizations to develop greater flexibility in their employment systems in order to adapt to organizational restructuring and labour market shifts (Kallenberg 2003). During the 1990s this trend became evident in healthcare organizations. Before healthcare restructuring, employment in the acute hospital sector was more stable, with higher levels of full-time staff. However, in the downsizing era, employers favoured more flexible, contingent workforces (Zeytinoglu 1999). As healthcare systems evolved, staffing patterns became more chaotic and predicting staffing requirements more complex. Increased use of casual and part-time staff, overtime and agency nurses, as well as alterations in skills mix, masked vacancy counts and thus rendered this measurement of nursing demand increasingly difficult. This study explores flexible nurse staffing practices and demonstrates how data such as nurse vacancy statistics, considered in isolation from nurse utilization information, are inaccurate indicators of nursing demand and nurse shortage. It develops an algorithm that provides a standard methodology for improved monitoring and management of nurse utilization data and better quantification of vacancy statistics. Use of standard methodology promotes more accurate measurement of nurse utilization and shortage. Furthermore, it provides a solid base for improved nursing workforce planning, production and management.

  2. Learning predictive statistics from temporal sequences: Dynamics and strategies.

    PubMed

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E; Kourtzi, Zoe

    2017-10-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics-that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments.

  3. Effects of molecular composition of natural organic matter on ferric iron complexation at circumneutral pH.

    PubMed

    Fujii, Manabu; Imaoka, Akira; Yoshimura, Chihiro; Waite, T D

    2014-04-15

    Thermodynamic and kinetic parameters for ferric iron (Fe[III]) complexation by well-characterized humic substances (HS) from various origins were determined by a competitive ligand method with 5-sulfosalicylic acid at circumneutral pH (6.0-8.0) and an ionic strength of ∼0.06 M. The measured Fe binding properties including conditional stability constants and complexation capacities ranged over more than 2 orders of magnitude, depending on the origin and the particular operationally defined fraction of HS examined. Statistical comparison of the complexation parameters to a range of chemical properties of the HS indicated a strong positive correlation between Fe(III) complexation capacity and aromatic carbon content in the HS at all pHs examined. In contrast, the complexation capacity was determined to be up to a few orders of magnitude smaller than the concentration of carboxylic and phenolic groups present. Therefore, specific functional groups including those resident in the proximity of aromatic structures within the HS are likely preferable for Fe(III) coordination under the conditions examined. Overall, our results suggest that the concentration of dissolved Fe(III) complexes in natural waters is substantially influenced by variation in HS characteristics in addition to other well-known factors such as HS concentration and nature and concentration of competing cations present.

  4. Reconciling statistical and systems science approaches to public health.

    PubMed

    Ip, Edward H; Rahmandad, Hazhir; Shoham, David A; Hammond, Ross; Huang, Terry T-K; Wang, Youfa; Mabry, Patricia L

    2013-10-01

    Although systems science has emerged as a set of innovative approaches to study complex phenomena, many topically focused researchers including clinicians and scientists working in public health are somewhat befuddled by this methodology that at times appears to be radically different from analytic methods, such as statistical modeling, to which the researchers are accustomed. There also appears to be conflicts between complex systems approaches and traditional statistical methodologies, both in terms of their underlying strategies and the languages they use. We argue that the conflicts are resolvable, and the sooner the better for the field. In this article, we show how statistical and systems science approaches can be reconciled, and how together they can advance solutions to complex problems. We do this by comparing the methods within a theoretical framework based on the work of population biologist Richard Levins. We present different types of models as representing different tradeoffs among the four desiderata of generality, realism, fit, and precision.

  5. Reconciling Statistical and Systems Science Approaches to Public Health

    PubMed Central

    Ip, Edward H.; Rahmandad, Hazhir; Shoham, David A.; Hammond, Ross; Huang, Terry T.-K.; Wang, Youfa; Mabry, Patricia L.

    2016-01-01

    Although systems science has emerged as a set of innovative approaches to study complex phenomena, many topically focused researchers including clinicians and scientists working in public health are somewhat befuddled by this methodology that at times appears to be radically different from analytic methods, such as statistical modeling, to which the researchers are accustomed. There also appears to be conflicts between complex systems approaches and traditional statistical methodologies, both in terms of their underlying strategies and the languages they use. We argue that the conflicts are resolvable, and the sooner the better for the field. In this article, we show how statistical and systems science approaches can be reconciled, and how together they can advance solutions to complex problems. We do this by comparing the methods within a theoretical framework based on the work of population biologist Richard Levins. We present different types of models as representing different tradeoffs among the four desiderata of generality, realism, fit, and precision. PMID:24084395

  6. Heuristic Identification of Biological Architectures for Simulating Complex Hierarchical Genetic Interactions

    PubMed Central

    Moore, Jason H; Amos, Ryan; Kiralis, Jeff; Andrews, Peter C

    2015-01-01

    Simulation plays an essential role in the development of new computational and statistical methods for the genetic analysis of complex traits. Most simulations start with a statistical model using methods such as linear or logistic regression that specify the relationship between genotype and phenotype. This is appealing due to its simplicity and because these statistical methods are commonly used in genetic analysis. It is our working hypothesis that simulations need to move beyond simple statistical models to more realistically represent the biological complexity of genetic architecture. The goal of the present study was to develop a prototype genotype–phenotype simulation method and software that are capable of simulating complex genetic effects within the context of a hierarchical biology-based framework. Specifically, our goal is to simulate multilocus epistasis or gene–gene interaction where the genetic variants are organized within the framework of one or more genes, their regulatory regions and other regulatory loci. We introduce here the Heuristic Identification of Biological Architectures for simulating Complex Hierarchical Interactions (HIBACHI) method and prototype software for simulating data in this manner. This approach combines a biological hierarchy, a flexible mathematical framework, a liability threshold model for defining disease endpoints, and a heuristic search strategy for identifying high-order epistatic models of disease susceptibility. We provide several simulation examples using genetic models exhibiting independent main effects and three-way epistatic effects. PMID:25395175

  7. Inference with viral quasispecies diversity indices: clonal and NGS approaches.

    PubMed

    Gregori, Josep; Salicrú, Miquel; Domingo, Esteban; Sanchez, Alex; Esteban, Juan I; Rodríguez-Frías, Francisco; Quer, Josep

    2014-04-15

    Given the inherent dynamics of a viral quasispecies, we are often interested in the comparison of diversity indices of sequential samples of a patient, or in the comparison of diversity indices of virus in groups of patients in a treated versus control design. It is then important to make sure that the diversity measures from each sample may be compared with no bias and within a consistent statistical framework. In the present report, we review some indices often used as measures for viral quasispecies complexity and provide means for statistical inference, applying procedures taken from the ecology field. In particular, we examine the Shannon entropy and the mutation frequency, and we discuss the appropriateness of different normalization methods of the Shannon entropy found in the literature. By taking amplicons ultra-deep pyrosequencing (UDPS) raw data as a surrogate of a real hepatitis C virus viral population, we study through in-silico sampling the statistical properties of these indices under two methods of viral quasispecies sampling, classical cloning followed by Sanger sequencing (CCSS) and next-generation sequencing (NGS) such as UDPS. We propose solutions specific to each of the two sampling methods-CCSS and NGS-to guarantee statistically conforming conclusions as free of bias as possible. josep.gregori@gmail.com Supplementary information: Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  8. Ka-Band Atmospheric Phase Stability Measurements in Goldstone, CA; White Sands, NM; and Guam

    NASA Technical Reports Server (NTRS)

    Zemba, Michael J.; Morse, Jacquelynne Rose; Nessel, James A.

    2014-01-01

    As spacecraft communication links are driven to higher frequencies (e.g. Ka-band) both by spectrum congestion and the appeal of higher data rates, the propagation phenomena at these frequencies must be well characterized for effective system design. In particular, the phase stability of a site at a given frequency will govern whether or not the site is a practical location for an antenna array, particularly if uplink capabilities are desired. Propagation studies to characterize such phenomena must be done on a site-by-site basis due to the wide variety of climates and weather conditions at each ground terminal. Accordingly, in order to statistically characterize the atmospheric effects on Ka-Band links, site test interferometers (STIs) have been deployed at three of NASA's operational sites to directly measure each site's tropospheric phase stability. Using three years of results from these experiments, this paper will statistically characterize the simultaneous atmospheric phase noise measurements recorded by the STIs deployed at the following ground station sites: the Goldstone Deep Space Communications Complex near Barstow, CA; the White Sands Ground Terminal near Las Cruces, NM; and the Guam Remote Ground Terminal on the island of Guam.

  9. Observability of ionospheric space-time structure with ISR: A simulation study

    NASA Astrophysics Data System (ADS)

    Swoboda, John; Semeter, Joshua; Zettergren, Matthew; Erickson, Philip J.

    2017-02-01

    The sources of error from electronically steerable array (ESA) incoherent scatter radar (ISR) systems are investigated both theoretically and with use of an open-source ISR simulator, developed by the authors, called Simulator for ISR (SimISR). The main sources of error incorporated in the simulator include statistical uncertainty, which arises due to nature of the measurement mechanism and the inherent space-time ambiguity from the sensor. SimISR can take a field of plasma parameters, parameterized by time and space, and create simulated ISR data at the scattered electric field (i.e., complex receiver voltage) level, subsequently processing these data to show possible reconstructions of the original parameter field. To demonstrate general utility, we show a number of simulation examples, with two cases using data from a self-consistent multifluid transport model. Results highlight the significant influence of the forward model of the ISR process and the resulting statistical uncertainty on plasma parameter measurements and the core experiment design trade-offs that must be made when planning observations. These conclusions further underscore the utility of this class of measurement simulator as a design tool for more optimal experiment design efforts using flexible ESA class ISR systems.

  10. Evaluation of Assays for Measurement of Serum (Anti)oxidants in Hemodialysis Patients

    PubMed Central

    Jansen, Eugene H. J. M.; Antarorov, Risto

    2014-01-01

    Background. Various biomarkers and assays have been used for assessment of (anti)oxidant status in hemodialysis patients, including those intended for measurement of serum total (anti)oxidants, most often as a part of panel biomarkers. Methods. Serum (anti)oxidant status was measured in 32 chronically hemodialyzed patients and in 47 healthy persons, using two oxidations and three antioxidant assays. Results. The patients before the hemodialysis session have had higher values of total oxidants in comparison to the healthy persons, with a further increase during the hemodialysis. These findings were confirmed with both oxidation assays, but they differ in the percentage of increase and the statistical significance. All three antioxidant assays showed significantly higher values of the total serum antioxidants in the patients before the hemodialysis session in comparison to the healthy persons, and their significant decrease during the hemodialysis. However, the assays differ in the percentage of decrease, its statistical significance, and the correlations with uric acid. Conclusion. The variability of results of total (anti)oxidants which are obtained using different assays should be taken into account when interpreting data from clinical studies of oxidative stress, especially in complex pathologies such as chronic hemodialysis. PMID:24982909

  11. Concentric network symmetry grasps authors' styles in word adjacency networks

    NASA Astrophysics Data System (ADS)

    Amancio, Diego R.; Silva, Filipi N.; Costa, Luciano da F.

    2015-06-01

    Several characteristics of written texts have been inferred from statistical analysis derived from networked models. Even though many network measurements have been adapted to study textual properties at several levels of complexity, some textual aspects have been disregarded. In this paper, we study the symmetry of word adjacency networks, a well-known representation of text as a graph. A statistical analysis of the symmetry distribution performed in several novels showed that most of the words do not display symmetric patterns of connectivity. More specifically, the merged symmetry displayed a distribution similar to the ubiquitous power-law distribution. Our experiments also revealed that the studied metrics do not correlate with other traditional network measurements, such as the degree or the betweenness centrality. The discriminability power of the symmetry measurements was verified in the authorship attribution task. Interestingly, we found that specific authors prefer particular types of symmetric motifs. As a consequence, the authorship of books could be accurately identified in 82.5% of the cases, in a dataset comprising books written by 8 authors. Because the proposed measurements for text analysis are complementary to the traditional approach, they can be used to improve the characterization of text networks, which might be useful for applications based on stylistic classification.

  12. Using Statistical Natural Language Processing for Understanding Complex Responses to Free-Response Tasks

    ERIC Educational Resources Information Center

    DeMark, Sarah F.; Behrens, John T.

    2004-01-01

    Whereas great advances have been made in the statistical sophistication of assessments in terms of evidence accumulation and task selection, relatively little statistical work has explored the possibility of applying statistical techniques to data for the purposes of determining appropriate domain understanding and to generate task-level scoring…

  13. Preictal dynamics of EEG complexity in intracranially recorded epileptic seizure: a case report.

    PubMed

    Bob, Petr; Roman, Robert; Svetlak, Miroslav; Kukleta, Miloslav; Chladek, Jan; Brazdil, Milan

    2014-11-01

    Recent findings suggest that neural complexity reflecting a number of independent processes in the brain may characterize typical changes during epileptic seizures and may enable to describe preictal dynamics. With respect to previously reported findings suggesting specific changes in neural complexity during preictal period, we have used measure of pointwise correlation dimension (PD2) as a sensitive indicator of nonstationary changes in complexity of the electroencephalogram (EEG) signal. Although this measure of complexity in epileptic patients was previously reported by Feucht et al (Applications of correlation dimension and pointwise dimension for non-linear topographical analysis of focal onset seizures. Med Biol Comput. 1999;37:208-217), it was not used to study changes in preictal dynamics. With this aim to study preictal changes of EEG complexity, we have examined signals from 11 multicontact depth (intracerebral) EEG electrodes located in 108 cortical and subcortical brain sites, and from 3 scalp EEG electrodes in a patient with intractable epilepsy, who underwent preoperative evaluation before epilepsy surgery. From those 108 EEG contacts, records related to 44 electrode contacts implanted into lesional structures and white matter were not included into the experimental analysis.The results show that in comparison to interictal period (at about 8-6 minutes before seizure onset), there was a statistically significant decrease in PD2 complexity in the preictal period at about 2 minutes before seizure onset in all 64 intracranial channels localized in various brain sites that were included into the analysis and in 3 scalp EEG channels as well. Presented results suggest that using PD2 in EEG analysis may have significant implications for research of preictal dynamics and prediction of epileptic seizures.

  14. Relating Complexity and Error Rates of Ontology Concepts. More Complex NCIt Concepts Have More Errors.

    PubMed

    Min, Hua; Zheng, Ling; Perl, Yehoshua; Halper, Michael; De Coronado, Sherri; Ochs, Christopher

    2017-05-18

    Ontologies are knowledge structures that lend support to many health-information systems. A study is carried out to assess the quality of ontological concepts based on a measure of their complexity. The results show a relation between complexity of concepts and error rates of concepts. A measure of lateral complexity defined as the number of exhibited role types is used to distinguish between more complex and simpler concepts. Using a framework called an area taxonomy, a kind of abstraction network that summarizes the structural organization of an ontology, concepts are divided into two groups along these lines. Various concepts from each group are then subjected to a two-phase QA analysis to uncover and verify errors and inconsistencies in their modeling. A hierarchy of the National Cancer Institute thesaurus (NCIt) is used as our test-bed. A hypothesis pertaining to the expected error rates of the complex and simple concepts is tested. Our study was done on the NCIt's Biological Process hierarchy. Various errors, including missing roles, incorrect role targets, and incorrectly assigned roles, were discovered and verified in the two phases of our QA analysis. The overall findings confirmed our hypothesis by showing a statistically significant difference between the amounts of errors exhibited by more laterally complex concepts vis-à-vis simpler concepts. QA is an essential part of any ontology's maintenance regimen. In this paper, we reported on the results of a QA study targeting two groups of ontology concepts distinguished by their level of complexity, defined in terms of the number of exhibited role types. The study was carried out on a major component of an important ontology, the NCIt. The findings suggest that more complex concepts tend to have a higher error rate than simpler concepts. These findings can be utilized to guide ongoing efforts in ontology QA.

  15. Automated Selection of Regions of Interest for Intensity-based FRET Analysis of Transferrin Endocytic Trafficking in Normal vs. Cancer Cells

    PubMed Central

    Talati, Ronak; Vanderpoel, Andrew; Eladdadi, Amina; Anderson, Kate; Abe, Ken; Barroso, Margarida

    2013-01-01

    The overexpression of certain membrane-bound receptors is a hallmark of cancer progression and it has been suggested to affect the organization, activation, recycling and down-regulation of receptor-ligand complexes in human cancer cells. Thus, comparing receptor trafficking pathways in normal vs. cancer cells requires the ability to image cells expressing dramatically different receptor expression levels. Here, we have presented a significant technical advance to the analysis and processing of images collected using intensity based Förster resonance energy transfer (FRET) confocal microscopy. An automated Image J macro was developed to select region of interests (ROI) based on intensity and statistical-based thresholds within cellular images with reduced FRET signal. Furthermore, SSMD (strictly standardized mean differences), a statistical signal-to-noise ratio (SNR) evaluation parameter, was used to validate the quality of FRET analysis, in particular of ROI database selection. The Image J ROI selection macro together with SSMD as an evaluation parameter of SNR levels, were used to investigate the endocytic recycling of Tfn-TFR complexes at nanometer range resolution in human normal vs. breast cancer cells expressing significantly different levels of endogenous TFR. Here, the FRET-based assay demonstrates that Tfn-TFR complexes in normal epithelial vs. breast cancer cells show a significantly different E% behavior during their endocytic recycling pathway. Since E% is a relative measure of distance, we propose that these changes in E% levels represent conformational changes in Tfn-TFR complexes during endocytic pathway. Thus, our results indicate that Tfn-TFR complexes undergo different conformational changes in normal vs. cancer cells, indicating that the organization of Tfn-TFR complexes at the nanometer range is significantly altered during the endocytic recycling pathway in cancer cells. In summary, improvements in the automated selection of FRET ROI datasets allowed us to detect significant changes in E% with potential biological significance in human normal vs. cancer cells. PMID:23994873

  16. Multivariate statistical data analysis methods for detecting baroclinic wave interactions in the thermally driven rotating annulus

    NASA Astrophysics Data System (ADS)

    von Larcher, Thomas; Harlander, Uwe; Alexandrov, Kiril; Wang, Yongtai

    2010-05-01

    Experiments on baroclinic wave instabilities in a rotating cylindrical gap have been long performed, e.g., to unhide regular waves of different zonal wave number, to better understand the transition to the quasi-chaotic regime, and to reveal the underlying dynamical processes of complex wave flows. We present the application of appropriate multivariate data analysis methods on time series data sets acquired by the use of non-intrusive measurement techniques of a quite different nature. While the high accurate Laser-Doppler-Velocimetry (LDV ) is used for measurements of the radial velocity component at equidistant azimuthal positions, a high sensitive thermographic camera measures the surface temperature field. The measurements are performed at particular parameter points, where our former studies show that kinds of complex wave patterns occur [1, 2]. Obviously, the temperature data set has much more information content as the velocity data set due to the particular measurement techniques. Both sets of time series data are analyzed by using multivariate statistical techniques. While the LDV data sets are studied by applying the Multi-Channel Singular Spectrum Analysis (M - SSA), the temperature data sets are analyzed by applying the Empirical Orthogonal Functions (EOF ). Our goal is (a) to verify the results yielded with the analysis of the velocity data and (b) to compare the data analysis methods. Therefor, the temperature data are processed in a way to become comparable to the LDV data, i.e. reducing the size of the data set in such a manner that the temperature measurements would imaginary be performed at equidistant azimuthal positions only. This approach initially results in a great loss of information. But applying the M - SSA to the reduced temperature data sets enable us to compare the methods. [1] Th. von Larcher and C. Egbers, Experiments on transitions of baroclinic waves in a differentially heated rotating annulus, Nonlinear Processes in Geophysics, 2005, 12, 1033-1041, NPG Print: ISSN 1023-5809, NPG Online: ISSN 1607-7946 [2] U. Harlander, Th. von Larcher, Y. Wang and C. Egbers, PIV- and LDV-measurements of baroclinic wave interactions in a thermally driven rotating annulus, Experiments in Fluids, 2009, DOI: 10.1007/s00348-009-0792-5

  17. Modeling Particle Exposure in US Trucking Terminals

    PubMed Central

    Davis, ME; Smith, TJ; Laden, F; Hart, JE; Ryan, LM; Garshick, E

    2007-01-01

    Multi-tiered sampling approaches are common in environmental and occupational exposure assessment, where exposures for a given individual are often modeled based on simultaneous measurements taken at multiple indoor and outdoor sites. The monitoring data from such studies is hierarchical by design, imposing a complex covariance structure that must be accounted for in order to obtain unbiased estimates of exposure. Statistical methods such as structural equation modeling (SEM) represent a useful alternative to simple linear regression in these cases, providing simultaneous and unbiased predictions of each level of exposure based on a set of covariates specific to the exposure setting. We test the SEM approach using data from a large exposure assessment of diesel and combustion particles in the US trucking industry. The exposure assessment includes data from 36 different trucking terminals across the United States sampled between 2001 and 2005, measuring PM2.5 and its elemental carbon (EC), organic carbon (OC) components, by personal monitoring, and sampling at two indoor work locations and an outdoor “background” location. Using the SEM method, we predict: 1) personal exposures as a function of work related exposure and smoking status; 2) work related exposure as a function of terminal characteristics, indoor ventilation, job location, and background exposure conditions; and 3) background exposure conditions as a function of weather, nearby source pollution, and other regional differences across terminal sites. The primary advantage of SEMs in this setting is the ability to simultaneously predict exposures at each of the sampling locations, while accounting for the complex covariance structure among the measurements and descriptive variables. The statistically significant results and high R2 values observed from the trucking industry application supports the broader use of this approach in exposure assessment modeling. PMID:16856739

  18. Comparison of AERMOD and CALPUFF models for simulating SO2 concentrations in a gas refinery.

    PubMed

    Atabi, Farideh; Jafarigol, Farzaneh; Moattar, Faramarz; Nouri, Jafar

    2016-09-01

    In this study, concentration of SO2 from a gas refinery located in complex terrain was calculated by the steady-state, AERMOD model, and nonsteady-state CALPUFF model. First, in four seasons, SO2 concentrations emitted from 16 refinery stacks, in nine receptors, were obtained by field measurements, and then the performance of both models was evaluated. Then, the simulated results for SO2 ambient concentrations made by each model were compared with the results of the observed concentrations, and model results were compared among themselves. The evaluation of the two models to simulate SO2 concentrations was based on the statistical analysis and Q-Q plots. Review of statistical parameters and Q-Q plots has shown that, according to the evaluation of estimations made, performance of both models to simulate the concentration of SO2 in the region can be considered acceptable. The results showed the AERMOD composite ratio between simulated values made by models and the observed values in various receptors for all four average times is 0.72, whereas CALPUFF's ratio is 0.89. However, in the complex conditions of topography, CALPUFF offers better agreement with the observed concentrations.

  19. Maternal periodontal disease and preeclampsia in Jaipur population

    PubMed Central

    Jaiman, Girija; Nayak, Prathibha Anand; Sharma, Sanu; Nagpal, Kiran

    2018-01-01

    Background: Preeclampsia is identified as an important cause for mother and newborn mortality. Inspite of extensive research, the exact etiological relations have not been established. Hence, an attempt has been made in this study to evaluate the relationship between the preeclampsia and maternal periodontal disease. Materials and Methods: The case–control study comprised of thirty pregnant women distributed equally in the case (preeclampsia) and control (healthy) group. Gingival index, plaque index, bleeding on probing, clinical probing depth, and clinical attachment level were measured in both groups. Microbiologic examination for identification of one red complex organism Porphyromonas gingivalis and one orange complex organism Fusobacterium nucleatum were done in plaque and placental blood of cases and controls. The clinical examinations and collection of placental blood were done 24 h before delivery. Results: Periodontal condition in the preeclamptic women was statistically worse compared with the normotensive women. There was no statistically significant association between microorganisms in plaque and placental blood between normotensive control and preeclamptic pregnant women. The preeclamptic women had significantly higher chances of having newborns weighing <2.5 kg than the normotensive women. Conclusion: The preeclamptic women were associated with significantly higher periodontitis and lower fetal birth weight than normotensive women. PMID:29568173

  20. Maternal periodontal disease and preeclampsia in Jaipur population.

    PubMed

    Jaiman, Girija; Nayak, Prathibha Anand; Sharma, Sanu; Nagpal, Kiran

    2018-01-01

    Preeclampsia is identified as an important cause for mother and newborn mortality. Inspite of extensive research, the exact etiological relations have not been established. Hence, an attempt has been made in this study to evaluate the relationship between the preeclampsia and maternal periodontal disease. The case-control study comprised of thirty pregnant women distributed equally in the case (preeclampsia) and control (healthy) group. Gingival index, plaque index, bleeding on probing, clinical probing depth, and clinical attachment level were measured in both groups. Microbiologic examination for identification of one red complex organism Porphyromonas gingivalis and one orange complex organism Fusobacterium nucleatum were done in plaque and placental blood of cases and controls. The clinical examinations and collection of placental blood were done 24 h before delivery. Periodontal condition in the preeclamptic women was statistically worse compared with the normotensive women. There was no statistically significant association between microorganisms in plaque and placental blood between normotensive control and preeclamptic pregnant women. The preeclamptic women had significantly higher chances of having newborns weighing <2.5 kg than the normotensive women. The preeclamptic women were associated with significantly higher periodontitis and lower fetal birth weight than normotensive women.

  1. Concepts and their dynamics: a quantum-theoretic modeling of human thought.

    PubMed

    Aerts, Diederik; Gabora, Liane; Sozzo, Sandro

    2013-10-01

    We analyze different aspects of our quantum modeling approach of human concepts and, more specifically, focus on the quantum effects of contextuality, interference, entanglement, and emergence, illustrating how each of them makes its appearance in specific situations of the dynamics of human concepts and their combinations. We point out the relation of our approach, which is based on an ontology of a concept as an entity in a state changing under influence of a context, with the main traditional concept theories, that is, prototype theory, exemplar theory, and theory theory. We ponder about the question why quantum theory performs so well in its modeling of human concepts, and we shed light on this question by analyzing the role of complex amplitudes, showing how they allow to describe interference in the statistics of measurement outcomes, while in the traditional theories statistics of outcomes originates in classical probability weights, without the possibility of interference. The relevance of complex numbers, the appearance of entanglement, and the role of Fock space in explaining contextual emergence, all as unique features of the quantum modeling, are explicitly revealed in this article by analyzing human concepts and their dynamics. © 2013 Cognitive Science Society, Inc.

  2. The challenging use and interpretation of circulating biomarkers of exposure to persistent organic pollutants in environmental health: Comparison of lipid adjustment approaches in a case study related to endometriosis.

    PubMed

    Cano-Sancho, German; Labrune, Léa; Ploteau, Stéphane; Marchand, Philippe; Le Bizec, Bruno; Antignac, Jean-Philippe

    2018-06-01

    The gold-standard matrix for measuring the internal levels of persistent organic pollutants (POPs) is the adipose tissue, however in epidemiological studies the use of serum is preferred due to the low cost and higher accessibility. The interpretation of serum biomarkers is tightly related to the understanding of the underlying causal structure relating the POPs, serum lipids and the disease. Considering the extended benefits of using serum biomarkers we aimed to further examine if through statistical modelling we would be able to improve the use and interpretation of serum biomarkers in the study of endometriosis. Hence, we have conducted a systematic comparison of statistical approaches commonly used to lipid-adjust the circulating biomarkers of POPs based on existing methods, using data from a pilot case-control study focused on severe deep infiltrating endometriosis. The odds ratios (ORs) obtained from unconditional regression for those models with serum biomarkers were further compared to those obtained from adipose tissue. The results of this exploratory study did not support the use of blood biomarkers as proxy estimates of POPs in adipose tissue to implement in risk models for endometriosis with the available statistical approaches to correct for lipids. The current statistical approaches commonly used to lipid-adjust circulating POPs, do not fully represent the underlying biological complexity between POPs, lipids and disease (especially those directly or indirectly affecting or affected by lipid metabolism). Hence, further investigations are warranted to improve the use and interpretation of blood biomarkers under complex scenarios of lipid dynamics. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. An ensemble Kalman filter for statistical estimation of physics constrained nonlinear regression models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harlim, John, E-mail: jharlim@psu.edu; Mahdi, Adam, E-mail: amahdi@ncsu.edu; Majda, Andrew J., E-mail: jonjon@cims.nyu.edu

    2014-01-15

    A central issue in contemporary science is the development of nonlinear data driven statistical–dynamical models for time series of noisy partial observations from nature or a complex model. It has been established recently that ad-hoc quadratic multi-level regression models can have finite-time blow-up of statistical solutions and/or pathological behavior of their invariant measure. Recently, a new class of physics constrained nonlinear regression models were developed to ameliorate this pathological behavior. Here a new finite ensemble Kalman filtering algorithm is developed for estimating the state, the linear and nonlinear model coefficients, the model and the observation noise covariances from available partialmore » noisy observations of the state. Several stringent tests and applications of the method are developed here. In the most complex application, the perfect model has 57 degrees of freedom involving a zonal (east–west) jet, two topographic Rossby waves, and 54 nonlinearly interacting Rossby waves; the perfect model has significant non-Gaussian statistics in the zonal jet with blocked and unblocked regimes and a non-Gaussian skewed distribution due to interaction with the other 56 modes. We only observe the zonal jet contaminated by noise and apply the ensemble filter algorithm for estimation. Numerically, we find that a three dimensional nonlinear stochastic model with one level of memory mimics the statistical effect of the other 56 modes on the zonal jet in an accurate fashion, including the skew non-Gaussian distribution and autocorrelation decay. On the other hand, a similar stochastic model with zero memory levels fails to capture the crucial non-Gaussian behavior of the zonal jet from the perfect 57-mode model.« less

  4. A clinicomicrobiological study to evaluate the efficacy of manual and powered toothbrushes among autistic patients

    PubMed Central

    Vajawat, Mayuri; Deepika, P. C.; Kumar, Vijay; Rajeshwari, P.

    2015-01-01

    Aim: To compare the efficacy of powered toothbrushes in improving gingival health and reducing salivary red complex counts as compared to manual toothbrushes, among autistic individuals. Materials and Methods: Forty autistics was selected. Test group received powered toothbrushes, and control group received manual toothbrushes. Plaque index and gingival index were recorded. Unstimulated saliva was collected for analysis of red complex organisms using polymerase chain reaction. Results: A statistically significant reduction in the plaque scores was seen over a period of 12 weeks in both the groups (P < 0.001 for tests and P = 0.002 for controls). This reduction was statistically more significant in the test group (P = 0.024). A statistically significant reduction in the gingival scores was seen over a period of 12 weeks in both the groups (P < 0.001 for tests and P = 0.001 for controls). This reduction was statistically more significant in the test group (P = 0.042). No statistically significant reduction in the detection rate of red complex organisms were seen at 4 weeks in both the groups. Conclusion: Powered toothbrushes result in a significant overall improvement in gingival health when constant reinforcement of oral hygiene instructions is given. PMID:26681855

  5. A statistical physics perspective on criticality in financial markets

    NASA Astrophysics Data System (ADS)

    Bury, Thomas

    2013-11-01

    Stock markets are complex systems exhibiting collective phenomena and particular features such as synchronization, fluctuations distributed as power-laws, non-random structures and similarity to neural networks. Such specific properties suggest that markets operate at a very special point. Financial markets are believed to be critical by analogy to physical systems, but little statistically founded evidence has been given. Through a data-based methodology and comparison to simulations inspired by the statistical physics of complex systems, we show that the Dow Jones and index sets are not rigorously critical. However, financial systems are closer to criticality in the crash neighborhood.

  6. Statistical Analysis of Complexity Generators for Cost Estimation

    NASA Technical Reports Server (NTRS)

    Rowell, Ginger Holmes

    1999-01-01

    Predicting the cost of cutting edge new technologies involved with spacecraft hardware can be quite complicated. A new feature of the NASA Air Force Cost Model (NAFCOM), called the Complexity Generator, is being developed to model the complexity factors that drive the cost of space hardware. This parametric approach is also designed to account for the differences in cost, based on factors that are unique to each system and subsystem. The cost driver categories included in this model are weight, inheritance from previous missions, technical complexity, and management factors. This paper explains the Complexity Generator framework, the statistical methods used to select the best model within this framework, and the procedures used to find the region of predictability and the prediction intervals for the cost of a mission.

  7. Imprints of magnetic power and helicity spectra on radio polarimetry statistics

    NASA Astrophysics Data System (ADS)

    Junklewitz, H.; Enßlin, T. A.

    2011-06-01

    The statistical properties of turbulent magnetic fields in radio-synchrotron sources should be imprinted on the statistics of polarimetric observables. In search of these imprints, i.e. characteristic modifications of the polarimetry statistics caused by magnetic field properties, we calculate correlation and cross-correlation functions from a set of observables that contain total intensity I, polarized intensity P, and Faraday depth φ. The correlation functions are evaluated for all combinations of observables up to fourth order in magnetic field B. We derive these analytically as far as possible and from first principles using only some basic assumptions, such as Gaussian statistics for the underlying magnetic field in the observed region and statistical homogeneity. We further assume some simplifications to reduce the complexity of the calculations, because for a start we were interested in a proof of concept. Using this statistical approach, we show that it is possible to gain information about the helical part of the magnetic power spectrum via the correlation functions < P(kperp) φ(k'_{perp)φ(k''perp)>B} and < I(kperp) φ(k'_{perp)φ(k''perp)>B}. Using this insight, we construct an easy-to-use test for helicity called LITMUS (Local Inference Test for Magnetic fields which Uncovers heliceS), which gives a spectrally integrated measure of helicity. For now, all calculations are given in a Faraday-free case, but set up so that Faraday rotational effects can be included later.

  8. Remote sensing of environmental particulate pollutants - Optical methods for determinations of size distribution and complex refractive index

    NASA Technical Reports Server (NTRS)

    Fymat, A. L.

    1978-01-01

    A unifying approach, based on a generalization of Pearson's differential equation of statistical theory, is proposed for both the representation of particulate size distribution and the interpretation of radiometric measurements in terms of this parameter. A single-parameter gamma-type distribution is introduced, and it is shown that inversion can only provide the dimensionless parameter, r/ab (where r = particle radius, a = effective radius, b = effective variance), at least when the distribution vanishes at both ends. The basic inversion problem in reconstructing the particle size distribution is analyzed, and the existing methods are reviewed (with emphasis on their capabilities) and classified. A two-step strategy is proposed for simultaneously determining the complex refractive index and reconstructing the size distribution of atmospheric particulates.

  9. Occam’s Quantum Strop: Synchronizing and Compressing Classical Cryptic Processes via a Quantum Channel

    NASA Astrophysics Data System (ADS)

    Mahoney, John R.; Aghamohammadi, Cina; Crutchfield, James P.

    2016-02-01

    A stochastic process’ statistical complexity stands out as a fundamental property: the minimum information required to synchronize one process generator to another. How much information is required, though, when synchronizing over a quantum channel? Recent work demonstrated that representing causal similarity as quantum state-indistinguishability provides a quantum advantage. We generalize this to synchronization and offer a sequence of constructions that exploit extended causal structures, finding substantial increase of the quantum advantage. We demonstrate that maximum compression is determined by the process’ cryptic order-a classical, topological property closely allied to Markov order, itself a measure of historical dependence. We introduce an efficient algorithm that computes the quantum advantage and close noting that the advantage comes at a cost-one trades off prediction for generation complexity.

  10. Task Oriented Evaluation of Module Extraction Techniques

    NASA Astrophysics Data System (ADS)

    Palmisano, Ignazio; Tamma, Valentina; Payne, Terry; Doran, Paul

    Ontology Modularization techniques identify coherent and often reusable regions within an ontology. The ability to identify such modules, thus potentially reducing the size or complexity of an ontology for a given task or set of concepts is increasingly important in the Semantic Web as domain ontologies increase in terms of size, complexity and expressivity. To date, many techniques have been developed, but evaluation of the results of these techniques is sketchy and somewhat ad hoc. Theoretical properties of modularization algorithms have only been studied in a small number of cases. This paper presents an empirical analysis of a number of modularization techniques, and the modules they identify over a number of diverse ontologies, by utilizing objective, task-oriented measures to evaluate the fitness of the modules for a number of statistical classification problems.

  11. Occam's Quantum Strop: Synchronizing and Compressing Classical Cryptic Processes via a Quantum Channel.

    PubMed

    Mahoney, John R; Aghamohammadi, Cina; Crutchfield, James P

    2016-02-15

    A stochastic process' statistical complexity stands out as a fundamental property: the minimum information required to synchronize one process generator to another. How much information is required, though, when synchronizing over a quantum channel? Recent work demonstrated that representing causal similarity as quantum state-indistinguishability provides a quantum advantage. We generalize this to synchronization and offer a sequence of constructions that exploit extended causal structures, finding substantial increase of the quantum advantage. We demonstrate that maximum compression is determined by the process' cryptic order--a classical, topological property closely allied to Markov order, itself a measure of historical dependence. We introduce an efficient algorithm that computes the quantum advantage and close noting that the advantage comes at a cost-one trades off prediction for generation complexity.

  12. Information loss method to measure node similarity in networks

    NASA Astrophysics Data System (ADS)

    Li, Yongli; Luo, Peng; Wu, Chong

    2014-09-01

    Similarity measurement for the network node has been paid increasing attention in the field of statistical physics. In this paper, we propose an entropy-based information loss method to measure the node similarity. The whole model is established based on this idea that less information loss is caused by seeing two more similar nodes as the same. The proposed new method has relatively low algorithm complexity, making it less time-consuming and more efficient to deal with the large scale real-world network. In order to clarify its availability and accuracy, this new approach was compared with some other selected approaches on two artificial examples and synthetic networks. Furthermore, the proposed method is also successfully applied to predict the network evolution and predict the unknown nodes' attributions in the two application examples.

  13. A SURVEY OF LABORATORY AND STATISTICAL ISSUES RELATED TO FARMWORKER EXPOSURE STUDIES

    EPA Science Inventory

    Developing internally valid, and perhaps generalizable, farmworker exposure studies is a complex process that involves many statistical and laboratory considerations. Statistics are an integral component of each study beginning with the design stage and continuing to the final da...

  14. Analysis of Immune Complex Structure by Statistical Mechanics and Light Scattering Techniques.

    NASA Astrophysics Data System (ADS)

    Busch, Nathan Adams

    1995-01-01

    The size and structure of immune complexes determine their behavior in the immune system. The chemical physics of the complex formation is not well understood; this is due in part to inadequate characterization of the proteins involved, and in part by lack of sufficiently well developed theoretical techniques. Understanding the complex formation will permit rational design of strategies for inhibiting tissue deposition of the complexes. A statistical mechanical model of the proteins based upon the theory of associating fluids was developed. The multipole electrostatic potential for each protein used in this study was characterized for net protein charge, dipole moment magnitude, and dipole moment direction. The binding sites, between the model antigen and antibodies, were characterized for their net surface area, energy, and position relative to the dipole moment of the protein. The equilibrium binding graphs generated with the protein statistical mechanical model compares favorably with experimental data obtained from radioimmunoassay results. The isothermal compressibility predicted by the model agrees with results obtained from dynamic light scattering. The statistical mechanics model was used to investigate association between the model antigen and selected pairs of antibodies. It was found that, in accordance to expectations from thermodynamic arguments, the highest total binding energy yielded complex distributions which were skewed to higher complex size. From examination of the simulated formation of ring structures from linear chain complexes, and from the joint shape probability surfaces, it was found that ring configurations were formed by the "folding" of linear chains until the ends are within binding distance. By comparing the single antigen/two antibody system which differ only in their respective binding site locations, it was found that binding site location influences complex size and shape distributions only when ring formation occurs. The internal potential energy of a ring complex is considerably less than that of the non-associating system; therefore the ring complexes are quite stable and show no evidence of breaking, and collapsing into smaller complexes. The ring formation will occur only in systems where the total free energy of each complex may be minimized. Thus, ring formation will occur even though entropically unfavorable conformations result if the total free energy can be minimized by doing so.

  15. Accounting for isotopic clustering in Fourier transform mass spectrometry data analysis for clinical diagnostic studies.

    PubMed

    Kakourou, Alexia; Vach, Werner; Nicolardi, Simone; van der Burgt, Yuri; Mertens, Bart

    2016-10-01

    Mass spectrometry based clinical proteomics has emerged as a powerful tool for high-throughput protein profiling and biomarker discovery. Recent improvements in mass spectrometry technology have boosted the potential of proteomic studies in biomedical research. However, the complexity of the proteomic expression introduces new statistical challenges in summarizing and analyzing the acquired data. Statistical methods for optimally processing proteomic data are currently a growing field of research. In this paper we present simple, yet appropriate methods to preprocess, summarize and analyze high-throughput MALDI-FTICR mass spectrometry data, collected in a case-control fashion, while dealing with the statistical challenges that accompany such data. The known statistical properties of the isotopic distribution of the peptide molecules are used to preprocess the spectra and translate the proteomic expression into a condensed data set. Information on either the intensity level or the shape of the identified isotopic clusters is used to derive summary measures on which diagnostic rules for disease status allocation will be based. Results indicate that both the shape of the identified isotopic clusters and the overall intensity level carry information on the class outcome and can be used to predict the presence or absence of the disease.

  16. Evidential evaluation of DNA profiles using a discrete statistical model implemented in the DNA LiRa software.

    PubMed

    Puch-Solis, Roberto; Clayton, Tim

    2014-07-01

    The high sensitivity of the technology for producing profiles means that it has become routine to produce profiles from relatively small quantities of DNA. The profiles obtained from low template DNA (LTDNA) are affected by several phenomena which must be taken into consideration when interpreting and evaluating this evidence. Furthermore, many of the same phenomena affect profiles from higher amounts of DNA (e.g. where complex mixtures has been revealed). In this article we present a statistical model, which forms the basis of software DNA LiRa, and that is able to calculate likelihood ratios where one to four donors are postulated and for any number of replicates. The model can take into account dropin and allelic dropout for different contributors, template degradation and uncertain allele designations. In this statistical model unknown parameters are treated following the Empirical Bayesian paradigm. The performance of LiRa is tested using examples and the outputs are compared with those generated using two other statistical software packages likeLTD and LRmix. The concept of ban efficiency is introduced as a measure for assessing model sensitivity. Copyright © 2014. Published by Elsevier Ireland Ltd.

  17. The Manipulative Complexity of Lower Paleolithic Stone Toolmaking

    PubMed Central

    Faisal, Aldo; Stout, Dietrich; Apel, Jan; Bradley, Bruce

    2010-01-01

    Background Early stone tools provide direct evidence of human cognitive and behavioral evolution that is otherwise unavailable. Proper interpretation of these data requires a robust interpretive framework linking archaeological evidence to specific behavioral and cognitive actions. Methodology/Principal Findings Here we employ a data glove to record manual joint angles in a modern experimental toolmaker (the 4th author) replicating ancient tool forms in order to characterize and compare the manipulative complexity of two major Lower Paleolithic technologies (Oldowan and Acheulean). To this end we used a principled and general measure of behavioral complexity based on the statistics of joint movements. Conclusions/Significance This allowed us to confirm that previously observed differences in brain activation associated with Oldowan versus Acheulean technologies reflect higher-level behavior organization rather than lower-level differences in manipulative complexity. This conclusion is consistent with a scenario in which the earliest stages of human technological evolution depended on novel perceptual-motor capacities (such as the control of joint stiffness) whereas later developments increasingly relied on enhanced mechanisms for cognitive control. This further suggests possible links between toolmaking and language evolution. PMID:21072164

  18. Circularly-symmetric complex normal ratio distribution for scalar transmissibility functions. Part I: Fundamentals

    NASA Astrophysics Data System (ADS)

    Yan, Wang-Ji; Ren, Wei-Xin

    2016-12-01

    Recent advances in signal processing and structural dynamics have spurred the adoption of transmissibility functions in academia and industry alike. Due to the inherent randomness of measurement and variability of environmental conditions, uncertainty impacts its applications. This study is focused on statistical inference for raw scalar transmissibility functions modeled as complex ratio random variables. The goal is achieved through companion papers. This paper (Part I) is dedicated to dealing with a formal mathematical proof. New theorems on multivariate circularly-symmetric complex normal ratio distribution are proved on the basis of principle of probabilistic transformation of continuous random vectors. The closed-form distributional formulas for multivariate ratios of correlated circularly-symmetric complex normal random variables are analytically derived. Afterwards, several properties are deduced as corollaries and lemmas to the new theorems. Monte Carlo simulation (MCS) is utilized to verify the accuracy of some representative cases. This work lays the mathematical groundwork to find probabilistic models for raw scalar transmissibility functions, which are to be expounded in detail in Part II of this study.

  19. PREFACE: Counting Complexity: An international workshop on statistical mechanics and combinatorics

    NASA Astrophysics Data System (ADS)

    de Gier, Jan; Warnaar, Ole

    2006-07-01

    On 10-15 July 2005 the conference `Counting Complexity: An international workshop on statistical mechanics and combinatorics' was held on Dunk Island, Queensland, Australia in celebration of Tony Guttmann's 60th birthday. Dunk Island provided the perfect setting for engaging in almost all of Tony's life-long passions: swimming, running, food, wine and, of course, plenty of mathematics and physics. The conference was attended by many of Tony's close scientific friends from all over the world, and most talks were presented by his past and present collaborators. This volume contains the proceedings of the meeting and consists of 24 refereed research papers in the fields of statistical mechanics, condensed matter physics and combinatorics. These papers provide an excellent illustration of the breadth and scope of Tony's work. The very first contribution, written by Stu Whittington, contains an overview of the many scientific achievements of Tony over the past 40 years in mathematics and physics. The organizing committee, consisting of Richard Brak, Aleks Owczarek, Jan de Gier, Emma Lockwood, Andrew Rechnitzer and Ole Warnaar, gratefully acknowledges the Australian Mathematical Society (AustMS), the Australian Mathematical Sciences Institute (AMSI), the ARC Centre of Excellence for Mathematics and Statistics of Complex Systems (MASCOS), the ARC Complex Open Systems Research Network (COSNet), the Institute of Physics (IOP) and the Department of Mathematics and Statistics of The University of Melbourne for financial support in organizing the conference. Tony, we hope that your future years in mathematics will be numerous. Count yourself lucky! Tony Guttman

  20. [Evaluation of using statistical methods in selected national medical journals].

    PubMed

    Sych, Z

    1996-01-01

    The paper covers the performed evaluation of frequency with which the statistical methods were applied in analyzed works having been published in six selected, national medical journals in the years 1988-1992. For analysis the following journals were chosen, namely: Klinika Oczna, Medycyna Pracy, Pediatria Polska, Polski Tygodnik Lekarski, Roczniki Państwowego Zakładu Higieny, Zdrowie Publiczne. Appropriate number of works up to the average in the remaining medical journals was randomly selected from respective volumes of Pol. Tyg. Lek. The studies did not include works wherein the statistical analysis was not implemented, which referred both to national and international publications. That exemption was also extended to review papers, casuistic ones, reviews of books, handbooks, monographies, reports from scientific congresses, as well as papers on historical topics. The number of works was defined in each volume. Next, analysis was performed to establish the mode of finding out a suitable sample in respective studies, differentiating two categories: random and target selections. Attention was also paid to the presence of control sample in the individual works. In the analysis attention was also focussed on the existence of sample characteristics, setting up three categories: complete, partial and lacking. In evaluating the analyzed works an effort was made to present the results of studies in tables and figures (Tab. 1, 3). Analysis was accomplished with regard to the rate of employing statistical methods in analyzed works in relevant volumes of six selected, national medical journals for the years 1988-1992, simultaneously determining the number of works, in which no statistical methods were used. Concurrently the frequency of applying the individual statistical methods was analyzed in the scrutinized works. Prominence was given to fundamental statistical methods in the field of descriptive statistics (measures of position, measures of dispersion) as well as most important methods of mathematical statistics such as parametric tests of significance, analysis of variance (in single and dual classifications). non-parametric tests of significance, correlation and regression. The works, in which use was made of either multiple correlation or multiple regression or else more complex methods of studying the relationship for two or more numbers of variables, were incorporated into the works whose statistical methods were constituted by correlation and regression as well as other methods, e.g. statistical methods being used in epidemiology (coefficients of incidence and morbidity, standardization of coefficients, survival tables) factor analysis conducted by Jacobi-Hotellng's method, taxonomic methods and others. On the basis of the performed studies it has been established that the frequency of employing statistical methods in the six selected national, medical journals in the years 1988-1992 was 61.1-66.0% of the analyzed works (Tab. 3), and they generally were almost similar to the frequency provided in English language medical journals. On a whole, no significant differences were disclosed in the frequency of applied statistical methods (Tab. 4) as well as in frequency of random tests (Tab. 3) in the analyzed works, appearing in the medical journals in respective years 1988-1992. The most frequently used statistical methods in analyzed works for 1988-1992 were the measures of position 44.2-55.6% and measures of dispersion 32.5-38.5% as well as parametric tests of significance 26.3-33.1% of the works analyzed (Tab. 4). For the purpose of increasing the frequency and reliability of the used statistical methods, the didactics should be widened in the field of biostatistics at medical studies and postgraduation training designed for physicians and scientific-didactic workers.

  1. Transfer Entropy as a Log-Likelihood Ratio

    NASA Astrophysics Data System (ADS)

    Barnett, Lionel; Bossomaier, Terry

    2012-09-01

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  2. Transfer entropy as a log-likelihood ratio.

    PubMed

    Barnett, Lionel; Bossomaier, Terry

    2012-09-28

    Transfer entropy, an information-theoretic measure of time-directed information transfer between joint processes, has steadily gained popularity in the analysis of complex stochastic dynamics in diverse fields, including the neurosciences, ecology, climatology, and econometrics. We show that for a broad class of predictive models, the log-likelihood ratio test statistic for the null hypothesis of zero transfer entropy is a consistent estimator for the transfer entropy itself. For finite Markov chains, furthermore, no explicit model is required. In the general case, an asymptotic χ2 distribution is established for the transfer entropy estimator. The result generalizes the equivalence in the Gaussian case of transfer entropy and Granger causality, a statistical notion of causal influence based on prediction via vector autoregression, and establishes a fundamental connection between directed information transfer and causality in the Wiener-Granger sense.

  3. Characterizing chaotic melodies in automatic music composition

    NASA Astrophysics Data System (ADS)

    Coca, Andrés E.; Tost, Gerard O.; Zhao, Liang

    2010-09-01

    In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests.

  4. Can't Count or Won't Count? Embedding Quantitative Methods in Substantive Sociology Curricula: A Quasi-Experiment.

    PubMed

    Williams, Malcolm; Sloan, Luke; Cheung, Sin Yi; Sutton, Carole; Stevens, Sebastian; Runham, Libby

    2016-06-01

    This paper reports on a quasi-experiment in which quantitative methods (QM) are embedded within a substantive sociology module. Through measuring student attitudes before and after the intervention alongside control group comparisons, we illustrate the impact that embedding has on the student experience. Our findings are complex and even contradictory. Whilst the experimental group were less likely to be distrustful of statistics and appreciate how QM inform social research, they were also less confident about their statistical abilities, suggesting that through 'doing' quantitative sociology the experimental group are exposed to the intricacies of method and their optimism about their own abilities is challenged. We conclude that embedding QM in a single substantive module is not a 'magic bullet' and that a wider programme of content and assessment diversification across the curriculum is preferential.

  5. Dendrimer-paclitaxel complexes for efficient treatment in ovarian cancer: study on OVCAR-3 and HEK293T cells.

    PubMed

    Yao, Hua; Ma, Jinqi

    2018-01-01

    The present paper investigates the enhancement of the therapeutic effect of Paclitaxel (a potent anticancer drug) by increasing its cellular uptake in the cancerous cells with subsequent reduction in its cytotoxic effects. To fulfill these goals the Paclitaxel (PTX)-Biotinylated PAMAM dendrimer complexes were prepared using biotinylation method. The primary parameter of Biotinylated PAMAM with a terminal HN 2 group - the degree of biotinylation - was evaluated using HABA assay. The basic integrity of the complex was studied using DSC. The Drug Loading (DL) and Drug Release (DR) parameters of Biotinylated PAMAM dendrimer-PTX complexes were also examined. Cellular uptake study was performed in OVCAR-3 and HEK293T cells using fluorescence technique. The statistical analysis was also performed to support the experimental data. The results obtained from HABA assay showed the complete biotinylation of PAMAM dendrimer. DSC study confirmed the integrity of the complex as compared with pure drug, biotinylated complex and their physical mixture. Batch 9 showed the highest DL (12.09%) and DR (70%) for 72 h as compared to different concentrations of drug and biotinylated complex. The OVCAR-3 (cancerous) cells were characterized by more intensive cellular uptake of the complexes than HEK293T (normal) cells. The obtained experimental results were supported by the statistical data. The results obtained from both experimental and statistical evaluation confirmed that the biotinylated PAMAM NH 2 dendrimer-PTX complex not only displays increased cellular uptake but has also enhanced release up to 72 h with the reduction in cytotoxicity.

  6. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more.

    PubMed

    Rivas, Elena; Lang, Raymond; Eddy, Sean R

    2012-02-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases.

  7. A range of complex probabilistic models for RNA secondary structure prediction that includes the nearest-neighbor model and more

    PubMed Central

    Rivas, Elena; Lang, Raymond; Eddy, Sean R.

    2012-01-01

    The standard approach for single-sequence RNA secondary structure prediction uses a nearest-neighbor thermodynamic model with several thousand experimentally determined energy parameters. An attractive alternative is to use statistical approaches with parameters estimated from growing databases of structural RNAs. Good results have been reported for discriminative statistical methods using complex nearest-neighbor models, including CONTRAfold, Simfold, and ContextFold. Little work has been reported on generative probabilistic models (stochastic context-free grammars [SCFGs]) of comparable complexity, although probabilistic models are generally easier to train and to use. To explore a range of probabilistic models of increasing complexity, and to directly compare probabilistic, thermodynamic, and discriminative approaches, we created TORNADO, a computational tool that can parse a wide spectrum of RNA grammar architectures (including the standard nearest-neighbor model and more) using a generalized super-grammar that can be parameterized with probabilities, energies, or arbitrary scores. By using TORNADO, we find that probabilistic nearest-neighbor models perform comparably to (but not significantly better than) discriminative methods. We find that complex statistical models are prone to overfitting RNA structure and that evaluations should use structurally nonhomologous training and test data sets. Overfitting has affected at least one published method (ContextFold). The most important barrier to improving statistical approaches for RNA secondary structure prediction is the lack of diversity of well-curated single-sequence RNA secondary structures in current RNA databases. PMID:22194308

  8. The best motivator priorities parents choose via analytical hierarchy process

    NASA Astrophysics Data System (ADS)

    Farah, R. N.; Latha, P.

    2015-05-01

    Motivation is probably the most important factor that educators can target in order to improve learning. Numerous cross-disciplinary theories have been postulated to explain motivation. While each of these theories has some truth, no single theory seems to adequately explain all human motivation. The fact is that human beings in general and pupils in particular are complex creatures with complex needs and desires. In this paper, Analytic Hierarchy Process (AHP) has been proposed as an emerging solution to move towards too large, dynamic and complex real world multi-criteria decision making problems in selecting the most suitable motivator when choosing school for their children. Data were analyzed using SPSS 17.0 ("Statistical Package for Social Science") software. Statistic testing used are descriptive and inferential statistic. Descriptive statistic used to identify respondent pupils and parents demographic factors. The statistical testing used to determine the pupils and parents highest motivator priorities and parents' best priorities using AHP to determine the criteria chosen by parents such as school principals, teachers, pupils and parents. The moderating factors are selected schools based on "Standard Kualiti Pendidikan Malaysia" (SKPM) in Ampang. Inferential statistics such as One-way ANOVA used to get the significant and data used to calculate the weightage of AHP. School principals is found to be the best motivator for parents in choosing school for their pupils followed by teachers, parents and pupils.

  9. Statistical tools for analysis and modeling of cosmic populations and astronomical time series: CUDAHM and TSE

    NASA Astrophysics Data System (ADS)

    Loredo, Thomas; Budavari, Tamas; Scargle, Jeffrey D.

    2018-01-01

    This presentation provides an overview of open-source software packages addressing two challenging classes of astrostatistics problems. (1) CUDAHM is a C++ framework for hierarchical Bayesian modeling of cosmic populations, leveraging graphics processing units (GPUs) to enable applying this computationally challenging paradigm to large datasets. CUDAHM is motivated by measurement error problems in astronomy, where density estimation and linear and nonlinear regression must be addressed for populations of thousands to millions of objects whose features are measured with possibly complex uncertainties, potentially including selection effects. An example calculation demonstrates accurate GPU-accelerated luminosity function estimation for simulated populations of $10^6$ objects in about two hours using a single NVIDIA Tesla K40c GPU. (2) Time Series Explorer (TSE) is a collection of software in Python and MATLAB for exploratory analysis and statistical modeling of astronomical time series. It comprises a library of stand-alone functions and classes, as well as an application environment for interactive exploration of times series data. The presentation will summarize key capabilities of this emerging project, including new algorithms for analysis of irregularly-sampled time series.

  10. Comparison of RF spectrum prediction methods for dynamic spectrum access

    NASA Astrophysics Data System (ADS)

    Kovarskiy, Jacob A.; Martone, Anthony F.; Gallagher, Kyle A.; Sherbondy, Kelly D.; Narayanan, Ram M.

    2017-05-01

    Dynamic spectrum access (DSA) refers to the adaptive utilization of today's busy electromagnetic spectrum. Cognitive radio/radar technologies require DSA to intelligently transmit and receive information in changing environments. Predicting radio frequency (RF) activity reduces sensing time and energy consumption for identifying usable spectrum. Typical spectrum prediction methods involve modeling spectral statistics with Hidden Markov Models (HMM) or various neural network structures. HMMs describe the time-varying state probabilities of Markov processes as a dynamic Bayesian network. Neural Networks model biological brain neuron connections to perform a wide range of complex and often non-linear computations. This work compares HMM, Multilayer Perceptron (MLP), and Recurrent Neural Network (RNN) algorithms and their ability to perform RF channel state prediction. Monte Carlo simulations on both measured and simulated spectrum data evaluate the performance of these algorithms. Generalizing spectrum occupancy as an alternating renewal process allows Poisson random variables to generate simulated data while energy detection determines the occupancy state of measured RF spectrum data for testing. The results suggest that neural networks achieve better prediction accuracy and prove more adaptable to changing spectral statistics than HMMs given sufficient training data.

  11. Nonparametric method for failures diagnosis in the actuating subsystem of aircraft control system

    NASA Astrophysics Data System (ADS)

    Terentev, M. N.; Karpenko, S. S.; Zybin, E. Yu; Kosyanchuk, V. V.

    2018-02-01

    In this paper we design a nonparametric method for failures diagnosis in the aircraft control system that uses the measurements of the control signals and the aircraft states only. It doesn’t require a priori information of the aircraft model parameters, training or statistical calculations, and is based on analytical nonparametric one-step-ahead state prediction approach. This makes it possible to predict the behavior of unidentified and failure dynamic systems, to weaken the requirements to control signals, and to reduce the diagnostic time and problem complexity.

  12. ACIRF User’s Guide for the General Model (Version 3.5)

    DTIC Science & Technology

    1992-06-01

    61 3c Example ACIRF formatted output for the frozen-in model (summary of measured realization statistics for antenr.. 2...must be delta correlated in angle, delay, and Doppler frequency: < z(KL,O)o) *(Kijj",o) = S(K±,T, O)D) 5(KL-K’) 8(T-r’) 8(0D-Oab) .( 61 ) The first-order... 61 , and the central limit theorem could be invoked to argue that h(p,r,t) and hA(p,rt) are zero- mean, normally-distributed complex quantities. Indeed

  13. Experimental Design and Power Calculation for RNA-seq Experiments.

    PubMed

    Wu, Zhijin; Wu, Hao

    2016-01-01

    Power calculation is a critical component of RNA-seq experimental design. The flexibility of RNA-seq experiment and the wide dynamic range of transcription it measures make it an attractive technology for whole transcriptome analysis. These features, in addition to the high dimensionality of RNA-seq data, bring complexity in experimental design, making an analytical power calculation no longer realistic. In this chapter we review the major factors that influence the statistical power of detecting differential expression, and give examples of power assessment using the R package PROPER.

  14. Water-refined solution structure of the human Grb7-SH2 domain in complex with the erbB2 receptor peptide pY1139.

    PubMed

    Pias, Sally C; Johnson, Dennis L; Smith, David E; Lyons, Barbara A

    2012-08-01

    We report a refinement in implicit water of the previously published solution structure of the Grb7-SH2 domain bound to the erbB2 receptor peptide pY1139. Structure quality measures indicate substantial improvement, with residues in the most favored regions of the Ramachandran plot increasing by 14 % and with WHAT IF statistics (Vriend, G. J. Mol. Graph., 1990, 8(1), 52-56) falling closer to expected values for well-refined structures.

  15. PDB-wide collection of binding data: current status of the PDBbind database.

    PubMed

    Liu, Zhihai; Li, Yan; Han, Li; Li, Jie; Liu, Jie; Zhao, Zhixiong; Nie, Wei; Liu, Yuchen; Wang, Renxiao

    2015-02-01

    Molecular recognition between biological macromolecules and organic small molecules plays an important role in various life processes. Both structural information and binding data of biomolecular complexes are indispensable for depicting the underlying mechanism in such an event. The PDBbind database was created to collect experimentally measured binding data for the biomolecular complexes throughout the Protein Data Bank (PDB). It thus provides the linkage between structural information and energetic properties of biomolecular complexes, which is especially desirable for computational studies or statistical analyses. Since its first public release in 2004, the PDBbind database has been updated on an annual basis. The latest release (version 2013) provides experimental binding affinity data for 10,776 biomolecular complexes in PDB, including 8302 protein-ligand complexes and 2474 other types of complexes. In this article, we will describe the current methods used for compiling PDBbind and the updated status of this database. We will also review some typical applications of PDBbind published in the scientific literature. All contents of this database are freely accessible at the PDBbind-CN Web server at http://www.pdbbind-cn.org/. wangrx@mail.sioc.ac.cn. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  16. Comparison of Measured and WRF-LES Turbulence Statistics in a Real Convective Boundary Layer over Complex Terrain

    NASA Astrophysics Data System (ADS)

    Rai, R. K.; Berg, L. K.; Kosovic, B.; Mirocha, J. D.; Pekour, M. S.; Shaw, W. J.

    2015-12-01

    Resolving the finest turbulent scales present in the lower atmosphere using numerical simulations helps to study the processes that occur in the atmospheric boundary layer, such as the turbulent inflow condition to the wind plant and the generation of the wake behind wind turbines. This work employs several nested domains in the WRF-LES framework to simulate conditions in a convectively driven cloud free boundary layer at an instrumented field site in complex terrain. The innermost LES domain (30 m spatial resolution) receives the boundary forcing from two other coarser resolution LES outer domains, which in turn receive boundary conditions from two WRF-mesoscale domains. Wind and temperature records from sonic anemometers mounted at two vertical levels (30 m and 60 m) are compared with the LES results in term of first and second statistical moments as well as power spectra and distributions of wind velocity. For the two mostly used boundary layer parameterizations (MYNN and YSU) tested in the WRF mesoscale domains, the MYNN scheme shows slightly better agreement with the observations for some quantities, such as time averaged velocity and Turbulent Kinetic Energy (TKE). However, LES driven by WRF-mesoscale simulations using either parameterization have similar velocity spectra and distributions of velocity. For each component of the wind velocity, WRF-LES power spectra are found to be comparable to the spectra derived from the measured data (for the frequencies that are accurately represented by WRF-LES). Furthermore, the analysis of LES results shows a noticeable variability of the mean and variance even over small horizontal distances that would be considered sub-grid scale in mesoscale simulations. This observed statistical variability in space and time can be utilized to further analyze the turbulence quantities over a heterogeneous surface and to improve the turbulence parameterization in the mesoscale model.

  17. Statistical Modelling and Characterization of Experimental mm-Wave Indoor Channels for Future 5G Wireless Communication Networks

    PubMed Central

    Al-Samman, A. M.; Rahman, T. A.; Azmi, M. H.; Hindia, M. N.; Khan, I.; Hanafi, E.

    2016-01-01

    This paper presents an experimental characterization of millimeter-wave (mm-wave) channels in the 6.5 GHz, 10.5 GHz, 15 GHz, 19 GHz, 28 GHz and 38 GHz frequency bands in an indoor corridor environment. More than 4,000 power delay profiles were measured across the bands using an omnidirectional transmitter antenna and a highly directional horn receiver antenna for both co- and cross-polarized antenna configurations. This paper develops a new path-loss model to account for the frequency attenuation with distance, which we term the frequency attenuation (FA) path-loss model and introduce a frequency-dependent attenuation factor. The large-scale path loss was characterized based on both new and well-known path-loss models. A general and less complex method is also proposed to estimate the cross-polarization discrimination (XPD) factor of close-in reference distance with the XPD (CIX) and ABG with the XPD (ABGX) path-loss models to avoid the computational complexity of minimum mean square error (MMSE) approach. Moreover, small-scale parameters such as root mean square (RMS) delay spread, mean excess (MN-EX) delay, dispersion factors and maximum excess (MAX-EX) delay parameters were used to characterize the multipath channel dispersion. Multiple statistical distributions for RMS delay spread were also investigated. The results show that our proposed models are simpler and more physically-based than other well-known models. The path-loss exponents for all studied models are smaller than that of the free-space model by values in the range of 0.1 to 1.4 for all measured frequencies. The RMS delay spread values varied between 0.2 ns and 13.8 ns, and the dispersion factor values were less than 1 for all measured frequencies. The exponential and Weibull probability distribution models best fit the RMS delay spread empirical distribution for all of the measured frequencies in all scenarios. PMID:27654703

  18. Statistical Modelling and Characterization of Experimental mm-Wave Indoor Channels for Future 5G Wireless Communication Networks.

    PubMed

    Al-Samman, A M; Rahman, T A; Azmi, M H; Hindia, M N; Khan, I; Hanafi, E

    This paper presents an experimental characterization of millimeter-wave (mm-wave) channels in the 6.5 GHz, 10.5 GHz, 15 GHz, 19 GHz, 28 GHz and 38 GHz frequency bands in an indoor corridor environment. More than 4,000 power delay profiles were measured across the bands using an omnidirectional transmitter antenna and a highly directional horn receiver antenna for both co- and cross-polarized antenna configurations. This paper develops a new path-loss model to account for the frequency attenuation with distance, which we term the frequency attenuation (FA) path-loss model and introduce a frequency-dependent attenuation factor. The large-scale path loss was characterized based on both new and well-known path-loss models. A general and less complex method is also proposed to estimate the cross-polarization discrimination (XPD) factor of close-in reference distance with the XPD (CIX) and ABG with the XPD (ABGX) path-loss models to avoid the computational complexity of minimum mean square error (MMSE) approach. Moreover, small-scale parameters such as root mean square (RMS) delay spread, mean excess (MN-EX) delay, dispersion factors and maximum excess (MAX-EX) delay parameters were used to characterize the multipath channel dispersion. Multiple statistical distributions for RMS delay spread were also investigated. The results show that our proposed models are simpler and more physically-based than other well-known models. The path-loss exponents for all studied models are smaller than that of the free-space model by values in the range of 0.1 to 1.4 for all measured frequencies. The RMS delay spread values varied between 0.2 ns and 13.8 ns, and the dispersion factor values were less than 1 for all measured frequencies. The exponential and Weibull probability distribution models best fit the RMS delay spread empirical distribution for all of the measured frequencies in all scenarios.

  19. Comparison of three sampling and analytical methods for the determination of airborne hexavalent chromium.

    PubMed

    Boiano, J M; Wallace, M E; Sieber, W K; Groff, J H; Wang, J; Ashley, K

    2000-08-01

    A field study was conducted with the goal of comparing the performance of three recently developed or modified sampling and analytical methods for the determination of airborne hexavalent chromium (Cr(VI)). The study was carried out in a hard chrome electroplating facility and in a jet engine manufacturing facility where airborne Cr(VI) was expected to be present. The analytical methods evaluated included two laboratory-based procedures (OSHA Method ID-215 and NIOSH Method 7605) and a field-portable method (NIOSH Method 7703). These three methods employ an identical sampling methodology: collection of Cr(VI)-containing aerosol on a polyvinyl chloride (PVC) filter housed in a sampling cassette, which is connected to a personal sampling pump calibrated at an appropriate flow rate. The basis of the analytical methods for all three methods involves extraction of the PVC filter in alkaline buffer solution, chemical isolation of the Cr(VI) ion, complexation of the Cr(VI) ion with 1,5-diphenylcarbazide, and spectrometric measurement of the violet chromium diphenylcarbazone complex at 540 nm. However, there are notable specific differences within the sample preparation procedures used in three methods. To assess the comparability of the three measurement protocols, a total of 20 side-by-side air samples were collected, equally divided between a chromic acid electroplating operation and a spray paint operation where water soluble forms of Cr(VI) were used. A range of Cr(VI) concentrations from 0.6 to 960 microg m(-3), with Cr(VI) mass loadings ranging from 0.4 to 32 microg, was measured at the two operations. The equivalence of the means of the log-transformed Cr(VI) concentrations obtained from the different analytical methods was compared. Based on analysis of variance (ANOVA) results, no statistically significant differences were observed between mean values measured using each of the three methods. Small but statistically significant differences were observed between results obtained from performance evaluation samples for the NIOSH field method and the OSHA laboratory method.

  20. The statistical power to detect cross-scale interactions at macroscales

    USGS Publications Warehouse

    Wagner, Tyler; Fergus, C. Emi; Stow, Craig A.; Cheruvelil, Kendra S.; Soranno, Patricia A.

    2016-01-01

    Macroscale studies of ecological phenomena are increasingly common because stressors such as climate and land-use change operate at large spatial and temporal scales. Cross-scale interactions (CSIs), where ecological processes operating at one spatial or temporal scale interact with processes operating at another scale, have been documented in a variety of ecosystems and contribute to complex system dynamics. However, studies investigating CSIs are often dependent on compiling multiple data sets from different sources to create multithematic, multiscaled data sets, which results in structurally complex, and sometimes incomplete data sets. The statistical power to detect CSIs needs to be evaluated because of their importance and the challenge of quantifying CSIs using data sets with complex structures and missing observations. We studied this problem using a spatially hierarchical model that measures CSIs between regional agriculture and its effects on the relationship between lake nutrients and lake productivity. We used an existing large multithematic, multiscaled database, LAke multiscaled GeOSpatial, and temporal database (LAGOS), to parameterize the power analysis simulations. We found that the power to detect CSIs was more strongly related to the number of regions in the study rather than the number of lakes nested within each region. CSI power analyses will not only help ecologists design large-scale studies aimed at detecting CSIs, but will also focus attention on CSI effect sizes and the degree to which they are ecologically relevant and detectable with large data sets.

  1. Morphometry of white muscle fibers and performance of Nile tilapia (Oreochromis niloticus) fingerlings treated with methyltestosterone or a homeopathic complex.

    PubMed

    Júnior, R P; Vargas, L; Valentim-Zabott, M; Ribeiro, R P; da Silva, A V; Otutumi, L K

    2012-07-01

    Nile tilapia (Oreochromis niloticus), are widely used in fish farming, hormonal treatments are used to increase productivity. Studies of the characteristics of the fiber types are important in species that have well developed muscle mass, such as Nile tilapia. A total of 4800 post-larval fish were randomly assigned by tank to receive one of three treatments: Control (30°GL alcohol), Homeopathic complex (Homeopatila RS) or Hormone (17-α-methyltestosterone) supplemented in the feed for 28 days. Survival and morphological parameters were measured at day 45. At day 45, the survival rates were 54.1% (Control), 87.8% (Homeopathy), 50.3% (Hormone). The mean final weight for Homeopathy was statistically significantly lower (1.07 g) than the other two groups: Control (1.81 g) and Hormone (2.04 g). Mean total lengths were Control (4.75 cm), Hormone (4.49 cm), statistically significantly different from Homeopathy (3.83 cm). Average partial length, trunk length, height and body width were significantly lower for Homeopathy than Control or Hormone (p<0.05) Homeopathy treated fish had significantly greater muscle fiber diameter than the other two groups. Fish treated with the homeopathic complex had improved survival and muscle fiber hypertrophy, but were smaller (probably related to increased survival and overcrowding) compared to fingerlings treated with synthetic hormone or control. Copyright © 2012 The Faculty of Homeopathy. Published by Elsevier Ltd. All rights reserved.

  2. Aesthetic Responses to Exact Fractals Driven by Physical Complexity

    PubMed Central

    Bies, Alexander J.; Blanc-Goldhammer, Daryn R.; Boydston, Cooper R.; Taylor, Richard P.; Sereno, Margaret E.

    2016-01-01

    Fractals are physically complex due to their repetition of patterns at multiple size scales. Whereas the statistical characteristics of the patterns repeat for fractals found in natural objects, computers can generate patterns that repeat exactly. Are these exact fractals processed differently, visually and aesthetically, than their statistical counterparts? We investigated the human aesthetic response to the complexity of exact fractals by manipulating fractal dimensionality, symmetry, recursion, and the number of segments in the generator. Across two studies, a variety of fractal patterns were visually presented to human participants to determine the typical response to exact fractals. In the first study, we found that preference ratings for exact midpoint displacement fractals can be described by a linear trend with preference increasing as fractal dimension increases. For the majority of individuals, preference increased with dimension. We replicated these results for other exact fractal patterns in a second study. In the second study, we also tested the effects of symmetry and recursion by presenting asymmetric dragon fractals, symmetric dragon fractals, and Sierpinski carpets and Koch snowflakes, which have radial and mirror symmetry. We found a strong interaction among recursion, symmetry and fractal dimension. Specifically, at low levels of recursion, the presence of symmetry was enough to drive high preference ratings for patterns with moderate to high levels of fractal dimension. Most individuals required a much higher level of recursion to recover this level of preference in a pattern that lacked mirror or radial symmetry, while others were less discriminating. This suggests that exact fractals are processed differently than their statistical counterparts. We propose a set of four factors that influence complexity and preference judgments in fractals that may extend to other patterns: fractal dimension, recursion, symmetry and the number of segments in a pattern. Conceptualizations such as Berlyne’s and Redies’ theories of aesthetics also provide a suitable framework for interpretation of our data with respect to the individual differences that we detect. Future studies that incorporate physiological methods to measure the human aesthetic response to exact fractal patterns would further elucidate our responses to such timeless patterns. PMID:27242475

  3. Statistical analysis of dimer formation in supersaturated metal vapor based on molecular dynamics simulation

    NASA Astrophysics Data System (ADS)

    Korenchenko, Anna E.; Vorontsov, Alexander G.; Gelchinski, Boris R.; Sannikov, Grigorii P.

    2018-04-01

    We discuss the problem of dimer formation during the homogeneous nucleation of atomic metal vapor in an inert gas environment. We simulated nucleation with molecular dynamics and carried out the statistical analysis of double- and triple-atomic collisions as the two ways of long-lived diatomic complex formation. Close pair of atoms with lifetime greater than the mean time interval between atom-atom collisions is called a long-lived diatomic complex. We found that double- and triple-atomic collisions gave approximately the same probabilities of long-lived diatomic complex formation, but internal energy of the resulted state was essentially lower in the second case. Some diatomic complexes formed in three-particle collisions are stable enough to be a critical nucleus.

  4. Detecting changes in dynamic and complex acoustic environments

    PubMed Central

    Boubenec, Yves; Lawlor, Jennifer; Górska, Urszula; Shamma, Shihab; Englitz, Bernhard

    2017-01-01

    Natural sounds such as wind or rain, are characterized by the statistical occurrence of their constituents. Despite their complexity, listeners readily detect changes in these contexts. We here address the neural basis of statistical decision-making using a combination of psychophysics, EEG and modelling. In a texture-based, change-detection paradigm, human performance and reaction times improved with longer pre-change exposure, consistent with improved estimation of baseline statistics. Change-locked and decision-related EEG responses were found in a centro-parietal scalp location, whose slope depended on change size, consistent with sensory evidence accumulation. The potential's amplitude scaled with the duration of pre-change exposure, suggesting a time-dependent decision threshold. Auditory cortex-related potentials showed no response to the change. A dual timescale, statistical estimation model accounted for subjects' performance. Furthermore, a decision-augmented auditory cortex model accounted for performance and reaction times, suggesting that the primary cortical representation requires little post-processing to enable change-detection in complex acoustic environments. DOI: http://dx.doi.org/10.7554/eLife.24910.001 PMID:28262095

  5. Unraveling multiple changes in complex climate time series using Bayesian inference

    NASA Astrophysics Data System (ADS)

    Berner, Nadine; Trauth, Martin H.; Holschneider, Matthias

    2016-04-01

    Change points in time series are perceived as heterogeneities in the statistical or dynamical characteristics of observations. Unraveling such transitions yields essential information for the understanding of the observed system. The precise detection and basic characterization of underlying changes is therefore of particular importance in environmental sciences. We present a kernel-based Bayesian inference approach to investigate direct as well as indirect climate observations for multiple generic transition events. In order to develop a diagnostic approach designed to capture a variety of natural processes, the basic statistical features of central tendency and dispersion are used to locally approximate a complex time series by a generic transition model. A Bayesian inversion approach is developed to robustly infer on the location and the generic patterns of such a transition. To systematically investigate time series for multiple changes occurring at different temporal scales, the Bayesian inversion is extended to a kernel-based inference approach. By introducing basic kernel measures, the kernel inference results are composed into a proxy probability to a posterior distribution of multiple transitions. Thus, based on a generic transition model a probability expression is derived that is capable to indicate multiple changes within a complex time series. We discuss the method's performance by investigating direct and indirect climate observations. The approach is applied to environmental time series (about 100 a), from the weather station in Tuscaloosa, Alabama, and confirms documented instrumentation changes. Moreover, the approach is used to investigate a set of complex terrigenous dust records from the ODP sites 659, 721/722 and 967 interpreted as climate indicators of the African region of the Plio-Pleistocene period (about 5 Ma). The detailed inference unravels multiple transitions underlying the indirect climate observations coinciding with established global climate events.

  6. Functional Logistic Regression Approach to Detecting Gene by Longitudinal Environmental Exposure Interaction in a Case-Control Study

    PubMed Central

    Wei, Peng; Tang, Hongwei; Li, Donghui

    2014-01-01

    Most complex human diseases are likely the consequence of the joint actions of genetic and environmental factors. Identification of gene-environment (GxE) interactions not only contributes to a better understanding of the disease mechanisms, but also improves disease risk prediction and targeted intervention. In contrast to the large number of genetic susceptibility loci discovered by genome-wide association studies, there have been very few successes in identifying GxE interactions which may be partly due to limited statistical power and inaccurately measured exposures. While existing statistical methods only consider interactions between genes and static environmental exposures, many environmental/lifestyle factors, such as air pollution and diet, change over time, and cannot be accurately captured at one measurement time point or by simply categorizing into static exposure categories. There is a dearth of statistical methods for detecting gene by time-varying environmental exposure interactions. Here we propose a powerful functional logistic regression (FLR) approach to model the time-varying effect of longitudinal environmental exposure and its interaction with genetic factors on disease risk. Capitalizing on the powerful functional data analysis framework, our proposed FLR model is capable of accommodating longitudinal exposures measured at irregular time points and contaminated by measurement errors, commonly encountered in observational studies. We use extensive simulations to show that the proposed method can control the Type I error and is more powerful than alternative ad hoc methods. We demonstrate the utility of this new method using data from a case-control study of pancreatic cancer to identify the windows of vulnerability of lifetime body mass index on the risk of pancreatic cancer as well as genes which may modify this association. PMID:25219575

  7. Functional logistic regression approach to detecting gene by longitudinal environmental exposure interaction in a case-control study.

    PubMed

    Wei, Peng; Tang, Hongwei; Li, Donghui

    2014-11-01

    Most complex human diseases are likely the consequence of the joint actions of genetic and environmental factors. Identification of gene-environment (G × E) interactions not only contributes to a better understanding of the disease mechanisms, but also improves disease risk prediction and targeted intervention. In contrast to the large number of genetic susceptibility loci discovered by genome-wide association studies, there have been very few successes in identifying G × E interactions, which may be partly due to limited statistical power and inaccurately measured exposures. Although existing statistical methods only consider interactions between genes and static environmental exposures, many environmental/lifestyle factors, such as air pollution and diet, change over time, and cannot be accurately captured at one measurement time point or by simply categorizing into static exposure categories. There is a dearth of statistical methods for detecting gene by time-varying environmental exposure interactions. Here, we propose a powerful functional logistic regression (FLR) approach to model the time-varying effect of longitudinal environmental exposure and its interaction with genetic factors on disease risk. Capitalizing on the powerful functional data analysis framework, our proposed FLR model is capable of accommodating longitudinal exposures measured at irregular time points and contaminated by measurement errors, commonly encountered in observational studies. We use extensive simulations to show that the proposed method can control the Type I error and is more powerful than alternative ad hoc methods. We demonstrate the utility of this new method using data from a case-control study of pancreatic cancer to identify the windows of vulnerability of lifetime body mass index on the risk of pancreatic cancer as well as genes that may modify this association. © 2014 Wiley Periodicals, Inc.

  8. Morphometricity as a measure of the neuroanatomical signature of a trait.

    PubMed

    Sabuncu, Mert R; Ge, Tian; Holmes, Avram J; Smoller, Jordan W; Buckner, Randy L; Fischl, Bruce

    2016-09-27

    Complex physiological and behavioral traits, including neurological and psychiatric disorders, often associate with distributed anatomical variation. This paper introduces a global metric, called morphometricity, as a measure of the anatomical signature of different traits. Morphometricity is defined as the proportion of phenotypic variation that can be explained by macroscopic brain morphology. We estimate morphometricity via a linear mixed-effects model that uses an anatomical similarity matrix computed based on measurements derived from structural brain MRI scans. We examined over 3,800 unique MRI scans from nine large-scale studies to estimate the morphometricity of a range of phenotypes, including clinical diagnoses such as Alzheimer's disease, and nonclinical traits such as measures of cognition. Our results demonstrate that morphometricity can provide novel insights about the neuroanatomical correlates of a diverse set of traits, revealing associations that might not be detectable through traditional statistical techniques.

  9. Dissimilarity measure based on ordinal pattern for physiological signals

    NASA Astrophysics Data System (ADS)

    Wang, Jing; Shang, Pengjian; Shi, Wenbin; Cui, Xingran

    2016-08-01

    Complex physiologic signals may carry information of their underlying mechanisms. In this paper, we introduce a dissimilarity measure to capture the features of underlying dynamics from various types of physiologic signals based on rank order statistics of ordinal patterns. Simulated 1/f noise and white noise are used to evaluate the effect of data length, embedding dimension and time delay on this measure. We then apply this measure to different physiologic signals. The method can successfully characterize the unique underlying patterns of subjects at similar physiologic states. It can also serve as a good discriminative tool for the healthy young, healthy elderly, congestive heart failure, atrial fibrilation and white noise groups. Furthermore, when investigated into the details of underlying ordinal patterns for each group, it is found that the distributions of ordinal patterns varies significantly for healthy and pathologic states, as well as aging.

  10. Morphometricity as a measure of the neuroanatomical signature of a trait

    PubMed Central

    Sabuncu, Mert R.; Ge, Tian; Holmes, Avram J.; Smoller, Jordan W.; Buckner, Randy L.; Fischl, Bruce

    2016-01-01

    Complex physiological and behavioral traits, including neurological and psychiatric disorders, often associate with distributed anatomical variation. This paper introduces a global metric, called morphometricity, as a measure of the anatomical signature of different traits. Morphometricity is defined as the proportion of phenotypic variation that can be explained by macroscopic brain morphology. We estimate morphometricity via a linear mixed-effects model that uses an anatomical similarity matrix computed based on measurements derived from structural brain MRI scans. We examined over 3,800 unique MRI scans from nine large-scale studies to estimate the morphometricity of a range of phenotypes, including clinical diagnoses such as Alzheimer’s disease, and nonclinical traits such as measures of cognition. Our results demonstrate that morphometricity can provide novel insights about the neuroanatomical correlates of a diverse set of traits, revealing associations that might not be detectable through traditional statistical techniques. PMID:27613854

  11. Observation-Driven Configuration of Complex Software Systems

    NASA Astrophysics Data System (ADS)

    Sage, Aled

    2010-06-01

    The ever-increasing complexity of software systems makes them hard to comprehend, predict and tune due to emergent properties and non-deterministic behaviour. Complexity arises from the size of software systems and the wide variety of possible operating environments: the increasing choice of platforms and communication policies leads to ever more complex performance characteristics. In addition, software systems exhibit different behaviour under different workloads. Many software systems are designed to be configurable so that policies can be chosen to meet the needs of various stakeholders. For complex software systems it can be difficult to accurately predict the effects of a change and to know which configuration is most appropriate. This thesis demonstrates that it is useful to run automated experiments that measure a selection of system configurations. Experiments can find configurations that meet the stakeholders' needs, find interesting behavioural characteristics, and help produce predictive models of the system's behaviour. The design and use of ACT (Automated Configuration Tool) for running such experiments is described, in combination a number of search strategies for deciding on the configurations to measure. Design Of Experiments (DOE) is discussed, with emphasis on Taguchi Methods. These statistical methods have been used extensively in manufacturing, but have not previously been used for configuring software systems. The novel contribution here is an industrial case study, applying the combination of ACT and Taguchi Methods to DC-Directory, a product from Data Connection Ltd (DCL). The case study investigated the applicability of Taguchi Methods for configuring complex software systems. Taguchi Methods were found to be useful for modelling and configuring DC- Directory, making them a valuable addition to the techniques available to system administrators and developers.

  12. Benchmarking in pathology: development of a benchmarking complexity unit and associated key performance indicators.

    PubMed

    Neil, Amanda; Pfeffer, Sally; Burnett, Leslie

    2013-01-01

    This paper details the development of a new type of pathology laboratory productivity unit, the benchmarking complexity unit (BCU). The BCU provides a comparative index of laboratory efficiency, regardless of test mix. It also enables estimation of a measure of how much complex pathology a laboratory performs, and the identification of peer organisations for the purposes of comparison and benchmarking. The BCU is based on the theory that wage rates reflect productivity at the margin. A weighting factor for the ratio of medical to technical staff time was dynamically calculated based on actual participant site data. Given this weighting, a complexity value for each test, at each site, was calculated. The median complexity value (number of BCUs) for that test across all participating sites was taken as its complexity value for the Benchmarking in Pathology Program. The BCU allowed implementation of an unbiased comparison unit and test listing that was found to be a robust indicator of the relative complexity for each test. Employing the BCU data, a number of Key Performance Indicators (KPIs) were developed, including three that address comparative organisational complexity, analytical depth and performance efficiency, respectively. Peer groups were also established using the BCU combined with simple organisational and environmental metrics. The BCU has enabled productivity statistics to be compared between organisations. The BCU corrects for differences in test mix and workload complexity of different organisations and also allows for objective stratification into peer groups.

  13. STUDY OF TURBULENT ENERGY OVER COMPLEX TERRAIN: STATE, 1978

    EPA Science Inventory

    The complex structure of the earth's surface influenced atmospheric parameters pertinent to modeling the diffusion process during the 1978 'STATE' field study. The Information Theory approach of statistics proved useful for analyzing the complex structures observed in the radiome...

  14. Genotype-environment interaction and sociology: contributions and complexities.

    PubMed

    Seabrook, Jamie A; Avison, William R

    2010-05-01

    Genotype-environment interaction (G x E) refers to situations in which genetic effects connected to a phenotype are dependent upon variability in the environment, or when genes modify an organism's sensitivity to particular environmental features. Using a typology suggested in the G x E literature, we provide an overview of recent papers that show how social context can trigger a genetic vulnerability, compensate for a genetic vulnerability, control behaviors for which a genetic vulnerability exists, and improve adaptation via proximal causes. We argue that to improve their understanding of social structure, sociologists can take advantage of research in behavior genetics by assessing the impact of within-group variance of various health outcomes and complex human behaviors that are explainable by genotype, environment and their interaction. Insights from life course sociology can aid in ensuring that the dynamic nature of the environment in G x E has been accounted for. Identification of an appropriate entry point for sociologists interested in G x E research could begin with the choice of an environmental feature of interest, a genetic factor of interest, and/or behavior of interest. Optimizing measurement in order to capture the complexity of G x E is critical. Examining the interaction between poorly measured environmental factors and well measured genetic variables will overestimate the effects of genetic variables while underestimating the effect of environmental influences, thereby distorting the interaction between genotype and environment. Although the expense of collecting environmental data is very high, reliable and precise measurement of an environmental pathogen enhances a study's statistical power. Copyright 2010 Elsevier Ltd. All rights reserved.

  15. Introduction of a Journal Excerpt Activity Improves Undergraduate Students' Performance in Statistics

    ERIC Educational Resources Information Center

    Rabin, Laura A.; Nutter-Upham, Katherine E.

    2010-01-01

    We describe an active learning exercise intended to improve undergraduate students' understanding of statistics by grounding complex concepts within a meaningful, applied context. Students in a journal excerpt activity class read brief excerpts of statistical reporting from published research articles, answered factual and interpretive questions,…

  16. The structure and resilience of financial market networks

    NASA Astrophysics Data System (ADS)

    Kauê Dal'Maso Peron, Thomas; da Fontoura Costa, Luciano; Rodrigues, Francisco A.

    2012-03-01

    Financial markets can be viewed as a highly complex evolving system that is very sensitive to economic instabilities. The complex organization of the market can be represented in a suitable fashion in terms of complex networks, which can be constructed from stock prices such that each pair of stocks is connected by a weighted edge that encodes the distance between them. In this work, we propose an approach to analyze the topological and dynamic evolution of financial networks based on the stock correlation matrices. An entropy-related measurement is adopted to quantify the robustness of the evolving financial market organization. It is verified that the network topological organization suffers strong variation during financial instabilities and the networks in such periods become less robust. A statistical robust regression model is proposed to quantity the relationship between the network structure and resilience. The obtained coefficients of such model indicate that the average shortest path length is the measurement most related to network resilience coefficient. This result indicates that a collective behavior is observed between stocks during financial crisis. More specifically, stocks tend to synchronize their price evolution, leading to a high correlation between pair of stock prices, which contributes to the increase in distance between them and, consequently, decrease the network resilience.

  17. A wind tunnel study on the effects of complex topography on wind turbine performance

    NASA Astrophysics Data System (ADS)

    Howard, Kevin; Hu, Stephen; Chamorro, Leonardo; Guala, Michele

    2012-11-01

    A set of wind tunnel experiments were conducted to study the response of a wind turbine under flow conditions typically observed at the wind farm scale, in complex terrain. A scale model wind turbine was placed in a fully developed turbulent boundary layer flow obtained in the SAFL Wind Tunnel. Experiments focused on the performance of a turbine model, under the effects induced by a second upwind turbine or a by three-dimensional, sinusoidal hill, peaking at the turbine hub height. High frequency measurements of fluctuating streamwise and wall normal velocities were obtained with a X-wire anemometer simultaneously with the rotor angular velocity and the turbine(s) voltage output. Velocity measurements in the wake of the first turbine and of the hill were used to determine the inflow conditions for the downwind test turbine. Turbine performance was inferred by the mean and fluctuating voltage statistics. Specific experiments were devoted to relate the mean voltage to the mean hub velocity, and the fluctuating voltage to the unsteadiness in the rotor kinematics induced by the perturbed (hill or turbine) or unperturbed (boundary layer) large scales of the incoming turbulent flow. Results show that the voltage signal can be used to assess turbine performance in complex flows.

  18. Learning predictive statistics from temporal sequences: Dynamics and strategies

    PubMed Central

    Wang, Rui; Shen, Yuan; Tino, Peter; Welchman, Andrew E.; Kourtzi, Zoe

    2017-01-01

    Human behavior is guided by our expectations about the future. Often, we make predictions by monitoring how event sequences unfold, even though such sequences may appear incomprehensible. Event structures in the natural environment typically vary in complexity, from simple repetition to complex probabilistic combinations. How do we learn these structures? Here we investigate the dynamics of structure learning by tracking human responses to temporal sequences that change in structure unbeknownst to the participants. Participants were asked to predict the upcoming item following a probabilistic sequence of symbols. Using a Markov process, we created a family of sequences, from simple frequency statistics (e.g., some symbols are more probable than others) to context-based statistics (e.g., symbol probability is contingent on preceding symbols). We demonstrate the dynamics with which individuals adapt to changes in the environment's statistics—that is, they extract the behaviorally relevant structures to make predictions about upcoming events. Further, we show that this structure learning relates to individual decision strategy; faster learning of complex structures relates to selection of the most probable outcome in a given context (maximizing) rather than matching of the exact sequence statistics. Our findings provide evidence for alternate routes to learning of behaviorally relevant statistics that facilitate our ability to predict future events in variable environments. PMID:28973111

  19. A Powerful Approach to Estimating Annotation-Stratified Genetic Covariance via GWAS Summary Statistics.

    PubMed

    Lu, Qiongshi; Li, Boyang; Ou, Derek; Erlendsdottir, Margret; Powles, Ryan L; Jiang, Tony; Hu, Yiming; Chang, David; Jin, Chentian; Dai, Wei; He, Qidu; Liu, Zefeng; Mukherjee, Shubhabrata; Crane, Paul K; Zhao, Hongyu

    2017-12-07

    Despite the success of large-scale genome-wide association studies (GWASs) on complex traits, our understanding of their genetic architecture is far from complete. Jointly modeling multiple traits' genetic profiles has provided insights into the shared genetic basis of many complex traits. However, large-scale inference sets a high bar for both statistical power and biological interpretability. Here we introduce a principled framework to estimate annotation-stratified genetic covariance between traits using GWAS summary statistics. Through theoretical and numerical analyses, we demonstrate that our method provides accurate covariance estimates, thereby enabling researchers to dissect both the shared and distinct genetic architecture across traits to better understand their etiologies. Among 50 complex traits with publicly accessible GWAS summary statistics (N total ≈ 4.5 million), we identified more than 170 pairs with statistically significant genetic covariance. In particular, we found strong genetic covariance between late-onset Alzheimer disease (LOAD) and amyotrophic lateral sclerosis (ALS), two major neurodegenerative diseases, in single-nucleotide polymorphisms (SNPs) with high minor allele frequencies and in SNPs located in the predicted functional genome. Joint analysis of LOAD, ALS, and other traits highlights LOAD's correlation with cognitive traits and hints at an autoimmune component for ALS. Copyright © 2017 American Society of Human Genetics. Published by Elsevier Inc. All rights reserved.

  20. Age- and sex-related growth patterns of the craniofacial complex in European children aged 3-6 years.

    PubMed

    Tutkuviene, Janina; Cattaneo, Cristina; Obertová, Zuzana; Ratnayake, Melanie; Poppa, Pasquale; Barkus, Arunas; Khalaj-Hedayati, Kerstin; Schroeder, Inge; Ritz-Timme, Stefanie

    2016-11-01

    Craniofacial growth changes in young children are not yet completely understood. Up-to-date references for craniofacial measurements are crucial for clinical assessment of orthodontic anomalies, craniofacial abnormalities and subsequent planning of interventions. To provide normal reference data and to identify growth patterns for craniofacial dimensions of European boys and girls aged 3-6 years. Using standard anthropometric methodology, body weight, body height and 23 craniofacial measurements were acquired for a cross-sectional sample of 681 healthy children (362 boys and 319 girls) aged 3-6 years from Germany, Italy and Lithuania. Descriptive statistics, correlation coefficients, percentage annual changes and percentage growth rates were used to analyse the dataset. Between the ages of 3-6 years, craniofacial measurements showed age- and sex-related patterns independent from patterns observed for body weight and body height. Sex-related differences were observed in the majority of craniofacial measurements. In both sexes, face heights and face depths showed the strongest correlation with age. Growth patterns differed by craniofacial measurement and can be summarised into eight distinct age- and sex-related patterns. This study provided reference data and identified sex- and age-related growth patterns of the craniofacial complex of young European children, which may be used for detailed assessment of normal growth in paediatrics, maxillofacial reconstructive surgery and possibly for forensic age assessment.

  1. Acceptance Probability (P a) Analysis for Process Validation Lifecycle Stages.

    PubMed

    Alsmeyer, Daniel; Pazhayattil, Ajay; Chen, Shu; Munaretto, Francesco; Hye, Maksuda; Sanghvi, Pradeep

    2016-04-01

    This paper introduces an innovative statistical approach towards understanding how variation impacts the acceptance criteria of quality attributes. Because of more complex stage-wise acceptance criteria, traditional process capability measures are inadequate for general application in the pharmaceutical industry. The probability of acceptance concept provides a clear measure, derived from specific acceptance criteria for each quality attribute. In line with the 2011 FDA Guidance, this approach systematically evaluates data and scientifically establishes evidence that a process is capable of consistently delivering quality product. The probability of acceptance provides a direct and readily understandable indication of product risk. As with traditional capability indices, the acceptance probability approach assumes that underlying data distributions are normal. The computational solutions for dosage uniformity and dissolution acceptance criteria are readily applicable. For dosage uniformity, the expected AV range may be determined using the s lo and s hi values along with the worst case estimates of the mean. This approach permits a risk-based assessment of future batch performance of the critical quality attributes. The concept is also readily applicable to sterile/non sterile liquid dose products. Quality attributes such as deliverable volume and assay per spray have stage-wise acceptance that can be converted into an acceptance probability. Accepted statistical guidelines indicate processes with C pk > 1.33 as performing well within statistical control and those with C pk < 1.0 as "incapable" (1). A C pk > 1.33 is associated with a centered process that will statistically produce less than 63 defective units per million. This is equivalent to an acceptance probability of >99.99%.

  2. Simultaneous Ka-Band Site Characterization: Goldstone, CA, White Sands, NM, and Guam, USA

    NASA Technical Reports Server (NTRS)

    Acosta, Roberto; Morse, Jacquelynne; Zemba, Michael; Nessel, James; Morabito, David; Caroglanian, Armen

    2011-01-01

    To statistically characterize atmospheric effects on Ka-band links at NASA operational sites, NASA has constructed site test interferometers (STI s) which directly measure the tropospheric phase stability and rain attenuation. These instruments observe an unmodulated beacon signal broadcast from a geostationary satellite (e.g., Anik F2) and measure the phase difference between the signals received by the two antennas and its signal attenuation. Three STI s have been deployed so far: the first one at the NASA Deep Space Network Tracking Complex in Goldstone, California (May 2007); the second at the NASA White Sands Complex, in Las Cruses, New Mexico (February 2009); and the third at the NASA Tracking and Data Relay Satellite (TDRS) Remote Ground Terminal (GRGT) complex in Guam (May 2010). Two station-years of simultaneous atmospheric phase fluctuation data have been collected at Goldstone and White Sands, while one year of data has been collected in Guam. With identical instruments operating simultaneously, we can directly compare the phase stability and rain attenuation at the three sites. Phase stability is analyzed statistically in terms of the root-mean-square (rms) of the tropospheric induced time delay fluctuations over 10 minute blocks. For two years, the time delay fluctuations at the DSN site in Goldstone, CA, have been better than 2.5 picoseconds (ps) for 90% of the time (with reference to zenith), meanwhile at the White Sands, New Mexico site, the time delay fluctuations have been better than 2.2 ps with reference to zenith) for 90% of time. For Guam, the time delay fluctuations have been better than 12 ps (reference to zenith) at 90% of the time, the higher fluctuations are as expected from a high humidity tropical rain zone. This type of data analysis, as well as many other site quality characteristics (e.g., rain attenuation, infrastructure, etc.) will be used to determine the suitability of all the sites for NASA s future communication services at Ka-band.

  3. Statistical downscaling rainfall using artificial neural network: significantly wetter Bangkok?

    NASA Astrophysics Data System (ADS)

    Vu, Minh Tue; Aribarg, Thannob; Supratid, Siriporn; Raghavan, Srivatsan V.; Liong, Shie-Yui

    2016-11-01

    Artificial neural network (ANN) is an established technique with a flexible mathematical structure that is capable of identifying complex nonlinear relationships between input and output data. The present study utilizes ANN as a method of statistically downscaling global climate models (GCMs) during the rainy season at meteorological site locations in Bangkok, Thailand. The study illustrates the applications of the feed forward back propagation using large-scale predictor variables derived from both the ERA-Interim reanalyses data and present day/future GCM data. The predictors are first selected over different grid boxes surrounding Bangkok region and then screened by using principal component analysis (PCA) to filter the best correlated predictors for ANN training. The reanalyses downscaled results of the present day climate show good agreement against station precipitation with a correlation coefficient of 0.8 and a Nash-Sutcliffe efficiency of 0.65. The final downscaled results for four GCMs show an increasing trend of precipitation for rainy season over Bangkok by the end of the twenty-first century. The extreme values of precipitation determined using statistical indices show strong increases of wetness. These findings will be useful for policy makers in pondering adaptation measures due to flooding such as whether the current drainage network system is sufficient to meet the changing climate and to plan for a range of related adaptation/mitigation measures.

  4. Time, frequency, and time-varying Granger-causality measures in neuroscience.

    PubMed

    Cekic, Sezen; Grandjean, Didier; Renaud, Olivier

    2018-05-20

    This article proposes a systematic methodological review and an objective criticism of existing methods enabling the derivation of time, frequency, and time-varying Granger-causality statistics in neuroscience. The capacity to describe the causal links between signals recorded at different brain locations during a neuroscience experiment is indeed of primary interest for neuroscientists, who often have very precise prior hypotheses about the relationships between recorded brain signals. The increasing interest and the huge number of publications related to this topic calls for this systematic review, which describes the very complex methodological aspects underlying the derivation of these statistics. In this article, we first present a general framework that allows us to review and compare Granger-causality statistics in the time domain, and the link with transfer entropy. Then, the spectral and the time-varying extensions are exposed and discussed together with their estimation and distributional properties. Although not the focus of this article, partial and conditional Granger causality, dynamical causal modelling, directed transfer function, directed coherence, partial directed coherence, and their variant are also mentioned. Copyright © 2018 John Wiley & Sons, Ltd.

  5. Predicting Slag Generation in Sub-Scale Test Motors Using a Neural Network

    NASA Technical Reports Server (NTRS)

    Wiesenberg, Brent

    1999-01-01

    Generation of slag (aluminum oxide) is an important issue for the Reusable Solid Rocket Motor (RSRM). Thiokol performed testing to quantify the relationship between raw material variations and slag generation in solid propellants by testing sub-scale motors cast with propellant containing various combinations of aluminum fuel and ammonium perchlorate (AP) oxidizer particle sizes. The test data were analyzed using statistical methods and an artificial neural network. This paper primarily addresses the neural network results with some comparisons to the statistical results. The neural network showed that the particle sizes of both the aluminum and unground AP have a measurable effect on slag generation. The neural network analysis showed that aluminum particle size is the dominant driver in slag generation, about 40% more influential than AP. The network predictions of the amount of slag produced during firing of sub-scale motors were 16% better than the predictions of a statistically derived empirical equation. Another neural network successfully characterized the slag generated during full-scale motor tests. The success is attributable to the ability of neural networks to characterize multiple complex factors including interactions that affect slag generation.

  6. Regression modeling of ground-water flow

    USGS Publications Warehouse

    Cooley, R.L.; Naff, R.L.

    1985-01-01

    Nonlinear multiple regression methods are developed to model and analyze groundwater flow systems. Complete descriptions of regression methodology as applied to groundwater flow models allow scientists and engineers engaged in flow modeling to apply the methods to a wide range of problems. Organization of the text proceeds from an introduction that discusses the general topic of groundwater flow modeling, to a review of basic statistics necessary to properly apply regression techniques, and then to the main topic: exposition and use of linear and nonlinear regression to model groundwater flow. Statistical procedures are given to analyze and use the regression models. A number of exercises and answers are included to exercise the student on nearly all the methods that are presented for modeling and statistical analysis. Three computer programs implement the more complex methods. These three are a general two-dimensional, steady-state regression model for flow in an anisotropic, heterogeneous porous medium, a program to calculate a measure of model nonlinearity with respect to the regression parameters, and a program to analyze model errors in computed dependent variables such as hydraulic head. (USGS)

  7. Statistical limitations in functional neuroimaging. I. Non-inferential methods and statistical models.

    PubMed Central

    Petersson, K M; Nichols, T E; Poline, J B; Holmes, A P

    1999-01-01

    Functional neuroimaging (FNI) provides experimental access to the intact living brain making it possible to study higher cognitive functions in humans. In this review and in a companion paper in this issue, we discuss some common methods used to analyse FNI data. The emphasis in both papers is on assumptions and limitations of the methods reviewed. There are several methods available to analyse FNI data indicating that none is optimal for all purposes. In order to make optimal use of the methods available it is important to know the limits of applicability. For the interpretation of FNI results it is also important to take into account the assumptions, approximations and inherent limitations of the methods used. This paper gives a brief overview over some non-inferential descriptive methods and common statistical models used in FNI. Issues relating to the complex problem of model selection are discussed. In general, proper model selection is a necessary prerequisite for the validity of the subsequent statistical inference. The non-inferential section describes methods that, combined with inspection of parameter estimates and other simple measures, can aid in the process of model selection and verification of assumptions. The section on statistical models covers approaches to global normalization and some aspects of univariate, multivariate, and Bayesian models. Finally, approaches to functional connectivity and effective connectivity are discussed. In the companion paper we review issues related to signal detection and statistical inference. PMID:10466149

  8. Dissociation kinetics of metal clusters on multiple electronic states including electronic level statistics into the vibronic soup

    NASA Astrophysics Data System (ADS)

    Shvartsburg, Alexandre A.; Siu, K. W. Michael

    2001-06-01

    Modeling the delayed dissociation of clusters had been over the last decade a frontline development area in chemical physics. It is of fundamental interest how statistical kinetics methods previously validated for regular molecules and atomic nuclei may apply to clusters, as this would help to understand the transferability of statistical models for disintegration of complex systems across various classes of physical objects. From a practical perspective, accurate simulation of unimolecular decomposition is critical for the extraction of true thermochemical values from measurements on the decay of energized clusters. Metal clusters are particularly challenging because of the multitude of low-lying electronic states that are coupled to vibrations. This has previously been accounted for assuming the average electronic structure of a conducting cluster approximated by the levels of electron in a cavity. While this provides a reasonable time-averaged description, it ignores the distribution of instantaneous electronic structures in a "boiling" cluster around that average. Here we set up a new treatment that incorporates the statistical distribution of electronic levels around the average picture using random matrix theory. This approach faithfully reflects the completely chaotic "vibronic soup" nature of hot metal clusters. We found that the consideration of electronic level statistics significantly promotes electronic excitation and thus increases the magnitude of its effect. As this excitation always depresses the decay rates, the inclusion of level statistics results in slower dissociation of metal clusters.

  9. Rainfall runoff modelling of the Upper Ganga and Brahmaputra basins using PERSiST.

    PubMed

    Futter, M N; Whitehead, P G; Sarkar, S; Rodda, H; Crossman, J

    2015-06-01

    There are ongoing discussions about the appropriate level of complexity and sources of uncertainty in rainfall runoff models. Simulations for operational hydrology, flood forecasting or nutrient transport all warrant different levels of complexity in the modelling approach. More complex model structures are appropriate for simulations of land-cover dependent nutrient transport while more parsimonious model structures may be adequate for runoff simulation. The appropriate level of complexity is also dependent on data availability. Here, we use PERSiST; a simple, semi-distributed dynamic rainfall-runoff modelling toolkit to simulate flows in the Upper Ganges and Brahmaputra rivers. We present two sets of simulations driven by single time series of daily precipitation and temperature using simple (A) and complex (B) model structures based on uniform and hydrochemically relevant land covers respectively. Models were compared based on ensembles of Bayesian Information Criterion (BIC) statistics. Equifinality was observed for parameters but not for model structures. Model performance was better for the more complex (B) structural representations than for parsimonious model structures. The results show that structural uncertainty is more important than parameter uncertainty. The ensembles of BIC statistics suggested that neither structural representation was preferable in a statistical sense. Simulations presented here confirm that relatively simple models with limited data requirements can be used to credibly simulate flows and water balance components needed for nutrient flux modelling in large, data-poor basins.

  10. Mathematical modelling of tumour volume dynamics in response to stereotactic ablative radiotherapy for non-small cell lung cancer

    NASA Astrophysics Data System (ADS)

    Tariq, Imran; Humbert-Vidan, Laia; Chen, Tao; South, Christopher P.; Ezhil, Veni; Kirkby, Norman F.; Jena, Rajesh; Nisbet, Andrew

    2015-05-01

    This paper reports a modelling study of tumour volume dynamics in response to stereotactic ablative radiotherapy (SABR). The main objective was to develop a model that is adequate to describe tumour volume change measured during SABR, and at the same time is not excessively complex as lacking support from clinical data. To this end, various modelling options were explored, and a rigorous statistical method, the Akaike information criterion, was used to help determine a trade-off between model accuracy and complexity. The models were calibrated to the data from 11 non-small cell lung cancer patients treated with SABR. The results showed that it is feasible to model the tumour volume dynamics during SABR, opening up the potential for using such models in a clinical environment in the future.

  11. Revealing physical interaction networks from statistics of collective dynamics

    PubMed Central

    Nitzan, Mor; Casadiego, Jose; Timme, Marc

    2017-01-01

    Revealing physical interactions in complex systems from observed collective dynamics constitutes a fundamental inverse problem in science. Current reconstruction methods require access to a system’s model or dynamical data at a level of detail often not available. We exploit changes in invariant measures, in particular distributions of sampled states of the system in response to driving signals, and use compressed sensing to reveal physical interaction networks. Dynamical observations following driving suffice to infer physical connectivity even if they are temporally disordered, are acquired at large sampling intervals, and stem from different experiments. Testing various nonlinear dynamic processes emerging on artificial and real network topologies indicates high reconstruction quality for existence as well as type of interactions. These results advance our ability to reveal physical interaction networks in complex synthetic and natural systems. PMID:28246630

  12. Artificial Intelligence Methods: Choice of algorithms, their complexity, and appropriateness within the context of hydrology and water resources. (Invited)

    NASA Astrophysics Data System (ADS)

    Bastidas, L. A.; Pande, S.

    2009-12-01

    Pattern analysis deals with the automatic detection of patterns in the data and there are a variety of algorithms available for the purpose. These algorithms are commonly called Artificial Intelligence (AI) or data driven algorithms, and have been applied lately to a variety of problems in hydrology and are becoming extremely popular. When confronting such a range of algorithms, the question of which one is the “best” arises. Some algorithms may be preferred because of the lower computational complexity; others take into account prior knowledge of the form and the amount of the data; others are chosen based on a version of the Occam’s razor principle that a simple classifier performs better. Popper has argued, however, that Occam’s razor is without operational value because there is no clear measure or criterion for simplicity. An example of measures that can be used for this purpose are: the so called algorithmic complexity - also known as Kolmogorov complexity or Kolmogorov (algorithmic) entropy; the Bayesian information criterion; or the Vapnik-Chervonenkis dimension. On the other hand, the No Free Lunch Theorem states that there is no best general algorithm, and that specific algorithms are superior only for specific problems. It should be noted also that the appropriate algorithm and the appropriate complexity are constrained by the finiteness of the available data and the uncertainties associated with it. Thus, there is compromise between the complexity of the algorithm, the data properties, and the robustness of the predictions. We discuss the above topics; briefly review the historical development of applications with particular emphasis on statistical learning theory (SLT), also known as machine learning (ML) of which support vector machines and relevant vector machines are the most commonly known algorithms. We present some applications of such algorithms for distributed hydrologic modeling; and introduce an example of how the complexity measure can be applied for appropriate model choice within the context of applications in hydrologic modeling intended for use in studies about water resources and water resources management and their direct relation to extreme conditions or natural hazards.

  13. Testing alternative ground water models using cross-validation and other methods

    USGS Publications Warehouse

    Foglia, L.; Mehl, S.W.; Hill, M.C.; Perona, P.; Burlando, P.

    2007-01-01

    Many methods can be used to test alternative ground water models. Of concern in this work are methods able to (1) rank alternative models (also called model discrimination) and (2) identify observations important to parameter estimates and predictions (equivalent to the purpose served by some types of sensitivity analysis). Some of the measures investigated are computationally efficient; others are computationally demanding. The latter are generally needed to account for model nonlinearity. The efficient model discrimination methods investigated include the information criteria: the corrected Akaike information criterion, Bayesian information criterion, and generalized cross-validation. The efficient sensitivity analysis measures used are dimensionless scaled sensitivity (DSS), composite scaled sensitivity, and parameter correlation coefficient (PCC); the other statistics are DFBETAS, Cook's D, and observation-prediction statistic. Acronyms are explained in the introduction. Cross-validation (CV) is a computationally intensive nonlinear method that is used for both model discrimination and sensitivity analysis. The methods are tested using up to five alternative parsimoniously constructed models of the ground water system of the Maggia Valley in southern Switzerland. The alternative models differ in their representation of hydraulic conductivity. A new method for graphically representing CV and sensitivity analysis results for complex models is presented and used to evaluate the utility of the efficient statistics. The results indicate that for model selection, the information criteria produce similar results at much smaller computational cost than CV. For identifying important observations, the only obviously inferior linear measure is DSS; the poor performance was expected because DSS does not include the effects of parameter correlation and PCC reveals large parameter correlations. ?? 2007 National Ground Water Association.

  14. [Influence of the duration of recording in the reproducibility of the signal averaged electrocardiogram].

    PubMed

    Copie, X; Blankoff, I; Hnatkova, K; Fei, L; Camm, A J; Malik, M

    1996-06-01

    The authors studied the possibility of improving the reproducibility of the signal averaged ECG by increasing the number of averaged QRS complexes. One hundred patients were included in the study. In each cases, 400 QRS complexes were recorded on twice, consecutively, in strictly identical conditions. During each recording, the total duration of the amplified and averaged QRS complex (tQRS), the duration of the terminal signal below 40 microV (LAS) and the root mean square of the amplitude of the last 40 ms (RMS) were determined for 100, 200, 300 and 400 recorded QRS complexes. The presence of late potentials was defined as the positivity of two of the following criteria: tQRS > 114 ms, LAS > 38 ms, RMS < 20 microV. The number of contradictory diagnostic conclusions between two successive recordings of the same duration decreased progressively with the number of averaged QRS complexes: 10 for 100 QRS, 10 for 200 QRS, 9 for 300 QRS and 6 for 400 QRS complexes, but this improvement was not statistically significant. The absolute differences of tQRS and RMS between two successive recordings of the same duration were statistically different for the four durations of recording (p = 0.05) and there was a tendency towards statistical significance for LAS (p = 0.09). The best quantitative reproducibility of the 3 parameters was obtained with the recording of 300 QRS complexes. In conclusion, the reproducibility of the signal averaged ECG is improved when the number of average QRS complexes is increased. The authors' results suggests that reproducibility this is optimal with the amplification and averaging of 300 QRS complexes.

  15. The efficacy of two oral hygiene regimens in reducing oral malodour: a randomised clinical trial.

    PubMed

    Feres, Magda; Figueiredo, Luciene Cristina; Faveri, Marcelo; Guerra, Marcelo C; Mateo, Luis R; Stewart, Bernal; Williams, Malcolm; Panagakos, Foti

    2015-12-01

    This study compared the efficacy of two oral hygiene regimens in reducing oral malodour and the proportions of bacterial species involved in the production of volatile sulphur compounds. Seventy subjects who participated in a halitosis-induction phase and achieved an organoleptic score of ≥ 3.0 [time point 0 (T0)] randomised into two groups: brushing with regular fluoride toothpaste alone (control group) or brushing with regular fluoride toothpaste followed by rinsing with a 0.075% cetylpyridinium chloride (CPC) mouthwash (CPC group). Subjects followed their assigned oral hygiene regimen for 21 days. Then, they underwent an organoleptic examination and measurement of volatile sulphur compounds (VSCs) using a portable gas chromatograph, 12 hours after their last oral hygiene procedure (T1) and 4 hours after an on-site oral hygiene (T2). Microbiological samples (supragingival biofilm, tongue coating and saliva) were analysed using checkerboard DNA-DNA hybridisation. Both therapies statistically significantly improved the organoleptic scores (P < 0.05), but the VSC levels and/or concentrations were reduced only in the CPC group (P < 0.05). In subjects rinsing with CPC, oral malodour scores were reduced by 49% at the 4-hour assessment (T2) compared with those not rinsing (P < 0.05). Red-complex pathogens were reduced more effectively in the CPC group than in the control group. Brushing followed by rinsing with a 0.075% CPC mouthwash provided statistically significantly greater reductions in oral malodour, measured organoleptically and instrumentally, and in the proportions of red-complex species when compared with brushing alone. © 2015 FDI World Dental Federation.

  16. [Effects of natural factors of Niska Banja spa on indexes of mobility of vertebral column in patients with ankylosing spondylitis].

    PubMed

    Nedović, Jovan; Stamenković, Bojana; Stojanović, Sonja; Stanković, Aleksandra; Dimić, Aleksandar

    2009-01-01

    Ankylosing spondilitis (AS) is a disease from a group of seronegative spondyloarthropathies with the prevalence of 0.1% affecting mainly young males, which also gives sociomedical significance to the disease. Among all inflammatory arthropathies, AS is the most suitable for balneotherapy. Thermomineral water of the Niska Banja spa is homeothermic, oligomineral, alkaline, low radioactive radon water and also, in conjunction with mineral peloid, is considered to be optimal for this indication. Our aim was to investigate the effects of natural factors of the Niska Banja spa as a part of complex treatment on the indexes of mobility of the vertebral column in the patients with AS. The study enrolled 40 patients with the average age of 48.0 +/-14.82 years and the average duration of disease of 16.9 +/- 6.42 years. Patients were treated with hydro- and peloidotherapy during the average of 17.23 +/- 2.71 days. At the beginning and at the end of treatment, a number of indexes of spinal mobility were measured. The statistical significance of differences was calculated using the Student's t-test. All of the measured indexes were better after balneotherapy reaching statistically significant differences in regard to the wall-to-occiput distance (p < 0.05), the index of sagittal mobility of the cervical (p < 0.05) and lumbar (p < 0.005) spine. The application of natural factors of the Niska Banja spa during complex treatment of the patients with AS is accompanied with the objective increase of the spine mobility.

  17. Uncovering beat deafness: detecting rhythm disorders with synchronized finger tapping and perceptual timing tasks.

    PubMed

    Dalla Bella, Simone; Sowiński, Jakub

    2015-03-16

    A set of behavioral tasks for assessing perceptual and sensorimotor timing abilities in the general population (i.e., non-musicians) is presented here with the goal of uncovering rhythm disorders, such as beat deafness. Beat deafness is characterized by poor performance in perceiving durations in auditory rhythmic patterns or poor synchronization of movement with auditory rhythms (e.g., with musical beats). These tasks include the synchronization of finger tapping to the beat of simple and complex auditory stimuli and the detection of rhythmic irregularities (anisochrony detection task) embedded in the same stimuli. These tests, which are easy to administer, include an assessment of both perceptual and sensorimotor timing abilities under different conditions (e.g., beat rates and types of auditory material) and are based on the same auditory stimuli, ranging from a simple metronome to a complex musical excerpt. The analysis of synchronized tapping data is performed with circular statistics, which provide reliable measures of synchronization accuracy (e.g., the difference between the timing of the taps and the timing of the pacing stimuli) and consistency. Circular statistics on tapping data are particularly well-suited for detecting individual differences in the general population. Synchronized tapping and anisochrony detection are sensitive measures for identifying profiles of rhythm disorders and have been used with success to uncover cases of poor synchronization with spared perceptual timing. This systematic assessment of perceptual and sensorimotor timing can be extended to populations of patients with brain damage, neurodegenerative diseases (e.g., Parkinson's disease), and developmental disorders (e.g., Attention Deficit Hyperactivity Disorder).

  18. Shedding Light on the Etiology of Sports Injuries: A Look Behind the Scenes of Time-to-Event Analyses.

    PubMed

    Nielsen, Rasmus Østergaard; Malisoux, Laurent; Møller, Merete; Theisen, Daniel; Parner, Erik Thorlund

    2016-04-01

    The etiological mechanism underpinning any sports-related injury is complex and multifactorial. Frequently, athletes perceive "excessive training" as the principal factor in their injury, an observation that is biologically plausible yet somewhat ambiguous. If the applied training load is suddenly increased, this may increase the risk for sports injury development, irrespective of the absolute amount of training. Indeed, little to no rigorous scientific evidence exists to support the hypothesis that fluctuations in training load, compared to absolute training load, are more important in explaining sports injury development. One reason for this could be that prospective data from scientific studies should be analyzed in a different manner. Time-to-event analysis is a useful statistical tool in which to analyze the influence of changing exposures on injury risk. However, the potential of time-to-event analysis remains insufficiently exploited in sports injury research. Therefore, the purpose of the present article was to present and discuss measures of association used in time-to-event analyses and to present the advanced concept of time-varying exposures and outcomes. In the paper, different measures of association, such as cumulative relative risk, cumulative risk difference, and the classical hazard rate ratio, are presented in a nontechnical manner, and suggestions for interpretation of study results are provided. To summarize, time-to-event analysis complements the statistical arsenal of sports injury prevention researchers, because it enables them to analyze the complex and highly dynamic reality of injury etiology, injury recurrence, and time to recovery across a range of sporting contexts.

  19. Detecting transitions in protein dynamics using a recurrence quantification analysis based bootstrap method.

    PubMed

    Karain, Wael I

    2017-11-28

    Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.

  20. EFFECT OF INTRAVITREAL RANIBIZUMAB ON GANGLION CELL COMPLEX AND PERIPAPILLARY RETINAL NERVE FIBER LAYER IN NEOVASCULAR AGE-RELATED MACULAR DEGENERATION USING SPECTRAL DOMAIN OPTICAL COHERENCE TOMOGRAPHY.

    PubMed

    Zucchiatti, Ilaria; Cicinelli, Maria V; Parodi, Maurizio Battaglia; Pierro, Luisa; Gagliardi, Marco; Accardo, Agostino; Bandello, Francesco

    2017-07-01

    To analyze the changes in ganglion cell complex and peripapillary retinal nerve fiber layer thickness, in central macular thickness and choroidal thickness on spectral domain optical coherence tomography in patients with neovascular age-related macular degeneration treated with intravitreal ranibizumab injections. All consecutive patients with untreated neovascular age-related macular degeneration received loading phase of three monthly intravitreal ranibizumab, followed by retreatments on a pro re nata protocol for 12 months. changes in ganglion cell complex and retinal nerve fiber layer at the end of follow-up. Secondary outcome: changes in best-corrected visual acuity, central macular thickness, and choroidal thickness at the end of follow-up. Choroidal thickness was measured at 500 μm, 1000 μm, and 1,500 μm intervals nasally, temporally, superiorly, and inferiorly to the fovea, respectively, on horizontal and vertical line scans centered on the fovea. Twenty-four eyes were included. Ganglion cell complex and peripapillary retinal nerve fiber layer thickness did not show statistically significant changes through 12 months (55.6 ± 18.5 and 81.9 ± 9.9 μm at baseline, 52.7 ± 19.3 and 84.6 ± 15.5 μm at month 12, P > 0.05). Central macular thickness showed progressive decrease from baseline to month 12, with maximum reduction at month 3 (P < 0.001). Statistically significant reduction in choroidal thickness was registered in the nasal 500, 1000, and 1,500 μm from the fovea, corresponding to the papillomacular region (from 169.6 ± 45.3 to 153.9 ± 46.9, P < 0.001). Intravitreal ranibizumab injections did not affect retinal nerve fiber layer and ganglion cell complex thickness in 1-year follow-up. Choroidal thickness in papillomacular area and central macular thickness was significantly reduced at the end of treatment. Further studies, with larger sample, longer follow-up, and greater number of injections, are warranted.

  1. Measurements of thermal updraft intensity over complex terrain using American white pelicans and a simple boundary-layer forecast model

    USGS Publications Warehouse

    Shannon, H.D.; Young, G.S.; Yates, M.; Fuller, Mark R.; Seegar, W.

    2003-01-01

    An examination of boundary-layer meteorological and avian aerodynamic theories suggests that soaring birds can be used to measure the magnitude of vertical air motions within the boundary layer. These theories are applied to obtain mixed-layer normalized thermal updraft intensity over both flat and complex terrain from the climb rates of soaring American white pelicans and from diagnostic boundary-layer model-produced estimates of the boundary-layer depth zi and the convective velocity scale w*. Comparison of the flatland data with the profiles of normalized updraft velocity obtained from previous studies reveals that the pelican-derived measurements of thermal updraft intensity are in close agreement with those obtained using traditional research aircraft and large eddy simulation (LES) in the height range of 0.2 to 0.8 zi. Given the success of this method, the profiles of thermal vertical velocity over the flatland and the nearby mountains are compared. This comparison shows that these profiles are statistically indistinguishable over this height range, indicating that the profile for thermal updraft intensity varies little over this sample of complex terrain. These observations support the findings of a recent LES study that explored the turbulent structure of the boundary layer using a range of terrain specifications. For terrain similar in scale to that encountered in this study, results of the LES suggest that the terrain caused less than an 11% variation in the standard deviation of vertical velocity.

  2. Structure and Randomness of Continuous-Time, Discrete-Event Processes

    NASA Astrophysics Data System (ADS)

    Marzen, Sarah E.; Crutchfield, James P.

    2017-10-01

    Loosely speaking, the Shannon entropy rate is used to gauge a stochastic process' intrinsic randomness; the statistical complexity gives the cost of predicting the process. We calculate, for the first time, the entropy rate and statistical complexity of stochastic processes generated by finite unifilar hidden semi-Markov models—memoryful, state-dependent versions of renewal processes. Calculating these quantities requires introducing novel mathematical objects (ɛ -machines of hidden semi-Markov processes) and new information-theoretic methods to stochastic processes.

  3. Asymptotic Distribution of the Likelihood Ratio Test Statistic for Sphericity of Complex Multivariate Normal Distribution.

    DTIC Science & Technology

    1981-08-01

    RATIO TEST STATISTIC FOR SPHERICITY OF COMPLEX MULTIVARIATE NORMAL DISTRIBUTION* C. Fang P. R. Krishnaiah B. N. Nagarsenker** August 1981 Technical...and their applications in time sEries, the reader is referred to Krishnaiah (1976). Motivated by the applications in the area of inference on multiple...for practical purposes. Here, we note that Krishnaiah , Lee and Chang (1976) approxi- mated the null distribution of certain power of the likeli

  4. Statistical complex fatigue data for SAE 4340 steel and its use in design by reliability

    NASA Technical Reports Server (NTRS)

    Kececioglu, D.; Smith, J. L.

    1970-01-01

    A brief description of the complex fatigue machines used in the test program is presented. The data generated from these machines are given and discussed. Two methods of obtaining strength distributions from the data are also discussed. Then follows a discussion of the construction of statistical fatigue diagrams and their use in designing by reliability. Finally, some of the problems encountered in the test equipment and a corrective modification are presented.

  5. Evidence of non-extensivity and complexity in the seismicity observed during 2011-2012 at the Santorini volcanic complex, Greece

    NASA Astrophysics Data System (ADS)

    Vallianatos, F.; Tzanis, A.; Michas, G.; Papadakis, G.

    2012-04-01

    Since the middle of summer 2011, an increase in the seismicity rates of the volcanic complex system of Santorini Island, Greece, was observed. In the present work, the temporal distribution of seismicity, as well as the magnitude distribution of earthquakes, have been studied using the concept of Non-Extensive Statistical Physics (NESP; Tsallis, 2009) along with the evolution of Shanon entropy H (also called information entropy). The analysis is based on the earthquake catalogue of the Geodynamic Institute of the National Observatory of Athens for the period July 2011-January 2012 (http://www.gein.noa.gr/). Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems. The observed distributions of seismicity rates at Santorini can be described (fitted) with NESP models to exceptionally well. This implies the inherent complexity of the Santorini volcanic seismicity, the applicability of NESP concepts to volcanic earthquake activity and the usefulness of NESP in investigating phenomena exhibiting multifractality and long-range coupling effects. Acknowledgments. This work was supported in part by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled "Integrated understanding of Seismicity, using innovative Methodologies of Fracture mechanics along with Earthquake and non extensive statistical physics - Application to the geodynamic system of the Hellenic Arc. SEISMO FEAR HELLARC". GM and GP wish to acknowledge the partial support of the Greek State Scholarships Foundation (ΙΚΥ).

  6. Broadband classification and statistics of echoes from aggregations of fish measured by long-range, mid-frequency sonar.

    PubMed

    Jones, Benjamin A; Stanton, Timothy K; Colosi, John A; Gauss, Roger C; Fialkowski, Joseph M; Michael Jech, J

    2017-06-01

    For horizontal-looking sonar systems operating at mid-frequencies (1-10 kHz), scattering by fish with resonant gas-filled swimbladders can dominate seafloor and surface reverberation at long-ranges (i.e., distances much greater than the water depth). This source of scattering, which can be difficult to distinguish from other sources of scattering in the water column or at the boundaries, can add spatio-temporal variability to an already complex acoustic record. Sparsely distributed, spatially compact fish aggregations were measured in the Gulf of Maine using a long-range broadband sonar with continuous spectral coverage from 1.5 to 5 kHz. Observed echoes, that are at least 15 decibels above background levels in the horizontal-looking sonar data, are classified spectrally by the resonance features as due to swimbladder-bearing fish. Contemporaneous multi-frequency echosounder measurements (18, 38, and 120 kHz) and net samples are used in conjunction with physics-based acoustic models to validate this approach. Furthermore, the fish aggregations are statistically characterized in the long-range data by highly non-Rayleigh distributions of the echo magnitudes. These distributions are accurately predicted by a computationally efficient, physics-based model. The model accounts for beam-pattern and waveguide effects as well as the scattering response of aggregations of fish.

  7. Characterisation of turbulence downstream of a linear compressor cascade

    NASA Astrophysics Data System (ADS)

    di Mare, Luca; Jelly, Thomas; Day, Ivor

    2014-11-01

    Characterisation of turbulence in turbomachinery remains one of the most complex tasks in fluid mechanics. In addition, current closure models required for Reynolds-averaged Navier-Stokes computations do not accurately represent the action of turbulent forces against the mean flow. Therefore, the statistical properties of turbulence in turbomachinery are of significant interest. In the current work, single- and two-point hot-wire measurements have been acquired downstream of a linear compressor cascade in order to examine the properties of large-scale turbulent structures and to assess how they affect turbulent momentum and energy transfer in compressor passages. The cascade has seven controlled diffusion which are representative of high-pressure stator blades found in turbofan engines. Blade chord, thickness and camber are 0.1515 m, 9.3% and 42 degrees, respectively. Measurements were acquired at a chord Reynolds number of 6 . 92 ×105 . Single-point statistics highlight differences in turbulence structure when comparing mid-span and end-wall regions. Evaluation of two-point correlations and their corresponding spectra reveal the length-scales of the energy-bearing eddies in the cascade. Ultimately, these measurements can be used to calibrate future computational models. The authors gratefully acknowledge Rolls-Royce plc for funding this work and granting permission for its publication.

  8. Discovering functional interdependence relationship in PPI networks for protein complex identification.

    PubMed

    Lam, Winnie W M; Chan, Keith C C

    2012-04-01

    Protein molecules interact with each other in protein complexes to perform many vital functions, and different computational techniques have been developed to identify protein complexes in protein-protein interaction (PPI) networks. These techniques are developed to search for subgraphs of high connectivity in PPI networks under the assumption that the proteins in a protein complex are highly interconnected. While these techniques have been shown to be quite effective, it is also possible that the matching rate between the protein complexes they discover and those that are previously determined experimentally be relatively low and the "false-alarm" rate can be relatively high. This is especially the case when the assumption of proteins in protein complexes being more highly interconnected be relatively invalid. To increase the matching rate and reduce the false-alarm rate, we have developed a technique that can work effectively without having to make this assumption. The name of the technique called protein complex identification by discovering functional interdependence (PCIFI) searches for protein complexes in PPI networks by taking into consideration both the functional interdependence relationship between protein molecules and the network topology of the network. The PCIFI works in several steps. The first step is to construct a multiple-function protein network graph by labeling each vertex with one or more of the molecular functions it performs. The second step is to filter out protein interactions between protein pairs that are not functionally interdependent of each other in the statistical sense. The third step is to make use of an information-theoretic measure to determine the strength of the functional interdependence between all remaining interacting protein pairs. Finally, the last step is to try to form protein complexes based on the measure of the strength of functional interdependence and the connectivity between proteins. For performance evaluation, PCIFI was used to identify protein complexes in real PPI network data and the protein complexes it found were matched against those that were previously known in MIPS. The results show that PCIFI can be an effective technique for the identification of protein complexes. The protein complexes it found can match more known protein complexes with a smaller false-alarm rate and can provide useful insights into the understanding of the functional interdependence relationships between proteins in protein complexes.

  9. graph-GPA: A graphical model for prioritizing GWAS results and investigating pleiotropic architecture.

    PubMed

    Chung, Dongjun; Kim, Hang J; Zhao, Hongyu

    2017-02-01

    Genome-wide association studies (GWAS) have identified tens of thousands of genetic variants associated with hundreds of phenotypes and diseases, which have provided clinical and medical benefits to patients with novel biomarkers and therapeutic targets. However, identification of risk variants associated with complex diseases remains challenging as they are often affected by many genetic variants with small or moderate effects. There has been accumulating evidence suggesting that different complex traits share common risk basis, namely pleiotropy. Recently, several statistical methods have been developed to improve statistical power to identify risk variants for complex traits through a joint analysis of multiple GWAS datasets by leveraging pleiotropy. While these methods were shown to improve statistical power for association mapping compared to separate analyses, they are still limited in the number of phenotypes that can be integrated. In order to address this challenge, in this paper, we propose a novel statistical framework, graph-GPA, to integrate a large number of GWAS datasets for multiple phenotypes using a hidden Markov random field approach. Application of graph-GPA to a joint analysis of GWAS datasets for 12 phenotypes shows that graph-GPA improves statistical power to identify risk variants compared to statistical methods based on smaller number of GWAS datasets. In addition, graph-GPA also promotes better understanding of genetic mechanisms shared among phenotypes, which can potentially be useful for the development of improved diagnosis and therapeutics. The R implementation of graph-GPA is currently available at https://dongjunchung.github.io/GGPA/.

  10. Can’t Count or Won’t Count? Embedding Quantitative Methods in Substantive Sociology Curricula: A Quasi-Experiment

    PubMed Central

    Williams, Malcolm; Sloan, Luke; Cheung, Sin Yi; Sutton, Carole; Stevens, Sebastian; Runham, Libby

    2015-01-01

    This paper reports on a quasi-experiment in which quantitative methods (QM) are embedded within a substantive sociology module. Through measuring student attitudes before and after the intervention alongside control group comparisons, we illustrate the impact that embedding has on the student experience. Our findings are complex and even contradictory. Whilst the experimental group were less likely to be distrustful of statistics and appreciate how QM inform social research, they were also less confident about their statistical abilities, suggesting that through ‘doing’ quantitative sociology the experimental group are exposed to the intricacies of method and their optimism about their own abilities is challenged. We conclude that embedding QM in a single substantive module is not a ‘magic bullet’ and that a wider programme of content and assessment diversification across the curriculum is preferential. PMID:27330225

  11. Shielding Effectiveness in a Two-Dimensional Reverberation Chamber Using Finite-Element Techniques

    NASA Technical Reports Server (NTRS)

    Bunting, Charles F.

    2006-01-01

    Reverberation chambers are attaining an increased importance in determination of electromagnetic susceptibility of avionics equipment. Given the nature of the variable boundary condition, the ability of a given source to couple energy into certain modes and the passband characteristic due the chamber Q, the fields are typically characterized by statistical means. The emphasis of this work is to apply finite-element techniques at cutoff to the analysis of a two-dimensional structure to examine the notion of shielding-effectiveness issues in a reverberating environment. Simulated mechanical stirring will be used to obtain the appropriate statistical field distribution. The shielding effectiveness (SE) in a simulated reverberating environment is compared to measurements in a reverberation chamber. A log-normal distribution for the SE is observed with implications for system designers. The work is intended to provide further refinement in the consideration of SE in a complex electromagnetic environment.

  12. Complex polarization-phase and spatial-frequency selections of laser images of blood-plasma films in diagnostics of changes in their polycrystalline structure

    NASA Astrophysics Data System (ADS)

    Ushenko, Yu. A.; Angelskii, P. O.; Dubolazov, A. V.; Karachevtsev, A. O.; Sidor, M. I.; Mintser, O. P.; Oleinichenko, B. P.; Bizer, L. I.

    2013-10-01

    We present a theoretical formalism of correlation phase analysis of laser images of human blood plasma with spatial-frequency selection of manifestations of mechanisms of linear and circular birefringence of albumin and globulin polycrystalline networks. Comparative results of the measurement of coordinate distributions of the correlation parameter—the modulus of the degree of local correlation of amplitudes—of laser images of blood plasma taken from patients of three groups—healthy patients (donors), rheumatoid-arthritis patients, and breast-cancer patients—are presented. We investigate values and ranges of change of statistical (the first to fourth statistical moments), correlation (excess of autocorrelation functions), and fractal (slopes of approximating curves and dispersion of extrema of logarithmic dependences of power spectra) parameters of coordinate distributions of the degree of local correlation of amplitudes. Objective criteria for diagnostics of occurrence and differentiation of inflammatory and oncological states are determined.

  13. Temperature- and composition-dependent hydrogen diffusivity in palladium from statistically-averaged molecular dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhou, Xiaowang; Heo, Tae Wook; Wood, Brandon C.

    Solid-state hydrogen storage materials undergo complex phase transformations whose kinetics is often limited by hydrogen diffusion. Among metal hydrides, palladium hydride undergoes a diffusional phase transformation upon hydrogen uptake, during which the hydrogen diffusivity varies with hydrogen composition and temperature. Here we perform robust statistically-averaged molecular dynamics simulations to obtain a well-converged analytical expression for hydrogen diffusivity in bulk palladium that is valid throughout all stages of the reaction. Our studies confirm significant dependence of the diffusivity on composition and temperature that elucidate key trends in the available experimental measurements. Whereas at low hydrogen compositions, a single process dominates, atmore » high hydrogen compositions, diffusion is found to exhibit behavior consistent with multiple hopping barriers. Further analysis, supported by nudged elastic band computations, suggests that the multi-barrier diffusion can be interpreted as two distinct mechanisms corresponding to hydrogen-rich and hydrogen-poor local environments.« less

  14. Enhanced Higgs boson to τ(+)τ(-) search with deep learning.

    PubMed

    Baldi, P; Sadowski, P; Whiteson, D

    2015-03-20

    The Higgs boson is thought to provide the interaction that imparts mass to the fundamental fermions, but while measurements at the Large Hadron Collider (LHC) are consistent with this hypothesis, current analysis techniques lack the statistical power to cross the traditional 5σ significance barrier without more data. Deep learning techniques have the potential to increase the statistical power of this analysis by automatically learning complex, high-level data representations. In this work, deep neural networks are used to detect the decay of the Higgs boson to a pair of tau leptons. A Bayesian optimization algorithm is used to tune the network architecture and training algorithm hyperparameters, resulting in a deep network of eight nonlinear processing layers that improves upon the performance of shallow classifiers even without the use of features specifically engineered by physicists for this application. The improvement in discovery significance is equivalent to an increase in the accumulated data set of 25%.

  15. Temperature- and composition-dependent hydrogen diffusivity in palladium from statistically-averaged molecular dynamics

    DOE PAGES

    Zhou, Xiaowang; Heo, Tae Wook; Wood, Brandon C.; ...

    2018-03-09

    Solid-state hydrogen storage materials undergo complex phase transformations whose kinetics is often limited by hydrogen diffusion. Among metal hydrides, palladium hydride undergoes a diffusional phase transformation upon hydrogen uptake, during which the hydrogen diffusivity varies with hydrogen composition and temperature. Here we perform robust statistically-averaged molecular dynamics simulations to obtain a well-converged analytical expression for hydrogen diffusivity in bulk palladium that is valid throughout all stages of the reaction. Our studies confirm significant dependence of the diffusivity on composition and temperature that elucidate key trends in the available experimental measurements. Whereas at low hydrogen compositions, a single process dominates, atmore » high hydrogen compositions, diffusion is found to exhibit behavior consistent with multiple hopping barriers. Further analysis, supported by nudged elastic band computations, suggests that the multi-barrier diffusion can be interpreted as two distinct mechanisms corresponding to hydrogen-rich and hydrogen-poor local environments.« less

  16. Focused Belief Measures for Uncertainty Quantification in High Performance Semantic Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joslyn, Cliff A.; Weaver, Jesse R.

    In web-scale semantic data analytics there is a great need for methods which aggregate uncertainty claims, on the one hand respecting the information provided as accurately as possible, while on the other still being tractable. Traditional statistical methods are more robust, but only represent distributional, additive uncertainty. Generalized information theory methods, including fuzzy systems and Dempster-Shafer (DS) evidence theory, represent multiple forms of uncertainty, but are computationally and methodologically difficult. We require methods which provide an effective balance between the complete representation of the full complexity of uncertainty claims in their interaction, while satisfying the needs of both computational complexitymore » and human cognition. Here we build on J{\\o}sang's subjective logic to posit methods in focused belief measures (FBMs), where a full DS structure is focused to a single event. The resulting ternary logical structure is posited to be able to capture the minimal amount of generalized complexity needed at a maximum of computational efficiency. We demonstrate the efficacy of this approach in a web ingest experiment over the 2012 Billion Triple dataset from the Semantic Web Challenge.« less

  17. Temporal complexity in emission from Anderson localized lasers

    NASA Astrophysics Data System (ADS)

    Kumar, Randhir; Balasubrahmaniyam, M.; Alee, K. Shadak; Mujumdar, Sushil

    2017-12-01

    Anderson localization lasers exploit resonant cavities formed due to structural disorder. The inherent randomness in the structure of these cavities realizes a probability distribution in all cavity parameters such as quality factors, mode volumes, mode structures, and so on, implying resultant statistical fluctuations in the temporal behavior. Here we provide direct experimental measurements of temporal width distributions of Anderson localization lasing pulses in intrinsically and extrinsically disordered coupled-microresonator arrays. We first illustrate signature exponential decays in the spatial intensity distributions of the lasing modes that quantify their localized character, and then measure the temporal width distributions of the pulsed emission over several configurations. We observe a dependence of temporal widths on the disorder strength, wherein the widths show a single-peaked, left-skewed distribution in extrinsic disorder and a dual-peaked distribution in intrinsic disorder. We propose a model based on coupled rate equations for an emitter and an Anderson cavity with a random mode structure, which gives excellent quantitative and qualitative agreement with the experimental observations. The experimental and theoretical analyses bring to the fore the temporal complexity in Anderson-localization-based lasing systems.

  18. Using time-delayed mutual information to discover and interpret temporal correlation structure in complex populations

    NASA Astrophysics Data System (ADS)

    Albers, D. J.; Hripcsak, George

    2012-03-01

    This paper addresses how to calculate and interpret the time-delayed mutual information (TDMI) for a complex, diversely and sparsely measured, possibly non-stationary population of time-series of unknown composition and origin. The primary vehicle used for this analysis is a comparison between the time-delayed mutual information averaged over the population and the time-delayed mutual information of an aggregated population (here, aggregation implies the population is conjoined before any statistical estimates are implemented). Through the use of information theoretic tools, a sequence of practically implementable calculations are detailed that allow for the average and aggregate time-delayed mutual information to be interpreted. Moreover, these calculations can also be used to understand the degree of homo or heterogeneity present in the population. To demonstrate that the proposed methods can be used in nearly any situation, the methods are applied and demonstrated on the time series of glucose measurements from two different subpopulations of individuals from the Columbia University Medical Center electronic health record repository, revealing a picture of the composition of the population as well as physiological features.

  19. Network analysis in detection of early-stage mild cognitive impairment

    NASA Astrophysics Data System (ADS)

    Ni, Huangjing; Qin, Jiaolong; Zhou, Luping; Zhao, Zhigen; Wang, Jun; Hou, Fengzhen

    2017-07-01

    The detection and intervention for early-stage mild cognitive impairment (EMCI) is of vital importance However, the pathology of EMCI remains largely unknown, making it be challenge to the clinical diagnosis. In this paper, the resting-state functional magnetic resonance imaging (rs-fMRI) data derived from EMCI patients and normal controls are analyzed using the complex network theory. We construct the functional connectivity (FC) networks and employ the local false discovery rate approach to successfully detect the abnormal functional connectivities appeared in the EMCI patients. Our results demonstrate the abnormal functional connectivities have appeared in the EMCI patients, and the affected brain regions are mainly distributed in the frontal and temporal lobes In addition, to quantitatively characterize the statistical properties of FCs in the complex network, we herein employ the entropy of the degree distribution (EDD) index and some other well-established measures, i.e., clustering coefficient (CC) and the efficiency of graph (EG). Eventually, we found that the EDD index, better than the widely used CC and EG measures, may serve as an assistant and potential marker for the detection of EMCI.

  20. Revisiting the European sovereign bonds with a permutation-information-theory approach

    NASA Astrophysics Data System (ADS)

    Fernández Bariviera, Aurelio; Zunino, Luciano; Guercio, María Belén; Martinez, Lisana B.; Rosso, Osvaldo A.

    2013-12-01

    In this paper we study the evolution of the informational efficiency in its weak form for seventeen European sovereign bonds time series. We aim to assess the impact of two specific economic situations in the hypothetical random behavior of these time series: the establishment of a common currency and a wide and deep financial crisis. In order to evaluate the informational efficiency we use permutation quantifiers derived from information theory. Specifically, time series are ranked according to two metrics that measure the intrinsic structure of their correlations: permutation entropy and permutation statistical complexity. These measures provide the rectangular coordinates of the complexity-entropy causality plane; the planar location of the time series in this representation space reveals the degree of informational efficiency. According to our results, the currency union contributed to homogenize the stochastic characteristics of the time series and produced synchronization in the random behavior of them. Additionally, the 2008 financial crisis uncovered differences within the apparently homogeneous European sovereign markets and revealed country-specific characteristics that were partially hidden during the monetary union heyday.

  1. Impact of lipid rafts on the T -cell-receptor and peptide-major-histocompatibility-complex interactions under different measurement conditions

    NASA Astrophysics Data System (ADS)

    Li, Long; Xu, Guang-Kui; Song, Fan

    2017-01-01

    The interactions between T-cell receptor (TCR) and peptide-major-histocompatibility complex (pMHC), which enable T-cell development and initiate adaptive immune responses, have been intensively studied. However, a central issue of how lipid rafts affect the TCR-pMHC interactions remains unclear. Here, by using a statistical-mechanical membrane model, we show that the binding affinity of TCR and pMHC anchored on two apposing cell membranes is significantly enhanced because of the lipid raft-induced signaling protein aggregation. This finding may provide an alternative insight into the mechanism of T-cell activation triggered by very low densities of pMHC. In the case of cell-substrate adhesion, our results indicate that the loss of lateral mobility of the proteins on the solid substrate leads to the inhibitory effect of lipid rafts on TCR-pMHC interactions. Our findings help to understand why different experimental methods for measuring the impact of lipid rafts on the receptor-ligand interactions have led to contradictory conclusions.

  2. Prospective impact of illness uncertainty on outcomes in chronic lung disease.

    PubMed

    Hoth, Karin F; Wamboldt, Frederick S; Strand, Matthew; Ford, Dee W; Sandhaus, Robert A; Strange, Charlie; Bekelman, David B; Holm, Kristen E

    2013-11-01

    To determine which aspect of illness uncertainty (i.e., ambiguity or complexity) has a stronger association with psychological and clinical outcomes over a 2-year period among individuals with a genetic subtype of chronic obstructive pulmonary disease (COPD). Ambiguity reflects uncertainty about physical cues and symptoms, and complexity reflects uncertainty about treatment and the medical system. Four-hundred and 7 individuals with alpha-1 antitrypsin deficiency-associated COPD completed questionnaires at baseline, 1- and 2-year follow-up. Uncertainty was measured using the Mishel Uncertainty in Illness Scale. Outcomes were measured using the Hospital Anxiety and Depression Scale, St. George's Respiratory Questionnaire, and MMRC Dyspnea Scale. Ambiguity and complexity were examined as predictors of depressive symptoms, anxiety, quality of life, and breathlessness using linear mixed models adjusting for demographic and health characteristics. Ambiguity was associated with more depressive symptoms (b = 0.09, SE = 0.02, p < .001) and anxiety (b = 0.13, SE = 0.02, p < .001), worse quality of life (b = 0.57, SE = 0.10, p < .001), and more breathlessness (b = 0.02, SE = 0.006, p < .001). Complexity did not have an independent effect on any outcome. Interactions between ambiguity and time since diagnosis were not statistically significant. Ambiguity was prospectively associated with worse mood, quality of life, and breathlessness. Thus, ambiguity should be targeted in psychosocial interventions. Time since diagnosis did not affect the association between ambiguity and outcomes, suggesting that the impact of ambiguity is equally strong throughout the course of COPD.

  3. Complex Dynamical Networks Constructed with Fully Controllable Nonlinear Nanomechanical Oscillators.

    PubMed

    Fon, Warren; Matheny, Matthew H; Li, Jarvis; Krayzman, Lev; Cross, Michael C; D'Souza, Raissa M; Crutchfield, James P; Roukes, Michael L

    2017-10-11

    Control of the global parameters of complex networks has been explored experimentally in a variety of contexts. Yet, the more difficult prospect of realizing arbitrary network architectures, especially analog physical networks that provide dynamical control of individual nodes and edges, has remained elusive. Given the vast hierarchy of time scales involved, it also proves challenging to measure a complex network's full internal dynamics. These span from the fastest nodal dynamics to very slow epochs over which emergent global phenomena, including network synchronization and the manifestation of exotic steady states, eventually emerge. Here, we demonstrate an experimental system that satisfies these requirements. It is based upon modular, fully controllable, nonlinear radio frequency nanomechanical oscillators, designed to form the nodes of complex dynamical networks with edges of arbitrary topology. The dynamics of these oscillators and their surrounding network are analog and continuous-valued and can be fully interrogated in real time. They comprise a piezoelectric nanomechanical membrane resonator, which serves as the frequency-determining element within an electrical feedback circuit. This embodiment permits network interconnections entirely within the electrical domain and provides unprecedented node and edge control over a vast region of parameter space. Continuous measurement of the instantaneous amplitudes and phases of every constituent oscillator node are enabled, yielding full and detailed network data without reliance upon statistical quantities. We demonstrate the operation of this platform through the real-time capture of the dynamics of a three-node ring network as it evolves from the uncoupled state to full synchronization.

  4. Statistical Literacy in the Data Science Workplace

    ERIC Educational Resources Information Center

    Grant, Robert

    2017-01-01

    Statistical literacy, the ability to understand and make use of statistical information including methods, has particular relevance in the age of data science, when complex analyses are undertaken by teams from diverse backgrounds. Not only is it essential to communicate to the consumers of information but also within the team. Writing from the…

  5. Key Success Factors for Statistical Literacy Poster Competitions

    ERIC Educational Resources Information Center

    MacFeely, Steve; Campos, Pedro; Helenius, Reija

    2017-01-01

    Statistical literacy is complex and multifaceted. In every country, education and numeracy are a function of a multitude of factors including culture, history, and societal norms. Nevertheless, since the launch of the International Statistical Poster Competition (ISLP) in 1994, a number of patterns have emerged to suggest there are some common or…

  6. Deconstructing Statistical Analysis

    ERIC Educational Resources Information Center

    Snell, Joel

    2014-01-01

    Using a very complex statistical analysis and research method for the sake of enhancing the prestige of an article or making a new product or service legitimate needs to be monitored and questioned for accuracy. 1) The more complicated the statistical analysis, and research the fewer the number of learned readers can understand it. This adds a…

  7. Complex chromosomal rearrangements induced in vivo by heavy ions.

    PubMed

    Durante, M; Ando, K; Furusawa, Y; Obe, G; George, K; Cucinotta, F A

    2004-01-01

    It has been suggested that the ratio complex/simple exchanges can be used as a biomarker of exposure to high-LET radiation. We tested this hypothesis in vivo, by considering data from several studies that measured complex exchanges in peripheral blood from humans exposed to mixed fields of low- and high-LET radiation. In particular, we studied data from astronauts involved in long-term missions in low-Earth-orbit, and uterus cancer patients treated with accelerated carbon ions. Data from two studies of chromosomal aberrations in astronauts used blood samples obtained before and after space flight, and a third study used blood samples from patients before and after radiotherapy course. Similar methods were used in each study, where lymphocytes were stimulated to grow in vitro, and collected after incubation in either colcemid or calyculin A. Slides were painted with whole-chromosome DNA fluorescent probes (FISH), and complex and simple chromosome exchanges in the painted genome were classified separately. Complex-type exchanges were observed at low frequencies in control subjects, and in our test subjects before the treatment. No statistically significant increase in the yield of complex-type exchanges was induced by the space flight. Radiation therapy induced a high fraction of complex exchanges, but no significant differences could be detected between patients treated with accelerated carbon ions or X-rays. Complex chromosomal rearrangements do not represent a practical biomarker of radiation quality in our test subjects. Copyright 2003 S. Karger AG, Basel

  8. Complex Chromosomal Rearrangements Induced in Vivo by Heavy Ions

    NASA Technical Reports Server (NTRS)

    Durante, M.; Ando, K.; Furusawa, G.; Obe, G.; George, K.; Cucinotta, F. A.

    2004-01-01

    It has been suggested that the ratio complex/simple exchanges can be used as a biomarker of exposure to high-LET radiation. We tested this hypothesis in vivo, by considering data from several studies that measured complex exchanges in peripheral blood from humans exposed to mixed fields of low- and high-LET radiation. In particular, we studied data from astronauts involved in long-term missions in low-Earth-orbit, and uterus cancer patients treated with accelerated carbon ions. Data from two studies of chromosomal aberrations in astronauts used blood samples obtained before and after space flight, and a third study used blood samples from patients before and after radiotherapy course. Similar methods were used in each study, where lymphocytes were stimulated to grow in vitro, and collected after incubation in either colcemid or calyculin A. Slides were painted with whole-chromosome DNA fluorescent probes (FISH), and complex and simple chromosome exchanges in the painted genome were classified separately. Complex-type exchanges were observed at low frequencies in control subjects, and in our test subjects before the treatment. No statistically significant increase in the yield of complex-type exchanges was induced by the space flight. Radiation therapy induced a high fraction of complex exchanges, but no significant differences could be detected between patients treated with accelerated carbon ions or X-rays. Complex chromosomal rearrangements do not represent a practical biomarker of radiation quality in our test subjects. Copyright 2003 S. Karger AG, Basel.

  9. From pull-down data to protein interaction networks and complexes with biological relevance.

    PubMed

    Zhang, Bing; Park, Byung-Hoon; Karpinets, Tatiana; Samatova, Nagiza F

    2008-04-01

    Recent improvements in high-throughput Mass Spectrometry (MS) technology have expedited genome-wide discovery of protein-protein interactions by providing a capability of detecting protein complexes in a physiological setting. Computational inference of protein interaction networks and protein complexes from MS data are challenging. Advances are required in developing robust and seamlessly integrated procedures for assessment of protein-protein interaction affinities, mathematical representation of protein interaction networks, discovery of protein complexes and evaluation of their biological relevance. A multi-step but easy-to-follow framework for identifying protein complexes from MS pull-down data is introduced. It assesses interaction affinity between two proteins based on similarity of their co-purification patterns derived from MS data. It constructs a protein interaction network by adopting a knowledge-guided threshold selection method. Based on the network, it identifies protein complexes and infers their core components using a graph-theoretical approach. It deploys a statistical evaluation procedure to assess biological relevance of each found complex. On Saccharomyces cerevisiae pull-down data, the framework outperformed other more complicated schemes by at least 10% in F(1)-measure and identified 610 protein complexes with high-functional homogeneity based on the enrichment in Gene Ontology (GO) annotation. Manual examination of the complexes brought forward the hypotheses on cause of false identifications. Namely, co-purification of different protein complexes as mediated by a common non-protein molecule, such as DNA, might be a source of false positives. Protein identification bias in pull-down technology, such as the hydrophilic bias could result in false negatives.

  10. Exploiting Complexity Information for Brain Activation Detection

    PubMed Central

    Zhang, Yan; Liang, Jiali; Lin, Qiang; Hu, Zhenghui

    2016-01-01

    We present a complexity-based approach for the analysis of fMRI time series, in which sample entropy (SampEn) is introduced as a quantification of the voxel complexity. Under this hypothesis the voxel complexity could be modulated in pertinent cognitive tasks, and it changes through experimental paradigms. We calculate the complexity of sequential fMRI data for each voxel in two distinct experimental paradigms and use a nonparametric statistical strategy, the Wilcoxon signed rank test, to evaluate the difference in complexity between them. The results are compared with the well known general linear model based Statistical Parametric Mapping package (SPM12), where a decided difference has been observed. This is because SampEn method detects brain complexity changes in two experiments of different conditions and the data-driven method SampEn evaluates just the complexity of specific sequential fMRI data. Also, the larger and smaller SampEn values correspond to different meanings, and the neutral-blank design produces higher predictability than threat-neutral. Complexity information can be considered as a complementary method to the existing fMRI analysis strategies, and it may help improving the understanding of human brain functions from a different perspective. PMID:27045838

  11. Comparison between Thermal Desorption Tubes and Stainless Steel Canisters Used for Measuring Volatile Organic Compounds in Petrochemical Factories

    PubMed Central

    Chang, Cheng-Ping; Lin, Tser-Cheng; Lin, Yu-Wen; Hua, Yi-Chun; Chu, Wei-Ming; Lin, Tzu-Yu; Lin, Yi-Wen; Wu, Jyun-De

    2016-01-01

    Objective: The purpose of this study was to compare thermal desorption tubes and stainless steel canisters for measuring volatile organic compounds (VOCs) emitted from petrochemical factories. Methods: Twelve petrochemical factories in the Mailiao Industrial Complex were recruited for conducting the measurements of VOCs. Thermal desorption tubes and 6-l specially prepared stainless steel canisters were used to simultaneously perform active sampling of environmental air samples. The sampling time of the environmental air samples was set up on 6h close to a full work shift of the workers. A total of 94 pairwise air samples were collected by using the thermal adsorption tubes and stainless steel canisters in these 12 factories in the petrochemical industrial complex. To maximize the number of comparative data points, all the measurements from all the factories in different sampling times were lumped together to perform a linear regression analysis for each selected VOC. Pearson product–moment correlation coefficient was used to examine the correlation between the pairwise measurements of these two sampling methods. A paired t-test was also performed to examine whether the difference in the concentrations of each selected VOC measured by the two methods was statistically significant. Results: The correlation coefficients of seven compounds, including acetone, n-hexane, benzene, toluene, 1,2-dichloroethane, 1,3-butadiene, and styrene were >0.80 indicating the two sampling methods for these VOCs’ measurements had high consistency. The paired t-tests for the measurements of n-hexane, benzene, m/p-xylene, o-xylene, 1,2-dichloroethane, and 1,3-butadiene showed statistically significant difference (P-value < 0.05). This indicated that the two sampling methods had various degrees of systematic errors. Looking at the results of six chemicals and these systematic errors probably resulted from the differences of the detection limits in the two sampling methods for these VOCs. Conclusions: The comparison between the concentrations of each of the 10 selected VOCs measured by the two sampling methods indicted that the thermal desorption tubes provided high accuracy and precision measurements for acetone, benzene, and 1,3-butadiene. The accuracy and precision of using the thermal desorption tubes for measuring the VOCs can be improved due to new developments in sorbent materials, multi-sorbent designs, and thermal desorption instrumentation. More applications of thermal desorption tubes for measuring occupational and environmental hazardous agents can be anticipated. PMID:26585828

  12. Comparison between Thermal Desorption Tubes and Stainless Steel Canisters Used for Measuring Volatile Organic Compounds in Petrochemical Factories.

    PubMed

    Chang, Cheng-Ping; Lin, Tser-Cheng; Lin, Yu-Wen; Hua, Yi-Chun; Chu, Wei-Ming; Lin, Tzu-Yu; Lin, Yi-Wen; Wu, Jyun-De

    2016-04-01

    The purpose of this study was to compare thermal desorption tubes and stainless steel canisters for measuring volatile organic compounds (VOCs) emitted from petrochemical factories. Twelve petrochemical factories in the Mailiao Industrial Complex were recruited for conducting the measurements of VOCs. Thermal desorption tubes and 6-l specially prepared stainless steel canisters were used to simultaneously perform active sampling of environmental air samples. The sampling time of the environmental air samples was set up on 6 h close to a full work shift of the workers. A total of 94 pairwise air samples were collected by using the thermal adsorption tubes and stainless steel canisters in these 12 factories in the petrochemical industrial complex. To maximize the number of comparative data points, all the measurements from all the factories in different sampling times were lumped together to perform a linear regression analysis for each selected VOC. Pearson product-moment correlation coefficient was used to examine the correlation between the pairwise measurements of these two sampling methods. A paired t-test was also performed to examine whether the difference in the concentrations of each selected VOC measured by the two methods was statistically significant. The correlation coefficients of seven compounds, including acetone, n-hexane, benzene, toluene, 1,2-dichloroethane, 1,3-butadiene, and styrene were >0.80 indicating the two sampling methods for these VOCs' measurements had high consistency. The paired t-tests for the measurements of n-hexane, benzene, m/p-xylene, o-xylene, 1,2-dichloroethane, and 1,3-butadiene showed statistically significant difference (P-value < 0.05). This indicated that the two sampling methods had various degrees of systematic errors. Looking at the results of six chemicals and these systematic errors probably resulted from the differences of the detection limits in the two sampling methods for these VOCs. The comparison between the concentrations of each of the 10 selected VOCs measured by the two sampling methods indicted that the thermal desorption tubes provided high accuracy and precision measurements for acetone, benzene, and 1,3-butadiene. The accuracy and precision of using the thermal desorption tubes for measuring the VOCs can be improved due to new developments in sorbent materials, multi-sorbent designs, and thermal desorption instrumentation. More applications of thermal desorption tubes for measuring occupational and environmental hazardous agents can be anticipated. © The Author 2015. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  13. Improving Evaluation of Dental Hygiene Students' Cultural Competence with a Mixed-Methods Approach.

    PubMed

    Flynn, Priscilla; Sarkarati, Nassim

    2018-02-01

    Most dental hygiene educational programs include cultural competence education, but may not evaluate student outcomes. The aim of this study was to design and implement a mixed-methods evaluation to measure dental hygiene students' progression toward cultural competence. Two cohorts consisting of consecutive classes in one U.S. dental hygiene program participated in the study. A total of 47 dental hygiene students (100% response rate) completed self-assessments to measure their attitudes and knowledge at three time points between 2014 and 2016. Mean scores were calculated for three domains: Physical Environment, Communication, and Values. Qualitative analysis of the students' cultural diversity papers was also conducted to further evaluate students' knowledge and skills. Bennett's five-level conceptual framework was used to code phrases or sentences to place students in the general categories of ethnocentric or ethno-relative. The quantitative and qualitative results yielded different outcomes for Cohort 1, but not for Cohort 2. The Cohort 1 students assessed themselves statistically significantly lower over time in one of the three measured domains. However, the Cohort 2 students assessed themselves as statistically significantly more culturally competent in all three domains. Qualitative results placed 72% of Cohort 1 students and 83% of Cohort 2 students in the more desirable ethno-relative category. Since quantitative methods consisting of student self-assessments may not adequately measure students' cultural competence, adding qualitative methods to measure skills specific to patient care in this study added a robust dimension to evaluating this complex dental hygiene student competence.

  14. Selection of nontarget arthropod taxa for field research on transgenic insecticidal crops: using empirical data and statistical power.

    PubMed

    Prasifka, J R; Hellmich, R L; Dively, G P; Higgins, L S; Dixon, P M; Duan, J J

    2008-02-01

    One of the possible adverse effects of transgenic insecticidal crops is the unintended decline in the abundance of nontarget arthropods. Field trials designed to evaluate potential nontarget effects can be more complex than expected because decisions to conduct field trials and the selection of taxa to include are not always guided by the results of laboratory tests. Also, recent studies emphasize the potential for indirect effects (adverse impacts to nontarget arthropods without feeding directly on plant tissues), which are difficult to predict because of interactions among nontarget arthropods, target pests, and transgenic crops. As a consequence, field studies may attempt to monitor expansive lists of arthropod taxa, making the design of such broad studies more difficult and reducing the likelihood of detecting any negative effects that might be present. To improve the taxonomic focus and statistical rigor of future studies, existing field data and corresponding power analysis may provide useful guidance. Analysis of control data from several nontarget field trials using repeated-measures designs suggests that while detection of small effects may require considerable increases in replication, there are taxa from different ecological roles that are sampled effectively using standard methods. The use of statistical power to guide selection of taxa for nontarget trials reflects scientists' inability to predict the complex interactions among arthropod taxa, particularly when laboratory trials fail to provide guidance on which groups are more likely to be affected. However, scientists still may exercise judgment, including taxa that are not included in or supported by power analyses.

  15. Energy-density field approach for low- and medium-frequency vibroacoustic analysis of complex structures using a statistical computational model

    NASA Astrophysics Data System (ADS)

    Kassem, M.; Soize, C.; Gagliardini, L.

    2009-06-01

    In this paper, an energy-density field approach applied to the vibroacoustic analysis of complex industrial structures in the low- and medium-frequency ranges is presented. This approach uses a statistical computational model. The analyzed system consists of an automotive vehicle structure coupled with its internal acoustic cavity. The objective of this paper is to make use of the statistical properties of the frequency response functions of the vibroacoustic system observed from previous experimental and numerical work. The frequency response functions are expressed in terms of a dimensionless matrix which is estimated using the proposed energy approach. Using this dimensionless matrix, a simplified vibroacoustic model is proposed.

  16. The instinctual nation-state: non-Darwinian theories, state science and ultra-nationalism in Oka Asajirō's Evolution and Human Life.

    PubMed

    Sullivan, Gregory

    2011-01-01

    In his anthology of socio-political essays, Evolution and Human Life, Oka Asajirō (1868-1944), early twentieth century Japan's foremost advocate of evolutionism, developed a biological vision of the nation-state as super-organism that reflected the concerns and aims of German-inspired Meiji statism and anticipated aspects of radical ultra-nationalism. Drawing on non-Darwinian doctrines, Oka attempted to realize such a fused or organic state by enhancing social instincts that would bind the minzoku (ethnic nation) and state into a single living entity. Though mobilization during the Russo-Japanese War seemed to evince this super-organism, the increasingly contentious and complex society that emerged in the war's aftermath caused Oka to turn first to Lamarckism and eventually to orthogenesis in the hopes of preserving the instincts needed for a viable nation-state. It is especially in the state interventionist measures that Oka finally came to endorse in order to forestall orthogenetically-driven degeneration that the technocratic proclivities of his statist orientation become most apparent. The article concludes by suggesting that Oka's emphasis on degeneration, autarkic expansion, and, most especially, totalitarian submersion of individuals into the statist collectivity indicates a complex relationship between his evolutionism and fascist ideology, what recent scholarship has dubbed radical Shinto ultra-nationalism.

  17. Central Limit Theorem for Exponentially Quasi-local Statistics of Spin Models on Cayley Graphs

    NASA Astrophysics Data System (ADS)

    Reddy, Tulasi Ram; Vadlamani, Sreekar; Yogeshwaran, D.

    2018-04-01

    Central limit theorems for linear statistics of lattice random fields (including spin models) are usually proven under suitable mixing conditions or quasi-associativity. Many interesting examples of spin models do not satisfy mixing conditions, and on the other hand, it does not seem easy to show central limit theorem for local statistics via quasi-associativity. In this work, we prove general central limit theorems for local statistics and exponentially quasi-local statistics of spin models on discrete Cayley graphs with polynomial growth. Further, we supplement these results by proving similar central limit theorems for random fields on discrete Cayley graphs taking values in a countable space, but under the stronger assumptions of α -mixing (for local statistics) and exponential α -mixing (for exponentially quasi-local statistics). All our central limit theorems assume a suitable variance lower bound like many others in the literature. We illustrate our general central limit theorem with specific examples of lattice spin models and statistics arising in computational topology, statistical physics and random networks. Examples of clustering spin models include quasi-associated spin models with fast decaying covariances like the off-critical Ising model, level sets of Gaussian random fields with fast decaying covariances like the massive Gaussian free field and determinantal point processes with fast decaying kernels. Examples of local statistics include intrinsic volumes, face counts, component counts of random cubical complexes while exponentially quasi-local statistics include nearest neighbour distances in spin models and Betti numbers of sub-critical random cubical complexes.

  18. A 20-year period of orthotopic liver transplantation activity in a single center: a time series analysis performed using the R Statistical Software.

    PubMed

    Santori, G; Andorno, E; Morelli, N; Casaccia, M; Bottino, G; Di Domenico, S; Valente, U

    2009-05-01

    In many Western countries a "minimum volume rule" policy has been adopted as a quality measure for complex surgical procedures. In Italy, the National Transplant Centre set the minimum number of orthotopic liver transplantation (OLT) procedures/y at 25/center. OLT procedures performed in a single center for a reasonably large period may be treated as a time series to evaluate trend, seasonal cycles, and nonsystematic fluctuations. Between January 1, 1987 and December 31, 2006, we performed 563 cadaveric donor OLTs to adult recipients. During 2007, there were another 28 procedures. The greatest numbers of OLTs/y were performed in 2001 (n = 51), 2005 (n = 50), and 2004 (n = 49). A time series analysis performed using R Statistical Software (Foundation for Statistical Computing, Vienna, Austria), a free software environment for statistical computing and graphics, showed an incremental trend after exponential smoothing as well as after seasonal decomposition. The predicted OLT/mo for 2007 calculated with the Holt-Winters exponential smoothing applied to the previous period 1987-2006 helped to identify the months where there was a major difference between predicted and performed procedures. The time series approach may be helpful to establish a minimum volume/y at a single-center level.

  19. In vivo and in vitro measurements of complex-type chromosomal exchanges induced by heavy ions.

    PubMed

    George, K; Durante, M; Wu, H; Willingham, V; Cucinotta, F A

    2003-01-01

    Heavy ions are more efficient in producing complex-type chromosome exchanges than sparsely ionizing radiation, and this can potentially be used as a biomarker of radiation quality. We measured the induction of complex-type chromosomal aberrations in human peripheral blood lymphocytes exposed in vitro to accelerated H-, He-, C-, Ar-, Fe- and Au-ions in the LET range of approximately 0.4-1400 keV/micrometers. Chromosomes were analyzed either at the first post-irradiation mitosis, or in interphase, following premature condensation by phosphatase inhibitors. Selected chromosomes were then visualized after FISH-painting. The dose-response curve for the induction of complex-type exchanges by heavy ions was linear in the dose-range 0.2-1.5 Gy, while gamma-rays did not produce a significant increase in the yield of complex rearrangements in this dose range. The yield of complex aberrations after 1 Gy of heavy ions increased up to an LET around 100 keV/micrometers, and then declined at higher LET values. When mitotic cells were analyzed, the frequency of complex rearrangements after 1 Gy was about 10 times higher for Ar- or Fe- ions (the most effective ions, with LET around 100 keV/micrometers) than for 250 MeV protons, and values were about 35 times higher in prematurely condensed chromosomes. These results suggest that complex rearrangements may be detected in astronauts' blood lymphocytes after long-term space flight, because crews are exposed to HZE particles from galactic cosmic radiation. However, in a cytogenetic study of ten astronauts after long-term missions on the Mir or International Space Station, we found a very low frequency of complex rearrangements, and a significant post-flight increase was detected in only one out of the ten crewmembers. It appears that the use of complex-type exchanges as biomarker of radiation quality in vivo after low-dose chronic exposure in mixed radiation fields is hampered by statistical uncertainties. c2003 COSPAR. Published by Elsevier Science Ltd. All rights reserved.

  20. Neuropsychological study of FASD in a sample of American Indian children: processing simple versus complex information.

    PubMed

    Aragón, Alfredo S; Kalberg, Wendy O; Buckley, David; Barela-Scott, Lindsey M; Tabachnick, Barbara G; May, Philip A

    2008-12-01

    Although a large body of literature exists on cognitive functioning in alcohol-exposed children, it is unclear if there is a signature neuropsychological profile in children with Fetal Alcohol Spectrum Disorders (FASD). This study assesses cognitive functioning in children with FASD from several American Indian reservations in the Northern Plains States, and it applies a hierarchical model of simple versus complex information processing to further examine cognitive function. We hypothesized that complex tests would discriminate between children with FASD and culturally similar controls, while children with FASD would perform similar to controls on relatively simple tests. Our sample includes 32 control children and 24 children with a form of FASD [fetal alcohol syndrome (FAS) = 10, partial fetal alcohol syndrome (PFAS) = 14]. The test battery measures general cognitive ability, verbal fluency, executive functioning, memory, and fine-motor skills. Many of the neuropsychological tests produced results consistent with a hierarchical model of simple versus complex processing. The complexity of the tests was determined "a priori" based on the number of cognitive processes involved in them. Multidimensional scaling was used to statistically analyze the accuracy of classifying the neurocognitive tests into a simple versus complex dichotomy. Hierarchical logistic regression models were then used to define the contribution made by complex versus simple tests in predicting the significant differences between children with FASD and controls. Complex test items discriminated better than simple test items. The tests that conformed well to the model were the Verbal Fluency, Progressive Planning Test (PPT), the Lhermitte memory tasks, and the Grooved Pegboard Test (GPT). The FASD-grouped children, when compared with controls, demonstrated impaired performance on letter fluency, while their performance was similar on category fluency. On the more complex PPT trials (problems 5 to 8), as well as the Lhermitte logical tasks, the FASD group performed the worst. The differential performance between children with FASD and controls was evident across various neuropsychological measures. The children with FASD performed significantly more poorly on the complex tasks than did the controls. The identification of a neurobehavioral profile in children with prenatal alcohol exposure will help clinicians identify and diagnose children with FASD.

Top