Budiyono, Agung; Rohrlich, Daniel
2017-11-03
Where does quantum mechanics part ways with classical mechanics? How does quantum randomness differ fundamentally from classical randomness? We cannot fully explain how the theories differ until we can derive them within a single axiomatic framework, allowing an unambiguous account of how one theory is the limit of the other. Here we derive non-relativistic quantum mechanics and classical statistical mechanics within a common framework. The common axioms include conservation of average energy and conservation of probability current. But two axioms distinguish quantum mechanics from classical statistical mechanics: an "ontic extension" defines a nonseparable (global) random variable that generates physical correlations, and an "epistemic restriction" constrains allowed phase space distributions. The ontic extension and epistemic restriction, with strength on the order of Planck's constant, imply quantum entanglement and uncertainty relations. This framework suggests that the wave function is epistemic, yet it does not provide an ontic dynamics for individual systems.
Domain generality vs. modality specificity: The paradox of statistical learning
Frost, Ram; Armstrong, Blair C.; Siegelman, Noam; Christiansen, Morten H.
2015-01-01
Statistical learning is typically considered to be a domain-general mechanism by which cognitive systems discover the underlying distributional properties of the input. Recent studies examining whether there are commonalities in the learning of distributional information across different domains or modalities consistently reveal, however, modality and stimulus specificity. An important question is, therefore, how and why a hypothesized domain-general learning mechanism systematically produces such effects. We offer a theoretical framework according to which statistical learning is not a unitary mechanism, but a set of domain-general computational principles, that operate in different modalities and therefore are subject to the specific constraints characteristic of their respective brain regions. This framework offers testable predictions and we discuss its computational and neurobiological plausibility. PMID:25631249
Statistical mechanics framework for static granular matter.
Henkes, Silke; Chakraborty, Bulbul
2009-06-01
The physical properties of granular materials have been extensively studied in recent years. So far, however, there exists no theoretical framework which can explain the observations in a unified manner beyond the phenomenological jamming diagram. This work focuses on the case of static granular matter, where we have constructed a statistical ensemble which mirrors equilibrium statistical mechanics. This ensemble, which is based on the conservation properties of the stress tensor, is distinct from the original Edwards ensemble and applies to packings of deformable grains. We combine it with a field theoretical analysis of the packings, where the field is the Airy stress function derived from the force and torque balance conditions. In this framework, Point J characterized by a diverging stiffness of the pressure fluctuations. Separately, we present a phenomenological mean-field theory of the jamming transition, which incorporates the mean contact number as a variable. We link both approaches in the context of the marginal rigidity picture proposed by Wyart and others.
Statistical label fusion with hierarchical performance models
Asman, Andrew J.; Dagley, Alexander S.; Landman, Bennett A.
2014-01-01
Label fusion is a critical step in many image segmentation frameworks (e.g., multi-atlas segmentation) as it provides a mechanism for generalizing a collection of labeled examples into a single estimate of the underlying segmentation. In the multi-label case, typical label fusion algorithms treat all labels equally – fully neglecting the known, yet complex, anatomical relationships exhibited in the data. To address this problem, we propose a generalized statistical fusion framework using hierarchical models of rater performance. Building on the seminal work in statistical fusion, we reformulate the traditional rater performance model from a multi-tiered hierarchical perspective. This new approach provides a natural framework for leveraging known anatomical relationships and accurately modeling the types of errors that raters (or atlases) make within a hierarchically consistent formulation. Herein, we describe several contributions. First, we derive a theoretical advancement to the statistical fusion framework that enables the simultaneous estimation of multiple (hierarchical) performance models within the statistical fusion context. Second, we demonstrate that the proposed hierarchical formulation is highly amenable to the state-of-the-art advancements that have been made to the statistical fusion framework. Lastly, in an empirical whole-brain segmentation task we demonstrate substantial qualitative and significant quantitative improvement in overall segmentation accuracy. PMID:24817809
Complete integrability of information processing by biochemical reactions
Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio
2016-01-01
Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling – based on spin systems – has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis–Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy – based on completely integrable hydrodynamic-type systems of PDEs – which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions. PMID:27812018
Complete integrability of information processing by biochemical reactions
NASA Astrophysics Data System (ADS)
Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio
2016-11-01
Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.
Complete integrability of information processing by biochemical reactions.
Agliari, Elena; Barra, Adriano; Dello Schiavo, Lorenzo; Moro, Antonio
2016-11-04
Statistical mechanics provides an effective framework to investigate information processing in biochemical reactions. Within such framework far-reaching analogies are established among (anti-) cooperative collective behaviors in chemical kinetics, (anti-)ferromagnetic spin models in statistical mechanics and operational amplifiers/flip-flops in cybernetics. The underlying modeling - based on spin systems - has been proved to be accurate for a wide class of systems matching classical (e.g. Michaelis-Menten, Hill, Adair) scenarios in the infinite-size approximation. However, the current research in biochemical information processing has been focusing on systems involving a relatively small number of units, where this approximation is no longer valid. Here we show that the whole statistical mechanical description of reaction kinetics can be re-formulated via a mechanical analogy - based on completely integrable hydrodynamic-type systems of PDEs - which provides explicit finite-size solutions, matching recently investigated phenomena (e.g. noise-induced cooperativity, stochastic bi-stability, quorum sensing). The resulting picture, successfully tested against a broad spectrum of data, constitutes a neat rationale for a numerically effective and theoretically consistent description of collective behaviors in biochemical reactions.
Confounding in statistical mediation analysis: What it is and how to address it.
Valente, Matthew J; Pelham, William E; Smyth, Heather; MacKinnon, David P
2017-11-01
Psychology researchers are often interested in mechanisms underlying how randomized interventions affect outcomes such as substance use and mental health. Mediation analysis is a common statistical method for investigating psychological mechanisms that has benefited from exciting new methodological improvements over the last 2 decades. One of the most important new developments is methodology for estimating causal mediated effects using the potential outcomes framework for causal inference. Potential outcomes-based methods developed in epidemiology and statistics have important implications for understanding psychological mechanisms. We aim to provide a concise introduction to and illustration of these new methods and emphasize the importance of confounder adjustment. First, we review the traditional regression approach for estimating mediated effects. Second, we describe the potential outcomes framework. Third, we define what a confounder is and how the presence of a confounder can provide misleading evidence regarding mechanisms of interventions. Fourth, we describe experimental designs that can help rule out confounder bias. Fifth, we describe new statistical approaches to adjust for measured confounders of the mediator-outcome relation and sensitivity analyses to probe effects of unmeasured confounders on the mediated effect. All approaches are illustrated with application to a real counseling intervention dataset. Counseling psychologists interested in understanding the causal mechanisms of their interventions can benefit from incorporating the most up-to-date techniques into their mediation analyses. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
Reconstructing constructivism: causal models, Bayesian learning mechanisms, and the theory theory.
Gopnik, Alison; Wellman, Henry M
2012-11-01
We propose a new version of the "theory theory" grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and nontechnical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and the psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists.
Selective gas capture via kinetic trapping
Kundu, Joyjit; Pascal, Tod; Prendergast, David; ...
2016-07-13
Conventional approaches to the capture of CO 2 by metal-organic frameworks focus on equilibrium conditions, and frameworks that contain little CO 2 in equilibrium are often rejected as carbon-capture materials. Here we use a statistical mechanical model, parameterized by quantum mechanical data, to suggest that metal-organic frameworks can be used to separate CO 2 from a typical flue gas mixture when used under nonequilibrium conditions. The origin of this selectivity is an emergent gas-separation mechanism that results from the acquisition by different gas types of different mobilities within a crowded framework. The resulting distribution of gas types within the frameworkmore » is in general spatially and dynamically heterogeneous. Our results suggest that relaxing the requirement of equilibrium can substantially increase the parameter space of conditions and materials for which selective gas capture can be effected.« less
Gautestad, Arild O
2012-09-07
Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the 'power law in disguise' paradox-from a composite Brownian motion consisting of a superposition of independent movement processes at different scales-may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated.
Phenomenology of small violations of Fermi and Bose statistics
NASA Astrophysics Data System (ADS)
Greenberg, O. W.; Mohapatra, Rabindra N.
1989-04-01
In a recent paper, we proposed a ``paronic'' field-theory framework for possible small deviations from the Pauli exclusion principle. This theory cannot be represented in a positive-metric (Hilbert) space. Nonetheless, the issue of possible small violations of the exclusion principle can be addressed in the framework of quantum mechanics, without being connected with a local quantum field theory. In this paper, we discuss the phenomenology of small violations of both Fermi and Bose statistics. We consider the implications of such violations in atomic, nuclear, particle, and condensed-matter physics and in astrophysics and cosmology. We also discuss experiments that can detect small violations of Fermi and Bose statistics or place stringent bounds on their validity.
Saffran, Jenny R.; Kirkham, Natasha Z.
2017-01-01
Perception involves making sense of a dynamic, multimodal environment. In the absence of mechanisms capable of exploiting the statistical patterns in the natural world, infants would face an insurmountable computational problem. Infant statistical learning mechanisms facilitate the detection of structure. These abilities allow the infant to compute across elements in their environmental input, extracting patterns for further processing and subsequent learning. In this selective review, we summarize findings that show that statistical learning is both a broad and flexible mechanism (supporting learning from different modalities across many different content areas) and input specific (shifting computations depending on the type of input and goal of learning). We suggest that statistical learning not only provides a framework for studying language development and object knowledge in constrained laboratory settings, but also allows researchers to tackle real-world problems, such as multilingualism, the role of ever-changing learning environments, and differential developmental trajectories. PMID:28793812
Reconstructing constructivism: Causal models, Bayesian learning mechanisms and the theory theory
Gopnik, Alison; Wellman, Henry M.
2012-01-01
We propose a new version of the “theory theory” grounded in the computational framework of probabilistic causal models and Bayesian learning. Probabilistic models allow a constructivist but rigorous and detailed approach to cognitive development. They also explain the learning of both more specific causal hypotheses and more abstract framework theories. We outline the new theoretical ideas, explain the computational framework in an intuitive and non-technical way, and review an extensive but relatively recent body of empirical results that supports these ideas. These include new studies of the mechanisms of learning. Children infer causal structure from statistical information, through their own actions on the world and through observations of the actions of others. Studies demonstrate these learning mechanisms in children from 16 months to 4 years old and include research on causal statistical learning, informal experimentation through play, and imitation and informal pedagogy. They also include studies of the variability and progressive character of intuitive theory change, particularly theory of mind. These studies investigate both the physical and psychological and social domains. We conclude with suggestions for further collaborative projects between developmental and computational cognitive scientists. PMID:22582739
Gautestad, Arild O.
2012-01-01
Animals moving under the influence of spatio-temporal scaling and long-term memory generate a kind of space-use pattern that has proved difficult to model within a coherent theoretical framework. An extended kind of statistical mechanics is needed, accounting for both the effects of spatial memory and scale-free space use, and put into a context of ecological conditions. Simulations illustrating the distinction between scale-specific and scale-free locomotion are presented. The results show how observational scale (time lag between relocations of an individual) may critically influence the interpretation of the underlying process. In this respect, a novel protocol is proposed as a method to distinguish between some main movement classes. For example, the ‘power law in disguise’ paradox—from a composite Brownian motion consisting of a superposition of independent movement processes at different scales—may be resolved by shifting the focus from pattern analysis at one particular temporal resolution towards a more process-oriented approach involving several scales of observation. A more explicit consideration of system complexity within a statistical mechanical framework, supplementing the more traditional mechanistic modelling approach, is advocated. PMID:22456456
Krug, Klaus-Peter; Knauber, Andreas W; Nothdurft, Frank P
2015-03-01
The aim of this study was to investigate the fracture behavior of metal-ceramic bridges with frameworks from cobalt-chromium-molybdenum (CoCrMo), which are manufactured using conventional casting or a new computer-aided design/computer-aided manufacturing (CAD/CAM) milling and sintering technique. A total of 32 metal-ceramic fixed dental prostheses (FDPs), which are based on a nonprecious metal framework, was produced using a conventional casting process (n = 16) or a new CAD/CAM milling and sintering process (n = 16). Eight unveneered frameworks were manufactured using each of the techniques. After thermal and mechanical aging of half of the restorations, all samples were subjected to a static loading test in a universal testing machine, in which acoustic emission monitoring was performed. Three different critical forces were revealed: the fracture force (F max), the force at the first reduction in force (F decr1), and the force at the critical acoustic event (F acoust1). With the exception of the veneered restorations with cast or sintered metal frameworks without artificial aging, which presented a statistically significant but slightly different F max, no statistically significant differences between cast and CAD/CAM sintered and milled FDPs were detected. Thermal and mechanical loading did not significantly affect the resulting forces. Cast and CAD/CAM milled and sintered metal-ceramic bridges were determined to be comparable with respect to the fracture behavior. FDPs based on CAD/CAM milled and sintered frameworks may be an applicable and less technique-sensitive alternative to frameworks that are based on conventionally cast frameworks.
Mechanics and statistics of the worm-like chain
NASA Astrophysics Data System (ADS)
Marantan, Andrew; Mahadevan, L.
2018-02-01
The worm-like chain model is a simple continuum model for the statistical mechanics of a flexible polymer subject to an external force. We offer a tutorial introduction to it using three approaches. First, we use a mesoscopic view, treating a long polymer (in two dimensions) as though it were made of many groups of correlated links or "clinks," allowing us to calculate its average extension as a function of the external force via scaling arguments. We then provide a standard statistical mechanics approach, obtaining the average extension by two different means: the equipartition theorem and the partition function. Finally, we work in a probabilistic framework, taking advantage of the Gaussian properties of the chain in the large-force limit to improve upon the previous calculations of the average extension.
A κ-generalized statistical mechanics approach to income analysis
NASA Astrophysics Data System (ADS)
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2009-02-01
This paper proposes a statistical mechanics approach to the analysis of income distribution and inequality. A new distribution function, having its roots in the framework of κ-generalized statistics, is derived that is particularly suitable for describing the whole spectrum of incomes, from the low-middle income region up to the high income Pareto power-law regime. Analytical expressions for the shape, moments and some other basic statistical properties are given. Furthermore, several well-known econometric tools for measuring inequality, which all exist in a closed form, are considered. A method for parameter estimation is also discussed. The model is shown to fit remarkably well the data on personal income for the United States, and the analysis of inequality performed in terms of its parameters is revealed as very powerful.
Statistical mechanics and scaling of fault populations with increasing strain in the Corinth Rift
NASA Astrophysics Data System (ADS)
Michas, Georgios; Vallianatos, Filippos; Sammonds, Peter
2015-12-01
Scaling properties of fracture/fault systems are studied in order to characterize the mechanical properties of rocks and to provide insight into the mechanisms that govern fault growth. A comprehensive image of the fault network in the Corinth Rift, Greece, obtained through numerous field studies and marine geophysical surveys, allows for the first time such a study over the entire area of the Rift. We compile a detailed fault map of the area and analyze the scaling properties of fault trace-lengths by using a statistical mechanics model, derived in the framework of generalized statistical mechanics and associated maximum entropy principle. By using this framework, a range of asymptotic power-law to exponential-like distributions are derived that can well describe the observed scaling patterns of fault trace-lengths in the Rift. Systematic variations and in particular a transition from asymptotic power-law to exponential-like scaling are observed to be a function of increasing strain in distinct strain regimes in the Rift, providing quantitative evidence for such crustal processes in a single tectonic setting. These results indicate the organization of the fault system as a function of brittle strain in the Earth's crust and suggest there are different mechanisms for fault growth in the distinct parts of the Rift. In addition, other factors such as fault interactions and the thickness of the brittle layer affect how the fault system evolves in time. The results suggest that regional strain, fault interactions and the boundary condition of the brittle layer may control fault growth and the fault network evolution in the Corinth Rift.
Generalized statistical mechanics approaches to earthquakes and tectonics.
Vallianatos, Filippos; Papadakis, Giorgos; Michas, Georgios
2016-12-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes.
Generalized statistical mechanics approaches to earthquakes and tectonics
Papadakis, Giorgos; Michas, Georgios
2016-01-01
Despite the extreme complexity that characterizes the mechanism of the earthquake generation process, simple empirical scaling relations apply to the collective properties of earthquakes and faults in a variety of tectonic environments and scales. The physical characterization of those properties and the scaling relations that describe them attract a wide scientific interest and are incorporated in the probabilistic forecasting of seismicity in local, regional and planetary scales. Considerable progress has been made in the analysis of the statistical mechanics of earthquakes, which, based on the principle of entropy, can provide a physical rationale to the macroscopic properties frequently observed. The scale-invariant properties, the (multi) fractal structures and the long-range interactions that have been found to characterize fault and earthquake populations have recently led to the consideration of non-extensive statistical mechanics (NESM) as a consistent statistical mechanics framework for the description of seismicity. The consistency between NESM and observations has been demonstrated in a series of publications on seismicity, faulting, rock physics and other fields of geosciences. The aim of this review is to present in a concise manner the fundamental macroscopic properties of earthquakes and faulting and how these can be derived by using the notions of statistical mechanics and NESM, providing further insights into earthquake physics and fault growth processes. PMID:28119548
Statistical physics approach to earthquake occurrence and forecasting
NASA Astrophysics Data System (ADS)
de Arcangelis, Lucilla; Godano, Cataldo; Grasso, Jean Robert; Lippiello, Eugenio
2016-04-01
There is striking evidence that the dynamics of the Earth crust is controlled by a wide variety of mutually dependent mechanisms acting at different spatial and temporal scales. The interplay of these mechanisms produces instabilities in the stress field, leading to abrupt energy releases, i.e., earthquakes. As a consequence, the evolution towards instability before a single event is very difficult to monitor. On the other hand, collective behavior in stress transfer and relaxation within the Earth crust leads to emergent properties described by stable phenomenological laws for a population of many earthquakes in size, time and space domains. This observation has stimulated a statistical mechanics approach to earthquake occurrence, applying ideas and methods as scaling laws, universality, fractal dimension, renormalization group, to characterize the physics of earthquakes. In this review we first present a description of the phenomenological laws of earthquake occurrence which represent the frame of reference for a variety of statistical mechanical models, ranging from the spring-block to more complex fault models. Next, we discuss the problem of seismic forecasting in the general framework of stochastic processes, where seismic occurrence can be described as a branching process implementing space-time-energy correlations between earthquakes. In this context we show how correlations originate from dynamical scaling relations between time and energy, able to account for universality and provide a unifying description for the phenomenological power laws. Then we discuss how branching models can be implemented to forecast the temporal evolution of the earthquake occurrence probability and allow to discriminate among different physical mechanisms responsible for earthquake triggering. In particular, the forecasting problem will be presented in a rigorous mathematical framework, discussing the relevance of the processes acting at different temporal scales for different levels of prediction. In this review we also briefly discuss how the statistical mechanics approach can be applied to non-tectonic earthquakes and to other natural stochastic processes, such as volcanic eruptions and solar flares.
Manifold parametrization of the left ventricle for a statistical modelling of its complete anatomy
NASA Astrophysics Data System (ADS)
Gil, D.; Garcia-Barnes, J.; Hernández-Sabate, A.; Marti, E.
2010-03-01
Distortion of Left Ventricle (LV) external anatomy is related to some dysfunctions, such as hypertrophy. The architecture of myocardial fibers determines LV electromechanical activation patterns as well as mechanics. Thus, their joined modelling would allow the design of specific interventions (such as peacemaker implantation and LV remodelling) and therapies (such as resynchronization). On one hand, accurate modelling of external anatomy requires either a dense sampling or a continuous infinite dimensional approach, which requires non-Euclidean statistics. On the other hand, computation of fiber models requires statistics on Riemannian spaces. Most approaches compute separate statistical models for external anatomy and fibers architecture. In this work we propose a general mathematical framework based on differential geometry concepts for computing a statistical model including, both, external and fiber anatomy. Our framework provides a continuous approach to external anatomy supporting standard statistics. We also provide a straightforward formula for the computation of the Riemannian fiber statistics. We have applied our methodology to the computation of complete anatomical atlas of canine hearts from diffusion tensor studies. The orientation of fibers over the average external geometry agrees with the segmental description of orientations reported in the literature.
A Framework for Assessing High School Students' Statistical Reasoning.
Chan, Shiau Wei; Ismail, Zaleha; Sumintono, Bambang
2016-01-01
Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students' statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students' statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework's cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments.
Collective behaviours: from biochemical kinetics to electronic circuits.
Agliari, Elena; Barra, Adriano; Burioni, Raffaella; Di Biasio, Aldo; Uguzzoni, Guido
2013-12-10
In this work we aim to highlight a close analogy between cooperative behaviors in chemical kinetics and cybernetics; this is realized by using a common language for their description, that is mean-field statistical mechanics. First, we perform a one-to-one mapping between paradigmatic behaviors in chemical kinetics (i.e., non-cooperative, cooperative, ultra-sensitive, anti-cooperative) and in mean-field statistical mechanics (i.e., paramagnetic, high and low temperature ferromagnetic, anti-ferromagnetic). Interestingly, the statistical mechanics approach allows a unified, broad theory for all scenarios and, in particular, Michaelis-Menten, Hill and Adair equations are consistently recovered. This framework is then tested against experimental biological data with an overall excellent agreement. One step forward, we consistently read the whole mapping from a cybernetic perspective, highlighting deep structural analogies between the above-mentioned kinetics and fundamental bricks in electronics (i.e. operational amplifiers, flashes, flip-flops), so to build a clear bridge linking biochemical kinetics and cybernetics.
Collective behaviours: from biochemical kinetics to electronic circuits
NASA Astrophysics Data System (ADS)
Agliari, Elena; Barra, Adriano; Burioni, Raffaella; di Biasio, Aldo; Uguzzoni, Guido
2013-12-01
In this work we aim to highlight a close analogy between cooperative behaviors in chemical kinetics and cybernetics; this is realized by using a common language for their description, that is mean-field statistical mechanics. First, we perform a one-to-one mapping between paradigmatic behaviors in chemical kinetics (i.e., non-cooperative, cooperative, ultra-sensitive, anti-cooperative) and in mean-field statistical mechanics (i.e., paramagnetic, high and low temperature ferromagnetic, anti-ferromagnetic). Interestingly, the statistical mechanics approach allows a unified, broad theory for all scenarios and, in particular, Michaelis-Menten, Hill and Adair equations are consistently recovered. This framework is then tested against experimental biological data with an overall excellent agreement. One step forward, we consistently read the whole mapping from a cybernetic perspective, highlighting deep structural analogies between the above-mentioned kinetics and fundamental bricks in electronics (i.e. operational amplifiers, flashes, flip-flops), so to build a clear bridge linking biochemical kinetics and cybernetics.
Context-Aware Generative Adversarial Privacy
NASA Astrophysics Data System (ADS)
Huang, Chong; Kairouz, Peter; Chen, Xiao; Sankar, Lalitha; Rajagopal, Ram
2017-12-01
Preserving the utility of published datasets while simultaneously providing provable privacy guarantees is a well-known challenge. On the one hand, context-free privacy solutions, such as differential privacy, provide strong privacy guarantees, but often lead to a significant reduction in utility. On the other hand, context-aware privacy solutions, such as information theoretic privacy, achieve an improved privacy-utility tradeoff, but assume that the data holder has access to dataset statistics. We circumvent these limitations by introducing a novel context-aware privacy framework called generative adversarial privacy (GAP). GAP leverages recent advancements in generative adversarial networks (GANs) to allow the data holder to learn privatization schemes from the dataset itself. Under GAP, learning the privacy mechanism is formulated as a constrained minimax game between two players: a privatizer that sanitizes the dataset in a way that limits the risk of inference attacks on the individuals' private variables, and an adversary that tries to infer the private variables from the sanitized dataset. To evaluate GAP's performance, we investigate two simple (yet canonical) statistical dataset models: (a) the binary data model, and (b) the binary Gaussian mixture model. For both models, we derive game-theoretically optimal minimax privacy mechanisms, and show that the privacy mechanisms learned from data (in a generative adversarial fashion) match the theoretically optimal ones. This demonstrates that our framework can be easily applied in practice, even in the absence of dataset statistics.
A Framework for Assessing High School Students' Statistical Reasoning
2016-01-01
Based on a synthesis of literature, earlier studies, analyses and observations on high school students, this study developed an initial framework for assessing students’ statistical reasoning about descriptive statistics. Framework descriptors were established across five levels of statistical reasoning and four key constructs. The former consisted of idiosyncratic reasoning, verbal reasoning, transitional reasoning, procedural reasoning, and integrated process reasoning. The latter include describing data, organizing and reducing data, representing data, and analyzing and interpreting data. In contrast to earlier studies, this initial framework formulated a complete and coherent statistical reasoning framework. A statistical reasoning assessment tool was then constructed from this initial framework. The tool was administered to 10 tenth-grade students in a task-based interview. The initial framework was refined, and the statistical reasoning assessment tool was revised. The ten students then participated in the second task-based interview, and the data obtained were used to validate the framework. The findings showed that the students’ statistical reasoning levels were consistent across the four constructs, and this result confirmed the framework’s cohesion. Developed to contribute to statistics education, this newly developed statistical reasoning framework provides a guide for planning learning goals and designing instruction and assessments. PMID:27812091
SHARE: system design and case studies for statistical health information release
Gardner, James; Xiong, Li; Xiao, Yonghui; Gao, Jingjing; Post, Andrew R; Jiang, Xiaoqian; Ohno-Machado, Lucila
2013-01-01
Objectives We present SHARE, a new system for statistical health information release with differential privacy. We present two case studies that evaluate the software on real medical datasets and demonstrate the feasibility and utility of applying the differential privacy framework on biomedical data. Materials and Methods SHARE releases statistical information in electronic health records with differential privacy, a strong privacy framework for statistical data release. It includes a number of state-of-the-art methods for releasing multidimensional histograms and longitudinal patterns. We performed a variety of experiments on two real datasets, the surveillance, epidemiology and end results (SEER) breast cancer dataset and the Emory electronic medical record (EeMR) dataset, to demonstrate the feasibility and utility of SHARE. Results Experimental results indicate that SHARE can deal with heterogeneous data present in medical data, and that the released statistics are useful. The Kullback–Leibler divergence between the released multidimensional histograms and the original data distribution is below 0.5 and 0.01 for seven-dimensional and three-dimensional data cubes generated from the SEER dataset, respectively. The relative error for longitudinal pattern queries on the EeMR dataset varies between 0 and 0.3. While the results are promising, they also suggest that challenges remain in applying statistical data release using the differential privacy framework for higher dimensional data. Conclusions SHARE is one of the first systems to provide a mechanism for custodians to release differentially private aggregate statistics for a variety of use cases in the medical domain. This proof-of-concept system is intended to be applied to large-scale medical data warehouses. PMID:23059729
Radiation from quantum weakly dynamical horizons in loop quantum gravity.
Pranzetti, Daniele
2012-07-06
We provide a statistical mechanical analysis of quantum horizons near equilibrium in the grand canonical ensemble. By matching the description of the nonequilibrium phase in terms of weakly dynamical horizons with a local statistical framework, we implement loop quantum gravity dynamics near the boundary. The resulting radiation process provides a quantum gravity description of the horizon evaporation. For large black holes, the spectrum we derive presents a discrete structure which could be potentially observable.
Edwards statistical mechanics for jammed granular matter
NASA Astrophysics Data System (ADS)
Baule, Adrian; Morone, Flaviano; Herrmann, Hans J.; Makse, Hernán A.
2018-01-01
In 1989, Sir Sam Edwards made the visionary proposition to treat jammed granular materials using a volume ensemble of equiprobable jammed states in analogy to thermal equilibrium statistical mechanics, despite their inherent athermal features. Since then, the statistical mechanics approach for jammed matter—one of the very few generalizations of Gibbs-Boltzmann statistical mechanics to out-of-equilibrium matter—has garnered an extraordinary amount of attention by both theorists and experimentalists. Its importance stems from the fact that jammed states of matter are ubiquitous in nature appearing in a broad range of granular and soft materials such as colloids, emulsions, glasses, and biomatter. Indeed, despite being one of the simplest states of matter—primarily governed by the steric interactions between the constitutive particles—a theoretical understanding based on first principles has proved exceedingly challenging. Here a systematic approach to jammed matter based on the Edwards statistical mechanical ensemble is reviewed. The construction of microcanonical and canonical ensembles based on the volume function, which replaces the Hamiltonian in jammed systems, is discussed. The importance of approximation schemes at various levels is emphasized leading to quantitative predictions for ensemble averaged quantities such as packing fractions and contact force distributions. An overview of the phenomenology of jammed states and experiments, simulations, and theoretical models scrutinizing the strong assumptions underlying Edwards approach is given including recent results suggesting the validity of Edwards ergodic hypothesis for jammed states. A theoretical framework for packings whose constitutive particles range from spherical to nonspherical shapes such as dimers, polymers, ellipsoids, spherocylinders or tetrahedra, hard and soft, frictional, frictionless and adhesive, monodisperse, and polydisperse particles in any dimensions is discussed providing insight into a unifying phase diagram for all jammed matter. Furthermore, the connection between the Edwards ensemble of metastable jammed states and metastability in spin glasses is established. This highlights the fact that the packing problem can be understood as a constraint satisfaction problem for excluded volume and force and torque balance leading to a unifying framework between the Edwards ensemble of equiprobable jammed states and out-of-equilibrium spin glasses.
Effects of Peer Tutoring on Reading Self-Concept
ERIC Educational Resources Information Center
Flores, Marta; Duran, David
2013-01-01
This study investigates the development of the Reading Self-Concept and of the mechanisms underlying it, within a framework of a reading programme based on peer tutoring. The multiple methodological design adopted allowed for a quantitative approach which showed statistically significant changes in the Reading Self-Concept of those students who…
PREFACE: Mathematical Aspects of Generalized Entropies and their Applications
NASA Astrophysics Data System (ADS)
Suyari, Hiroki; Ohara, Atsumi; Wada, Tatsuaki
2010-01-01
In the recent increasing interests in power-law behaviors beyond the usual exponential ones, there have been some concrete attempts in statistical physics to generalize the standard Boltzmann-Gibbs statistics. Among such generalizations, nonextensive statistical mechanics has been well studied for about the last two decades with many modifications and refinements. The generalization has provided not only a theoretical framework but also many applications such as chaos, multi-fractal, complex systems, nonequilibrium statistical mechanics, biophysics, econophysics, information theory and so on. At the same time as the developments in the generalization of statistical mechanics, the corresponding mathematical structures have also been required and uncovered. In particular, some deep connections to mathematical sciences such as q-analysis, information geometry, information theory and quantum probability theory have been revealed recently. These results obviously indicate an existence of the generalized mathematical structure including the mathematical framework for the exponential family as a special case, but the whole structure is still unclear. In order to make an opportunity to discuss the mathematical structure induced from generalized entropies by scientists in many fields, the international workshop 'Mathematical Aspects of Generalized Entropies and their Applications' was held on 7-9 July 2009 at Kyoto TERRSA, Kyoto, Japan. This volume is the proceedings of the workshop which consisted of 6 invited speakers, 14 oral presenters, 7 poster presenters and 63 other participants. The topics of the workshop cover the nonextensive statistical mechanics, chaos, cosmology, information geometry, divergence theory, econophysics, materials engineering, molecular dynamics and entropy theory, information theory and so on. The workshop was organized as the first attempt to discuss these mathematical aspects with leading experts in each area. We would like to express special thanks to all the invited speakers, the contributors and the participants at the workshop. We are also grateful to RIMS (Research Institute for Mathematical Science) in Kyoto University and the Ministry of Education, Science, Sports and Culture, Grant-in-Aid for Scientific Research (B), 18300003, 2009 for their support. Organizing Committee Editors of the Proceedings Hiroki Suyari (Chiba University, Japan) Atsumi Ohara (Osaka University, Japan) Tatsuaki Wada (Ibaraki University, Japan) Conference photograph
Effect of different aging methods on the mechanical behavior of multi-layered ceramic structures.
Borba, Márcia; de Araújo, Maico D; Fukushima, Karen A; Yoshimura, Humberto N; Griggs, Jason A; Della Bona, Álvaro; Cesar, Paulo F
2016-12-01
To evaluate the effect of two aging methods (mechanical cycling and autoclave) on the mechanical behavior of veneer and framework ceramic specimens with different configurations (monolithic, two and three-layers). Three ceramics used as framework for fixed dental prostheses (YZ-Vita In-Ceram YZ; IZ-Vita In-Ceram Zirconia; AL-Vita In-Ceram AL) and two veneering porcelains (VM7 and VM9) were studied. Bar-shaped specimens were produced in three different designs: monolithic, two layers (porcelain-framework) and three layers (porcelain-framework-porcelain). Specimens were tested for three-point flexural strength at 1MPa/s in 37°C artificial saliva. Three different experimental conditions were evaluated (n=10): control; mechanical cycling (2Hz, 37°C artificial saliva); and autoclave aging (134°C, 2 bars, 5h). Bi-layered specimens were tested in both conditions: with porcelain or framework ceramic under tension. Fracture surfaces were analyzed using stereomicroscope and scanning electron microscopy. Results were statistically analyzed using Kruskal-Wallis and Student-Newman-Keuls tests. Only for AL group, mechanical cycling and autoclave aging significantly decreased the flexural strength values in comparison to the control (p<0.01). YZ, AL, VM7 and VM9 monolithic groups showed no strength degradation. For multi-layered specimens, when the porcelain layer was tested in tension (bi and tri-layers), the aging methods evaluated also had no effect on strength (p≥0.05). Total and partial failure modes were identified. Mechanical cycling and autoclave aging protocols had no effect on the flexural strength values and failure behavior of YZ and IZ ceramic structures. Yet, AL monolithic structures showed a significant decrease in flexural strength with any of the aging methods. Copyright © 2016. Published by Elsevier Ltd.
Øilo, Marit; Nesse, Harald; Lundberg, Odd Johan; Gjerdet, Nils Roar
2018-04-25
New additive manufacturing techniques for nonprecious alloys have made the fabrication of metal-ceramic fixed partial dentures (FPDs) less expensive and less time-consuming. However, whether the mechanical properties produced by these techniques are comparable is unclear. The purpose of this in vitro study was to evaluate the mechanical properties of cobalt-chromium frameworks for FPDs fabricated by 3 different techniques. Thirty frameworks for 3-unit FPDs were fabricated by traditional casting, computer-aided design and computer-aided manufacturing (CAD-CAM) milling, and selective laser melting (SLM), with n=10 in each group. The frameworks were weighed, and distal and mesial connector areas measured. The frameworks were cemented and loaded centrally (0.5 mm/s) until deformation above 1 mm occurred. Stiffness was measured as the slope of the axis between 500 and 2000 N. Microhardness was measured on sectioned specimens by Vickers indentation. The microstructure was also analyzed by scanning electron microscopy. One-way ANOVA with Tukey post hoc analysis was used to compare the groups (α=.05). The framework design differed among the groups, making a comparison of strength impossible. The milled frameworks appeared bulky, while the cast and SLM frameworks were more slender. Statistically significant differences were found in microhardness, stiffness, wall thickness, weight, and connector size (P<.05), and a significant correlation was found between hardness and stiffness (-0.4, P<.005). Fabrication method affects the design, stiffness, microhardness, and microstructure of cobalt-chromium FPD frameworks. The SLM frameworks were stiffer and harder than the cast and milled specimens. Copyright © 2018 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Statistical foundations of liquid-crystal theory
Seguin, Brian; Fried, Eliot
2013-01-01
We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals. PMID:23772091
A statistical framework for applying RNA profiling to chemical hazard detection.
Kostich, Mitchell S
2017-12-01
Use of 'omics technologies in environmental science is expanding. However, application is mostly restricted to characterizing molecular steps leading from toxicant interaction with molecular receptors to apical endpoints in laboratory species. Use in environmental decision-making is limited, due to difficulty in elucidating mechanisms in sufficient detail to make quantitative outcome predictions in any single species or in extending predictions to aquatic communities. Here we introduce a mechanism-agnostic statistical approach, supplementing mechanistic investigation by allowing probabilistic outcome prediction even when understanding of molecular pathways is limited, and facilitating extrapolation from results in laboratory test species to predictions about aquatic communities. We use concepts familiar to environmental managers, supplemented with techniques employed for clinical interpretation of 'omics-based biomedical tests. We describe the framework in step-wise fashion, beginning with single test replicates of a single RNA variant, then extending to multi-gene RNA profiling, collections of test replicates, and integration of complementary data. In order to simplify the presentation, we focus on using RNA profiling for distinguishing presence versus absence of chemical hazards, but the principles discussed can be extended to other types of 'omics measurements, multi-class problems, and regression. We include a supplemental file demonstrating many of the concepts using the open source R statistical package. Published by Elsevier Ltd.
Collective behaviours: from biochemical kinetics to electronic circuits
Agliari, Elena; Barra, Adriano; Burioni, Raffaella; Di Biasio, Aldo; Uguzzoni, Guido
2013-01-01
In this work we aim to highlight a close analogy between cooperative behaviors in chemical kinetics and cybernetics; this is realized by using a common language for their description, that is mean-field statistical mechanics. First, we perform a one-to-one mapping between paradigmatic behaviors in chemical kinetics (i.e., non-cooperative, cooperative, ultra-sensitive, anti-cooperative) and in mean-field statistical mechanics (i.e., paramagnetic, high and low temperature ferromagnetic, anti-ferromagnetic). Interestingly, the statistical mechanics approach allows a unified, broad theory for all scenarios and, in particular, Michaelis-Menten, Hill and Adair equations are consistently recovered. This framework is then tested against experimental biological data with an overall excellent agreement. One step forward, we consistently read the whole mapping from a cybernetic perspective, highlighting deep structural analogies between the above-mentioned kinetics and fundamental bricks in electronics (i.e. operational amplifiers, flashes, flip-flops), so to build a clear bridge linking biochemical kinetics and cybernetics. PMID:24322327
NASA Astrophysics Data System (ADS)
Barra, Adriano; Contucci, Pierluigi; Sandell, Rickard; Vernia, Cecilia
2014-02-01
How does immigrant integration in a country change with immigration density? Guided by a statistical mechanics perspective we propose a novel approach to this problem. The analysis focuses on classical integration quantifiers such as the percentage of jobs (temporary and permanent) given to immigrants, mixed marriages, and newborns with parents of mixed origin. We find that the average values of different quantifiers may exhibit either linear or non-linear growth on immigrant density and we suggest that social action, a concept identified by Max Weber, causes the observed non-linearity. Using the statistical mechanics notion of interaction to quantitatively emulate social action, a unified mathematical model for integration is proposed and it is shown to explain both growth behaviors observed. The linear theory instead, ignoring the possibility of interaction effects would underestimate the quantifiers up to 30% when immigrant densities are low, and overestimate them as much when densities are high. The capacity to quantitatively isolate different types of integration mechanisms makes our framework a suitable tool in the quest for more efficient integration policies.
Yamamoto, Takeshi
2008-12-28
Conventional quantum chemical solvation theories are based on the mean-field embedding approximation. That is, the electronic wavefunction is calculated in the presence of the mean field of the environment. In this paper a direct quantum mechanical/molecular mechanical (QM/MM) analog of such a mean-field theory is formulated based on variational and perturbative frameworks. In the variational framework, an appropriate QM/MM free energy functional is defined and is minimized in terms of the trial wavefunction that best approximates the true QM wavefunction in a statistically averaged sense. Analytical free energy gradient is obtained, which takes the form of the gradient of effective QM energy calculated in the averaged MM potential. In the perturbative framework, the above variational procedure is shown to be equivalent to the first-order expansion of the QM energy (in the exact free energy expression) about the self-consistent reference field. This helps understand the relation between the variational procedure and the exact QM/MM free energy as well as existing QM/MM theories. Based on this, several ways are discussed for evaluating non-mean-field effects (i.e., statistical fluctuations of the QM wavefunction) that are neglected in the mean-field calculation. As an illustration, the method is applied to an S(N)2 Menshutkin reaction in water, NH(3)+CH(3)Cl-->NH(3)CH(3) (+)+Cl(-), for which free energy profiles are obtained at the Hartree-Fock, MP2, B3LYP, and BHHLYP levels by integrating the free energy gradient. Non-mean-field effects are evaluated to be <0.5 kcal/mol using a Gaussian fluctuation model for the environment, which suggests that those effects are rather small for the present reaction in water.
PGT: A Statistical Approach to Prediction and Mechanism Design
NASA Astrophysics Data System (ADS)
Wolpert, David H.; Bono, James W.
One of the biggest challenges facing behavioral economics is the lack of a single theoretical framework that is capable of directly utilizing all types of behavioral data. One of the biggest challenges of game theory is the lack of a framework for making predictions and designing markets in a manner that is consistent with the axioms of decision theory. An approach in which solution concepts are distribution-valued rather than set-valued (i.e. equilibrium theory) has both capabilities. We call this approach Predictive Game Theory (or PGT). This paper outlines a general Bayesian approach to PGT. It also presents one simple example to illustrate the way in which this approach differs from equilibrium approaches in both prediction and mechanism design settings.
Markov Random Fields, Stochastic Quantization and Image Analysis
1990-01-01
Markov random fields based on the lattice Z2 have been extensively used in image analysis in a Bayesian framework as a-priori models for the...of Image Analysis can be given some fundamental justification then there is a remarkable connection between Probabilistic Image Analysis , Statistical Mechanics and Lattice-based Euclidean Quantum Field Theory.
ERIC Educational Resources Information Center
Grenn, Michael W.
2013-01-01
This dissertation introduces a theory of information quality to explain macroscopic behavior observed in the systems engineering process. The theory extends principles of Shannon's mathematical theory of communication [1948] and statistical mechanics to information development processes concerned with the flow, transformation, and meaning of…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Kandler; Shi, Ying; Santhanagopalan, Shriram
Predictive models of Li-ion battery lifetime must consider a multiplicity of electrochemical, thermal, and mechanical degradation modes experienced by batteries in application environments. To complicate matters, Li-ion batteries can experience different degradation trajectories that depend on storage and cycling history of the application environment. Rates of degradation are controlled by factors such as temperature history, electrochemical operating window, and charge/discharge rate. We present a generalized battery life prognostic model framework for battery systems design and control. The model framework consists of trial functions that are statistically regressed to Li-ion cell life datasets wherein the cells have been aged under differentmore » levels of stress. Degradation mechanisms and rate laws dependent on temperature, storage, and cycling condition are regressed to the data, with multiple model hypotheses evaluated and the best model down-selected based on statistics. The resulting life prognostic model, implemented in state variable form, is extensible to arbitrary real-world scenarios. The model is applicable in real-time control algorithms to maximize battery life and performance. We discuss efforts to reduce lifetime prediction error and accommodate its inevitable impact in controller design.« less
Dunne, Lawrence J; Manos, George
2018-03-13
Although crucial for designing separation processes little is known experimentally about multi-component adsorption isotherms in comparison with pure single components. Very few binary mixture adsorption isotherms are to be found in the literature and information about isotherms over a wide range of gas-phase composition and mechanical pressures and temperature is lacking. Here, we present a quasi-one-dimensional statistical mechanical model of binary mixture adsorption in metal-organic frameworks (MOFs) treated exactly by a transfer matrix method in the osmotic ensemble. The experimental parameter space may be very complex and investigations into multi-component mixture adsorption may be guided by theoretical insights. The approach successfully models breathing structural transitions induced by adsorption giving a good account of the shape of adsorption isotherms of CO 2 and CH 4 adsorption in MIL-53(Al). Binary mixture isotherms and co-adsorption-phase diagrams are also calculated and found to give a good description of the experimental trends in these properties and because of the wide model parameter range which reproduces this behaviour suggests that this is generic to MOFs. Finally, a study is made of the influence of mechanical pressure on the shape of CO 2 and CH 4 adsorption isotherms in MIL-53(Al). Quite modest mechanical pressures can induce significant changes to isotherm shapes in MOFs with implications for binary mixture separation processes.This article is part of the theme issue 'Modern theoretical chemistry'. © 2018 The Author(s).
NASA Astrophysics Data System (ADS)
Dunne, Lawrence J.; Manos, George
2018-03-01
Although crucial for designing separation processes little is known experimentally about multi-component adsorption isotherms in comparison with pure single components. Very few binary mixture adsorption isotherms are to be found in the literature and information about isotherms over a wide range of gas-phase composition and mechanical pressures and temperature is lacking. Here, we present a quasi-one-dimensional statistical mechanical model of binary mixture adsorption in metal-organic frameworks (MOFs) treated exactly by a transfer matrix method in the osmotic ensemble. The experimental parameter space may be very complex and investigations into multi-component mixture adsorption may be guided by theoretical insights. The approach successfully models breathing structural transitions induced by adsorption giving a good account of the shape of adsorption isotherms of CO2 and CH4 adsorption in MIL-53(Al). Binary mixture isotherms and co-adsorption-phase diagrams are also calculated and found to give a good description of the experimental trends in these properties and because of the wide model parameter range which reproduces this behaviour suggests that this is generic to MOFs. Finally, a study is made of the influence of mechanical pressure on the shape of CO2 and CH4 adsorption isotherms in MIL-53(Al). Quite modest mechanical pressures can induce significant changes to isotherm shapes in MOFs with implications for binary mixture separation processes. This article is part of the theme issue `Modern theoretical chemistry'.
Statistical foundations of liquid-crystal theory: I. Discrete systems of rod-like molecules.
Seguin, Brian; Fried, Eliot
2012-12-01
We develop a mechanical theory for systems of rod-like particles. Central to our approach is the assumption that the external power expenditure for any subsystem of rods is independent of the underlying frame of reference. This assumption is used to derive the basic balance laws for forces and torques. By considering inertial forces on par with other forces, these laws hold relative to any frame of reference, inertial or noninertial. Finally, we introduce a simple set of constitutive relations to govern the interactions between rods and find restrictions necessary and sufficient for these laws to be consistent with thermodynamics. Our framework provides a foundation for a statistical mechanical derivation of the macroscopic balance laws governing liquid crystals.
Fairchild, Amanda J.; Abara, Winston E.; Gottschall, Amanda C.; Tein, Jenn-Yun; Prinz, Ronald J.
2015-01-01
The purpose of this article is to introduce and describe a statistical model that researchers can use to evaluate underlying mechanisms of behavioral onset and other event occurrence outcomes. Specifically, the article develops a framework for estimating mediation effects with outcomes measured in discrete-time epochs by integrating the statistical mediation model with discrete-time survival analysis. The methodology has the potential to help strengthen health research by targeting prevention and intervention work more effectively as well as by improving our understanding of discretized periods of risk. The model is applied to an existing longitudinal data set to demonstrate its use, and programming code is provided to facilitate its implementation. PMID:24296470
Origin of the spike-timing-dependent plasticity rule
NASA Astrophysics Data System (ADS)
Cho, Myoung Won; Choi, M. Y.
2016-08-01
A biological synapse changes its efficacy depending on the difference between pre- and post-synaptic spike timings. Formulating spike-timing-dependent interactions in terms of the path integral, we establish a neural-network model, which makes it possible to predict relevant quantities rigorously by means of standard methods in statistical mechanics and field theory. In particular, the biological synaptic plasticity rule is shown to emerge as the optimal form for minimizing the free energy. It is further revealed that maximization of the entropy of neural activities gives rise to the competitive behavior of biological learning. This demonstrates that statistical mechanics helps to understand rigorously key characteristic behaviors of a neural network, thus providing the possibility of physics serving as a useful and relevant framework for probing life.
On a full Bayesian inference for force reconstruction problems
NASA Astrophysics Data System (ADS)
Aucejo, M.; De Smet, O.
2018-05-01
In a previous paper, the authors introduced a flexible methodology for reconstructing mechanical sources in the frequency domain from prior local information on both their nature and location over a linear and time invariant structure. The proposed approach was derived from Bayesian statistics, because of its ability in mathematically accounting for experimenter's prior knowledge. However, since only the Maximum a Posteriori estimate was computed, the posterior uncertainty about the regularized solution given the measured vibration field, the mechanical model and the regularization parameter was not assessed. To answer this legitimate question, this paper fully exploits the Bayesian framework to provide, from a Markov Chain Monte Carlo algorithm, credible intervals and other statistical measures (mean, median, mode) for all the parameters of the force reconstruction problem.
Statistical mechanics of ribbons under bending and twisting torques.
Sinha, Supurna; Samuel, Joseph
2013-11-20
We present an analytical study of ribbons subjected to an external torque. We first describe the elastic response of a ribbon within a purely mechanical framework. We then study the role of thermal fluctuations in modifying its elastic response. We predict the moment-angle relation of bent and twisted ribbons. Such a study is expected to shed light on the role of twist in DNA looping and on bending elasticity of twisted graphene ribbons. Our quantitative predictions can be tested against future single molecule experiments.
A statistical framework for biomedical literature mining.
Chung, Dongjun; Lawson, Andrew; Zheng, W Jim
2017-09-30
In systems biology, it is of great interest to identify new genes that were not previously reported to be associated with biological pathways related to various functions and diseases. Identification of these new pathway-modulating genes does not only promote understanding of pathway regulation mechanisms but also allow identification of novel targets for therapeutics. Recently, biomedical literature has been considered as a valuable resource to investigate pathway-modulating genes. While the majority of currently available approaches are based on the co-occurrence of genes within an abstract, it has been reported that these approaches show only sub-optimal performances because 70% of abstracts contain information only for a single gene. To overcome such limitation, we propose a novel statistical framework based on the concept of ontology fingerprint that uses gene ontology to extract information from large biomedical literature data. The proposed framework simultaneously identifies pathway-modulating genes and facilitates interpreting functions of these new genes. We also propose a computationally efficient posterior inference procedure based on Metropolis-Hastings within Gibbs sampler for parameter updates and the poor man's reversible jump Markov chain Monte Carlo approach for model selection. We evaluate the proposed statistical framework with simulation studies, experimental validation, and an application to studies of pathway-modulating genes in yeast. The R implementation of the proposed model is currently available at https://dongjunchung.github.io/bayesGO/. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Rahman, Md Mahmudur; Bhattacharya, Prabir; Desai, Bipin C
2007-01-01
A content-based image retrieval (CBIR) framework for diverse collection of medical images of different imaging modalities, anatomic regions with different orientations and biological systems is proposed. Organization of images in such a database (DB) is well defined with predefined semantic categories; hence, it can be useful for category-specific searching. The proposed framework consists of machine learning methods for image prefiltering, similarity matching using statistical distance measures, and a relevance feedback (RF) scheme. To narrow down the semantic gap and increase the retrieval efficiency, we investigate both supervised and unsupervised learning techniques to associate low-level global image features (e.g., color, texture, and edge) in the projected PCA-based eigenspace with their high-level semantic and visual categories. Specially, we explore the use of a probabilistic multiclass support vector machine (SVM) and fuzzy c-mean (FCM) clustering for categorization and prefiltering of images to reduce the search space. A category-specific statistical similarity matching is proposed in a finer level on the prefiltered images. To incorporate a better perception subjectivity, an RF mechanism is also added to update the query parameters dynamically and adjust the proposed matching functions. Experiments are based on a ground-truth DB consisting of 5000 diverse medical images of 20 predefined categories. Analysis of results based on cross-validation (CV) accuracy and precision-recall for image categorization and retrieval is reported. It demonstrates the improvement, effectiveness, and efficiency achieved by the proposed framework.
The extraction and integration framework: a two-process account of statistical learning.
Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G
2013-07-01
The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved
NASA Astrophysics Data System (ADS)
Girard, L.; Weiss, J.; Molines, J. M.; Barnier, B.; Bouillon, S.
2009-08-01
Sea ice drift and deformation from models are evaluated on the basis of statistical and scaling properties. These properties are derived from two observation data sets: the RADARSAT Geophysical Processor System (RGPS) and buoy trajectories from the International Arctic Buoy Program (IABP). Two simulations obtained with the Louvain-la-Neuve Ice Model (LIM) coupled to a high-resolution ocean model and a simulation obtained with the Los Alamos Sea Ice Model (CICE) were analyzed. Model ice drift compares well with observations in terms of large-scale velocity field and distributions of velocity fluctuations although a significant bias on the mean ice speed is noted. On the other hand, the statistical properties of ice deformation are not well simulated by the models: (1) The distributions of strain rates are incorrect: RGPS distributions of strain rates are power law tailed, i.e., exhibit "wild randomness," whereas models distributions remain in the Gaussian attraction basin, i.e., exhibit "mild randomness." (2) The models are unable to reproduce the spatial and temporal correlations of the deformation fields: In the observations, ice deformation follows spatial and temporal scaling laws that express the heterogeneity and the intermittency of deformation. These relations do not appear in simulated ice deformation. Mean deformation in models is almost scale independent. The statistical properties of ice deformation are a signature of the ice mechanical behavior. The present work therefore suggests that the mechanical framework currently used by models is inappropriate. A different modeling framework based on elastic interactions could improve the representation of the statistical and scaling properties of ice deformation.
Statistics of dislocation pinning at localized obstacles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dutta, A.; Bhattacharya, M., E-mail: mishreyee@vecc.gov.in; Barat, P.
2014-10-14
Pinning of dislocations at nanosized obstacles like precipitates, voids, and bubbles is a crucial mechanism in the context of phenomena like hardening and creep. The interaction between such an obstacle and a dislocation is often studied at fundamental level by means of analytical tools, atomistic simulations, and finite element methods. Nevertheless, the information extracted from such studies cannot be utilized to its maximum extent on account of insufficient information about the underlying statistics of this process comprising a large number of dislocations and obstacles in a system. Here, we propose a new statistical approach, where the statistics of pinning ofmore » dislocations by idealized spherical obstacles is explored by taking into account the generalized size-distribution of the obstacles along with the dislocation density within a three-dimensional framework. Starting with a minimal set of material parameters, the framework employs the method of geometrical statistics with a few simple assumptions compatible with the real physical scenario. The application of this approach, in combination with the knowledge of fundamental dislocation-obstacle interactions, has successfully been demonstrated for dislocation pinning at nanovoids in neutron irradiated type 316-stainless steel in regard to the non-conservative motion of dislocations. An interesting phenomenon of transition from rare pinning to multiple pinning regimes with increasing irradiation temperature is revealed.« less
Theory of the Sea Ice Thickness Distribution
NASA Astrophysics Data System (ADS)
Toppaladoddi, Srikanth; Wettlaufer, J. S.
2015-10-01
We use concepts from statistical physics to transform the original evolution equation for the sea ice thickness distribution g (h ) from Thorndike et al. into a Fokker-Planck-like conservation law. The steady solution is g (h )=N (q )hqe-h /H, where q and H are expressible in terms of moments over the transition probabilities between thickness categories. The solution exhibits the functional form used in observational fits and shows that for h ≪1 , g (h ) is controlled by both thermodynamics and mechanics, whereas for h ≫1 only mechanics controls g (h ). Finally, we derive the underlying Langevin equation governing the dynamics of the ice thickness h , from which we predict the observed g (h ). The genericity of our approach provides a framework for studying the geophysical-scale structure of the ice pack using methods of broad relevance in statistical mechanics.
Theory of the Sea Ice Thickness Distribution.
Toppaladoddi, Srikanth; Wettlaufer, J S
2015-10-02
We use concepts from statistical physics to transform the original evolution equation for the sea ice thickness distribution g(h) from Thorndike et al. into a Fokker-Planck-like conservation law. The steady solution is g(h)=N(q)h(q)e(-h/H), where q and H are expressible in terms of moments over the transition probabilities between thickness categories. The solution exhibits the functional form used in observational fits and shows that for h≪1, g(h) is controlled by both thermodynamics and mechanics, whereas for h≫1 only mechanics controls g(h). Finally, we derive the underlying Langevin equation governing the dynamics of the ice thickness h, from which we predict the observed g(h). The genericity of our approach provides a framework for studying the geophysical-scale structure of the ice pack using methods of broad relevance in statistical mechanics.
Non-equilibrium Statistical Mechanics and the Sea Ice Thickness Distribution
NASA Astrophysics Data System (ADS)
Wettlaufer, John; Toppaladoddi, Srikanth
We use concepts from non-equilibrium statistical physics to transform the original evolution equation for the sea ice thickness distribution g (h) due to Thorndike et al., (1975) into a Fokker-Planck like conservation law. The steady solution is g (h) = calN (q) hqe - h / H , where q and H are expressible in terms of moments over the transition probabilities between thickness categories. The solution exhibits the functional form used in observational fits and shows that for h << 1 , g (h) is controlled by both thermodynamics and mechanics, whereas for h >> 1 only mechanics controls g (h) . Finally, we derive the underlying Langevin equation governing the dynamics of the ice thickness h, from which we predict the observed g (h) . This allows us to demonstrate that the ice thickness field is ergodic. The genericity of our approach provides a framework for studying the geophysical scale structure of the ice pack using methods of broad relevance in statistical mechanics. Swedish Research Council Grant No. 638-2013-9243, NASA Grant NNH13ZDA001N-CRYO and the National Science Foundation and the Office of Naval Research under OCE-1332750 for support.
NASA Astrophysics Data System (ADS)
Adib, Artur B.
In the last two decades or so, a collection of results in nonequilibrium statistical mechanics that departs from the traditional near-equilibrium framework introduced by Lars Onsager in 1931 has been derived, yielding new fundamental insights into far-from-equilibrium processes in general. Apart from offering a more quantitative statement of the second law of thermodynamics, some of these results---typified by the so-called "Jarzynski equality"---have also offered novel means of estimating equilibrium quantities from nonequilibrium processes, such as free energy differences from single-molecule "pulling" experiments. This thesis contributes to such efforts by offering three novel results in nonequilibrium statistical mechanics: (a) The entropic analog of the Jarzynski equality; (b) A methodology for estimating free energies from "clamp-and-release" nonequilibrium processes; and (c) A directly measurable symmetry relation in chemical kinetics similar to (but more general than) chemical detailed balance. These results share in common the feature of remaining valid outside Onsager's near-equilibrium regime, and bear direct applicability in protein folding kinetics as well as in single-molecule free energy estimation.
Sequential geophysical and flow inversion to characterize fracture networks in subsurface systems
Mudunuru, Maruti Kumar; Karra, Satish; Makedonska, Nataliia; ...
2017-09-05
Subsurface applications, including geothermal, geological carbon sequestration, and oil and gas, typically involve maximizing either the extraction of energy or the storage of fluids. Fractures form the main pathways for flow in these systems, and locating these fractures is critical for predicting flow. However, fracture characterization is a highly uncertain process, and data from multiple sources, such as flow and geophysical are needed to reduce this uncertainty. We present a nonintrusive, sequential inversion framework for integrating data from geophysical and flow sources to constrain fracture networks in the subsurface. In this framework, we first estimate bounds on the statistics formore » the fracture orientations using microseismic data. These bounds are estimated through a combination of a focal mechanism (physics-based approach) and clustering analysis (statistical approach) of seismic data. Then, the fracture lengths are constrained using flow data. In conclusion, the efficacy of this inversion is demonstrated through a representative example.« less
Sequential geophysical and flow inversion to characterize fracture networks in subsurface systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mudunuru, Maruti Kumar; Karra, Satish; Makedonska, Nataliia
Subsurface applications, including geothermal, geological carbon sequestration, and oil and gas, typically involve maximizing either the extraction of energy or the storage of fluids. Fractures form the main pathways for flow in these systems, and locating these fractures is critical for predicting flow. However, fracture characterization is a highly uncertain process, and data from multiple sources, such as flow and geophysical are needed to reduce this uncertainty. We present a nonintrusive, sequential inversion framework for integrating data from geophysical and flow sources to constrain fracture networks in the subsurface. In this framework, we first estimate bounds on the statistics formore » the fracture orientations using microseismic data. These bounds are estimated through a combination of a focal mechanism (physics-based approach) and clustering analysis (statistical approach) of seismic data. Then, the fracture lengths are constrained using flow data. In conclusion, the efficacy of this inversion is demonstrated through a representative example.« less
An Observation-Driven Agent-Based Modeling and Analysis Framework for C. elegans Embryogenesis.
Wang, Zi; Ramsey, Benjamin J; Wang, Dali; Wong, Kwai; Li, Husheng; Wang, Eric; Bao, Zhirong
2016-01-01
With cutting-edge live microscopy and image analysis, biologists can now systematically track individual cells in complex tissues and quantify cellular behavior over extended time windows. Computational approaches that utilize the systematic and quantitative data are needed to understand how cells interact in vivo to give rise to the different cell types and 3D morphology of tissues. An agent-based, minimum descriptive modeling and analysis framework is presented in this paper to study C. elegans embryogenesis. The framework is designed to incorporate the large amounts of experimental observations on cellular behavior and reserve data structures/interfaces that allow regulatory mechanisms to be added as more insights are gained. Observed cellular behaviors are organized into lineage identity, timing and direction of cell division, and path of cell movement. The framework also includes global parameters such as the eggshell and a clock. Division and movement behaviors are driven by statistical models of the observations. Data structures/interfaces are reserved for gene list, cell-cell interaction, cell fate and landscape, and other global parameters until the descriptive model is replaced by a regulatory mechanism. This approach provides a framework to handle the ongoing experiments of single-cell analysis of complex tissues where mechanistic insights lag data collection and need to be validated on complex observations.
Flexural strength and failure modes of layered ceramic structures.
Borba, Márcia; de Araújo, Maico D; de Lima, Erick; Yoshimura, Humberto N; Cesar, Paulo F; Griggs, Jason A; Della Bona, Alvaro
2011-12-01
To evaluate the effect of the specimen design on the flexural strength (σ(f)) and failure mode of ceramic structures, testing the hypothesis that the ceramic material under tension controls the mechanical performance of the structure. Three ceramics used as framework materials for fixed partial dentures (YZ--Vita In-Ceram YZ; IZ--Vita In-Ceram Zirconia; AL--Vita In-Ceram AL) and two veneering porcelains (VM7 and VM9) were studied. Bar-shaped specimens were produced in three different designs (n=10): monolithic, two layers (porcelain-framework) and three layers (TRI) (porcelain-framework-porcelain). Specimens were tested for three-point flexural strength at 1MPa/s in 37°C artificial saliva. For bi-layered design, the specimens were tested in both conditions: with porcelain (PT) or framework ceramic (FT) layer under tension. Fracture surfaces were analyzed using stereomicroscope and scanning electron microscopy (SEM). Young's modulus (E) and Poisson's ratio (ν) were determined using ultrasonic pulse-echo method. Results were statistically analyzed by Kruskal-Wallis and Student-Newman-Keuls tests. Except for VM7 and VM9, significant differences were observed for E values among the materials. YZ showed the highest ν value followed by IZ and AL. YZ presented the highest σ(f). There was no statistical difference in the σ(f) value between IZ and IZ-FT and between AL and AL-FT. σ(f) values for YZ-PT, IZ-PT, IZ-TRI, AL-PT, AL-TRI were similar to the results obtained for VM7 and VM9. Two types of fracture mode were identified: total and partial failure. The mechanical performance of the specimens was determined by the material under tension during testing, confirming the study hypothesis. Copyright © 2011 Academy of Dental Materials. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Müller, Tobias M.; Gurevich, Boris
2005-04-01
An important dissipation mechanism for waves in randomly inhomogeneous poroelastic media is the effect of wave-induced fluid flow. In the framework of Biot's theory of poroelasticity, this mechanism can be understood as scattering from fast into slow compressional waves. To describe this conversion scattering effect in poroelastic random media, the dynamic characteristics of the coherent wavefield using the theory of statistical wave propagation are analyzed. In particular, the method of statistical smoothing is applied to Biot's equations of poroelasticity. Within the accuracy of the first-order statistical smoothing an effective wave number of the coherent field, which accounts for the effect of wave-induced flow, is derived. This wave number is complex and involves an integral over the correlation function of the medium's fluctuations. It is shown that the known one-dimensional (1-D) result can be obtained as a special case of the present 3-D theory. The expression for the effective wave number allows to derive a model for elastic attenuation and dispersion due to wave-induced fluid flow. These wavefield attributes are analyzed in a companion paper. .
Random matrices and condensation into multiple states
NASA Astrophysics Data System (ADS)
Sadeghi, Sina; Engel, Andreas
2018-03-01
In the present work, we employ methods from statistical mechanics of disordered systems to investigate static properties of condensation into multiple states in a general framework. We aim at showing how typical properties of random interaction matrices play a vital role in manifesting the statistics of condensate states. In particular, an analytical expression for the fraction of condensate states in the thermodynamic limit is provided that confirms the result of the mean number of coexisting species in a random tournament game. We also study the interplay between the condensation problem and zero-sum games with correlated random payoff matrices.
Addressing the statistical mechanics of planet orbits in the solar system
NASA Astrophysics Data System (ADS)
Mogavero, Federico
2017-10-01
The chaotic nature of planet dynamics in the solar system suggests the relevance of a statistical approach to planetary orbits. In such a statistical description, the time-dependent position and velocity of the planets are replaced by the probability density function (PDF) of their orbital elements. It is natural to set up this kind of approach in the framework of statistical mechanics. In the present paper, I focus on the collisionless excitation of eccentricities and inclinations via gravitational interactions in a planetary system. The future planet trajectories in the solar system constitute the prototype of this kind of dynamics. I thus address the statistical mechanics of the solar system planet orbits and try to reproduce the PDFs numerically constructed by Laskar (2008, Icarus, 196, 1). I show that the microcanonical ensemble of the Laplace-Lagrange theory accurately reproduces the statistics of the giant planet orbits. To model the inner planets I then investigate the ansatz of equiprobability in the phase space constrained by the secular integrals of motion. The eccentricity and inclination PDFs of Earth and Venus are reproduced with no free parameters. Within the limitations of a stationary model, the predictions also show a reasonable agreement with Mars PDFs and that of Mercury inclination. The eccentricity of Mercury demands in contrast a deeper analysis. I finally revisit the random walk approach of Laskar to the time dependence of the inner planet PDFs. Such a statistical theory could be combined with direct numerical simulations of planet trajectories in the context of planet formation, which is likely to be a chaotic process.
Dittmer, Marc Philipp; Nensa, Moritz; Stiesch, Meike; Kohorst, Philipp
2013-01-01
Implant-supported screw-retained fixed dental prostheses (FDPs) produced by CAD/ CAM have been introduced in recent years for the rehabilitation of partial or total endentulous jaws. However, there is a lack of data about the long-term mechanical characteristics. The aim of this study was to investigate the failure mode and the influence of extended cyclic mechanical loading on the load-bearing capacity of these frameworks. Ten five-unit FDP frameworks simulating a free-end situation in the mandibular jaw were manufactured according to the I-Bridge®2-concept (I-Bridge®2, Biomain AB, Helsingborg, Sweden) and each was screw-retained on three differently angulated Astra Tech implants (30º buccal angulation/0º angulation/30º lingual angulation). One half of the specimens was tested for static load-bearing capacity without any further treatment (control), whereas the other half underwent five million cycles of mechanical loading with 100 N as the upper load limit (test). All specimens were loaded until failure in a universal testing machine with an occlusal force applied at the pontics. Load-displacement curves were recorded and the failure mode was macro- and microscopically analyzed. The statistical analysis was performed using a t-test (p=0.05). All the specimens survived cyclic mechanical loading and no obvious failure could be observed. Due to the cyclic mechanical loading, the load-bearing capacity decreased from 8,496 N±196 N (control) to 7,592 N±901 N (test). The cyclic mechanical loading did not significantly influence the load-bearing capacity (p=0.060). The failure mode was almost identical in all specimens: large deformations of the framework at the implant connection area were obvious. The load-bearing capacity of the I-Bridge®2 frameworks is much higher than the clinically relevant occlusal forces, even with considerably angulated implants. However, the performance under functional loading in vivo depends on additional aspects. Further studies are needed to address these aspects.
DITTMER, Marc Philipp; NENSA, Moritz; STIESCH, Meike; KOHORST, Philipp
2013-01-01
Implant-supported screw-retained fixed dental prostheses (FDPs) produced by CAD/ CAM have been introduced in recent years for the rehabilitation of partial or total endentulous jaws. However, there is a lack of data about the long-term mechanical characteristics. Objective The aim of this study was to investigate the failure mode and the influence of extended cyclic mechanical loading on the load-bearing capacity of these frameworks. Material and Methods Ten five-unit FDP frameworks simulating a free-end situation in the mandibular jaw were manufactured according to the I-Bridge®2-concept (I-Bridge®2, Biomain AB, Helsingborg, Sweden) and each was screw-retained on three differently angulated Astra Tech implants (30º buccal angulation/0º angulation/30º lingual angulation). One half of the specimens was tested for static load-bearing capacity without any further treatment (control), whereas the other half underwent five million cycles of mechanical loading with 100 N as the upper load limit (test). All specimens were loaded until failure in a universal testing machine with an occlusal force applied at the pontics. Load-displacement curves were recorded and the failure mode was macro- and microscopically analyzed. The statistical analysis was performed using a t-test (p=0.05). Results All the specimens survived cyclic mechanical loading and no obvious failure could be observed. Due to the cyclic mechanical loading, the load-bearing capacity decreased from 8,496 N±196 N (control) to 7,592 N±901 N (test). The cyclic mechanical loading did not significantly influence the load-bearing capacity (p=0.060). The failure mode was almost identical in all specimens: large deformations of the framework at the implant connection area were obvious. Conclusion The load-bearing capacity of the I-Bridge®2 frameworks is much higher than the clinically relevant occlusal forces, even with considerably angulated implants. However, the performance under functional loading in vivo depends on additional aspects. Further studies are needed to address these aspects. PMID:24037068
Genser, Bernd; Fischer, Joachim E; Figueiredo, Camila A; Alcântara-Neves, Neuza; Barreto, Mauricio L; Cooper, Philip J; Amorim, Leila D; Saemann, Marcus D; Weichhart, Thomas; Rodrigues, Laura C
2016-05-20
Immunologists often measure several correlated immunological markers, such as concentrations of different cytokines produced by different immune cells and/or measured under different conditions, to draw insights from complex immunological mechanisms. Although there have been recent methodological efforts to improve the statistical analysis of immunological data, a framework is still needed for the simultaneous analysis of multiple, often correlated, immune markers. This framework would allow the immunologists' hypotheses about the underlying biological mechanisms to be integrated. We present an analytical approach for statistical analysis of correlated immune markers, such as those commonly collected in modern immuno-epidemiological studies. We demonstrate i) how to deal with interdependencies among multiple measurements of the same immune marker, ii) how to analyse association patterns among different markers, iii) how to aggregate different measures and/or markers to immunological summary scores, iv) how to model the inter-relationships among these scores, and v) how to use these scores in epidemiological association analyses. We illustrate the application of our approach to multiple cytokine measurements from 818 children enrolled in a large immuno-epidemiological study (SCAALA Salvador), which aimed to quantify the major immunological mechanisms underlying atopic diseases or asthma. We demonstrate how to aggregate systematically the information captured in multiple cytokine measurements to immunological summary scores aimed at reflecting the presumed underlying immunological mechanisms (Th1/Th2 balance and immune regulatory network). We show how these aggregated immune scores can be used as predictors in regression models with outcomes of immunological studies (e.g. specific IgE) and compare the results to those obtained by a traditional multivariate regression approach. The proposed analytical approach may be especially useful to quantify complex immune responses in immuno-epidemiological studies, where investigators examine the relationship among epidemiological patterns, immune response, and disease outcomes.
Statistical Extremes of Turbulence and a Cascade Generalisation of Euler's Gyroscope Equation
NASA Astrophysics Data System (ADS)
Tchiguirinskaia, Ioulia; Scherzer, Daniel
2016-04-01
Turbulence refers to a rather well defined hydrodynamical phenomenon uncovered by Reynolds. Nowadays, the word turbulence is used to designate the loss of order in many different geophysical fields and the related fundamental extreme variability of environmental data over a wide range of scales. Classical statistical techniques for estimating the extremes, being largely limited to statistical distributions, do not take into account the mechanisms generating such extreme variability. An alternative approaches to nonlinear variability are based on a fundamental property of the non-linear equations: scale invariance, which means that these equations are formally invariant under given scale transforms. Its specific framework is that of multifractals. In this framework extreme variability builds up scale by scale leading to non-classical statistics. Although multifractals are increasingly understood as a basic framework for handling such variability, there is still a gap between their potential and their actual use. In this presentation we discuss how to dealt with highly theoretical problems of mathematical physics together with a wide range of geophysical applications. We use Euler's gyroscope equation as a basic element in constructing a complex deterministic system that preserves not only the scale symmetry of the Navier-Stokes equations, but some more of their symmetries. Euler's equation has been not only the object of many theoretical investigations of the gyroscope device, but also generalised enough to become the basic equation of fluid mechanics. Therefore, there is no surprise that a cascade generalisation of this equation can be used to characterise the intermittency of turbulence, to better understand the links between the multifractal exponents and the structure of a simplified, but not simplistic, version of the Navier-Stokes equations. In a given way, this approach is similar to that of Lorenz, who studied how the flap of a butterfly wing could generate a cyclone with the help of a 3D ordinary differential system. Being well supported by the extensive numerical results, the cascade generalisation of Euler's gyroscope equation opens new horizons for predictability and predictions of processes having long-range dependences.
Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity
Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny
2015-01-01
Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity). PMID:25976626
Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity.
Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny
2015-05-15
Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).
Notes on stochastic (bio)-logic gates: computing with allosteric cooperativity
NASA Astrophysics Data System (ADS)
Agliari, Elena; Altavilla, Matteo; Barra, Adriano; Dello Schiavo, Lorenzo; Katz, Evgeny
2015-05-01
Recent experimental breakthroughs have finally allowed to implement in-vitro reaction kinetics (the so called enzyme based logic) which code for two-inputs logic gates and mimic the stochastic AND (and NAND) as well as the stochastic OR (and NOR). This accomplishment, together with the already-known single-input gates (performing as YES and NOT), provides a logic base and paves the way to the development of powerful biotechnological devices. However, as biochemical systems are always affected by the presence of noise (e.g. thermal), standard logic is not the correct theoretical reference framework, rather we show that statistical mechanics can work for this scope: here we formulate a complete statistical mechanical description of the Monod-Wyman-Changeaux allosteric model for both single and double ligand systems, with the purpose of exploring their practical capabilities to express noisy logical operators and/or perform stochastic logical operations. Mixing statistical mechanics with logics, and testing quantitatively the resulting findings on the available biochemical data, we successfully revise the concept of cooperativity (and anti-cooperativity) for allosteric systems, with particular emphasis on its computational capabilities, the related ranges and scaling of the involved parameters and its differences with classical cooperativity (and anti-cooperativity).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jamshidian, M., E-mail: jamshidian@cc.iut.ac.ir; Institute of Structural Mechanics, Bauhaus-University Weimar, Marienstrasse 15, 99423 Weimar; Thamburaja, P., E-mail: prakash.thamburaja@gmail.com
A previously-developed finite-deformation- and crystal-elasticity-based constitutive theory for stressed grain growth in cubic polycrystalline bodies has been augmented to include a description of excess surface energy and grain-growth stagnation mechanisms through the use of surface effect state variables in a thermodynamically-consistent manner. The constitutive theory was also implemented into a multiscale coupled finite-element and phase-field computational framework. With the material parameters in the constitutive theory suitably calibrated, our three-dimensional numerical simulations show that the constitutive model is able to accurately predict the experimentally-determined evolution of crystallographic texture and grain size statistics in polycrystalline copper thin films deposited on polyimide substratemore » and annealed at high-homologous temperatures. In particular, our numerical analyses show that the broad texture transition observed in the annealing experiments of polycrystalline thin films is caused by grain growth stagnation mechanisms. - Graphical abstract: - Highlights: • Developing a theory for stressed grain growth in polycrystalline thin films. • Implementation into a multiscale coupled finite-element and phase-field framework. • Quantitative reproduction of the experimental grain growth data by simulations. • Revealing the cause of texture transition to be due to the stagnation mechanisms.« less
Spin Glass a Bridge Between Quantum Computation and Statistical Mechanics
NASA Astrophysics Data System (ADS)
Ohzeki, Masayuki
2013-09-01
In this chapter, we show two fascinating topics lying between quantum information processing and statistical mechanics. First, we introduce an elaborated technique, the surface code, to prepare the particular quantum state with robustness against decoherence. Interestingly, the theoretical limitation of the surface code, accuracy threshold, to restore the quantum state has a close connection with the problem on the phase transition in a special model known as spin glasses, which is one of the most active researches in statistical mechanics. The phase transition in spin glasses is an intractable problem, since we must strive many-body system with complicated interactions with change of their signs depending on the distance between spins. Fortunately, recent progress in spin-glass theory enables us to predict the precise location of the critical point, at which the phase transition occurs. It means that statistical mechanics is available for revealing one of the most interesting parts in quantum information processing. We show how to import the special tool in statistical mechanics into the problem on the accuracy threshold in quantum computation. Second, we show another interesting technique to employ quantum nature, quantum annealing. The purpose of quantum annealing is to search for the most favored solution of a multivariable function, namely optimization problem. The most typical instance is the traveling salesman problem to find the minimum tour while visiting all the cities. In quantum annealing, we introduce quantum fluctuation to drive a particular system with the artificial Hamiltonian, in which the ground state represents the optimal solution of the specific problem we desire to solve. Induction of the quantum fluctuation gives rise to the quantum tunneling effect, which allows nontrivial hopping from state to state. We then sketch a strategy to control the quantum fluctuation efficiently reaching the ground state. Such a generic framework is called quantum annealing. The most typical instance is quantum adiabatic computation based on the adiabatic theorem. The quantum adiabatic computation as discussed in the other chapter, unfortunately, has a crucial bottleneck for a part of the optimization problems. We here introduce several recent trials to overcome such a weakpoint by use of developments in statistical mechanics. Through both of the topics, we would shed light on the birth of the interdisciplinary field between quantum mechanics and statistical mechanics.
Stability estimation of autoregulated genes under Michaelis-Menten-type kinetics
NASA Astrophysics Data System (ADS)
Arani, Babak M. S.; Mahmoudi, Mahdi; Lahti, Leo; González, Javier; Wit, Ernst C.
2018-06-01
Feedback loops are typical motifs appearing in gene regulatory networks. In some well-studied model organisms, including Escherichia coli, autoregulated genes, i.e., genes that activate or repress themselves through their protein products, are the only feedback interactions. For these types of interactions, the Michaelis-Menten (MM) formulation is a suitable and widely used approach, which always leads to stable steady-state solutions representative of homeostatic regulation. However, in many other biological phenomena, such as cell differentiation, cancer progression, and catastrophes in ecosystems, one might expect to observe bistable switchlike dynamics in the case of strong positive autoregulation. To capture this complex behavior we use the generalized family of MM kinetic models. We give a full analysis regarding the stability of autoregulated genes. We show that the autoregulation mechanism has the capability to exhibit diverse cellular dynamics including hysteresis, a typical characteristic of bistable systems, as well as irreversible transitions between bistable states. We also introduce a statistical framework to estimate the kinetics parameters and probability of different stability regimes given observational data. Empirical data for the autoregulated gene SCO3217 in the SOS system in Streptomyces coelicolor are analyzed. The coupling of a statistical framework and the mathematical model can give further insight into understanding the evolutionary mechanisms toward different cell fates in various systems.
NASA Astrophysics Data System (ADS)
Nold, Andreas; Goddard, Ben; Sibley, David; Kalliadasis, Serafim
2014-03-01
Multiscale effects play a predominant role in wetting phenomena such as the moving contact line. An accurate description is of paramount interest for a wide range of industrial applications, yet it is a matter of ongoing research, due to the difficulty of incorporating different physical effects in one model. Important small-scale phenomena are corrections to the attractive fluid-fluid and wall-fluid forces in inhomogeneous density distributions, which often previously have been accounted for by the disjoining pressure in an ad-hoc manner. We systematically derive a novel model for the description of a single-component liquid-vapor multiphase system which inherently incorporates these nonlocal effects. This derivation, which is inspired by statistical mechanics in the framework of colloidal density functional theory, is critically discussed with respect to its assumptions and restrictions. The model is then employed numerically to study a moving contact line of a liquid fluid displacing its vapor phase. We show how nonlocal physical effects are inherently incorporated by the model and describe how classical macroscopic results for the contact line motion are retrieved. We acknowledge financial support from ERC Advanced Grant No. 247031 and Imperial College through a DTG International Studentship.
Gray, Alistair; Veale, Jaimie F.; Binson, Diane; Sell, Randell L.
2013-01-01
Objective. Effectively addressing health disparities experienced by sexual minority populations requires high-quality official data on sexual orientation. We developed a conceptual framework of sexual orientation to improve the quality of sexual orientation data in New Zealand's Official Statistics System. Methods. We reviewed conceptual and methodological literature, culminating in a draft framework. To improve the framework, we held focus groups and key-informant interviews with sexual minority stakeholders and producers and consumers of official statistics. An advisory board of experts provided additional guidance. Results. The framework proposes working definitions of the sexual orientation topic and measurement concepts, describes dimensions of the measurement concepts, discusses variables framing the measurement concepts, and outlines conceptual grey areas. Conclusion. The framework proposes standard definitions and concepts for the collection of official sexual orientation data in New Zealand. It presents a model for producers of official statistics in other countries, who wish to improve the quality of health data on their citizens. PMID:23840231
Putz, Mihai V.
2009-01-01
The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr’s quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions – all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems. PMID:20087467
Putz, Mihai V
2009-11-10
The density matrix theory, the ancestor of density functional theory, provides the immediate framework for Path Integral (PI) development, allowing the canonical density be extended for the many-electronic systems through the density functional closure relationship. Yet, the use of path integral formalism for electronic density prescription presents several advantages: assures the inner quantum mechanical description of the system by parameterized paths; averages the quantum fluctuations; behaves as the propagator for time-space evolution of quantum information; resembles Schrödinger equation; allows quantum statistical description of the system through partition function computing. In this framework, four levels of path integral formalism were presented: the Feynman quantum mechanical, the semiclassical, the Feynman-Kleinert effective classical, and the Fokker-Planck non-equilibrium ones. In each case the density matrix or/and the canonical density were rigorously defined and presented. The practical specializations for quantum free and harmonic motions, for statistical high and low temperature limits, the smearing justification for the Bohr's quantum stability postulate with the paradigmatic Hydrogen atomic excursion, along the quantum chemical calculation of semiclassical electronegativity and hardness, of chemical action and Mulliken electronegativity, as well as by the Markovian generalizations of Becke-Edgecombe electronic focalization functions - all advocate for the reliability of assuming PI formalism of quantum mechanics as a versatile one, suited for analytically and/or computationally modeling of a variety of fundamental physical and chemical reactivity concepts characterizing the (density driving) many-electronic systems.
A multiple hold-out framework for Sparse Partial Least Squares.
Monteiro, João M; Rao, Anil; Shawe-Taylor, John; Mourão-Miranda, Janaina
2016-09-15
Supervised classification machine learning algorithms may have limitations when studying brain diseases with heterogeneous populations, as the labels might be unreliable. More exploratory approaches, such as Sparse Partial Least Squares (SPLS), may provide insights into the brain's mechanisms by finding relationships between neuroimaging and clinical/demographic data. The identification of these relationships has the potential to improve the current understanding of disease mechanisms, refine clinical assessment tools, and stratify patients. SPLS finds multivariate associative effects in the data by computing pairs of sparse weight vectors, where each pair is used to remove its corresponding associative effect from the data by matrix deflation, before computing additional pairs. We propose a novel SPLS framework which selects the adequate number of voxels and clinical variables to describe each associative effect, and tests their reliability by fitting the model to different splits of the data. As a proof of concept, the approach was applied to find associations between grey matter probability maps and individual items of the Mini-Mental State Examination (MMSE) in a clinical sample with various degrees of dementia. The framework found two statistically significant associative effects between subsets of brain voxels and subsets of the questions/tasks. SPLS was compared with its non-sparse version (PLS). The use of projection deflation versus a classical PLS deflation was also tested in both PLS and SPLS. SPLS outperformed PLS, finding statistically significant effects and providing higher correlation values in hold-out data. Moreover, projection deflation provided better results. Copyright © 2016 The Author(s). Published by Elsevier B.V. All rights reserved.
A quantum framework for likelihood ratios
NASA Astrophysics Data System (ADS)
Bond, Rachael L.; He, Yang-Hui; Ormerod, Thomas C.
The ability to calculate precise likelihood ratios is fundamental to science, from Quantum Information Theory through to Quantum State Estimation. However, there is no assumption-free statistical methodology to achieve this. For instance, in the absence of data relating to covariate overlap, the widely used Bayes’ theorem either defaults to the marginal probability driven “naive Bayes’ classifier”, or requires the use of compensatory expectation-maximization techniques. This paper takes an information-theoretic approach in developing a new statistical formula for the calculation of likelihood ratios based on the principles of quantum entanglement, and demonstrates that Bayes’ theorem is a special case of a more general quantum mechanical expression.
Galatzer-Levy, Isaac R.; Ruggles, Kelly; Chen, Zhe
2017-01-01
Diverse environmental and biological systems interact to influence individual differences in response to environmental stress. Understanding the nature of these complex relationships can enhance the development of methods to: (1) identify risk, (2) classify individuals as healthy or ill, (3) understand mechanisms of change, and (4) develop effective treatments. The Research Domain Criteria (RDoC) initiative provides a theoretical framework to understand health and illness as the product of multiple inter-related systems but does not provide a framework to characterize or statistically evaluate such complex relationships. Characterizing and statistically evaluating models that integrate multiple levels (e.g. synapses, genes, environmental factors) as they relate to outcomes that a free from prior diagnostic benchmarks represents a challenge requiring new computational tools that are capable to capture complex relationships and identify clinically relevant populations. In the current review, we will summarize machine learning methods that can achieve these goals. PMID:29527592
A Statistical Framework for Analyzing Cyber Threats
defender cares most about the attacks against certain ports or services). The grey-box statistical framework formulates a new methodology of Cybersecurity ...the design of prediction models. Our research showed that the grey-box framework is effective in predicting cybersecurity situational awareness.
Quantum work statistics of charged Dirac particles in time-dependent fields
Deffner, Sebastian; Saxena, Avadh
2015-09-28
The quantum Jarzynski equality is an important theorem of modern quantum thermodynamics. We show that the Jarzynski equality readily generalizes to relativistic quantum mechanics described by the Dirac equation. After establishing the conceptual framework we solve a pedagogical, yet experimentally relevant, system analytically. As a main result we obtain the exact quantum work distributions for charged particles traveling through a time-dependent vector potential evolving under Schrödinger as well as under Dirac dynamics, and for which the Jarzynski equality is verified. Thus, special emphasis is put on the conceptual and technical subtleties arising from relativistic quantum mechanics.
Exploiting Data Missingness in Bayesian Network Modeling
NASA Astrophysics Data System (ADS)
Rodrigues de Morais, Sérgio; Aussem, Alex
This paper proposes a framework built on the use of Bayesian networks (BN) for representing statistical dependencies between the existing random variables and additional dummy boolean variables, which represent the presence/absence of the respective random variable value. We show how augmenting the BN with these additional variables helps pinpoint the mechanism through which missing data contributes to the classification task. The missing data mechanism is thus explicitly taken into account to predict the class variable using the data at hand. Extensive experiments on synthetic and real-world incomplete data sets reveals that the missingness information improves classification accuracy.
Conceptualizing a Framework for Advanced Placement Statistics Teaching Knowledge
ERIC Educational Resources Information Center
Haines, Brenna
2015-01-01
The purpose of this article is to sketch a conceptualization of a framework for Advanced Placement (AP) Statistics Teaching Knowledge. Recent research continues to problematize the lack of knowledge and preparation among secondary level statistics teachers. The College Board's AP Statistics course continues to grow and gain popularity, but is a…
Wind Generated Rogue Waves in an Annular Wave Flume.
Toffoli, A; Proment, D; Salman, H; Monbaliu, J; Frascoli, F; Dafilis, M; Stramignoni, E; Forza, R; Manfrin, M; Onorato, M
2017-04-07
We investigate experimentally the statistical properties of a wind-generated wave field and the spontaneous formation of rogue waves in an annular flume. Unlike many experiments on rogue waves where waves are mechanically generated, here the wave field is forced naturally by wind as it is in the ocean. What is unique about the present experiment is that the annular geometry of the tank makes waves propagating circularly in an unlimited-fetch condition. Within this peculiar framework, we discuss the temporal evolution of the statistical properties of the surface elevation. We show that rogue waves and heavy-tail statistics may develop naturally during the growth of the waves just before the wave height reaches a stationary condition. Our results shed new light on the formation of rogue waves in a natural environment.
Toughness and strength of nanocrystalline graphene
Shekhawat, Ashivni; Ritchie, Robert O.
2016-01-28
Pristine monocrystalline graphene is claimed to be the strongest material known with remarkable mechanical and electrical properties. However, graphene made with scalable fabrication techniques is polycrystalline and contains inherent nanoscale line and point defects—grain boundaries and grain-boundary triple junctions—that lead to significant statistical fluctuations in toughness and strength. These fluctuations become particularly pronounced for nanocrystalline graphene where the density of defects is high. Here we use large-scale simulation and continuum modelling to show that the statistical variation in toughness and strength can be understood with ‘weakest-link’ statistics. We develop the first statistical theory of toughness in polycrystalline graphene, and elucidatemore » the nanoscale origins of the grain-size dependence of its strength and toughness. Lastly, our results should lead to more reliable graphene device design, and provide a framework to interpret experimental results in a broad class of two-dimensional materials.« less
Multilayer Statistical Intrusion Detection in Wireless Networks
NASA Astrophysics Data System (ADS)
Hamdi, Mohamed; Meddeb-Makhlouf, Amel; Boudriga, Noureddine
2008-12-01
The rapid proliferation of mobile applications and services has introduced new vulnerabilities that do not exist in fixed wired networks. Traditional security mechanisms, such as access control and encryption, turn out to be inefficient in modern wireless networks. Given the shortcomings of the protection mechanisms, an important research focuses in intrusion detection systems (IDSs). This paper proposes a multilayer statistical intrusion detection framework for wireless networks. The architecture is adequate to wireless networks because the underlying detection models rely on radio parameters and traffic models. Accurate correlation between radio and traffic anomalies allows enhancing the efficiency of the IDS. A radio signal fingerprinting technique based on the maximal overlap discrete wavelet transform (MODWT) is developed. Moreover, a geometric clustering algorithm is presented. Depending on the characteristics of the fingerprinting technique, the clustering algorithm permits to control the false positive and false negative rates. Finally, simulation experiments have been carried out to validate the proposed IDS.
Unifying Complexity and Information
NASA Astrophysics Data System (ADS)
Ke, Da-Guan
2013-04-01
Complex systems, arising in many contexts in the computer, life, social, and physical sciences, have not shared a generally-accepted complexity measure playing a fundamental role as the Shannon entropy H in statistical mechanics. Superficially-conflicting criteria of complexity measurement, i.e. complexity-randomness (C-R) relations, have given rise to a special measure intrinsically adaptable to more than one criterion. However, deep causes of the conflict and the adaptability are not much clear. Here I trace the root of each representative or adaptable measure to its particular universal data-generating or -regenerating model (UDGM or UDRM). A representative measure for deterministic dynamical systems is found as a counterpart of the H for random process, clearly redefining the boundary of different criteria. And a specific UDRM achieving the intrinsic adaptability enables a general information measure that ultimately solves all major disputes. This work encourages a single framework coving deterministic systems, statistical mechanics and real-world living organisms.
Multi-fidelity machine learning models for accurate bandgap predictions of solids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pilania, Ghanshyam; Gubernatis, James E.; Lookman, Turab
Here, we present a multi-fidelity co-kriging statistical learning framework that combines variable-fidelity quantum mechanical calculations of bandgaps to generate a machine-learned model that enables low-cost accurate predictions of the bandgaps at the highest fidelity level. Additionally, the adopted Gaussian process regression formulation allows us to predict the underlying uncertainties as a measure of our confidence in the predictions. In using a set of 600 elpasolite compounds as an example dataset and using semi-local and hybrid exchange correlation functionals within density functional theory as two levels of fidelities, we demonstrate the excellent learning performance of the method against actual high fidelitymore » quantum mechanical calculations of the bandgaps. The presented statistical learning method is not restricted to bandgaps or electronic structure methods and extends the utility of high throughput property predictions in a significant way.« less
Multi-fidelity machine learning models for accurate bandgap predictions of solids
Pilania, Ghanshyam; Gubernatis, James E.; Lookman, Turab
2016-12-28
Here, we present a multi-fidelity co-kriging statistical learning framework that combines variable-fidelity quantum mechanical calculations of bandgaps to generate a machine-learned model that enables low-cost accurate predictions of the bandgaps at the highest fidelity level. Additionally, the adopted Gaussian process regression formulation allows us to predict the underlying uncertainties as a measure of our confidence in the predictions. In using a set of 600 elpasolite compounds as an example dataset and using semi-local and hybrid exchange correlation functionals within density functional theory as two levels of fidelities, we demonstrate the excellent learning performance of the method against actual high fidelitymore » quantum mechanical calculations of the bandgaps. The presented statistical learning method is not restricted to bandgaps or electronic structure methods and extends the utility of high throughput property predictions in a significant way.« less
Gardeux, Vincent; Achour, Ikbel; Li, Jianrong; Maienschein-Cline, Mark; Li, Haiquan; Pesce, Lorenzo; Parinandi, Gurunadh; Bahroos, Neil; Winn, Robert; Foster, Ian; Garcia, Joe G N; Lussier, Yves A
2014-01-01
Background The emergence of precision medicine allowed the incorporation of individual molecular data into patient care. Indeed, DNA sequencing predicts somatic mutations in individual patients. However, these genetic features overlook dynamic epigenetic and phenotypic response to therapy. Meanwhile, accurate personal transcriptome interpretation remains an unmet challenge. Further, N-of-1 (single-subject) efficacy trials are increasingly pursued, but are underpowered for molecular marker discovery. Method ‘N-of-1-pathways’ is a global framework relying on three principles: (i) the statistical universe is a single patient; (ii) significance is derived from geneset/biomodules powered by paired samples from the same patient; and (iii) similarity between genesets/biomodules assesses commonality and differences, within-study and cross-studies. Thus, patient gene-level profiles are transformed into deregulated pathways. From RNA-Seq of 55 lung adenocarcinoma patients, N-of-1-pathways predicts the deregulated pathways of each patient. Results Cross-patient N-of-1-pathways obtains comparable results with conventional genesets enrichment analysis (GSEA) and differentially expressed gene (DEG) enrichment, validated in three external evaluations. Moreover, heatmap and star plots highlight both individual and shared mechanisms ranging from molecular to organ-systems levels (eg, DNA repair, signaling, immune response). Patients were ranked based on the similarity of their deregulated mechanisms to those of an independent gold standard, generating unsupervised clusters of diametric extreme survival phenotypes (p=0.03). Conclusions The N-of-1-pathways framework provides a robust statistical and relevant biological interpretation of individual disease-free survival that is often overlooked in conventional cross-patient studies. It enables mechanism-level classifiers with smaller cohorts as well as N-of-1 studies. Software http://lussierlab.org/publications/N-of-1-pathways PMID:25301808
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gardeux, Vincent; Achour, Ikbel; Li, Jianrong
Background: The emergence of precision medicine allowed the incorporation of individual molecular data into patient care. This research entails, DNA sequencing predicts somatic mutations in individual patients. However, these genetic features overlook dynamic epigenetic and phenotypic response to therapy. Meanwhile, accurate personal transcriptome interpretation remains an unmet challenge. Further, N-of-1 (single-subject) efficacy trials are increasingly pursued, but are underpowered for molecular marker discovery. Method: ‘N-of-1- pathways’ is a global framework relying on three principles: (i) the statistical universe is a single patient; (ii) significance is derived from geneset/biomodules powered by paired samples from the same patient; and (iii) similarity betweenmore » genesets/biomodules assesses commonality and differences, within-study and cross-studies. Thus, patient gene-level profiles are transformed into deregulated pathways. From RNA-Seq of 55 lung adenocarcinoma patients, N-of-1- pathways predicts the deregulated pathways of each patient. Results: Cross-patient N-of-1- pathways obtains comparable results with conventional genesets enrichment analysis (GSEA) and differentially expressed gene (DEG) enrichment, validated in three external evaluations. Moreover, heatmap and star plots highlight both individual and shared mechanisms ranging from molecular to organ-systems levels (eg, DNA repair, signaling, immune response). Patients were ranked based on the similarity of their deregulated mechanisms to those of an independent gold standard, generating unsupervised clusters of diametric extreme survival phenotypes (p=0.03). Conclusions: The N-of-1- pathways framework provides a robust statistical and relevant biological interpretation of individual disease-free survival that is often overlooked in conventional cross-patient studies. It enables mechanism-level classifiers with smaller cohorts as well as N-of-1 studies.« less
Gardeux, Vincent; Achour, Ikbel; Li, Jianrong; ...
2014-11-01
Background: The emergence of precision medicine allowed the incorporation of individual molecular data into patient care. This research entails, DNA sequencing predicts somatic mutations in individual patients. However, these genetic features overlook dynamic epigenetic and phenotypic response to therapy. Meanwhile, accurate personal transcriptome interpretation remains an unmet challenge. Further, N-of-1 (single-subject) efficacy trials are increasingly pursued, but are underpowered for molecular marker discovery. Method: ‘N-of-1- pathways’ is a global framework relying on three principles: (i) the statistical universe is a single patient; (ii) significance is derived from geneset/biomodules powered by paired samples from the same patient; and (iii) similarity betweenmore » genesets/biomodules assesses commonality and differences, within-study and cross-studies. Thus, patient gene-level profiles are transformed into deregulated pathways. From RNA-Seq of 55 lung adenocarcinoma patients, N-of-1- pathways predicts the deregulated pathways of each patient. Results: Cross-patient N-of-1- pathways obtains comparable results with conventional genesets enrichment analysis (GSEA) and differentially expressed gene (DEG) enrichment, validated in three external evaluations. Moreover, heatmap and star plots highlight both individual and shared mechanisms ranging from molecular to organ-systems levels (eg, DNA repair, signaling, immune response). Patients were ranked based on the similarity of their deregulated mechanisms to those of an independent gold standard, generating unsupervised clusters of diametric extreme survival phenotypes (p=0.03). Conclusions: The N-of-1- pathways framework provides a robust statistical and relevant biological interpretation of individual disease-free survival that is often overlooked in conventional cross-patient studies. It enables mechanism-level classifiers with smaller cohorts as well as N-of-1 studies.« less
Unperturbed Schelling Segregation in Two or Three Dimensions
NASA Astrophysics Data System (ADS)
Barmpalias, George; Elwes, Richard; Lewis-Pye, Andrew
2016-09-01
Schelling's models of segregation, first described in 1969 (Am Econ Rev 59:488-493, 1969) are among the best known models of self-organising behaviour. Their original purpose was to identify mechanisms of urban racial segregation. But his models form part of a family which arises in statistical mechanics, neural networks, social science, and beyond, where populations of agents interact on networks. Despite extensive study, unperturbed Schelling models have largely resisted rigorous analysis, prior results generally focusing on variants in which noise is introduced into the dynamics, the resulting system being amenable to standard techniques from statistical mechanics or stochastic evolutionary game theory (Young in Individual strategy and social structure: an evolutionary theory of institutions, Princeton University Press, Princeton, 1998). A series of recent papers (Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012); Barmpalias et al. in: 55th annual IEEE symposium on foundations of computer science, Philadelphia, 2014, J Stat Phys 158:806-852, 2015), has seen the first rigorous analyses of 1-dimensional unperturbed Schelling models, in an asymptotic framework largely unknown in statistical mechanics. Here we provide the first such analysis of 2- and 3-dimensional unperturbed models, establishing most of the phase diagram, and answering a challenge from Brandt et al. in: Proceedings of the 44th annual ACM symposium on theory of computing (STOC 2012), 2012).
Wang, Xue; Bi, Dao-wei; Ding, Liang; Wang, Sheng
2007-01-01
The recent availability of low cost and miniaturized hardware has allowed wireless sensor networks (WSNs) to retrieve audio and video data in real world applications, which has fostered the development of wireless multimedia sensor networks (WMSNs). Resource constraints and challenging multimedia data volume make development of efficient algorithms to perform in-network processing of multimedia contents imperative. This paper proposes solving problems in the domain of WMSNs from the perspective of multi-agent systems. The multi-agent framework enables flexible network configuration and efficient collaborative in-network processing. The focus is placed on target classification in WMSNs where audio information is retrieved by microphones. To deal with the uncertainties related to audio information retrieval, the statistical approaches of power spectral density estimates, principal component analysis and Gaussian process classification are employed. A multi-agent negotiation mechanism is specially developed to efficiently utilize limited resources and simultaneously enhance classification accuracy and reliability. The negotiation is composed of two phases, where an auction based approach is first exploited to allocate the classification task among the agents and then individual agent decisions are combined by the committee decision mechanism. Simulation experiments with real world data are conducted and the results show that the proposed statistical approaches and negotiation mechanism not only reduce memory and computation requirements in WMSNs but also significantly enhance classification accuracy and reliability. PMID:28903223
A Hierarchical Approach to Fracture Mechanics
NASA Technical Reports Server (NTRS)
Saether, Erik; Taasan, Shlomo
2004-01-01
Recent research conducted under NASA LaRC's Creativity and Innovation Program has led to the development of an initial approach for a hierarchical fracture mechanics. This methodology unites failure mechanisms occurring at different length scales and provides a framework for a physics-based theory of fracture. At the nanoscale, parametric molecular dynamic simulations are used to compute the energy associated with atomic level failure mechanisms. This information is used in a mesoscale percolation model of defect coalescence to obtain statistics of fracture paths and energies through Monte Carlo simulations. The mathematical structure of predicted crack paths is described using concepts of fractal geometry. The non-integer fractal dimension relates geometric and energy measures between meso- and macroscales. For illustration, a fractal-based continuum strain energy release rate is derived for inter- and transgranular fracture in polycrystalline metals.
2017-01-01
Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274, 1926–1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105, 2745–2750; Thiessen & Yee 2010 Child Development 81, 1287–1303; Saffran 2002 Journal of Memory and Language 47, 172–196; Misyak & Christiansen 2012 Language Learning 62, 302–331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39, 246–263; Thiessen et al. 2013 Psychological Bulletin 139, 792–814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik 2013 Cognitive Science 37, 310–343). This article is part of the themed issue ‘New frontiers for statistical learning in the cognitive sciences'. PMID:27872374
Thiessen, Erik D
2017-01-05
Statistical learning has been studied in a variety of different tasks, including word segmentation, object identification, category learning, artificial grammar learning and serial reaction time tasks (e.g. Saffran et al. 1996 Science 274: , 1926-1928; Orban et al. 2008 Proceedings of the National Academy of Sciences 105: , 2745-2750; Thiessen & Yee 2010 Child Development 81: , 1287-1303; Saffran 2002 Journal of Memory and Language 47: , 172-196; Misyak & Christiansen 2012 Language Learning 62: , 302-331). The difference among these tasks raises questions about whether they all depend on the same kinds of underlying processes and computations, or whether they are tapping into different underlying mechanisms. Prior theoretical approaches to statistical learning have often tried to explain or model learning in a single task. However, in many cases these approaches appear inadequate to explain performance in multiple tasks. For example, explaining word segmentation via the computation of sequential statistics (such as transitional probability) provides little insight into the nature of sensitivity to regularities among simultaneously presented features. In this article, we will present a formal computational approach that we believe is a good candidate to provide a unifying framework to explore and explain learning in a wide variety of statistical learning tasks. This framework suggests that statistical learning arises from a set of processes that are inherent in memory systems, including activation, interference, integration of information and forgetting (e.g. Perruchet & Vinter 1998 Journal of Memory and Language 39: , 246-263; Thiessen et al. 2013 Psychological Bulletin 139: , 792-814). From this perspective, statistical learning does not involve explicit computation of statistics, but rather the extraction of elements of the input into memory traces, and subsequent integration across those memory traces that emphasize consistent information (Thiessen and Pavlik 2013 Cognitive Science 37: , 310-343).This article is part of the themed issue 'New frontiers for statistical learning in the cognitive sciences'. © 2016 The Author(s).
Tuckerman, Mark E; Chandra, Amalendu; Marx, Dominik
2010-09-28
Extraction of relaxation times, lifetimes, and rates associated with the transport of topological charge defects in hydrogen-bonded networks from molecular dynamics simulations is a challenge because proton transfer reactions continually change the identity of the defect core. In this paper, we present a statistical mechanical theory that allows these quantities to be computed in an unbiased manner. The theory employs a set of suitably defined indicator or population functions for locating a defect structure and their associated correlation functions. These functions are then used to develop a chemical master equation framework from which the rates and lifetimes can be determined. Furthermore, we develop an integral equation formalism for connecting various types of population correlation functions and derive an iterative solution to the equation, which is given a graphical interpretation. The chemical master equation framework is applied to the problems of both hydronium and hydroxide transport in bulk water. For each case it is shown that the theory establishes direct links between the defect's dominant solvation structures, the kinetics of charge transfer, and the mechanism of structural diffusion. A detailed analysis is presented for aqueous hydroxide, examining both reorientational time scales and relaxation of the rotational anisotropy, which is correlated with recent experimental results for these quantities. Finally, for OH(-)(aq) it is demonstrated that the "dynamical hypercoordination mechanism" is consistent with available experimental data while other mechanistic proposals are shown to fail. As a means of going beyond the linear rate theory valid from short up to intermediate time scales, a fractional kinetic model is introduced in the Appendix in order to describe the nonexponential long-time behavior of time-correlation functions. Within the mathematical framework of fractional calculus the power law decay ∼t(-σ), where σ is a parameter of the model and depends on the dimensionality of the system, is obtained from Mittag-Leffler functions due to their long-time asymptotics, whereas (stretched) exponential behavior is found for short times.
Jamming II: Edwards’ statistical mechanics of random packings of hard spheres
NASA Astrophysics Data System (ADS)
Wang, Ping; Song, Chaoming; Jin, Yuliang; Makse, Hernán A.
2011-02-01
The problem of finding the most efficient way to pack spheres has an illustrious history, dating back to the crystalline arrays conjectured by Kepler and the random geometries explored by Bernal in the 1960s. This problem finds applications spanning from the mathematician’s pencil, the processing of granular materials, the jamming and glass transitions, all the way to fruit packing in every grocery. There are presently numerous experiments showing that the loosest way to pack spheres gives a density of ∼55% (named random loose packing, RLP) while filling all the loose voids results in a maximum density of ∼63%-64% (named random close packing, RCP). While those values seem robustly true, to this date there is no well-accepted physical explanation or theoretical prediction for them. Here we develop a common framework for understanding the random packings of monodisperse hard spheres whose limits can be interpreted as the experimentally observed RLP and RCP. The reason for these limits arises from a statistical picture of jammed states in which the RCP can be interpreted as the ground state of the ensemble of jammed matter with zero compactivity, while the RLP arises in the infinite compactivity limit. We combine an extended statistical mechanics approach ‘a la Edwards’ (where the role traditionally played by the energy and temperature in thermal systems is substituted by the volume and compactivity) with a constraint on mechanical stability imposed by the isostatic condition. We show how such approaches can bring results that can be compared to experiments and allow for an exploitation of the statistical mechanics framework. The key result is the use of a relation between the local Voronoi volumes of the constituent grains (denoted the volume function) and the number of neighbors in contact that permits us to simply combine the two approaches to develop a theory of volume fluctuations in jammed matter. Ultimately, our results lead to a phase diagram that provides a unifying view of the disordered hard sphere packing problem and further sheds light on a diverse spectrum of data, including the RLP state. Theoretical results are well reproduced by numerical simulations that confirm the essential role played by friction in determining both the RLP and RCP limits. The RLP values depend on friction, explaining why varied experimental results can be obtained.
ERIC Educational Resources Information Center
Thompson, Bruce
Web-based statistical instruction, like all statistical instruction, ought to focus on teaching the essence of the research endeavor: the exercise of reflective judgment. Using the framework of the recent report of the American Psychological Association (APA) Task Force on Statistical Inference (Wilkinson and the APA Task Force on Statistical…
Evidence of the non-extensive character of Earth's ambient noise.
NASA Astrophysics Data System (ADS)
Koutalonis, Ioannis; Vallianatos, Filippos
2017-04-01
Investigation of dynamical features of ambient seismic noise is one of the important scientific and practical research challenges. In the same time there isgrowing interest concerning an approach to study Earth Physics based on thescience of complex systems and non extensive statistical mechanics which is a generalization of Boltzmann-Gibbs statistical physics (Vallianatos et al., 2016).This seems to be a promising framework for studying complex systems exhibitingphenomena such as, long-range interactions, and memory effects. Inthis work we use non-extensive statistical mechanics and signal analysis methodsto explore the nature of ambient noise as measured in the stations of the HSNC in South Aegean (Chatzopoulos et al., 2016). In the present work we analyzed the de-trended increments time series of ambient seismic noise X(t), in time windows of 20 minutes to 10 seconds within "calm time zones" where the human-induced noise presents a minimum. Following the non extensive statistical physics approach, the probability distribution function of the increments of ambient noise is investigated. Analyzing the probability density function (PDF)p(X), normalized to zero mean and unit varianceresults that the fluctuations of Earth's ambient noise follows a q-Gaussian distribution asdefined in the frame of non-extensive statisticalmechanics indicated the possible existence of memory effects in Earth's ambient noise. References: F. Vallianatos, G. Papadakis, G. Michas, Generalized statistical mechanics approaches to earthquakes and tectonics. Proc. R. Soc. A, 472, 20160497, 2016. G. Chatzopoulos, I.Papadopoulos, F.Vallianatos, The Hellenic Seismological Network of Crete (HSNC): Validation and results of the 2013 aftershock,Advances in Geosciences, 41, 65-72, 2016.
NASA Astrophysics Data System (ADS)
Miga, Michael I.; Weis, Jared A.; Granero-Molto, Froilan; Spagnoli, Anna
2010-03-01
Understanding bone remodeling and mechanical property characteristics is important for assessing treatments to accelerate healing or in developing diagnostics to evaluate successful return to function. The murine system whereby mid-diaphaseal tibia fractures are imparted on the subject and fracture healing is assessed at different time points and under different therapeutic conditions is a particularly useful model to study. In this work, a novel inverse geometric nonlinear elasticity modeling framework is proposed that can reconstruct multiple mechanical properties from uniaxial testing data. To test this framework, the Lame' constants were reconstructed within the context of a murine cohort (n=6) where there were no differences in treatment post tibia fracture except that half of the mice were allowed to heal 4 days longer (10 day, and 14 day healing time point, respectively). The properties reconstructed were a shear modulus of G=511.2 +/- 295.6 kPa, and 833.3+/- 352.3 kPa for the 10 day, and 14 day time points respectively. The second Lame' constant reconstructed at λ=1002.9 +/-42.9 kPa, and 14893.7 +/- 863.3 kPa for the 10 day, and 14 day time points respectively. An unpaired Student t-test was used to test for statistically significant differences among the groups. While the shear modulus did not meet our criteria for significance, the second Lame' constant did at a value p<0.0001. Traditional metrics that are commonly used within the bone fracture healing research community were not found to be statistically significant.
Lakatos, Eszter; Salehi-Reyhani, Ali; Barclay, Michael; Stumpf, Michael P H; Klug, David R
2017-01-01
We determine p53 protein abundances and cell to cell variation in two human cancer cell lines with single cell resolution, and show that the fractional width of the distributions is the same in both cases despite a large difference in average protein copy number. We developed a computational framework to identify dominant mechanisms controlling the variation of protein abundance in a simple model of gene expression from the summary statistics of single cell steady state protein expression distributions. Our results, based on single cell data analysed in a Bayesian framework, lends strong support to a model in which variation in the basal p53 protein abundance may be best explained by variations in the rate of p53 protein degradation. This is supported by measurements of the relative average levels of mRNA which are very similar despite large variation in the level of protein.
Quantifying economic fluctuations by adapting methods of statistical physics
NASA Astrophysics Data System (ADS)
Plerou, Vasiliki
2001-09-01
The first focus of this thesis is the investigation of cross-correlations between the price fluctuations of different stocks using the conceptual framework of random matrix theory (RMT), developed in physics to describe the statistical properties of energy-level spectra of complex nuclei. RMT makes predictions for the statistical properties of matrices that are universal, i.e., do not depend on the interactions between the elements comprising the system. In physical systems, deviations from the predictions of RMT provide clues regarding the mechanisms controlling the dynamics of a given system so this framework is of potential value if applied to economic systems. This thesis compares the statistics of cross-correlation matrix
Statistical Mechanics of the Delayed Reward-Based Learning with Node Perturbation
NASA Astrophysics Data System (ADS)
Hiroshi Saito,; Kentaro Katahira,; Kazuo Okanoya,; Masato Okada,
2010-06-01
In reward-based learning, reward is typically given with some delay after a behavior that causes the reward. In machine learning literature, the framework of the eligibility trace has been used as one of the solutions to handle the delayed reward in reinforcement learning. In recent studies, the eligibility trace is implied to be important for difficult neuroscience problem known as the “distal reward problem”. Node perturbation is one of the stochastic gradient methods from among many kinds of reinforcement learning implementations, and it searches the approximate gradient by introducing perturbation to a network. Since the stochastic gradient method does not require a objective function differential, it is expected to be able to account for the learning mechanism of a complex system, like a brain. We study the node perturbation with the eligibility trace as a specific example of delayed reward-based learning, and analyzed it using a statistical mechanics approach. As a result, we show the optimal time constant of the eligibility trace respect to the reward delay and the existence of unlearnable parameter configurations.
Particle Acceleration in a Statistically Modeled Solar Active-Region Corona
NASA Astrophysics Data System (ADS)
Toutounzi, A.; Vlahos, L.; Isliker, H.; Dimitropoulou, M.; Anastasiadis, A.; Georgoulis, M.
2013-09-01
Elaborating a statistical approach to describe the spatiotemporally intermittent electric field structures formed inside a flaring solar active region, we investigate the efficiency of such structures in accelerating charged particles (electrons). The large-scale magnetic configuration in the solar atmosphere responds to the strong turbulent flows that convey perturbations across the active region by initiating avalanche-type processes. The resulting unstable structures correspond to small-scale dissipation regions hosting strong electric fields. Previous research on particle acceleration in strongly turbulent plasmas provides a general framework for addressing such a problem. This framework combines various electromagnetic field configurations obtained by magnetohydrodynamical (MHD) or cellular automata (CA) simulations, or by employing a statistical description of the field's strength and configuration with test particle simulations. Our objective is to complement previous work done on the subject. As in previous efforts, a set of three probability distribution functions describes our ad-hoc electromagnetic field configurations. In addition, we work on data-driven 3D magnetic field extrapolations. A collisional relativistic test-particle simulation traces each particle's guiding center within these configurations. We also find that an interplay between different electron populations (thermal/non-thermal, ambient/injected) in our simulations may also address, via a re-acceleration mechanism, the so called `number problem'. Using the simulated particle-energy distributions at different heights of the cylinder we test our results against observations, in the framework of the collisional thick target model (CTTM) of solar hard X-ray (HXR) emission. The above work is supported by the Hellenic National Space Weather Research Network (HNSWRN) via the THALIS Programme.
The non-equilibrium allele frequency spectrum in a Poisson random field framework.
Kaj, Ingemar; Mugal, Carina F
2016-10-01
In population genetic studies, the allele frequency spectrum (AFS) efficiently summarizes genome-wide polymorphism data and shapes a variety of allele frequency-based summary statistics. While existing theory typically features equilibrium conditions, emerging methodology requires an analytical understanding of the build-up of the allele frequencies over time. In this work, we use the framework of Poisson random fields to derive new representations of the non-equilibrium AFS for the case of a Wright-Fisher population model with selection. In our approach, the AFS is a scaling-limit of the expectation of a Poisson stochastic integral and the representation of the non-equilibrium AFS arises in terms of a fixation time probability distribution. The known duality between the Wright-Fisher diffusion process and a birth and death process generalizing Kingman's coalescent yields an additional representation. The results carry over to the setting of a random sample drawn from the population and provide the non-equilibrium behavior of sample statistics. Our findings are consistent with and extend a previous approach where the non-equilibrium AFS solves a partial differential forward equation with a non-traditional boundary condition. Moreover, we provide a bridge to previous coalescent-based work, and hence tie several frameworks together. Since frequency-based summary statistics are widely used in population genetics, for example, to identify candidate loci of adaptive evolution, to infer the demographic history of a population, or to improve our understanding of the underlying mechanics of speciation events, the presented results are potentially useful for a broad range of topics. Copyright © 2016 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Nitko, Anthony J.; Hsu, Tse-chi
Item analysis procedures appropriate for domain-referenced classroom testing are described. A conceptual framework within which item statistics can be considered and promising statistics in light of this framework are presented. The sampling fluctuations of the more promising item statistics for sample sizes comparable to the typical classroom…
Tsallis non-extensive statistics and solar wind plasma complexity
NASA Astrophysics Data System (ADS)
Pavlos, G. P.; Iliopoulos, A. C.; Zastenker, G. N.; Zelenyi, L. M.; Karakatsanis, L. P.; Riazantseva, M. O.; Xenakis, M. N.; Pavlos, E. G.
2015-03-01
This article presents novel results revealing non-equilibrium phase transition processes in the solar wind plasma during a strong shock event, which took place on 26th September 2011. Solar wind plasma is a typical case of stochastic spatiotemporal distribution of physical state variables such as force fields (B → , E →) and matter fields (particle and current densities or bulk plasma distributions). This study shows clearly the non-extensive and non-Gaussian character of the solar wind plasma and the existence of multi-scale strong correlations from the microscopic to the macroscopic level. It also underlines the inefficiency of classical magneto-hydro-dynamic (MHD) or plasma statistical theories, based on the classical central limit theorem (CLT), to explain the complexity of the solar wind dynamics, since these theories include smooth and differentiable spatial-temporal functions (MHD theory) or Gaussian statistics (Boltzmann-Maxwell statistical mechanics). On the contrary, the results of this study indicate the presence of non-Gaussian non-extensive statistics with heavy tails probability distribution functions, which are related to the q-extension of CLT. Finally, the results of this study can be understood in the framework of modern theoretical concepts such as non-extensive statistical mechanics (Tsallis, 2009), fractal topology (Zelenyi and Milovanov, 2004), turbulence theory (Frisch, 1996), strange dynamics (Zaslavsky, 2002), percolation theory (Milovanov, 1997), anomalous diffusion theory and anomalous transport theory (Milovanov, 2001), fractional dynamics (Tarasov, 2013) and non-equilibrium phase transition theory (Chang, 1992).
Statistical testing of association between menstruation and migraine.
Barra, Mathias; Dahl, Fredrik A; Vetvik, Kjersti G
2015-02-01
To repair and refine a previously proposed method for statistical analysis of association between migraine and menstruation. Menstrually related migraine (MRM) affects about 20% of female migraineurs in the general population. The exact pathophysiological link from menstruation to migraine is hypothesized to be through fluctuations in female reproductive hormones, but the exact mechanisms remain unknown. Therefore, the main diagnostic criterion today is concurrency of migraine attacks with menstruation. Methods aiming to exclude spurious associations are wanted, so that further research into these mechanisms can be performed on a population with a true association. The statistical method is based on a simple two-parameter null model of MRM (which allows for simulation modeling), and Fisher's exact test (with mid-p correction) applied to standard 2 × 2 contingency tables derived from the patients' headache diaries. Our method is a corrected version of a previously published flawed framework. To our best knowledge, no other published methods for establishing a menstruation-migraine association by statistical means exist today. The probabilistic methodology shows good performance when subjected to receiver operator characteristic curve analysis. Quick reference cutoff values for the clinical setting were tabulated for assessing association given a patient's headache history. In this paper, we correct a proposed method for establishing association between menstruation and migraine by statistical methods. We conclude that the proposed standard of 3-cycle observations prior to setting an MRM diagnosis should be extended with at least one perimenstrual window to obtain sufficient information for statistical processing. © 2014 American Headache Society.
Wingate, Peter H; Thornton, George C; McIntyre, Kelly S; Frame, Jennifer H
2003-02-01
The present study examined relationships between reduction-in-force (RIF) personnel practices, presentation of statistical evidence, and litigation outcomes. Policy capturing methods were utilized to analyze the components of 115 federal district court opinions involving age discrimination disparate treatment allegations and organizational downsizing. Univariate analyses revealed meaningful links between RIF personnel practices, use of statistical evidence, and judicial verdict. The defendant organization was awarded summary judgment in 73% of the claims included in the study. Judicial decisions in favor of the defendant organization were found to be significantly related to such variables as formal performance appraisal systems, termination decision review within the organization, methods of employee assessment and selection for termination, and the presence of a concrete layoff policy. The use of statistical evidence in ADEA disparate treatment litigation was investigated and found to be a potentially persuasive type of indirect evidence. Legal, personnel, and evidentiary ramifications are reviewed, and a framework of downsizing mechanics emphasizing legal defensibility is presented.
Reconciling intuitive physics and Newtonian mechanics for colliding objects.
Sanborn, Adam N; Mansinghka, Vikash K; Griffiths, Thomas L
2013-04-01
People have strong intuitions about the influence objects exert upon one another when they collide. Because people's judgments appear to deviate from Newtonian mechanics, psychologists have suggested that people depend on a variety of task-specific heuristics. This leaves open the question of how these heuristics could be chosen, and how to integrate them into a unified model that can explain human judgments across a wide range of physical reasoning tasks. We propose an alternative framework, in which people's judgments are based on optimal statistical inference over a Newtonian physical model that incorporates sensory noise and intrinsic uncertainty about the physical properties of the objects being viewed. This noisy Newton framework can be applied to a multitude of judgments, with people's answers determined by the uncertainty they have for physical variables and the constraints of Newtonian mechanics. We investigate a range of effects in mass judgments that have been taken as strong evidence for heuristic use and show that they are well explained by the interplay between Newtonian constraints and sensory uncertainty. We also consider an extended model that handles causality judgments, and obtain good quantitative agreement with human judgments across tasks that involve different judgment types with a single consistent set of parameters.
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
Inverse tissue mechanics of cell monolayer expansion.
Kondo, Yohei; Aoki, Kazuhiro; Ishii, Shin
2018-03-01
Living tissues undergo deformation during morphogenesis. In this process, cells generate mechanical forces that drive the coordinated cell motion and shape changes. Recent advances in experimental and theoretical techniques have enabled in situ measurement of the mechanical forces, but the characterization of mechanical properties that determine how these forces quantitatively affect tissue deformation remains challenging, and this represents a major obstacle for the complete understanding of morphogenesis. Here, we proposed a non-invasive reverse-engineering approach for the estimation of the mechanical properties, by combining tissue mechanics modeling and statistical machine learning. Our strategy is to model the tissue as a continuum mechanical system and to use passive observations of spontaneous tissue deformation and force fields to statistically estimate the model parameters. This method was applied to the analysis of the collective migration of Madin-Darby canine kidney cells, and the tissue flow and force were simultaneously observed by the phase contrast imaging and traction force microscopy. We found that our monolayer elastic model, whose elastic moduli were reverse-engineered, enabled a long-term forecast of the traction force fields when given the tissue flow fields, indicating that the elasticity contributes to the evolution of the tissue stress. Furthermore, we investigated the tissues in which myosin was inhibited by blebbistatin treatment, and observed a several-fold reduction in the elastic moduli. The obtained results validate our framework, which paves the way to the estimation of mechanical properties of living tissues during morphogenesis.
When mechanism matters: Bayesian forecasting using models of ecological diffusion
Hefley, Trevor J.; Hooten, Mevin B.; Russell, Robin E.; Walsh, Daniel P.; Powell, James A.
2017-01-01
Ecological diffusion is a theory that can be used to understand and forecast spatio-temporal processes such as dispersal, invasion, and the spread of disease. Hierarchical Bayesian modelling provides a framework to make statistical inference and probabilistic forecasts, using mechanistic ecological models. To illustrate, we show how hierarchical Bayesian models of ecological diffusion can be implemented for large data sets that are distributed densely across space and time. The hierarchical Bayesian approach is used to understand and forecast the growth and geographic spread in the prevalence of chronic wasting disease in white-tailed deer (Odocoileus virginianus). We compare statistical inference and forecasts from our hierarchical Bayesian model to phenomenological regression-based methods that are commonly used to analyse spatial occurrence data. The mechanistic statistical model based on ecological diffusion led to important ecological insights, obviated a commonly ignored type of collinearity, and was the most accurate method for forecasting.
Topologically protected modes in non-equilibrium stochastic systems.
Murugan, Arvind; Vaikuntanathan, Suriyanarayanan
2017-01-10
Non-equilibrium driving of biophysical processes is believed to enable their robust functioning despite the presence of thermal fluctuations and other sources of disorder. Such robust functions include sensory adaptation, enhanced enzymatic specificity and maintenance of coherent oscillations. Elucidating the relation between energy consumption and organization remains an important and open question in non-equilibrium statistical mechanics. Here we report that steady states of systems with non-equilibrium fluxes can support topologically protected boundary modes that resemble similar modes in electronic and mechanical systems. Akin to their electronic and mechanical counterparts, topological-protected boundary steady states in non-equilibrium systems are robust and are largely insensitive to local perturbations. We argue that our work provides a framework for how biophysical systems can use non-equilibrium driving to achieve robust function.
Thermodynamic Model of Spatial Memory
NASA Astrophysics Data System (ADS)
Kaufman, Miron; Allen, P.
1998-03-01
We develop and test a thermodynamic model of spatial memory. Our model is an application of statistical thermodynamics to cognitive science. It is related to applications of the statistical mechanics framework in parallel distributed processes research. Our macroscopic model allows us to evaluate an entropy associated with spatial memory tasks. We find that older adults exhibit higher levels of entropy than younger adults. Thurstone's Law of Categorical Judgment, according to which the discriminal processes along the psychological continuum produced by presentations of a single stimulus are normally distributed, is explained by using a Hooke spring model of spatial memory. We have also analyzed a nonlinear modification of the ideal spring model of spatial memory. This work is supported by NIH/NIA grant AG09282-06.
Temperature in and out of equilibrium: A review of concepts, tools and attempts
NASA Astrophysics Data System (ADS)
Puglisi, A.; Sarracino, A.; Vulpiani, A.
2017-11-01
We review the general aspects of the concept of temperature in equilibrium and non-equilibrium statistical mechanics. Although temperature is an old and well-established notion, it still presents controversial facets. After a short historical survey of the key role of temperature in thermodynamics and statistical mechanics, we tackle a series of issues which have been recently reconsidered. In particular, we discuss different definitions and their relevance for energy fluctuations. The interest in such a topic has been triggered by the recent observation of negative temperatures in condensed matter experiments. Moreover, the ability to manipulate systems at the micro and nano-scale urges to understand and clarify some aspects related to the statistical properties of small systems (as the issue of temperature's ;fluctuations;). We also discuss the notion of temperature in a dynamical context, within the theory of linear response for Hamiltonian systems at equilibrium and stochastic models with detailed balance, and the generalized fluctuation-response relations, which provide a hint for an extension of the definition of temperature in far-from-equilibrium systems. To conclude we consider non-Hamiltonian systems, such as granular materials, turbulence and active matter, where a general theoretical framework is still lacking.
NASA Astrophysics Data System (ADS)
Militello, F.; Farley, T.; Mukhi, K.; Walkden, N.; Omotani, J. T.
2018-05-01
A statistical framework was introduced in Militello and Omotani [Nucl. Fusion 56, 104004 (2016)] to correlate the dynamics and statistics of L-mode and inter-ELM plasma filaments with the radial profiles of thermodynamic quantities they generate in the Scrape Off Layer. This paper extends the framework to cases in which the filaments are emitted from the separatrix at different toroidal positions and with a finite toroidal velocity. It is found that the toroidal velocity does not affect the profiles, while the toroidal distribution of filament emission renormalises the waiting time between two events. Experimental data collected by visual camera imaging are used to evaluate the statistics of the fluctuations, to inform the choice of the probability distribution functions used in the application of the framework. It is found that the toroidal separation of the filaments is exponentially distributed, thus suggesting the lack of a toroidal modal structure. Finally, using these measurements, the framework is applied to an experimental case and good agreement is found.
Do the Modified Uncertainty Principle and Polymer Quantization predict same physics?
NASA Astrophysics Data System (ADS)
Majumder, Barun; Sen, Sourav
2012-10-01
In this Letter we study the effects of the Modified Uncertainty Principle as proposed in Ali et al. (2009) [5] in simple quantum mechanical systems and study its thermodynamic properties. We have assumed that the quantum particles follow Maxwell-Boltzmann statistics with no spin. We compare our results with the results found in the GUP and polymer quantum mechanical frameworks. Interestingly we find that the corrected thermodynamic entities are exactly the same compared to the polymer results but the length scale considered has a theoretically different origin. Hence we express the need of further study for an investigation whether these two approaches are conceptually connected in the fundamental level.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aldemir, Tunc; Denning, Richard; Catalyurek, Umit
Reduction in safety margin can be expected as passive structures and components undergo degradation with time. Limitations in the traditional probabilistic risk assessment (PRA) methodology constrain its value as an effective tool to address the impact of aging effects on risk and for quantifying the impact of aging management strategies in maintaining safety margins. A methodology has been developed to address multiple aging mechanisms involving large numbers of components (with possibly statistically dependent failures) within the PRA framework in a computationally feasible manner when the sequencing of events is conditioned on the physical conditions predicted in a simulation environment, suchmore » as the New Generation System Code (NGSC) concept. Both epistemic and aleatory uncertainties can be accounted for within the same phenomenological framework and maintenance can be accounted for in a coherent fashion. The framework accommodates the prospective impacts of various intervention strategies such as testing, maintenance, and refurbishment. The methodology is illustrated with several examples.« less
Alvarez, Stéphanie; Timler, Carl J.; Michalscheck, Mirja; Paas, Wim; Descheemaeker, Katrien; Tittonell, Pablo; Andersson, Jens A.; Groot, Jeroen C. J.
2018-01-01
Creating typologies is a way to summarize the large heterogeneity of smallholder farming systems into a few farm types. Various methods exist, commonly using statistical analysis, to create these typologies. We demonstrate that the methodological decisions on data collection, variable selection, data-reduction and clustering techniques can bear a large impact on the typology results. We illustrate the effects of analysing the diversity from different angles, using different typology objectives and different hypotheses, on typology creation by using an example from Zambia’s Eastern Province. Five separate typologies were created with principal component analysis (PCA) and hierarchical clustering analysis (HCA), based on three different expert-informed hypotheses. The greatest overlap between typologies was observed for the larger, wealthier farm types but for the remainder of the farms there were no clear overlaps between typologies. Based on these results, we argue that the typology development should be guided by a hypothesis on the local agriculture features and the drivers and mechanisms of differentiation among farming systems, such as biophysical and socio-economic conditions. That hypothesis is based both on the typology objective and on prior expert knowledge and theories of the farm diversity in the study area. We present a methodological framework that aims to integrate participatory and statistical methods for hypothesis-based typology construction. This is an iterative process whereby the results of the statistical analysis are compared with the reality of the target population as hypothesized by the local experts. Using a well-defined hypothesis and the presented methodological framework, which consolidates the hypothesis through local expert knowledge for the creation of typologies, warrants development of less subjective and more contextualized quantitative farm typologies. PMID:29763422
Alvarez, Stéphanie; Timler, Carl J; Michalscheck, Mirja; Paas, Wim; Descheemaeker, Katrien; Tittonell, Pablo; Andersson, Jens A; Groot, Jeroen C J
2018-01-01
Creating typologies is a way to summarize the large heterogeneity of smallholder farming systems into a few farm types. Various methods exist, commonly using statistical analysis, to create these typologies. We demonstrate that the methodological decisions on data collection, variable selection, data-reduction and clustering techniques can bear a large impact on the typology results. We illustrate the effects of analysing the diversity from different angles, using different typology objectives and different hypotheses, on typology creation by using an example from Zambia's Eastern Province. Five separate typologies were created with principal component analysis (PCA) and hierarchical clustering analysis (HCA), based on three different expert-informed hypotheses. The greatest overlap between typologies was observed for the larger, wealthier farm types but for the remainder of the farms there were no clear overlaps between typologies. Based on these results, we argue that the typology development should be guided by a hypothesis on the local agriculture features and the drivers and mechanisms of differentiation among farming systems, such as biophysical and socio-economic conditions. That hypothesis is based both on the typology objective and on prior expert knowledge and theories of the farm diversity in the study area. We present a methodological framework that aims to integrate participatory and statistical methods for hypothesis-based typology construction. This is an iterative process whereby the results of the statistical analysis are compared with the reality of the target population as hypothesized by the local experts. Using a well-defined hypothesis and the presented methodological framework, which consolidates the hypothesis through local expert knowledge for the creation of typologies, warrants development of less subjective and more contextualized quantitative farm typologies.
Koike, Mari; Hummel, Susan K; Ball, John D; Okabe, Toru
2012-06-01
Although pure titanium is known to have good biocompatibility, a titanium alloy with better strength is needed for fabricating clinically acceptable, partial removable dental prosthesis (RDP) frameworks. The mechanical properties of an experimental Ti-5Al-5Cu alloy cast with a 2-step investment technique were examined for RDP framework applications. Patterns for tests for various properties and denture frameworks for a preliminary trial casting were invested with a 2-step coating method using 2 types of mold materials: a less reactive spinel compound (Al(2)O(3)·MgO) and a less expensive SiO(2)-based material. The yield and tensile strength (n=5), modulus of elasticity (n=5), elongation (n=5), and hardness (n=8) of the cast Ti-5Al-5Cu alloy were determined. The external appearance and internal porosities of the preliminary trial castings of denture frameworks (n=2) were examined with a conventional dental radiographic unit. Cast Ti-6Al-4V alloy and commercially pure titanium (CP Ti) were used as controls. The data for the mechanical properties were statistically analyzed with 1-way ANOVA (α=.05). The yield strength of the cast Ti-5Al-5Cu alloy was 851 MPa and the hardness was 356 HV. These properties were comparable to those of the cast Ti-6Al-4V and were higher than those of CP Ti (P<.05). One of the acrylic resin-retention areas of the Ti-5Al-5Cu frameworks was found to have been incompletely cast. The cast biocompatible experimental Ti-5Al-5Cu alloy exhibited high strength when cast with a 2-step coating method. With a dedicated study to determine the effect of sprue design on the quality of castings, biocompatible Ti-5Al-5Cu RDP frameworks for a clinical trial can be produced. Copyright © 2012 The Editorial Council of the Journal of Prosthetic Dentistry. Published by Mosby, Inc. All rights reserved.
Fit reduced GUTS models online: From theory to practice.
Baudrot, Virgile; Veber, Philippe; Gence, Guillaume; Charles, Sandrine
2018-05-20
Mechanistic modeling approaches, such as the toxicokinetic-toxicodynamic (TKTD) framework, are promoted by international institutions such as the European Food Safety Authority and the Organization for Economic Cooperation and Development to assess the environmental risk of chemical products generated by human activities. TKTD models can encompass a large set of mechanisms describing the kinetics of compounds inside organisms (e.g., uptake and elimination) and their effect at the level of individuals (e.g., damage accrual, recovery, and death mechanism). Compared to classical dose-response models, TKTD approaches have many advantages, including accounting for temporal aspects of exposure and toxicity, considering data points all along the experiment and not only at the end, and making predictions for untested situations as realistic exposure scenarios. Among TKTD models, the general unified threshold model of survival (GUTS) is within the most recent and innovative framework but is still underused in practice, especially by risk assessors, because specialist programming and statistical skills are necessary to run it. Making GUTS models easier to use through a new module freely available from the web platform MOSAIC (standing for MOdeling and StAtistical tools for ecotoxIClogy) should promote GUTS operability in support of the daily work of environmental risk assessors. This paper presents the main features of MOSAIC_GUTS: uploading of the experimental data, GUTS fitting analysis, and LCx estimates with their uncertainty. These features will be exemplified from literature data. Integr Environ Assess Manag 2018;00:000-000. © 2018 SETAC. © 2018 SETAC.
ERIC Educational Resources Information Center
LeMire, Steven D.
2010-01-01
This paper proposes an argument framework for the teaching of null hypothesis statistical testing and its application in support of research. Elements of the Toulmin (1958) model of argument are used to illustrate the use of p values and Type I and Type II error rates in support of claims about statistical parameters and subject matter research…
The Role of Discrete Global Grid Systems in the Global Statistical Geospatial Framework
NASA Astrophysics Data System (ADS)
Purss, M. B. J.; Peterson, P.; Minchin, S. A.; Bermudez, L. E.
2016-12-01
The United Nations Committee of Experts on Global Geospatial Information Management (UN-GGIM) has proposed the development of a Global Statistical Geospatial Framework (GSGF) as a mechanism for the establishment of common analytical systems that enable the integration of statistical and geospatial information. Conventional coordinate reference systems address the globe with a continuous field of points suitable for repeatable navigation and analytical geometry. While this continuous field is represented on a computer in a digitized and discrete fashion by tuples of fixed-precision floating point values, it is a non-trivial exercise to relate point observations spatially referenced in this way to areal coverages on the surface of the Earth. The GSGF states the need to move to gridded data delivery and the importance of using common geographies and geocoding. The challenges associated with meeting these goals are not new and there has been a significant effort within the geospatial community to develop nested gridding standards to tackle these issues over many years. These efforts have recently culminated in the development of a Discrete Global Grid Systems (DGGS) standard which has been developed under the auspices of Open Geospatial Consortium (OGC). DGGS provide a fixed areal based geospatial reference frame for the persistent location of measured Earth observations, feature interpretations, and modelled predictions. DGGS address the entire planet by partitioning it into a discrete hierarchical tessellation of progressively finer resolution cells, which are referenced by a unique index that facilitates rapid computation, query and analysis. The geometry and location of the cell is the principle aspect of a DGGS. Data integration, decomposition, and aggregation is optimised in the DGGS hierarchical structure and can be exploited for efficient multi-source data processing, storage, discovery, transmission, visualization, computation, analysis, and modelling. During the 6th Session of the UN-GGIM in August 2016 the role of DGGS in the context of the GSGF was formally acknowledged. This paper proposes to highlight the synergies and role of DGGS in the Global Statistical Geospatial Framework and to show examples of the use of DGGS to combine geospatial statistics with traditional geoscientific data.
Yong, Paul J
2017-10-01
Endometriosis is a common chronic disease affecting 1 in 10 women of reproductive age, with half of women with endometriosis experiencing deep dyspareunia. A review of research studies on endometriosis indicates a need for a validated question or questionnaire for deep dyspareunia. Moreover, placebo-controlled randomized trials have yet to demonstrate a clear benefit for traditional treatments of endometriosis for the outcome of deep dyspareunia. The reason some patients might not respond to traditional treatments is the multifactorial nature of deep dyspareunia in endometriosis, which can include comorbid conditions (eg, interstitial cystitis and bladder pain syndrome) and central sensitization underlying genito-pelvic pain penetration disorder. In general, there is a lack of a framework that integrates these multifactorial causes to provide a standardized approach to deep dyspareunia in endometriosis. To propose a clinical framework for deep dyspareunia based on a synthesis of pain mechanisms with genito-pelvic pain penetration disorder according to the Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition. Narrative review after literature search with the terms (endometriosis AND dyspareunia) OR (dyspareunia AND deep) and after analysis of placebo-controlled randomized trials. Deep dyspareunia presence or absence or deep dyspareunia severity on a numeric rating scale or visual analog scale. Four types of deep dyspareunia are proposed in women with endometriosis: type I that is directly due to endometriosis; type II that is related to a comorbid condition; type III in which genito-pelvic pain penetration disorder is primary; and type IV that is secondary to a combination of types I to III. Four types of deep dyspareunia in endometriosis are proposed, which can be used as a framework in research studies and in clinical practice. Research trials could phenotype or stratify patients by each type. The framework also could give rise to more personalized care for patients by targeting appropriate treatments to each deep dyspareunia type. Yong PJ. Deep Dyspareunia in Endometriosis: A Proposed Framework Based on Pain Mechanisms and Genito-Pelvic Pain Penetration Disorder. Sex Med Rev 2017;5:495-507. Copyright © 2017 International Society for Sexual Medicine. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Luzzi, R.; Vasconcellos, A. R.; Ramos, J. G.; Rodrigues, C. G.
2018-01-01
We describe the formalism of statistical irreversible thermodynamics constructed based on Zubarev's nonequilibrium statistical operator (NSO) method, which is a powerful and universal tool for investigating the most varied physical phenomena. We present brief overviews of the statistical ensemble formalism and statistical irreversible thermodynamics. The first can be constructed either based on a heuristic approach or in the framework of information theory in the Jeffreys-Jaynes scheme of scientific inference; Zubarev and his school used both approaches in formulating the NSO method. We describe the main characteristics of statistical irreversible thermodynamics and discuss some particular considerations of several authors. We briefly describe how Rosenfeld, Bohr, and Prigogine proposed to derive a thermodynamic uncertainty principle.
NASA Astrophysics Data System (ADS)
Beretta, Gian Paolo
2014-10-01
By suitable reformulations, we cast the mathematical frameworks of several well-known different approaches to the description of nonequilibrium dynamics into a unified formulation valid in all these contexts, which extends to such frameworks the concept of steepest entropy ascent (SEA) dynamics introduced by the present author in previous works on quantum thermodynamics. Actually, the present formulation constitutes a generalization also for the quantum thermodynamics framework. The analysis emphasizes that in the SEA modeling principle a key role is played by the geometrical metric with respect to which to measure the length of a trajectory in state space. In the near-thermodynamic-equilibrium limit, the metric tensor is directly related to the Onsager's generalized resistivity tensor. Therefore, through the identification of a suitable metric field which generalizes the Onsager generalized resistance to the arbitrarily far-nonequilibrium domain, most of the existing theories of nonequilibrium thermodynamics can be cast in such a way that the state exhibits the spontaneous tendency to evolve in state space along the path of SEA compatible with the conservation constraints and the boundary conditions. The resulting unified family of SEA dynamical models is intrinsically and strongly consistent with the second law of thermodynamics. The non-negativity of the entropy production is a general and readily proved feature of SEA dynamics. In several of the different approaches to nonequilibrium description we consider here, the SEA concept has not been investigated before. We believe it defines the precise meaning and the domain of general validity of the so-called maximum entropy production principle. Therefore, it is hoped that the present unifying approach may prove useful in providing a fresh basis for effective, thermodynamically consistent, numerical models and theoretical treatments of irreversible conservative relaxation towards equilibrium from far nonequilibrium states. The mathematical frameworks we consider are the following: (A) statistical or information-theoretic models of relaxation; (B) small-scale and rarefied gas dynamics (i.e., kinetic models for the Boltzmann equation); (C) rational extended thermodynamics, macroscopic nonequilibrium thermodynamics, and chemical kinetics; (D) mesoscopic nonequilibrium thermodynamics, continuum mechanics with fluctuations; and (E) quantum statistical mechanics, quantum thermodynamics, mesoscopic nonequilibrium quantum thermodynamics, and intrinsic quantum thermodynamics.
A Framework for Thinking about Informal Statistical Inference
ERIC Educational Resources Information Center
Makar, Katie; Rubin, Andee
2009-01-01
Informal inferential reasoning has shown some promise in developing students' deeper understanding of statistical processes. This paper presents a framework to think about three key principles of informal inference--generalizations "beyond the data," probabilistic language, and data as evidence. The authors use primary school classroom…
NASA Astrophysics Data System (ADS)
Vallianatos, F.; Tzanis, A.; Michas, G.; Papadakis, G.
2012-04-01
Since the middle of summer 2011, an increase in the seismicity rates of the volcanic complex system of Santorini Island, Greece, was observed. In the present work, the temporal distribution of seismicity, as well as the magnitude distribution of earthquakes, have been studied using the concept of Non-Extensive Statistical Physics (NESP; Tsallis, 2009) along with the evolution of Shanon entropy H (also called information entropy). The analysis is based on the earthquake catalogue of the Geodynamic Institute of the National Observatory of Athens for the period July 2011-January 2012 (http://www.gein.noa.gr/). Non-Extensive Statistical Physics, which is a generalization of Boltzmann-Gibbs statistical physics, seems a suitable framework for studying complex systems. The observed distributions of seismicity rates at Santorini can be described (fitted) with NESP models to exceptionally well. This implies the inherent complexity of the Santorini volcanic seismicity, the applicability of NESP concepts to volcanic earthquake activity and the usefulness of NESP in investigating phenomena exhibiting multifractality and long-range coupling effects. Acknowledgments. This work was supported in part by the THALES Program of the Ministry of Education of Greece and the European Union in the framework of the project entitled "Integrated understanding of Seismicity, using innovative Methodologies of Fracture mechanics along with Earthquake and non extensive statistical physics - Application to the geodynamic system of the Hellenic Arc. SEISMO FEAR HELLARC". GM and GP wish to acknowledge the partial support of the Greek State Scholarships Foundation (ΙΚΥ).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gevorkyan, A. S., E-mail: g-ashot@sci.am; Sahakyan, V. V.
We study the classical 1D Heisenberg spin glasses in the framework of nearest-neighboring model. Based on the Hamilton equations we obtained the system of recurrence equations which allows to perform node-by-node calculations of a spin-chain. It is shown that calculations from the first principles of classical mechanics lead to ℕℙ hard problem, that however in the limit of the statistical equilibrium can be calculated by ℙ algorithm. For the partition function of the ensemble a new representation is offered in the form of one-dimensional integral of spin-chains’ energy distribution.
Controlling species richness in spin-glass model ecosystems
NASA Astrophysics Data System (ADS)
Poderoso, Fábio C.; Fontanari, José F.
2006-11-01
Within the framework of the random replicator model of ecosystems, we use equilibrium statistical mechanics tools to study the effect of manipulating the ecosystem so as to guarantee that a fixed fraction of the surviving species at equilibrium display a predefined set of characters (e.g., characters of economic value). Provided that the intraspecies competition is not too weak, we find that the consequence of such intervention on the ecosystem composition is a significant increase on the number of species that become extinct, and so the impoverishment of the ecosystem.
Structure-Specific Statistical Mapping of White Matter Tracts
Yushkevich, Paul A.; Zhang, Hui; Simon, Tony; Gee, James C.
2008-01-01
We present a new model-based framework for the statistical analysis of diffusion imaging data associated with specific white matter tracts. The framework takes advantage of the fact that several of the major white matter tracts are thin sheet-like structures that can be effectively modeled by medial representations. The approach involves segmenting major tracts and fitting them with deformable geometric medial models. The medial representation makes it possible to average and combine tensor-based features along directions locally perpendicular to the tracts, thus reducing data dimensionality and accounting for errors in normalization. The framework enables the analysis of individual white matter structures, and provides a range of possibilities for computing statistics and visualizing differences between cohorts. The framework is demonstrated in a study of white matter differences in pediatric chromosome 22q11.2 deletion syndrome. PMID:18407524
A Survey of Statistical Models for Reverse Engineering Gene Regulatory Networks
Huang, Yufei; Tienda-Luna, Isabel M.; Wang, Yufeng
2009-01-01
Statistical models for reverse engineering gene regulatory networks are surveyed in this article. To provide readers with a system-level view of the modeling issues in this research, a graphical modeling framework is proposed. This framework serves as the scaffolding on which the review of different models can be systematically assembled. Based on the framework, we review many existing models for many aspects of gene regulation; the pros and cons of each model are discussed. In addition, network inference algorithms are also surveyed under the graphical modeling framework by the categories of point solutions and probabilistic solutions and the connections and differences among the algorithms are provided. This survey has the potential to elucidate the development and future of reverse engineering GRNs and bring statistical signal processing closer to the core of this research. PMID:20046885
Statistical Mechanics of Viral Entry
NASA Astrophysics Data System (ADS)
Zhang, Yaojun; Dudko, Olga K.
2015-01-01
Viruses that have lipid-membrane envelopes infect cells by fusing with the cell membrane to release viral genes. Membrane fusion is known to be hindered by high kinetic barriers associated with drastic structural rearrangements—yet viral infection, which occurs by fusion, proceeds on remarkably short time scales. Here, we present a quantitative framework that captures the principles behind the invasion strategy shared by all enveloped viruses. The key to this strategy—ligand-triggered conformational changes in the viral proteins that pull the membranes together—is treated as a set of concurrent, bias field-induced activated rate processes. The framework results in analytical solutions for experimentally measurable characteristics of virus-cell fusion and enables us to express the efficiency of the viral strategy in quantitative terms. The predictive value of the theory is validated through simulations and illustrated through recent experimental data on influenza virus infection.
A unifying framework for quantifying the nature of animal interactions.
Potts, Jonathan R; Mokross, Karl; Lewis, Mark A
2014-07-06
Collective phenomena, whereby agent-agent interactions determine spatial patterns, are ubiquitous in the animal kingdom. On the other hand, movement and space use are also greatly influenced by the interactions between animals and their environment. Despite both types of interaction fundamentally influencing animal behaviour, there has hitherto been no unifying framework for the models proposed in both areas. Here, we construct a general method for inferring population-level spatial patterns from underlying individual movement and interaction processes, a key ingredient in building a statistical mechanics for ecological systems. We show that resource selection functions, as well as several examples of collective motion models, arise as special cases of our framework, thus bringing together resource selection analysis and collective animal behaviour into a single theory. In particular, we focus on combining the various mechanistic models of territorial interactions in the literature with step selection functions, by incorporating interactions into the step selection framework and demonstrating how to derive territorial patterns from the resulting models. We demonstrate the efficacy of our model by application to a population of insectivore birds in the Amazon rainforest. © 2014 The Author(s) Published by the Royal Society. All rights reserved.
Quantum approach to classical statistical mechanics.
Somma, R D; Batista, C D; Ortiz, G
2007-07-20
We present a new approach to study the thermodynamic properties of d-dimensional classical systems by reducing the problem to the computation of ground state properties of a d-dimensional quantum model. This classical-to-quantum mapping allows us to extend the scope of standard optimization methods by unifying them under a general framework. The quantum annealing method is naturally extended to simulate classical systems at finite temperatures. We derive the rates to assure convergence to the optimal thermodynamic state using the adiabatic theorem of quantum mechanics. For simulated and quantum annealing, we obtain the asymptotic rates of T(t) approximately (pN)/(k(B)logt) and gamma(t) approximately (Nt)(-c/N), for the temperature and magnetic field, respectively. Other annealing strategies are also discussed.
Wickham, Hadley; Hofmann, Heike
2011-12-01
We propose a new framework for visualising tables of counts, proportions and probabilities. We call our framework product plots, alluding to the computation of area as a product of height and width, and the statistical concept of generating a joint distribution from the product of conditional and marginal distributions. The framework, with extensions, is sufficient to encompass over 20 visualisations previously described in fields of statistical graphics and infovis, including bar charts, mosaic plots, treemaps, equal area plots and fluctuation diagrams. © 2011 IEEE
Bayesian Parameter Inference and Model Selection by Population Annealing in Systems Biology
Murakami, Yohei
2014-01-01
Parameter inference and model selection are very important for mathematical modeling in systems biology. Bayesian statistics can be used to conduct both parameter inference and model selection. Especially, the framework named approximate Bayesian computation is often used for parameter inference and model selection in systems biology. However, Monte Carlo methods needs to be used to compute Bayesian posterior distributions. In addition, the posterior distributions of parameters are sometimes almost uniform or very similar to their prior distributions. In such cases, it is difficult to choose one specific value of parameter with high credibility as the representative value of the distribution. To overcome the problems, we introduced one of the population Monte Carlo algorithms, population annealing. Although population annealing is usually used in statistical mechanics, we showed that population annealing can be used to compute Bayesian posterior distributions in the approximate Bayesian computation framework. To deal with un-identifiability of the representative values of parameters, we proposed to run the simulations with the parameter ensemble sampled from the posterior distribution, named “posterior parameter ensemble”. We showed that population annealing is an efficient and convenient algorithm to generate posterior parameter ensemble. We also showed that the simulations with the posterior parameter ensemble can, not only reproduce the data used for parameter inference, but also capture and predict the data which was not used for parameter inference. Lastly, we introduced the marginal likelihood in the approximate Bayesian computation framework for Bayesian model selection. We showed that population annealing enables us to compute the marginal likelihood in the approximate Bayesian computation framework and conduct model selection depending on the Bayes factor. PMID:25089832
Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander
2012-01-01
We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic—mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. PMID:23199912
Molecular system identification for enzyme directed evolution and design
NASA Astrophysics Data System (ADS)
Guan, Xiangying; Chakrabarti, Raj
2017-09-01
The rational design of chemical catalysts requires methods for the measurement of free energy differences in the catalytic mechanism for any given catalyst Hamiltonian. The scope of experimental learning algorithms that can be applied to catalyst design would also be expanded by the availability of such methods. Methods for catalyst characterization typically either estimate apparent kinetic parameters that do not necessarily correspond to free energy differences in the catalytic mechanism or measure individual free energy differences that are not sufficient for establishing the relationship between the potential energy surface and catalytic activity. Moreover, in order to enhance the duty cycle of catalyst design, statistically efficient methods for the estimation of the complete set of free energy differences relevant to the catalytic activity based on high-throughput measurements are preferred. In this paper, we present a theoretical and algorithmic system identification framework for the optimal estimation of free energy differences in solution phase catalysts, with a focus on one- and two-substrate enzymes. This framework, which can be automated using programmable logic, prescribes a choice of feasible experimental measurements and manipulated input variables that identify the complete set of free energy differences relevant to the catalytic activity and minimize the uncertainty in these free energy estimates for each successive Hamiltonian design. The framework also employs decision-theoretic logic to determine when model reduction can be applied to improve the duty cycle of high-throughput catalyst design. Automation of the algorithm using fluidic control systems is proposed, and applications of the framework to the problem of enzyme design are discussed.
NASA Astrophysics Data System (ADS)
Sundberg, R.; Moberg, A.; Hind, A.
2012-08-01
A statistical framework for comparing the output of ensemble simulations from global climate models with networks of climate proxy and instrumental records has been developed, focusing on near-surface temperatures for the last millennium. This framework includes the formulation of a joint statistical model for proxy data, instrumental data and simulation data, which is used to optimize a quadratic distance measure for ranking climate model simulations. An essential underlying assumption is that the simulations and the proxy/instrumental series have a shared component of variability that is due to temporal changes in external forcing, such as volcanic aerosol load, solar irradiance or greenhouse gas concentrations. Two statistical tests have been formulated. Firstly, a preliminary test establishes whether a significant temporal correlation exists between instrumental/proxy and simulation data. Secondly, the distance measure is expressed in the form of a test statistic of whether a forced simulation is closer to the instrumental/proxy series than unforced simulations. The proposed framework allows any number of proxy locations to be used jointly, with different seasons, record lengths and statistical precision. The goal is to objectively rank several competing climate model simulations (e.g. with alternative model parameterizations or alternative forcing histories) by means of their goodness of fit to the unobservable true past climate variations, as estimated from noisy proxy data and instrumental observations.
Improved analyses using function datasets and statistical modeling
John S. Hogland; Nathaniel M. Anderson
2014-01-01
Raster modeling is an integral component of spatial analysis. However, conventional raster modeling techniques can require a substantial amount of processing time and storage space and have limited statistical functionality and machine learning algorithms. To address this issue, we developed a new modeling framework using C# and ArcObjects and integrated that framework...
How Does Teacher Knowledge in Statistics Impact on Teacher Listening?
ERIC Educational Resources Information Center
Burgess, Tim
2012-01-01
For teaching statistics investigations at primary school level, teacher knowledge has been identified using a framework developed from a classroom based study. Through development of the framework, three types of teacher listening problems were identified, each of which had potential impact on the students' learning. The three types of problems…
Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M; Young, Vincent B; Jansson, Janet K; Fredricks, David N; Borenstein, Elhanan
2016-01-01
Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites' abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha
ABSTRACT Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community.more » Our framework then compares variation in predicted metabolic potential with variation in measured metabolites’ abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. IMPORTANCEStudies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism.« less
Noecker, Cecilia; Eng, Alexander; Srinivasan, Sujatha; Theriot, Casey M.; Young, Vincent B.; Jansson, Janet K.; Fredricks, David N.
2016-01-01
ABSTRACT Multiple molecular assays now enable high-throughput profiling of the ecology, metabolic capacity, and activity of the human microbiome. However, to date, analyses of such multi-omic data typically focus on statistical associations, often ignoring extensive prior knowledge of the mechanisms linking these various facets of the microbiome. Here, we introduce a comprehensive framework to systematically link variation in metabolomic data with community composition by utilizing taxonomic, genomic, and metabolic information. Specifically, we integrate available and inferred genomic data, metabolic network modeling, and a method for predicting community-wide metabolite turnover to estimate the biosynthetic and degradation potential of a given community. Our framework then compares variation in predicted metabolic potential with variation in measured metabolites’ abundances to evaluate whether community composition can explain observed shifts in the community metabolome, and to identify key taxa and genes contributing to the shifts. Focusing on two independent vaginal microbiome data sets, each pairing 16S community profiling with large-scale metabolomics, we demonstrate that our framework successfully recapitulates observed variation in 37% of metabolites. Well-predicted metabolite variation tends to result from disease-associated metabolism. We further identify several disease-enriched species that contribute significantly to these predictions. Interestingly, our analysis also detects metabolites for which the predicted variation negatively correlates with the measured variation, suggesting environmental control points of community metabolism. Applying this framework to gut microbiome data sets reveals similar trends, including prediction of bile acid metabolite shifts. This framework is an important first step toward a system-level multi-omic integration and an improved mechanistic understanding of the microbiome activity and dynamics in health and disease. IMPORTANCE Studies characterizing both the taxonomic composition and metabolic profile of various microbial communities are becoming increasingly common, yet new computational methods are needed to integrate and interpret these data in terms of known biological mechanisms. Here, we introduce an analytical framework to link species composition and metabolite measurements, using a simple model to predict the effects of community ecology on metabolite concentrations and evaluating whether these predictions agree with measured metabolomic profiles. We find that a surprisingly large proportion of metabolite variation in the vaginal microbiome can be predicted based on species composition (including dramatic shifts associated with disease), identify putative mechanisms underlying these predictions, and evaluate the roles of individual bacterial species and genes. Analysis of gut microbiome data using this framework recovers similar community metabolic trends. This framework lays the foundation for model-based multi-omic integrative studies, ultimately improving our understanding of microbial community metabolism. PMID:27239563
General Aviation Avionics Statistics : 1975
DOT National Transportation Integrated Search
1978-06-01
This report presents avionics statistics for the 1975 general aviation (GA) aircraft fleet and updates a previous publication, General Aviation Avionics Statistics: 1974. The statistics are presented in a capability group framework which enables one ...
Synaptic Transmission Optimization Predicts Expression Loci of Long-Term Plasticity.
Costa, Rui Ponte; Padamsey, Zahid; D'Amour, James A; Emptage, Nigel J; Froemke, Robert C; Vogels, Tim P
2017-09-27
Long-term modifications of neuronal connections are critical for reliable memory storage in the brain. However, their locus of expression-pre- or postsynaptic-is highly variable. Here we introduce a theoretical framework in which long-term plasticity performs an optimization of the postsynaptic response statistics toward a given mean with minimal variance. Consequently, the state of the synapse at the time of plasticity induction determines the ratio of pre- and postsynaptic modifications. Our theory explains the experimentally observed expression loci of the hippocampal and neocortical synaptic potentiation studies we examined. Moreover, the theory predicts presynaptic expression of long-term depression, consistent with experimental observations. At inhibitory synapses, the theory suggests a statistically efficient excitatory-inhibitory balance in which changes in inhibitory postsynaptic response statistics specifically target the mean excitation. Our results provide a unifying theory for understanding the expression mechanisms and functions of long-term synaptic transmission plasticity. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Origins and properties of kappa distributions in space plasmas
NASA Astrophysics Data System (ADS)
Livadiotis, George
2016-07-01
Classical particle systems reside at thermal equilibrium with their velocity distribution function stabilized into a Maxwell distribution. On the contrary, collisionless and correlated particle systems, such as the space and astrophysical plasmas, are characterized by a non-Maxwellian behavior, typically described by the so-called kappa distributions. Empirical kappa distributions have become increasingly widespread across space and plasma physics. However, a breakthrough in the field came with the connection of kappa distributions to the solid statistical framework of Tsallis non-extensive statistical mechanics. Understanding the statistical origin of kappa distributions was the cornerstone of further theoretical developments and applications, some of which will be presented in this talk: (i) The physical meaning of thermal parameters, e.g., temperature and kappa index; (ii) the multi-particle description of kappa distributions; (iii) the phase-space kappa distribution of a Hamiltonian with non-zero potential; (iv) the Sackur-Tetrode entropy for kappa distributions, and (v) the new quantization constant, h _{*}˜10 ^{-22} Js.
Superstatistical Energy Distributions of an Ion in an Ultracold Buffer Gas
NASA Astrophysics Data System (ADS)
Rouse, I.; Willitsch, S.
2017-04-01
An ion in a radio frequency ion trap interacting with a buffer gas of ultracold neutral atoms is a driven dynamical system which has been found to develop a nonthermal energy distribution with a power law tail. The exact analytical form of this distribution is unknown, but has often been represented empirically by q -exponential (Tsallis) functions. Based on the concepts of superstatistics, we introduce a framework for the statistical mechanics of an ion trapped in an rf field subject to collisions with a buffer gas. We derive analytic ion secular energy distributions from first principles both neglecting and including the effects of the thermal energy of the buffer gas. For a buffer gas with a finite temperature, we prove that Tsallis statistics emerges from the combination of a constant heating term and multiplicative energy fluctuations. We show that the resulting distributions essentially depend on experimentally controllable parameters paving the way for an accurate control of the statistical properties of ion-atom hybrid systems.
Calha, Nuno; Messias, Ana; Guerra, Fernando; Martinho, Beatriz; Neto, Maria Augusta; Nicolau, Pedro
2017-04-01
To evaluate the effect of geometry on the displacement and the strain distribution of anterior implant-supported zirconia frameworks under static load using the 3D digital image correlation method. Two groups (n=5) of 4-unit zirconia frameworks were produced by CAD/CAM for the implant-abutment assembly. Group 1 comprised five straight configuration frameworks and group 2 consisted of five curved configuration frameworks. Specimens were cemented and submitted to static load up to 200N. Displacements were captured with two high-speed photographic cameras and analyzed with video correlation system in three spacial axes U, V, W. Statistical analysis was made using the nonparametric Mann-Whitney test. Up to 150N loads, the vertical displacements (V axis) were statistically higher for curved frameworks (-267.83±23.76μm), when compared to the straight frameworks (-120.73±36.17μm) (p=0.008), as well as anterior displacements in the W transformed axis (589.55±64.51μm vs 224.29±50.38μm for the curved and straight frameworks), respectively (p=0.008). The mean von Mises strains over the surface frameworks were statistically higher for the curved frameworks under any load. Within the limitations of this in vitro study, it is possible to conclude that the geometric configuration influences the deformation of 4-unit anterior frameworks under static load. The higher strain distribution and micro-movements of the curved frameworks reflect less rigidity and increased risk of fractures associated to FPDs. Copyright © 2016 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.
Introduction to the topical issue: Nonadditive entropy and nonextensive statistical mechanics
NASA Astrophysics Data System (ADS)
Sugiyama, Masaru
. Dear CMT readers, it is my pleasure to introduce you to this topical issue dealing with a new research field of great interest, nonextensive statistical mechanics. This theory was initiated by Constantino Tsallis' work in 1998, as a possible generalization of Boltzmann-Gibbs thermostatistics. It is based on a nonadditive entropy, nowadays referred to as the Tsallis entropy. Nonextensive statistical mechanics is expected to be a consistent and unified theoretical framework for describing the macroscopic properties of complex systems that are anomalous in view of ordinary thermostatistics. In such systems, the long-standing problem regarding the relationship between statistical and dynamical laws becomes highlighted, since ergodicity and mixing may not be well realized in situations such as the edge of chaos. The phase space appears to self-organize in a structure that is not simply Euclidean but (multi)fractal. Due to this nontrivial structure, the concept of homogeneity of the system, which is the basic premise in ordinary thermodynamics, is violated and accordingly the additivity postulate for the thermodynamic quantities such as the internal energy and entropy may not be justified, in general. (Physically, nonadditivity is deeply relevant to nonextensivity of a system, in which the thermodynamic quantities do not scale with size in a simple way. Typical examples are systems with long-range interactions like self-gravitating systems as well as nonneutral charged ones.) A point of crucial importance here is that, phenomenologically, such an exotic phase-space structure has a fairly long lifetime. Therefore, this state, referred to as a metaequilibrium state or a nonequilibrium stationary state, appears to be described by a generalized entropic principle different from the traditional Boltzmann-Gibbs form, even though it may eventually approach the Boltzmann-Gibbs equilibrium state. The limits t-> ∞ and N-> ∞ do not commute, where t and N are time and the number of particles, respectively. The present topical issue is devoted to summarizing the current status of nonextensive statistical mechanics from various perspectives. It is my hope that this issue can inform the reader of one of the foremost research areas in thermostatistics. This issue consists of eight articles. The first one by Tsallis and Brigatti presents a general introduction and an overview of nonextensive statistical mechanics. At first glance, generalization of the ordinary Boltzmann-Gibbs-Shannon entropy might be completely arbitrary. But Abe's article explains how Tsallis' generalization of the statistical entropy can uniquely be characterized by both physical and mathematical principles. Then, the article by Pluchino, Latora, and Rapisarda presents a strong evidence that nonextensive statistical mechanics is in fact relevant to nonextensive systems with long-range interactions. The articles by Rajagopal, by Wada, and by Plastino, Miller, and Plastino are concerned with the macroscopic thermodynamic properties of nonextensive statistical mechanics. Rajagopal discusses the first and second laws of thermodynamics. Wada develops a discussion about the condition under which the nonextensive statistical-mechanical formalism is thermodynamically stable. The work of Plastino, Miller, and Plastino addresses the thermodynamic Legendre-transform structure and its robustness for generalizations of entropy. After these fundamental investigations, Sakagami and Taruya examine the theory for self-gravitating systems. Finally, Beck presents a novel idea of the so-called superstatistics, which provides nonextensive statistical mechanics with a physical interpretation based on nonequilibrium concepts including temperature fluctuations. Its applications to hydrodynamic turbulence and pattern formation in thermal convection states are also discussed. Nonextensive statistical mechanics is already a well-studied field, and a number of works are available in the literature. It is recommended that the interested reader visit the URL http: //tsallis.cat.cbpf.br/TEMUCO.pdf. There, one can find a comprehensive list of references to more than one thousand papers including important results that, due to lack of space, have not been mentioned in the present issue. Though there are so many published works, nonextensive statistical mechanics is still a developing field. This can naturally be understood, since the program that has been undertaken is an extremely ambitious one that makes a serious attempt to enlarge the horizons of the realm of statistical mechanics. The possible influence of nonextensive statistical mechanics on continuum mechanics and thermodynamics seems to be wide and deep. I will therefore be happy if this issue contributes to attracting the interest of researchers and stimulates research activities not only in the very field of nonextensive statistical mechanics but also in the field of continuum mechanics and thermodynamics in a wider context. As the editor of the present topical issue, I would like to express my sincere thanks to all those who joined up to make this issue. I cordially thank Professor S. Abe for advising me on the editorial policy. Without his help, the present topical issue would never have been brought out.
Visual aftereffects and sensory nonlinearities from a single statistical framework
Laparra, Valero; Malo, Jesús
2015-01-01
When adapted to a particular scenery our senses may fool us: colors are misinterpreted, certain spatial patterns seem to fade out, and static objects appear to move in reverse. A mere empirical description of the mechanisms tuned to color, texture, and motion may tell us where these visual illusions come from. However, such empirical models of gain control do not explain why these mechanisms work in this apparently dysfunctional manner. Current normative explanations of aftereffects based on scene statistics derive gain changes by (1) invoking decorrelation and linear manifold matching/equalization, or (2) using nonlinear divisive normalization obtained from parametric scene models. These principled approaches have different drawbacks: the first is not compatible with the known saturation nonlinearities in the sensors and it cannot fully accomplish information maximization due to its linear nature. In the second, gain change is almost determined a priori by the assumed parametric image model linked to divisive normalization. In this study we show that both the response changes that lead to aftereffects and the nonlinear behavior can be simultaneously derived from a single statistical framework: the Sequential Principal Curves Analysis (SPCA). As opposed to mechanistic models, SPCA is not intended to describe how physiological sensors work, but it is focused on explaining why they behave as they do. Nonparametric SPCA has two key advantages as a normative model of adaptation: (i) it is better than linear techniques as it is a flexible equalization that can be tuned for more sensible criteria other than plain decorrelation (either full information maximization or error minimization); and (ii) it makes no a priori functional assumption regarding the nonlinearity, so the saturations emerge directly from the scene data and the goal (and not from the assumed function). It turns out that the optimal responses derived from these more sensible criteria and SPCA are consistent with dysfunctional behaviors such as aftereffects. PMID:26528165
Combining statistical inference and decisions in ecology
Williams, Perry J.; Hooten, Mevin B.
2016-01-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation, and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem.
(Finite) statistical size effects on compressive strength.
Weiss, Jérôme; Girard, Lucas; Gimbert, Florent; Amitrano, David; Vandembroucq, Damien
2014-04-29
The larger structures are, the lower their mechanical strength. Already discussed by Leonardo da Vinci and Edmé Mariotte several centuries ago, size effects on strength remain of crucial importance in modern engineering for the elaboration of safety regulations in structural design or the extrapolation of laboratory results to geophysical field scales. Under tensile loading, statistical size effects are traditionally modeled with a weakest-link approach. One of its prominent results is a prediction of vanishing strength at large scales that can be quantified in the framework of extreme value statistics. Despite a frequent use outside its range of validity, this approach remains the dominant tool in the field of statistical size effects. Here we focus on compressive failure, which concerns a wide range of geophysical and geotechnical situations. We show on historical and recent experimental data that weakest-link predictions are not obeyed. In particular, the mechanical strength saturates at a nonzero value toward large scales. Accounting explicitly for the elastic interactions between defects during the damage process, we build a formal analogy of compressive failure with the depinning transition of an elastic manifold. This critical transition interpretation naturally entails finite-size scaling laws for the mean strength and its associated variability. Theoretical predictions are in remarkable agreement with measurements reported for various materials such as rocks, ice, coal, or concrete. This formalism, which can also be extended to the flowing instability of granular media under multiaxial compression, has important practical consequences for future design rules.
Diagnosis and Prognosis of Weapon Systems
NASA Technical Reports Server (NTRS)
Nolan, Mary; Catania, Rebecca; deMare, Gregory
2005-01-01
The Prognostics Framework is a set of software tools with an open architecture that affords a capability to integrate various prognostic software mechanisms and to provide information for operational and battlefield decision-making and logistical planning pertaining to weapon systems. The Prognostics NASA Tech Briefs, February 2005 17 Framework is also a system-level health -management software system that (1) receives data from performance- monitoring and built-in-test sensors and from other prognostic software and (2) processes the received data to derive a diagnosis and a prognosis for a weapon system. This software relates the diagnostic and prognostic information to the overall health of the system, to the ability of the system to perform specific missions, and to needed maintenance actions and maintenance resources. In the development of the Prognostics Framework, effort was focused primarily on extending previously developed model-based diagnostic-reasoning software to add prognostic reasoning capabilities, including capabilities to perform statistical analyses and to utilize information pertaining to deterioration of parts, failure modes, time sensitivity of measured values, mission criticality, historical data, and trends in measurement data. As thus extended, the software offers an overall health-monitoring capability.
2014-01-01
Background Striking a balance between the degree of model complexity and parameter identifiability, while still producing biologically feasible simulations using modelling is a major challenge in computational biology. While these two elements of model development are closely coupled, parameter fitting from measured data and analysis of model mechanisms have traditionally been performed separately and sequentially. This process produces potential mismatches between model and data complexities that can compromise the ability of computational frameworks to reveal mechanistic insights or predict new behaviour. In this study we address this issue by presenting a generic framework for combined model parameterisation, comparison of model alternatives and analysis of model mechanisms. Results The presented methodology is based on a combination of multivariate metamodelling (statistical approximation of the input–output relationships of deterministic models) and a systematic zooming into biologically feasible regions of the parameter space by iterative generation of new experimental designs and look-up of simulations in the proximity of the measured data. The parameter fitting pipeline includes an implicit sensitivity analysis and analysis of parameter identifiability, making it suitable for testing hypotheses for model reduction. Using this approach, under-constrained model parameters, as well as the coupling between parameters within the model are identified. The methodology is demonstrated by refitting the parameters of a published model of cardiac cellular mechanics using a combination of measured data and synthetic data from an alternative model of the same system. Using this approach, reduced models with simplified expressions for the tropomyosin/crossbridge kinetics were found by identification of model components that can be omitted without affecting the fit to the parameterising data. Our analysis revealed that model parameters could be constrained to a standard deviation of on average 15% of the mean values over the succeeding parameter sets. Conclusions Our results indicate that the presented approach is effective for comparing model alternatives and reducing models to the minimum complexity replicating measured data. We therefore believe that this approach has significant potential for reparameterising existing frameworks, for identification of redundant model components of large biophysical models and to increase their predictive capacity. PMID:24886522
A generalized statistical model for the size distribution of wealth
NASA Astrophysics Data System (ADS)
Clementi, F.; Gallegati, M.; Kaniadakis, G.
2012-12-01
In a recent paper in this journal (Clementi et al 2009 J. Stat. Mech. P02037), we proposed a new, physically motivated, distribution function for modeling individual incomes, having its roots in the framework of the κ-generalized statistical mechanics. The performance of the κ-generalized distribution was checked against real data on personal income for the United States in 2003. In this paper we extend our previous model so as to be able to account for the distribution of wealth. Probabilistic functions and inequality measures of this generalized model for wealth distribution are obtained in closed form. In order to check the validity of the proposed model, we analyze the US household wealth distributions from 1984 to 2009 and conclude an excellent agreement with the data that is superior to any other model already known in the literature.
Study of pre-seismic kHz EM emissions by means of complex systems
NASA Astrophysics Data System (ADS)
Balasis, Georgios; Papadimitriou, Constantinos; Eftaxias, Konstantinos
2010-05-01
The field of study of complex systems holds that the dynamics of complex systems are founded on universal principles that may used to describe disparate problems ranging from particle physics to economies of societies. A corollary is that transferring ideas and results from investigators in hitherto disparate areas will cross-fertilize and lead to important new results. It is well-known that the Boltzmann-Gibbs statistical mechanics works best in dealing with systems composed of either independent subsystems or interacting via short-range forces, and whose subsystems can access all the available phase space. For systems exhibiting long-range correlations, memory, or fractal properties, non-extensive Tsallis statistical mechanics becomes the most appropriate mathematical framework. As it was mentioned a central property of the magnetic storm, solar flare, and earthquake preparation process is the possible occurrence of coherent large-scale collective with a very rich structure, resulting from the repeated nonlinear interactions among collective with a very rich structure, resulting from the repeated nonlinear interactions among its constituents. Consequently, the non-extensive statistical mechanics is an appropriate regime to investigate universality, if any, in magnetic storm, solar flare, earthquake and pre-failure EM emission occurrence. A model for earthquake dynamics coming from a non-extensive Tsallis formulation, starting from first principles, has been recently introduced. This approach leads to a Gutenberg-Richter type law for the magnitude distribution of earthquakes which provides an excellent fit to seismicities generated in various large geographic areas usually identified as "seismic regions". We examine whether the Gutenberg-Richter law corresponding to a non-extensive Tsallis statistics is able to describe the distribution of amplitude of earthquakes, pre-seismic kHz EM emissions (electromagnetic earthquakes), solar flares, and magnetic storms. The analysis shows that the introduced non-extensive model provides an excellent fit to the experimental data, incorporating the characteristics of universality by means of non-extensive statistics into the extreme events under study.
ERIC Educational Resources Information Center
Martin, James L.
This paper reports on attempts by the author to construct a theoretical framework of adult education participation using a theory development process and the corresponding multivariate statistical techniques. Two problems are identified: the lack of theoretical framework in studying problems, and the limiting of statistical analysis to univariate…
Teaching Introductory Business Statistics Using the DCOVA Framework
ERIC Educational Resources Information Center
Levine, David M.; Stephan, David F.
2011-01-01
Introductory business statistics students often receive little guidance on how to apply the methods they learn to further business objectives they may one day face. And those students may fail to see the continuity among the topics taught in an introductory course if they learn those methods outside a context that provides a unifying framework.…
Computational Approaches to the Chemical Equilibrium Constant in Protein-ligand Binding.
Montalvo-Acosta, Joel José; Cecchini, Marco
2016-12-01
The physiological role played by protein-ligand recognition has motivated the development of several computational approaches to the ligand binding affinity. Some of them, termed rigorous, have a strong theoretical foundation but involve too much computation to be generally useful. Some others alleviate the computational burden by introducing strong approximations and/or empirical calibrations, which also limit their general use. Most importantly, there is no straightforward correlation between the predictive power and the level of approximation introduced. Here, we present a general framework for the quantitative interpretation of protein-ligand binding based on statistical mechanics. Within this framework, we re-derive self-consistently the fundamental equations of some popular approaches to the binding constant and pinpoint the inherent approximations. Our analysis represents a first step towards the development of variants with optimum accuracy/efficiency ratio for each stage of the drug discovery pipeline. © 2016 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
The importance of stress percolation patterns in rocks and other polycrystalline materials.
Burnley, P C
2013-01-01
A new framework for thinking about the deformation behavior of rocks and other heterogeneous polycrystalline materials is proposed, based on understanding the patterns of stress transmission through these materials. Here, using finite element models, I show that stress percolates through polycrystalline materials that have heterogeneous elastic and plastic properties of the same order as those found in rocks. The pattern of stress percolation is related to the degree of heterogeneity in and statistical distribution of the elastic and plastic properties of the constituent grains in the aggregate. The development of these stress patterns leads directly to shear localization, and their existence provides insight into the formation of rhythmic features such as compositional banding and foliation in rocks that are reacting or dissolving while being deformed. In addition, this framework provides a foundation for understanding and predicting the macroscopic rheology of polycrystalline materials based on single-crystal elastic and plastic mechanical properties.
The importance of stress percolation patterns in rocks and other polycrystalline materials
Burnley, P.C.
2013-01-01
A new framework for thinking about the deformation behavior of rocks and other heterogeneous polycrystalline materials is proposed, based on understanding the patterns of stress transmission through these materials. Here, using finite element models, I show that stress percolates through polycrystalline materials that have heterogeneous elastic and plastic properties of the same order as those found in rocks. The pattern of stress percolation is related to the degree of heterogeneity in and statistical distribution of the elastic and plastic properties of the constituent grains in the aggregate. The development of these stress patterns leads directly to shear localization, and their existence provides insight into the formation of rhythmic features such as compositional banding and foliation in rocks that are reacting or dissolving while being deformed. In addition, this framework provides a foundation for understanding and predicting the macroscopic rheology of polycrystalline materials based on single-crystal elastic and plastic mechanical properties. PMID:23823992
Faunus: An object oriented framework for molecular simulation
Lund, Mikael; Trulsson, Martin; Persson, Björn
2008-01-01
Background We present a C++ class library for Monte Carlo simulation of molecular systems, including proteins in solution. The design is generic and highly modular, enabling multiple developers to easily implement additional features. The statistical mechanical methods are documented by extensive use of code comments that – subsequently – are collected to automatically build a web-based manual. Results We show how an object oriented design can be used to create an intuitively appealing coding framework for molecular simulation. This is exemplified in a minimalistic C++ program that can calculate protein protonation states. We further discuss performance issues related to high level coding abstraction. Conclusion C++ and the Standard Template Library (STL) provide a high-performance platform for generic molecular modeling. Automatic generation of code documentation from inline comments has proven particularly useful in that no separate manual needs to be maintained. PMID:18241331
General Aviation Avionics Statistics : 1976
DOT National Transportation Integrated Search
1979-11-01
This report presents avionics statistics for the 1976 general aviation (GA) aircraft fleet and is the third in a series titled "General Aviation Avionics Statistics." The statistics are presented in a capability group framework which enables one to r...
General Aviation Avionics Statistics : 1978 Data
DOT National Transportation Integrated Search
1980-12-01
The report presents avionics statistics for the 1978 general aviation (GA) aircraft fleet and is the fifth in a series titled "General Aviation Statistics." The statistics are presented in a capability group framework which enables one to relate airb...
General Aviation Avionics Statistics : 1979 Data
DOT National Transportation Integrated Search
1981-04-01
This report presents avionics statistics for the 1979 general aviation (GA) aircraft fleet and is the sixth in a series titled General Aviation Avionics Statistics. The statistics preseneted in a capability group framework which enables one to relate...
A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds
Hagos, Samson; Feng, Zhe; Plant, Robert S.; ...
2018-02-20
A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. Finally, in addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.« less
A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds
NASA Astrophysics Data System (ADS)
Hagos, Samson; Feng, Zhe; Plant, Robert S.; Houze, Robert A.; Xiao, Heng
2018-02-01
A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii) the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. In addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.
A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagos, Samson; Feng, Zhe; Plant, Robert S.
A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The framework follows the nonequilibrium statistical mechanical approach to constructing a master equation for representing the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics of convective cells: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and the cloud-base mass flux is a nonlinear function of convective cell area, the mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated cloud-base mass flux variability under diurnally varying forcing. Finally, in addition to its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to serve as a nonequilibrium closure formulations for spectral mass flux parameterizations.« less
The visual system’s internal model of the world
Lee, Tai Sing
2015-01-01
The Bayesian paradigm has provided a useful conceptual theory for understanding perceptual computation in the brain. While the detailed neural mechanisms of Bayesian inference are not fully understood, recent computational and neurophysiological works have illuminated the underlying computational principles and representational architecture. The fundamental insights are that the visual system is organized as a modular hierarchy to encode an internal model of the world, and that perception is realized by statistical inference based on such internal model. In this paper, I will discuss and analyze the varieties of representational schemes of these internal models and how they might be used to perform learning and inference. I will argue for a unified theoretical framework for relating the internal models to the observed neural phenomena and mechanisms in the visual cortex. PMID:26566294
[Scale Relativity Theory in living beings morphogenesis: fratal, determinism and chance].
Chaline, J
2012-10-01
The Scale Relativity Theory has many biological applications from linear to non-linear and, from classical mechanics to quantum mechanics. Self-similar laws have been used as model for the description of a huge number of biological systems. Theses laws may explain the origin of basal life structures. Log-periodic behaviors of acceleration or deceleration can be applied to branching macroevolution, to the time sequences of major evolutionary leaps. The existence of such a law does not mean that the role of chance in evolution is reduced, but instead that randomness and contingency may occur within a framework which may itself be structured in a partly statistical way. The scale relativity theory can open new perspectives in evolution. Copyright © 2012 Elsevier Masson SAS. All rights reserved.
A functional model of sensemaking in a neurocognitive architecture.
Lebiere, Christian; Pirolli, Peter; Thomson, Robert; Paik, Jaehyon; Rutledge-Taylor, Matthew; Staszewski, James; Anderson, John R
2013-01-01
Sensemaking is the active process of constructing a meaningful representation (i.e., making sense) of some complex aspect of the world. In relation to intelligence analysis, sensemaking is the act of finding and interpreting relevant facts amongst the sea of incoming reports, images, and intelligence. We present a cognitive model of core information-foraging and hypothesis-updating sensemaking processes applied to complex spatial probability estimation and decision-making tasks. While the model was developed in a hybrid symbolic-statistical cognitive architecture, its correspondence to neural frameworks in terms of both structure and mechanisms provided a direct bridge between rational and neural levels of description. Compared against data from two participant groups, the model correctly predicted both the presence and degree of four biases: confirmation, anchoring and adjustment, representativeness, and probability matching. It also favorably predicted human performance in generating probability distributions across categories, assigning resources based on these distributions, and selecting relevant features given a prior probability distribution. This model provides a constrained theoretical framework describing cognitive biases as arising from three interacting factors: the structure of the task environment, the mechanisms and limitations of the cognitive architecture, and the use of strategies to adapt to the dual constraints of cognition and the environment.
Toward Model Building for Visual Aesthetic Perception
Lughofer, Edwin; Zeng, Xianyi
2017-01-01
Several models of visual aesthetic perception have been proposed in recent years. Such models have drawn on investigations into the neural underpinnings of visual aesthetics, utilizing neurophysiological techniques and brain imaging techniques including functional magnetic resonance imaging, magnetoencephalography, and electroencephalography. The neural mechanisms underlying the aesthetic perception of the visual arts have been explained from the perspectives of neuropsychology, brain and cognitive science, informatics, and statistics. Although corresponding models have been constructed, the majority of these models contain elements that are difficult to be simulated or quantified using simple mathematical functions. In this review, we discuss the hypotheses, conceptions, and structures of six typical models for human aesthetic appreciation in the visual domain: the neuropsychological, information processing, mirror, quartet, and two hierarchical feed-forward layered models. Additionally, the neural foundation of aesthetic perception, appreciation, or judgement for each model is summarized. The development of a unified framework for the neurobiological mechanisms underlying the aesthetic perception of visual art and the validation of this framework via mathematical simulation is an interesting challenge in neuroaesthetics research. This review aims to provide information regarding the most promising proposals for bridging the gap between visual information processing and brain activity involved in aesthetic appreciation. PMID:29270194
A Functional Model of Sensemaking in a Neurocognitive Architecture
Lebiere, Christian; Paik, Jaehyon; Rutledge-Taylor, Matthew; Staszewski, James; Anderson, John R.
2013-01-01
Sensemaking is the active process of constructing a meaningful representation (i.e., making sense) of some complex aspect of the world. In relation to intelligence analysis, sensemaking is the act of finding and interpreting relevant facts amongst the sea of incoming reports, images, and intelligence. We present a cognitive model of core information-foraging and hypothesis-updating sensemaking processes applied to complex spatial probability estimation and decision-making tasks. While the model was developed in a hybrid symbolic-statistical cognitive architecture, its correspondence to neural frameworks in terms of both structure and mechanisms provided a direct bridge between rational and neural levels of description. Compared against data from two participant groups, the model correctly predicted both the presence and degree of four biases: confirmation, anchoring and adjustment, representativeness, and probability matching. It also favorably predicted human performance in generating probability distributions across categories, assigning resources based on these distributions, and selecting relevant features given a prior probability distribution. This model provides a constrained theoretical framework describing cognitive biases as arising from three interacting factors: the structure of the task environment, the mechanisms and limitations of the cognitive architecture, and the use of strategies to adapt to the dual constraints of cognition and the environment. PMID:24302930
Mimura, Yasuhiro; Takemoto, Satoko; Tachibana, Taro; Ogawa, Yutaka; Nishimura, Masaomi; Yokota, Hideo; Imamoto, Naoko
2017-11-24
Nuclear pore complexes (NPCs) maintain cellular homeostasis by mediating nucleocytoplasmic transport. Although cyclin-dependent kinases (CDKs) regulate NPC assembly in interphase, the location of NPC assembly on the nuclear envelope is not clear. CDKs also regulate the disappearance of pore-free islands, which are nuclear envelope subdomains; this subdomain gradually disappears with increase in homogeneity of the NPC in response to CDK activity. However, a causal relationship between pore-free islands and NPC assembly remains unclear. Here, we elucidated mechanisms underlying NPC assembly from a new perspective by focusing on pore-free islands. We proposed a novel framework for image-based analysis to automatically determine the detailed 'landscape' of pore-free islands from a large quantity of images, leading to the identification of NPC intermediates that appear in pore-free islands with increased frequency in response to CDK activity. Comparison of the spatial distribution between simulated and the observed NPC intermediates within pore-free islands showed that their distribution was spatially biased. These results suggested that the disappearance of pore-free islands is highly related to de novo NPC assembly and indicated the existence of specific regulatory mechanisms for the spatial arrangement of NPC assembly on nuclear envelopes.
Thermodynamic equilibrium with acceleration and the Unruh effect
NASA Astrophysics Data System (ADS)
Becattini, F.
2018-04-01
We address the problem of thermodynamic equilibrium with constant acceleration along the velocity field lines in a quantum relativistic statistical mechanics framework. We show that for a free scalar quantum field, after vacuum subtraction, all mean values vanish when the local temperature T is as low as the Unruh temperature TU=A /2 π where A is the magnitude of the acceleration four-vector. We argue that the Unruh temperature is an absolute lower bound for the temperature of any accelerated fluid at global thermodynamic equilibrium. We discuss the conditions of this bound to be applicable in a local thermodynamic equilibrium situation.
Analyzing Data for Systems Biology: Working at the Intersection of Thermodynamics and Data Analytics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cannon, William R.; Baxter, Douglas J.
2012-08-15
Many challenges in systems biology have to do with analyzing data within the framework of molecular phenomena and cellular pathways. How does this relate to thermodynamics that we know govern the behavior of molecules? Making progress in relating data analysis to thermodynamics is essential in systems biology if we are to build predictive models that enable the field of synthetic biology. This report discusses work at the crossroads of thermodynamics and data analysis, and demonstrates that statistical mechanical free energy is a multinomial log likelihood. Applications to systems biology are presented.
Workflow based framework for life science informatics.
Tiwari, Abhishek; Sekhar, Arvind K T
2007-10-01
Workflow technology is a generic mechanism to integrate diverse types of available resources (databases, servers, software applications and different services) which facilitate knowledge exchange within traditionally divergent fields such as molecular biology, clinical research, computational science, physics, chemistry and statistics. Researchers can easily incorporate and access diverse, distributed tools and data to develop their own research protocols for scientific analysis. Application of workflow technology has been reported in areas like drug discovery, genomics, large-scale gene expression analysis, proteomics, and system biology. In this article, we have discussed the existing workflow systems and the trends in applications of workflow based systems.
NASA Astrophysics Data System (ADS)
Shahtahmassebi, Amir Reza; Song, Jie; Zheng, Qing; Blackburn, George Alan; Wang, Ke; Huang, Ling Yan; Pan, Yi; Moore, Nathan; Shahtahmassebi, Golnaz; Sadrabadi Haghighi, Reza; Deng, Jing Song
2016-04-01
A substantial body of literature has accumulated on the topic of using remotely sensed data to map impervious surfaces which are widely recognized as an important indicator of urbanization. However, the remote sensing of impervious surface growth has not been successfully addressed. This study proposes a new framework for deriving and summarizing urban expansion and re-densification using time series of impervious surface fractions (ISFs) derived from remotely sensed imagery. This approach integrates multiple endmember spectral mixture analysis (MESMA), analysis of regression residuals, spatial statistics (Getis_Ord) and urban growth theories; hence, the framework is abbreviated as MRGU. The performance of MRGU was compared with commonly used change detection techniques in order to evaluate the effectiveness of the approach. The results suggested that the ISF regression residuals were optimal for detecting impervious surface changes while Getis_Ord was effective for mapping hotspot regions in the regression residuals image. Moreover, the MRGU outputs agreed with the mechanisms proposed in several existing urban growth theories, but importantly the outputs enable the refinement of such models by explicitly accounting for the spatial distribution of both expansion and re-densification mechanisms. Based on Landsat data, the MRGU is somewhat restricted in its ability to measure re-densification in the urban core but this may be improved through the use of higher spatial resolution satellite imagery. The paper ends with an assessment of the present gaps in remote sensing of impervious surface growth and suggests some solutions. The application of impervious surface fractions in urban change detection is a stimulating new research idea which is driving future research with new models and algorithms.
A consistent framework for Horton regression statistics that leads to a modified Hack's law
Furey, P.R.; Troutman, B.M.
2008-01-01
A statistical framework is introduced that resolves important problems with the interpretation and use of traditional Horton regression statistics. The framework is based on a univariate regression model that leads to an alternative expression for Horton ratio, connects Horton regression statistics to distributional simple scaling, and improves the accuracy in estimating Horton plot parameters. The model is used to examine data for drainage area A and mainstream length L from two groups of basins located in different physiographic settings. Results show that confidence intervals for the Horton plot regression statistics are quite wide. Nonetheless, an analysis of covariance shows that regression intercepts, but not regression slopes, can be used to distinguish between basin groups. The univariate model is generalized to include n > 1 dependent variables. For the case where the dependent variables represent ln A and ln L, the generalized model performs somewhat better at distinguishing between basin groups than two separate univariate models. The generalized model leads to a modification of Hack's law where L depends on both A and Strahler order ??. Data show that ?? plays a statistically significant role in the modified Hack's law expression. ?? 2008 Elsevier B.V.
ERIC Educational Resources Information Center
Dong, Nianbo; Spybrook, Jessaca; Kelcey, Ben
2016-01-01
The purpose of this study is to propose a general framework for power analyses to detect the moderator effects in two- and three-level cluster randomized trials (CRTs). The study specifically aims to: (1) develop the statistical formulations for calculating statistical power, minimum detectable effect size (MDES) and its confidence interval to…
ERIC Educational Resources Information Center
McGrath, April L.; Ferns, Alyssa; Greiner, Leigh; Wanamaker, Kayla; Brown, Shelley
2015-01-01
In this study we assessed the usefulness of a multifaceted teaching framework in an advanced statistics course. We sought to expand on past findings by using this framework to assess changes in anxiety and self-efficacy, and we collected focus group data to ascertain whether students attribute such changes to a multifaceted teaching approach.…
A data fusion framework for meta-evaluation of intelligent transportation system effectiveness
DOT National Transportation Integrated Search
This study presents a framework for the meta-evaluation of Intelligent Transportation System effectiveness. The framework is based on data fusion approaches that adjust for data biases and violations of other standard statistical assumptions. Operati...
Applications of statistics to medical science (1) Fundamental concepts.
Watanabe, Hiroshi
2011-01-01
The conceptual framework of statistical tests and statistical inferences are discussed, and the epidemiological background of statistics is briefly reviewed. This study is one of a series in which we survey the basics of statistics and practical methods used in medical statistics. Arguments related to actual statistical analysis procedures will be made in subsequent papers.
Robust detection of multiple sclerosis lesions from intensity-normalized multi-channel MRI
NASA Astrophysics Data System (ADS)
Karpate, Yogesh; Commowick, Olivier; Barillot, Christian
2015-03-01
Multiple sclerosis (MS) is a disease with heterogeneous evolution among the patients. Quantitative analysis of longitudinal Magnetic Resonance Images (MRI) provides a spatial analysis of the brain tissues which may lead to the discovery of biomarkers of disease evolution. Better understanding of the disease will lead to a better discovery of pathogenic mechanisms, allowing for patient-adapted therapeutic strategies. To characterize MS lesions, we propose a novel paradigm to detect white matter lesions based on a statistical framework. It aims at studying the benefits of using multi-channel MRI to detect statistically significant differences between each individual MS patient and a database of control subjects. This framework consists in two components. First, intensity standardization is conducted to minimize the inter-subject intensity difference arising from variability of the acquisition process and different scanners. The intensity normalization maps parameters obtained using a robust Gaussian Mixture Model (GMM) estimation not affected by the presence of MS lesions. The second part studies the comparison of multi-channel MRI of MS patients with respect to an atlas built from the control subjects, thereby allowing us to look for differences in normal appearing white matter, in and around the lesions of each patient. Experimental results demonstrate that our technique accurately detects significant differences in lesions consequently improving the results of MS lesion detection.
Combining statistical inference and decisions in ecology.
Williams, Perry J; Hooten, Mevin B
2016-09-01
Statistical decision theory (SDT) is a sub-field of decision theory that formally incorporates statistical investigation into a decision-theoretic framework to account for uncertainties in a decision problem. SDT provides a unifying analysis of three types of information: statistical results from a data set, knowledge of the consequences of potential choices (i.e., loss), and prior beliefs about a system. SDT links the theoretical development of a large body of statistical methods, including point estimation, hypothesis testing, and confidence interval estimation. The theory and application of SDT have mainly been developed and published in the fields of mathematics, statistics, operations research, and other decision sciences, but have had limited exposure in ecology. Thus, we provide an introduction to SDT for ecologists and describe its utility for linking the conventionally separate tasks of statistical investigation and decision making in a single framework. We describe the basic framework of both Bayesian and frequentist SDT, its traditional use in statistics, and discuss its application to decision problems that occur in ecology. We demonstrate SDT with two types of decisions: Bayesian point estimation and an applied management problem of selecting a prescribed fire rotation for managing a grassland bird species. Central to SDT, and decision theory in general, are loss functions. Thus, we also provide basic guidance and references for constructing loss functions for an SDT problem. © 2016 by the Ecological Society of America.
Multiscale Modeling of Damage Processes in Aluminum Alloys: Grain-Scale Mechanisms
NASA Technical Reports Server (NTRS)
Hochhalter, J. D.; Veilleux, M. G.; Bozek, J. E.; Glaessgen, E. H.; Ingraffea, A. R.
2008-01-01
This paper has two goals related to the development of a physically-grounded methodology for modeling the initial stages of fatigue crack growth in an aluminum alloy. The aluminum alloy, AA 7075-T651, is susceptible to fatigue cracking that nucleates from cracked second phase iron-bearing particles. Thus, the first goal of the paper is to validate an existing framework for the prediction of the conditions under which the particles crack. The observed statistics of particle cracking (defined as incubation for this alloy) must be accurately predicted to simulate the stochastic nature of microstructurally small fatigue crack (MSFC) formation. Also, only by simulating incubation of damage in a statistically accurate manner can subsequent stages of crack growth be accurately predicted. To maintain fidelity and computational efficiency, a filtering procedure was developed to eliminate particles that were unlikely to crack. The particle filter considers the distributions of particle sizes and shapes, grain texture, and the configuration of the surrounding grains. This filter helps substantially reduce the number of particles that need to be included in the microstructural models and forms the basis of the future work on the subsequent stages of MSFC, crack nucleation and microstructurally small crack propagation. A physics-based approach to simulating fracture should ultimately begin at nanometer length scale, in which atomistic simulation is used to predict the fundamental damage mechanisms of MSFC. These mechanisms include dislocation formation and interaction, interstitial void formation, and atomic diffusion. However, atomistic simulations quickly become computationally intractable as the system size increases, especially when directly linking to the already large microstructural models. Therefore, the second goal of this paper is to propose a method that will incorporate atomistic simulation and small-scale experimental characterization into the existing multiscale framework. At the microscale, the nanoscale mechanics are represented within cohesive zones where appropriate, i.e. where the mechanics observed at the nanoscale can be represented as occurring on a plane such as at grain boundaries or slip planes at a crack front. Important advancements that are yet to be made include: 1. an increased fidelity in cohesive zone modeling; 2. a means to understand how atomistic simulation scales with time; 3. a new experimental methodology for generating empirical models for CZMs and emerging materials; and 4. a validation of simulations of the damage processes at the nano-micro scale. With ever-increasing computer power, the long-term ability to employ atomistic simulation for the prognosis of structural components will not be limited by computation power, but by our lack of knowledge in incorporating atomistic models into simulations of MSFC into a multiscale framework.
Burnecki, Krzysztof; Kepten, Eldad; Janczura, Joanna; Bronshtein, Irena; Garini, Yuval; Weron, Aleksander
2012-11-07
We present a systematic statistical analysis of the recently measured individual trajectories of fluorescently labeled telomeres in the nucleus of living human cells. The experiments were performed in the U2OS cancer cell line. We propose an algorithm for identification of the telomere motion. By expanding the previously published data set, we are able to explore the dynamics in six time orders, a task not possible earlier. As a result, we establish a rigorous mathematical characterization of the stochastic process and identify the basic mathematical mechanisms behind the telomere motion. We find that the increments of the motion are stationary, Gaussian, ergodic, and even more chaotic--mixing. Moreover, the obtained memory parameter estimates, as well as the ensemble average mean square displacement reveal subdiffusive behavior at all time spans. All these findings statistically prove a fractional Brownian motion for the telomere trajectories, which is confirmed by a generalized p-variation test. Taking into account the biophysical nature of telomeres as monomers in the chromatin chain, we suggest polymer dynamics as a sufficient framework for their motion with no influence of other models. In addition, these results shed light on other studies of telomere motion and the alternative telomere lengthening mechanism. We hope that identification of these mechanisms will allow the development of a proper physical and biological model for telomere subdynamics. This array of tests can be easily implemented to other data sets to enable quick and accurate analysis of their statistical characteristics. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.
Failure of cement hydrates: freeze-thaw and fracture
NASA Astrophysics Data System (ADS)
Ioannidou, Katerina; Del Gado, Emanuela; Ulm, Franz-Josef; Pellenq, Roland
Mechanical and viscoelastic behavior of concrete crucially depends on cement hydrates, the ``glue'' of cement. Even more than the atomistic structure, the mesoscale amorphous texture of cement hydrates over hundreds of nanometers plays a crucial role for material properties. We use simulations that combine information of the nano-scale building units of cement hydrates and on their effective interactions, obtained from atomistic simulations and experiments, into a statistical physics framework for aggregating nanoparticles.Our mesoscale model was able to reconcile different experimental results ranging from small-angle neutron scattering, SEM, adsorption/desorption of N2, and water to nanoindentation and gain the new fundamental insights into the microscopic origin of the properties measured. Our results suggest that heterogeneities developed during the early stages of hydration persist in the structure of C-S-H, impacting the rheological and mechanical performance of the hardened cement paste. In this talk I discuss recent investigation on failure mechanism at the mesoscale of hardened cement paste such as freeze-thaw and fracture. Using correlations between local volume fractions and local stress we provide a link between structural and mechanical heterogeneities during the failure mechanisms.
A Classification of Statistics Courses (A Framework for Studying Statistical Education)
ERIC Educational Resources Information Center
Turner, J. C.
1976-01-01
A classification of statistics courses in presented, with main categories of "course type,""methods of presentation,""objectives," and "syllabus." Examples and suggestions for uses of the classification are given. (DT)
Statistical Model Analysis of (n,p) Cross Sections and Average Energy For Fission Neutron Spectrum
DOE Office of Scientific and Technical Information (OSTI.GOV)
Odsuren, M.; Khuukhenkhuu, G.
2011-06-28
Investigation of charged particle emission reaction cross sections for fast neutrons is important to both nuclear reactor technology and the understanding of nuclear reaction mechanisms. In particular, the study of (n,p) cross sections is necessary to estimate radiation damage due to hydrogen production, nuclear heating and transmutations in the structural materials of fission and fusion reactors. On the other hand, it is often necessary in practice to evaluate the neutron cross sections of the nuclides for which no experimental data are available.Because of this, we carried out the systematical analysis of known experimental (n,p) and (n,a) cross sections for fastmore » neutrons and observed a systematical regularity in the wide energy interval of 6-20 MeV and for broad mass range of target nuclei. To explain this effect using the compound, pre-equilibrium and direct reaction mechanisms some formulae were deduced. In this paper, in the framework of the statistical model known experimental (n,p) cross sections averaged over the thermal fission neutron spectrum of U-235 are analyzed. It was shown that the experimental data are satisfactorily described by the statistical model. Also, in the case of (n,p) cross sections the effective average neutron energy for fission spectrum of U-235 was found to be around 3 MeV.« less
Observers Exploit Stochastic Models of Sensory Change to Help Judge the Passage of Time
Ahrens, Misha B.; Sahani, Maneesh
2011-01-01
Summary Sensory stimulation can systematically bias the perceived passage of time [1–5], but why and how this happens is mysterious. In this report, we provide evidence that such biases may ultimately derive from an innate and adaptive use of stochastically evolving dynamic stimuli to help refine estimates derived from internal timekeeping mechanisms [6–15]. A simplified statistical model based on probabilistic expectations of stimulus change derived from the second-order temporal statistics of the natural environment [16, 17] makes three predictions. First, random noise-like stimuli whose statistics violate natural expectations should induce timing bias. Second, a previously unexplored obverse of this effect is that similar noise stimuli with natural statistics should reduce the variability of timing estimates. Finally, this reduction in variability should scale with the interval being timed, so as to preserve the overall Weber law of interval timing. All three predictions are borne out experimentally. Thus, in the context of our novel theoretical framework, these results suggest that observers routinely rely on sensory input to augment their sense of the passage of time, through a process of Bayesian inference based on expectations of change in the natural environment. PMID:21256018
Daikoku, Tatsuya
2018-06-19
Statistical learning (SL) is a method of learning based on the transitional probabilities embedded in sequential phenomena such as music and language. It has been considered an implicit and domain-general mechanism that is innate in the human brain and that functions independently of intention to learn and awareness of what has been learned. SL is an interdisciplinary notion that incorporates information technology, artificial intelligence, musicology, and linguistics, as well as psychology and neuroscience. A body of recent study has suggested that SL can be reflected in neurophysiological responses based on the framework of information theory. This paper reviews a range of work on SL in adults and children that suggests overlapping and independent neural correlations in music and language, and that indicates disability of SL. Furthermore, this article discusses the relationships between the order of transitional probabilities (TPs) (i.e., hierarchy of local statistics) and entropy (i.e., global statistics) regarding SL strategies in human's brains; claims importance of information-theoretical approaches to understand domain-general, higher-order, and global SL covering both real-world music and language; and proposes promising approaches for the application of therapy and pedagogy from various perspectives of psychology, neuroscience, computational studies, musicology, and linguistics.
Ho, Lap; Cheng, Haoxiang; Wang, Jun; Simon, James E; Wu, Qingli; Zhao, Danyue; Carry, Eileen; Ferruzzi, Mario G; Faith, Jeremiah; Valcarcel, Breanna; Hao, Ke; Pasinetti, Giulio M
2018-03-05
The development of a given botanical preparation for eventual clinical application requires extensive, detailed characterizations of the chemical composition, as well as the biological availability, biological activity, and safety profiles of the botanical. These issues are typically addressed using diverse experimental protocols and model systems. Based on this consideration, in this study we established a comprehensive database and analysis framework for the collection, collation, and integrative analysis of diverse, multiscale data sets. Using this framework, we conducted an integrative analysis of heterogeneous data from in vivo and in vitro investigation of a complex bioactive dietary polyphenol-rich preparation (BDPP) and built an integrated network linking data sets generated from this multitude of diverse experimental paradigms. We established a comprehensive database and analysis framework as well as a systematic and logical means to catalogue and collate the diverse array of information gathered, which is securely stored and added to in a standardized manner to enable fast query. We demonstrated the utility of the database in (1) a statistical ranking scheme to prioritize response to treatments and (2) in depth reconstruction of functionality studies. By examination of these data sets, the system allows analytical querying of heterogeneous data and the access of information related to interactions, mechanism of actions, functions, etc., which ultimately provide a global overview of complex biological responses. Collectively, we present an integrative analysis framework that leads to novel insights on the biological activities of a complex botanical such as BDPP that is based on data-driven characterizations of interactions between BDPP-derived phenolic metabolites and their mechanisms of action, as well as synergism and/or potential cancellation of biological functions. Out integrative analytical approach provides novel means for a systematic integrative analysis of heterogeneous data types in the development of complex botanicals such as polyphenols for eventual clinical and translational applications.
Tiossi, Rodrigo; Rodrigues, Renata Cristina Silveira; de Mattos, Maria da Glória Chiarello; Ribeiro, Ricardo Faria
2008-01-01
This study compared the vertical misfit of 3-unit implant-supported nickel-chromium (Ni-Cr) and cobalt-chromium (Co-Cr) alloy and commercially pure titanium (cpTi) frameworks after casting as 1 piece, after sectioning and laser welding, and after simulated porcelain firings. The results on the tightened side showed no statistically significant differences. On the opposite side, statistically significant differences were found for Co-Cr alloy (118.64 microm [SD: 91.48] to 39.90 microm [SD: 27.13]) and cpTi (118.56 microm [51.35] to 27.87 microm [12.71]) when comparing 1-piece to laser-welded frameworks. With both sides tightened, only Co-Cr alloy showed statistically significant differences after laser welding. Ni-Cr alloy showed the lowest misfit values, though the differences were not statistically significantly different. Simulated porcelain firings revealed no significant differences.
A statistical framework for protein quantitation in bottom-up MS-based proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Karpievitch, Yuliya; Stanley, Jeffrey R.; Taverner, Thomas
2009-08-15
ABSTRACT Motivation: Quantitative mass spectrometry-based proteomics requires protein-level estimates and confidence measures. Challenges include the presence of low-quality or incorrectly identified peptides and widespread, informative, missing data. Furthermore, models are required for rolling peptide-level information up to the protein level. Results: We present a statistical model for protein abundance in terms of peptide peak intensities, applicable to both label-based and label-free quantitation experiments. The model allows for both random and censoring missingness mechanisms and provides naturally for protein-level estimates and confidence measures. The model is also used to derive automated filtering and imputation routines. Three LC-MS datasets are used tomore » illustrate the methods. Availability: The software has been made available in the open-source proteomics platform DAnTE (Polpitiya et al. (2008)) (http://omics.pnl.gov/software/). Contact: adabney@stat.tamu.edu« less
GAFFE: a gaze-attentive fixation finding engine.
Rajashekar, U; van der Linde, I; Bovik, A C; Cormack, L K
2008-04-01
The ability to automatically detect visually interesting regions in images has many practical applications, especially in the design of active machine vision and automatic visual surveillance systems. Analysis of the statistics of image features at observers' gaze can provide insights into the mechanisms of fixation selection in humans. Using a foveated analysis framework, we studied the statistics of four low-level local image features: luminance, contrast, and bandpass outputs of both luminance and contrast, and discovered that image patches around human fixations had, on average, higher values of each of these features than image patches selected at random. Contrast-bandpass showed the greatest difference between human and random fixations, followed by luminance-bandpass, RMS contrast, and luminance. Using these measurements, we present a new algorithm that selects image regions as likely candidates for fixation. These regions are shown to correlate well with fixations recorded from human observers.
NASA Astrophysics Data System (ADS)
Zaccaria, A.; Cristelli, M.; Alfi, V.; Ciulla, F.; Pietronero, L.
2010-06-01
We show that the statistics of spreads in real order books is characterized by an intrinsic asymmetry due to discreteness effects for even or odd values of the spread. An analysis of data from the New York Stock Exchange (NYSE) order book points out that traders’ strategies contribute to this asymmetry. We also investigate this phenomenon in the framework of a microscopic model and, by introducing a nonuniform deposition mechanism for limit orders, we are able to quantitatively reproduce the asymmetry found in the experimental data. Simulations of our model also show a realistic dynamics with a sort of intermittent behavior characterized by long periods in which the order book is compact and liquid interrupted by volatile configurations. The order placement strategies produce a nontrivial behavior of the spread relaxation dynamics which is similar to the one observed in real markets.
A Framework for Understanding the Patterns of Student Difficulties in Quantum Mechanics
NASA Astrophysics Data System (ADS)
Singh, Chandralekha
2015-04-01
Compared with introductory physics, relatively little is known about the development of expertise in advanced physics courses, especially in the case of quantum mechanics. We describe a theoretical framework for understanding the patterns of student reasoning difficulties and how students develop expertise in quantum mechanics. The framework posits that the challenges many students face in developing expertise in quantum mechanics are analogous to the challenges introductory students face in developing expertise in introductory classical mechanics. This framework incorporates the effects of diversity in students' prior preparation, goals and motivation for taking upper-level physics courses in general as well as the ``paradigm shift'' from classical mechanics to quantum mechanics. The framework is based on empirical investigations demonstrating that the patterns of reasoning, problem-solving, and self-monitoring difficulties in quantum mechanics bear a striking resemblance to those found in introductory classical mechanics. Examples from research in quantum mechanics and introductory classical mechanics will be discussed to illustrate how the patterns of difficulties are analogous as students learn to unpack the respective principles and grasp the formalism in each knowledge domain during the development of expertise. Embracing such a theoretical framework and contemplating the parallels between the difficulties in these two knowledge domains can enable researchers to leverage the extensive literature for introductory physics education research to guide the design of teaching and learning tools for helping students develop expertise in quantum mechanics. Support from the National Science Foundation is gratefully acknowledged.
NASA Astrophysics Data System (ADS)
Glushak, P. A.; Markiv, B. B.; Tokarchuk, M. V.
2018-01-01
We present a generalization of Zubarev's nonequilibrium statistical operator method based on the principle of maximum Renyi entropy. In the framework of this approach, we obtain transport equations for the basic set of parameters of the reduced description of nonequilibrium processes in a classical system of interacting particles using Liouville equations with fractional derivatives. For a classical systems of particles in a medium with a fractal structure, we obtain a non-Markovian diffusion equation with fractional spatial derivatives. For a concrete model of the frequency dependence of a memory function, we obtain generalized Kettano-type diffusion equation with the spatial and temporal fractality taken into account. We present a generalization of nonequilibrium thermofield dynamics in Zubarev's nonequilibrium statistical operator method in the framework of Renyi statistics.
ERIC Educational Resources Information Center
Haas, Stephanie W.; Pattuelli, Maria Cristina; Brown, Ron T.
2003-01-01
Describes the Statistical Interactive Glossary (SIG), an enhanced glossary of statistical terms supported by the GovStat ontology of statistical concepts. Presents a conceptual framework whose components articulate different aspects of a term's basic explanation that can be manipulated to produce a variety of presentations. The overarching…
Active contours on statistical manifolds and texture segmentation
Sang-Mook Lee; A. Lynn Abbott; Neil A. Clark; Philip A. Araman
2005-01-01
A new approach to active contours on statistical manifolds is presented. The statistical manifolds are 2- dimensional Riemannian manifolds that are statistically defined by maps that transform a parameter domain onto a set of probability density functions. In this novel framework, color or texture features are measured at each image point and their statistical...
Active contours on statistical manifolds and texture segmentaiton
Sang-Mook Lee; A. Lynn Abbott; Neil A. Clark; Philip A. Araman
2005-01-01
A new approach to active contours on statistical manifolds is presented. The statistical manifolds are 2- dimensional Riemannian manifolds that are statistically defined by maps that transform a parameter domain onto-a set of probability density functions. In this novel framework, color or texture features are measured at each Image point and their statistical...
Framework for Understanding the Patterns of Student Difficulties in Quantum Mechanics
ERIC Educational Resources Information Center
Marshman, Emily; Singh, Chandralekha
2015-01-01
Compared with introductory physics, relatively little is known about the development of expertise in advanced physics courses, especially in the case of quantum mechanics. Here, we describe a framework for understanding the patterns of student reasoning difficulties and how students develop expertise in quantum mechanics. The framework posits that…
SOCR: Statistics Online Computational Resource
Dinov, Ivo D.
2011-01-01
The need for hands-on computer laboratory experience in undergraduate and graduate statistics education has been firmly established in the past decade. As a result a number of attempts have been undertaken to develop novel approaches for problem-driven statistical thinking, data analysis and result interpretation. In this paper we describe an integrated educational web-based framework for: interactive distribution modeling, virtual online probability experimentation, statistical data analysis, visualization and integration. Following years of experience in statistical teaching at all college levels using established licensed statistical software packages, like STATA, S-PLUS, R, SPSS, SAS, Systat, etc., we have attempted to engineer a new statistics education environment, the Statistics Online Computational Resource (SOCR). This resource performs many of the standard types of statistical analysis, much like other classical tools. In addition, it is designed in a plug-in object-oriented architecture and is completely platform independent, web-based, interactive, extensible and secure. Over the past 4 years we have tested, fine-tuned and reanalyzed the SOCR framework in many of our undergraduate and graduate probability and statistics courses and have evidence that SOCR resources build student’s intuition and enhance their learning. PMID:21451741
Entropy for Mechanically Vibrating Systems
NASA Astrophysics Data System (ADS)
Tufano, Dante
The research contained within this thesis deals with the subject of entropy as defined for and applied to mechanically vibrating systems. This work begins with an overview of entropy as it is understood in the fields of classical thermodynamics, information theory, statistical mechanics, and statistical vibroacoustics. Khinchin's definition of entropy, which is the primary definition used for the work contained in this thesis, is introduced in the context of vibroacoustic systems. The main goal of this research is to to establish a mathematical framework for the application of Khinchin's entropy in the field of statistical vibroacoustics by examining the entropy context of mechanically vibrating systems. The introduction of this thesis provides an overview of statistical energy analysis (SEA), a modeling approach to vibroacoustics that motivates this work on entropy. The objective of this thesis is given, and followed by a discussion of the intellectual merit of this work as well as a literature review of relevant material. Following the introduction, an entropy analysis of systems of coupled oscillators is performed utilizing Khinchin's definition of entropy. This analysis develops upon the mathematical theory relating to mixing entropy, which is generated by the coupling of vibroacoustic systems. The mixing entropy is shown to provide insight into the qualitative behavior of such systems. Additionally, it is shown that the entropy inequality property of Khinchin's entropy can be reduced to an equality using the mixing entropy concept. This equality can be interpreted as a facet of the second law of thermodynamics for vibroacoustic systems. Following this analysis, an investigation of continuous systems is performed using Khinchin's entropy. It is shown that entropy analyses using Khinchin's entropy are valid for continuous systems that can be decomposed into a finite number of modes. The results are shown to be analogous to those obtained for simple oscillators, which demonstrates the applicability of entropy-based approaches to real-world systems. Three systems are considered to demonstrate these findings: 1) a rod end-coupled to a simple oscillator, 2) two end-coupled rods, and 3) two end-coupled beams. The aforementioned work utilizes the weak coupling assumption to determine the entropy of composite systems. Following this discussion, a direct method of finding entropy is developed which does not rely on this limiting assumption. The resulting entropy provides a useful benchmark for evaluating the accuracy of the weak coupling approach, and is validated using systems of coupled oscillators. The later chapters of this work discuss Khinchin's entropy as applied to nonlinear and nonconservative systems, respectively. The discussion of entropy for nonlinear systems is motivated by the desire to expand the applicability of SEA techniques beyond the linear regime. The discussion of nonconservative systems is also crucial, since real-world systems interact with their environment, and it is necessary to confirm the validity of an entropy approach for systems that are relevant in the context of SEA. Having developed a mathematical framework for determining entropy under a number of previously unexplored cases, the relationship between thermodynamics and statistical vibroacoustics can be better understood. Specifically, vibroacoustic temperatures can be obtained for systems that are not necessarily linear or weakly coupled. In this way, entropy provides insight into how the power flow proportionality of statistical energy analysis (SEA) can be applied to a broader class of vibroacoustic systems. As such, entropy is a useful tool for both justifying and expanding the foundational results of SEA.
NASA Astrophysics Data System (ADS)
Potirakis, Stelios M.; Zitis, Pavlos I.; Eftaxias, Konstantinos
2013-07-01
The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe a great variety of scientific and technological approaches of different types of natural, artificial, and social systems. Several authors have suggested that earthquake dynamics and the dynamics of economic (financial) systems can be analyzed within similar mathematical frameworks. We apply concepts of the nonextensive statistical physics, on time-series data of observable manifestations of the underlying complex processes ending up with these different extreme events, in order to support the suggestion that a dynamical analogy exists between a financial crisis (in the form of share or index price collapse) and a single earthquake. We also investigate the existence of such an analogy by means of scale-free statistics (the Gutenberg-Richter distribution of event sizes). We show that the populations of: (i) fracto-electromagnetic events rooted in the activation of a single fault, emerging prior to a significant earthquake, (ii) the trade volume events of different shares/economic indices, prior to a collapse, and (iii) the price fluctuation (considered as the difference of maximum minus minimum price within a day) events of different shares/economic indices, prior to a collapse, follow both the traditional Gutenberg-Richter law as well as a nonextensive model for earthquake dynamics, with similar parameter values. The obtained results imply the existence of a dynamic analogy between earthquakes and economic crises, which moreover follow the dynamics of seizures, magnetic storms and solar flares.
Structured statistical models of inductive reasoning.
Kemp, Charles; Tenenbaum, Joshua B
2009-01-01
Everyday inductive inferences are often guided by rich background knowledge. Formal models of induction should aim to incorporate this knowledge and should explain how different kinds of knowledge lead to the distinctive patterns of reasoning found in different inductive contexts. This article presents a Bayesian framework that attempts to meet both goals and describes [corrected] 4 applications of the framework: a taxonomic model, a spatial model, a threshold model, and a causal model. Each model makes probabilistic inferences about the extensions of novel properties, but the priors for the 4 models are defined over different kinds of structures that capture different relationships between the categories in a domain. The framework therefore shows how statistical inference can operate over structured background knowledge, and the authors argue that this interaction between structure and statistics is critical for explaining the power and flexibility of human reasoning.
Multiresolution multiscale active mask segmentation of fluorescence microscope images
NASA Astrophysics Data System (ADS)
Srinivasa, Gowri; Fickus, Matthew; Kovačević, Jelena
2009-08-01
We propose an active mask segmentation framework that combines the advantages of statistical modeling, smoothing, speed and flexibility offered by the traditional methods of region-growing, multiscale, multiresolution and active contours respectively. At the crux of this framework is a paradigm shift from evolving contours in the continuous domain to evolving multiple masks in the discrete domain. Thus, the active mask framework is particularly suited to segment digital images. We demonstrate the use of the framework in practice through the segmentation of punctate patterns in fluorescence microscope images. Experiments reveal that statistical modeling helps the multiple masks converge from a random initial configuration to a meaningful one. This obviates the need for an involved initialization procedure germane to most of the traditional methods used to segment fluorescence microscope images. While we provide the mathematical details of the functions used to segment fluorescence microscope images, this is only an instantiation of the active mask framework. We suggest some other instantiations of the framework to segment different types of images.
Retrieval Capabilities of Hierarchical Networks: From Dyson to Hopfield
NASA Astrophysics Data System (ADS)
Agliari, Elena; Barra, Adriano; Galluzzi, Andrea; Guerra, Francesco; Tantari, Daniele; Tavani, Flavia
2015-01-01
We consider statistical-mechanics models for spin systems built on hierarchical structures, which provide a simple example of non-mean-field framework. We show that the coupling decay with spin distance can give rise to peculiar features and phase diagrams much richer than their mean-field counterpart. In particular, we consider the Dyson model, mimicking ferromagnetism in lattices, and we prove the existence of a number of metastabilities, beyond the ordered state, which become stable in the thermodynamic limit. Such a feature is retained when the hierarchical structure is coupled with the Hebb rule for learning, hence mimicking the modular architecture of neurons, and gives rise to an associative network able to perform single pattern retrieval as well as multiple-pattern retrieval, depending crucially on the external stimuli and on the rate of interaction decay with distance; however, those emergent multitasking features reduce the network capacity with respect to the mean-field counterpart. The analysis is accomplished through statistical mechanics, Markov chain theory, signal-to-noise ratio technique, and numerical simulations in full consistency. Our results shed light on the biological complexity shown by real networks, and suggest future directions for understanding more realistic models.
A computational DFT study of structural transitions in textured solid-fluid interfaces
NASA Astrophysics Data System (ADS)
Yatsyshin, Petr; Parry, Andrew O.; Kalliadasis, Serafim
2015-11-01
Fluids adsorbed at walls, in capillary pores and slits, and in more exotic, sculpted geometries such as grooves and wedges can exhibit many new phase transitions, including wetting, pre-wetting, capillary-condensation and filling, compared to their bulk counterparts. As well as being of fundamental interest to the modern statistical mechanical theory of inhomogeneous fluids, these are also relevant to nanofluidics, chemical- and bioengineering. In this talk we will show using a microscopic Density Functional Theory (DFT) for fluids how novel, continuous, interfacial transitions associated with the first-order prewetting line, can occur on steps, in grooves and in wedges, that are sensitive to both the range of the intermolecular forces and interfacial fluctuation effects. These transitions compete with wetting, filling and condensation producing very rich phase diagrams even for relatively simple geometries. We will also discuss practical aspects of DFT calculations, and demonstrate how this statistical-mechanical framework is capable of yielding complex fluid structure, interfacial tensions, and regions of thermodynamic stability of various fluid configurations. As a side note, this demonstrates that DFT is an excellent tool for the investigations of complex multiphase systems. We acknowledge financial support from the European Research Council via Advanced Grant No. 247031.
General aviation avionics statistics : 1977.
DOT National Transportation Integrated Search
1980-06-01
This report presents avionics statistics for the 1977 general aviation (GA) aircraft fleet and is the fourth in a series. The statistics are presented in a capability group framework which enables one to relate airborne avionics equipment to the capa...
Mechanical properties and ultrastructural characteristics of a glass fiber-reinforced composite.
García Barbero, Alvaro Enrique; Vera González, Vicente; García Barbero, Ernesto; Aliaga Vera, Ignacio
2015-06-01
To examine the ultrastructural characteristics of a fiber-reinforced composite (FRC) and its behavior in vitro as a framework for fixed partial dentures (FPDs). A total of 40 specimens were prepared using extracted teeth fixed in methacrylate blocks as supports for the FPD, then the specimens were divided into four groups depending on whether a retaining box was used to fix the FPD to the support teeth, and on whether a composite pontic was assembled on top of the fibers. Fracture testing was performed in a universal testing machine (1 mm/minute). Fracture strength values and failure types were statistically compared for each group. Using retaining boxes did not improve the mechanical behavior of the restorative system. The weakest element of the system was the composite tooth constructed on top of the FRC.
The computational nature of memory modification.
Gershman, Samuel J; Monfils, Marie-H; Norman, Kenneth A; Niv, Yael
2017-03-15
Retrieving a memory can modify its influence on subsequent behavior. We develop a computational theory of memory modification, according to which modification of a memory trace occurs through classical associative learning, but which memory trace is eligible for modification depends on a structure learning mechanism that discovers the units of association by segmenting the stream of experience into statistically distinct clusters (latent causes). New memories are formed when the structure learning mechanism infers that a new latent cause underlies current sensory observations. By the same token, old memories are modified when old and new sensory observations are inferred to have been generated by the same latent cause. We derive this framework from probabilistic principles, and present a computational implementation. Simulations demonstrate that our model can reproduce the major experimental findings from studies of memory modification in the Pavlovian conditioning literature.
Jarzynski equality in the context of maximum path entropy
NASA Astrophysics Data System (ADS)
González, Diego; Davis, Sergio
2017-06-01
In the global framework of finding an axiomatic derivation of nonequilibrium Statistical Mechanics from fundamental principles, such as the maximum path entropy - also known as Maximum Caliber principle -, this work proposes an alternative derivation of the well-known Jarzynski equality, a nonequilibrium identity of great importance today due to its applications to irreversible processes: biological systems (protein folding), mechanical systems, among others. This equality relates the free energy differences between two equilibrium thermodynamic states with the work performed when going between those states, through an average over a path ensemble. In this work the analysis of Jarzynski's equality will be performed using the formalism of inference over path space. This derivation highlights the wide generality of Jarzynski's original result, which could even be used in non-thermodynamical settings such as social systems, financial and ecological systems.
Studying Weather and Climate Extremes in a Non-stationary Framework
NASA Astrophysics Data System (ADS)
Wu, Z.
2010-12-01
The study of weather and climate extremes often uses the theory of extreme values. Such a detection method has a major problem: to obtain the probability distribution of extremes, one has to implicitly assume the Earth’s climate is stationary over a long period within which the climatology is defined. While such detection makes some sense in a purely statistical view of stationary processes, it can lead to misleading statistical properties of weather and climate extremes caused by long term climate variability and change, and may also cause enormous difficulty in attributing and predicting these extremes. To alleviate this problem, here we report a novel non-stationary framework for studying weather and climate extremes in a non-stationary framework. In this new framework, the weather and climate extremes will be defined as timescale-dependent quantities derived from the anomalies with respect to non-stationary climatologies of different timescales. With this non-stationary framework, the non-stationary and nonlinear nature of climate system will be taken into account; and the attribution and the prediction of weather and climate extremes can then be separated into 1) the change of the statistical properties of the weather and climate extremes themselves and 2) the background climate variability and change. The new non-stationary framework will use the ensemble empirical mode decomposition (EEMD) method, which is a recent major improvement of the Hilbert-Huang Transform for time-frequency analysis. Using this tool, we will adaptively decompose various weather and climate data from observation and climate models in terms of the components of the various natural timescales contained in the data. With such decompositions, the non-stationary statistical properties (both spatial and temporal) of weather and climate anomalies and of their corresponding climatologies will be analyzed and documented.
Nonequilibrium thermodynamics in sheared hard-sphere materials.
Lieou, Charles K C; Langer, J S
2012-06-01
We combine the shear-transformation-zone (STZ) theory of amorphous plasticity with Edwards' statistical theory of granular materials to describe shear flow in a disordered system of thermalized hard spheres. The equations of motion for this system are developed within a statistical thermodynamic framework analogous to that which has been used in the analysis of molecular glasses. For hard spheres, the system volume V replaces the internal energy U as a function of entropy S in conventional statistical mechanics. In place of the effective temperature, the compactivity X=∂V/∂S characterizes the internal state of disorder. We derive the STZ equations of motion for a granular material accordingly, and predict the strain rate as a function of the ratio of the shear stress to the pressure for different values of a dimensionless, temperature-like variable near a jamming transition. We use a simplified version of our theory to interpret numerical simulations by Haxton, Schmiedeberg, and Liu, and in this way are able to obtain useful insights about internal rate factors and relations between jamming and glass transitions.
Bayesian Sensitivity Analysis of Statistical Models with Missing Data
ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG
2013-01-01
Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718
Giordano, Bruno L.; Kayser, Christoph; Rousselet, Guillaume A.; Gross, Joachim; Schyns, Philippe G.
2016-01-01
Abstract We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate statistical framework that provides effect sizes on a common meaningful scale, allows for unified treatment of discrete, continuous, unidimensional and multidimensional variables, and enables direct comparisons of representations from behavioral and brain responses across any recording modality. We validate the use of this estimate as a statistical test within a neuroimaging context, considering both discrete stimulus classes and continuous stimulus features. We also present examples of analyses facilitated by these developments, including application of multivariate analyses to MEG planar magnetic field gradients, and pairwise temporal interactions in evoked EEG responses. We show the benefit of considering the instantaneous temporal derivative together with the raw values of M/EEG signals as a multivariate response, how we can separately quantify modulations of amplitude and direction for vector quantities, and how we can measure the emergence of novel information over time in evoked responses. Open‐source Matlab and Python code implementing the new methods accompanies this article. Hum Brain Mapp 38:1541–1573, 2017. © 2016 Wiley Periodicals, Inc. PMID:27860095
A Stochastic Framework for Modeling the Population Dynamics of Convective Clouds
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagos, Samson; Feng, Zhe; Plant, Robert S.
A stochastic prognostic framework for modeling the population dynamics of convective clouds and representing them in climate models is proposed. The approach used follows the non-equilibrium statistical mechanical approach through a master equation. The aim is to represent the evolution of the number of convective cells of a specific size and their associated cloud-base mass flux, given a large-scale forcing. In this framework, referred to as STOchastic framework for Modeling Population dynamics of convective clouds (STOMP), the evolution of convective cell size is predicted from three key characteristics: (i) the probability of growth, (ii) the probability of decay, and (iii)more » the cloud-base mass flux. STOMP models are constructed and evaluated against CPOL radar observations at Darwin and convection permitting model (CPM) simulations. Multiple models are constructed under various assumptions regarding these three key parameters and the realisms of these models are evaluated. It is shown that in a model where convective plumes prefer to aggregate spatially and mass flux is a non-linear function of convective cell area, mass flux manifests a recharge-discharge behavior under steady forcing. Such a model also produces observed behavior of convective cell populations and CPM simulated mass flux variability under diurnally varying forcing. Besides its use in developing understanding of convection processes and the controls on convective cell size distributions, this modeling framework is also designed to be capable of providing alternative, non-equilibrium, closure formulations for spectral mass flux parameterizations.« less
A theoretical framework for the associations between identity and psychopathology.
Klimstra, Theo A; Denissen, Jaap J A
2017-11-01
Identity research largely emerged from clinical observations. Decades of empirical work advanced the field in refining existing approaches and adding new approaches. Furthermore, the existence of linkages of identity with psychopathology is now well established. Unfortunately, both the directionality of effects between identity aspects and psychopathology symptoms, and the mechanisms underlying associations are unclear. In the present paper, we present a new framework to inspire hypothesis-driven empirical research to overcome this limitation. The framework has a basic resemblance to theoretical models for the study of personality and psychopathology, so we provide examples of how these might apply to the study of identity. Next, we explain that unique features of identity may come into play in individuals suffering from psychopathology that are mostly related to the content of one's identity. These include pros and cons of identifying with one's diagnostic label. Finally, inspired by Hermans' dialogical self theory and principles derived from Piaget's, Swann's and Kelly's work, we delineate a framework with identity at the core of an individual multidimensional space. In this space, psychopathology symptoms have a known distance (representing relevance) to one's identity, and individual multidimensional spaces are connected to those of other individuals in one's social network. We discuss methodological (quantitative and qualitative, idiographic and nomothetic) and statistical procedures (multilevel models and network models) to test the framework. Resulting evidence can boost the field of identity research in demonstrating its high practical relevance for the emergence and conservation of psychopathology. (PsycINFO Database Record (c) 2017 APA, all rights reserved).
To b or not to b ?? A nonextensive view of b-value in the Gutenberg-Richter law.
NASA Astrophysics Data System (ADS)
Vallianatos, Filippos
2014-05-01
The Gutenberg-Richter (GR) (Gutenberg and Richter, 1944) law one of the cornerstones of modern seismology has been considered as a paradigm of manifestation of self-organized criticality since the dependence of the cumulative number of earthquakes with energy, i.e., the number of earthquakes with energy greater than E, behaves as a power law with the b value related to the critical exponent. A great number of seismic hazard studies have been originated as a result of this law. The Gutenberg-Richter (GR) law is an empirical relationship, which recent efforts relate it with general physical principles (Kagan and Knopoff, 1981; Wesnousky, 1999; Sarlis et al., 2010; Telesca, 2012; Vallianatos and Sammonds, 2013). Nonextensive statistical mechanics pioneered by Tsallis (Tsallis, 2009) provides a consistent theoretical framework for the studies of complex systems in their nonequilibrium stationary states, systems with multi fractal and self-similar structures, long-range interacting systems, etc. Earth is such system. In the present work we analyze the different pathways (originated in Sotolongo-Costa, A. Posadas , 2004; Silva et al., 2006) to extract the generalization of the G-R law as obtained in the frame of non extensive statistical physics. We estimate the b-value and we discuss its underline physics. This research has been funded by the European Union (European Social Fund) and Greek national resources under the framework of the "THALES Program: SEISMO FEAR HELLARC" project of the "Education & Lifelong Learning" Operational Programme. References Gutenberg, B. and C. F. Richter (1944). Bull. Seismol. Soc. Am. 34, 185-188. Kagan, Y. Y. and L. Knopoff (1981). J. Geophys. Res. 86, 2853-2862. Sarlis, N., E. Skordas and P. Varotsos (2010). Physical Review E - Statistical, Nonlinear, and Soft Matter Physics 82 (2) , 021110. Silva, R., G. Franca, C. Vilar and J. Alcaniz (2006). Phys. Rev. E, 73, 026102 Sotolongo-Costa, O. and A. Posadas (2004). Phys. Rev. Lett., 92, 048501 Telesca, L. (2012). Bull. Seismol. Soc. Amer., 102,886-891. Tsallis, C. (2009). Introduction to Nonextensive Statistical Mechanics, Approaching a Complex World Springer, New York Vallianatos, F. and P. Sammonds, (2013). Tectonophysics 590, 52-58 Wesnousky, S. G. (1999). Bull. Seismol. Soc. Am. 89, 1131-1137.
Framework for understanding the patterns of student difficulties in quantum mechanics
NASA Astrophysics Data System (ADS)
Marshman, Emily; Singh, Chandralekha
2015-12-01
[This paper is part of the Focused Collection on Upper Division Physics Courses.] Compared with introductory physics, relatively little is known about the development of expertise in advanced physics courses, especially in the case of quantum mechanics. Here, we describe a framework for understanding the patterns of student reasoning difficulties and how students develop expertise in quantum mechanics. The framework posits that the challenges many students face in developing expertise in quantum mechanics are analogous to the challenges introductory students face in developing expertise in introductory classical mechanics. This framework incorporates both the effects of diversity in upper-level students' prior preparation, goals, and motivation in general (i.e., the facts that even in upper-level courses, students may be inadequately prepared, have unclear goals, and have insufficient motivation to excel) as well as the "paradigm shift" from classical mechanics to quantum mechanics. The framework is based on empirical investigations demonstrating that the patterns of reasoning, problem-solving, and self-monitoring difficulties in quantum mechanics bear a striking resemblance to those found in introductory classical mechanics. Examples from research in quantum mechanics and introductory classical mechanics are discussed to illustrate how the patterns of difficulties are analogous as students learn to unpack the respective principles and grasp the formalism in each knowledge domain during the development of expertise. Embracing such a framework and contemplating the parallels between the difficulties in these two knowledge domains can enable researchers to leverage the extensive literature for introductory physics education research to guide the design of teaching and learning tools for helping students develop expertise in quantum mechanics.
ERIC Educational Resources Information Center
Kohli, Nidhi; Koran, Jennifer; Henn, Lisa
2015-01-01
There are well-defined theoretical differences between the classical test theory (CTT) and item response theory (IRT) frameworks. It is understood that in the CTT framework, person and item statistics are test- and sample-dependent. This is not the perception with IRT. For this reason, the IRT framework is considered to be theoretically superior…
Schneebeli, Esther; Brägger, Urs; Scherrer, Susanne S; Keller, Andrea; Wittneben, Julia G; Hicklin, Stefan P
2017-07-01
The aim of this study was to assess and compare quality as well as economic aspects of CAD/CAM high strength ceramic three-unit FDP frameworks ordered from dental laboratories located in emerging countries and Switzerland. The master casts of six cases were sent to five dental laboratories located in Thailand (Bangkok), China (Peking and Shenzhen), Turkey (Izmir), and Switzerland (Bern). Each laboratory was using a different CAD/CAM system. The clinical fit of the frameworks was qualitatively assessed, and the thickness of the framework material, the connector height, the width, and the diameter were evaluated using a measuring sensor. The analysis of the internal fit of the frameworks was performed by means of a replica technique, whereas the inner and outer surfaces of the frameworks were evaluated for traces of postprocessing and damage to the intaglio surface with light and electronic microscopes. Groups (dental laboratories and cases) were compared for statistically significant differences using Mann-Whitney U-tests after Bonferroni correction. An acceptable clinical fit was found at 97.9% of the margins produced in laboratory E, 87.5% in B, 93.7% in C, 79.2% in A, and 62.5% in D. The mean framework thicknesses were not statistically significantly different for the premolar regions; however, for the molar area 4/8 of the evaluated sites were statistically significantly different. Circumference, surface, and width of the connectors produced in the different laboratories were statistically significantly different but not the height. There were great differences in the designs for the pontic and connector regions, and some of the frameworks would not be recommended for clinical use. Traces of heavy postprocessing were found in frameworks from some of the laboratories. The prices per framework ranged from US$177 to US$896. By ordering laboratory work in developing countries, a considerable price reduction was obtained compared to the price level in Switzerland. Despite the use of the standardized CAD/CAM chains of production in all laboratories, a large variability in the quality aspects, such as clinical marginal fit, connector and pontic design, as well as postprocessing traces was noted. Recommended sound handling of postprocessing was not applied in all laboratories. Dentists should be aware of the true and factitious advantages of CAD/CAM production chains and not lose control over the process. © 2015 by the American College of Prosthodontists.
ERIC Educational Resources Information Center
Leavy, Aisling; Hourigan, Mairead
2016-01-01
We argue that the development of statistical literacy is greatly supported by engaging students in carrying out statistical investigations. We describe the use of driving questions and interesting contexts to motivate two statistical investigations. The PPDAC cycle is use as an organizing framework to support the process statistical investigation.
From classical to quantum mechanics: ``How to translate physical ideas into mathematical language''
NASA Astrophysics Data System (ADS)
Bergeron, H.
2001-09-01
Following previous works by E. Prugovečki [Physica A 91A, 202 (1978) and Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)] on common features of classical and quantum mechanics, we develop a unified mathematical framework for classical and quantum mechanics (based on L2-spaces over classical phase space), in order to investigate to what extent quantum mechanics can be obtained as a simple modification of classical mechanics (on both logical and analytical levels). To obtain this unified framework, we split quantum theory in two parts: (i) general quantum axiomatics (a system is described by a state in a Hilbert space, observables are self-adjoints operators, and so on) and (ii) quantum mechanics proper that specifies the Hilbert space as L2(Rn); the Heisenberg rule [pi,qj]=-iℏδij with p=-iℏ∇, the free Hamiltonian H=-ℏ2Δ/2m and so on. We show that general quantum axiomatics (up to a supplementary "axiom of classicity") can be used as a nonstandard mathematical ground to formulate physical ideas and equations of ordinary classical statistical mechanics. So, the question of a "true quantization" with "ℏ" must be seen as an independent physical problem not directly related with quantum formalism. At this stage, we show that this nonstandard formulation of classical mechanics exhibits a new kind of operation that has no classical counterpart: this operation is related to the "quantization process," and we show why quantization physically depends on group theory (the Galilei group). This analytical procedure of quantization replaces the "correspondence principle" (or canonical quantization) and allows us to map classical mechanics into quantum mechanics, giving all operators of quantum dynamics and the Schrödinger equation. The great advantage of this point of view is that quantization is based on concrete physical arguments and not derived from some "pure algebraic rule" (we exhibit also some limit of the correspondence principle). Moreover spins for particles are naturally generated, including an approximation of their interaction with magnetic fields. We also recover by this approach the semi-classical formalism developed by E. Prugovečki [Stochastic Quantum Mechanics and Quantum Space-time (Reidel, Dordrecht, 1986)].
Entropy Production and Fluctuation Theorems for Active Matter
NASA Astrophysics Data System (ADS)
Mandal, Dibyendu; Klymko, Katherine; DeWeese, Michael R.
2017-12-01
Active biological systems reside far from equilibrium, dissipating heat even in their steady state, thus requiring an extension of conventional equilibrium thermodynamics and statistical mechanics. In this Letter, we have extended the emerging framework of stochastic thermodynamics to active matter. In particular, for the active Ornstein-Uhlenbeck model, we have provided consistent definitions of thermodynamic quantities such as work, energy, heat, entropy, and entropy production at the level of single, stochastic trajectories and derived related fluctuation relations. We have developed a generalization of the Clausius inequality, which is valid even in the presence of the non-Hamiltonian dynamics underlying active matter systems. We have illustrated our results with explicit numerical studies.
Generalized Dynamic Equations Related to Condensation and Freezing Processes
NASA Astrophysics Data System (ADS)
Wang, Xingrong; Huang, Yong
2018-01-01
The generalized thermodynamic equation related to condensation and freezing processes was derived by introducing the condensation and freezing probability function into the dynamic framework based on the statistical thermodynamic fluctuation theory. As a result, the physical mechanism of some weather phenomena covered by using
Portraits of self-organization in fish schools interacting with robots
NASA Astrophysics Data System (ADS)
Aureli, M.; Fiorilli, F.; Porfiri, M.
2012-05-01
In this paper, we propose an enabling computational and theoretical framework for the analysis of experimental instances of collective behavior in response to external stimuli. In particular, this work addresses the characterization of aggregation and interaction phenomena in robot-animal groups through the exemplary analysis of fish schooling in the vicinity of a biomimetic robot. We adapt global observables from statistical mechanics to capture the main features of the shoal collective motion and its response to the robot from experimental observations. We investigate the shoal behavior by using a diffusion mapping analysis performed on these global observables that also informs the definition of relevant portraits of self-organization.
Mechanics of Lipid Bilayer Membranes
NASA Astrophysics Data System (ADS)
Powers, Thomas R.
All cells have membranes. The plasma membrane encapsulates the cell's interior, acting as a barrier against the outside world. In cells with nuclei (eukaryotic cells), membranes also form internal compartments (organelles) which carry out specialized tasks, such as protein modification and sorting in the case of the Golgi apparatus, and ATP production in the case of mitochondria. The main components of membranes are lipids and proteins. The proteins can be channels, carriers, receptors, catalysts, signaling molecules, or structural elements, and typically contribute a substantial fraction of the total membrane dry weight. The equilibrium properties of pure lipid membranes are relatively well-understood, and will be the main focus of this article. The framework of elasticity theory and statistical mechanics that we will develop will serve as the foundation for understanding biological phenomena such as the nonequilibrium behavior of membranes laden with ion pumps, the role of membrane elasticity in ion channel gating, and the dynamics of vesicle fission and fusion. Understanding the mechanics of lipid membranes is also important for drug encapsulation and delivery.
Mapping the Energy Cascade in the North Atlantic Ocean: The Coarse-graining Approach
Aluie, Hussein; Hecht, Matthew; Vallis, Geoffrey K.
2017-11-14
A coarse-graining framework is implemented to analyze nonlinear processes, measure energy transfer rates and map out the energy pathways from simulated global ocean data. Traditional tools to measure the energy cascade from turbulence theory, such as spectral flux or spectral transfer rely on the assumption of statistical homogeneity, or at least a large separation between the scales of motion and the scales of statistical inhomogeneity. The coarse-graining framework allows for probing the fully nonlinear dynamics simultaneously in scale and in space, and is not restricted by those assumptions. This study describes how the framework can be applied to ocean flows.
Mapping the Energy Cascade in the North Atlantic Ocean: The Coarse-graining Approach
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aluie, Hussein; Hecht, Matthew; Vallis, Geoffrey K.
A coarse-graining framework is implemented to analyze nonlinear processes, measure energy transfer rates and map out the energy pathways from simulated global ocean data. Traditional tools to measure the energy cascade from turbulence theory, such as spectral flux or spectral transfer rely on the assumption of statistical homogeneity, or at least a large separation between the scales of motion and the scales of statistical inhomogeneity. The coarse-graining framework allows for probing the fully nonlinear dynamics simultaneously in scale and in space, and is not restricted by those assumptions. This study describes how the framework can be applied to ocean flows.
Software for Data Analysis with Graphical Models
NASA Technical Reports Server (NTRS)
Buntine, Wray L.; Roy, H. Scott
1994-01-01
Probabilistic graphical models are being used widely in artificial intelligence and statistics, for instance, in diagnosis and expert systems, as a framework for representing and reasoning with probabilities and independencies. They come with corresponding algorithms for performing statistical inference. This offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper illustrates the framework with an example and then presents some basic techniques for the task: problem decomposition and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.
Metrics and Mappings: A Framework for Understanding Real-World Quantitative Estimation.
ERIC Educational Resources Information Center
Brown, Norman R.; Siegler, Robert S.
1993-01-01
A metrics and mapping framework is proposed to account for how heuristics, domain-specific reasoning, and intuitive statistical induction processes are integrated to generate estimates. Results of 4 experiments involving 188 undergraduates illustrate framework usefulness and suggest when people use heuristics and when they emphasize…
A Statistical Framework for the Functional Analysis of Metagenomes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sharon, Itai; Pati, Amrita; Markowitz, Victor
2008-10-01
Metagenomic studies consider the genetic makeup of microbial communities as a whole, rather than their individual member organisms. The functional and metabolic potential of microbial communities can be analyzed by comparing the relative abundance of gene families in their collective genomic sequences (metagenome) under different conditions. Such comparisons require accurate estimation of gene family frequencies. They present a statistical framework for assessing these frequencies based on the Lander-Waterman theory developed originally for Whole Genome Shotgun (WGS) sequencing projects. They also provide a novel method for assessing the reliability of the estimations which can be used for removing seemingly unreliable measurements.more » They tested their method on a wide range of datasets, including simulated genomes and real WGS data from sequencing projects of whole genomes. Results suggest that their framework corrects inherent biases in accepted methods and provides a good approximation to the true statistics of gene families in WGS projects.« less
Towards a resource-based habitat approach for spatial modelling of vector-borne disease risks.
Hartemink, Nienke; Vanwambeke, Sophie O; Purse, Bethan V; Gilbert, Marius; Van Dyck, Hans
2015-11-01
Given the veterinary and public health impact of vector-borne diseases, there is a clear need to assess the suitability of landscapes for the emergence and spread of these diseases. Current approaches for predicting disease risks neglect key features of the landscape as components of the functional habitat of vectors or hosts, and hence of the pathogen. Empirical-statistical methods do not explicitly incorporate biological mechanisms, whereas current mechanistic models are rarely spatially explicit; both methods ignore the way animals use the landscape (i.e. movement ecology). We argue that applying a functional concept for habitat, i.e. the resource-based habitat concept (RBHC), can solve these issues. The RBHC offers a framework to identify systematically the different ecological resources that are necessary for the completion of the transmission cycle and to relate these resources to (combinations of) landscape features and other environmental factors. The potential of the RBHC as a framework for identifying suitable habitats for vector-borne pathogens is explored and illustrated with the case of bluetongue virus, a midge-transmitted virus affecting ruminants. The concept facilitates the study of functional habitats of the interacting species (vectors as well as hosts) and provides new insight into spatial and temporal variation in transmission opportunities and exposure that ultimately determine disease risks. It may help to identify knowledge gaps and control options arising from changes in the spatial configuration of key resources across the landscape. The RBHC framework may act as a bridge between existing mechanistic and statistical modelling approaches. © 2014 The Authors. Biological Reviews published by John Wiley & Sons Ltd on behalf of Cambridge Philosophical Society.
A Unifying Framework for Teaching Nonparametric Statistical Tests
ERIC Educational Resources Information Center
Bargagliotti, Anna E.; Orrison, Michael E.
2014-01-01
Increased importance is being placed on statistics at both the K-12 and undergraduate level. Research divulging effective methods to teach specific statistical concepts is still widely sought after. In this paper, we focus on best practices for teaching topics in nonparametric statistics at the undergraduate level. To motivate the work, we…
ERIC Educational Resources Information Center
Lesser, Lawrence M.; Wagler, Amy E.; Esquinca, Alberto; Valenzuela, M. Guadalupe
2013-01-01
The framework of linguistic register and case study research on Spanish-speaking English language learners (ELLs) learning statistics informed the construction of a quantitative instrument, the Communication, Language, And Statistics Survey (CLASS). CLASS aims to assess whether ELLs and non-ELLs approach the learning of statistics differently with…
ERIC Educational Resources Information Center
Metz, Mary Louise
2010-01-01
Statistics education has become an increasingly important component of the mathematics education of today's citizens. In part to address the call for a more statistically literate citizenship, The "Guidelines for Assessment and Instruction in Statistics Education (GAISE)" were developed in 2005 by the American Statistical Association. These…
Inferring Demographic History Using Two-Locus Statistics.
Ragsdale, Aaron P; Gutenkunst, Ryan N
2017-06-01
Population demographic history may be learned from contemporary genetic variation data. Methods based on aggregating the statistics of many single loci into an allele frequency spectrum (AFS) have proven powerful, but such methods ignore potentially informative patterns of linkage disequilibrium (LD) between neighboring loci. To leverage such patterns, we developed a composite-likelihood framework for inferring demographic history from aggregated statistics of pairs of loci. Using this framework, we show that two-locus statistics are more sensitive to demographic history than single-locus statistics such as the AFS. In particular, two-locus statistics escape the notorious confounding of depth and duration of a bottleneck, and they provide a means to estimate effective population size based on the recombination rather than mutation rate. We applied our approach to a Zambian population of Drosophila melanogaster Notably, using both single- and two-locus statistics, we inferred a substantially lower ancestral effective population size than previous works and did not infer a bottleneck history. Together, our results demonstrate the broad potential for two-locus statistics to enable powerful population genetic inference. Copyright © 2017 by the Genetics Society of America.
Sub-grid scale models for discontinuous Galerkin methods based on the Mori-Zwanzig formalism
NASA Astrophysics Data System (ADS)
Parish, Eric; Duraisamy, Karthk
2017-11-01
The optimal prediction framework of Chorin et al., which is a reformulation of the Mori-Zwanzig (M-Z) formalism of non-equilibrium statistical mechanics, provides a framework for the development of mathematically-derived closure models. The M-Z formalism provides a methodology to reformulate a high-dimensional Markovian dynamical system as a lower-dimensional, non-Markovian (non-local) system. In this lower-dimensional system, the effects of the unresolved scales on the resolved scales are non-local and appear as a convolution integral. The non-Markovian system is an exact statement of the original dynamics and is used as a starting point for model development. In this work, we investigate the development of M-Z-based closures model within the context of the Variational Multiscale Method (VMS). The method relies on a decomposition of the solution space into two orthogonal subspaces. The impact of the unresolved subspace on the resolved subspace is shown to be non-local in time and is modeled through the M-Z-formalism. The models are applied to hierarchical discontinuous Galerkin discretizations. Commonalities between the M-Z closures and conventional flux schemes are explored. This work was supported in part by AFOSR under the project ''LES Modeling of Non-local effects using Statistical Coarse-graining'' with Dr. Jean-Luc Cambier as the technical monitor.
Quantifying risks with exact analytical solutions of derivative pricing distribution
NASA Astrophysics Data System (ADS)
Zhang, Kun; Liu, Jing; Wang, Erkang; Wang, Jin
2017-04-01
Derivative (i.e. option) pricing is essential for modern financial instrumentations. Despite of the previous efforts, the exact analytical forms of the derivative pricing distributions are still challenging to obtain. In this study, we established a quantitative framework using path integrals to obtain the exact analytical solutions of the statistical distribution for bond and bond option pricing for the Vasicek model. We discuss the importance of statistical fluctuations away from the expected option pricing characterized by the distribution tail and their associations to value at risk (VaR). The framework established here is general and can be applied to other financial derivatives for quantifying the underlying statistical distributions.
2013-03-18
0188 3. DATES COVERED (From - To) - UU UU UU UU Approved for public release; distribution is unlimited. Stability and degradation mechanisms of metal ...Stability and degradation mechanisms of metal –organic frameworks containing the Zr6O4(OH)4 secondary building unit Report Title See publication. 3...Stability and degradation mechanisms of metal –organic frameworks containing the Zr6O4(OH)4 secondary building unit Approved for public release; distribution
NASA Astrophysics Data System (ADS)
Jajcay, N.; Kravtsov, S.; Tsonis, A.; Palus, M.
2017-12-01
A better understanding of dynamics in complex systems, such as the Earth's climate is one of the key challenges for contemporary science and society. A large amount of experimental data requires new mathematical and computational approaches. Natural complex systems vary on many temporal and spatial scales, often exhibiting recurring patterns and quasi-oscillatory phenomena. The statistical inference of causal interactions and synchronization between dynamical phenomena evolving on different temporal scales is of vital importance for better understanding of underlying mechanisms and a key for modeling and prediction of such systems. This study introduces and applies information theory diagnostics to phase and amplitude time series of different wavelet components of the observed data that characterizes El Niño. A suite of significant interactions between processes operating on different time scales was detected, and intermittent synchronization among different time scales has been associated with the extreme El Niño events. The mechanisms of these nonlinear interactions were further studied in conceptual low-order and state-of-the-art dynamical, as well as statistical climate models. Observed and simulated interactions exhibit substantial discrepancies, whose understanding may be the key to an improved prediction. Moreover, the statistical framework which we apply here is suitable for direct usage of inferring cross-scale interactions in nonlinear time series from complex systems such as the terrestrial magnetosphere, solar-terrestrial interactions, seismic activity or even human brain dynamics.
Chapter two: Phenomenology of tsunamis II: scaling, event statistics, and inter-event triggering
Geist, Eric L.
2012-01-01
Observations related to tsunami catalogs are reviewed and described in a phenomenological framework. An examination of scaling relationships between earthquake size (as expressed by scalar seismic moment and mean slip) and tsunami size (as expressed by mean and maximum local run-up and maximum far-field amplitude) indicates that scaling is significant at the 95% confidence level, although there is uncertainty in how well earthquake size can predict tsunami size (R2 ~ 0.4-0.6). In examining tsunami event statistics, current methods used to estimate the size distribution of earthquakes and landslides and the inter-event time distribution of earthquakes are first reviewed. These methods are adapted to estimate the size and inter-event distribution of tsunamis at a particular recording station. Using a modified Pareto size distribution, the best-fit power-law exponents of tsunamis recorded at nine Pacific tide-gauge stations exhibit marked variation, in contrast to the approximately constant power-law exponent for inter-plate thrust earthquakes. With regard to the inter-event time distribution, significant temporal clustering of tsunami sources is demonstrated. For tsunami sources occurring in close proximity to other sources in both space and time, a physical triggering mechanism, such as static stress transfer, is a likely cause for the anomalous clustering. Mechanisms of earthquake-to-earthquake and earthquake-to-landslide triggering are reviewed. Finally, a modification of statistical branching models developed for earthquake triggering is introduced to describe triggering among tsunami sources.
Dynamic and thermodynamic processes driving the January 2014 precipitation record in southern UK
NASA Astrophysics Data System (ADS)
Oueslati, B.; Yiou, P.; Jezequel, A.
2017-12-01
Regional extreme precipitation are projected to intensify as a response to planetary climate change, with important impacts on societies. Understanding and anticipating those events remain a major challenge. In this study, we revisit the mechanisms of winter precipitation record that occurred in southern United Kingdom in January 2014. The physical drivers of this event are analyzed using the water vapor budget. Precipitation changes are decomposed into dynamic contributions, related to changes in atmospheric circulation, and thermodynamic contributions, related to changes in water vapor. We attempt to quantify the relative importance of the two contributions during this event and examine the applicability of Clausius-Clapeyron scaling. This work provides a physical interpretation of the mechanisms associated with Southern UK's wettest event, which is complementary to other studies based on statistical approaches (Schaller et al., 2016, Yiou et al., 2017). The analysis is carried out using the ERA-Interim reanalysis. This is motivated by the horizontal resolution of this dataset. It is then applied to present-day simulations and future projections of CMIP5 models on selected extreme precipitation events in southern UK that are comparable to January 2014 in terms of atmospheric circulation.References:Schaller, N. et al. Human influence on climate in the 2014 southern England winter floods and their impacts, Nature Clim. Change, 2016, 6, 627-634 Yiou, P., et al. A statistical framework for conditional extreme event attribution Advances in Statistical Climatology, Meteorology and Oceanography, 2017, 3, 17-31
Ince, Robin A A; Giordano, Bruno L; Kayser, Christoph; Rousselet, Guillaume A; Gross, Joachim; Schyns, Philippe G
2017-03-01
We begin by reviewing the statistical framework of information theory as applicable to neuroimaging data analysis. A major factor hindering wider adoption of this framework in neuroimaging is the difficulty of estimating information theoretic quantities in practice. We present a novel estimation technique that combines the statistical theory of copulas with the closed form solution for the entropy of Gaussian variables. This results in a general, computationally efficient, flexible, and robust multivariate statistical framework that provides effect sizes on a common meaningful scale, allows for unified treatment of discrete, continuous, unidimensional and multidimensional variables, and enables direct comparisons of representations from behavioral and brain responses across any recording modality. We validate the use of this estimate as a statistical test within a neuroimaging context, considering both discrete stimulus classes and continuous stimulus features. We also present examples of analyses facilitated by these developments, including application of multivariate analyses to MEG planar magnetic field gradients, and pairwise temporal interactions in evoked EEG responses. We show the benefit of considering the instantaneous temporal derivative together with the raw values of M/EEG signals as a multivariate response, how we can separately quantify modulations of amplitude and direction for vector quantities, and how we can measure the emergence of novel information over time in evoked responses. Open-source Matlab and Python code implementing the new methods accompanies this article. Hum Brain Mapp 38:1541-1573, 2017. © 2016 Wiley Periodicals, Inc. 2016 The Authors Human Brain Mapping Published by Wiley Periodicals, Inc.
A Framework for Authenticity in the Mathematics and Statistics Classroom
ERIC Educational Resources Information Center
Garrett, Lauretta; Huang, Li; Charleton, Maria Calhoun
2016-01-01
Authenticity is a term commonly used in reference to pedagogical and curricular qualities of mathematics teaching and learning, but its use lacks a coherent framework. The work of researchers in engineering education provides such a framework. Authentic qualities of mathematics teaching and learning are fit within a model described by Strobel,…
Folding thermodynamics of pseudoknotted chain conformations
Kopeikin, Zoia; Chen, Shi-Jie
2008-01-01
We develop a statistical mechanical framework for the folding thermodynamics of pseudoknotted structures. As applications of the theory, we investigate the folding stability and the free energy landscapes for both the thermal and the mechanical unfolding of pseudoknotted chains. For the mechanical unfolding process, we predict the force-extension curves, from which we can obtain the information about structural transitions in the unfolding process. In general, a pseudoknotted structure unfolds through multiple structural transitions. The interplay between the helix stems and the loops plays an important role in the folding stability of pseudoknots. For instance, variations in loop sizes can lead to the destabilization of some intermediate states and change the (equilibrium) folding pathways (e.g., two helix stems unfold either cooperatively or sequentially). In both thermal and mechanical unfolding, depending on the nucleotide sequence, misfolded intermediate states can emerge in the folding process. In addition, thermal and mechanical unfoldings often have different (equilibrium) pathways. For example, for certain sequences, the misfolded intermediates, which generally have longer tails, can fold, unfold, and refold again in the pulling process, which means that these intermediates can switch between two different average end-end extensions. PMID:16674261
Watanabe, Hiroshi C; Kubillus, Maximilian; Kubař, Tomáš; Stach, Robert; Mizaikoff, Boris; Ishikita, Hiroshi
2017-07-21
In the condensed phase, quantum chemical properties such as many-body effects and intermolecular charge fluctuations are critical determinants of the solvation structure and dynamics. Thus, a quantum mechanical (QM) molecular description is required for both solute and solvent to incorporate these properties. However, it is challenging to conduct molecular dynamics (MD) simulations for condensed systems of sufficient scale when adapting QM potentials. To overcome this problem, we recently developed the size-consistent multi-partitioning (SCMP) quantum mechanics/molecular mechanics (QM/MM) method and realized stable and accurate MD simulations, using the QM potential to a benchmark system. In the present study, as the first application of the SCMP method, we have investigated the structures and dynamics of Na + , K + , and Ca 2+ solutions based on nanosecond-scale sampling, a sampling 100-times longer than that of conventional QM-based samplings. Furthermore, we have evaluated two dynamic properties, the diffusion coefficient and difference spectra, with high statistical certainty. Furthermore the calculation of these properties has not previously been possible within the conventional QM/MM framework. Based on our analysis, we have quantitatively evaluated the quantum chemical solvation effects, which show distinct differences between the cations.
The computational nature of memory modification
Gershman, Samuel J; Monfils, Marie-H; Norman, Kenneth A; Niv, Yael
2017-01-01
Retrieving a memory can modify its influence on subsequent behavior. We develop a computational theory of memory modification, according to which modification of a memory trace occurs through classical associative learning, but which memory trace is eligible for modification depends on a structure learning mechanism that discovers the units of association by segmenting the stream of experience into statistically distinct clusters (latent causes). New memories are formed when the structure learning mechanism infers that a new latent cause underlies current sensory observations. By the same token, old memories are modified when old and new sensory observations are inferred to have been generated by the same latent cause. We derive this framework from probabilistic principles, and present a computational implementation. Simulations demonstrate that our model can reproduce the major experimental findings from studies of memory modification in the Pavlovian conditioning literature. DOI: http://dx.doi.org/10.7554/eLife.23763.001 PMID:28294944
NASA Technical Reports Server (NTRS)
Rodriguez, Guillermo (Editor)
1990-01-01
Various papers on intelligent control and adaptive systems are presented. Individual topics addressed include: control architecture for a Mars walking vehicle, representation for error detection and recovery in robot task plans, real-time operating system for robots, execution monitoring of a mobile robot system, statistical mechanics models for motion and force planning, global kinematics for manipulator planning and control, exploration of unknown mechanical assemblies through manipulation, low-level representations for robot vision, harmonic functions for robot path construction, simulation of dual behavior of an autonomous system. Also discussed are: control framework for hand-arm coordination, neural network approach to multivehicle navigation, electronic neural networks for global optimization, neural network for L1 norm linear regression, planning for assembly with robot hands, neural networks in dynamical systems, control design with iterative learning, improved fuzzy process control of spacecraft autonomous rendezvous using a genetic algorithm.
Waste management CDM projects barriers NVivo 10® qualitative dataset.
Bufoni, André Luiz; de Sousa Ferreira, Aracéli Cristina; Oliveira, Luciano Basto
2017-12-01
This article contains one NVivo 10® file with the complete 432 projects design documents (PDD) of seven waste management sector industries registered as Clean Development Mechanism (CDM) under United Nations Framework Convention on Climate Change (UNFCCC) Kyoto Protocol Initiative from 2004 to 2014. All data analyses and sample statistics made during the research remain in the file. We coded PDDs in 890 fragments of text, classified in five categories of barriers (nodes): technological, financial, human resources, regulatory, socio-political. The data supports the findings of author thesis [1] and other two indexed publication in Waste Management Journal: "The financial attractiveness assessment of large waste management projects registered as clean development mechanism" and "The declared barriers of the large developing countries waste management projects: The STAR model" [2], [3]. The data allows any computer assisted qualitative content analysis (CAQCA) on the sector and it is available at Mendeley [4].
Chung, Dongjun; Kim, Hang J; Zhao, Hongyu
2017-02-01
Genome-wide association studies (GWAS) have identified tens of thousands of genetic variants associated with hundreds of phenotypes and diseases, which have provided clinical and medical benefits to patients with novel biomarkers and therapeutic targets. However, identification of risk variants associated with complex diseases remains challenging as they are often affected by many genetic variants with small or moderate effects. There has been accumulating evidence suggesting that different complex traits share common risk basis, namely pleiotropy. Recently, several statistical methods have been developed to improve statistical power to identify risk variants for complex traits through a joint analysis of multiple GWAS datasets by leveraging pleiotropy. While these methods were shown to improve statistical power for association mapping compared to separate analyses, they are still limited in the number of phenotypes that can be integrated. In order to address this challenge, in this paper, we propose a novel statistical framework, graph-GPA, to integrate a large number of GWAS datasets for multiple phenotypes using a hidden Markov random field approach. Application of graph-GPA to a joint analysis of GWAS datasets for 12 phenotypes shows that graph-GPA improves statistical power to identify risk variants compared to statistical methods based on smaller number of GWAS datasets. In addition, graph-GPA also promotes better understanding of genetic mechanisms shared among phenotypes, which can potentially be useful for the development of improved diagnosis and therapeutics. The R implementation of graph-GPA is currently available at https://dongjunchung.github.io/GGPA/.
Statistical Learning and Language: An Individual Differences Study
ERIC Educational Resources Information Center
Misyak, Jennifer B.; Christiansen, Morten H.
2012-01-01
Although statistical learning and language have been assumed to be intertwined, this theoretical presupposition has rarely been tested empirically. The present study investigates the relationship between statistical learning and language using a within-subject design embedded in an individual-differences framework. Participants were administered…
A Statistical Test for Comparing Nonnested Covariance Structure Models.
ERIC Educational Resources Information Center
Levy, Roy; Hancock, Gregory R.
While statistical procedures are well known for comparing hierarchically related (nested) covariance structure models, statistical tests for comparing nonhierarchically related (nonnested) models have proven more elusive. While isolated attempts have been made, none exists within the commonly used maximum likelihood estimation framework, thereby…
Characterization of protein folding by a Φ-value calculation with a statistical-mechanical model.
Wako, Hiroshi; Abe, Haruo
2016-01-01
The Φ-value analysis approach provides information about transition-state structures along the folding pathway of a protein by measuring the effects of an amino acid mutation on folding kinetics. Here we compared the theoretically calculated Φ values of 27 proteins with their experimentally observed Φ values; the theoretical values were calculated using a simple statistical-mechanical model of protein folding. The theoretically calculated Φ values reflected the corresponding experimentally observed Φ values with reasonable accuracy for many of the proteins, but not for all. The correlation between the theoretically calculated and experimentally observed Φ values strongly depends on whether the protein-folding mechanism assumed in the model holds true in real proteins. In other words, the correlation coefficient can be expected to illuminate the folding mechanisms of proteins, providing the answer to the question of which model more accurately describes protein folding: the framework model or the nucleation-condensation model. In addition, we tried to characterize protein folding with respect to various properties of each protein apart from the size and fold class, such as the free-energy profile, contact-order profile, and sensitivity to the parameters used in the Φ-value calculation. The results showed that any one of these properties alone was not enough to explain protein folding, although each one played a significant role in it. We have confirmed the importance of characterizing protein folding from various perspectives. Our findings have also highlighted that protein folding is highly variable and unique across different proteins, and this should be considered while pursuing a unified theory of protein folding.
Characterization of protein folding by a Φ-value calculation with a statistical-mechanical model
Wako, Hiroshi; Abe, Haruo
2016-01-01
The Φ-value analysis approach provides information about transition-state structures along the folding pathway of a protein by measuring the effects of an amino acid mutation on folding kinetics. Here we compared the theoretically calculated Φ values of 27 proteins with their experimentally observed Φ values; the theoretical values were calculated using a simple statistical-mechanical model of protein folding. The theoretically calculated Φ values reflected the corresponding experimentally observed Φ values with reasonable accuracy for many of the proteins, but not for all. The correlation between the theoretically calculated and experimentally observed Φ values strongly depends on whether the protein-folding mechanism assumed in the model holds true in real proteins. In other words, the correlation coefficient can be expected to illuminate the folding mechanisms of proteins, providing the answer to the question of which model more accurately describes protein folding: the framework model or the nucleation-condensation model. In addition, we tried to characterize protein folding with respect to various properties of each protein apart from the size and fold class, such as the free-energy profile, contact-order profile, and sensitivity to the parameters used in the Φ-value calculation. The results showed that any one of these properties alone was not enough to explain protein folding, although each one played a significant role in it. We have confirmed the importance of characterizing protein folding from various perspectives. Our findings have also highlighted that protein folding is highly variable and unique across different proteins, and this should be considered while pursuing a unified theory of protein folding. PMID:28409079
Accurate Black Hole Spin Measurements using ABC
NASA Astrophysics Data System (ADS)
Connolly, Andrew
Measuring the spin of black holes provides important insights into the supernova formation mechanism of stellar-mass black holes, galaxy merger scenarios for supermassive black holes, and the launching mechanisms of ballistic jets. It is therefore of crucial importance to measure black hole spins to a high degree of accuracy. Stellar-mass black holes in binary systems (BHBs) have two major advantages over Active Galactic Nuclei (AGN): (1) owing to their proximity and brightness, observations of BHBs are not as limited by counting statistics as their supermassive counter-parts; (2) unlike in AGN, one can use two largely independent methods to measure the spin in BHBs, providing a check on spin measurements. However, the high flux that makes BHBs such excellent targets for spin measurements also proves to be their Achilles heel: modern CCD cameras are optimized for observing faint sources. Consequently, observations of bright BHBs with CCD cameras are subject to non-linear instrumental effects among them pile-up and grade migration that strongly distort the spectrum. Since spin measurements rely on a very precise model of both the continuum X-ray flux and disc reflection signatures superimposed on top of the former, these instrumental effects may cause inferred spin measurements to differ by a factor of two or more. Current mitigation strategies are aimed at removing instrumental effects either during the observations themselves, by requiring simultaneous observations with multiple telescopes, or in post-processing. Even when these techniques are employed, pile-up may remain unrecognized and still distort results, whereas mitigation strategies may introduce additional systematic biases, e.g. due to increased (cross-)calibration uncertainties. Advances in modern statistical methodology allow for efficient modeling of instrumental effects during the analysis stage, largely eliminating the requirements for observations with multiple instruments or increased observation time. In particular, a class of methods col- lectively called Approximate Bayesian Computation (ABC) is capable of exploiting the fact that it is possible to simulate instrumental effects to a high degree of accuracy in order to build reliable statistical models incorporating pile-up and related effects. With the loss of the Hitomi spacecraft, it is more important than ever to make full use of the data we collect with current instruments. We propose an ambitious program to estimate the spins of 13 black holes in X-ray binaries using observations with XMMNewton s EPIC MOS and pn, Suzaku s XIS and Chandra s ACIS and HETG instruments. We will build a general framework for dealing with pile-up in spectral modeling using ABC and refine current instrumental simulators for inclusion in this framework. Coupled with state-of-the- art sampling methods, this will allow us to take advantage of dozens of observations in the archives of all three instruments. We will be able to estimate spins to much bet- ter accuracy than ever before and test current models for black hole formation as well as jet launching mechanisms. The program will deliver a considerable legacy, because the statistical and methodological framework will be general. Application to other instruments suffering from photon pile-up, e.g. Swift/XRT, Fermi/GBM, ASCA/SIS, and GALEX, will only require is a model capable of simulating the relevant instrumental effects. This will enable other science cases beyond that proposed here which rely on precise spectral measurements or cases where pile-up cannot be avoided, e.g. high-precision radius measurements in neutron stars, understanding X-ray dust scattering, and stellar evolution studies of globular clusters.
ON THE DYNAMICAL DERIVATION OF EQUILIBRIUM STATISTICAL MECHANICS
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prigogine, I.; Balescu, R.; Henin, F.
1960-12-01
Work on nonequilibrium statistical mechanics, which allows an extension of the kinetic proof to all results of equilibrium statistical mechanics involving a finite number of degrees of freedom, is summarized. As an introduction to the general N-body problem, the scattering theory in classical mechanics is considered. The general N-body problem is considered for the case of classical mechanics, quantum mechanics with Boltzmann statistics, and quantum mechanics including quantum statistics. Six basic diagrams, which describe the elementary processes of the dynamics of correlations, were obtained. (M.C.G.)
Applying Sociocultural Theory to Teaching Statistics for Doctoral Social Work Students
ERIC Educational Resources Information Center
Mogro-Wilson, Cristina; Reeves, Michael G.; Charter, Mollie Lazar
2015-01-01
This article describes the development of two doctoral-level multivariate statistics courses utilizing sociocultural theory, an integrative pedagogical framework. In the first course, the implementation of sociocultural theory helps to support the students through a rigorous introduction to statistics. The second course involves students…
Influence of preliminary damage on the load-bearing capacity of zirconia fixed dental prostheses.
Kohorst, Philipp; Butzheinen, Lutz Oliver; Dittmer, Marc Philipp; Heuer, Wieland; Borchers, Lothar; Stiesch, Meike
2010-12-01
The objective of this investigation was to evaluate the influence of differently shaped preliminary cuts in combination with artificial aging on the load-bearing capacity of four-unit zirconia fixed dental prostheses (FDPs). Forty frameworks were fabricated from white-stage zirconia blanks (InCeram YZ, Vita) by means of a computer-aided design/computer-aided manufacturing system (Cerec inLab, Sirona). Frameworks were divided into four homogeneous groups with ten specimens each. Prior to veneering, frameworks of two groups were "damaged" by defined saw cuts of different dimensions, to simulate accidental flaws generated during shape cutting. After the veneering process, FDPs, with the exception of a control group without preliminary damage, were subjected to thermal and mechanical cycling (TMC) during 200 days storage in distilled water at 36°C. Following the aging procedure, all specimens were loaded until fracture, and forces at fracture were recorded. The statistical analysis of force at fracture data was performed using two-way ANOVA, with the level of significance chosen at 0.05. Neither type of preliminary mechanical damage significantly affected the load-bearing capacity of FDPs. In contrast, artificial aging by TMC proved to have a significant influence on the load-bearing capacity of both the undamaged and the predamaged zirconia restorations (p < 0.001); however, even though load-bearing capacity decreased by about 20% due to simulated aging, the FDPs still showed mean load-bearing capacities of about 1600 N. The results of this study reveal that zirconia restorations have a high tolerance regarding mechanical damages. Irrespective of these findings, damage to zirconia ceramics during production or finishing should be avoided, as this may nevertheless lead to subcritical crack growth and, eventually, catastrophic failure. Furthermore, to ensure long-term clinical success, the design of zirconia restorations has to accommodate the decrease in load-bearing capacity due to TMC in the oral environment. © 2010 by The American College of Prosthodontists.
2009-08-05
Socio-cultural data acquisition, extraction, and management.??? First the idea of a theoretical framework will be very briefly discussed as well as...SUBJECT TERMS human behavior, theoretical framework , hypothesis development, experimental design, ethical research, statistical power, human laboratory...who throw rocks? • How can we make them stay too far away to throw rocks? UNCLASSIFIED – Approved for Public Release Theoretical Framework / Conceptual
NASA Astrophysics Data System (ADS)
Potirakis, Stelios M.; Contoyiannis, Yiannis; Kopanas, John; Kalimeris, Anastasios; Antonopoulos, George; Peratzakis, Athanasios; Eftaxias, Konstantinos; Nomicos, Costantinos
2014-05-01
When one considers a phenomenon that is "complex" refers to a system whose phenomenological laws that describe the global behavior of the system, are not necessarily directly related to the "microscopic" laws that regulate the evolution of its elementary parts. The field of study of complex systems considers that the dynamics of complex systems are founded on universal principles that may be used to describe disparate problems ranging from particle physics to economies of societies. Several authors have suggested that earthquake (EQ) dynamics can be analyzed within similar mathematical frameworks with economy dynamics, and neurodynamics. A central property of the EQ preparation process is the occurrence of coherent large-scale collective behavior with a very rich structure, resulting from repeated nonlinear interactions among the constituents of the system. As a result, nonextensive statistics is an appropriate, physically meaningful, tool for the study of EQ dynamics. Since the fracture induced electromagnetic (EM) precursors are observable manifestations of the underlying EQ preparation process, the analysis of a fracture induced EM precursor observed prior to the occurrence of a large EQ can also be conducted within the nonextensive statistics framework. Within the frame of the investigation for universal principles that may hold for different dynamical systems that are related to the genesis of extreme events, we present here statistical similarities of the pre-earthquake EM emissions related to an EQ, with the pre-ictal electrical brain activity related to an epileptic seizure, and with the pre-crisis economic observables related to the collapse of a share. It is demonstrated the all three dynamical systems' observables can be analyzed in the frame of nonextensive statistical mechanics, while the frequency-size relations of appropriately defined "events" that precede the extreme event related to each one of these different systems present striking quantitative similarities. It is also demonstrated that, for the considered systems, the nonextensive parameter q increases as the extreme event approaches, which indicates that the strength of the long-memory / long-range interactions between the constituents of the system increases characterizing the dynamics of the system.
NASA Astrophysics Data System (ADS)
Moslehi, M.; de Barros, F.; Rajagopal, R.
2014-12-01
Hydrogeological models that represent flow and transport in subsurface domains are usually large-scale with excessive computational complexity and uncertain characteristics. Uncertainty quantification for predicting flow and transport in heterogeneous formations often entails utilizing a numerical Monte Carlo framework, which repeatedly simulates the model according to a random field representing hydrogeological characteristics of the field. The physical resolution (e.g. grid resolution associated with the physical space) for the simulation is customarily chosen based on recommendations in the literature, independent of the number of Monte Carlo realizations. This practice may lead to either excessive computational burden or inaccurate solutions. We propose an optimization-based methodology that considers the trade-off between the following conflicting objectives: time associated with computational costs, statistical convergence of the model predictions and physical errors corresponding to numerical grid resolution. In this research, we optimally allocate computational resources by developing a modeling framework for the overall error based on a joint statistical and numerical analysis and optimizing the error model subject to a given computational constraint. The derived expression for the overall error explicitly takes into account the joint dependence between the discretization error of the physical space and the statistical error associated with Monte Carlo realizations. The accuracy of the proposed framework is verified in this study by applying it to several computationally extensive examples. Having this framework at hand aims hydrogeologists to achieve the optimum physical and statistical resolutions to minimize the error with a given computational budget. Moreover, the influence of the available computational resources and the geometric properties of the contaminant source zone on the optimum resolutions are investigated. We conclude that the computational cost associated with optimal allocation can be substantially reduced compared with prevalent recommendations in the literature.
Ciani, Oriana; Davis, Sarah; Tappenden, Paul; Garside, Ruth; Stein, Ken; Cantrell, Anna; Saad, Everardo D; Buyse, Marc; Taylor, Rod S
2014-07-01
Licensing of, and coverage decisions on, new therapies should rely on evidence from patient-relevant endpoints such as overall survival (OS). Nevertheless, evidence from surrogate endpoints may also be useful, as it may not only expedite the regulatory approval of new therapies but also inform coverage decisions. It is, therefore, essential that candidate surrogate endpoints be properly validated. However, there is no consensus on statistical methods for such validation and on how the evidence thus derived should be applied by policy makers. We review current statistical approaches to surrogate-endpoint validation based on meta-analysis in various advanced-tumor settings. We assessed the suitability of two surrogates (progression-free survival [PFS] and time-to-progression [TTP]) using three current validation frameworks: Elston and Taylor's framework, the German Institute of Quality and Efficiency in Health Care's (IQWiG) framework and the Biomarker-Surrogacy Evaluation Schema (BSES3). A wide variety of statistical methods have been used to assess surrogacy. The strength of the association between the two surrogates and OS was generally low. The level of evidence (observation-level versus treatment-level) available varied considerably by cancer type, by evaluation tools and was not always consistent even within one specific cancer type. Not in all solid tumors the treatment-level association between PFS or TTP and OS has been investigated. According to IQWiG's framework, only PFS achieved acceptable evidence of surrogacy in metastatic colorectal and ovarian cancer treated with cytotoxic agents. Our study emphasizes the challenges of surrogate-endpoint validation and the importance of building consensus on the development of evaluation frameworks.
1986-06-01
Energy and Natural Resources SWS Contract Report 391 FINAL REPORT A THEORETICAL FRAMEWORK FOR EXAMINING GEOGRAPHICAL VARIABILITY IN THE MICROPHYSICAL...U) A Theoretical Framework for Examining Geographical Variability in the Microphysical Mechanisms of Precipitation Development 12. PERSONAL AUTHOR(S...concentration. Oter key parameters include the degree of entrainment and stability of the environment. I 5 - T17 Unclassified ,.-. . A THEORETICAL FRAMEWORK FOR
Statistical Mechanics of the Cytoskeleton
NASA Astrophysics Data System (ADS)
Wang, Shenshen
The mechanical integrity of eukaryotic cells along with their capability of dynamic remodeling depends on their cytoskeleton, a structural scaffold made up of a complex and dense network of filamentous proteins spanning the cytoplasm. Active force generation within the cytoskeletal networks by molecular motors is ultimately powered by the consumption of chemical energy and conversion of that energy into mechanical work. The resulting functional movements range from the collective cell migration in epithelial tissues responsible for wound healing to the changes of cell shape that occur during muscle contraction, as well as all the internal structural rearrangements essential for cell division. The role of the cytoskeleton as a dynamic versatile mesoscale "muscle", whose passive and active performance is both highly heterogeneous in space and time and intimately linked to diverse biological functions, allows it to serve as a sensitive indicator for the health and developmental state of the cell. By approaching this natural nonequilibrium many-body system from a variety of perspectives, researchers have made major progress toward understanding the cytoskeleton's unusual mechanical, dynamical and structural properties. Yet a unifying framework capable of capturing both the dynamics of active pattern formation and the emergence of spontaneous collective motion, that allows one to predict the dependence of the model's control parameters on motor properties, is still needed. In the following we construct a microscopic model and provide a theoretical framework to investigate the intricate interplay between local force generation, network architecture and collective motor action. This framework is able to accommodate both regular and heterogeneous pattern formation, as well as arrested coarsening and macroscopic contraction in a unified manner, through the notion of motor-driven effective interactions. Moreover a systematic expansion scheme combined with a variational stability analysis yields a threshold strength of motor kicking noise, below which the motorized system behaves as if it were at an effective equilibrium, but with a nontrivial effective temperature. Above the threshold, however, collective directed motion emerges spontaneously. Computer simulations support the theoretical predictions and highlight the essential role played in large-scale contraction by spatial correlation in motor kicking events.
The Development of a Professional Statistics Teaching Identity
ERIC Educational Resources Information Center
Whitaker, Douglas
2016-01-01
Motivated by the increased statistics expectations for students and their teachers because of the widespread adoption of the Common Core State Standards for Mathematics, this study explores exemplary, in-service statistics teachers' professional identities using a theoretical framework informed by Gee (2000) and communities of practice (Lave &…
English and Chinese languages as weighted complex networks
NASA Astrophysics Data System (ADS)
Sheng, Long; Li, Chunguang
2009-06-01
In this paper, we analyze statistical properties of English and Chinese written human language within the framework of weighted complex networks. The two language networks are based on an English novel and a Chinese biography, respectively, and both of the networks are constructed in the same way. By comparing the intensity and density of connections between the two networks, we find that high weight connections in Chinese language networks prevail more than those in English language networks. Furthermore, some of the topological and weighted quantities are compared. The results display some differences in the structural organizations between the two language networks. These observations indicate that the two languages may have different linguistic mechanisms and different combinatorial natures.
On the (In)Validity of Tests of Simple Mediation: Threats and Solutions
Pek, Jolynn; Hoyle, Rick H.
2015-01-01
Mediation analysis is a popular framework for identifying underlying mechanisms in social psychology. In the context of simple mediation, we review and discuss the implications of three facets of mediation analysis: (a) conceptualization of the relations between the variables, (b) statistical approaches, and (c) relevant elements of design. We also highlight the issue of equivalent models that are inherent in simple mediation. The extent to which results are meaningful stem directly from choices regarding these three facets of mediation analysis. We conclude by discussing how mediation analysis can be better applied to examine causal processes, highlight the limits of simple mediation, and make recommendations for better practice. PMID:26985234
Pressure calculation in hybrid particle-field simulations
NASA Astrophysics Data System (ADS)
Milano, Giuseppe; Kawakatsu, Toshihiro
2010-12-01
In the framework of a recently developed scheme for a hybrid particle-field simulation techniques where self-consistent field (SCF) theory and particle models (molecular dynamics) are combined [J. Chem. Phys. 130, 214106 (2009)], we developed a general formulation for the calculation of instantaneous pressure and stress tensor. The expressions have been derived from statistical mechanical definition of the pressure starting from the expression for the free energy functional in the SCF theory. An implementation of the derived formulation suitable for hybrid particle-field molecular dynamics-self-consistent field simulations is described. A series of test simulations on model systems are reported comparing the calculated pressure with those obtained from standard molecular dynamics simulations based on pair potentials.
Calculations of the surface tensions of liquid metals
NASA Technical Reports Server (NTRS)
Stroud, D. G.
1981-01-01
The understanding of the surface tension of liquid metals and alloys from as close to first principles as possible is discussed. The two ingredients which are combined in these calculations are: the electron theory of metals, and the classical theory of liquids, as worked out within the framework of statistical mechanics. The results are a new theory of surface tensions and surface density profiles from knowledge purely of the bulk properties of the coexisting liquid and vapor phases. It is found that the method works well for the pure liquid metals on which it was tested; work is extended to mixtures of liquid metals, interfaces between immiscible liquid metals, and to the temperature derivative of the surface tension.
Adhesive loose packings of small dry particles.
Liu, Wenwei; Li, Shuiqing; Baule, Adrian; Makse, Hernán A
2015-08-28
We explore adhesive loose packings of small dry spherical particles of micrometer size using 3D discrete-element simulations with adhesive contact mechanics and statistical ensemble theory. A dimensionless adhesion parameter (Ad) successfully combines the effects of particle velocities, sizes and the work of adhesion, identifying a universal regime of adhesive packings for Ad > 1. The structural properties of the packings in this regime are well described by an ensemble approach based on a coarse-grained volume function that includes the correlation between bulk and contact spheres. Our theoretical and numerical results predict: (i) an equation of state for adhesive loose packings that appear as a continuation from the frictionless random close packing (RCP) point in the jamming phase diagram and (ii) the existence of an asymptotic adhesive loose packing point at a coordination number Z = 2 and a packing fraction ϕ = 1/2(3). Our results highlight that adhesion leads to a universal packing regime at packing fractions much smaller than the random loose packing (RLP), which can be described within a statistical mechanical framework. We present a general phase diagram of jammed matter comprising frictionless, frictional, adhesive as well as non-spherical particles, providing a classification of packings in terms of their continuation from the spherical frictionless RCP.
Discrete Model for the Structure and Strength of Cementitious Materials
NASA Astrophysics Data System (ADS)
Balopoulos, Victor D.; Archontas, Nikolaos; Pantazopoulou, Stavroula J.
2017-12-01
Cementitious materials are characterized by brittle behavior in direct tension and by transverse dilatation (due to microcracking) under compression. Microcracking causes increasingly larger transverse strains and a phenomenological Poisson's ratio that gradually increases to about ν =0.5 and beyond, at the limit point in compression. This behavior is due to the underlying structure of cementitious pastes which is simulated here with a discrete physical model. The computational model is generic, assembled from a statistically generated, continuous network of flaky dendrites consisting of cement hydrates that emanate from partially hydrated cement grains. In the actual amorphous material, the dendrites constitute the solid phase of the cement gel and interconnect to provide the strength and stiffness against load. The idealized dendrite solid is loaded in compression and tension to compute values for strength and Poisson's effects. Parametric studies are conducted, to calibrate the statistical parameters of the discrete model with the physical and mechanical characteristics of the material, so that the familiar experimental trends may be reproduced. The model provides a framework for the study of the mechanical behavior of the material under various states of stress and strain and can be used to model the effects of additives (e.g., fibers) that may be explicitly simulated in the discrete structure.
Evaluation of Chest Injury Mechanisms in Nearside Oblique Frontal Impacts
Iraeus, Johan; Lindquist, Mats; Wistrand, Sofie; Sibgård, Elin; Pipkorn, Bengt
2013-01-01
Despite the use of seat belts and modern safety systems, many automobile occupants are still seriously injured or killed in car crashes. Common configurations in these crashes are oblique and small overlap frontal impacts that often lead to chest injuries. To evaluate the injury mechanism in these oblique impacts, an investigation was carried out using mathematical human body model simulations. A model of a simplified vehicle interior was developed and validated by means of mechanical sled tests with the Hybrid III dummy. The interior model was then combined with the human body model THUMS and validated by means of mechanical PMHS sled tests. Occupant kinematics as well as rib fracture patterns were predicted with reasonable accuracy. The final model was updated to conform to modern cars and a simulation matrix was run. In this matrix the boundary conditions, ΔV and PDOF, were varied and rib fracture risk as a function of the boundary conditions was evaluated using a statistical framework. In oblique frontal impacts, two injury producing mechanisms were found; (i) diagonal belt load and (ii) side structure impact. The second injury mechanism was found for PDOFs of 25°–35°, depending on ΔV. This means that for larger PDOFs, less ΔV is needed to cause a serious chest injury. PMID:24406957
Mourning dove hunting regulation strategy based on annual harvest statistics and banding data
Otis, D.L.
2006-01-01
Although managers should strive to base game bird harvest management strategies on mechanistic population models, monitoring programs required to build and continuously update these models may not be in place. Alternatively, If estimates of total harvest and harvest rates are available, then population estimates derived from these harvest data can serve as the basis for making hunting regulation decisions based on population growth rates derived from these estimates. I present a statistically rigorous approach for regulation decision-making using a hypothesis-testing framework and an assumed framework of 3 hunting regulation alternatives. I illustrate and evaluate the technique with historical data on the mid-continent mallard (Anas platyrhynchos) population. I evaluate the statistical properties of the hypothesis-testing framework using the best available data on mourning doves (Zenaida macroura). I use these results to discuss practical implementation of the technique as an interim harvest strategy for mourning doves until reliable mechanistic population models and associated monitoring programs are developed.
Zhi, Ruicong; Zhao, Lei; Xie, Nan; Wang, Houyin; Shi, Bolin; Shi, Jingye
2016-01-13
A framework of establishing standard reference scale (texture) is proposed by multivariate statistical analysis according to instrumental measurement and sensory evaluation. Multivariate statistical analysis is conducted to rapidly select typical reference samples with characteristics of universality, representativeness, stability, substitutability, and traceability. The reasonableness of the framework method is verified by establishing standard reference scale of texture attribute (hardness) with Chinese well-known food. More than 100 food products in 16 categories were tested using instrumental measurement (TPA test), and the result was analyzed with clustering analysis, principal component analysis, relative standard deviation, and analysis of variance. As a result, nine kinds of foods were determined to construct the hardness standard reference scale. The results indicate that the regression coefficient between the estimated sensory value and the instrumentally measured value is significant (R(2) = 0.9765), which fits well with Stevens's theory. The research provides reliable a theoretical basis and practical guide for quantitative standard reference scale establishment on food texture characteristics.
Convolutionless Nakajima-Zwanzig equations for stochastic analysis in nonlinear dynamical systems.
Venturi, D; Karniadakis, G E
2014-06-08
Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima-Zwanzig-Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection-reaction problems.
NASA Astrophysics Data System (ADS)
Barnea, A. Ronny; Cheshnovsky, Ori; Even, Uzi
2018-02-01
Interference experiments have been paramount in our understanding of quantum mechanics and are frequently the basis of testing the superposition principle in the framework of quantum theory. In recent years, several studies have challenged the nature of wave-function interference from the perspective of Born's rule—namely, the manifestation of so-called high-order interference terms in a superposition generated by diffraction of the wave functions. Here we present an experimental test of multipath interference in the diffraction of metastable helium atoms, with large-number counting statistics, comparable to photon-based experiments. We use a variation of the original triple-slit experiment and accurate single-event counting techniques to provide a new experimental bound of 2.9 ×10-5 on the statistical deviation from the commonly approximated null third-order interference term in Born's rule for matter waves. Our value is on the order of the maximal contribution predicted for multipath trajectories by Feynman path integrals.
Selmke, Markus; Khadka, Utsab; Bregulla, Andreas P; Cichos, Frank; Yang, Haw
2018-04-18
Photon nudging is a new experimental method which enables the force-free manipulation and localization of individual self-propelled artificial micro-swimmers in fluidic environments. It uses a weak laser to stochastically and adaptively turn on and off the swimmer's propulsion when the swimmer, through rotational diffusion, points towards or away from its target, respectively. This contribution presents a theoretical framework for the statistics of both 2D and 3D controls. The main results are: the on- and off-time distributions for the controlling laser, the arrival time statistics for the swimmer to reach a remote target, and how the experimentally accessible control parameters influence the control, e.g., the optimal acceptance angle for directed transport. The results are general in that they are independent of the propulsion or the actuation mechanisms. They provide a concrete physical picture for how a single artificial micro-swimmer could be navigated under thermal fluctuations-insights that could also be useful for understanding biological micro-swimmers.
On the Mathematical Consequences of Binning Spike Trains.
Cessac, Bruno; Le Ny, Arnaud; Löcherbach, Eva
2017-01-01
We initiate a mathematical analysis of hidden effects induced by binning spike trains of neurons. Assuming that the original spike train has been generated by a discrete Markov process, we show that binning generates a stochastic process that is no longer Markov but is instead a variable-length Markov chain (VLMC) with unbounded memory. We also show that the law of the binned raster is a Gibbs measure in the DLR (Dobrushin-Lanford-Ruelle) sense coined in mathematical statistical mechanics. This allows the derivation of several important consequences on statistical properties of binned spike trains. In particular, we introduce the DLR framework as a natural setting to mathematically formalize anticipation, that is, to tell "how good" our nervous system is at making predictions. In a probabilistic sense, this corresponds to condition a process by its future, and we discuss how binning may affect our conclusions on this ability. We finally comment on the possible consequences of binning in the detection of spurious phase transitions or in the detection of incorrect evidence of criticality.
Convolutionless Nakajima–Zwanzig equations for stochastic analysis in nonlinear dynamical systems
Venturi, D.; Karniadakis, G. E.
2014-01-01
Determining the statistical properties of stochastic nonlinear systems is of major interest across many disciplines. Currently, there are no general efficient methods to deal with this challenging problem that involves high dimensionality, low regularity and random frequencies. We propose a framework for stochastic analysis in nonlinear dynamical systems based on goal-oriented probability density function (PDF) methods. The key idea stems from techniques of irreversible statistical mechanics, and it relies on deriving evolution equations for the PDF of quantities of interest, e.g. functionals of the solution to systems of stochastic ordinary and partial differential equations. Such quantities could be low-dimensional objects in infinite dimensional phase spaces. We develop the goal-oriented PDF method in the context of the time-convolutionless Nakajima–Zwanzig–Mori formalism. We address the question of approximation of reduced-order density equations by multi-level coarse graining, perturbation series and operator cumulant resummation. Numerical examples are presented for stochastic resonance and stochastic advection–reaction problems. PMID:24910519
DOE Office of Scientific and Technical Information (OSTI.GOV)
Makse, Hernan A.; Johnson, David L.
2014-09-03
This is the final report describing the results of DOE Grant # DE-FG02-03ER15458 with original termination date of April 31, 2013, which has been extended to April 31, 2014. The goal of this project is to develop a theoretical and experimental understanding of sound propagation, elasticity and dissipation in granular materials. The topic is relevant for the efficient production of hydrocarbon and for identifying and characterizing the underground formation for storage of either CO 2 or nuclear waste material. Furthermore, understanding the basic properties of acoustic propagation in granular media is of importance not only to the energy industry, butmore » also to the pharmaceutical, chemical and agricultural industries. We employ a set of experimental, theoretical and computational tools to develop a study of acoustics and dissipation in granular media. These include the concept effective mass of granular media, normal modes analysis, statistical mechanics frameworks and numerical simulations based on Discrete Element Methods. Effective mass measurements allow us to study the mechanisms of the elastic response and attenuation of acoustic modes in granular media. We perform experiments and simulations under varying conditions, including humidity and vacuum, and different interparticle force-laws to develop a fundamental understanding of the mechanisms of damping and acoustic propagation in granular media. A theoretical statistical approach studies the necessary phase space of configurations in pressure, volume fraction to classify granular materials.« less
Connectopic mapping with resting-state fMRI.
Haak, Koen V; Marquand, Andre F; Beckmann, Christian F
2018-04-15
Brain regions are often topographically connected: nearby locations within one brain area connect with nearby locations in another area. Mapping these connection topographies, or 'connectopies' in short, is crucial for understanding how information is processed in the brain. Here, we propose principled, fully data-driven methods for mapping connectopies using functional magnetic resonance imaging (fMRI) data acquired at rest by combining spectral embedding of voxel-wise connectivity 'fingerprints' with a novel approach to spatial statistical inference. We apply the approach in human primary motor and visual cortex, and show that it can trace biologically plausible, overlapping connectopies in individual subjects that follow these regions' somatotopic and retinotopic maps. As a generic mechanism to perform inference over connectopies, the new spatial statistics approach enables rigorous statistical testing of hypotheses regarding the fine-grained spatial profile of functional connectivity and whether that profile is different between subjects or between experimental conditions. The combined framework offers a fundamental alternative to existing approaches to investigating functional connectivity in the brain, from voxel- or seed-pair wise characterizations of functional association, towards a full, multivariate characterization of spatial topography. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Molecular Retrofitting Adapts a Metal–Organic Framework to Extreme Pressure
Kapustin, Eugene A.; Lee, Seungkyu; Alshammari, Ahmad S.; ...
2017-06-07
Despite numerous studies on chemical and thermal stability of metal-organic frameworks (MOFs), mechanical stability remains largely undeveloped. No strategy exists to control the mechanical deformation of MOFs under ultrahigh pressure, to date. We show that the mechanically unstable MOF-520 can be retrofitted by precise placement of a rigid 4,4'-biphenyldicarboxylate (BPDC) linker as a "girder" to afford a mechanically robust framework: MOF-520-BPDC. This retrofitting alters how the structure deforms under ultrahigh pressure and thus leads to a drastic enhancement of its mechanical robustness. While in the parent MOF-520 the pressure transmitting medium molecules diffuse into the pore and expand the structuremore » from the inside upon compression, the girder in the new retrofitted MOF-520-BPDC prevents the framework from expansion by linking two adjacent secondary building units together. As a result, the modified MOF is stable under hydrostatic compression in a diamond-anvil cell up to 5.5 gigapascal. The increased mechanical stability of MOF-520-BPDC prohibits the typical amorphization observed for MOFs in this pressure range. Direct correlation between the orientation of these girders within the framework and its linear strain was estimated, providing new insights for the design of MOFs with optimized mechanical properties.« less
FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption
2015-01-01
Background The increasing availability of genome data motivates massive research studies in personalized treatment and precision medicine. Public cloud services provide a flexible way to mitigate the storage and computation burden in conducting genome-wide association studies (GWAS). However, data privacy has been widely concerned when sharing the sensitive information in a cloud environment. Methods We presented a novel framework (FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption) to fully outsource GWAS (i.e., chi-square statistic computation) using homomorphic encryption. The proposed framework enables secure divisions over encrypted data. We introduced two division protocols (i.e., secure errorless division and secure approximation division) with a trade-off between complexity and accuracy in computing chi-square statistics. Results The proposed framework was evaluated for the task of chi-square statistic computation with two case-control datasets from the 2015 iDASH genome privacy protection challenge. Experimental results show that the performance of FORESEE can be significantly improved through algorithmic optimization and parallel computation. Remarkably, the secure approximation division provides significant performance gain, but without missing any significance SNPs in the chi-square association test using the aforementioned datasets. Conclusions Unlike many existing HME based studies, in which final results need to be computed by the data owner due to the lack of the secure division operation, the proposed FORESEE framework support complete outsourcing to the cloud and output the final encrypted chi-square statistics. PMID:26733391
FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption.
Zhang, Yuchen; Dai, Wenrui; Jiang, Xiaoqian; Xiong, Hongkai; Wang, Shuang
2015-01-01
The increasing availability of genome data motivates massive research studies in personalized treatment and precision medicine. Public cloud services provide a flexible way to mitigate the storage and computation burden in conducting genome-wide association studies (GWAS). However, data privacy has been widely concerned when sharing the sensitive information in a cloud environment. We presented a novel framework (FORESEE: Fully Outsourced secuRe gEnome Study basEd on homomorphic Encryption) to fully outsource GWAS (i.e., chi-square statistic computation) using homomorphic encryption. The proposed framework enables secure divisions over encrypted data. We introduced two division protocols (i.e., secure errorless division and secure approximation division) with a trade-off between complexity and accuracy in computing chi-square statistics. The proposed framework was evaluated for the task of chi-square statistic computation with two case-control datasets from the 2015 iDASH genome privacy protection challenge. Experimental results show that the performance of FORESEE can be significantly improved through algorithmic optimization and parallel computation. Remarkably, the secure approximation division provides significant performance gain, but without missing any significance SNPs in the chi-square association test using the aforementioned datasets. Unlike many existing HME based studies, in which final results need to be computed by the data owner due to the lack of the secure division operation, the proposed FORESEE framework support complete outsourcing to the cloud and output the final encrypted chi-square statistics.
Empirical Research of College Students' Alternative Frameworks of Particle Mechanics
ERIC Educational Resources Information Center
Wang, Hongmei
2010-01-01
Based on the constructive theory, about 300 college students of grade 05 of the electronic information specialty of Dezhou University are surveyed for their alternative frameworks of particle mechanics in college physics in this article. In the survey, the questionnaires are used to find out college students' alternative frameworks, and the…
Detecting Anomalous Insiders in Collaborative Information Systems
Chen, You; Nyemba, Steve; Malin, Bradley
2012-01-01
Collaborative information systems (CISs) are deployed within a diverse array of environments that manage sensitive information. Current security mechanisms detect insider threats, but they are ill-suited to monitor systems in which users function in dynamic teams. In this paper, we introduce the community anomaly detection system (CADS), an unsupervised learning framework to detect insider threats based on the access logs of collaborative environments. The framework is based on the observation that typical CIS users tend to form community structures based on the subjects accessed (e.g., patients’ records viewed by healthcare providers). CADS consists of two components: 1) relational pattern extraction, which derives community structures and 2) anomaly prediction, which leverages a statistical model to determine when users have sufficiently deviated from communities. We further extend CADS into MetaCADS to account for the semantics of subjects (e.g., patients’ diagnoses). To empirically evaluate the framework, we perform an assessment with three months of access logs from a real electronic health record (EHR) system in a large medical center. The results illustrate our models exhibit significant performance gains over state-of-the-art competitors. When the number of illicit users is low, MetaCADS is the best model, but as the number grows, commonly accessed semantics lead to hiding in a crowd, such that CADS is more prudent. PMID:24489520
Word-level language modeling for P300 spellers based on discriminative graphical models
NASA Astrophysics Data System (ADS)
Delgado Saa, Jaime F.; de Pesters, Adriana; McFarland, Dennis; Çetin, Müjdat
2015-04-01
Objective. In this work we propose a probabilistic graphical model framework that uses language priors at the level of words as a mechanism to increase the performance of P300-based spellers. Approach. This paper is concerned with brain-computer interfaces based on P300 spellers. Motivated by P300 spelling scenarios involving communication based on a limited vocabulary, we propose a probabilistic graphical model framework and an associated classification algorithm that uses learned statistical models of language at the level of words. Exploiting such high-level contextual information helps reduce the error rate of the speller. Main results. Our experimental results demonstrate that the proposed approach offers several advantages over existing methods. Most importantly, it increases the classification accuracy while reducing the number of times the letters need to be flashed, increasing the communication rate of the system. Significance. The proposed approach models all the variables in the P300 speller in a unified framework and has the capability to correct errors in previous letters in a word, given the data for the current one. The structure of the model we propose allows the use of efficient inference algorithms, which in turn makes it possible to use this approach in real-time applications.
Zhou, Xiang
2017-12-01
Linear mixed models (LMMs) are among the most commonly used tools for genetic association studies. However, the standard method for estimating variance components in LMMs-the restricted maximum likelihood estimation method (REML)-suffers from several important drawbacks: REML requires individual-level genotypes and phenotypes from all samples in the study, is computationally slow, and produces downward-biased estimates in case control studies. To remedy these drawbacks, we present an alternative framework for variance component estimation, which we refer to as MQS. MQS is based on the method of moments (MoM) and the minimal norm quadratic unbiased estimation (MINQUE) criterion, and brings two seemingly unrelated methods-the renowned Haseman-Elston (HE) regression and the recent LD score regression (LDSC)-into the same unified statistical framework. With this new framework, we provide an alternative but mathematically equivalent form of HE that allows for the use of summary statistics. We provide an exact estimation form of LDSC to yield unbiased and statistically more efficient estimates. A key feature of our method is its ability to pair marginal z -scores computed using all samples with SNP correlation information computed using a small random subset of individuals (or individuals from a proper reference panel), while capable of producing estimates that can be almost as accurate as if both quantities are computed using the full data. As a result, our method produces unbiased and statistically efficient estimates, and makes use of summary statistics, while it is computationally efficient for large data sets. Using simulations and applications to 37 phenotypes from 8 real data sets, we illustrate the benefits of our method for estimating and partitioning SNP heritability in population studies as well as for heritability estimation in family studies. Our method is implemented in the GEMMA software package, freely available at www.xzlab.org/software.html.
ERIC Educational Resources Information Center
Sweet, Shauna J.; Rupp, Andre A.
2012-01-01
The "evidence-centered design" (ECD) framework is a powerful tool that supports careful and critical thinking about the identification and accumulation of evidence in assessment contexts. In this paper, we demonstrate how the ECD framework provides critical support for designing simulation studies to investigate statistical methods…
Influence of laser-welding and electroerosion on passive fit of implant-supported prosthesis.
Silva, Tatiana Bernardon; De Arruda Nobilo, Mauro Antonio; Pessanha Henriques, Guilherme Elias; Mesquita, Marcelo Ferraz; Guimaraes, Magali Beck
2008-01-01
This study investigated the influence of laser welding and electroerosion procedure on the passive fit of interim fixed implant-supported titanium frameworks. Twenty frameworks were made from a master model, with five parallel placed implants in the inter foramen region, and cast in commercially pure titanium. The frameworks were divided into 4 groups: 10 samples were tested before (G1) and after (G2) electroerosion application; and another 10 were sectioned into five pieces and laser welded before (G3) and after (G4) electroerosion application. The passive fit between the UCLA abutment of the framework and the implant was evaluated using an optical microscope Olympus STM (Olympus Optical Co., Tokyo, Japan) with 0.0005mm of accuracy. Statistical analyses showed significant differences between G1 and G2, G1 and G3, G1 and G4, G2 and G4. However, no statistical difference was observed when comparing G2 and G3. These results indicate that frameworks may show a more precise adaptation if they are sectioned and laser welded. In the same way, electroerosion improves the precision in the framework adaptation.
NASA Astrophysics Data System (ADS)
Li, Wei; Henke, Sebastian; Cheetham, Anthony K.
2014-12-01
Metal-organic frameworks (MOFs), a young family of functional materials, have been attracting considerable attention from the chemistry, materials science, and physics communities. In the light of their potential applications in industry and technology, the fundamental mechanical properties of MOFs, which are of critical importance for manufacturing, processing, and performance, need to be addressed and understood. It has been widely accepted that the framework topology, which describes the overall connectivity pattern of the MOF building units, is of vital importance for the mechanical properties. However, recent advances in the area of MOF mechanics reveal that chemistry plays a major role as well. From the viewpoint of materials science, a deep understanding of the influence of chemical effects on MOF mechanics is not only highly desirable for the development of novel functional materials with targeted mechanical response, but also for a better understanding of important properties such as structural flexibility and framework breathing. The present work discusses the intrinsic connection between chemical effects and the mechanical behavior of MOFs through a number of prototypical examples.
Teaching Statistics with Technology
ERIC Educational Resources Information Center
Prodromou, Theodosia
2015-01-01
The Technological Pedagogical Content Knowledge (TPACK) conceptual framework for teaching mathematics, developed by Mishra and Koehler (2006), emphasises the importance of developing integrated and interdependent understanding of three primary forms of knowledge: technology, pedagogy, and content. The TPACK conceptual framework is based upon the…
Mechanical and fracture behavior of veneer-framework composites for all-ceramic dental bridges.
Studart, André R; Filser, Frank; Kocher, Peter; Lüthy, Heinz; Gauckler, Ludwig J
2007-01-01
High-strength ceramics are required in dental posterior restorations in order to withstand the excessive tensile stresses that occur during mastication. The aim of this study was to investigate the fracture behavior and the fast-fracture mechanical strength of three veneer-framework composites (Empress 2/IPS Eris, TZP/Cercon S and Inceram-Zirconia/Vita VM7) for all-ceramic dental bridges. The load bearing capacity of the veneer-framework composites were evaluated using a bending mechanical apparatus. The stress distribution through the rectangular-shaped layered samples was assessed using simple beam calculations and used to estimate the fracture strength of the veneer layer. Optical microscopy of fractured specimens was employed to determine the origin of cracks and the fracture mode. Under fast fracture conditions, cracks were observed to initiate on, or close to, the veneer outer surface and propagate towards the inner framework material. Crack deflection occurred at the veneer-framework interface of composites containing a tough framework material (TZP/Cercon S and Inceram-Zirconia/Vita VM7), as opposed to the straight propagation observed in the case of weaker frameworks (Empress 2/IPS Eris). The mechanical strength of dental composites containing a weak framework (K(IC)<3 MPam(1/2)) is ultimately determined by the low fracture strength of the veneer layer, since no crack arresting occurs at the veneer-framework interface. Therefore, high-toughness ceramics (K(IC)>5 MPam(1/2)) should be used as framework materials of posterior all-ceramic bridges, so that cracks propagating from the veneer layer do not lead to a premature failure of the prosthesis.
Statistical Research of Investment Development of Russian Regions
ERIC Educational Resources Information Center
Burtseva, Tatiana A.; Aleshnikova, Vera I.; Dubovik, Mayya V.; Naidenkova, Ksenya V.; Kovalchuk, Nadezda B.; Repetskaya, Natalia V.; Kuzmina, Oksana G.; Surkov, Anton A.; Bershadskaya, Olga I.; Smirennikova, Anna V.
2016-01-01
This article the article is concerned with a substantiation of procedures ensuring the implementation of statistical research and monitoring of investment development of the Russian regions, which would be pertinent for modern development of the state statistics. The aim of the study is to develop the methodological framework in order to estimate…
Long-term strategy for the statistical design of a forest health monitoring system
Hans T. Schreuder; Raymond L. Czaplewski
1993-01-01
A conceptual framework is given for a broad-scale survey of forest health that accomplishes three objectives: generate descriptive statistics; detect changes in such statistics; and simplify analytical inferences that identify, and possibly establish cause-effect relationships. Our paper discusses the development of sampling schemes to satisfy these three objectives,...
The Gtr-Model a Universal Framework for Quantum-Like Measurements
NASA Astrophysics Data System (ADS)
Aerts, Diederik; Bianchi, Massimiliano Sassoli De
We present a very general geometrico-dynamical description of physical or more abstract entities, called the general tension-reduction (GTR) model, where not only states, but also measurement-interactions can be represented, and the associated outcome probabilities calculated. Underlying the model is the hypothesis that indeterminism manifests as a consequence of unavoidable uctuations in the experimental context, in accordance with the hidden-measurements interpretation of quantum mechanics. When the structure of the state space is Hilbertian, and measurements are of the universal kind, i.e., are the result of an average over all possible ways of selecting an outcome, the GTR-model provides the same predictions of the Born rule, and therefore provides a natural completed version of quantum mechanics. However, when the structure of the state space is non-Hilbertian and/or not all possible ways of selecting an outcome are available to be actualized, the predictions of the model generally differ from the quantum ones, especially when sequential measurements are considered. Some paradigmatic examples will be discussed, taken from physics and human cognition. Particular attention will be given to some known psychological effects, like question order effects and response replicability, which we show are able to generate non-Hilbertian statistics. We also suggest a realistic interpretation of the GTR-model, when applied to human cognition and decision, which we think could become the generally adopted interpretative framework in quantum cognition research.
Stress transmission through a model system of cohesionless elastic grains
NASA Astrophysics Data System (ADS)
Da Silva, Miguel; Rajchenbach, Jean
2000-08-01
Understanding the mechanical properties of granular materials is important for applications in civil and chemical engineering, geophysical sciences and the food industry, as well as for the control or prevention of avalanches and landslides. Unlike continuous media, granular materials lack cohesion, and cannot resist tensile stresses. Current descriptions of the mechanical properties of collections of cohesionless grains have relied either on elasto-plastic models classically used in civil engineering, or on a recent model involving hyperbolic equations. The former models suggest that collections of elastic grains submitted to a compressive load will behave elastically. Here we present the results of an experiment on a two-dimensional model system-made of discrete square cells submitted to a point load-in which the region in which the stress is confined is photoelastically visualized as a parabola. These results, which can be interpreted within a statistical framework, demonstrate that the collective response of the pile contradicts the standard elastic predictions and supports a diffusive description of stress transmission. We expect that these findings will be applicable to problems in soil mechanics, such as the behaviour of cohesionless soils or sand piles.
New statistical scission-point model to predict fission fragment observables
NASA Astrophysics Data System (ADS)
Lemaître, Jean-François; Panebianco, Stefano; Sida, Jean-Luc; Hilaire, Stéphane; Heinrich, Sophie
2015-09-01
The development of high performance computing facilities makes possible a massive production of nuclear data in a full microscopic framework. Taking advantage of the individual potential calculations of more than 7000 nuclei, a new statistical scission-point model, called SPY, has been developed. It gives access to the absolute available energy at the scission point, which allows the use of a parameter-free microcanonical statistical description to calculate the distributions and the mean values of all fission observables. SPY uses the richness of microscopy in a rather simple theoretical framework, without any parameter except the scission-point definition, to draw clear answers based on perfect knowledge of the ingredients involved in the model, with very limited computing cost.
Equilibrium Fluctuation Relations for Voltage Coupling in Membrane Proteins
Kim, Ilsoo; Warshel, Arieh
2015-01-01
A general theoretical framework is developed to account for the effects of an external potential on the energetics of membrane proteins. The framework is based on the free energy relation between two (forward/backward) probability densities, which was recently generalized to non-equilibrium processes, culminating in the work-fluctuation theorem. Starting from the probability densities of the conformational states along the reaction coordinate of “voltage coupling”, we investigate several interconnected free energy relations between these two conformational states, considering voltage activation of ion channels. The free energy difference at zero membrane potential (i.e., between the two “non-equilibrium” conformational states) is shown to be equivalent to the free energy difference between the two “equilibrium” conformational states along the one-dimensional reaction coordinate of voltage coupling. Furthermore, the requirement that the application of linear response approximation to the free energy functions (free energies) of voltage coupling should satisfy the general free energy relations, yields a novel expression for the gating charge in terms of other experimentally measurable quantities. This connection is familiar in statistical mechanics, known as the equilibrium fluctuation-response relation. The theory is illustrated by considering the movement of a unit charge within the membrane under the influence of an external potential, using a coarse-graining (CG) model of membrane proteins, which includes the membrane, the electrolytes and the electrodes. The CG model yields Marcus–type voltage dependent free energy parabolas for the two conformational states, which allow for quantitative estimations of an equilibrium free energy difference, a free energy of barrier, and the voltage dependency of channel activation (Q-V curve) for the unit charge movement. In addition, our analysis offers a quantitative rationale for the correlation between the free energy landscapes (parabolas) and the Q-V curve, upon site-directed mutagenesis or drug binding. Taken together, by introducing the voltage coupling as a reaction coordinate of energy gab, the present theory offers a firm physical foundation from the equilibrium theory of statistical mechanics for the thermodynamic models of voltage activation in voltage-sensitive membrane proteins. This formulation also provides a powerful bridge between the CG model and the conventional macroscopic treatments, offering an intuitive and quantitative framework for a better understating of the structure-function correlations of voltage gating in ion channels as well as electrogenic phenomena in ion pumps and transporters. PMID:26290960
PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare
Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian
2015-01-01
Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao’s garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework. PMID:26146645
Steganalysis based on reducing the differences of image statistical characteristics
NASA Astrophysics Data System (ADS)
Wang, Ran; Niu, Shaozhang; Ping, Xijian; Zhang, Tao
2018-04-01
Compared with the process of embedding, the image contents make a more significant impact on the differences of image statistical characteristics. This makes the image steganalysis to be a classification problem with bigger withinclass scatter distances and smaller between-class scatter distances. As a result, the steganalysis features will be inseparate caused by the differences of image statistical characteristics. In this paper, a new steganalysis framework which can reduce the differences of image statistical characteristics caused by various content and processing methods is proposed. The given images are segmented to several sub-images according to the texture complexity. Steganalysis features are separately extracted from each subset with the same or close texture complexity to build a classifier. The final steganalysis result is figured out through a weighted fusing process. The theoretical analysis and experimental results can demonstrate the validity of the framework.
PRECISE:PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare.
Chen, Feng; Wang, Shuang; Mohammed, Noman; Cheng, Samuel; Jiang, Xiaoqian
2014-10-01
Quality improvement (QI) requires systematic and continuous efforts to enhance healthcare services. A healthcare provider might wish to compare local statistics with those from other institutions in order to identify problems and develop intervention to improve the quality of care. However, the sharing of institution information may be deterred by institutional privacy as publicizing such statistics could lead to embarrassment and even financial damage. In this article, we propose a PRivacy-prEserving Cloud-assisted quality Improvement Service in hEalthcare (PRECISE), which aims at enabling cross-institution comparison of healthcare statistics while protecting privacy. The proposed framework relies on a set of state-of-the-art cryptographic protocols including homomorphic encryption and Yao's garbled circuit schemes. By securely pooling data from different institutions, PRECISE can rank the encrypted statistics to facilitate QI among participating institutes. We conducted experiments using MIMIC II database and demonstrated the feasibility of the proposed PRECISE framework.
Takano, Wataru; Kusajima, Ikuo; Nakamura, Yoshihiko
2016-08-01
It is desirable for robots to be able to linguistically understand human actions during human-robot interactions. Previous research has developed frameworks for encoding human full body motion into model parameters and for classifying motion into specific categories. For full understanding, the motion categories need to be connected to the natural language such that the robots can interpret human motions as linguistic expressions. This paper proposes a novel framework for integrating observation of human motion with that of natural language. This framework consists of two models; the first model statistically learns the relations between motions and their relevant words, and the second statistically learns sentence structures as word n-grams. Integration of these two models allows robots to generate sentences from human motions by searching for words relevant to the motion using the first model and then arranging these words in appropriate order using the second model. This allows making sentences that are the most likely to be generated from the motion. The proposed framework was tested on human full body motion measured by an optical motion capture system. In this, descriptive sentences were manually attached to the motions, and the validity of the system was demonstrated. Copyright © 2016 Elsevier Ltd. All rights reserved.
Hoch, Jeffrey S; Briggs, Andrew H; Willan, Andrew R
2002-07-01
Economic evaluation is often seen as a branch of health economics divorced from mainstream econometric techniques. Instead, it is perceived as relying on statistical methods for clinical trials. Furthermore, the statistic of interest in cost-effectiveness analysis, the incremental cost-effectiveness ratio is not amenable to regression-based methods, hence the traditional reliance on comparing aggregate measures across the arms of a clinical trial. In this paper, we explore the potential for health economists undertaking cost-effectiveness analysis to exploit the plethora of established econometric techniques through the use of the net-benefit framework - a recently suggested reformulation of the cost-effectiveness problem that avoids the reliance on cost-effectiveness ratios and their associated statistical problems. This allows the formulation of the cost-effectiveness problem within a standard regression type framework. We provide an example with empirical data to illustrate how a regression type framework can enhance the net-benefit method. We go on to suggest that practical advantages of the net-benefit regression approach include being able to use established econometric techniques, adjust for imperfect randomisation, and identify important subgroups in order to estimate the marginal cost-effectiveness of an intervention. Copyright 2002 John Wiley & Sons, Ltd.
Marginal discrepancy of CAD-CAM complete-arch fixed implant-supported frameworks.
Yilmaz, Burak; Kale, Ediz; Johnston, William M
2018-02-21
Computer-aided design and computer-aided manufacturing (CAD-CAM) high-density polymers (HDPs) have recently been marketed for the fabrication of long-term interim implant-supported fixed prostheses. However, information regarding the precision of fit of CAD-CAM HDP implant-supported complete-arch screw-retained prostheses is scarce. The purpose of this in vitro study was to evaluate the marginal discrepancy of CAD-CAM HDP complete-arch implant-supported screw-retained fixed prosthesis frameworks and compare them with conventional titanium (Ti) and zirconia (Zir) frameworks. A screw-retained complete-arch acrylic resin prototype with multiunit abutments was fabricated on a typodont model with 2 straight implants in the anterior region and 2 implants with a 30-degree distal tilt in the posterior region. A 3-dimensional (3D) laboratory laser scanner was used to digitize the typodont model with scan bodies and the resin prototype to generate a virtual 3D CAD framework. A CAM milling unit was used to fabricate 5 frameworks from HDP, Ti, and Zir blocks. The 1-screw test was performed by tightening the prosthetic screw in the maxillary left first molar abutment (terminal location) when the frameworks were on the typodont model, and the marginal discrepancy of frameworks was evaluated using an industrial computed tomographic scanner and a 3D volumetric software. The 3D marginal discrepancy at the abutment-framework interface of the maxillary left canine (L1), right canine (L2), and right first molar (L3) sites was measured. The mean values for 3D marginal discrepancy were calculated for each location in a group with 95% confidence limits. The results were analyzed by repeated-measures 2-way ANOVA using the restricted maximum likelihood estimation and the Satterthwaite degrees of freedom methods, which do not require normality and homoscedasticity in the data. The between-subjects factor was material, the within-subjects factor was location, and the interaction was included in the model. Tukey tests were applied to resolve any statistically significant source of variation (overall α=.05). The 3D marginal discrepancy measurement was possible only for L2 and L3 because the L1 values were too small to detect. The mean discrepancy values at L2 were 60 μm for HDP, 74 μm for Ti, and 84 μm for Zir. At the L3 location, the mean discrepancy values were 55 μm for HDP, 102 μm for Ti, and 94 μm for Zir. The ANOVA did not find a statistically significant overall effect for implant location (P=.072) or a statistically significant interaction of location and material (P=.078), but it did find a statistically significant overall effect of material (P=.019). Statistical differences were found overall between HDP and the other 2 materials (P≤.037). When the tested materials were used with the CAD-CAM system, the 3D marginal discrepancy of CAD-CAM HDP frameworks was smaller than that of titanium or zirconia frameworks. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
A Framework for Restructuring the Military Retirement System
2013-07-01
Associate Professor of Economics in the Social Sciences Department at West Point where he teaches econometrics and labor economics. His areas of...others worth considering, but each should be carefully benchmarked against our proposed framework. 25 ENDNOTES 1. Office of the Actuary , Statistical
Sarntivijai, Sirarat; Zhang, Shelley; Jagannathan, Desikan G.; Zaman, Shadia; Burkhart, Keith K.; Omenn, Gilbert S.; He, Yongqun; Athey, Brian D.; Abernethy, Darrell R.
2016-01-01
Introduction A translational bioinformatics challenge lies in connecting population and individual’s clinical phenotypes in various formats to biological mechanisms. The Medical Dictionary for Regulatory Activities (MedDRA®) is the default dictionary for Adverse Event (AE) reporting in the FDA Adverse Event Reporting System (FAERS). The Ontology of Adverse Events (OAE) represents AEs as pathological processes occurring after drug exposures. Objectives The aim is to establish a semantic framework to link biological mechanisms to phenotypes of AEs by combining OAE with MedDRA® in FAERS data analysis. We investigated the AEs associated with Tyrosine Kinase Inhibitors (TKIs) and monoclonal antibodies (mAbs) targeting tyrosine kinases. The selected 5 TKIs/mAbs (i.e., dasatinib, imatinib, lapatinib, cetuximab, and trastuzumab) are known to induce impaired ventricular function (non-QT) cardiotoxicity. Results Statistical analysis of FAERS data identified 1,053 distinct MedDRA® terms significantly associated with TKIs/mAbs, where 884 did not have corresponding OAE terms. We manually annotated these terms, added them to OAE by the standard OAE development strategy, and mapped them to MedDRA®. The data integration to provide insights into molecular mechanisms for drug-associated AEs is performed by including linkages in OAE for all related AE terms to MedDRA® and existing ontologies including Human Phenotype Ontology (HP), Uber Anatomy Ontology (UBERON), and Gene Ontology (GO). Sixteen AEs are shared by all 5 TKIs/mAbs, and each of 17 cardiotoxicity AEs was associated with at least one TKI/mAb. As an example, we analyzed ‘cardiac failure’ using the relations established in OAE with other ontologies, and demonstrated that one of the biological processes associated with cardiac failure maps to the genes associated with heart contraction. Conclusion By expanding existing OAE ontological design, our TKI use case demonstrates that the combination of OAE and MedDRA® provides a semantic framework to link clinical phenotypes of adverse drug events to biological mechanisms. PMID:27003817
Probabilistic Graphical Model Representation in Phylogenetics
Höhna, Sebastian; Heath, Tracy A.; Boussau, Bastien; Landis, Michael J.; Ronquist, Fredrik; Huelsenbeck, John P.
2014-01-01
Recent years have seen a rapid expansion of the model space explored in statistical phylogenetics, emphasizing the need for new approaches to statistical model representation and software development. Clear communication and representation of the chosen model is crucial for: (i) reproducibility of an analysis, (ii) model development, and (iii) software design. Moreover, a unified, clear and understandable framework for model representation lowers the barrier for beginners and nonspecialists to grasp complex phylogenetic models, including their assumptions and parameter/variable dependencies. Graphical modeling is a unifying framework that has gained in popularity in the statistical literature in recent years. The core idea is to break complex models into conditionally independent distributions. The strength lies in the comprehensibility, flexibility, and adaptability of this formalism, and the large body of computational work based on it. Graphical models are well-suited to teach statistical models, to facilitate communication among phylogeneticists and in the development of generic software for simulation and statistical inference. Here, we provide an introduction to graphical models for phylogeneticists and extend the standard graphical model representation to the realm of phylogenetics. We introduce a new graphical model component, tree plates, to capture the changing structure of the subgraph corresponding to a phylogenetic tree. We describe a range of phylogenetic models using the graphical model framework and introduce modules to simplify the representation of standard components in large and complex models. Phylogenetic model graphs can be readily used in simulation, maximum likelihood inference, and Bayesian inference using, for example, Metropolis–Hastings or Gibbs sampling of the posterior distribution. [Computation; graphical models; inference; modularization; statistical phylogenetics; tree plate.] PMID:24951559
Statistical emulation of landslide-induced tsunamis at the Rockall Bank, NE Atlantic
Guillas, S.; Georgiopoulou, A.; Dias, F.
2017-01-01
Statistical methods constitute a useful approach to understand and quantify the uncertainty that governs complex tsunami mechanisms. Numerical experiments may often have a high computational cost. This forms a limiting factor for performing uncertainty and sensitivity analyses, where numerous simulations are required. Statistical emulators, as surrogates of these simulators, can provide predictions of the physical process in a much faster and computationally inexpensive way. They can form a prominent solution to explore thousands of scenarios that would be otherwise numerically expensive and difficult to achieve. In this work, we build a statistical emulator of the deterministic codes used to simulate submarine sliding and tsunami generation at the Rockall Bank, NE Atlantic Ocean, in two stages. First we calibrate, against observations of the landslide deposits, the parameters used in the landslide simulations. This calibration is performed under a Bayesian framework using Gaussian Process (GP) emulators to approximate the landslide model, and the discrepancy function between model and observations. Distributions of the calibrated input parameters are obtained as a result of the calibration. In a second step, a GP emulator is built to mimic the coupled landslide-tsunami numerical process. The emulator propagates the uncertainties in the distributions of the calibrated input parameters inferred from the first step to the outputs. As a result, a quantification of the uncertainty of the maximum free surface elevation at specified locations is obtained. PMID:28484339
Statistical emulation of landslide-induced tsunamis at the Rockall Bank, NE Atlantic.
Salmanidou, D M; Guillas, S; Georgiopoulou, A; Dias, F
2017-04-01
Statistical methods constitute a useful approach to understand and quantify the uncertainty that governs complex tsunami mechanisms. Numerical experiments may often have a high computational cost. This forms a limiting factor for performing uncertainty and sensitivity analyses, where numerous simulations are required. Statistical emulators, as surrogates of these simulators, can provide predictions of the physical process in a much faster and computationally inexpensive way. They can form a prominent solution to explore thousands of scenarios that would be otherwise numerically expensive and difficult to achieve. In this work, we build a statistical emulator of the deterministic codes used to simulate submarine sliding and tsunami generation at the Rockall Bank, NE Atlantic Ocean, in two stages. First we calibrate, against observations of the landslide deposits, the parameters used in the landslide simulations. This calibration is performed under a Bayesian framework using Gaussian Process (GP) emulators to approximate the landslide model, and the discrepancy function between model and observations. Distributions of the calibrated input parameters are obtained as a result of the calibration. In a second step, a GP emulator is built to mimic the coupled landslide-tsunami numerical process. The emulator propagates the uncertainties in the distributions of the calibrated input parameters inferred from the first step to the outputs. As a result, a quantification of the uncertainty of the maximum free surface elevation at specified locations is obtained.
Multiaxis sensing using metal organic frameworks
DOE Office of Scientific and Technical Information (OSTI.GOV)
Talin, Albert Alec; Allendorf, Mark D.; Leonard, Francois
2017-01-17
A sensor device including a sensor substrate; and a thin film comprising a porous metal organic framework (MOF) on the substrate that presents more than one transduction mechanism when exposed to an analyte. A method including exposing a porous metal organic framework (MOF) on a substrate to an analyte; and identifying more than one transduction mechanism in response to the exposure to the analyte.
Quantum Behavior of an Autonomous Maxwell Demon
NASA Astrophysics Data System (ADS)
Chapman, Adrian; Miyake, Akimasa
2015-03-01
A Maxwell Demon is an agent that can exploit knowledge of a system's microstate to perform useful work. The second law of thermodynamics is only recovered upon taking into account the work required to irreversibly update the demon's memory, bringing information theoretic concepts into a thermodynamic framework. Recently, there has been interest in modeling a classical Maxwell demon as an autonomous physical system to study this information-work tradeoff explicitly. Motivated by the idea that states with non-local entanglement structure can be used as a computational resource, we ask whether these states have thermodynamic resource quality as well by generalizing a particular classical autonomous Maxwell demon to the quantum regime. We treat the full quantum description using a matrix product operator formalism, which allows us to handle quantum and classical correlations in a unified framework. Applying this, together with techniques from statistical mechanics, we are able to approximate nonlocal quantities such as the erasure performed on the demon's memory register when correlations are present. Finally, we examine how the demon may use these correlations as a resource to outperform its classical counterpart.
Chatterjee, Abhijit; Vlachos, Dionisios G
2007-07-21
While recently derived continuum mesoscopic equations successfully bridge the gap between microscopic and macroscopic physics, so far they have been derived only for simple lattice models. In this paper, general deterministic continuum mesoscopic equations are derived rigorously via nonequilibrium statistical mechanics to account for multiple interacting surface species and multiple processes on multiple site types and/or different crystallographic planes. Adsorption, desorption, reaction, and surface diffusion are modeled. It is demonstrated that contrary to conventional phenomenological continuum models, microscopic physics, such as the interaction potential, determines the final form of the mesoscopic equation. Models of single component diffusion and binary diffusion of interacting particles on single-type site lattice and of single component diffusion on complex microporous materials' lattices consisting of two types of sites are derived, as illustrations of the mesoscopic framework. Simplification of the diffusion mesoscopic model illustrates the relation to phenomenological models, such as the Fickian and Maxwell-Stefan transport models. It is demonstrated that the mesoscopic equations are in good agreement with lattice kinetic Monte Carlo simulations for several prototype examples studied.
A New Adaptive Framework for Collaborative Filtering Prediction
Almosallam, Ibrahim A.; Shang, Yi
2010-01-01
Collaborative filtering is one of the most successful techniques for recommendation systems and has been used in many commercial services provided by major companies including Amazon, TiVo and Netflix. In this paper we focus on memory-based collaborative filtering (CF). Existing CF techniques work well on dense data but poorly on sparse data. To address this weakness, we propose to use z-scores instead of explicit ratings and introduce a mechanism that adaptively combines global statistics with item-based values based on data density level. We present a new adaptive framework that encapsulates various CF algorithms and the relationships among them. An adaptive CF predictor is developed that can self adapt from user-based to item-based to hybrid methods based on the amount of available ratings. Our experimental results show that the new predictor consistently obtained more accurate predictions than existing CF methods, with the most significant improvement on sparse data sets. When applied to the Netflix Challenge data set, our method performed better than existing CF and singular value decomposition (SVD) methods and achieved 4.67% improvement over Netflix’s system. PMID:21572924
A New Adaptive Framework for Collaborative Filtering Prediction.
Almosallam, Ibrahim A; Shang, Yi
2008-06-01
Collaborative filtering is one of the most successful techniques for recommendation systems and has been used in many commercial services provided by major companies including Amazon, TiVo and Netflix. In this paper we focus on memory-based collaborative filtering (CF). Existing CF techniques work well on dense data but poorly on sparse data. To address this weakness, we propose to use z-scores instead of explicit ratings and introduce a mechanism that adaptively combines global statistics with item-based values based on data density level. We present a new adaptive framework that encapsulates various CF algorithms and the relationships among them. An adaptive CF predictor is developed that can self adapt from user-based to item-based to hybrid methods based on the amount of available ratings. Our experimental results show that the new predictor consistently obtained more accurate predictions than existing CF methods, with the most significant improvement on sparse data sets. When applied to the Netflix Challenge data set, our method performed better than existing CF and singular value decomposition (SVD) methods and achieved 4.67% improvement over Netflix's system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Chao; Xu, Zhijie; Lai, Canhai
This report is prepared for the demonstration of hierarchical prediction of carbon capture efficiency of a solvent-based absorption column. A computational fluid dynamics (CFD) model is first developed to simulate the core phenomena of solvent-based carbon capture, i.e., the CO2 physical absorption and chemical reaction, on a simplified geometry of wetted wall column (WWC) at bench scale. Aqueous solutions of ethanolamine (MEA) are commonly selected as a CO2 stream scrubbing liquid. CO2 is captured by both physical and chemical absorption using highly CO2 soluble and reactive solvent, MEA, during the scrubbing process. In order to provide confidence bound on themore » computational predictions of this complex engineering system, a hierarchical calibration and validation framework is proposed. The overall goal of this effort is to provide a mechanism-based predictive framework with confidence bound for overall mass transfer coefficient of the wetted wall column (WWC) with statistical analyses of the corresponding WWC experiments with increasing physical complexity.« less
Charting molecular free-energy landscapes with an atlas of collective variables
NASA Astrophysics Data System (ADS)
Hashemian, Behrooz; Millán, Daniel; Arroyo, Marino
2016-11-01
Collective variables (CVs) are a fundamental tool to understand molecular flexibility, to compute free energy landscapes, and to enhance sampling in molecular dynamics simulations. However, identifying suitable CVs is challenging, and is increasingly addressed with systematic data-driven manifold learning techniques. Here, we provide a flexible framework to model molecular systems in terms of a collection of locally valid and partially overlapping CVs: an atlas of CVs. The specific motivation for such a framework is to enhance the applicability and robustness of CVs based on manifold learning methods, which fail in the presence of periodicities in the underlying conformational manifold. More generally, using an atlas of CVs rather than a single chart may help us better describe different regions of conformational space. We develop the statistical mechanics foundation for our multi-chart description and propose an algorithmic implementation. The resulting atlas of data-based CVs are then used to enhance sampling and compute free energy surfaces in two model systems, alanine dipeptide and β-D-glucopyranose, whose conformational manifolds have toroidal and spherical topologies.
Functional Genomics Assistant (FUGA): a toolbox for the analysis of complex biological networks
2011-01-01
Background Cellular constituents such as proteins, DNA, and RNA form a complex web of interactions that regulate biochemical homeostasis and determine the dynamic cellular response to external stimuli. It follows that detailed understanding of these patterns is critical for the assessment of fundamental processes in cell biology and pathology. Representation and analysis of cellular constituents through network principles is a promising and popular analytical avenue towards a deeper understanding of molecular mechanisms in a system-wide context. Findings We present Functional Genomics Assistant (FUGA) - an extensible and portable MATLAB toolbox for the inference of biological relationships, graph topology analysis, random network simulation, network clustering, and functional enrichment statistics. In contrast to conventional differential expression analysis of individual genes, FUGA offers a framework for the study of system-wide properties of biological networks and highlights putative molecular targets using concepts of systems biology. Conclusion FUGA offers a simple and customizable framework for network analysis in a variety of systems biology applications. It is freely available for individual or academic use at http://code.google.com/p/fuga. PMID:22035155
Putting Cognitive Science behind a Statistics Teacher's Intuition
ERIC Educational Resources Information Center
Jones, Karrie A.; Jones, Jennifer L.; Vermette, Paul J.
2011-01-01
Recent advances in cognitive science have led to an enriched understanding of how people learn. Using a framework presented by Willingham, this article examines instructional best practice from the perspective of conceptual understanding and its implications on statistics education.
Engineering Change Management Method Framework in Mechanical Engineering
NASA Astrophysics Data System (ADS)
Stekolschik, Alexander
2016-11-01
Engineering changes make an impact on different process chains in and outside the company, and lead to most error costs and time shifts. In fact, 30 to 50 per cent of development costs result from technical changes. Controlling engineering change processes can help us to avoid errors and risks, and contribute to cost optimization and a shorter time to market. This paper presents a method framework for controlling engineering changes at mechanical engineering companies. The developed classification of engineering changes and accordingly process requirements build the basis for the method framework. The developed method framework comprises two main areas: special data objects managed in different engineering IT tools and process framework. Objects from both areas are building blocks that can be selected to the overall business process based on the engineering process type and change classification. The process framework contains steps for the creation of change objects (both for overall change and for parts), change implementation, and release. Companies can select singleprocess building blocks from the framework, depending on the product development process and change impact. The developed change framework has been implemented at a division (10,000 employees) of a big German mechanical engineering company.
NASA Astrophysics Data System (ADS)
Chung, Moo K.; Kim, Seung-Goo; Schaefer, Stacey M.; van Reekum, Carien M.; Peschke-Schmitz, Lara; Sutterer, Matthew J.; Davidson, Richard J.
2014-03-01
The sparse regression framework has been widely used in medical image processing and analysis. However, it has been rarely used in anatomical studies. We present a sparse shape modeling framework using the Laplace- Beltrami (LB) eigenfunctions of the underlying shape and show its improvement of statistical power. Tradition- ally, the LB-eigenfunctions are used as a basis for intrinsically representing surface shapes as a form of Fourier descriptors. To reduce high frequency noise, only the first few terms are used in the expansion and higher frequency terms are simply thrown away. However, some lower frequency terms may not necessarily contribute significantly in reconstructing the surfaces. Motivated by this idea, we present a LB-based method to filter out only the significant eigenfunctions by imposing a sparse penalty. For dense anatomical data such as deformation fields on a surface mesh, the sparse regression behaves like a smoothing process, which will reduce the error of incorrectly detecting false negatives. Hence the statistical power improves. The sparse shape model is then applied in investigating the influence of age on amygdala and hippocampus shapes in the normal population. The advantage of the LB sparse framework is demonstrated by showing the increased statistical power.
Assessing Cultural Competence in Graduating Students
ERIC Educational Resources Information Center
Kohli, Hermeet K.; Kohli, Amarpreet S.; Huber, Ruth; Faul, Anna C.
2010-01-01
Twofold purpose of this study was to develop a framework to understand cultural competence in graduating social work students, and test that framework for appropriateness and predictability using multivariate statistics. Scale and predictor variables were collected using an online instrument from a nationwide convenience sample of graduating…
Willems, Sander; Fraiture, Marie-Alice; Deforce, Dieter; De Keersmaecker, Sigrid C J; De Loose, Marc; Ruttink, Tom; Herman, Philippe; Van Nieuwerburgh, Filip; Roosens, Nancy
2016-02-01
Because the number and diversity of genetically modified (GM) crops has significantly increased, their analysis based on real-time PCR (qPCR) methods is becoming increasingly complex and laborious. While several pioneers already investigated Next Generation Sequencing (NGS) as an alternative to qPCR, its practical use has not been assessed for routine analysis. In this study a statistical framework was developed to predict the number of NGS reads needed to detect transgene sequences, to prove their integration into the host genome and to identify the specific transgene event in a sample with known composition. This framework was validated by applying it to experimental data from food matrices composed of pure GM rice, processed GM rice (noodles) or a 10% GM/non-GM rice mixture, revealing some influential factors. Finally, feasibility of NGS for routine analysis of GM crops was investigated by applying the framework to samples commonly encountered in routine analysis of GM crops. Copyright © 2015 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Schneider, Markus P. A.
This dissertation contributes to two areas in economics: the understanding of the distribution of earned income and to Bayesian analysis of distributional data. Recently, physicists claimed that the distribution of earned income is exponential (see Yakovenko, 2009). The first chapter explores the perspective that the economy is a statistical mechanical system and the implication for labor market outcomes is considered critically. The robustness of the empirical results that lead to the physicists' claims, the significance of the exponential distribution in statistical mechanics, and the case for a conservation law in economics are discussed. The conclusion reached is that physicists' conception of the economy is too narrow even within their chosen framework, but that their overall approach is insightful. The dual labor market theory of segmented labor markets is invoked to understand why the observed distribution may be a mixture of distributional components, corresponding to different generating mechanisms described in Reich et al. (1973). The application of informational entropy in chapter II connects this work to Bayesian analysis and maximum entropy econometrics. The analysis follows E. T. Jaynes's treatment of Wolf's dice data, but is applied to the distribution of earned income based on CPS data. The results are calibrated to account for rounded survey responses using a simple simulation, and answer the graphical analyses by physicists. The results indicate that neither the income distribution of all respondents nor of the subpopulation used by physicists appears to be exponential. The empirics do support the claim that a mixture with exponential and log-normal distributional components ts the data. In the final chapter, a log-linear model is used to fit the exponential to the earned income distribution. Separating the CPS data by gender and marital status reveals that the exponential is only an appropriate model for a limited number of subpopulations, namely the never married and women. The estimated parameter for never-married men's incomes is significantly different from the parameter estimated for never-married women, implying that either the combined distribution is not exponential or that the individual distributions are not exponential. However, it substantiates the existence of a persistent gender income gap among the never-married. References: Reich, M., D. M. Gordon, and R. C. Edwards (1973). A Theory of Labor Market Segmentation. Quarterly Journal of Economics 63, 359-365. Yakovenko, V. M. (2009). Econophysics, Statistical Mechanics Approach to. In R. A. Meyers (Ed.), Encyclopedia of Complexity and System Science. Springer.
The Statistical Basis of Chemical Equilibria.
ERIC Educational Resources Information Center
Hauptmann, Siegfried; Menger, Eva
1978-01-01
Describes a machine which demonstrates the statistical bases of chemical equilibrium, and in doing so conveys insight into the connections among statistical mechanics, quantum mechanics, Maxwell Boltzmann statistics, statistical thermodynamics, and transition state theory. (GA)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kapustin, Eugene A.; Lee, Seungkyu; Alshammari, Ahmad S.
Despite numerous studies on chemical and thermal stability of metal-organic frameworks (MOFs), mechanical stability remains largely undeveloped. No strategy exists to control the mechanical deformation of MOFs under ultrahigh pressure, to date. We show that the mechanically unstable MOF-520 can be retrofitted by precise placement of a rigid 4,4'-biphenyldicarboxylate (BPDC) linker as a "girder" to afford a mechanically robust framework: MOF-520-BPDC. This retrofitting alters how the structure deforms under ultrahigh pressure and thus leads to a drastic enhancement of its mechanical robustness. While in the parent MOF-520 the pressure transmitting medium molecules diffuse into the pore and expand the structuremore » from the inside upon compression, the girder in the new retrofitted MOF-520-BPDC prevents the framework from expansion by linking two adjacent secondary building units together. As a result, the modified MOF is stable under hydrostatic compression in a diamond-anvil cell up to 5.5 gigapascal. The increased mechanical stability of MOF-520-BPDC prohibits the typical amorphization observed for MOFs in this pressure range. Direct correlation between the orientation of these girders within the framework and its linear strain was estimated, providing new insights for the design of MOFs with optimized mechanical properties.« less
Validity criteria for Fermi's golden rule scattering rates applied to metallic nanowires.
Moors, Kristof; Sorée, Bart; Magnus, Wim
2016-09-14
Fermi's golden rule underpins the investigation of mobile carriers propagating through various solids, being a standard tool to calculate their scattering rates. As such, it provides a perturbative estimate under the implicit assumption that the effect of the interaction Hamiltonian which causes the scattering events is sufficiently small. To check the validity of this assumption, we present a general framework to derive simple validity criteria in order to assess whether the scattering rates can be trusted for the system under consideration, given its statistical properties such as average size, electron density, impurity density et cetera. We derive concrete validity criteria for metallic nanowires with conduction electrons populating a single parabolic band subjected to different elastic scattering mechanisms: impurities, grain boundaries and surface roughness.
Molecular modeling of polycarbonate materials: Glass transition and mechanical properties
NASA Astrophysics Data System (ADS)
Palczynski, Karol; Wilke, Andreas; Paeschke, Manfred; Dzubiella, Joachim
2017-09-01
Linking the experimentally accessible macroscopic properties of thermoplastic polymers to their microscopic static and dynamic properties is a key requirement for targeted material design. Classical molecular dynamics simulations enable us to study the structural and dynamic behavior of molecules on microscopic scales, and statistical physics provides a framework for relating these properties to the macroscopic properties. We take a first step toward creating an automated workflow for the theoretical prediction of thermoplastic material properties by developing an expeditious method for parameterizing a simple yet surprisingly powerful coarse-grained bisphenol-A polycarbonate model which goes beyond previous coarse-grained models and successfully reproduces the thermal expansion behavior, the glass transition temperature as a function of the molecular weight, and several elastic properties.
High-energy transformations of polyfluoroalkanes. IX pyrolysis of 1,1-difluoroethane
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mitin, P.V.; Golovin, A.V.; Grigor`eva, T.Yu.
1994-07-10
Kinetics of the unimolecular thermal dehydrofluorination of 1,1-difluoroethane in a flow reactor is reported. The first-order rate constant is determined; logk[1/c]=(-60,000{plus_minus}2000)/4.569{center_dot}T + 13.33{plus_minus}0.10. 1,1-Difluoroethylene, as a by-product of the pyrolysis of 1,1-difluoroethane, is formed by a radical mechanism, for which a heterogeneous, initiation state is proposed. MNDO calculations show the predominant formation of the CH{sub 3}-CF{sub 2} radical at the initiation stage. For this radical, rate constants of unimolecular 1{r_arrow}2 and 2{r_arrow}1 hydrogen shifts are determined within the framework of the PPKM statistical theory. 17 refs., 4 figs., 2 tabs.
Numerical test of the Edwards conjecture shows that all packings are equally probable at jamming
NASA Astrophysics Data System (ADS)
Martiniani, Stefano; Schrenk, K. Julian; Ramola, Kabir; Chakraborty, Bulbul; Frenkel, Daan
2017-09-01
In the late 1980s, Sam Edwards proposed a possible statistical-mechanical framework to describe the properties of disordered granular materials. A key assumption underlying the theory was that all jammed packings are equally likely. In the intervening years it has never been possible to test this bold hypothesis directly. Here we present simulations that provide direct evidence that at the unjamming point, all packings of soft repulsive particles are equally likely, even though generically, jammed packings are not. Typically, jammed granular systems are observed precisely at the unjamming point since grains are not very compressible. Our results therefore support Edwards’ original conjecture. We also present evidence that at unjamming the configurational entropy of the system is maximal.
Statistical Analysis of CFD Solutions from the Drag Prediction Workshop
NASA Technical Reports Server (NTRS)
Hemsch, Michael J.
2002-01-01
A simple, graphical framework is presented for robust statistical evaluation of results obtained from N-Version testing of a series of RANS CFD codes. The solutions were obtained by a variety of code developers and users for the June 2001 Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration used for the computational tests is the DLR-F4 wing-body combination previously tested in several European wind tunnels and for which a previous N-Version test had been conducted. The statistical framework is used to evaluate code results for (1) a single cruise design point, (2) drag polars and (3) drag rise. The paper concludes with a discussion of the meaning of the results, especially with respect to predictability, Validation, and reporting of solutions.
EFFECTS-BASED CUMULATIVE RISK ASSESSMENT IN A LOW-INCOME URBAN COMMUNITY NEAR A SUPERFUND SITE
We will introduce into the cumulative risk assessment framework novel methods for non-cancer risk assessment, techniques for dose-response modeling that extend insights from chemical mixtures frameworks to non-chemical stressors, multilevel statistical methods used to address ...
Are dragon-king neuronal avalanches dungeons for self-organized brain activity?
NASA Astrophysics Data System (ADS)
de Arcangelis, L.
2012-05-01
Recent experiments have detected a novel form of spontaneous neuronal activity both in vitro and in vivo: neuronal avalanches. The statistical properties of this activity are typical of critical phenomena, with power laws characterizing the distributions of avalanche size and duration. A critical behaviour for the spontaneous brain activity has important consequences on stimulated activity and learning. Very interestingly, these statistical properties can be altered in significant ways in epilepsy and by pharmacological manipulations. In particular, there can be an increase in the number of large events anticipated by the power law, referred to herein as dragon-king avalanches. This behaviour, as verified by numerical models, can originate from a number of different mechanisms. For instance, it is observed experimentally that the emergence of a critical behaviour depends on the subtle balance between excitatory and inhibitory mechanisms acting in the system. Perturbing this balance, by increasing either synaptic excitation or the incidence of depolarized neuronal up-states causes frequent dragon-king avalanches. Conversely, an unbalanced GABAergic inhibition or long periods of low activity in the network give rise to sub-critical behaviour. Moreover, the existence of power laws, common to other stochastic processes, like earthquakes or solar flares, suggests that correlations are relevant in these phenomena. The dragon-king avalanches may then also be the expression of pathological correlations leading to frequent avalanches encompassing all neurons. We will review the statistics of neuronal avalanches in experimental systems. We then present numerical simulations of a neuronal network model introducing within the self-organized criticality framework ingredients from the physiology of real neurons, as the refractory period, synaptic plasticity and inhibitory synapses. The avalanche critical behaviour and the role of dragon-king avalanches will be discussed in relation to different drives, neuronal states and microscopic mechanisms of charge storage and release in neuronal networks.
Martin, Jordan S; Suarez, Scott A
2017-08-01
Interest in quantifying consistent among-individual variation in primate behavior, also known as personality, has grown rapidly in recent decades. Although behavioral coding is the most frequently utilized method for assessing primate personality, limitations in current statistical practice prevent researchers' from utilizing the full potential of their coding datasets. These limitations include the use of extensive data aggregation, not modeling biologically relevant sources of individual variance during repeatability estimation, not partitioning between-individual (co)variance prior to modeling personality structure, the misuse of principal component analysis, and an over-reliance upon exploratory statistical techniques to compare personality models across populations, species, and data collection methods. In this paper, we propose a statistical framework for primate personality research designed to address these limitations. Our framework synthesizes recently developed mixed-effects modeling approaches for quantifying behavioral variation with an information-theoretic model selection paradigm for confirmatory personality research. After detailing a multi-step analytic procedure for personality assessment and model comparison, we employ this framework to evaluate seven models of personality structure in zoo-housed bonobos (Pan paniscus). We find that differences between sexes, ages, zoos, time of observation, and social group composition contributed to significant behavioral variance. Independently of these factors, however, personality nonetheless accounted for a moderate to high proportion of variance in average behavior across observational periods. A personality structure derived from past rating research receives the strongest support relative to our model set. This model suggests that personality variation across the measured behavioral traits is best described by two correlated but distinct dimensions reflecting individual differences in affiliation and sociability (Agreeableness) as well as activity level, social play, and neophilia toward non-threatening stimuli (Openness). These results underscore the utility of our framework for quantifying personality in primates and facilitating greater integration between the behavioral ecological and comparative psychological approaches to personality research. © 2017 Wiley Periodicals, Inc.
ERIC Educational Resources Information Center
Lin, Mengquan; Chang, Kai; Gong, Le
2016-01-01
The higher education quality evaluation and assurance frameworks and their operating mechanisms of countries such as the United Kingdom, France, and the United States show that higher education systems, traditional culture, and social background all impact quality assurance operating mechanisms. A model analysis of these higher education quality…
A probabilistic, distributed, recursive mechanism for decision-making in the brain
Gurney, Kevin N.
2018-01-01
Decision formation recruits many brain regions, but the procedure they jointly execute is unknown. Here we characterize its essential composition, using as a framework a novel recursive Bayesian algorithm that makes decisions based on spike-trains with the statistics of those in sensory cortex (MT). Using it to simulate the random-dot-motion task, we demonstrate it quantitatively replicates the choice behaviour of monkeys, whilst predicting losses of otherwise usable information from MT. Its architecture maps to the recurrent cortico-basal-ganglia-thalamo-cortical loops, whose components are all implicated in decision-making. We show that the dynamics of its mapped computations match those of neural activity in the sensorimotor cortex and striatum during decisions, and forecast those of basal ganglia output and thalamus. This also predicts which aspects of neural dynamics are and are not part of inference. Our single-equation algorithm is probabilistic, distributed, recursive, and parallel. Its success at capturing anatomy, behaviour, and electrophysiology suggests that the mechanism implemented by the brain has these same characteristics. PMID:29614077
Faes, Luca; Nollo, Giandomenico; Krohova, Jana; Czippelova, Barbora; Turianikova, Zuzana; Javorka, Michal
2017-07-01
To fully elucidate the complex physiological mechanisms underlying the short-term autonomic regulation of heart period (H), systolic and diastolic arterial pressure (S, D) and respiratory (R) variability, the joint dynamics of these variables need to be explored using multivariate time series analysis. This study proposes the utilization of information-theoretic measures to measure causal interactions between nodes of the cardiovascular/cardiorespiratory network and to assess the nature (synergistic or redundant) of these directed interactions. Indexes of information transfer and information modification are extracted from the H, S, D and R series measured from healthy subjects in a resting state and during postural stress. Computations are performed in the framework of multivariate linear regression, using bootstrap techniques to assess on a single-subject basis the statistical significance of each measure and of its transitions across conditions. We find patterns of information transfer and modification which are related to specific cardiovascular and cardiorespiratory mechanisms in resting conditions and to their modification induced by the orthostatic stress.
Causal Modeling the Delayed-Choice Experiment
NASA Astrophysics Data System (ADS)
Chaves, Rafael; Lemos, Gabriela Barreto; Pienaar, Jacques
2018-05-01
Wave-particle duality has become one of the flagships of quantum mechanics. This counterintuitive concept is highlighted in a delayed-choice experiment, where the experimental setup that reveals either the particle or wave nature of a quantum system is decided after the system has entered the apparatus. Here we consider delayed-choice experiments from the perspective of device-independent causal models and show their equivalence to a prepare-and-measure scenario. Within this framework, we consider Wheeler's original proposal and its variant using a quantum control and show that a simple classical causal model is capable of reproducing the quantum mechanical predictions. Nonetheless, among other results, we show that, in a slight variant of Wheeler's gedanken experiment, a photon in an interferometer can indeed generate statistics incompatible with any nonretrocausal hidden variable model, whose dimensionality is the same as that of the quantum system it is supposed to mimic. Our proposal tolerates arbitrary losses and inefficiencies, making it specially suited to loophole-free experimental implementations.
The Impact of Climate Projection Method on the Analysis of Climate Change in Semi-arid Basins
NASA Astrophysics Data System (ADS)
Halper, E.; Shamir, E.
2016-12-01
In small basins with arid climates, rainfall characteristics are highly variable and stream flow is tightly coupled with the nuances of rainfall events (e.g. hourly precipitation patterns Climate change assessments in these basins typically employ CMIP5 projections downscaled with Bias Corrected Statistical Downscaling and Bias Correction/Constructed Analogs (BCSD-BCCA) methods, but these products have drawbacks. Specifically, BCSD-BCCA these projections do not explicitly account for localized physical precipitation mechanisms (e.g. monsoon and snowfall) that are essential to many hydrological systems in the U. S. Southwest. An investigation of the impact of different types of precipitation projections for two kinds of hydrologic studies is being conducted under the U.S. Bureau of Reclamation's Science and Technology Grant Program. An innovative modeling framework consisting of a weather generator of likely hourly precipitation scenarios, coupled with rainfall-runoff, river routing and groundwater models, has been developed in the Nogales, Arizona area. This framework can simulate the impact of future climate on municipal water operations. This framework allows the rigorous comparison of the BCSD-BCCA methods with alternative approaches including rainfall output from dynamical downscaled Regional Climate Models (RCM), a stochastic rainfall generator forced by either Global Climate Models (GCM) or RCM, and projections using historical records conditioned on either GCM or RCM. The results will provide guide for the use of climate change projections into hydrologic studies of semi-arid areas. The project extends this comparison to analyses of flood control. Large flows on the Bill Williams River are a concern for the operation of dams along the Lower Colorado River. After adapting the weather generator for this region, we will evaluate the model performance for rainfall and stream flow, with emphasis on statistical features important to the specific needs of flood management. The end product of the research is to develop a test to guide selection of a precipitation projection method (including downscaling procedure) for a given region and objective.
NASA Astrophysics Data System (ADS)
Mosby, Matthew; Matouš, Karel
2015-12-01
Three-dimensional simulations capable of resolving the large range of spatial scales, from the failure-zone thickness up to the size of the representative unit cell, in damage mechanics problems of particle reinforced adhesives are presented. We show that resolving this wide range of scales in complex three-dimensional heterogeneous morphologies is essential in order to apprehend fracture characteristics, such as strength, fracture toughness and shape of the softening profile. Moreover, we show that computations that resolve essential physical length scales capture the particle size-effect in fracture toughness, for example. In the vein of image-based computational materials science, we construct statistically optimal unit cells containing hundreds to thousands of particles. We show that these statistically representative unit cells are capable of capturing the first- and second-order probability functions of a given data-source with better accuracy than traditional inclusion packing techniques. In order to accomplish these large computations, we use a parallel multiscale cohesive formulation and extend it to finite strains including damage mechanics. The high-performance parallel computational framework is executed on up to 1024 processing cores. A mesh convergence and a representative unit cell study are performed. Quantifying the complex damage patterns in simulations consisting of tens of millions of computational cells and millions of highly nonlinear equations requires data-mining the parallel simulations, and we propose two damage metrics to quantify the damage patterns. A detailed study of volume fraction and filler size on the macroscopic traction-separation response of heterogeneous adhesives is presented.
Schaid, Daniel J
2010-01-01
Measures of genomic similarity are the basis of many statistical analytic methods. We review the mathematical and statistical basis of similarity methods, particularly based on kernel methods. A kernel function converts information for a pair of subjects to a quantitative value representing either similarity (larger values meaning more similar) or distance (smaller values meaning more similar), with the requirement that it must create a positive semidefinite matrix when applied to all pairs of subjects. This review emphasizes the wide range of statistical methods and software that can be used when similarity is based on kernel methods, such as nonparametric regression, linear mixed models and generalized linear mixed models, hierarchical models, score statistics, and support vector machines. The mathematical rigor for these methods is summarized, as is the mathematical framework for making kernels. This review provides a framework to move from intuitive and heuristic approaches to define genomic similarities to more rigorous methods that can take advantage of powerful statistical modeling and existing software. A companion paper reviews novel approaches to creating kernels that might be useful for genomic analyses, providing insights with examples [1]. Copyright © 2010 S. Karger AG, Basel.
Nonuniform continuum model for solvatochromism based on frozen-density embedding theory.
Shedge, Sapana Vitthal; Wesolowski, Tomasz A
2014-10-20
Frozen-density embedding theory (FDET) provides the formal framework for multilevel numerical simulations, such that a selected subsystem is described at the quantum mechanical level, whereas its environment is described by means of the electron density (frozen density; ${\\rho _{\\rm{B}} (\\vec r)}$). The frozen density ${\\rho _{\\rm{B}} (\\vec r)}$ is usually obtained from some lower-level quantum mechanical methods applied to the environment, but FDET is not limited to such choices for ${\\rho _{\\rm{B}} (\\vec r)}$. The present work concerns the application of FDET, in which ${\\rho _{\\rm{B}} (\\vec r)}$ is the statistically averaged electron density of the solvent ${\\left\\langle {\\rho _{\\rm{B}} (\\vec r)} \\right\\rangle }$. The specific solute-solvent interactions are represented in a statistical manner in ${\\left\\langle {\\rho _{\\rm{B}} (\\vec r)} \\right\\rangle }$. A full self-consistent treatment of solvated chromophore, thus involves a single geometry of the chromophore in a given state and the corresponding ${\\left\\langle {\\rho _{\\rm{B}} (\\vec r)} \\right\\rangle }$. We show that the coupling between the two descriptors might be made in an approximate manner that is applicable for both absorption and emission. The proposed protocol leads to accurate (error in the range of 0.05 eV) descriptions of the solvatochromic shifts in both absorption and emission. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quantum Mechanics and the Principle of Least Radix Economy
NASA Astrophysics Data System (ADS)
Garcia-Morales, Vladimir
2015-03-01
A new variational method, the principle of least radix economy, is formulated. The mathematical and physical relevance of the radix economy, also called digit capacity, is established, showing how physical laws can be derived from this concept in a unified way. The principle reinterprets and generalizes the principle of least action yielding two classes of physical solutions: least action paths and quantum wavefunctions. A new physical foundation of the Hilbert space of quantum mechanics is then accomplished and it is used to derive the Schrödinger and Dirac equations and the breaking of the commutativity of spacetime geometry. The formulation provides an explanation of how determinism and random statistical behavior coexist in spacetime and a framework is developed that allows dynamical processes to be formulated in terms of chains of digits. These methods lead to a new (pre-geometrical) foundation for Lorentz transformations and special relativity. The Parker-Rhodes combinatorial hierarchy is encompassed within our approach and this leads to an estimate of the interaction strength of the electromagnetic and gravitational forces that agrees with the experimental values to an error of less than one thousandth. Finally, it is shown how the principle of least-radix economy naturally gives rise to Boltzmann's principle of classical statistical thermodynamics. A new expression for a general (path-dependent) nonequilibrium entropy is proposed satisfying the Second Law of Thermodynamics.
Factors that enable and hinder the implementation of projects in the alcohol and other drug field.
MacLean, Sarah; Berends, Lynda; Hunter, Barbara; Roberts, Bridget; Mugavin, Janette
2012-02-01
Few studies systematically explore elements of successful project implementation across a range of alcohol and other drug (AOD) activities. This paper provides an evidence base to inform project implementation in the AOD field. We accessed records for 127 completed projects funded by the Alcohol, Education and Rehabilitation Foundation from 2002 to 2008. An adapted realist synthesis methodology enabled us to develop categories of enablers and barriers to successful project implementation, and to identify factors statistically associated with successful project implementation, defined as meeting all funding objectives. Thematic analysis of eight case study projects allowed detailed exploration of findings. Nine enabler and 10 barrier categories were identified. Those most frequently reported as both barriers and enablers concerned partnerships with external agencies and communities, staffing and project design. Achieving supportive relationships with partner agencies and communities, employing skilled staff and implementing consumer or participant input mechanisms were statistically associated with successful project implementation. The framework described here will support development of evidence-based project funding guidelines and project performance indicators. The study provides evidence that investing project hours and resources to develop robust relationships with project partners and communities, implementing mechanisms for consumer or participant input and attracting skilled staff are legitimate and important activities, not just in themselves but because they potentially influence achievement of project funding objectives. © 2012 The Authors. ANZJPH © 2012 Public Health Association of Australia.
ERIC Educational Resources Information Center
Turan, Fikret Korhan; Cetinkaya, Saadet; Ustun, Ceyda
2016-01-01
Building sustainable universities calls for participative management and collaboration among stakeholders. Combining analytic hierarchy and network processes (AHP/ANP) with statistical analysis, this research proposes a framework that can be used in higher education institutions for integrating stakeholder preferences into strategic decisions. The…
ERIC Educational Resources Information Center
Mano, Quintino R.
2016-01-01
Accumulating evidence suggests that literacy acquisition involves developing sensitivity to the statistical regularities of the textual environment. To organize accumulating evidence and help guide future inquiry, this article integrates data from disparate fields of study and formalizes a new two-process framework for developing sensitivity to…
76 FR 47533 - Fisheries of the Northeastern United States; Monkfish; Framework Adjustment 7
Federal Register 2010, 2011, 2012, 2013, 2014
2011-08-05
... FMP). The New England Fishery Management Council and Mid- Atlantic Fishery Management Council (Councils) developed Framework 7 to adjust the annual catch target (ACT) for the Northern Fishery Management... catch (ABC) for monkfish. The New England Council's Scientific and Statistical Committee (SSC) has...
Commentary: Using Potential Outcomes to Understand Causal Mediation Analysis
ERIC Educational Resources Information Center
Imai, Kosuke; Jo, Booil; Stuart, Elizabeth A.
2011-01-01
In this commentary, we demonstrate how the potential outcomes framework can help understand the key identification assumptions underlying causal mediation analysis. We show that this framework can lead to the development of alternative research design and statistical analysis strategies applicable to the longitudinal data settings considered by…
Graph embedding and extensions: a general framework for dimensionality reduction.
Yan, Shuicheng; Xu, Dong; Zhang, Benyu; Zhang, Hong-Jiang; Yang, Qiang; Lin, Stephen
2007-01-01
Over the past few decades, a large family of algorithms - supervised or unsupervised; stemming from statistics or geometry theory - has been designed to provide different solutions to the problem of dimensionality reduction. Despite the different motivations of these algorithms, we present in this paper a general formulation known as graph embedding to unify them within a common framework. In graph embedding, each algorithm can be considered as the direct graph embedding or its linear/kernel/tensor extension of a specific intrinsic graph that describes certain desired statistical or geometric properties of a data set, with constraints from scale normalization or a penalty graph that characterizes a statistical or geometric property that should be avoided. Furthermore, the graph embedding framework can be used as a general platform for developing new dimensionality reduction algorithms. By utilizing this framework as a tool, we propose a new supervised dimensionality reduction algorithm called Marginal Fisher Analysis in which the intrinsic graph characterizes the intraclass compactness and connects each data point with its neighboring points of the same class, while the penalty graph connects the marginal points and characterizes the interclass separability. We show that MFA effectively overcomes the limitations of the traditional Linear Discriminant Analysis algorithm due to data distribution assumptions and available projection directions. Real face recognition experiments show the superiority of our proposed MFA in comparison to LDA, also for corresponding kernel and tensor extensions.
A segmentation editing framework based on shape change statistics
NASA Astrophysics Data System (ADS)
Mostapha, Mahmoud; Vicory, Jared; Styner, Martin; Pizer, Stephen
2017-02-01
Segmentation is a key task in medical image analysis because its accuracy significantly affects successive steps. Automatic segmentation methods often produce inadequate segmentations, which require the user to manually edit the produced segmentation slice by slice. Because editing is time-consuming, an editing tool that enables the user to produce accurate segmentations by only drawing a sparse set of contours would be needed. This paper describes such a framework as applied to a single object. Constrained by the additional information enabled by the manually segmented contours, the proposed framework utilizes object shape statistics to transform the failed automatic segmentation to a more accurate version. Instead of modeling the object shape, the proposed framework utilizes shape change statistics that were generated to capture the object deformation from the failed automatic segmentation to its corresponding correct segmentation. An optimization procedure was used to minimize an energy function that consists of two terms, an external contour match term and an internal shape change regularity term. The high accuracy of the proposed segmentation editing approach was confirmed by testing it on a simulated data set based on 10 in-vivo infant magnetic resonance brain data sets using four similarity metrics. Segmentation results indicated that our method can provide efficient and adequately accurate segmentations (Dice segmentation accuracy increase of 10%), with very sparse contours (only 10%), which is promising in greatly decreasing the work expected from the user.
Welvaert, Marijke; Caley, Peter
2016-01-01
Citizen science and crowdsourcing have been emerging as methods to collect data for surveillance and/or monitoring activities. They could be gathered under the overarching term citizen surveillance . The discipline, however, still struggles to be widely accepted in the scientific community, mainly because these activities are not embedded in a quantitative framework. This results in an ongoing discussion on how to analyze and make useful inference from these data. When considering the data collection process, we illustrate how citizen surveillance can be classified according to the nature of the underlying observation process measured in two dimensions-the degree of observer reporting intention and the control in observer detection effort. By classifying the observation process in these dimensions we distinguish between crowdsourcing, unstructured citizen science and structured citizen science. This classification helps the determine data processing and statistical treatment of these data for making inference. Using our framework, it is apparent that published studies are overwhelmingly associated with structured citizen science, and there are well developed statistical methods for the resulting data. In contrast, methods for making useful inference from purely crowd-sourced data remain under development, with the challenges of accounting for the unknown observation process considerable. Our quantitative framework for citizen surveillance calls for an integration of citizen science and crowdsourcing and provides a way forward to solve the statistical challenges inherent to citizen-sourced data.
ERIC Educational Resources Information Center
Games, Ivan Alex
2008-01-01
This article discusses a framework for the analysis and assessment of twenty-first-century language and literacy practices in game and design-based contexts. It presents the framework in the context of game design within "Gamestar Mechanic", an innovative game-based learning environment where children learn the Discourse of game design. It…
Numerical modeling of the fracture process in a three-unit all-ceramic fixed partial denture.
Kou, Wen; Kou, Shaoquan; Liu, Hongyuan; Sjögren, Göran
2007-08-01
The main objectives were to examine the fracture mechanism and process of a ceramic fixed partial denture (FPD) framework under simulated mechanical loading using a recently developed numerical modeling code, the R-T(2D) code, and also to evaluate the suitability of R-T(2D) code as a tool for this purpose. Using the recently developed R-T(2D) code the fracture mechanism and process of a 3U yttria-tetragonal zirconia polycrystal ceramic (Y-TZP) FPD framework was simulated under static loading. In addition, the fracture pattern obtained using the numerical simulation was compared with the fracture pattern obtained in a previous laboratory test. The result revealed that the framework fracture pattern obtained using the numerical simulation agreed with that observed in a previous laboratory test. Quasi-photoelastic stress fringe pattern and acoustic emission showed that the fracture mechanism was tensile failure and that the crack started at the lower boundary of the framework. The fracture process could be followed both in step-by-step and step-in-step. Based on the findings in the current study, the R-T(2D) code seems suitable for use as a complement to other tests and clinical observations in studying stress distribution, fracture mechanism and fracture processes in ceramic FPD frameworks.
Working toward integrated models of alpine plant distribution.
Carlson, Bradley Z; Randin, Christophe F; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe
2013-10-01
Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial-temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution.
Fu, Guifang; Dai, Xiaotian; Symanzik, Jürgen; Bushman, Shaun
2017-01-01
Leaf shape traits have long been a focus of many disciplines, but the complex genetic and environmental interactive mechanisms regulating leaf shape variation have not yet been investigated in detail. The question of the respective roles of genes and environment and how they interact to modulate leaf shape is a thorny evolutionary problem, and sophisticated methodology is needed to address it. In this study, we investigated a framework-level approach that inputs shape image photographs and genetic and environmental data, and then outputs the relative importance ranks of all variables after integrating shape feature extraction, dimension reduction, and tree-based statistical models. The power of the proposed framework was confirmed by simulation and a Populus szechuanica var. tibetica data set. This new methodology resulted in the detection of novel shape characteristics, and also confirmed some previous findings. The quantitative modeling of a combination of polygenetic, plastic, epistatic, and gene-environment interactive effects, as investigated in this study, will improve the discernment of quantitative leaf shape characteristics, and the methods are ready to be applied to other leaf morphology data sets. Unlike the majority of approaches in the quantitative leaf shape literature, this framework-level approach is data-driven, without assuming any pre-known shape attributes, landmarks, or model structures. © 2016 The Authors. New Phytologist © 2016 New Phytologist Trust.
Probabilistic models for reactive behaviour in heterogeneous condensed phase media
NASA Astrophysics Data System (ADS)
Baer, M. R.; Gartling, D. K.; DesJardin, P. E.
2012-02-01
This work presents statistically-based models to describe reactive behaviour in heterogeneous energetic materials. Mesoscale effects are incorporated in continuum-level reactive flow descriptions using probability density functions (pdfs) that are associated with thermodynamic and mechanical states. A generalised approach is presented that includes multimaterial behaviour by treating the volume fraction as a random kinematic variable. Model simplifications are then sought to reduce the complexity of the description without compromising the statistical approach. Reactive behaviour is first considered for non-deformable media having a random temperature field as an initial state. A pdf transport relationship is derived and an approximate moment approach is incorporated in finite element analysis to model an example application whereby a heated fragment impacts a reactive heterogeneous material which leads to a delayed cook-off event. Modelling is then extended to include deformation effects associated with shock loading of a heterogeneous medium whereby random variables of strain, strain-rate and temperature are considered. A demonstrative mesoscale simulation of a non-ideal explosive is discussed that illustrates the joint statistical nature of the strain and temperature fields during shock loading to motivate the probabilistic approach. This modelling is derived in a Lagrangian framework that can be incorporated in continuum-level shock physics analysis. Future work will consider particle-based methods for a numerical implementation of this modelling approach.
The difference between a dynamic and mechanical approach to stroke treatment.
Helgason, Cathy M
2007-06-01
The current classification of stroke is based on causation, also called pathogenesis, and relies on binary logic faithful to the Aristotelian tradition. Accordingly, a pathology is or is not the cause of the stroke, is considered independent of others, and is the target for treatment. It is the subject for large double-blind randomized clinical therapeutic trials. The scientific view behind clinical trials is the fundamental concept that information is statistical, and causation is determined by probabilities. Therefore, the cause and effect relation will be determined by probability-theory-based statistics. This is the basis of evidence-based medicine, which calls for the results of such trials to be the basis for physician decisions regarding diagnosis and treatment. However, there are problems with the methodology behind evidence-based medicine. Calculations using probability-theory-based statistics regarding cause and effect are performed within an automatic system where there are known inputs and outputs. This method of research provides a framework of certainty with no surprise elements or outcomes. However, it is not a system or method that will come up with previously unknown variables, concepts, or universal principles; it is not a method that will give a new outcome; and it is not a method that allows for creativity, expertise, or new insight for problem solving.
Ortiz, Aurélie U; Boutin, Anne; Fuchs, Alain H; Coudert, François-Xavier
2013-06-06
We provide the first molecular dynamics study of the mechanical instability that is the cause of pressure-induced amorphization of zeolitic imidazolate framework ZIF-8. By measuring the elastic constants of ZIF-8 up to the amorphization pressure, we show that the crystal-to-amorphous transition is triggered by the mechanical instability of ZIF-8 under compression, due to shear mode softening of the material. No similar softening was observed under temperature increase, explaining the absence of temperature-induced amorphization in ZIF-8. We also demonstrate the large impact of the presence of adsorbate in the pores on the mechanical stability and compressibility of the framework, increasing its shear stability. This first molecular dynamics study of ZIF mechanical properties under variations of pressure, temperature, and pore filling opens the way to a more comprehensive understanding of their mechanical stability, structural transitions, and amorphization.
Alfadda, Sara A
2014-01-01
To use a novel approach to measure the amount of vertical marginal gap in computer numeric controlled (CNC)-milled titanium frameworks and conventional cast frameworks. Ten cast frameworks were fabricated on the mandibular master casts of 10 patients. Then, 10 CNC-milled titanium frameworks were fabricated by laser scanning the cast frameworks. The vertical marginal gap was measured and analyzed using the Contura-G2 coordinate measuring machine and special computer software. The CNC-milled titanium frameworks showed an overall reduced mean vertical gap compared with the cast frameworks in all five analogs. This difference was highly statistically significant in the distal analogs. The largest mean gap in the cast framework was recorded in the most distal analogs, and the least amount was in the middle analog. Neither of the two types of frameworks provided a completely gap-free superstructure. The CNCmilled titanium frameworks showed a significantly smaller vertical marginal gap than the cast frameworks.
Benmarhnia, Tarik; Huang, Jonathan Y.; Jones, Catherine M.
2017-01-01
Background: Calls for evidence-informed public health policy, with implicit promises of greater program effectiveness, have intensified recently. The methods to produce such policies are not self-evident, requiring a conciliation of values and norms between policy-makers and evidence producers. In particular, the translation of uncertainty from empirical research findings, particularly issues of statistical variability and generalizability, is a persistent challenge because of the incremental nature of research and the iterative cycle of advancing knowledge and implementation. This paper aims to assess how the concept of uncertainty is considered and acknowledged in World Health Organization (WHO) policy recommendations and guidelines. Methods: We selected four WHO policy statements published between 2008-2013 regarding maternal and child nutrient supplementation, infant feeding, heat action plans, and malaria control to represent topics with a spectrum of available evidence bases. Each of these four statements was analyzed using a novel framework to assess the treatment of statistical variability and generalizability. Results: WHO currently provides substantial guidance on addressing statistical variability through GRADE (Grading of Recommendations Assessment, Development, and Evaluation) ratings for precision and consistency in their guideline documents. Accordingly, our analysis showed that policy-informing questions were addressed by systematic reviews and representations of statistical variability (eg, with numeric confidence intervals). In contrast, the presentation of contextual or "background" evidence regarding etiology or disease burden showed little consideration for this variability. Moreover, generalizability or "indirectness" was uniformly neglected, with little explicit consideration of study settings or subgroups. Conclusion: In this paper, we found that non-uniform treatment of statistical variability and generalizability factors that may contribute to uncertainty regarding recommendations were neglected, including the state of evidence informing background questions (prevalence, mechanisms, or burden or distributions of health problems) and little assessment of generalizability, alternate interventions, and additional outcomes not captured by systematic review. These other factors often form a basis for providing policy recommendations, particularly in the absence of a strong evidence base for intervention effects. Consequently, they should also be subject to stringent and systematic evaluation criteria. We suggest that more effort is needed to systematically acknowledge (1) when evidence is missing, conflicting, or equivocal, (2) what normative considerations were also employed, and (3) how additional evidence may be accrued. PMID:29179291
Alternative Statistical Frameworks for Student Growth Percentile Estimation
ERIC Educational Resources Information Center
Lockwood, J. R.; Castellano, Katherine E.
2015-01-01
This article suggests two alternative statistical approaches for estimating student growth percentiles (SGP). The first is to estimate percentile ranks of current test scores conditional on past test scores directly, by modeling the conditional cumulative distribution functions, rather than indirectly through quantile regressions. This would…
Statistical Teleodynamics: Toward a Theory of Emergence.
Venkatasubramanian, Venkat
2017-10-24
The central scientific challenge of the 21st century is developing a mathematical theory of emergence that can explain and predict phenomena such as consciousness and self-awareness. The most successful research program of the 20th century, reductionism, which goes from the whole to parts, seems unable to address this challenge. This is because addressing this challenge inherently requires an opposite approach, going from parts to the whole. In addition, reductionism, by the very nature of its inquiry, typically does not concern itself with teleology or purposeful behavior. Modeling emergence, in contrast, requires the addressing of teleology. Together, these two requirements present a formidable challenge in developing a successful mathematical theory of emergence. In this article, I describe a new theory of emergence, called statistical teleodynamics, that addresses certain aspects of the general problem. Statistical teleodynamics is a mathematical framework that unifies three seemingly disparate domains-purpose-free entities in statistical mechanics, human engineered teleological systems in systems engineering, and nature-evolved teleological systems in biology and sociology-within the same conceptual formalism. This theory rests on several key conceptual insights, the most important one being the recognition that entropy mathematically models the concept of fairness in economics and philosophy and, equivalently, the concept of robustness in systems engineering. These insights help prove that the fairest inequality of income is a log-normal distribution, which will emerge naturally at equilibrium in an ideal free market society. Similarly, the theory predicts the emergence of the three classes of network organization-exponential, scale-free, and Poisson-seen widely in a variety of domains. Statistical teleodynamics is the natural generalization of statistical thermodynamics, the most successful parts-to-whole systems theory to date, but this generalization is only a modest step toward a more comprehensive mathematical theory of emergence.
NASA Astrophysics Data System (ADS)
Donges, J. F.; Schleussner, C.-F.; Siegmund, J. F.; Donner, R. V.
2016-05-01
Studying event time series is a powerful approach for analyzing the dynamics of complex dynamical systems in many fields of science. In this paper, we describe the method of event coincidence analysis to provide a framework for quantifying the strength, directionality and time lag of statistical interrelationships between event series. Event coincidence analysis allows to formulate and test null hypotheses on the origin of the observed interrelationships including tests based on Poisson processes or, more generally, stochastic point processes with a prescribed inter-event time distribution and other higher-order properties. Applying the framework to country-level observational data yields evidence that flood events have acted as triggers of epidemic outbreaks globally since the 1950s. Facing projected future changes in the statistics of climatic extreme events, statistical techniques such as event coincidence analysis will be relevant for investigating the impacts of anthropogenic climate change on human societies and ecosystems worldwide.
[The informational support of statistical observation related to children disability].
Son, I M; Polikarpov, A V; Ogrizko, E V; Golubeva, T Yu
2016-01-01
Within the framework of the Convention on rights of the disabled the revision is specified concerning criteria of identification of disability of children and reformation of system of medical social expertise according international standards of indices of health and indices related to health. In connection with it, it is important to consider the relationship between alterations in forms of the Federal statistical monitoring in the part of registration of disabled children in the Russian Federation and classification of health indices and indices related to health applied at identification of disability. The article presents analysis of relationship between alterations in forms of the Federal statistical monitoring in the part of registration of disabled children in the Russian Federation and applied classifications used at identification of disability (International classification of impairments, disabilities and handicap (ICDH), international classification of functioning, disability and health (ICF), international classification of functioning, disability and health, version for children and youth (ICF-CY). The intersectorial interaction is considered within the framework of statistics of children disability.
Statistical Analysis of CFD Solutions from the Fourth AIAA Drag Prediction Workshop
NASA Technical Reports Server (NTRS)
Morrison, Joseph H.
2010-01-01
A graphical framework is used for statistical analysis of the results from an extensive N-version test of a collection of Reynolds-averaged Navier-Stokes computational fluid dynamics codes. The solutions were obtained by code developers and users from the U.S., Europe, Asia, and Russia using a variety of grid systems and turbulence models for the June 2009 4th Drag Prediction Workshop sponsored by the AIAA Applied Aerodynamics Technical Committee. The aerodynamic configuration for this workshop was a new subsonic transport model, the Common Research Model, designed using a modern approach for the wing and included a horizontal tail. The fourth workshop focused on the prediction of both absolute and incremental drag levels for wing-body and wing-body-horizontal tail configurations. This work continues the statistical analysis begun in the earlier workshops and compares the results from the grid convergence study of the most recent workshop with earlier workshops using the statistical framework.
Environmental statistics and optimal regulation
NASA Astrophysics Data System (ADS)
Sivak, David; Thomson, Matt
2015-03-01
The precision with which an organism can detect its environment, and the timescale for and statistics of environmental change, will affect the suitability of different strategies for regulating protein levels in response to environmental inputs. We propose a general framework--here applied to the enzymatic regulation of metabolism in response to changing nutrient concentrations--to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, and the costs associated with enzyme production. We find: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones.
Statistics, Formation and Stability of Exoplanetary Systems
NASA Astrophysics Data System (ADS)
Silburt, Ari
Over the past two decades scientists have detected thousands of exoplanets, and their collective properties are now emerging. This thesis contributes to the exoplanet field by analyzing the statistics, formation and stability of exoplanetary systems. The first part of this thesis conducts a statistical reconstruction of the radius and period distributions of Kepler planets. Accounting for observation and detection biases, as well as measurement errors, we calculate the occurrence of planetary systems, including the prevalence of Earth-like planets. This calculation is compared to related works, finding both similarities and differences. Second, the formation of Kepler planets near mean motion resonance (MMR) is investigated. In particular, 27 Kepler systems near 2:1 MMR are analyzed to determine whether tides are a viable mechanism for transporting Kepler planets from MMR. We find that tides alone cannot transport near-resonant planets from exact 2:1 MMR to their observed locations, and other mechanisms must be invoked to explain their formation. Third, a new hybrid integrator HERMES is presented, which is capable of simulating N-bodies undergoing close encounters. HERMES is specifically designed for planets embedded in planetesimal disks, and includes an adaptive routine for optimizing the close encounter boundary to help maintain accuracy. We find the performance of HERMES comparable to other popular hybrid integrators. Fourth, the longterm stability of planetary systems is investigated using machine learning techniques. Typical studies of longterm stability require thousands of realizations to acquire statistically rigorous results, which can take weeks or months to perform. Here we find that a trained machine is capable of quickly and accurately classifying longterm planet stability. Finally, the planetary system HD155358, consisting of two Jovian-sized planets near 2:1 MMR, is investigated using previously collected radial velocity data. New orbital parameters are derived using a Bayesian framework, and we find a high likelihood that the planets are in MMR. In addition, formation and stability constraints are placed on the HD155358 system.
Statistical mechanics of complex neural systems and high dimensional data
NASA Astrophysics Data System (ADS)
Advani, Madhu; Lahiri, Subhaneil; Ganguli, Surya
2013-03-01
Recent experimental advances in neuroscience have opened new vistas into the immense complexity of neuronal networks. This proliferation of data challenges us on two parallel fronts. First, how can we form adequate theoretical frameworks for understanding how dynamical network processes cooperate across widely disparate spatiotemporal scales to solve important computational problems? Second, how can we extract meaningful models of neuronal systems from high dimensional datasets? To aid in these challenges, we give a pedagogical review of a collection of ideas and theoretical methods arising at the intersection of statistical physics, computer science and neurobiology. We introduce the interrelated replica and cavity methods, which originated in statistical physics as powerful ways to quantitatively analyze large highly heterogeneous systems of many interacting degrees of freedom. We also introduce the closely related notion of message passing in graphical models, which originated in computer science as a distributed algorithm capable of solving large inference and optimization problems involving many coupled variables. We then show how both the statistical physics and computer science perspectives can be applied in a wide diversity of contexts to problems arising in theoretical neuroscience and data analysis. Along the way we discuss spin glasses, learning theory, illusions of structure in noise, random matrices, dimensionality reduction and compressed sensing, all within the unified formalism of the replica method. Moreover, we review recent conceptual connections between message passing in graphical models, and neural computation and learning. Overall, these ideas illustrate how statistical physics and computer science might provide a lens through which we can uncover emergent computational functions buried deep within the dynamical complexities of neuronal networks.
Markov Logic Networks in the Analysis of Genetic Data
Sakhanenko, Nikita A.
2010-01-01
Abstract Complex, non-additive genetic interactions are common and can be critical in determining phenotypes. Genome-wide association studies (GWAS) and similar statistical studies of linkage data, however, assume additive models of gene interactions in looking for genotype-phenotype associations. These statistical methods view the compound effects of multiple genes on a phenotype as a sum of influences of each gene and often miss a substantial part of the heritable effect. Such methods do not use any biological knowledge about underlying mechanisms. Modeling approaches from the artificial intelligence (AI) field that incorporate deterministic knowledge into models to perform statistical analysis can be applied to include prior knowledge in genetic analysis. We chose to use the most general such approach, Markov Logic Networks (MLNs), for combining deterministic knowledge with statistical analysis. Using simple, logistic regression-type MLNs we can replicate the results of traditional statistical methods, but we also show that we are able to go beyond finding independent markers linked to a phenotype by using joint inference without an independence assumption. The method is applied to genetic data on yeast sporulation, a complex phenotype with gene interactions. In addition to detecting all of the previously identified loci associated with sporulation, our method identifies four loci with smaller effects. Since their effect on sporulation is small, these four loci were not detected with methods that do not account for dependence between markers due to gene interactions. We show how gene interactions can be detected using more complex models, which can be used as a general framework for incorporating systems biology with genetics. PMID:20958249
Truth, models, model sets, AIC, and multimodel inference: a Bayesian perspective
Barker, Richard J.; Link, William A.
2015-01-01
Statistical inference begins with viewing data as realizations of stochastic processes. Mathematical models provide partial descriptions of these processes; inference is the process of using the data to obtain a more complete description of the stochastic processes. Wildlife and ecological scientists have become increasingly concerned with the conditional nature of model-based inference: what if the model is wrong? Over the last 2 decades, Akaike's Information Criterion (AIC) has been widely and increasingly used in wildlife statistics for 2 related purposes, first for model choice and second to quantify model uncertainty. We argue that for the second of these purposes, the Bayesian paradigm provides the natural framework for describing uncertainty associated with model choice and provides the most easily communicated basis for model weighting. Moreover, Bayesian arguments provide the sole justification for interpreting model weights (including AIC weights) as coherent (mathematically self consistent) model probabilities. This interpretation requires treating the model as an exact description of the data-generating mechanism. We discuss the implications of this assumption, and conclude that more emphasis is needed on model checking to provide confidence in the quality of inference.
Selvarasu, Suresh; Kim, Do Yun; Karimi, Iftekhar A; Lee, Dong-Yup
2010-10-01
We present an integrated framework for characterizing fed-batch cultures of mouse hybridoma cells producing monoclonal antibody (mAb). This framework systematically combines data preprocessing, elemental balancing and statistical analysis technique. Initially, specific rates of cell growth, glucose/amino acid consumptions and mAb/metabolite productions were calculated via curve fitting using logistic equations, with subsequent elemental balancing of the preprocessed data indicating the presence of experimental measurement errors. Multivariate statistical analysis was then employed to understand physiological characteristics of the cellular system. The results from principal component analysis (PCA) revealed three major clusters of amino acids with similar trends in their consumption profiles: (i) arginine, threonine and serine, (ii) glycine, tyrosine, phenylalanine, methionine, histidine and asparagine, and (iii) lysine, valine and isoleucine. Further analysis using partial least square (PLS) regression identified key amino acids which were positively or negatively correlated with the cell growth, mAb production and the generation of lactate and ammonia. Based on these results, the optimal concentrations of key amino acids in the feed medium can be inferred, potentially leading to an increase in cell viability and productivity, as well as a decrease in toxic waste production. The study demonstrated how the current methodological framework using multivariate statistical analysis techniques can serve as a potential tool for deriving rational medium design strategies. Copyright © 2010 Elsevier B.V. All rights reserved.
Moore, Jason H; Amos, Ryan; Kiralis, Jeff; Andrews, Peter C
2015-01-01
Simulation plays an essential role in the development of new computational and statistical methods for the genetic analysis of complex traits. Most simulations start with a statistical model using methods such as linear or logistic regression that specify the relationship between genotype and phenotype. This is appealing due to its simplicity and because these statistical methods are commonly used in genetic analysis. It is our working hypothesis that simulations need to move beyond simple statistical models to more realistically represent the biological complexity of genetic architecture. The goal of the present study was to develop a prototype genotype–phenotype simulation method and software that are capable of simulating complex genetic effects within the context of a hierarchical biology-based framework. Specifically, our goal is to simulate multilocus epistasis or gene–gene interaction where the genetic variants are organized within the framework of one or more genes, their regulatory regions and other regulatory loci. We introduce here the Heuristic Identification of Biological Architectures for simulating Complex Hierarchical Interactions (HIBACHI) method and prototype software for simulating data in this manner. This approach combines a biological hierarchy, a flexible mathematical framework, a liability threshold model for defining disease endpoints, and a heuristic search strategy for identifying high-order epistatic models of disease susceptibility. We provide several simulation examples using genetic models exhibiting independent main effects and three-way epistatic effects. PMID:25395175
NASA Astrophysics Data System (ADS)
Song, Seok Goo; Kwak, Sangmin; Lee, Kyungbook; Park, Donghee
2017-04-01
It is a critical element to predict the intensity and variability of strong ground motions in seismic hazard assessment. The characteristics and variability of earthquake rupture process may be a dominant factor in determining the intensity and variability of near-source strong ground motions. Song et al. (2014) demonstrated that the variability of earthquake rupture scenarios could be effectively quantified in the framework of 1-point and 2-point statistics of earthquake source parameters, constrained by rupture dynamics and past events. The developed pseudo-dynamic source modeling schemes were also validated against the recorded ground motion data of past events and empirical ground motion prediction equations (GMPEs) at the broadband platform (BBP) developed by the Southern California Earthquake Center (SCEC). Recently we improved the computational efficiency of the developed pseudo-dynamic source-modeling scheme by adopting the nonparametric co-regionalization algorithm, introduced and applied in geostatistics initially. We also investigated the effect of earthquake rupture process on near-source ground motion characteristics in the framework of 1-point and 2-point statistics, particularly focusing on the forward directivity region. Finally we will discuss whether the pseudo-dynamic source modeling can reproduce the variability (standard deviation) of empirical GMPEs and the efficiency of 1-point and 2-point statistics to address the variability of ground motions.
Bredbenner, Todd L.; Eliason, Travis D.; Francis, W. Loren; McFarland, John M.; Merkle, Andrew C.; Nicolella, Daniel P.
2014-01-01
Cervical spinal injuries are a significant concern in all trauma injuries. Recent military conflicts have demonstrated the substantial risk of spinal injury for the modern warfighter. Finite element models used to investigate injury mechanisms often fail to examine the effects of variation in geometry or material properties on mechanical behavior. The goals of this study were to model geometric variation for a set of cervical spines, to extend this model to a parametric finite element model, and, as a first step, to validate the parametric model against experimental data for low-loading conditions. Individual finite element models were created using cervical spine (C3–T1) computed tomography data for five male cadavers. Statistical shape modeling (SSM) was used to generate a parametric finite element model incorporating variability of spine geometry, and soft-tissue material property variation was also included. The probabilistic loading response of the parametric model was determined under flexion-extension, axial rotation, and lateral bending and validated by comparison to experimental data. Based on qualitative and quantitative comparison of the experimental loading response and model simulations, we suggest that the model performs adequately under relatively low-level loading conditions in multiple loading directions. In conclusion, SSM methods coupled with finite element analyses within a probabilistic framework, along with the ability to statistically validate the overall model performance, provide innovative and important steps toward describing the differences in vertebral morphology, spinal curvature, and variation in material properties. We suggest that these methods, with additional investigation and validation under injurious loading conditions, will lead to understanding and mitigating the risks of injury in the spine and other musculoskeletal structures. PMID:25506051
ERIC Educational Resources Information Center
Marmolejo-Ramos, Fernando; Cousineau, Denis
2017-01-01
The number of articles showing dissatisfaction with the null hypothesis statistical testing (NHST) framework has been progressively increasing over the years. Alternatives to NHST have been proposed and the Bayesian approach seems to have achieved the highest amount of visibility. In this last part of the special issue, a few alternative…
Segal, Leonie; Sara Opie, Rachelle; Dalziel, Kim
2012-01-01
Context Home-visiting programs have been offered for more than sixty years to at-risk families of newborns and infants. But despite decades of experience with program delivery, more than sixty published controlled trials, and more than thirty published literature reviews, there is still uncertainty surrounding the performance of these programs. Our particular interest was the performance of home visiting in reducing child maltreatment. Methods We developed a program logic framework to assist in understanding the neonate/infant home-visiting literature, identified through a systematic literature review. We tested whether success could be explained by the logic model using descriptive synthesis and statistical analysis. Findings Having a stated objective of reducing child maltreatment—a theory or mechanism of change underpinning the home-visiting program consistent with the target population and their needs and program components that can deliver against the nominated theory of change—considerably increased the chance of success. We found that only seven of fifty-three programs demonstrated such consistency, all of which had a statistically significant positive outcome, whereas of the fifteen that had no match, none was successful. Programs with a partial match had an intermediate success rate. The relationship between program success and full, partial or no match was statistically significant. Conclusions Employing a theory-driven approach provides a new way of understanding the disparate performance of neonate/infant home-visiting programs. Employing a similar theory-driven approach could also prove useful in the review of other programs that embody a diverse set of characteristics and may apply to diverse populations and settings. A program logic framework provides a rigorous approach to deriving policy-relevant meaning from effectiveness evidence of complex programs. For neonate/infant home-visiting programs, it means that in developing these programs, attention to consistency of objectives, theory of change, target population, and program components is critical. PMID:22428693
Segal, Leonie; Sara Opie, Rachelle; Dalziel, Kim
2012-03-01
Home-visiting programs have been offered for more than sixty years to at-risk families of newborns and infants. But despite decades of experience with program delivery, more than sixty published controlled trials, and more than thirty published literature reviews, there is still uncertainty surrounding the performance of these programs. Our particular interest was the performance of home visiting in reducing child maltreatment. We developed a program logic framework to assist in understanding the neonate/infant home-visiting literature, identified through a systematic literature review. We tested whether success could be explained by the logic model using descriptive synthesis and statistical analysis. Having a stated objective of reducing child maltreatment-a theory or mechanism of change underpinning the home-visiting program consistent with the target population and their needs and program components that can deliver against the nominated theory of change-considerably increased the chance of success. We found that only seven of fifty-three programs demonstrated such consistency, all of which had a statistically significant positive outcome, whereas of the fifteen that had no match, none was successful. Programs with a partial match had an intermediate success rate. The relationship between program success and full, partial or no match was statistically significant. Employing a theory-driven approach provides a new way of understanding the disparate performance of neonate/infant home-visiting programs. Employing a similar theory-driven approach could also prove useful in the review of other programs that embody a diverse set of characteristics and may apply to diverse populations and settings. A program logic framework provides a rigorous approach to deriving policy-relevant meaning from effectiveness evidence of complex programs. For neonate/infant home-visiting programs, it means that in developing these programs, attention to consistency of objectives, theory of change, target population, and program components is critical. © 2012 Milbank Memorial Fund.
Emergent mechanics of biological structures
Dumont, Sophie; Prakash, Manu
2014-01-01
Mechanical force organizes life at all scales, from molecules to cells and tissues. Although we have made remarkable progress unraveling the mechanics of life's individual building blocks, our understanding of how they give rise to the mechanics of larger-scale biological structures is still poor. Unlike the engineered macroscopic structures that we commonly build, biological structures are dynamic and self-organize: they sculpt themselves and change their own architecture, and they have structural building blocks that generate force and constantly come on and off. A description of such structures defies current traditional mechanical frameworks. It requires approaches that account for active force-generating parts and for the formation of spatial and temporal patterns utilizing a diverse array of building blocks. In this Perspective, we term this framework “emergent mechanics.” Through examples at molecular, cellular, and tissue scales, we highlight challenges and opportunities in quantitatively understanding the emergent mechanics of biological structures and the need for new conceptual frameworks and experimental tools on the way ahead. PMID:25368421
A scan statistic to extract causal gene clusters from case-control genome-wide rare CNV data.
Nishiyama, Takeshi; Takahashi, Kunihiko; Tango, Toshiro; Pinto, Dalila; Scherer, Stephen W; Takami, Satoshi; Kishino, Hirohisa
2011-05-26
Several statistical tests have been developed for analyzing genome-wide association data by incorporating gene pathway information in terms of gene sets. Using these methods, hundreds of gene sets are typically tested, and the tested gene sets often overlap. This overlapping greatly increases the probability of generating false positives, and the results obtained are difficult to interpret, particularly when many gene sets show statistical significance. We propose a flexible statistical framework to circumvent these problems. Inspired by spatial scan statistics for detecting clustering of disease occurrence in the field of epidemiology, we developed a scan statistic to extract disease-associated gene clusters from a whole gene pathway. Extracting one or a few significant gene clusters from a global pathway limits the overall false positive probability, which results in increased statistical power, and facilitates the interpretation of test results. In the present study, we applied our method to genome-wide association data for rare copy-number variations, which have been strongly implicated in common diseases. Application of our method to a simulated dataset demonstrated the high accuracy of this method in detecting disease-associated gene clusters in a whole gene pathway. The scan statistic approach proposed here shows a high level of accuracy in detecting gene clusters in a whole gene pathway. This study has provided a sound statistical framework for analyzing genome-wide rare CNV data by incorporating topological information on the gene pathway.
Adams, Alyssa; Zenil, Hector; Davies, Paul C W; Walker, Sara Imari
2017-04-20
Open-ended evolution (OEE) is relevant to a variety of biological, artificial and technological systems, but has been challenging to reproduce in silico. Most theoretical efforts focus on key aspects of open-ended evolution as it appears in biology. We recast the problem as a more general one in dynamical systems theory, providing simple criteria for open-ended evolution based on two hallmark features: unbounded evolution and innovation. We define unbounded evolution as patterns that are non-repeating within the expected Poincare recurrence time of an isolated system, and innovation as trajectories not observed in isolated systems. As a case study, we implement novel variants of cellular automata (CA) where the update rules are allowed to vary with time in three alternative ways. Each is capable of generating conditions for open-ended evolution, but vary in their ability to do so. We find that state-dependent dynamics, regarded as a hallmark of life, statistically out-performs other candidate mechanisms, and is the only mechanism to produce open-ended evolution in a scalable manner, essential to the notion of ongoing evolution. This analysis suggests a new framework for unifying mechanisms for generating OEE with features distinctive to life and its artifacts, with broad applicability to biological and artificial systems.
Faheem, Muhammad; Heyden, Andreas
2014-08-12
We report the development of a quantum mechanics/molecular mechanics free energy perturbation (QM/MM-FEP) method for modeling chemical reactions at metal-water interfaces. This novel solvation scheme combines planewave density function theory (DFT), periodic electrostatic embedded cluster method (PEECM) calculations using Gaussian-type orbitals, and classical molecular dynamics (MD) simulations to obtain a free energy description of a complex metal-water system. We derive a potential of mean force (PMF) of the reaction system within the QM/MM framework. A fixed-size, finite ensemble of MM conformations is used to permit precise evaluation of the PMF of QM coordinates and its gradient defined within this ensemble. Local conformations of adsorbed reaction moieties are optimized using sequential MD-sampling and QM-optimization steps. An approximate reaction coordinate is constructed using a number of interpolated states and the free energy difference between adjacent states is calculated using the QM/MM-FEP method. By avoiding on-the-fly QM calculations and by circumventing the challenges associated with statistical averaging during MD sampling, a computational speedup of multiple orders of magnitude is realized. The method is systematically validated against the results of ab initio QM calculations and demonstrated for C-C cleavage in double-dehydrogenated ethylene glycol on a Pt (111) model surface.
Kobayashi, Yutaka; Ohtsuki, Hisashi
2014-03-01
Learning abilities are categorized into social (learning from others) and individual learning (learning on one's own). Despite the typically higher cost of individual learning, there are mechanisms that allow stable coexistence of both learning modes in a single population. In this paper, we investigate by means of mathematical modeling how the effect of spatial structure on evolutionary outcomes of pure social and individual learning strategies depends on the mechanisms for coexistence. We model a spatially structured population based on the infinite-island framework and consider three scenarios that differ in coexistence mechanisms. Using the inclusive-fitness method, we derive the equilibrium frequency of social learners and the genetic load of social learning (defined as average fecundity reduction caused by the presence of social learning) in terms of some summary statistics, such as relatedness, for each of the three scenarios and compare the results. This comparative analysis not only reconciles previous models that made contradictory predictions as to the effect of spatial structure on the equilibrium frequency of social learners but also derives a simple mathematical rule that determines the sign of the genetic load (i.e. whether or not social learning contributes to the mean fecundity of the population). Copyright © 2013 Elsevier Inc. All rights reserved.
From mechanisms to function: an integrated framework of animal innovation
Tebbich, Sabine; Griffin, Andrea S.; Peschl, Markus F.; Sterelny, Kim
2016-01-01
Animal innovations range from the discovery of novel food types to the invention of completely novel behaviours. Innovations can give access to new opportunities, and thus enable innovating agents to invade and create novel niches. This in turn can pave the way for morphological adaptation and adaptive radiation. The mechanisms that make innovations possible are probably as diverse as the innovations themselves. So too are their evolutionary consequences. Perhaps because of this diversity, we lack a unifying framework that links mechanism to function. We propose a framework for animal innovation that describes the interactions between mechanism, fitness benefit and evolutionary significance, and which suggests an expanded range of experimental approaches. In doing so, we split innovation into factors (components and phases) that can be manipulated systematically, and which can be investigated both experimentally and with correlational studies. We apply this framework to a selection of cases, showing how it helps us ask more precise questions and design more revealing experiments. PMID:26926285
Al-Otaibi, Hanan Nejer; Akeel, Riyadh Fadul
2014-01-01
To determine the effect of increased torque of the abutment screw and retorquing after 10 minutes on implant-supported fixed prostheses. Two strain gauges (SGs) were attached to four implants stabilized on an acrylic resin mandible. Four implant-supported frameworks were constructed to represent passive fit (PF) and different amounts of misfit (MF1, MF2, and MF3). Vertical misfit was measured using a traveling microscope. Each framework was torqued to 35 Ncm (the manufacturer's recommendation) and 40 Ncm, and the preload was recorded immediately and again after retorquing 10 minutes later (torque stage). The smallest gap was observed under the PF framework. Three-way analysis of variance revealed significant effects of the framework, torque value, and torque stage on preload. The PF showed the highest mean preload under both torque values. An independent-sample t test between the torque values revealed a statistically significant difference only for MF1 and MF2. A dependent-sample t test of the torque stage revealed a statistically significant difference at a torque value of 35 Ncm under the PF and MF3 frameworks. Increasing the torque value beyond the manufacturer's recommended amount and retorquing of the screws at 10 minutes after the initial torque did not necessarily lead to a significant increase in preload in full-arch implant-supported fixed prostheses, particularly under non-passively fitting frameworks.
Scene-based nonuniformity correction and enhancement: pixel statistics and subpixel motion.
Zhao, Wenyi; Zhang, Chao
2008-07-01
We propose a framework for scene-based nonuniformity correction (NUC) and nonuniformity correction and enhancement (NUCE) that is required for focal-plane array-like sensors to obtain clean and enhanced-quality images. The core of the proposed framework is a novel registration-based nonuniformity correction super-resolution (NUCSR) method that is bootstrapped by statistical scene-based NUC methods. Based on a comprehensive imaging model and an accurate parametric motion estimation, we are able to remove severe/structured nonuniformity and in the presence of subpixel motion to simultaneously improve image resolution. One important feature of our NUCSR method is the adoption of a parametric motion model that allows us to (1) handle many practical scenarios where parametric motions are present and (2) carry out perfect super-resolution in principle by exploring available subpixel motions. Experiments with real data demonstrate the efficiency of the proposed NUCE framework and the effectiveness of the NUCSR method.
NASA Astrophysics Data System (ADS)
Jacobson, Erik; Simpson, Amber
2018-04-01
Replication studies play a critical role in scientific accumulation of knowledge, yet replication studies in mathematics education are rare. In this study, the authors replicated Thanheiser's (Educational Studies in Mathematics 75:241-251, 2010) study of prospective elementary teachers' conceptions of multidigit number and examined the main claim that most elementary pre-service teachers think about digits incorrectly at least some of the time. Results indicated no statistically significant difference in the distribution of conceptions between the original and replication samples and, moreover, no statistically significant differences in the distribution of sub-conceptions among prospective teachers with the most common conception. These results suggest confidence is warranted both in the generality of the main claim and in the utility of the conceptions framework for describing prospective elementary teachers' conceptions of multidigit number. The report further contributes a framework for replication of mathematics education research adapted from the field of psychology.
A formal framework of scenario creation and analysis of extreme hydrological events
NASA Astrophysics Data System (ADS)
Lohmann, D.
2007-12-01
We are presenting a formal framework for a hydrological risk analysis. Different measures of risk will be introduced, such as average annual loss or occurrence exceedance probability. These are important measures for e.g. insurance companies to determine the cost of insurance. One key aspect of investigating the potential consequences of extreme hydrological events (floods and draughts) is the creation of meteorological scenarios that reflect realistic spatial and temporal patterns of precipitation that also have correct local statistics. 100,000 years of these meteorological scenarios are used in a calibrated rainfall-runoff-flood-loss-risk model to produce flood and draught events that have never been observed. The results of this hazard model are statistically analyzed and linked to socio-economic data and vulnerability functions to show the impact of severe flood events. We are showing results from the Risk Management Solutions (RMS) Europe Flood Model to introduce this formal framework.
A Data Analytical Framework for Improving Real-Time, Decision Support Systems in Healthcare
ERIC Educational Resources Information Center
Yahav, Inbal
2010-01-01
In this dissertation we develop a framework that combines data mining, statistics and operations research methods for improving real-time decision support systems in healthcare. Our approach consists of three main concepts: data gathering and preprocessing, modeling, and deployment. We introduce the notion of offline and semi-offline modeling to…
An Analysis of Variance Framework for Matrix Sampling.
ERIC Educational Resources Information Center
Sirotnik, Kenneth
Significant cost savings can be achieved with the use of matrix sampling in estimating population parameters from psychometric data. The statistical design is intuitively simple, using the framework of the two-way classification analysis of variance technique. For example, the mean and variance are derived from the performance of a certain grade…
Demographic Accounting and Model-Building. Education and Development Technical Reports.
ERIC Educational Resources Information Center
Stone, Richard
This report describes and develops a model for coordinating a variety of demographic and social statistics within a single framework. The framework proposed, together with its associated methods of analysis, serves both general and specific functions. The general aim of these functions is to give numerical definition to the pattern of society and…
Mediation Analysis in a Latent Growth Curve Modeling Framework
ERIC Educational Resources Information Center
von Soest, Tilmann; Hagtvet, Knut A.
2011-01-01
This article presents several longitudinal mediation models in the framework of latent growth curve modeling and provides a detailed account of how such models can be constructed. Logical and statistical challenges that might arise when such analyses are conducted are also discussed. Specifically, we discuss how the initial status (intercept) and…
VALUE - A Framework to Validate Downscaling Approaches for Climate Change Studies
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Widmann, Martin; Gutiérrez, José M.; Kotlarski, Sven; Chandler, Richard E.; Hertig, Elke; Wibig, Joanna; Huth, Radan; Wilke, Renate A. I.
2015-04-01
VALUE is an open European network to validate and compare downscaling methods for climate change research. VALUE aims to foster collaboration and knowledge exchange between climatologists, impact modellers, statisticians, and stakeholders to establish an interdisciplinary downscaling community. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. Here, we present the key ingredients of this framework. VALUE's main approach to validation is user-focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur: what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Do methods fail in representing regional climate change? How is the overall representation of regional climate, including errors inherited from global climate models? The framework will be the basis for a comprehensive community-open downscaling intercomparison study, but is intended also to provide general guidance for other validation studies.
VALUE: A framework to validate downscaling approaches for climate change studies
NASA Astrophysics Data System (ADS)
Maraun, Douglas; Widmann, Martin; Gutiérrez, José M.; Kotlarski, Sven; Chandler, Richard E.; Hertig, Elke; Wibig, Joanna; Huth, Radan; Wilcke, Renate A. I.
2015-01-01
VALUE is an open European network to validate and compare downscaling methods for climate change research. VALUE aims to foster collaboration and knowledge exchange between climatologists, impact modellers, statisticians, and stakeholders to establish an interdisciplinary downscaling community. A key deliverable of VALUE is the development of a systematic validation framework to enable the assessment and comparison of both dynamical and statistical downscaling methods. In this paper, we present the key ingredients of this framework. VALUE's main approach to validation is user- focused: starting from a specific user problem, a validation tree guides the selection of relevant validation indices and performance measures. Several experiments have been designed to isolate specific points in the downscaling procedure where problems may occur: what is the isolated downscaling skill? How do statistical and dynamical methods compare? How do methods perform at different spatial scales? Do methods fail in representing regional climate change? How is the overall representation of regional climate, including errors inherited from global climate models? The framework will be the basis for a comprehensive community-open downscaling intercomparison study, but is intended also to provide general guidance for other validation studies.
Non-equilibrium statistical mechanics theory for the large scales of geophysical flows
NASA Astrophysics Data System (ADS)
Eric, S.; Bouchet, F.
2010-12-01
The aim of any theory of turbulence is to understand the statistical properties of the velocity field. As a huge number of degrees of freedom is involved, statistical mechanics is a natural approach. The self-organization of two-dimensional and geophysical turbulent flows is addressed based on statistical mechanics methods. We discuss classical and recent works on this subject; from the statistical mechanics basis of the theory up to applications to Jupiter’s troposphere and ocean vortices and jets. The equilibrium microcanonical measure is built from the Liouville theorem. Important statistical mechanics concepts (large deviations, mean field approach) and thermodynamic concepts (ensemble inequivalence, negative heat capacity) are briefly explained and used to predict statistical equilibria for turbulent flows. This is applied to make quantitative models of two-dimensional turbulence, the Great Red Spot and other Jovian vortices, ocean jets like the Gulf-Stream, and ocean vortices. A detailed comparison between these statistical equilibria and real flow observations will be discussed. We also present recent results for non-equilibrium situations, for which forces and dissipation are in a statistical balance. As an example, the concept of phase transition allows us to describe drastic changes of the whole system when a few external parameters are changed. F. Bouchet and E. Simonnet, Random Changes of Flow Topology in Two-Dimensional and Geophysical Turbulence, Physical Review Letters 102 (2009), no. 9, 094504-+. F. Bouchet and J. Sommeria, Emergence of intense jets and Jupiter's Great Red Spot as maximum-entropy structures, Journal of Fluid Mechanics 464 (2002), 165-207. A. Venaille and F. Bouchet, Ocean rings and jets as statistical equilibrium states, submitted to JPO F. Bouchet and A. Venaille, Statistical mechanics of two-dimensional and geophysical flows, submitted to Physics Reports Non-equilibrium phase transitions for the 2D Navier-Stokes equations with stochastic forces (time series and probability density functions (PDFs) of the modulus of the largest scale Fourrier component, showing bistability between dipole and unidirectional flows). This bistability is predicted by statistical mechanics.
Role of differential physical properties in the collective mechanics and dynamics of tissues
NASA Astrophysics Data System (ADS)
Das, Moumita
Living cells and tissues are highly mechanically sensitive and active. Mechanical stimuli influence the shape, motility, and functions of cells, modulate the behavior of tissues, and play a key role in several diseases. In this talk I will discuss how collective biophysical properties of tissues emerge from the interplay between differential mechanical properties and statistical physics of underlying components, focusing on two complementary tissue types whose properties are primarily determined by (1) the extracellular matrix (ECM), and (2) individual and collective cell properties. I will start with the structure-mechanics-function relationships in articular cartilage (AC), a soft tissue that has very few cells, and its mechanical response is primarily due to its ECM. AC is a remarkable tissue: it can support loads exceeding ten times our body weight and bear 60+ years of daily mechanical loading despite having minimal regenerative capacity. I will discuss the biophysical principles underlying this exceptional mechanical response using the framework of rigidity percolation theory, and compare our predictions with experiments done by our collaborators. Next I will discuss ongoing theoretical work on how the differences in cell mechanics, motility, adhesion, and proliferation in a co-culture of breast cancer cells and healthy breast epithelial cells may modulate experimentally observed differential migration and segregation. Our results may provide insights into the mechanobiology of tissues with cell populations with different physical properties present together such as during the formation of embryos or the initiation of tumors. This work was partially supported by a Cottrell College Science Award.
2017-06-01
Training time statistics from Jones’ thesis. . . . . . . . . . . . . . 15 Table 2.2 Evaluation runtime statistics from Camp’s thesis for a single image. 17...Table 2.3 Training and evaluation runtime statistics from Sharpe’s thesis. . . 19 Table 2.4 Sharpe’s screenshot detector results for combinations of...training resources available and time required for each algorithm Jones [15] tested. Table 2.1. Training time statistics from Jones’ [15] thesis. Algorithm
Statistical mechanics based on fractional classical and quantum mechanics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korichi, Z.; Meftah, M. T., E-mail: mewalid@yahoo.com
2014-03-15
The purpose of this work is to study some problems in statistical mechanics based on the fractional classical and quantum mechanics. At first stage we have presented the thermodynamical properties of the classical ideal gas and the system of N classical oscillators. In both cases, the Hamiltonian contains fractional exponents of the phase space (position and momentum). At the second stage, in the context of the fractional quantum mechanics, we have calculated the thermodynamical properties for the black body radiation, studied the Bose-Einstein statistics with the related problem of the condensation and the Fermi-Dirac statistics.
Hidden in the background: a local approach to CMB anomalies
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sánchez, Juan C. Bueno, E-mail: juan.c.bueno@correounivalle.edu.co
2016-09-01
We investigate a framework aiming to provide a common origin for the large-angle anomalies detected in the Cosmic Microwave Background (CMB), which are hypothesized as the result of the statistical inhomogeneity developed by different isocurvature fields of mass m ∼ H present during inflation. The inhomogeneity arises as the combined effect of ( i ) the initial conditions for isocurvature fields (obtained after a fast-roll stage finishing many e -foldings before cosmological scales exit the horizon), ( ii ) their inflationary fluctuations and ( iii ) their coupling to other degrees of freedom. Our case of interest is when thesemore » fields (interpreted as the precursors of large-angle anomalies) leave an observable imprint only in isolated patches of the Universe. When the latter intersect the last scattering surface, such imprints arise in the CMB. Nevertheless, due to their statistically inhomogeneous nature, these imprints are difficult to detect, for they become hidden in the background similarly to the Cold Spot. We then compute the probability that a single isocurvature field becomes inhomogeneous at the end of inflation and find that, if the appropriate conditions are given (which depend exclusively on the preexisting fast-roll stage), this probability is at the percent level. Finally, we discuss several mechanisms (including the curvaton and the inhomogeneous reheating) to investigate whether an initial statistically inhomogeneous isocurvature field fluctuation might give rise to some of the observed anomalies. In particular, we focus on the Cold Spot, the power deficit at low multipoles and the breaking of statistical isotropy.« less
Expected p-values in light of an ROC curve analysis applied to optimal multiple testing procedures.
Vexler, Albert; Yu, Jihnhee; Zhao, Yang; Hutson, Alan D; Gurevich, Gregory
2017-01-01
Many statistical studies report p-values for inferential purposes. In several scenarios, the stochastic aspect of p-values is neglected, which may contribute to drawing wrong conclusions in real data experiments. The stochastic nature of p-values makes their use to examine the performance of given testing procedures or associations between investigated factors to be difficult. We turn our focus on the modern statistical literature to address the expected p-value (EPV) as a measure of the performance of decision-making rules. During the course of our study, we prove that the EPV can be considered in the context of receiver operating characteristic (ROC) curve analysis, a well-established biostatistical methodology. The ROC-based framework provides a new and efficient methodology for investigating and constructing statistical decision-making procedures, including: (1) evaluation and visualization of properties of the testing mechanisms, considering, e.g. partial EPVs; (2) developing optimal tests via the minimization of EPVs; (3) creation of novel methods for optimally combining multiple test statistics. We demonstrate that the proposed EPV-based approach allows us to maximize the integrated power of testing algorithms with respect to various significance levels. In an application, we use the proposed method to construct the optimal test and analyze a myocardial infarction disease dataset. We outline the usefulness of the "EPV/ROC" technique for evaluating different decision-making procedures, their constructions and properties with an eye towards practical applications.
Li, Wei; Thirumurugan, A; Barton, Phillip T; Lin, Zheshuai; Henke, Sebastian; Yeung, Hamish H-M; Wharmby, Michael T; Bithell, Erica G; Howard, Christopher J; Cheetham, Anthony K
2014-06-04
Two analogous metal-organic frameworks (MOFs) with the perovskite architecture, [C(NH2)3][Mn(HCOO)3] (1) and [(CH2)3NH2][Mn(HCOO)3] (2), exhibit significantly different mechanical properties. The marked difference is attributed to their distinct modes of hydrogen bonding between the A-site amine cation and the anionic framework. The stronger cross-linking hydrogen bonding in 1 gives rise to Young's moduli and hardnesses that are up to twice those in 2, while the thermal expansion is substantially smaller. This study presents clear evidence that the mechanical properties of MOF materials can be substantially tuned via hydrogen-bonding interactions.
NASA Astrophysics Data System (ADS)
McMullen, Ryan; McKeon, Beverley
2017-11-01
It is well-known that small amounts of high-molecular weight polymers can drastically reduce turbulent drag in a liquid (Toms, 1948). Furthermore, recent work has shown that studying polymers in turbulence can shed light on the nature of the self-sustaining mechanisms of wall turbulence (White and Mungal, 2008; Graham, 2014). The focus of this talk is an investigation of the linear mechanisms at play in polymer drag-reduced turbulent channel flow. The resolvent framework introduced by McKeon and Sharma (2010) for Newtonian turbulence is extended to the viscoelastic case in order to study the most-amplified velocity and polymer stretching modes, explored in the case of creeping flow by Jovanović and coworkers (Jovanović and Kumar, 2010; Lieu et al., 2013). Particular attention is given to the role of critical layers, which have been shown to be important in the dynamics of Newtonian turbulence (McKeon and Sharma, 2010). Additionally, comparisons will be made with the lower branch of the P4 family of exact coherent states, which closely reproduce statistical features of polymer drag-reduced turbulence close to maximum drag reduction (Park and Graham, 2015). The support of the Dow Corporation is gratefully acknowledged.
A new framework to increase the efficiency of large-scale solar power plants.
NASA Astrophysics Data System (ADS)
Alimohammadi, Shahrouz; Kleissl, Jan P.
2015-11-01
A new framework to estimate the spatio-temporal behavior of solar power is introduced, which predicts the statistical behavior of power output at utility scale Photo-Voltaic (PV) power plants. The framework is based on spatio-temporal Gaussian Processes Regression (Kriging) models, which incorporates satellite data with the UCSD version of the Weather and Research Forecasting model. This framework is designed to improve the efficiency of the large-scale solar power plants. The results are also validated from measurements of the local pyranometer sensors, and some improvements in different scenarios are observed. Solar energy.
Qiang, Zeng; Ning, Li; Yanan, Zhou; Jiazhen, Yan; Wenbo, Liu
2015-12-01
The effect of sandblasting on the bond strength between 3mol% yttrium-stabilized tetragonal zirconium polycrystal (3Y-TZP) zirconia framework and veneering porcelain was evaluated. A total of 21 specimens [(25 ± 1) mm x (3 ± 0.1) mmx (0.5 ± 0.05) mm] were prepared according to ISO 9693. The specimens were then randomly divided into 3 groups. Sandblasting was performed on 2 meshes of Al₂O₃ particles: group A with mesh 110 and group B with mesh 80. Group C, which was not sandblasted, was the control group. The surface roughness of the zirconia framework, as well as the bond strength between 3Y-TZP zirconia framework and veneering porcelain, was measured. The interface microstructure was observed by scanning electron microscope (SEM), and elemental distribution was detected by energy dispersive spectroscopy (EDS). Surface roughness values were (1.272 ± 0.149) μm for group A, (0.622 ± 0.113) μm for group B, and (0.221 ± 0.065) μm for group C. Statistical significance were found among groups (P < 0.05). The bond strength values were (28.21 ± 1.52) MPa for group A, (27.71 ± 1.27) MPa for group B, and (24.87 ± 3.84) MPa for group C. Statistical significance was found between group A and group C (P < 0.05), whereas the other groups had no statistical significance (P > 0.05). Interface adhesion failure was the primary performance. SEM images showed the close interface bonding, and EDS showed that the interface had no obvious element penetration. Al₂O₃ sandblasting can slightly enhance the bond strength between zirconia framework and veneering porcelain.
Predicting weak lensing statistics from halo mass reconstructions - Final Paper
DOE Office of Scientific and Technical Information (OSTI.GOV)
Everett, Spencer
2015-08-20
As dark matter does not absorb or emit light, its distribution in the universe must be inferred through indirect effects such as the gravitational lensing of distant galaxies. While most sources are only weakly lensed, the systematic alignment of background galaxies around a foreground lens can constrain the mass of the lens which is largely in the form of dark matter. In this paper, I have implemented a framework to reconstruct all of the mass along lines of sight using a best-case dark matter halo model in which the halo mass is known. This framework is then used to makemore » predictions of the weak lensing of 3,240 generated source galaxies through a 324 arcmin² field of the Millennium Simulation. The lensed source ellipticities are characterized by the ellipticity-ellipticity and galaxy-mass correlation functions and compared to the same statistic for the intrinsic and ray-traced ellipticities. In the ellipticity-ellipticity correlation function, I and that the framework systematically under predicts the shear power by an average factor of 2.2 and fails to capture correlation from dark matter structure at scales larger than 1 arcminute. The model predicted galaxy-mass correlation function is in agreement with the ray-traced statistic from scales 0.2 to 0.7 arcminutes, but systematically underpredicts shear power at scales larger than 0.7 arcminutes by an average factor of 1.2. Optimization of the framework code has reduced the mean CPU time per lensing prediction by 70% to 24 ± 5 ms. Physical and computational shortcomings of the framework are discussed, as well as potential improvements for upcoming work.« less
Density profiles in the Scrape-Off Layer interpreted through filament dynamics
NASA Astrophysics Data System (ADS)
Militello, Fulvio
2017-10-01
We developed a new theoretical framework to clarify the relation between radial Scrape-Off Layer density profiles and the fluctuations that generate them. The framework provides an interpretation of the experimental features of the profiles and of the turbulence statistics on the basis of simple properties of the filaments, such as their radial motion and their draining towards the divertor. L-mode and inter-ELM filaments are described as a Poisson process in which each event is independent and modelled with a wave function of amplitude and width statistically distributed according to experimental observations and evolving according to fluid equations. We will rigorously show that radially accelerating filaments, less efficient parallel exhaust and also a statistical distribution of their radial velocity can contribute to induce flatter profiles in the far SOL and therefore enhance plasma-wall interactions. A quite general result of our analysis is the resiliency of this non-exponential nature of the profiles and the increase of the relative fluctuation amplitude towards the wall, as experimentally observed. According to the framework, profile broadening at high fueling rates can be caused by interactions with neutrals (e.g. charge exchange) in the divertor or by a significant radial acceleration of the filaments. The framework assumptions were tested with 3D numerical simulations of seeded SOL filaments based on a two fluid model. In particular, filaments interact through the electrostatic field they generate only when they are in close proximity (separation comparable to their width in the drift plane), thus justifying our independence hypothesis. In addition, we will discuss how isolated filament motion responds to variations in the plasma conditions, and specifically divertor conditions. Finally, using the theoretical framework we will reproduce and interpret experimental results obtained on JET, MAST and HL-2A.
NASA Astrophysics Data System (ADS)
Weymer, Bradley A.; Wernette, Phillipe; Everett, Mark E.; Houser, Chris
2018-06-01
Shorelines exhibit long-range dependence (LRD) and have been shown in some environments to be described in the wave number domain by a power-law characteristic of scale independence. Recent evidence suggests that the geomorphology of barrier islands can, however, exhibit scale dependence as a result of systematic variations in the underlying framework geology. The LRD of framework geology, which influences island geomorphology and its response to storms and sea level rise, has not been previously examined. Electromagnetic induction (EMI) surveys conducted along Padre Island National Seashore (PAIS), Texas, United States, reveal that the EMI apparent conductivity (σa) signal and, by inference, the framework geology exhibits LRD at scales of up to 101 to 102 km. Our study demonstrates the utility of describing EMI σa and lidar spatial series by a fractional autoregressive integrated moving average (ARIMA) process that specifically models LRD. This method offers a robust and compact way of quantifying the geological variations along a barrier island shoreline using three statistical parameters (p, d, q). We discuss how ARIMA models that use a single parameter d provide a quantitative measure for determining free and forced barrier island evolutionary behavior across different scales. Statistical analyses at regional, intermediate, and local scales suggest that the geologic framework within an area of paleo-channels exhibits a first-order control on dune height. The exchange of sediment amongst nearshore, beach, and dune in areas outside this region are scale independent, implying that barrier islands like PAIS exhibit a combination of free and forced behaviors that affect the response of the island to sea level rise.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Mayer, Paula S. D.; Reddy, Uday; Ackley, Keith; Futrell, Mike
1991-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at combining effective tool and data integration mechanisms with a model of the software development process in an intelligent integrated software development environment. Guided by this model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated.
Tougas, Michelle E.; Hayden, Jill A.; McGrath, Patrick J.; Huguet, Anna; Rozario, Sharlene
2015-01-01
Background Theory is often recommended as a framework for guiding hypothesized mechanisms of treatment effect. However, there is limited guidance about how to use theory in intervention development. Methods We conducted a systematic review to provide an exemplar review evaluating the extent to which use of theory is identified and incorporated within existing interventions. We searched electronic databases PubMed, PsycINFO, CENTRAL, and EMBASE from inception to May 2014. We searched clinicaltrials.gov for registered protocols, reference lists of relevant systematic reviews and included studies, and conducted a citation search in Web of Science. We included peer-reviewed publications of interventions that referenced the social cognitive theory of self-regulation as a framework for interventions to manage chronic health conditions. Two reviewers independently assessed articles for eligibility. We contacted all authors of included studies for information detailing intervention content. We describe how often theory mechanisms were addressed by interventions, and report intervention characteristics used to address theory. Results Of 202 articles that reported using the social cognitive theory of self-regulation, 52% failed to incorporate self-monitoring, a main theory component, and were therefore excluded. We included 35 interventions that adequately used the theory framework. Intervention characteristics were often poorly reported in peer-reviewed publications, 21 of 35 interventions incorporated characteristics that addressed each of the main theory components. Each intervention addressed, on average, six of eight self-monitoring mechanisms, two of five self-judgement mechanisms, and one of three self-evaluation mechanisms. The self-monitoring mechanisms ‘Feedback’ and ‘Consistency’ were addressed by all interventions, whereas the self-evaluation mechanisms ‘Self-incentives’ and ‘External rewards’ were addressed by six and four interventions, respectively. The present review establishes that systematic review is a feasible method of identifying use of theory as a conceptual framework for existing interventions. We identified the social cognitive theory of self-regulation as a feasible framework to guide intervention development for chronic health conditions. PMID:26252889
Many-Body Localization and Thermalization in Quantum Statistical Mechanics
NASA Astrophysics Data System (ADS)
Nandkishore, Rahul; Huse, David A.
2015-03-01
We review some recent developments in the statistical mechanics of isolated quantum systems. We provide a brief introduction to quantum thermalization, paying particular attention to the eigenstate thermalization hypothesis (ETH) and the resulting single-eigenstate statistical mechanics. We then focus on a class of systems that fail to quantum thermalize and whose eigenstates violate the ETH: These are the many-body Anderson-localized systems; their long-time properties are not captured by the conventional ensembles of quantum statistical mechanics. These systems can forever locally remember information about their local initial conditions and are thus of interest for possibilities of storing quantum information. We discuss key features of many-body localization (MBL) and review a phenomenology of the MBL phase. Single-eigenstate statistical mechanics within the MBL phase reveal dynamically stable ordered phases, and phase transitions among them, that are invisible to equilibrium statistical mechanics and can occur at high energy and low spatial dimensionality, where equilibrium ordering is forbidden.
A data colocation grid framework for big data medical image processing: backend design
NASA Astrophysics Data System (ADS)
Bao, Shunxing; Huo, Yuankai; Parvathaneni, Prasanna; Plassard, Andrew J.; Bermudez, Camilo; Yao, Yuang; Lyu, Ilwoo; Gokhale, Aniruddha; Landman, Bennett A.
2018-03-01
When processing large medical imaging studies, adopting high performance grid computing resources rapidly becomes important. We recently presented a "medical image processing-as-a-service" grid framework that offers promise in utilizing the Apache Hadoop ecosystem and HBase for data colocation by moving computation close to medical image storage. However, the framework has not yet proven to be easy to use in a heterogeneous hardware environment. Furthermore, the system has not yet validated when considering variety of multi-level analysis in medical imaging. Our target design criteria are (1) improving the framework's performance in a heterogeneous cluster, (2) performing population based summary statistics on large datasets, and (3) introducing a table design scheme for rapid NoSQL query. In this paper, we present a heuristic backend interface application program interface (API) design for Hadoop and HBase for Medical Image Processing (HadoopBase-MIP). The API includes: Upload, Retrieve, Remove, Load balancer (for heterogeneous cluster) and MapReduce templates. A dataset summary statistic model is discussed and implemented by MapReduce paradigm. We introduce a HBase table scheme for fast data query to better utilize the MapReduce model. Briefly, 5153 T1 images were retrieved from a university secure, shared web database and used to empirically access an in-house grid with 224 heterogeneous CPU cores. Three empirical experiments results are presented and discussed: (1) load balancer wall-time improvement of 1.5-fold compared with a framework with built-in data allocation strategy, (2) a summary statistic model is empirically verified on grid framework and is compared with the cluster when deployed with a standard Sun Grid Engine (SGE), which reduces 8-fold of wall clock time and 14-fold of resource time, and (3) the proposed HBase table scheme improves MapReduce computation with 7 fold reduction of wall time compare with a naïve scheme when datasets are relative small. The source code and interfaces have been made publicly available.
A Data Colocation Grid Framework for Big Data Medical Image Processing: Backend Design.
Bao, Shunxing; Huo, Yuankai; Parvathaneni, Prasanna; Plassard, Andrew J; Bermudez, Camilo; Yao, Yuang; Lyu, Ilwoo; Gokhale, Aniruddha; Landman, Bennett A
2018-03-01
When processing large medical imaging studies, adopting high performance grid computing resources rapidly becomes important. We recently presented a "medical image processing-as-a-service" grid framework that offers promise in utilizing the Apache Hadoop ecosystem and HBase for data colocation by moving computation close to medical image storage. However, the framework has not yet proven to be easy to use in a heterogeneous hardware environment. Furthermore, the system has not yet validated when considering variety of multi-level analysis in medical imaging. Our target design criteria are (1) improving the framework's performance in a heterogeneous cluster, (2) performing population based summary statistics on large datasets, and (3) introducing a table design scheme for rapid NoSQL query. In this paper, we present a heuristic backend interface application program interface (API) design for Hadoop & HBase for Medical Image Processing (HadoopBase-MIP). The API includes: Upload, Retrieve, Remove, Load balancer (for heterogeneous cluster) and MapReduce templates. A dataset summary statistic model is discussed and implemented by MapReduce paradigm. We introduce a HBase table scheme for fast data query to better utilize the MapReduce model. Briefly, 5153 T1 images were retrieved from a university secure, shared web database and used to empirically access an in-house grid with 224 heterogeneous CPU cores. Three empirical experiments results are presented and discussed: (1) load balancer wall-time improvement of 1.5-fold compared with a framework with built-in data allocation strategy, (2) a summary statistic model is empirically verified on grid framework and is compared with the cluster when deployed with a standard Sun Grid Engine (SGE), which reduces 8-fold of wall clock time and 14-fold of resource time, and (3) the proposed HBase table scheme improves MapReduce computation with 7 fold reduction of wall time compare with a naïve scheme when datasets are relative small. The source code and interfaces have been made publicly available.
NASA Astrophysics Data System (ADS)
Ramaswamy, Sriram
2007-03-01
Collections of self-driven or ``active'' particles are now recognised as a distinct kind of nonequilibrium matter, and an understanding of their phases, hydrodynamics, mechanical response, and correlations is a vital and rapidly developing part of the statistical physics of soft-matter systems far from equilibrium. My talk will review our recent results, from theory, simulation and experiment, on order, fluctuations, and flow instabilities in collections of active particles, in suspension or on a solid surface. Our work, which began by adapting theories of flocking to include the hydrodynamics of the ambient fluid, provides the theoretical framework for understanding active matter in all its diversity: contractile filaments in cell extracts, crawling or dividing cells, collectively swimming bacteria, fish schools, and agitated monolayers of orientable granular particles.
Validity criteria for Fermi’s golden rule scattering rates applied to metallic nanowires
NASA Astrophysics Data System (ADS)
Moors, Kristof; Sorée, Bart; Magnus, Wim
2016-09-01
Fermi’s golden rule underpins the investigation of mobile carriers propagating through various solids, being a standard tool to calculate their scattering rates. As such, it provides a perturbative estimate under the implicit assumption that the effect of the interaction Hamiltonian which causes the scattering events is sufficiently small. To check the validity of this assumption, we present a general framework to derive simple validity criteria in order to assess whether the scattering rates can be trusted for the system under consideration, given its statistical properties such as average size, electron density, impurity density et cetera. We derive concrete validity criteria for metallic nanowires with conduction electrons populating a single parabolic band subjected to different elastic scattering mechanisms: impurities, grain boundaries and surface roughness.
Machine learning bandgaps of double perovskites
Pilania, G.; Mannodi-Kanakkithodi, A.; Uberuaga, B. P.; Ramprasad, R.; Gubernatis, J. E.; Lookman, T.
2016-01-01
The ability to make rapid and accurate predictions on bandgaps of double perovskites is of much practical interest for a range of applications. While quantum mechanical computations for high-fidelity bandgaps are enormously computation-time intensive and thus impractical in high throughput studies, informatics-based statistical learning approaches can be a promising alternative. Here we demonstrate a systematic feature-engineering approach and a robust learning framework for efficient and accurate predictions of electronic bandgaps of double perovskites. After evaluating a set of more than 1.2 million features, we identify lowest occupied Kohn-Sham levels and elemental electronegativities of the constituent atomic species as the most crucial and relevant predictors. The developed models are validated and tested using the best practices of data science and further analyzed to rationalize their prediction performance. PMID:26783247
Phenomenological approach to mechanical damage growth analysis.
Pugno, Nicola; Bosia, Federico; Gliozzi, Antonio S; Delsanto, Pier Paolo; Carpinteri, Alberto
2008-10-01
The problem of characterizing damage evolution in a generic material is addressed with the aim of tracing it back to existing growth models in other fields of research. Based on energetic considerations, a system evolution equation is derived for a generic damage indicator describing a material system subjected to an increasing external stress. The latter is found to fit into the framework of a recently developed phenomenological universality (PUN) approach and, more specifically, the so-called U2 class. Analytical results are confirmed by numerical simulations based on a fiber-bundle model and statistically assigned local strengths at the microscale. The fits with numerical data prove, with an excellent degree of reliability, that the typical evolution of the damage indicator belongs to the aforementioned PUN class. Applications of this result are briefly discussed and suggested.
Lazari, Priscilla Cardoso; Sotto-Maior, Bruno Salles; Rocha, Eduardo Passos; de Villa Camargos, Germana; Del Bel Cury, Altair Antoninha
2014-10-01
The chipping of ceramic veneers is a common problem for zirconia-based restorations and is due to the weak interface between both structures. The purpose of this study was to evaluate the mechanical behavior of ceramic veneers on zirconia and metal frameworks under 2 different bond-integrity conditions. The groups were created to simulate framework-veneer bond integrity with the crowns partially debonded (frictional coefficient, 0.3) or completely bonded as follows: crown with a silver-palladium framework cemented onto a natural tooth, ceramic crown with a zirconia framework cemented onto a natural tooth, crown with a silver-palladium framework cemented onto a Morse taper implant, and ceramic crown with a zirconia framework cemented onto a Morse taper implant. The test loads were 49 N applied to the palatal surface at 45 degrees to the long axis of the crown and 25.5 N applied perpendicular to the incisal edge of the crown. The maximum principal stress, shear stress, and deformation values were calculated for the ceramic veneer; and the von Mises stress was determined for the framework. Veneers with partial debonding to the framework (frictional coefficient, 0.3) had greater stress concentrations in all structures compared with the completely bonded veneers. The metal ceramic crowns experienced lower stress values than ceramic crowns in models that simulate a perfect bond between the ceramic and the framework. Frameworks cemented to a tooth exhibited greater stress values than frameworks cemented to implants, regardless of the material used. Incomplete bonding between the ceramic veneer and the prosthetic framework affects the mechanical performance of the ceramic veneer, which makes it susceptible to failure, independent of the framework material or complete crown support. Copyright © 2014 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Library Statistical Data Base Formats and Definitions.
ERIC Educational Resources Information Center
Jones, Dennis; And Others
Represented are the detailed set of data structures relevant to the categorization of information, terminology, and definitions employed in the design of the library statistical data base. The data base, or management information system, provides administrators with a framework of information and standardized data for library management, planning,…
Some Statistics for Assessing Person-Fit Based on Continuous-Response Models
ERIC Educational Resources Information Center
Ferrando, Pere Joan
2010-01-01
This article proposes several statistics for assessing individual fit based on two unidimensional models for continuous responses: linear factor analysis and Samejima's continuous response model. Both models are approached using a common framework based on underlying response variables and are formulated at the individual level as fixed regression…
Visualizing Teacher Education as a Complex System: A Nested Simplex System Approach
ERIC Educational Resources Information Center
Ludlow, Larry; Ell, Fiona; Cochran-Smith, Marilyn; Newton, Avery; Trefcer, Kaitlin; Klein, Kelsey; Grudnoff, Lexie; Haigh, Mavis; Hill, Mary F.
2017-01-01
Our purpose is to provide an exploratory statistical representation of initial teacher education as a complex system comprised of dynamic influential elements. More precisely, we reveal what the system looks like for differently-positioned teacher education stakeholders based on our framework for gathering, statistically analyzing, and graphically…
Statistical Framework for Recreational Water Quality Criteria and Monitoring
Discussion between the EPA Office of Research and Development (ORD) and the EPA Office of Water (OW), which is charged with setting criteria in accordance with the BEACH Act of 2000, have made it clear that in-depth statistical guidance for such criteria is needed. In January 20...
Theoretical Frameworks for Math Fact Fluency
ERIC Educational Resources Information Center
Arnold, Katherine
2012-01-01
Recent education statistics indicate persistent low math scores for our nation's students. This drop in math proficiency includes deficits in basic number sense and automaticity of math facts. The decrease has been recorded across all grade levels with the elementary levels showing the greatest loss (National Center for Education Statistics,…
A Conceptual Framework for Teaching Statistics from a Distance
ERIC Educational Resources Information Center
Mills, Jamie
2015-01-01
This article discusses important considerations for teachers who teach or may be thinking about teaching statistics online or in a hybrid/blended format. Suggestions from previous research and practical teaching experiences are examined. Moreover, the latest recommendations from the literature are considered in the context of teaching from a…
Cyber Mentoring in an Online Introductory Statistics Course
ERIC Educational Resources Information Center
Rashid, Mamunur; Sarkar, Jyotirmoy
2018-01-01
Students in an online statistics course were prone to become increasingly disengaged as the semester progressed. In Spring 2015, we took a proactive measure to retain student engagement by introducing a cyber mentoring session. We describe the framework, operation and effectiveness of cyber mentoring in improving students' learning experience and…
Commentary to Library Statistical Data Base.
ERIC Educational Resources Information Center
Jones, Dennis; And Others
The National Center for Higher Education Management Systems (NCHEMS) has developed a library statistical data base which concentrates on the management information needs of administrators of public and academic libraries. This document provides an overview of the framework and conceptual approach employed in the design of the data base. The data…
Compendium of Abstracts on Statistical Applications in Geotechnical Engineering.
1983-09-01
research in the application of probabilistic and statistical methods to soil mechanics, rock mechanics, and engineering geology problems have grown markedly...probability, statistics, soil mechanics, rock mechanics, and engineering geology. 2. The purpose of this report is to make available to the U. S...Deformation Dynamic Response Analysis Seepage, Soil Permeability and Piping Earthquake Engineering, Seismology, Settlement and Heave Seismic Risk Analysis
NASA Astrophysics Data System (ADS)
Wallace, Jon Michael
2003-10-01
Reliability prediction of components operating in complex systems has historically been conducted in a statistically isolated manner. Current physics-based, i.e. mechanistic, component reliability approaches focus more on component-specific attributes and mathematical algorithms and not enough on the influence of the system. The result is that significant error can be introduced into the component reliability assessment process. The objective of this study is the development of a framework that infuses the needs and influence of the system into the process of conducting mechanistic-based component reliability assessments. The formulated framework consists of six primary steps. The first three steps, identification, decomposition, and synthesis, are primarily qualitative in nature and employ system reliability and safety engineering principles to construct an appropriate starting point for the component reliability assessment. The following two steps are the most unique. They involve a step to efficiently characterize and quantify the system-driven local parameter space and a subsequent step using this information to guide the reduction of the component parameter space. The local statistical space quantification step is accomplished using two proposed multivariate probability models: Multi-Response First Order Second Moment and Taylor-Based Inverse Transformation. Where existing joint probability models require preliminary distribution and correlation information of the responses, these models combine statistical information of the input parameters with an efficient sampling of the response analyses to produce the multi-response joint probability distribution. Parameter space reduction is accomplished using Approximate Canonical Correlation Analysis (ACCA) employed as a multi-response screening technique. The novelty of this approach is that each individual local parameter and even subsets of parameters representing entire contributing analyses can now be rank ordered with respect to their contribution to not just one response, but the entire vector of component responses simultaneously. The final step of the framework is the actual probabilistic assessment of the component. Although the same multivariate probability tools employed in the characterization step can be used for the component probability assessment, variations of this final step are given to allow for the utilization of existing probabilistic methods such as response surface Monte Carlo and Fast Probability Integration. The overall framework developed in this study is implemented to assess the finite-element based reliability prediction of a gas turbine airfoil involving several failure responses. Results of this implementation are compared to results generated using the conventional 'isolated' approach as well as a validation approach conducted through large sample Monte Carlo simulations. The framework resulted in a considerable improvement to the accuracy of the part reliability assessment and an improved understanding of the component failure behavior. Considerable statistical complexity in the form of joint non-normal behavior was found and accounted for using the framework. Future applications of the framework elements are discussed.
Probabilistic models in human sensorimotor control
Wolpert, Daniel M.
2009-01-01
Sensory and motor uncertainty form a fundamental constraint on human sensorimotor control. Bayesian decision theory (BDT) has emerged as a unifying framework to understand how the central nervous system performs optimal estimation and control in the face of such uncertainty. BDT has two components: Bayesian statistics and decision theory. Here we review Bayesian statistics and show how it applies to estimating the state of the world and our own body. Recent results suggest that when learning novel tasks we are able to learn the statistical properties of both the world and our own sensory apparatus so as to perform estimation using Bayesian statistics. We review studies which suggest that humans can combine multiple sources of information to form maximum likelihood estimates, can incorporate prior beliefs about possible states of the world so as to generate maximum a posteriori estimates and can use Kalman filter-based processes to estimate time-varying states. Finally, we review Bayesian decision theory in motor control and how the central nervous system processes errors to determine loss functions and optimal actions. We review results that suggest we plan movements based on statistics of our actions that result from signal-dependent noise on our motor outputs. Taken together these studies provide a statistical framework for how the motor system performs in the presence of uncertainty. PMID:17628731
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2016-01-01
This chapter discusses the ongoing development of combined uncertainty and error bound estimates for computational fluid dynamics (CFD) calculations subject to imposed random parameters and random fields. An objective of this work is the construction of computable error bound formulas for output uncertainty statistics that guide CFD practitioners in systematically determining how accurately CFD realizations should be approximated and how accurately uncertainty statistics should be approximated for output quantities of interest. Formal error bounds formulas for moment statistics that properly account for the presence of numerical errors in CFD calculations and numerical quadrature errors in the calculation of moment statistics have been previously presented in [8]. In this past work, hierarchical node-nested dense and sparse tensor product quadratures are used to calculate moment statistics integrals. In the present work, a framework has been developed that exploits the hierarchical structure of these quadratures in order to simplify the calculation of an estimate of the quadrature error needed in error bound formulas. When signed estimates of realization error are available, this signed error may also be used to estimate output quantity of interest probability densities as a means to assess the impact of realization error on these density estimates. Numerical results are presented for CFD problems with uncertainty to demonstrate the capabilities of this framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jin, Jun-Cheng; Technology Promotion Center of Nano Composite Material of Biomimetic Sensor and Detecting Technology, Preparation and Application, Anhui Provincial Laboratory West Anhui University, Anhui 237012; Guo, Rui-Li
2016-11-15
A systematic study has been conducted on a novel luminescent metal-organic framework, ([Zn(bpyp)(L-OH)]·DMF·2H{sub 2}O){sub n} (1), to explore its sensing mechanisms to Fe{sup 3+}. Structure analyses show that compound 1 exist pyridine N atoms and -OH groups on the pore surface for specific sensing of metal ions via Lewis acid-base interactions. On this consideration, the quenching mechanisms are studied and the processes are controlled by multiple mechanisms in which dynamic and static mechanisms are calculated, achieving the quantification evaluation of the quenching process. This work not only achieves the quantitative evaluation of the luminescence quenching but also provides certain insightsmore » into the quenching process, and the possible mechanisms explored in this work may inspire future research and design of target luminescent metal-organic frameworks (LMOFs) with specific functions. - Graphical abstract: A systematic study has been conducted on a novel luminescent metal-organic framework to explore its sensing mechanisms to Fe{sup 3+}. The quenching mechanisms are studied and the processes are controlled by multiple mechanisms in which dynamic and static mechanisms are calculated, achieving the quantification evaluation of the quenching process. - Highlights: • A novel porous luminescent MOF containing uncoordinated groups in interlayer channels was successfully synthesized. • The compound 1 can exhibit significant luminescent sensitivity to Fe{sup 3+}, which make its good candidate as luminescent sensor. • The corresponding dynamic and static quenching constants are calculated, achieving the quantification evaluation of the quenching process.« less
A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic
Qi, Jin-Peng; Qi, Jie; Zhang, Qing
2016-01-01
Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals. PMID:27413364
A Fast Framework for Abrupt Change Detection Based on Binary Search Trees and Kolmogorov Statistic.
Qi, Jin-Peng; Qi, Jie; Zhang, Qing
2016-01-01
Change-Point (CP) detection has attracted considerable attention in the fields of data mining and statistics; it is very meaningful to discuss how to quickly and efficiently detect abrupt change from large-scale bioelectric signals. Currently, most of the existing methods, like Kolmogorov-Smirnov (KS) statistic and so forth, are time-consuming, especially for large-scale datasets. In this paper, we propose a fast framework for abrupt change detection based on binary search trees (BSTs) and a modified KS statistic, named BSTKS (binary search trees and Kolmogorov statistic). In this method, first, two binary search trees, termed as BSTcA and BSTcD, are constructed by multilevel Haar Wavelet Transform (HWT); second, three search criteria are introduced in terms of the statistic and variance fluctuations in the diagnosed time series; last, an optimal search path is detected from the root to leaf nodes of two BSTs. The studies on both the synthetic time series samples and the real electroencephalograph (EEG) recordings indicate that the proposed BSTKS can detect abrupt change more quickly and efficiently than KS, t-statistic (t), and Singular-Spectrum Analyses (SSA) methods, with the shortest computation time, the highest hit rate, the smallest error, and the highest accuracy out of four methods. This study suggests that the proposed BSTKS is very helpful for useful information inspection on all kinds of bioelectric time series signals.
Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan
2007-11-01
Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.
NASA Astrophysics Data System (ADS)
Liu, Ruiming; Liu, Erqi; Yang, Jie; Zeng, Yong; Wang, Fanglin; Cao, Yuan
2007-11-01
Fukunaga-Koontz transform (FKT), stemming from principal component analysis (PCA), is used in many pattern recognition and image-processing fields. It cannot capture the higher-order statistical property of natural images, so its detection performance is not satisfying. PCA has been extended into kernel PCA in order to capture the higher-order statistics. However, thus far there have been no researchers who have definitely proposed kernel FKT (KFKT) and researched its detection performance. For accurately detecting potential small targets from infrared images, we first extend FKT into KFKT to capture the higher-order statistical properties of images. Then a framework based on Kalman prediction and KFKT, which can automatically detect and track small targets, is developed. Results of experiments show that KFKT outperforms FKT and the proposed framework is competent to automatically detect and track infrared point targets.
Statistical Mechanics of Temporal and Interacting Networks
NASA Astrophysics Data System (ADS)
Zhao, Kun
In the last ten years important breakthroughs in the understanding of the topology of complexity have been made in the framework of network science. Indeed it has been found that many networks belong to the universality classes called small-world networks or scale-free networks. Moreover it was found that the complex architecture of real world networks strongly affects the critical phenomena defined on these structures. Nevertheless the main focus of the research has been the characterization of single and static networks. Recently, temporal networks and interacting networks have attracted large interest. Indeed many networks are interacting or formed by a multilayer structure. Example of these networks are found in social networks where an individual might be at the same time part of different social networks, in economic and financial networks, in physiology or in infrastructure systems. Moreover, many networks are temporal, i.e. the links appear and disappear on the fast time scale. Examples of these networks are social networks of contacts such as face-to-face interactions or mobile-phone communication, the time-dependent correlations in the brain activity and etc. Understanding the evolution of temporal and multilayer networks and characterizing critical phenomena in these systems is crucial if we want to describe, predict and control the dynamics of complex system. In this thesis, we investigate several statistical mechanics models of temporal and interacting networks, to shed light on the dynamics of this new generation of complex networks. First, we investigate a model of temporal social networks aimed at characterizing human social interactions such as face-to-face interactions and phone-call communication. Indeed thanks to the availability of data on these interactions, we are now in the position to compare the proposed model to the real data finding good agreement. Second, we investigate the entropy of temporal networks and growing networks , to provide a new framework to quantify the information encoded in these networks and to answer a fundamental problem in network science: how complex are temporal and growing networks. Finally, we consider two examples of critical phenomena in interacting networks. In particular, on one side we investigate the percolation of interacting networks by introducing antagonistic interactions. On the other side, we investigate a model of political election based on the percolation of antagonistic networks. The aim of this research is to show how antagonistic interactions change the physics of critical phenomena on interacting networks. We believe that the work presented in these thesis offers the possibility to appreciate the large variability of problems that can be addressed in the new framework of temporal and interacting networks.
Mechanical properties and negative thermal expansion of a dense rare earth formate framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Zhanrui; Jiang, Xingxing; Feng, Guoqiang
The fundamental mechanical properties of a dense metal–organic framework material, [NH{sub 2}CHNH{sub 2}][Er(HCOO){sub 4}] (1), have been studied using nanoindentation technique. The results demonstrate that the elastic moduli, hardnesses, and yield stresses on the (021)/(02−1) facets are 29.8/30.2, 1.80/1.83 and 0.93/1.01 GPa, respectively. Moreover, variable-temperature powder and single-crystal X-ray diffraction experiments reveal that framework 1 shows significant negative thermal expansion along its b axis, which can be explained by using a hinge–strut structural motif. - Graphical abstract: The structure of framework, [NH{sub 2}CHNH{sub 2}][Er(HCOO){sub 4}], and its indicatrix of thermal expansion. - Highlights: • The elastic modulus, hardness, and yieldmore » stress properties of a rare earth metal–organic framework material were studied via nanoindentation technique. • Variable-temperature powder X-ray diffraction experiments reveal that this framework shows significant negative thermal expansion along its b axis. • Based on variable-temperature single-crystal X-ray diffraction experiments, the mechanism of negative thermal expansion can be explained by a hinge–strut structural motif.« less
Diagnosis and Threat Detection Capabilities of the SERENITY Monitoring Framework
NASA Astrophysics Data System (ADS)
Tsigkritis, Theocharis; Spanoudakis, George; Kloukinas, Christos; Lorenzoli, Davide
The SERENITY monitoring framework offers mechanisms for diagnosing the causes of violations of security and dependability (S&D) properties and detecting potential violations of such properties, called "Cthreats". Diagnostic information and threat detection are often necessary for deciding what an appropriate reaction to a violation is and taking pre-emptive actions against predicted violations, respectively. In this chapter, we describe the mechanisms of the SERENITY monitoring framework which generate diagnostic information for violations of S&D properties and detecting threats.
Teaching Classical Statistical Mechanics: A Simulation Approach.
ERIC Educational Resources Information Center
Sauer, G.
1981-01-01
Describes a one-dimensional model for an ideal gas to study development of disordered motion in Newtonian mechanics. A Monte Carlo procedure for simulation of the statistical ensemble of an ideal gas with fixed total energy is developed. Compares both approaches for a pseudoexperimental foundation of statistical mechanics. (Author/JN)
Pera, F; Pesce, P; Solimano, F; Tealdo, T; Pera, P; Menini, M
2017-05-01
Frameworks made of carbon fibre-reinforced composites (CFRC) seem to be a viable alternative to traditional metal frameworks in implant prosthodontics. CFRC provide stiffness, rigidity and optimal biocompatibility. The aim of the present prospective study was to compare carbon fibre frameworks versus metal frameworks used to rigidly splint implants in full-arch immediate loading rehabilitations. Forty-two patients (test group) were rehabilitated with full-arch immediate loading rehabilitations of the upper jaw (total: 170 implants) following the Columbus Bridge Protocol with four to six implants with distal tilted implants. All patients were treated with resin screw-retained full-arch prostheses endowed with carbon fibre frameworks. The mean follow-up was 22 months (range: 18-24). Differences in the absolute change of bone resorption over time between the two implant sides (mesial and distal) were assessed performing a Mann-Whitney U-test. The outcomes were statistically compared with those of patients rehabilitated following the same protocol but using metal frameworks (control group: 34 patients with 163 implants - data reported in Tealdo, Menini, Bevilacqua, Pera, Pesce, Signori, Pera, Int J Prosthodont, 27, 2014, 207). Ten implants failed in the control group (6·1%); none failed in the test group (P = 0·002). A statistically significant difference in the absolute change of bone resorption around the implants was found between the two groups (P = 0·004), with greater mean peri-implant bone resorption in the control group (1 mm) compared to the test group (0·8 mm). Carbon fibre frameworks may be considered as a viable alternative to the metal ones and showed less marginal bone loss around implants and a greater implant survival rate during the observation period. © 2017 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Bianconi, Ginestra
2009-03-01
In this paper we generalize the concept of random networks to describe network ensembles with nontrivial features by a statistical mechanics approach. This framework is able to describe undirected and directed network ensembles as well as weighted network ensembles. These networks might have nontrivial community structure or, in the case of networks embedded in a given space, they might have a link probability with a nontrivial dependence on the distance between the nodes. These ensembles are characterized by their entropy, which evaluates the cardinality of networks in the ensemble. In particular, in this paper we define and evaluate the structural entropy, i.e., the entropy of the ensembles of undirected uncorrelated simple networks with given degree sequence. We stress the apparent paradox that scale-free degree distributions are characterized by having small structural entropy while they are so widely encountered in natural, social, and technological complex systems. We propose a solution to the paradox by proving that scale-free degree distributions are the most likely degree distribution with the corresponding value of the structural entropy. Finally, the general framework we present in this paper is able to describe microcanonical ensembles of networks as well as canonical or hidden-variable network ensembles with significant implications for the formulation of network-constructing algorithms.
NASA Astrophysics Data System (ADS)
Latypov, Marat I.; Kalidindi, Surya R.
2017-10-01
There is a critical need for the development and verification of practically useful multiscale modeling strategies for simulating the mechanical response of multiphase metallic materials with heterogeneous microstructures. In this contribution, we present data-driven reduced order models for effective yield strength and strain partitioning in such microstructures. These models are built employing the recently developed framework of Materials Knowledge Systems that employ 2-point spatial correlations (or 2-point statistics) for the quantification of the heterostructures and principal component analyses for their low-dimensional representation. The models are calibrated to a large collection of finite element (FE) results obtained for a diverse range of microstructures with various sizes, shapes, and volume fractions of the phases. The performance of the models is evaluated by comparing the predictions of yield strength and strain partitioning in two-phase materials with the corresponding predictions from a classical self-consistent model as well as results of full-field FE simulations. The reduced-order models developed in this work show an excellent combination of accuracy and computational efficiency, and therefore present an important advance towards computationally efficient microstructure-sensitive multiscale modeling frameworks.
Testolin, Alberto; Zorzi, Marco
2016-01-01
Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage. PMID:27468262
Testolin, Alberto; Zorzi, Marco
2016-01-01
Connectionist models can be characterized within the more general framework of probabilistic graphical models, which allow to efficiently describe complex statistical distributions involving a large number of interacting variables. This integration allows building more realistic computational models of cognitive functions, which more faithfully reflect the underlying neural mechanisms at the same time providing a useful bridge to higher-level descriptions in terms of Bayesian computations. Here we discuss a powerful class of graphical models that can be implemented as stochastic, generative neural networks. These models overcome many limitations associated with classic connectionist models, for example by exploiting unsupervised learning in hierarchical architectures (deep networks) and by taking into account top-down, predictive processing supported by feedback loops. We review some recent cognitive models based on generative networks, and we point out promising research directions to investigate neuropsychological disorders within this approach. Though further efforts are required in order to fill the gap between structured Bayesian models and more realistic, biophysical models of neuronal dynamics, we argue that generative neural networks have the potential to bridge these levels of analysis, thereby improving our understanding of the neural bases of cognition and of pathologies caused by brain damage.
Chen, Zhe; Purdon, Patrick L.; Brown, Emery N.; Barbieri, Riccardo
2012-01-01
In recent years, time-varying inhomogeneous point process models have been introduced for assessment of instantaneous heartbeat dynamics as well as specific cardiovascular control mechanisms and hemodynamics. Assessment of the model’s statistics is established through the Wiener-Volterra theory and a multivariate autoregressive (AR) structure. A variety of instantaneous cardiovascular metrics, such as heart rate (HR), heart rate variability (HRV), respiratory sinus arrhythmia (RSA), and baroreceptor-cardiac reflex (baroreflex) sensitivity (BRS), are derived within a parametric framework and instantaneously updated with adaptive and local maximum likelihood estimation algorithms. Inclusion of second-order non-linearities, with subsequent bispectral quantification in the frequency domain, further allows for definition of instantaneous metrics of non-linearity. We here present a comprehensive review of the devised methods as applied to experimental recordings from healthy subjects during propofol anesthesia. Collective results reveal interesting dynamic trends across the different pharmacological interventions operated within each anesthesia session, confirming the ability of the algorithm to track important changes in cardiorespiratory elicited interactions, and pointing at our mathematical approach as a promising monitoring tool for an accurate, non-invasive assessment in clinical practice. We also discuss the limitations and other alternative modeling strategies of our point process approach. PMID:22375120
Implementing Restricted Maximum Likelihood Estimation in Structural Equation Models
ERIC Educational Resources Information Center
Cheung, Mike W.-L.
2013-01-01
Structural equation modeling (SEM) is now a generic modeling framework for many multivariate techniques applied in the social and behavioral sciences. Many statistical models can be considered either as special cases of SEM or as part of the latent variable modeling framework. One popular extension is the use of SEM to conduct linear mixed-effects…
ERIC Educational Resources Information Center
Evangelista, Nancy; McLellan, Mary J.
2004-01-01
The expansion of early childhood services has brought increasing recognition of the need to address mental health disorders in young children. The transactional perspective of developmental psychopathology is the basis for review of diagnostic frameworks for young children. The Diagnostic and Statistical Manual of Mental Disorders (DSM-IV) is…
ERIC Educational Resources Information Center
Taylor, John
2008-01-01
One of the major tasks of the United Nations Permanent Forum on Indigenous Issues (UNPFII) following its establishment in 2000 has been to establish statistical profiles of the world's Indigenous peoples. As part of this broad task, it has recommended that the Millennium Development Goals and other global reporting frameworks should be assessed…
Towards sound epistemological foundations of statistical methods for high-dimensional biology.
Mehta, Tapan; Tanik, Murat; Allison, David B
2004-09-01
A sound epistemological foundation for biological inquiry comes, in part, from application of valid statistical procedures. This tenet is widely appreciated by scientists studying the new realm of high-dimensional biology, or 'omic' research, which involves multiplicity at unprecedented scales. Many papers aimed at the high-dimensional biology community describe the development or application of statistical techniques. The validity of many of these is questionable, and a shared understanding about the epistemological foundations of the statistical methods themselves seems to be lacking. Here we offer a framework in which the epistemological foundation of proposed statistical methods can be evaluated.
A general science-based framework for dynamical spatio-temporal models
Wikle, C.K.; Hooten, M.B.
2010-01-01
Spatio-temporal statistical models are increasingly being used across a wide variety of scientific disciplines to describe and predict spatially-explicit processes that evolve over time. Correspondingly, in recent years there has been a significant amount of research on new statistical methodology for such models. Although descriptive models that approach the problem from the second-order (covariance) perspective are important, and innovative work is being done in this regard, many real-world processes are dynamic, and it can be more efficient in some cases to characterize the associated spatio-temporal dependence by the use of dynamical models. The chief challenge with the specification of such dynamical models has been related to the curse of dimensionality. Even in fairly simple linear, first-order Markovian, Gaussian error settings, statistical models are often over parameterized. Hierarchical models have proven invaluable in their ability to deal to some extent with this issue by allowing dependency among groups of parameters. In addition, this framework has allowed for the specification of science based parameterizations (and associated prior distributions) in which classes of deterministic dynamical models (e. g., partial differential equations (PDEs), integro-difference equations (IDEs), matrix models, and agent-based models) are used to guide specific parameterizations. Most of the focus for the application of such models in statistics has been in the linear case. The problems mentioned above with linear dynamic models are compounded in the case of nonlinear models. In this sense, the need for coherent and sensible model parameterizations is not only helpful, it is essential. Here, we present an overview of a framework for incorporating scientific information to motivate dynamical spatio-temporal models. First, we illustrate the methodology with the linear case. We then develop a general nonlinear spatio-temporal framework that we call general quadratic nonlinearity and demonstrate that it accommodates many different classes of scientific-based parameterizations as special cases. The model is presented in a hierarchical Bayesian framework and is illustrated with examples from ecology and oceanography. ?? 2010 Sociedad de Estad??stica e Investigaci??n Operativa.
Patel, Vikram; Burns, Jonathan K; Dhingra, Monisha; Tarver, Leslie; Kohrt, Brandon A; Lund, Crick
2018-02-01
Most countries have witnessed a dramatic increase of income inequality in the past three decades. This paper addresses the question of whether income inequality is associated with the population prevalence of depression and, if so, the potential mechanisms and pathways which may explain this association. Our systematic review included 26 studies, mostly from high-income countries. Nearly two-thirds of all studies and five out of six longitudinal studies reported a statistically significant positive relationship between income inequality and risk of depression; only one study reported a statistically significant negative relationship. Twelve studies were included in a meta-analysis with dichotomized inequality groupings. The pooled risk ratio was 1.19 (95% CI: 1.07-1.31), demonstrating greater risk of depression in populations with higher income inequality relative to populations with lower inequality. Multiple studies reported subgroup effects, including greater impacts of income inequality among women and low-income populations. We propose an ecological framework, with mechanisms operating at the national level (the neo-material hypothesis), neighbourhood level (the social capital and the social comparison hypotheses) and individual level (psychological stress and social defeat hypotheses) to explain this association. We conclude that policy makers should actively promote actions to reduce income inequality, such as progressive taxation policies and a basic universal income. Mental health professionals should champion such policies, as well as promote the delivery of interventions which target the pathways and proximal determinants, such as building life skills in adolescents and provision of psychological therapies and packages of care with demonstrated effectiveness for settings of poverty and high income inequality. © 2018 World Psychiatric Association.
Patel, Vikram; Burns, Jonathan K.; Dhingra, Monisha; Tarver, Leslie; Kohrt, Brandon A.; Lund, Crick
2018-01-01
Most countries have witnessed a dramatic increase of income inequality in the past three decades. This paper addresses the question of whether income inequality is associated with the population prevalence of depression and, if so, the potential mechanisms and pathways which may explain this association. Our systematic review included 26 studies, mostly from high‐income countries. Nearly two‐thirds of all studies and five out of six longitudinal studies reported a statistically significant positive relationship between income inequality and risk of depression; only one study reported a statistically significant negative relationship. Twelve studies were included in a meta‐analysis with dichotomized inequality groupings. The pooled risk ratio was 1.19 (95% CI: 1.07‐1.31), demonstrating greater risk of depression in populations with higher income inequality relative to populations with lower inequality. Multiple studies reported subgroup effects, including greater impacts of income inequality among women and low‐income populations. We propose an ecological framework, with mechanisms operating at the national level (the neo‐material hypothesis), neighbourhood level (the social capital and the social comparison hypotheses) and individual level (psychological stress and social defeat hypotheses) to explain this association. We conclude that policy makers should actively promote actions to reduce income inequality, such as progressive taxation policies and a basic universal income. Mental health professionals should champion such policies, as well as promote the delivery of interventions which target the pathways and proximal determinants, such as building life skills in adolescents and provision of psychological therapies and packages of care with demonstrated effectiveness for settings of poverty and high income inequality. PMID:29352539
A Cluster-Based Framework for the Security of Medical Sensor Environments
NASA Astrophysics Data System (ADS)
Klaoudatou, Eleni; Konstantinou, Elisavet; Kambourakis, Georgios; Gritzalis, Stefanos
The adoption of Wireless Sensor Networks (WSNs) in the healthcare sector poses many security issues, mainly because medical information is considered particularly sensitive. The security mechanisms employed are expected to be more efficient in terms of energy consumption and scalability in order to cope with the constrained capabilities of WSNs and patients’ mobility. Towards this goal, cluster-based medical WSNs can substantially improve efficiency and scalability. In this context, we have proposed a general framework for cluster-based medical environments on top of which security mechanisms can rely. This framework fully covers the varying needs of both in-hospital environments and environments formed ad hoc for medical emergencies. In this paper, we further elaborate on the security of our proposed solution. We specifically focus on key establishment mechanisms and investigate the group key agreement protocols that can best fit in our framework.
Welp, Annalena; Manser, Tanja
2016-07-19
There is growing evidence that teamwork in hospitals is related to both patient outcomes and clinician occupational well-being. Furthermore, clinician well-being is associated with patient safety. Despite considerable research activity, few studies include all three concepts, and their interrelations have not yet been investigated systematically. To advance our understanding of these potentially complex interrelations we propose an integrative framework taking into account current evidence and research gaps identified in a systematic review. We conducted a literature search in six major databases (Medline, PsycArticles, PsycInfo, Psyndex, ScienceDirect, and Web of Knowledge). Inclusion criteria were: peer reviewed papers published between January 2000 and June 2015 investigating a statistical relationship between at least two of the three concepts; teamwork, patient safety, and clinician occupational well-being in hospital settings, including practicing nurses and physicians. We assessed methodological quality using a standardized rating system and qualitatively appraised and extracted relevant data, such as instruments, analyses and outcomes. The 98 studies included in this review were highly diverse regarding quality, methodology and outcomes. We found support for the existence of independent associations between teamwork, clinician occupational well-being and patient safety. However, we identified several conceptual and methodological limitations. The main barrier to advancing our understanding of the causal relationships between teamwork, clinician well-being and patient safety is the lack of an integrative, theory-based, and methodologically thorough approach investigating the three concepts simultaneously and longitudinally. Based on psychological theory and our findings, we developed an integrative framework that addresses these limitations and proposes mechanisms by which these concepts might be linked. Knowledge about the mechanisms underlying the relationships between these concepts helps to identify avenues for future research, aimed at benefiting clinicians and patients by using the synergies between teamwork, clinician occupational well-being and patient safety.
Environmental statistics and optimal regulation.
Sivak, David A; Thomson, Matt
2014-09-01
Any organism is embedded in an environment that changes over time. The timescale for and statistics of environmental change, the precision with which the organism can detect its environment, and the costs and benefits of particular protein expression levels all will affect the suitability of different strategies--such as constitutive expression or graded response--for regulating protein levels in response to environmental inputs. We propose a general framework-here specifically applied to the enzymatic regulation of metabolism in response to changing concentrations of a basic nutrient-to predict the optimal regulatory strategy given the statistics of fluctuations in the environment and measurement apparatus, respectively, and the costs associated with enzyme production. We use this framework to address three fundamental questions: (i) when a cell should prefer thresholding to a graded response; (ii) when there is a fitness advantage to implementing a Bayesian decision rule; and (iii) when retaining memory of the past provides a selective advantage. We specifically find that: (i) relative convexity of enzyme expression cost and benefit influences the fitness of thresholding or graded responses; (ii) intermediate levels of measurement uncertainty call for a sophisticated Bayesian decision rule; and (iii) in dynamic contexts, intermediate levels of uncertainty call for retaining memory of the past. Statistical properties of the environment, such as variability and correlation times, set optimal biochemical parameters, such as thresholds and decay rates in signaling pathways. Our framework provides a theoretical basis for interpreting molecular signal processing algorithms and a classification scheme that organizes known regulatory strategies and may help conceptualize heretofore unknown ones.
Booth, Brian G; Keijsers, Noël L W; Sijbers, Jan; Huysmans, Toon
2018-05-03
Pedobarography produces large sets of plantar pressure samples that are routinely subsampled (e.g. using regions of interest) or aggregated (e.g. center of pressure trajectories, peak pressure images) in order to simplify statistical analysis and provide intuitive clinical measures. We hypothesize that these data reductions discard gait information that can be used to differentiate between groups or conditions. To test the hypothesis of null information loss, we created an implementation of statistical parametric mapping (SPM) for dynamic plantar pressure datasets (i.e. plantar pressure videos). Our SPM software framework brings all plantar pressure videos into anatomical and temporal correspondence, then performs statistical tests at each sampling location in space and time. Novelly, we introduce non-linear temporal registration into the framework in order to normalize for timing differences within the stance phase. We refer to our software framework as STAPP: spatiotemporal analysis of plantar pressure measurements. Using STAPP, we tested our hypothesis on plantar pressure videos from 33 healthy subjects walking at different speeds. As walking speed increased, STAPP was able to identify significant decreases in plantar pressure at mid-stance from the heel through the lateral forefoot. The extent of these plantar pressure decreases has not previously been observed using existing plantar pressure analysis techniques. We therefore conclude that the subsampling of plantar pressure videos - a task which led to the discarding of gait information in our study - can be avoided using STAPP. Copyright © 2018 Elsevier B.V. All rights reserved.
Detecting Statistically Significant Communities of Triangle Motifs in Undirected Networks
2016-04-26
REPORT TYPE Final 3. DATES COVERED (From - To) 15 Oct 2014 to 14 Jan 2015 4. TITLE AND SUBTITLE Detecting statistically significant clusters of...extend the work of Perry et al. [6] by developing a statistical framework that supports the detection of triangle motif-based clusters in complex...priori, the need for triangle motif-based clustering . 2. Developed an algorithm for clustering undirected networks, where the triangle con guration was
RooStatsCms: A tool for analysis modelling, combination and statistical studies
NASA Astrophysics Data System (ADS)
Piparo, D.; Schott, G.; Quast, G.
2010-04-01
RooStatsCms is an object oriented statistical framework based on the RooFit technology. Its scope is to allow the modelling, statistical analysis and combination of multiple search channels for new phenomena in High Energy Physics. It provides a variety of methods described in literature implemented as classes, whose design is oriented to the execution of multiple CPU intensive jobs on batch systems or on the Grid.
Exploring Tree Age & Diameter to Illustrate Sample Design & Inference in Observational Ecology
ERIC Educational Resources Information Center
Casady, Grant M.
2015-01-01
Undergraduate biology labs often explore the techniques of data collection but neglect the statistical framework necessary to express findings. Students can be confused about how to use their statistical knowledge to address specific biological questions. Growth in the area of observational ecology requires that students gain experience in…
2011-12-30
the term " superresolution "). The single-phase matched field statistic for a given template was also demonstrated to be a viable detection statistic... Superresolution with seismic arrays using empirical matched field processing, Geophys. J. Int. 182: 1455–1477. Kim, K.-H. and Park, Y. (2010): The 20
A Data Warehouse Architecture for DoD Healthcare Performance Measurements.
1999-09-01
design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse of healthcare metrics. With the DoD healthcare...framework, this thesis defines a methodology to design, develop, implement, and apply statistical analysis and data mining tools to a Data Warehouse...21 F. INABILITY TO CONDUCT HELATHCARE ANALYSIS
Measuring the Impacts of ICT Using Official Statistics. OECD Digital Economy Papers, No. 136
ERIC Educational Resources Information Center
Roberts, Sheridan
2008-01-01
This paper describes the findings of an OECD project examining ICT impact measurement and analyses based on official statistics. Both economic and social impacts are covered and some results are presented. It attempts to place ICT impacts measurement into an Information Society conceptual framework, provides some suggestions for standardising…
Learning Axes and Bridging Tools in a Technology-Based Design for Statistics
ERIC Educational Resources Information Center
Abrahamson, Dor; Wilensky, Uri
2007-01-01
We introduce a design-based research framework, "learning axes and bridging tools," and demonstrate its application in the preparation and study of an implementation of a middle-school experimental computer-based unit on probability and statistics, "ProbLab" (Probability Laboratory, Abrahamson and Wilensky 2002 [Abrahamson, D., & Wilensky, U.…
Information Distribution Practices of Federal Statistical Agencies: The Census Bureau Example.
ERIC Educational Resources Information Center
Gey, Frederick C.
1993-01-01
Describes the current and historical distribution channels of the U.S. Bureau of the Census within a framework of distribution policies and practices for federal statistical information. The issues of reasonable distribution policies and the impact of technological change are discussed, and guidelines are offered. (Contains 26 references.) (EAM)
Complexity Framework for Sustainability: An Analysis of Five Papers
ERIC Educational Resources Information Center
Putnik, Goran D.
2009-01-01
Purpose: The purpose of this paper is to present an examination of the concepts and mechanisms of complexity and learning usability and applicability for management in turbulent environments as well as their examination through the Chaordic system thinking (CST) lenses and framework. Contributing to awareness of how different mechanisms could be…
Computational rationality: linking mechanism and behavior through bounded utility maximization.
Lewis, Richard L; Howes, Andrew; Singh, Satinder
2014-04-01
We propose a framework for including information-processing bounds in rational analyses. It is an application of bounded optimality (Russell & Subramanian, 1995) to the challenges of developing theories of mechanism and behavior. The framework is based on the idea that behaviors are generated by cognitive mechanisms that are adapted to the structure of not only the environment but also the mind and brain itself. We call the framework computational rationality to emphasize the incorporation of computational mechanism into the definition of rational action. Theories are specified as optimal program problems, defined by an adaptation environment, a bounded machine, and a utility function. Such theories yield different classes of explanation, depending on the extent to which they emphasize adaptation to bounds, and adaptation to some ecology that differs from the immediate local environment. We illustrate this variation with examples from three domains: visual attention in a linguistic task, manual response ordering, and reasoning. We explore the relation of this framework to existing "levels" approaches to explanation, and to other optimality-based modeling approaches. Copyright © 2014 Cognitive Science Society, Inc.
Working toward integrated models of alpine plant distribution
Carlson, Bradley Z.; Randin, Christophe F.; Boulangeat, Isabelle; Lavergne, Sébastien; Thuiller, Wilfried; Choler, Philippe
2014-01-01
Species distribution models (SDMs) have been frequently employed to forecast the response of alpine plants to global changes. Efforts to model alpine plant distribution have thus far been primarily based on a correlative approach, in which ecological processes are implicitly addressed through a statistical relationship between observed species occurrences and environmental predictors. Recent evidence, however, highlights the shortcomings of correlative SDMs, especially in alpine landscapes where plant species tend to be decoupled from atmospheric conditions in micro-topographic habitats and are particularly exposed to geomorphic disturbances. While alpine plants respond to the same limiting factors as plants found at lower elevations, alpine environments impose a particular set of scale-dependent and hierarchical drivers that shape the realized niche of species and that require explicit consideration in a modelling context. Several recent studies in the European Alps have successfully integrated both correlative and process-based elements into distribution models of alpine plants, but for the time being a single integrative modelling framework that includes all key drivers remains elusive. As a first step in working toward a comprehensive integrated model applicable to alpine plant communities, we propose a conceptual framework that structures the primary mechanisms affecting alpine plant distributions. We group processes into four categories, including multi-scalar abiotic drivers, gradient dependent species interactions, dispersal and spatial–temporal plant responses to disturbance. Finally, we propose a methodological framework aimed at developing an integrated model to better predict alpine plant distribution. PMID:24790594
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.
Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less
Lim, Hojun; Abdeljawad, Fadi; Owen, Steven J.; ...
2016-04-25
Here, the mechanical properties of materials systems are highly influenced by various features at the microstructural level. The ability to capture these heterogeneities and incorporate them into continuum-scale frameworks of the deformation behavior is considered a key step in the development of complex non-local models of failure. In this study, we present a modeling framework that incorporates physically-based realizations of polycrystalline aggregates from a phase field (PF) model into a crystal plasticity finite element (CP-FE) framework. Simulated annealing via the PF model yields ensembles of materials microstructures with various grain sizes and shapes. With the aid of a novel FEmore » meshing technique, FE discretizations of these microstructures are generated, where several key features, such as conformity to interfaces, and triple junction angles, are preserved. The discretizations are then used in the CP-FE framework to simulate the mechanical response of polycrystalline α-iron. It is shown that the conformal discretization across interfaces reduces artificial stress localization commonly observed in non-conformal FE discretizations. The work presented herein is a first step towards incorporating physically-based microstructures in lieu of the overly simplified representations that are commonly used. In broader terms, the proposed framework provides future avenues to explore bridging models of materials processes, e.g. additive manufacturing and microstructure evolution of multi-phase multi-component systems, into continuum-scale frameworks of the mechanical properties.« less
Wu, Zheyang; Zhao, Hongyu
2012-01-01
For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies.
Wu, Zheyang; Zhao, Hongyu
2013-01-01
For more fruitful discoveries of genetic variants associated with diseases in genome-wide association studies, it is important to know whether joint analysis of multiple markers is more powerful than the commonly used single-marker analysis, especially in the presence of gene-gene interactions. This article provides a statistical framework to rigorously address this question through analytical power calculations for common model search strategies to detect binary trait loci: marginal search, exhaustive search, forward search, and two-stage screening search. Our approach incorporates linkage disequilibrium, random genotypes, and correlations among score test statistics of logistic regressions. We derive analytical results under two power definitions: the power of finding all the associated markers and the power of finding at least one associated marker. We also consider two types of error controls: the discovery number control and the Bonferroni type I error rate control. After demonstrating the accuracy of our analytical results by simulations, we apply them to consider a broad genetic model space to investigate the relative performances of different model search strategies. Our analytical study provides rapid computation as well as insights into the statistical mechanism of capturing genetic signals under different genetic models including gene-gene interactions. Even though we focus on genetic association analysis, our results on the power of model selection procedures are clearly very general and applicable to other studies. PMID:23956610
Capillary fluctuations of surface steps: An atomistic simulation study for the model Cu(111) system
NASA Astrophysics Data System (ADS)
Freitas, Rodrigo; Frolov, Timofey; Asta, Mark
2017-10-01
Molecular dynamics (MD) simulations are employed to investigate the capillary fluctuations of steps on the surface of a model metal system. The fluctuation spectrum, characterized by the wave number (k ) dependence of the mean squared capillary-wave amplitudes and associated relaxation times, is calculated for 〈110 〉 and 〈112 〉 steps on the {111 } surface of elemental copper near the melting temperature of the classical potential model considered. Step stiffnesses are derived from the MD results, yielding values from the largest system sizes of (37 ±1 ) meV/A ˚ for the different line orientations, implying that the stiffness is isotropic within the statistical precision of the calculations. The fluctuation lifetimes are found to vary by approximately four orders of magnitude over the range of wave numbers investigated, displaying a k dependence consistent with kinetics governed by step-edge mediated diffusion. The values for step stiffness derived from these simulations are compared to step free energies for the same system and temperature obtained in a recent MD-based thermodynamic-integration (TI) study [Freitas, Frolov, and Asta, Phys. Rev. B 95, 155444 (2017), 10.1103/PhysRevB.95.155444]. Results from the capillary-fluctuation analysis and TI calculations yield statistically significant differences that are discussed within the framework of statistical-mechanical theories for configurational contributions to step free energies.
Johnson, Kari; Fleury, Julie; McClain, Darya
2018-08-01
Evaluate music listening for delirium prevention among patients admitted to a Trauma Intensive Care and Trauma Orthopaedic Unit. The Roy Adaptation Model provided the theoretical framework focusing on modifying contextual stimuli. Randomised controlled trial, 40 patients aged 55 and older. Participants randomly assigned to receive music listening or usual care for 60 minutes, twice a day, over three days. Pre-recorded self-selected music using an iPod and headsets, with slow tempo, low pitch and simple repetitive rhythms to alter physiologic responses. Heart rate, respiratory rate, systolic and diastolic blood pressure, confusion assessment method. Repeated measures ANOVA, F(4, 134) = 4.75, p = .001, suggested statistically significant differences in heart rate pre/post music listening, and F(1, 37) = 10.44, p = .003 in systolic blood pressure pre/post music listening. Post-hoc analysis reported changes at three time periods of statistical significance; (p = .010), (p = .005) and (p = .039) and a change in systolic blood pressure pre/post music listening; (p = .001) of statistical significance. All participants screened negative for delirium. Music addresses pathophysiologic mechanisms that contribute to delirium; neurotransmitter imbalance, inflammation and acute physiologic stressors. Music to prevent delirium is one of few that provide support in a critical care setting. Copyright © 2018 Elsevier Ltd. All rights reserved.
Improving information retrieval in functional analysis.
Rodriguez, Juan C; González, Germán A; Fresno, Cristóbal; Llera, Andrea S; Fernández, Elmer A
2016-12-01
Transcriptome analysis is essential to understand the mechanisms regulating key biological processes and functions. The first step usually consists of identifying candidate genes; to find out which pathways are affected by those genes, however, functional analysis (FA) is mandatory. The most frequently used strategies for this purpose are Gene Set and Singular Enrichment Analysis (GSEA and SEA) over Gene Ontology. Several statistical methods have been developed and compared in terms of computational efficiency and/or statistical appropriateness. However, whether their results are similar or complementary, the sensitivity to parameter settings, or possible bias in the analyzed terms has not been addressed so far. Here, two GSEA and four SEA methods and their parameter combinations were evaluated in six datasets by comparing two breast cancer subtypes with well-known differences in genetic background and patient outcomes. We show that GSEA and SEA lead to different results depending on the chosen statistic, model and/or parameters. Both approaches provide complementary results from a biological perspective. Hence, an Integrative Functional Analysis (IFA) tool is proposed to improve information retrieval in FA. It provides a common gene expression analytic framework that grants a comprehensive and coherent analysis. Only a minimal user parameter setting is required, since the best SEA/GSEA alternatives are integrated. IFA utility was demonstrated by evaluating four prostate cancer and the TCGA breast cancer microarray datasets, which showed its biological generalization capabilities. Copyright © 2016 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Shimizu, K.; Yagi, Y.; Okuwaki, R.; Kasahara, A.
2017-12-01
The kinematic earthquake rupture models are useful to derive statistics and scaling properties of the large and great earthquakes. However, the kinematic rupture models for the same earthquake are often different from one another. Such sensitivity of the modeling prevents us to understand the statistics and scaling properties of the earthquakes. Yagi and Fukahata (2011) introduces the uncertainty of Green's function into the tele-seismic waveform inversion, and shows that the stable spatiotemporal distribution of slip-rate can be obtained by using an empirical Bayesian scheme. One of the unsolved problems in the inversion rises from the modeling error originated from an uncertainty of a fault-model setting. Green's function near the nodal plane of focal mechanism is known to be sensitive to the slight change of the assumed fault geometry, and thus the spatiotemporal distribution of slip-rate should be distorted by the modeling error originated from the uncertainty of the fault model. We propose a new method accounting for the complexity in the fault geometry by additionally solving the focal mechanism on each space knot. Since a solution of finite source inversion gets unstable with an increasing of flexibility of the model, we try to estimate a stable spatiotemporal distribution of focal mechanism in the framework of Yagi and Fukahata (2011). We applied the proposed method to the 52 tele-seismic P-waveforms of the 2013 Balochistan, Pakistan earthquake. The inverted-potency distribution shows unilateral rupture propagation toward southwest of the epicenter, and the spatial variation of the focal mechanisms shares the same pattern as the fault-curvature along the tectonic fabric. On the other hand, the broad pattern of rupture process, including the direction of rupture propagation, cannot be reproduced by an inversion analysis under the assumption that the faulting occurred on a single flat plane. These results show that the modeling error caused by simplifying the fault model is non-negligible in the tele-seismic waveform inversion of the 2013 Balochistan, Pakistan earthquake.
ERIC Educational Resources Information Center
Cole, Patricia Ann
2013-01-01
This sequential explanatory mixed methods study investigated 24 college and university syllabi for content consisting of multicultural education that used the framework for multicultural education devised by James A. Banks (2006). This framework was used to analyze data collected using descriptive statistics for quantitative phase one. The four…
ERIC Educational Resources Information Center
Jovanovic, Aleksandar; Jankovic, Anita; Jovanovic, Snezana Markovic; Peric, Vladan; Vitosevic, Biljana; Pavlovic, Milos
2015-01-01
The paper describes the delivery of the courses in the framework of the project implementation and presents the effect the change in the methodology had on student performance as measured by final grade. Methodology: University of Pristina piloted blended courses in 2013 under the framework of the Tempus BLATT project. The blended learning…
Leeseberg Stamler, L; Cole, M M; Patrick, L J
2001-08-01
Strategies to delay or prevent complications from diabetes include diabetes patient education. Diabetes educators seek to provide education that meets the needs of clients and influences positive health outcomes. (1) To expand prior research exploring an enablement framework for patient education by examining perceptions of patient education by persons with diabetes and (2) to test the mastery of stress instrument (MSI) as a potential evaluative instrument for patient education. Triangulated data collection with a convenience sample of adults taking diabetes education classes. Half the sample completed audio-taped semi-structured interviews pre, during and posteducation and all completed the MSI posteducation. Qualitative data were analysed using latent content analysis, descriptive statistics were completed. Qualitative analysis revealed content categories similar to previous work with prenatal participants, supporting the enablement framework. Statistical analyses noted congruence with psychometric findings from development of MSI; secondary qualitative analyses revealed congruency between MSI scores and patient perceptions. Mastery is an outcome congruent with the enablement framework for patient education across content areas. Mastery of stress instrument may be a instrument for identification of patients who are coping well with diabetes self-management, as well as those who are not and who require further nursing interventions.
Integrated framework for developing search and discrimination metrics
NASA Astrophysics Data System (ADS)
Copeland, Anthony C.; Trivedi, Mohan M.
1997-06-01
This paper presents an experimental framework for evaluating target signature metrics as models of human visual search and discrimination. This framework is based on a prototype eye tracking testbed, the Integrated Testbed for Eye Movement Studies (ITEMS). ITEMS determines an observer's visual fixation point while he studies a displayed image scene, by processing video of the observer's eye. The utility of this framework is illustrated with an experiment using gray-scale images of outdoor scenes that contain randomly placed targets. Each target is a square region of a specific size containing pixel values from another image of an outdoor scene. The real-world analogy of this experiment is that of a military observer looking upon the sensed image of a static scene to find camouflaged enemy targets that are reported to be in the area. ITEMS provides the data necessary to compute various statistics for each target to describe how easily the observers located it, including the likelihood the target was fixated or identified and the time required to do so. The computed values of several target signature metrics are compared to these statistics, and a second-order metric based on a model of image texture was found to be the most highly correlated.
A Stochastic Framework for Evaluating Seizure Prediction Algorithms Using Hidden Markov Models
Wong, Stephen; Gardner, Andrew B.; Krieger, Abba M.; Litt, Brian
2007-01-01
Responsive, implantable stimulation devices to treat epilepsy are now in clinical trials. New evidence suggests that these devices may be more effective when they deliver therapy before seizure onset. Despite years of effort, prospective seizure prediction, which could improve device performance, remains elusive. In large part, this is explained by lack of agreement on a statistical framework for modeling seizure generation and a method for validating algorithm performance. We present a novel stochastic framework based on a three-state hidden Markov model (HMM) (representing interictal, preictal, and seizure states) with the feature that periods of increased seizure probability can transition back to the interictal state. This notion reflects clinical experience and may enhance interpretation of published seizure prediction studies. Our model accommodates clipped EEG segments and formalizes intuitive notions regarding statistical validation. We derive equations for type I and type II errors as a function of the number of seizures, duration of interictal data, and prediction horizon length and we demonstrate the model’s utility with a novel seizure detection algorithm that appeared to predicted seizure onset. We propose this framework as a vital tool for designing and validating prediction algorithms and for facilitating collaborative research in this area. PMID:17021032
Baig, Mirza Rustum; Akbar, Jaber Hussain; Qudeimat, Muawia; Omar, Ridwaan
2018-02-15
To evaluate the effects of impression material, impression tray type, and type of partial edentulism (ie, Kennedy class) on the accuracy of fit of cobalt-chromium (Co-Cr) partial removable dental prostheses (PRDP) in terms of the number of fabricated frameworks required until the attainment of adequate fit. Electronic case documentations of 120 partially edentulous patients provided with Co-Cr PRDP treatment for one or both arches were examined. Statistical analyses of data were performed using analysis of variance and Tukey honest significant difference test to compare the relationships between the different factors and the number of frameworks that needed to be fabricated for each patient (α = .05). Statistical analysis of data derived from 143 records (69 maxillary and 74 mandibular) revealed no significant correlation between impression material, tray type, or Kennedy class and the number of construction attempts for the pooled or individual arch data (P ≥ .05). In PRDP treatment, alginate can be chosen as a first-choice material, and metal stock trays can be a preferred option for making final impressions to fabricate Co-Cr frameworks.
NASA Astrophysics Data System (ADS)
Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew
2007-04-01
One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.
Chance and time: Cutting the Gordian knot
NASA Astrophysics Data System (ADS)
Hagar, Amit
One of the recurrent problems in the foundations of physics is to explain why we rarely observe certain phenomena that are allowed by our theories and laws. In thermodynamics, for example, the spontaneous approach towards equilibrium is ubiquitous yet the time-reversal-invariant laws that presumably govern thermal behaviour in the microscopic level equally allow spontaneous approach away from equilibrium to occur. Why are the former processes frequently observed while the latter are almost never reported? Another example comes from quantum mechanics where the formalism, if considered complete and universally applicable, predicts the existence of macroscopic superpositions---monstrous Schrodinger cats---and these are never observed: while electrons and atoms enjoy the cloudiness of waves, macroscopic objects are always localized to definite positions. A well-known explanatory framework due to Ludwig Boltzmann traces the rarity of "abnormal" thermodynamic phenomena to the scarcity of the initial conditions that lead to it. After all, physical laws are no more than algorithms and these are expected to generate different results according to different initial conditions, hence Boltzmann's insight that violations of thermodynamic laws are possible but highly improbable. Yet Boltzmann introduces probabilities into this explanatory scheme, and since the latter is couched in terms of classical mechanics, these probabilities must be interpreted as a result of ignorance of the exact state the system is in. Quantum mechanics has taught us otherwise. Here the attempts to explain why we never observe macroscopic superpositions have led to different interpretations of the formalism and to different solutions to the quantum measurement problem. These solutions introduce additional interpretations to the meaning of probability over and above ignorance of the definite state of the physical system: quantum probabilities may result from pure chance. Notwithstanding the success of the Boltzmannian framework in explaining the thermodynamic arrow in time it leaves us with a foundational puzzle: how can ignorance play a role in scientific explanation of objective reality? In turns out that two opposing solutions to the quantum measurement problem in which probabilities arise from the stochastic character of the underlying dynamics may scratch this explanatory itch. By offering a dynamical justification to the probabilities employed in classical statistical mechanics these two interpretations complete the Boltzmannian explanatory scheme and allow us to exorcize ignorance from scientific explanations of unobserved phenomena. In this thesis I argue that the puzzle of the thermodynamic arrow in time is closely related to the problem of interpreting quantum mechanics, i.e., to the measurement problem. We may solve one by fiat and thus solve the other, but it seems unwise to try solving them independently. I substantiate this claim by presenting two possible interpretations to non-relativistic quantum mechanics. Differing as they do on the meaning of the probabilities they introduce into the otherwise deterministic dynamics, these interpretations offer alternative explanatory schemes to the standard Boltzmannian statistical mechanical explanation of thermodynamic approach to equilibrium. I then show how notwithstanding their current empirical equivalence, the two approaches diverge at the continental divide between scientific realism and anti-realism.
Chen, Xi; Cui, Qiang; Tang, Yuye; Yoo, Jejoong; Yethiraj, Arun
2008-01-01
A hierarchical simulation framework that integrates information from molecular dynamics (MD) simulations into a continuum model is established to study the mechanical response of mechanosensitive channel of large-conductance (MscL) using the finite element method (FEM). The proposed MD-decorated FEM (MDeFEM) approach is used to explore the detailed gating mechanisms of the MscL in Escherichia coli embedded in a palmitoyloleoylphosphatidylethanolamine lipid bilayer. In Part I of this study, the framework of MDeFEM is established. The transmembrane and cytoplasmic helices are taken to be elastic rods, the loops are modeled as springs, and the lipid bilayer is approximated by a three-layer sheet. The mechanical properties of the continuum components, as well as their interactions, are derived from molecular simulations based on atomic force fields. In addition, analytical closed-form continuum model and elastic network model are established to complement the MDeFEM approach and to capture the most essential features of gating. In Part II of this study, the detailed gating mechanisms of E. coli-MscL under various types of loading are presented and compared with experiments, structural model, and all-atom simulations, as well as the analytical models established in Part I. It is envisioned that such a hierarchical multiscale framework will find great value in the study of a variety of biological processes involving complex mechanical deformations such as muscle contraction and mechanotransduction. PMID:18390626
Special Features of Galactic Dynamics
NASA Astrophysics Data System (ADS)
Efthymiopoulos, Christos; Voglis, Nikos; Kalapotharakos, Constantinos
This is an introductory article to some basic notions and currently open problems of galactic dynamics. The focus is on topics mostly relevant to the so-called `new methods' of celestial mechanics or Hamiltonian dynamics, as applied to the ellipsoidal components of galaxies, i.e., to the elliptical galaxies and to the dark halos and bulges of disk galaxies. Traditional topics such as Jeans theorem, the role of a `third integral' of motion, Nekhoroshev theory, violent relaxation, and the statistical mechanics of collisionless stellar systems are first discussed. The emphasis is on modern extrapolations of these old topics. Recent results from orbital and global dynamical studies of galaxies are then shortly reviewed. The role of various families of orbits in supporting self-consistency, as well as the role of chaos in galaxies, are stressed. A description is then given of the main numerical techniques of integration of the N-body problem in the framework of stellar dynamics and of the results obtained via N-Body experiments. A final topic is the secular evolution and self-organization of galactic systems.
Ionic effects on the temperature-force phase diagram of DNA.
Amnuanpol, Sitichoke
2017-12-01
Double-stranded DNA (dsDNA) undergoes a structural transition to single-stranded DNA (ssDNA) in many biologically important processes such as replication and transcription. This strand separation arises in response either to thermal fluctuations or to external forces. The roles of ions are twofold, shortening the range of the interstrand potential and renormalizing the DNA elastic modulus. The dsDNA-to-ssDNA transition is studied on the basis that dsDNA is regarded as a bound state while ssDNA is regarded as an unbound state. The ground state energy of DNA is obtained by mapping the statistical mechanics problem to the imaginary time quantum mechanics problem. In the temperature-force phase diagram the critical force F c (T) increases logarithmically with the Na + concentration in the range from 32 to 110 mM. Discussing this logarithmic dependence of F c (T) within the framework of polyelectrolyte theory, it inevitably suggests a constraint on the difference between the interstrand separation and the length per unit charge during the dsDNA-to-ssDNA transition.
NASA Astrophysics Data System (ADS)
Ye, Fei; Marchetti, P. A.; Su, Z. B.; Yu, L.
2017-09-01
The relation between braid and exclusion statistics is examined in one-dimensional systems, within the framework of Chern-Simons statistical transmutation in gauge invariant form with an appropriate dimensional reduction. If the matter action is anomalous, as for chiral fermions, a relation between braid and exclusion statistics can be established explicitly for both mutual and nonmutual cases. However, if it is not anomalous, the exclusion statistics of emergent low energy excitations is not necessarily connected to the braid statistics of the physical charged fields of the system. Finally, we also discuss the bosonization of one-dimensional anyonic systems through T-duality. Dedicated to the memory of Mario Tonin.